%global _empty_manifest_terminate_build 0
Name: python-fastai
Version: 2.7.12
Release: 1
Summary: fastai simplifies training fast and accurate neural nets using modern best practices
License: Apache Software License 2.0
URL: https://github.com/fastai/fastai
Source0: https://mirrors.nju.edu.cn/pypi/web/packages/a8/8e/46a85646fd441d5beebf7db5d3c6086c1744f3511fe99ba00c5af91e5ec3/fastai-2.7.12.tar.gz
BuildArch: noarch
Requires: python3-pip
Requires: python3-packaging
Requires: python3-fastdownload
Requires: python3-fastcore
Requires: python3-torchvision
Requires: python3-matplotlib
Requires: python3-pandas
Requires: python3-requests
Requires: python3-pyyaml
Requires: python3-fastprogress
Requires: python3-pillow
Requires: python3-scikit-learn
Requires: python3-scipy
Requires: python3-spacy
Requires: python3-torch
Requires: python3-ipywidgets
Requires: python3-pytorch-lightning
Requires: python3-pytorch-ignite
Requires: python3-transformers
Requires: python3-sentencepiece
Requires: python3-tensorboard
Requires: python3-pydicom
Requires: python3-catalyst
Requires: python3-flask-compress
Requires: python3-captum
Requires: python3-flask
Requires: python3-wandb
Requires: python3-kornia
Requires: python3-scikit-image
Requires: python3-neptune-client
Requires: python3-comet-ml
Requires: python3-albumentations
Requires: python3-opencv-python
Requires: python3-pyarrow
Requires: python3-ninja
Requires: python3-timm
Requires: python3-accelerate
%description
![CI](https://github.com/fastai/fastai/workflows/CI/badge.svg)
[![PyPI](https://img.shields.io/pypi/v/fastai?color=blue&label=pypi%20version.png)](https://pypi.org/project/fastai/#description)
[![Conda (channel
only)](https://img.shields.io/conda/vn/fastai/fastai?color=seagreen&label=conda%20version.png)](https://anaconda.org/fastai/fastai)
[![Build fastai
images](https://github.com/fastai/docker-containers/workflows/Build%20fastai%20images/badge.svg)](https://github.com/fastai/docker-containers)
![docs](https://github.com/fastai/fastai/workflows/docs/badge.svg)
## Installing
You can use fastai without any installation by using [Google
Colab](https://colab.research.google.com/). In fact, every page of this
documentation is also available as an interactive notebook - click “Open
in colab” at the top of any page to open it (be sure to change the Colab
runtime to “GPU” to have it run fast!) See the fast.ai documentation on
[Using Colab](https://course.fast.ai/start_colab) for more information.
You can install fastai on your own machines with conda (highly
recommended), as long as you’re running Linux or Windows (NB: Mac is not
supported). For Windows, please see the “Running on Windows” for
important notes.
If you’re using
[miniconda](https://docs.conda.io/en/latest/miniconda.html)
(recommended) then run (note that if you replace `conda` with
[mamba](https://github.com/mamba-org/mamba) the install process will be
much faster and more reliable):
``` bash
conda install -c fastchan fastai
```
…or if you’re using
[Anaconda](https://www.anaconda.com/products/individual) then run:
``` bash
conda install -c fastchan fastai anaconda
```
To install with pip, use: `pip install fastai`. If you install with pip,
you should install PyTorch first by following the PyTorch [installation
instructions](https://pytorch.org/get-started/locally/).
If you plan to develop fastai yourself, or want to be on the cutting
edge, you can use an editable install (if you do this, you should also
use an editable install of
[fastcore](https://github.com/fastai/fastcore) to go with it.) First
install PyTorch, and then:
git clone https://github.com/fastai/fastai
pip install -e "fastai[dev]"
## Learning fastai
The best way to get started with fastai (and deep learning) is to read
[the
book](https://www.amazon.com/Deep-Learning-Coders-fastai-PyTorch/dp/1492045527),
and complete [the free course](https://course.fast.ai).
To see what’s possible with fastai, take a look at the [Quick
Start](https://docs.fast.ai/quick_start.html), which shows how to use
around 5 lines of code to build an image classifier, an image
segmentation model, a text sentiment model, a recommendation system, and
a tabular model. For each of the applications, the code is much the
same.
Read through the [Tutorials](https://docs.fast.ai/tutorial.html) to
learn how to train your own models on your own datasets. Use the
navigation sidebar to look through the fastai documentation. Every
class, function, and method is documented here.
To learn about the design and motivation of the library, read the [peer
reviewed paper](https://www.mdpi.com/2078-2489/11/2/108/htm).
## About fastai
fastai is a deep learning library which provides practitioners with
high-level components that can quickly and easily provide
state-of-the-art results in standard deep learning domains, and provides
researchers with low-level components that can be mixed and matched to
build new approaches. It aims to do both things without substantial
compromises in ease of use, flexibility, or performance. This is
possible thanks to a carefully layered architecture, which expresses
common underlying patterns of many deep learning and data processing
techniques in terms of decoupled abstractions. These abstractions can be
expressed concisely and clearly by leveraging the dynamism of the
underlying Python language and the flexibility of the PyTorch library.
fastai includes:
- A new type dispatch system for Python along with a semantic type
hierarchy for tensors
- A GPU-optimized computer vision library which can be extended in pure
Python
- An optimizer which refactors out the common functionality of modern
optimizers into two basic pieces, allowing optimization algorithms to
be implemented in 4–5 lines of code
- A novel 2-way callback system that can access any part of the data,
model, or optimizer and change it at any point during training
- A new data block API
- And much more…
fastai is organized around two main design goals: to be approachable and
rapidly productive, while also being deeply hackable and configurable.
It is built on top of a hierarchy of lower-level APIs which provide
composable building blocks. This way, a user wanting to rewrite part of
the high-level API or add particular behavior to suit their needs does
not have to learn how to use the lowest level.
## Migrating from other libraries
It’s very easy to migrate from plain PyTorch, Ignite, or any other
PyTorch-based library, or even to use fastai in conjunction with other
libraries. Generally, you’ll be able to use all your existing data
processing code, but will be able to reduce the amount of code you
require for training, and more easily take advantage of modern best
practices. Here are migration guides from some popular libraries to help
you on your way:
- [Plain PyTorch](https://docs.fast.ai/examples/migrating_pytorch.html)
- [Ignite](https://docs.fast.ai/examples/migrating_ignite.html)
- [Lightning](https://docs.fast.ai/examples/migrating_lightning.html)
- [Catalyst](https://docs.fast.ai/examples/migrating_catalyst.html)
## Windows Support
When installing with `mamba` or `conda` replace `-c fastchan` in the
installation with `-c pytorch -c nvidia -c fastai`, since fastchan is
not currently supported on Windows.
Due to python multiprocessing issues on Jupyter and Windows,
`num_workers` of `Dataloader` is reset to 0 automatically to avoid
Jupyter hanging. This makes tasks such as computer vision in Jupyter on
Windows many times slower than on Linux. This limitation doesn’t exist
if you use fastai from a script.
See [this
example](https://github.com/fastai/fastai/blob/master/nbs/examples/dataloader_spawn.py)
to fully leverage the fastai API on Windows.
## Tests
To run the tests in parallel, launch:
`nbdev_test`
For all the tests to pass, you’ll need to install the dependencies
specified as part of dev_requirements in settings.ini
`pip install -e .[dev]`
Tests are written using `nbdev`, for example see the documentation for
`test_eq`.
## Contributing
After you clone this repository, make sure you have run
`nbdev_install_hooks` in your terminal. This install Jupyter and git
hooks to automatically clean, trust, and fix merge conflicts in
notebooks.
After making changes in the repo, you should run `nbdev_prepare` and
make additional and necessary changes in order to pass all the tests.
## Docker Containers
For those interested in official docker containers for this project,
they can be found
[here](https://github.com/fastai/docker-containers#fastai).
%package -n python3-fastai
Summary: fastai simplifies training fast and accurate neural nets using modern best practices
Provides: python-fastai
BuildRequires: python3-devel
BuildRequires: python3-setuptools
BuildRequires: python3-pip
%description -n python3-fastai
![CI](https://github.com/fastai/fastai/workflows/CI/badge.svg)
[![PyPI](https://img.shields.io/pypi/v/fastai?color=blue&label=pypi%20version.png)](https://pypi.org/project/fastai/#description)
[![Conda (channel
only)](https://img.shields.io/conda/vn/fastai/fastai?color=seagreen&label=conda%20version.png)](https://anaconda.org/fastai/fastai)
[![Build fastai
images](https://github.com/fastai/docker-containers/workflows/Build%20fastai%20images/badge.svg)](https://github.com/fastai/docker-containers)
![docs](https://github.com/fastai/fastai/workflows/docs/badge.svg)
## Installing
You can use fastai without any installation by using [Google
Colab](https://colab.research.google.com/). In fact, every page of this
documentation is also available as an interactive notebook - click “Open
in colab” at the top of any page to open it (be sure to change the Colab
runtime to “GPU” to have it run fast!) See the fast.ai documentation on
[Using Colab](https://course.fast.ai/start_colab) for more information.
You can install fastai on your own machines with conda (highly
recommended), as long as you’re running Linux or Windows (NB: Mac is not
supported). For Windows, please see the “Running on Windows” for
important notes.
If you’re using
[miniconda](https://docs.conda.io/en/latest/miniconda.html)
(recommended) then run (note that if you replace `conda` with
[mamba](https://github.com/mamba-org/mamba) the install process will be
much faster and more reliable):
``` bash
conda install -c fastchan fastai
```
…or if you’re using
[Anaconda](https://www.anaconda.com/products/individual) then run:
``` bash
conda install -c fastchan fastai anaconda
```
To install with pip, use: `pip install fastai`. If you install with pip,
you should install PyTorch first by following the PyTorch [installation
instructions](https://pytorch.org/get-started/locally/).
If you plan to develop fastai yourself, or want to be on the cutting
edge, you can use an editable install (if you do this, you should also
use an editable install of
[fastcore](https://github.com/fastai/fastcore) to go with it.) First
install PyTorch, and then:
git clone https://github.com/fastai/fastai
pip install -e "fastai[dev]"
## Learning fastai
The best way to get started with fastai (and deep learning) is to read
[the
book](https://www.amazon.com/Deep-Learning-Coders-fastai-PyTorch/dp/1492045527),
and complete [the free course](https://course.fast.ai).
To see what’s possible with fastai, take a look at the [Quick
Start](https://docs.fast.ai/quick_start.html), which shows how to use
around 5 lines of code to build an image classifier, an image
segmentation model, a text sentiment model, a recommendation system, and
a tabular model. For each of the applications, the code is much the
same.
Read through the [Tutorials](https://docs.fast.ai/tutorial.html) to
learn how to train your own models on your own datasets. Use the
navigation sidebar to look through the fastai documentation. Every
class, function, and method is documented here.
To learn about the design and motivation of the library, read the [peer
reviewed paper](https://www.mdpi.com/2078-2489/11/2/108/htm).
## About fastai
fastai is a deep learning library which provides practitioners with
high-level components that can quickly and easily provide
state-of-the-art results in standard deep learning domains, and provides
researchers with low-level components that can be mixed and matched to
build new approaches. It aims to do both things without substantial
compromises in ease of use, flexibility, or performance. This is
possible thanks to a carefully layered architecture, which expresses
common underlying patterns of many deep learning and data processing
techniques in terms of decoupled abstractions. These abstractions can be
expressed concisely and clearly by leveraging the dynamism of the
underlying Python language and the flexibility of the PyTorch library.
fastai includes:
- A new type dispatch system for Python along with a semantic type
hierarchy for tensors
- A GPU-optimized computer vision library which can be extended in pure
Python
- An optimizer which refactors out the common functionality of modern
optimizers into two basic pieces, allowing optimization algorithms to
be implemented in 4–5 lines of code
- A novel 2-way callback system that can access any part of the data,
model, or optimizer and change it at any point during training
- A new data block API
- And much more…
fastai is organized around two main design goals: to be approachable and
rapidly productive, while also being deeply hackable and configurable.
It is built on top of a hierarchy of lower-level APIs which provide
composable building blocks. This way, a user wanting to rewrite part of
the high-level API or add particular behavior to suit their needs does
not have to learn how to use the lowest level.
## Migrating from other libraries
It’s very easy to migrate from plain PyTorch, Ignite, or any other
PyTorch-based library, or even to use fastai in conjunction with other
libraries. Generally, you’ll be able to use all your existing data
processing code, but will be able to reduce the amount of code you
require for training, and more easily take advantage of modern best
practices. Here are migration guides from some popular libraries to help
you on your way:
- [Plain PyTorch](https://docs.fast.ai/examples/migrating_pytorch.html)
- [Ignite](https://docs.fast.ai/examples/migrating_ignite.html)
- [Lightning](https://docs.fast.ai/examples/migrating_lightning.html)
- [Catalyst](https://docs.fast.ai/examples/migrating_catalyst.html)
## Windows Support
When installing with `mamba` or `conda` replace `-c fastchan` in the
installation with `-c pytorch -c nvidia -c fastai`, since fastchan is
not currently supported on Windows.
Due to python multiprocessing issues on Jupyter and Windows,
`num_workers` of `Dataloader` is reset to 0 automatically to avoid
Jupyter hanging. This makes tasks such as computer vision in Jupyter on
Windows many times slower than on Linux. This limitation doesn’t exist
if you use fastai from a script.
See [this
example](https://github.com/fastai/fastai/blob/master/nbs/examples/dataloader_spawn.py)
to fully leverage the fastai API on Windows.
## Tests
To run the tests in parallel, launch:
`nbdev_test`
For all the tests to pass, you’ll need to install the dependencies
specified as part of dev_requirements in settings.ini
`pip install -e .[dev]`
Tests are written using `nbdev`, for example see the documentation for
`test_eq`.
## Contributing
After you clone this repository, make sure you have run
`nbdev_install_hooks` in your terminal. This install Jupyter and git
hooks to automatically clean, trust, and fix merge conflicts in
notebooks.
After making changes in the repo, you should run `nbdev_prepare` and
make additional and necessary changes in order to pass all the tests.
## Docker Containers
For those interested in official docker containers for this project,
they can be found
[here](https://github.com/fastai/docker-containers#fastai).
%package help
Summary: Development documents and examples for fastai
Provides: python3-fastai-doc
%description help
![CI](https://github.com/fastai/fastai/workflows/CI/badge.svg)
[![PyPI](https://img.shields.io/pypi/v/fastai?color=blue&label=pypi%20version.png)](https://pypi.org/project/fastai/#description)
[![Conda (channel
only)](https://img.shields.io/conda/vn/fastai/fastai?color=seagreen&label=conda%20version.png)](https://anaconda.org/fastai/fastai)
[![Build fastai
images](https://github.com/fastai/docker-containers/workflows/Build%20fastai%20images/badge.svg)](https://github.com/fastai/docker-containers)
![docs](https://github.com/fastai/fastai/workflows/docs/badge.svg)
## Installing
You can use fastai without any installation by using [Google
Colab](https://colab.research.google.com/). In fact, every page of this
documentation is also available as an interactive notebook - click “Open
in colab” at the top of any page to open it (be sure to change the Colab
runtime to “GPU” to have it run fast!) See the fast.ai documentation on
[Using Colab](https://course.fast.ai/start_colab) for more information.
You can install fastai on your own machines with conda (highly
recommended), as long as you’re running Linux or Windows (NB: Mac is not
supported). For Windows, please see the “Running on Windows” for
important notes.
If you’re using
[miniconda](https://docs.conda.io/en/latest/miniconda.html)
(recommended) then run (note that if you replace `conda` with
[mamba](https://github.com/mamba-org/mamba) the install process will be
much faster and more reliable):
``` bash
conda install -c fastchan fastai
```
…or if you’re using
[Anaconda](https://www.anaconda.com/products/individual) then run:
``` bash
conda install -c fastchan fastai anaconda
```
To install with pip, use: `pip install fastai`. If you install with pip,
you should install PyTorch first by following the PyTorch [installation
instructions](https://pytorch.org/get-started/locally/).
If you plan to develop fastai yourself, or want to be on the cutting
edge, you can use an editable install (if you do this, you should also
use an editable install of
[fastcore](https://github.com/fastai/fastcore) to go with it.) First
install PyTorch, and then:
git clone https://github.com/fastai/fastai
pip install -e "fastai[dev]"
## Learning fastai
The best way to get started with fastai (and deep learning) is to read
[the
book](https://www.amazon.com/Deep-Learning-Coders-fastai-PyTorch/dp/1492045527),
and complete [the free course](https://course.fast.ai).
To see what’s possible with fastai, take a look at the [Quick
Start](https://docs.fast.ai/quick_start.html), which shows how to use
around 5 lines of code to build an image classifier, an image
segmentation model, a text sentiment model, a recommendation system, and
a tabular model. For each of the applications, the code is much the
same.
Read through the [Tutorials](https://docs.fast.ai/tutorial.html) to
learn how to train your own models on your own datasets. Use the
navigation sidebar to look through the fastai documentation. Every
class, function, and method is documented here.
To learn about the design and motivation of the library, read the [peer
reviewed paper](https://www.mdpi.com/2078-2489/11/2/108/htm).
## About fastai
fastai is a deep learning library which provides practitioners with
high-level components that can quickly and easily provide
state-of-the-art results in standard deep learning domains, and provides
researchers with low-level components that can be mixed and matched to
build new approaches. It aims to do both things without substantial
compromises in ease of use, flexibility, or performance. This is
possible thanks to a carefully layered architecture, which expresses
common underlying patterns of many deep learning and data processing
techniques in terms of decoupled abstractions. These abstractions can be
expressed concisely and clearly by leveraging the dynamism of the
underlying Python language and the flexibility of the PyTorch library.
fastai includes:
- A new type dispatch system for Python along with a semantic type
hierarchy for tensors
- A GPU-optimized computer vision library which can be extended in pure
Python
- An optimizer which refactors out the common functionality of modern
optimizers into two basic pieces, allowing optimization algorithms to
be implemented in 4–5 lines of code
- A novel 2-way callback system that can access any part of the data,
model, or optimizer and change it at any point during training
- A new data block API
- And much more…
fastai is organized around two main design goals: to be approachable and
rapidly productive, while also being deeply hackable and configurable.
It is built on top of a hierarchy of lower-level APIs which provide
composable building blocks. This way, a user wanting to rewrite part of
the high-level API or add particular behavior to suit their needs does
not have to learn how to use the lowest level.
## Migrating from other libraries
It’s very easy to migrate from plain PyTorch, Ignite, or any other
PyTorch-based library, or even to use fastai in conjunction with other
libraries. Generally, you’ll be able to use all your existing data
processing code, but will be able to reduce the amount of code you
require for training, and more easily take advantage of modern best
practices. Here are migration guides from some popular libraries to help
you on your way:
- [Plain PyTorch](https://docs.fast.ai/examples/migrating_pytorch.html)
- [Ignite](https://docs.fast.ai/examples/migrating_ignite.html)
- [Lightning](https://docs.fast.ai/examples/migrating_lightning.html)
- [Catalyst](https://docs.fast.ai/examples/migrating_catalyst.html)
## Windows Support
When installing with `mamba` or `conda` replace `-c fastchan` in the
installation with `-c pytorch -c nvidia -c fastai`, since fastchan is
not currently supported on Windows.
Due to python multiprocessing issues on Jupyter and Windows,
`num_workers` of `Dataloader` is reset to 0 automatically to avoid
Jupyter hanging. This makes tasks such as computer vision in Jupyter on
Windows many times slower than on Linux. This limitation doesn’t exist
if you use fastai from a script.
See [this
example](https://github.com/fastai/fastai/blob/master/nbs/examples/dataloader_spawn.py)
to fully leverage the fastai API on Windows.
## Tests
To run the tests in parallel, launch:
`nbdev_test`
For all the tests to pass, you’ll need to install the dependencies
specified as part of dev_requirements in settings.ini
`pip install -e .[dev]`
Tests are written using `nbdev`, for example see the documentation for
`test_eq`.
## Contributing
After you clone this repository, make sure you have run
`nbdev_install_hooks` in your terminal. This install Jupyter and git
hooks to automatically clean, trust, and fix merge conflicts in
notebooks.
After making changes in the repo, you should run `nbdev_prepare` and
make additional and necessary changes in order to pass all the tests.
## Docker Containers
For those interested in official docker containers for this project,
they can be found
[here](https://github.com/fastai/docker-containers#fastai).
%prep
%autosetup -n fastai-2.7.12
%build
%py3_build
%install
%py3_install
install -d -m755 %{buildroot}/%{_pkgdocdir}
if [ -d doc ]; then cp -arf doc %{buildroot}/%{_pkgdocdir}; fi
if [ -d docs ]; then cp -arf docs %{buildroot}/%{_pkgdocdir}; fi
if [ -d example ]; then cp -arf example %{buildroot}/%{_pkgdocdir}; fi
if [ -d examples ]; then cp -arf examples %{buildroot}/%{_pkgdocdir}; fi
pushd %{buildroot}
if [ -d usr/lib ]; then
find usr/lib -type f -printf "/%h/%f\n" >> filelist.lst
fi
if [ -d usr/lib64 ]; then
find usr/lib64 -type f -printf "/%h/%f\n" >> filelist.lst
fi
if [ -d usr/bin ]; then
find usr/bin -type f -printf "/%h/%f\n" >> filelist.lst
fi
if [ -d usr/sbin ]; then
find usr/sbin -type f -printf "/%h/%f\n" >> filelist.lst
fi
touch doclist.lst
if [ -d usr/share/man ]; then
find usr/share/man -type f -printf "/%h/%f.gz\n" >> doclist.lst
fi
popd
mv %{buildroot}/filelist.lst .
mv %{buildroot}/doclist.lst .
%files -n python3-fastai -f filelist.lst
%dir %{python3_sitelib}/*
%files help -f doclist.lst
%{_docdir}/*
%changelog
* Fri Apr 21 2023 Python_Bot - 2.7.12-1
- Package Spec generated