%global _empty_manifest_terminate_build 0
Name: python-pytorch-adapt
Version: 0.0.83
Release: 1
Summary: Domain adaptation made easy. Fully featured, modular, and customizable.
License: MIT License
URL: https://github.com/KevinMusgrave/pytorch-adapt
Source0: https://mirrors.nju.edu.cn/pypi/web/packages/db/e5/96520821bbb5f2f38d3f77458e9b47e155b37a35e1e577b34f6dd5a55a49/pytorch-adapt-0.0.83.tar.gz
BuildArch: noarch
Requires: python3-numpy
Requires: python3-torch
Requires: python3-torchvision
Requires: python3-torchmetrics
Requires: python3-pytorch-metric-learning
Requires: python3-albumentations
Requires: python3-black
Requires: python3-isort
Requires: python3-nbqa
Requires: python3-flake8
Requires: python3-mkdocs-material
Requires: python3-mkdocstrings[python]
Requires: python3-griffe
Requires: python3-mkdocs-gen-files
Requires: python3-mkdocs-section-index
Requires: python3-mkdocs-literate-nav
Requires: python3-pytorch-ignite
Requires: python3-pytorch-lightning
Requires: python3-record-keeper
Requires: python3-tensorboard
Requires: python3-timm
%description
## Why use PyTorch Adapt?
PyTorch Adapt provides tools for **domain adaptation**, a type of machine learning algorithm that repurposes existing models to work in new domains. This library is:
### 1. **Fully featured**
Build a complete train/val domain adaptation pipeline in a few lines of code.
### 2. **Modular**
Use just the parts that suit your needs, whether it's the algorithms, loss functions, or validation methods.
### 3. **Highly customizable**
Customize and combine complex algorithms with ease.
### 4. **Compatible with frameworks**
Add additional functionality to your code by using one of the framework wrappers. Converting an algorithm into a PyTorch Lightning module is as simple as wrapping it with ```Lightning```.
## Documentation
- [**Documentation**](https://kevinmusgrave.github.io/pytorch-adapt/)
- [**Installation instructions**](https://github.com/KevinMusgrave/pytorch-adapt#installation)
- [**List of papers implemented**](https://kevinmusgrave.github.io/pytorch-adapt/algorithms/uda)
## Examples
See the **[examples folder](https://github.com/KevinMusgrave/pytorch-adapt/blob/main/examples/README.md)** for notebooks you can download or run on Google Colab.
## How to...
### Use in vanilla PyTorch
```python
from pytorch_adapt.hooks import DANNHook
from pytorch_adapt.utils.common_functions import batch_to_device
# Assuming that models, optimizers, and dataloader are already created.
hook = DANNHook(optimizers)
for data in tqdm(dataloader):
data = batch_to_device(data, device)
# Optimization is done inside the hook.
# The returned loss is for logging.
_, loss = hook({**models, **data})
```
### Build complex algorithms
Let's customize ```DANNHook``` with:
- minimum class confusion
- virtual adversarial training
```python
from pytorch_adapt.hooks import MCCHook, VATHook
# G and C are the Generator and Classifier models
G, C = models["G"], models["C"]
misc = {"combined_model": torch.nn.Sequential(G, C)}
hook = DANNHook(optimizers, post_g=[MCCHook(), VATHook()])
for data in tqdm(dataloader):
data = batch_to_device(data, device)
_, loss = hook({**models, **data, **misc})
```
### Wrap with your favorite PyTorch framework
First, set up the adapter and dataloaders:
```python
from pytorch_adapt.adapters import DANN
from pytorch_adapt.containers import Models
from pytorch_adapt.datasets import DataloaderCreator
models_cont = Models(models)
adapter = DANN(models=models_cont)
dc = DataloaderCreator(num_workers=2)
dataloaders = dc(**datasets)
```
Then use a framework wrapper:
#### PyTorch Lightning
```python
import pytorch_lightning as pl
from pytorch_adapt.frameworks.lightning import Lightning
L_adapter = Lightning(adapter)
trainer = pl.Trainer(gpus=1, max_epochs=1)
trainer.fit(L_adapter, dataloaders["train"])
```
#### PyTorch Ignite
```python
trainer = Ignite(adapter)
trainer.run(datasets, dataloader_creator=dc)
```
### Check your model's performance
You can do this in vanilla PyTorch:
```python
from pytorch_adapt.validators import SNDValidator
# Assuming predictions have been collected
target_train = {"preds": preds}
validator = SNDValidator()
score = validator(target_train=target_train)
```
You can also do this during training with a framework wrapper:
#### PyTorch Lightning
```python
from pytorch_adapt.frameworks.utils import filter_datasets
validator = SNDValidator()
dataloaders = dc(**filter_datasets(datasets, validator))
train_loader = dataloaders.pop("train")
L_adapter = Lightning(adapter, validator=validator)
trainer = pl.Trainer(gpus=1, max_epochs=1)
trainer.fit(L_adapter, train_loader, list(dataloaders.values()))
```
#### Pytorch Ignite
```python
from pytorch_adapt.validators import ScoreHistory
validator = ScoreHistory(SNDValidator())
trainer = Ignite(adapter, validator=validator)
trainer.run(datasets, dataloader_creator=dc)
```
### Run the above examples
See [this notebook](https://github.com/KevinMusgrave/pytorch-adapt/blob/main/examples/other/ReadmeExamples.ipynb) and [the examples page](https://github.com/KevinMusgrave/pytorch-adapt/tree/main/examples/) for other notebooks.
## Installation
### Pip
```
pip install pytorch-adapt
```
**To get the latest dev version**:
```
pip install pytorch-adapt --pre
```
**To use ```pytorch_adapt.frameworks.lightning```**:
```
pip install pytorch-adapt[lightning]
```
**To use ```pytorch_adapt.frameworks.ignite```**:
```
pip install pytorch-adapt[ignite]
```
### Conda
Coming soon...
### Dependencies
See [setup.py](https://github.com/KevinMusgrave/pytorch-adapt/blob/main/setup.py)
## Acknowledgements
### Contributors
Thanks to the contributors who made pull requests!
| Contributor | Highlights |
| -- | -- |
| [deepseek-eoghan](https://github.com/deepseek-eoghan) | Improved the TargetDataset class |
### Advisors
Thank you to [Ser-Nam Lim](https://research.fb.com/people/lim-ser-nam/), and my research advisor, [Professor Serge Belongie](https://vision.cornell.edu/se3/people/serge-belongie/).
### Logo
Thanks to [Jeff Musgrave](https://www.designgenius.ca/) for designing the logo.
### Citing this library
If you'd like to cite pytorch-adapt in your paper, you can refer to [this paper](https://arxiv.org/abs/2211.15673) by copy-pasting this bibtex reference:
```latex
@article{Musgrave2022PyTorchA,
title={PyTorch Adapt},
author={Kevin Musgrave and Serge J. Belongie and Ser Nam Lim},
journal={ArXiv},
year={2022},
volume={abs/2211.15673}
}
```
### Code references (in no particular order)
- https://github.com/wgchang/DSBN
- https://github.com/jihanyang/AFN
- https://github.com/thuml/Versatile-Domain-Adaptation
- https://github.com/tim-learn/ATDOC
- https://github.com/thuml/CDAN
- https://github.com/takerum/vat_chainer
- https://github.com/takerum/vat_tf
- https://github.com/RuiShu/dirt-t
- https://github.com/lyakaap/VAT-pytorch
- https://github.com/9310gaurav/virtual-adversarial-training
- https://github.com/thuml/Deep-Embedded-Validation
- https://github.com/lr94/abas
- https://github.com/thuml/Batch-Spectral-Penalization
- https://github.com/jvanvugt/pytorch-domain-adaptation
- https://github.com/ptrblck/pytorch_misc
%package -n python3-pytorch-adapt
Summary: Domain adaptation made easy. Fully featured, modular, and customizable.
Provides: python-pytorch-adapt
BuildRequires: python3-devel
BuildRequires: python3-setuptools
BuildRequires: python3-pip
%description -n python3-pytorch-adapt
## Why use PyTorch Adapt?
PyTorch Adapt provides tools for **domain adaptation**, a type of machine learning algorithm that repurposes existing models to work in new domains. This library is:
### 1. **Fully featured**
Build a complete train/val domain adaptation pipeline in a few lines of code.
### 2. **Modular**
Use just the parts that suit your needs, whether it's the algorithms, loss functions, or validation methods.
### 3. **Highly customizable**
Customize and combine complex algorithms with ease.
### 4. **Compatible with frameworks**
Add additional functionality to your code by using one of the framework wrappers. Converting an algorithm into a PyTorch Lightning module is as simple as wrapping it with ```Lightning```.
## Documentation
- [**Documentation**](https://kevinmusgrave.github.io/pytorch-adapt/)
- [**Installation instructions**](https://github.com/KevinMusgrave/pytorch-adapt#installation)
- [**List of papers implemented**](https://kevinmusgrave.github.io/pytorch-adapt/algorithms/uda)
## Examples
See the **[examples folder](https://github.com/KevinMusgrave/pytorch-adapt/blob/main/examples/README.md)** for notebooks you can download or run on Google Colab.
## How to...
### Use in vanilla PyTorch
```python
from pytorch_adapt.hooks import DANNHook
from pytorch_adapt.utils.common_functions import batch_to_device
# Assuming that models, optimizers, and dataloader are already created.
hook = DANNHook(optimizers)
for data in tqdm(dataloader):
data = batch_to_device(data, device)
# Optimization is done inside the hook.
# The returned loss is for logging.
_, loss = hook({**models, **data})
```
### Build complex algorithms
Let's customize ```DANNHook``` with:
- minimum class confusion
- virtual adversarial training
```python
from pytorch_adapt.hooks import MCCHook, VATHook
# G and C are the Generator and Classifier models
G, C = models["G"], models["C"]
misc = {"combined_model": torch.nn.Sequential(G, C)}
hook = DANNHook(optimizers, post_g=[MCCHook(), VATHook()])
for data in tqdm(dataloader):
data = batch_to_device(data, device)
_, loss = hook({**models, **data, **misc})
```
### Wrap with your favorite PyTorch framework
First, set up the adapter and dataloaders:
```python
from pytorch_adapt.adapters import DANN
from pytorch_adapt.containers import Models
from pytorch_adapt.datasets import DataloaderCreator
models_cont = Models(models)
adapter = DANN(models=models_cont)
dc = DataloaderCreator(num_workers=2)
dataloaders = dc(**datasets)
```
Then use a framework wrapper:
#### PyTorch Lightning
```python
import pytorch_lightning as pl
from pytorch_adapt.frameworks.lightning import Lightning
L_adapter = Lightning(adapter)
trainer = pl.Trainer(gpus=1, max_epochs=1)
trainer.fit(L_adapter, dataloaders["train"])
```
#### PyTorch Ignite
```python
trainer = Ignite(adapter)
trainer.run(datasets, dataloader_creator=dc)
```
### Check your model's performance
You can do this in vanilla PyTorch:
```python
from pytorch_adapt.validators import SNDValidator
# Assuming predictions have been collected
target_train = {"preds": preds}
validator = SNDValidator()
score = validator(target_train=target_train)
```
You can also do this during training with a framework wrapper:
#### PyTorch Lightning
```python
from pytorch_adapt.frameworks.utils import filter_datasets
validator = SNDValidator()
dataloaders = dc(**filter_datasets(datasets, validator))
train_loader = dataloaders.pop("train")
L_adapter = Lightning(adapter, validator=validator)
trainer = pl.Trainer(gpus=1, max_epochs=1)
trainer.fit(L_adapter, train_loader, list(dataloaders.values()))
```
#### Pytorch Ignite
```python
from pytorch_adapt.validators import ScoreHistory
validator = ScoreHistory(SNDValidator())
trainer = Ignite(adapter, validator=validator)
trainer.run(datasets, dataloader_creator=dc)
```
### Run the above examples
See [this notebook](https://github.com/KevinMusgrave/pytorch-adapt/blob/main/examples/other/ReadmeExamples.ipynb) and [the examples page](https://github.com/KevinMusgrave/pytorch-adapt/tree/main/examples/) for other notebooks.
## Installation
### Pip
```
pip install pytorch-adapt
```
**To get the latest dev version**:
```
pip install pytorch-adapt --pre
```
**To use ```pytorch_adapt.frameworks.lightning```**:
```
pip install pytorch-adapt[lightning]
```
**To use ```pytorch_adapt.frameworks.ignite```**:
```
pip install pytorch-adapt[ignite]
```
### Conda
Coming soon...
### Dependencies
See [setup.py](https://github.com/KevinMusgrave/pytorch-adapt/blob/main/setup.py)
## Acknowledgements
### Contributors
Thanks to the contributors who made pull requests!
| Contributor | Highlights |
| -- | -- |
| [deepseek-eoghan](https://github.com/deepseek-eoghan) | Improved the TargetDataset class |
### Advisors
Thank you to [Ser-Nam Lim](https://research.fb.com/people/lim-ser-nam/), and my research advisor, [Professor Serge Belongie](https://vision.cornell.edu/se3/people/serge-belongie/).
### Logo
Thanks to [Jeff Musgrave](https://www.designgenius.ca/) for designing the logo.
### Citing this library
If you'd like to cite pytorch-adapt in your paper, you can refer to [this paper](https://arxiv.org/abs/2211.15673) by copy-pasting this bibtex reference:
```latex
@article{Musgrave2022PyTorchA,
title={PyTorch Adapt},
author={Kevin Musgrave and Serge J. Belongie and Ser Nam Lim},
journal={ArXiv},
year={2022},
volume={abs/2211.15673}
}
```
### Code references (in no particular order)
- https://github.com/wgchang/DSBN
- https://github.com/jihanyang/AFN
- https://github.com/thuml/Versatile-Domain-Adaptation
- https://github.com/tim-learn/ATDOC
- https://github.com/thuml/CDAN
- https://github.com/takerum/vat_chainer
- https://github.com/takerum/vat_tf
- https://github.com/RuiShu/dirt-t
- https://github.com/lyakaap/VAT-pytorch
- https://github.com/9310gaurav/virtual-adversarial-training
- https://github.com/thuml/Deep-Embedded-Validation
- https://github.com/lr94/abas
- https://github.com/thuml/Batch-Spectral-Penalization
- https://github.com/jvanvugt/pytorch-domain-adaptation
- https://github.com/ptrblck/pytorch_misc
%package help
Summary: Development documents and examples for pytorch-adapt
Provides: python3-pytorch-adapt-doc
%description help
## Why use PyTorch Adapt?
PyTorch Adapt provides tools for **domain adaptation**, a type of machine learning algorithm that repurposes existing models to work in new domains. This library is:
### 1. **Fully featured**
Build a complete train/val domain adaptation pipeline in a few lines of code.
### 2. **Modular**
Use just the parts that suit your needs, whether it's the algorithms, loss functions, or validation methods.
### 3. **Highly customizable**
Customize and combine complex algorithms with ease.
### 4. **Compatible with frameworks**
Add additional functionality to your code by using one of the framework wrappers. Converting an algorithm into a PyTorch Lightning module is as simple as wrapping it with ```Lightning```.
## Documentation
- [**Documentation**](https://kevinmusgrave.github.io/pytorch-adapt/)
- [**Installation instructions**](https://github.com/KevinMusgrave/pytorch-adapt#installation)
- [**List of papers implemented**](https://kevinmusgrave.github.io/pytorch-adapt/algorithms/uda)
## Examples
See the **[examples folder](https://github.com/KevinMusgrave/pytorch-adapt/blob/main/examples/README.md)** for notebooks you can download or run on Google Colab.
## How to...
### Use in vanilla PyTorch
```python
from pytorch_adapt.hooks import DANNHook
from pytorch_adapt.utils.common_functions import batch_to_device
# Assuming that models, optimizers, and dataloader are already created.
hook = DANNHook(optimizers)
for data in tqdm(dataloader):
data = batch_to_device(data, device)
# Optimization is done inside the hook.
# The returned loss is for logging.
_, loss = hook({**models, **data})
```
### Build complex algorithms
Let's customize ```DANNHook``` with:
- minimum class confusion
- virtual adversarial training
```python
from pytorch_adapt.hooks import MCCHook, VATHook
# G and C are the Generator and Classifier models
G, C = models["G"], models["C"]
misc = {"combined_model": torch.nn.Sequential(G, C)}
hook = DANNHook(optimizers, post_g=[MCCHook(), VATHook()])
for data in tqdm(dataloader):
data = batch_to_device(data, device)
_, loss = hook({**models, **data, **misc})
```
### Wrap with your favorite PyTorch framework
First, set up the adapter and dataloaders:
```python
from pytorch_adapt.adapters import DANN
from pytorch_adapt.containers import Models
from pytorch_adapt.datasets import DataloaderCreator
models_cont = Models(models)
adapter = DANN(models=models_cont)
dc = DataloaderCreator(num_workers=2)
dataloaders = dc(**datasets)
```
Then use a framework wrapper:
#### PyTorch Lightning
```python
import pytorch_lightning as pl
from pytorch_adapt.frameworks.lightning import Lightning
L_adapter = Lightning(adapter)
trainer = pl.Trainer(gpus=1, max_epochs=1)
trainer.fit(L_adapter, dataloaders["train"])
```
#### PyTorch Ignite
```python
trainer = Ignite(adapter)
trainer.run(datasets, dataloader_creator=dc)
```
### Check your model's performance
You can do this in vanilla PyTorch:
```python
from pytorch_adapt.validators import SNDValidator
# Assuming predictions have been collected
target_train = {"preds": preds}
validator = SNDValidator()
score = validator(target_train=target_train)
```
You can also do this during training with a framework wrapper:
#### PyTorch Lightning
```python
from pytorch_adapt.frameworks.utils import filter_datasets
validator = SNDValidator()
dataloaders = dc(**filter_datasets(datasets, validator))
train_loader = dataloaders.pop("train")
L_adapter = Lightning(adapter, validator=validator)
trainer = pl.Trainer(gpus=1, max_epochs=1)
trainer.fit(L_adapter, train_loader, list(dataloaders.values()))
```
#### Pytorch Ignite
```python
from pytorch_adapt.validators import ScoreHistory
validator = ScoreHistory(SNDValidator())
trainer = Ignite(adapter, validator=validator)
trainer.run(datasets, dataloader_creator=dc)
```
### Run the above examples
See [this notebook](https://github.com/KevinMusgrave/pytorch-adapt/blob/main/examples/other/ReadmeExamples.ipynb) and [the examples page](https://github.com/KevinMusgrave/pytorch-adapt/tree/main/examples/) for other notebooks.
## Installation
### Pip
```
pip install pytorch-adapt
```
**To get the latest dev version**:
```
pip install pytorch-adapt --pre
```
**To use ```pytorch_adapt.frameworks.lightning```**:
```
pip install pytorch-adapt[lightning]
```
**To use ```pytorch_adapt.frameworks.ignite```**:
```
pip install pytorch-adapt[ignite]
```
### Conda
Coming soon...
### Dependencies
See [setup.py](https://github.com/KevinMusgrave/pytorch-adapt/blob/main/setup.py)
## Acknowledgements
### Contributors
Thanks to the contributors who made pull requests!
| Contributor | Highlights |
| -- | -- |
| [deepseek-eoghan](https://github.com/deepseek-eoghan) | Improved the TargetDataset class |
### Advisors
Thank you to [Ser-Nam Lim](https://research.fb.com/people/lim-ser-nam/), and my research advisor, [Professor Serge Belongie](https://vision.cornell.edu/se3/people/serge-belongie/).
### Logo
Thanks to [Jeff Musgrave](https://www.designgenius.ca/) for designing the logo.
### Citing this library
If you'd like to cite pytorch-adapt in your paper, you can refer to [this paper](https://arxiv.org/abs/2211.15673) by copy-pasting this bibtex reference:
```latex
@article{Musgrave2022PyTorchA,
title={PyTorch Adapt},
author={Kevin Musgrave and Serge J. Belongie and Ser Nam Lim},
journal={ArXiv},
year={2022},
volume={abs/2211.15673}
}
```
### Code references (in no particular order)
- https://github.com/wgchang/DSBN
- https://github.com/jihanyang/AFN
- https://github.com/thuml/Versatile-Domain-Adaptation
- https://github.com/tim-learn/ATDOC
- https://github.com/thuml/CDAN
- https://github.com/takerum/vat_chainer
- https://github.com/takerum/vat_tf
- https://github.com/RuiShu/dirt-t
- https://github.com/lyakaap/VAT-pytorch
- https://github.com/9310gaurav/virtual-adversarial-training
- https://github.com/thuml/Deep-Embedded-Validation
- https://github.com/lr94/abas
- https://github.com/thuml/Batch-Spectral-Penalization
- https://github.com/jvanvugt/pytorch-domain-adaptation
- https://github.com/ptrblck/pytorch_misc
%prep
%autosetup -n pytorch-adapt-0.0.83
%build
%py3_build
%install
%py3_install
install -d -m755 %{buildroot}/%{_pkgdocdir}
if [ -d doc ]; then cp -arf doc %{buildroot}/%{_pkgdocdir}; fi
if [ -d docs ]; then cp -arf docs %{buildroot}/%{_pkgdocdir}; fi
if [ -d example ]; then cp -arf example %{buildroot}/%{_pkgdocdir}; fi
if [ -d examples ]; then cp -arf examples %{buildroot}/%{_pkgdocdir}; fi
pushd %{buildroot}
if [ -d usr/lib ]; then
find usr/lib -type f -printf "/%h/%f\n" >> filelist.lst
fi
if [ -d usr/lib64 ]; then
find usr/lib64 -type f -printf "/%h/%f\n" >> filelist.lst
fi
if [ -d usr/bin ]; then
find usr/bin -type f -printf "/%h/%f\n" >> filelist.lst
fi
if [ -d usr/sbin ]; then
find usr/sbin -type f -printf "/%h/%f\n" >> filelist.lst
fi
touch doclist.lst
if [ -d usr/share/man ]; then
find usr/share/man -type f -printf "/%h/%f.gz\n" >> doclist.lst
fi
popd
mv %{buildroot}/filelist.lst .
mv %{buildroot}/doclist.lst .
%files -n python3-pytorch-adapt -f filelist.lst
%dir %{python3_sitelib}/*
%files help -f doclist.lst
%{_docdir}/*
%changelog
* Fri May 05 2023 Python_Bot - 0.0.83-1
- Package Spec generated