summaryrefslogtreecommitdiff
path: root/python-ml4a.spec
diff options
context:
space:
mode:
Diffstat (limited to 'python-ml4a.spec')
-rw-r--r--python-ml4a.spec257
1 files changed, 257 insertions, 0 deletions
diff --git a/python-ml4a.spec b/python-ml4a.spec
new file mode 100644
index 0000000..ebccab2
--- /dev/null
+++ b/python-ml4a.spec
@@ -0,0 +1,257 @@
+%global _empty_manifest_terminate_build 0
+Name: python-ml4a
+Version: 0.1.3
+Release: 1
+Summary: A toolkit for making art with machine learning, including an API for popular deep learning models, recipes for combining them, and a suite of educational examples
+License: MIT
+URL: http://github.com/ml4a/ml4a
+Source0: https://mirrors.aliyun.com/pypi/web/packages/80/1a/7452b2ec23b55f8ad283d99c0ba57ecbcc332262bb2ca74837b78e446622/ml4a-0.1.3.tar.gz
+BuildArch: noarch
+
+Requires: python3-bs4
+Requires: python3-dill
+Requires: python3-imutils
+Requires: python3-inflect
+Requires: python3-face-recognition
+Requires: python3-gdown
+Requires: python3-ipython
+Requires: python3-ipywidgets
+Requires: python3-librosa
+Requires: python3-lxml
+Requires: python3-matplotlib
+Requires: python3-moviepy
+Requires: python3-ninja
+Requires: python3-noise
+Requires: python3-numba
+Requires: python3-numpy
+Requires: python3-omegaconf
+Requires: python3-opencv-python
+Requires: python3-Pillow
+Requires: python3-pytorch-lightning
+Requires: python3-psutil
+Requires: python3-scikit-image
+Requires: python3-scikit-learn
+Requires: python3-tensorflow-gpu
+Requires: python3-torch
+Requires: python3-torchvision
+Requires: python3-tqdm
+Requires: python3-unidecode
+Requires: python3-yacs
+
+%description
+<h1 align="center">
+ <br>
+ <a href="https://ml4a.net/"><img src="https://pbs.twimg.com/profile_images/717391151041540096/K3Z09zCg_400x400.jpg" alt="ml4a" width="200"></a>
+ <br>
+ Machine Learning for Artists
+ <br>
+</h1>
+<div align="center">
+ <a href="https://ml-4a.slack.com/"><img src="https://img.shields.io/badge/chat-on%20slack-7A5979.svg" /></a>
+ <a href="https://mybinder.org/v2/gh/ml4a/ml4a/ml4a.net"><img src="https://mybinder.org/badge.svg" /></a>
+ <a href="http://colab.research.google.com/github/ml4a/ml4a/blob/ml4a.net"><img src="https://colab.research.google.com/assets/colab-badge.svg" /></a>
+ <a href="https://twitter.com/ml4a_"><img src="https://img.shields.io/twitter/follow/ml4a_?label=Follow&style=social"></a>
+</div>
+
+[ml4a](https://ml4a.net) is a Python library for making art with machine learning. It features:
+
+* an API wrapping popular deep learning models with creative applications, including [StyleGAN2](https://github.com/NVLabs/stylegan2/), [SPADE](https://github.com/NVlabs/SPADE), [Neural Style Transfer](https://github.com/genekogan/neural_style), [DeepDream](https://github.com/genekogan/deepdream), and [many others](https://github.com/ml4a/ml4a/tree/master/ml4a/models/submodules).
+* a collection of [Jupyter notebooks](https://github.com/ml4a/ml4a-guides/tree/ml4a.net/examples) explaining the basics of deep learning for beginners, and providing [recipes for using the materials creatively](https://github.com/ml4a/ml4a-guides/tree/ml4a.net/examples/models).
+
+## Example
+
+ml4a bundles the source code of various open source repositories as [git submodules](https://github.com/ml4a/ml4a-guides/tree/ml4a.net/ml4a/models/submodules) and contains wrappers to streamline and simplify them. For example, to generate sample images with StyleGAN2:
+
+```
+from ml4a import image
+from ml4a.models import stylegan
+
+network_pkl = stylegan.get_pretrained_model('ffhq')
+stylegan.load_model(network_pkl)
+
+samples = stylegan.random_sample(3, labels=None, truncation=1.0)
+image.display(samples)
+```
+
+Every model in `ml4a.models`, including the `stylegan` module above, imports all of the original repository's code into its namespace, allowing low-level access.
+
+## Support ml4a
+
+### Become a sponsor
+
+You can support ml4a by [donating through GitHub sponsors](https://github.com/sponsors/ml4a/).
+
+### How to contribute
+
+Start by joining the [Slack](https://join.slack.com/t/ml-4a/shared_invite/enQtNjA4MjgzODk1MjA3LTlhYjQ5NWQ2OTNlODZiMDRjZTFmNDZiYjlmZWYwNGM0YjIxNjE3Yjc0NWVjMmVlZjNmZDhmYTkzZjk0ZTg1ZGM%3E) or following us on [Twitter](https://www.twitter.com/ml4a_). Contribute to the codebase, or help write tutorials.
+
+
+## License
+
+ml4a itself is [licensed MIT](https://github.com/ml4a/ml4a/blob/master/LICENSE), but you are also bound to the licenses of any [models](https://github.com/ml4a/ml4a/tree/master/ml4a/models/submodules) you use.
+
+
+
+
+%package -n python3-ml4a
+Summary: A toolkit for making art with machine learning, including an API for popular deep learning models, recipes for combining them, and a suite of educational examples
+Provides: python-ml4a
+BuildRequires: python3-devel
+BuildRequires: python3-setuptools
+BuildRequires: python3-pip
+%description -n python3-ml4a
+<h1 align="center">
+ <br>
+ <a href="https://ml4a.net/"><img src="https://pbs.twimg.com/profile_images/717391151041540096/K3Z09zCg_400x400.jpg" alt="ml4a" width="200"></a>
+ <br>
+ Machine Learning for Artists
+ <br>
+</h1>
+<div align="center">
+ <a href="https://ml-4a.slack.com/"><img src="https://img.shields.io/badge/chat-on%20slack-7A5979.svg" /></a>
+ <a href="https://mybinder.org/v2/gh/ml4a/ml4a/ml4a.net"><img src="https://mybinder.org/badge.svg" /></a>
+ <a href="http://colab.research.google.com/github/ml4a/ml4a/blob/ml4a.net"><img src="https://colab.research.google.com/assets/colab-badge.svg" /></a>
+ <a href="https://twitter.com/ml4a_"><img src="https://img.shields.io/twitter/follow/ml4a_?label=Follow&style=social"></a>
+</div>
+
+[ml4a](https://ml4a.net) is a Python library for making art with machine learning. It features:
+
+* an API wrapping popular deep learning models with creative applications, including [StyleGAN2](https://github.com/NVLabs/stylegan2/), [SPADE](https://github.com/NVlabs/SPADE), [Neural Style Transfer](https://github.com/genekogan/neural_style), [DeepDream](https://github.com/genekogan/deepdream), and [many others](https://github.com/ml4a/ml4a/tree/master/ml4a/models/submodules).
+* a collection of [Jupyter notebooks](https://github.com/ml4a/ml4a-guides/tree/ml4a.net/examples) explaining the basics of deep learning for beginners, and providing [recipes for using the materials creatively](https://github.com/ml4a/ml4a-guides/tree/ml4a.net/examples/models).
+
+## Example
+
+ml4a bundles the source code of various open source repositories as [git submodules](https://github.com/ml4a/ml4a-guides/tree/ml4a.net/ml4a/models/submodules) and contains wrappers to streamline and simplify them. For example, to generate sample images with StyleGAN2:
+
+```
+from ml4a import image
+from ml4a.models import stylegan
+
+network_pkl = stylegan.get_pretrained_model('ffhq')
+stylegan.load_model(network_pkl)
+
+samples = stylegan.random_sample(3, labels=None, truncation=1.0)
+image.display(samples)
+```
+
+Every model in `ml4a.models`, including the `stylegan` module above, imports all of the original repository's code into its namespace, allowing low-level access.
+
+## Support ml4a
+
+### Become a sponsor
+
+You can support ml4a by [donating through GitHub sponsors](https://github.com/sponsors/ml4a/).
+
+### How to contribute
+
+Start by joining the [Slack](https://join.slack.com/t/ml-4a/shared_invite/enQtNjA4MjgzODk1MjA3LTlhYjQ5NWQ2OTNlODZiMDRjZTFmNDZiYjlmZWYwNGM0YjIxNjE3Yjc0NWVjMmVlZjNmZDhmYTkzZjk0ZTg1ZGM%3E) or following us on [Twitter](https://www.twitter.com/ml4a_). Contribute to the codebase, or help write tutorials.
+
+
+## License
+
+ml4a itself is [licensed MIT](https://github.com/ml4a/ml4a/blob/master/LICENSE), but you are also bound to the licenses of any [models](https://github.com/ml4a/ml4a/tree/master/ml4a/models/submodules) you use.
+
+
+
+
+%package help
+Summary: Development documents and examples for ml4a
+Provides: python3-ml4a-doc
+%description help
+<h1 align="center">
+ <br>
+ <a href="https://ml4a.net/"><img src="https://pbs.twimg.com/profile_images/717391151041540096/K3Z09zCg_400x400.jpg" alt="ml4a" width="200"></a>
+ <br>
+ Machine Learning for Artists
+ <br>
+</h1>
+<div align="center">
+ <a href="https://ml-4a.slack.com/"><img src="https://img.shields.io/badge/chat-on%20slack-7A5979.svg" /></a>
+ <a href="https://mybinder.org/v2/gh/ml4a/ml4a/ml4a.net"><img src="https://mybinder.org/badge.svg" /></a>
+ <a href="http://colab.research.google.com/github/ml4a/ml4a/blob/ml4a.net"><img src="https://colab.research.google.com/assets/colab-badge.svg" /></a>
+ <a href="https://twitter.com/ml4a_"><img src="https://img.shields.io/twitter/follow/ml4a_?label=Follow&style=social"></a>
+</div>
+
+[ml4a](https://ml4a.net) is a Python library for making art with machine learning. It features:
+
+* an API wrapping popular deep learning models with creative applications, including [StyleGAN2](https://github.com/NVLabs/stylegan2/), [SPADE](https://github.com/NVlabs/SPADE), [Neural Style Transfer](https://github.com/genekogan/neural_style), [DeepDream](https://github.com/genekogan/deepdream), and [many others](https://github.com/ml4a/ml4a/tree/master/ml4a/models/submodules).
+* a collection of [Jupyter notebooks](https://github.com/ml4a/ml4a-guides/tree/ml4a.net/examples) explaining the basics of deep learning for beginners, and providing [recipes for using the materials creatively](https://github.com/ml4a/ml4a-guides/tree/ml4a.net/examples/models).
+
+## Example
+
+ml4a bundles the source code of various open source repositories as [git submodules](https://github.com/ml4a/ml4a-guides/tree/ml4a.net/ml4a/models/submodules) and contains wrappers to streamline and simplify them. For example, to generate sample images with StyleGAN2:
+
+```
+from ml4a import image
+from ml4a.models import stylegan
+
+network_pkl = stylegan.get_pretrained_model('ffhq')
+stylegan.load_model(network_pkl)
+
+samples = stylegan.random_sample(3, labels=None, truncation=1.0)
+image.display(samples)
+```
+
+Every model in `ml4a.models`, including the `stylegan` module above, imports all of the original repository's code into its namespace, allowing low-level access.
+
+## Support ml4a
+
+### Become a sponsor
+
+You can support ml4a by [donating through GitHub sponsors](https://github.com/sponsors/ml4a/).
+
+### How to contribute
+
+Start by joining the [Slack](https://join.slack.com/t/ml-4a/shared_invite/enQtNjA4MjgzODk1MjA3LTlhYjQ5NWQ2OTNlODZiMDRjZTFmNDZiYjlmZWYwNGM0YjIxNjE3Yjc0NWVjMmVlZjNmZDhmYTkzZjk0ZTg1ZGM%3E) or following us on [Twitter](https://www.twitter.com/ml4a_). Contribute to the codebase, or help write tutorials.
+
+
+## License
+
+ml4a itself is [licensed MIT](https://github.com/ml4a/ml4a/blob/master/LICENSE), but you are also bound to the licenses of any [models](https://github.com/ml4a/ml4a/tree/master/ml4a/models/submodules) you use.
+
+
+
+
+%prep
+%autosetup -n ml4a-0.1.3
+
+%build
+%py3_build
+
+%install
+%py3_install
+install -d -m755 %{buildroot}/%{_pkgdocdir}
+if [ -d doc ]; then cp -arf doc %{buildroot}/%{_pkgdocdir}; fi
+if [ -d docs ]; then cp -arf docs %{buildroot}/%{_pkgdocdir}; fi
+if [ -d example ]; then cp -arf example %{buildroot}/%{_pkgdocdir}; fi
+if [ -d examples ]; then cp -arf examples %{buildroot}/%{_pkgdocdir}; fi
+pushd %{buildroot}
+if [ -d usr/lib ]; then
+ find usr/lib -type f -printf "\"/%h/%f\"\n" >> filelist.lst
+fi
+if [ -d usr/lib64 ]; then
+ find usr/lib64 -type f -printf "\"/%h/%f\"\n" >> filelist.lst
+fi
+if [ -d usr/bin ]; then
+ find usr/bin -type f -printf "\"/%h/%f\"\n" >> filelist.lst
+fi
+if [ -d usr/sbin ]; then
+ find usr/sbin -type f -printf "\"/%h/%f\"\n" >> filelist.lst
+fi
+touch doclist.lst
+if [ -d usr/share/man ]; then
+ find usr/share/man -type f -printf "\"/%h/%f.gz\"\n" >> doclist.lst
+fi
+popd
+mv %{buildroot}/filelist.lst .
+mv %{buildroot}/doclist.lst .
+
+%files -n python3-ml4a -f filelist.lst
+%dir %{python3_sitelib}/*
+
+%files help -f doclist.lst
+%{_docdir}/*
+
+%changelog
+* Tue Jun 20 2023 Python_Bot <Python_Bot@openeuler.org> - 0.1.3-1
+- Package Spec generated