summaryrefslogtreecommitdiff
diff options
context:
space:
mode:
authorCoprDistGit <infra@openeuler.org>2023-04-11 02:06:00 +0000
committerCoprDistGit <infra@openeuler.org>2023-04-11 02:06:00 +0000
commit864321740e9eab3bd708c0d35468adcab296a694 (patch)
treeaac44e52f92b761f7f332f39d53bf88a25ab68ad
parent46a0cb60a97ceebacf051ba1a200361bbedc56b2 (diff)
automatic import of python-nevergrad
-rw-r--r--.gitignore1
-rw-r--r--python-nevergrad.spec457
-rw-r--r--sources1
3 files changed, 459 insertions, 0 deletions
diff --git a/.gitignore b/.gitignore
index e69de29..8535456 100644
--- a/.gitignore
+++ b/.gitignore
@@ -0,0 +1 @@
+/nevergrad-0.6.0.tar.gz
diff --git a/python-nevergrad.spec b/python-nevergrad.spec
new file mode 100644
index 0000000..99d2cd4
--- /dev/null
+++ b/python-nevergrad.spec
@@ -0,0 +1,457 @@
+%global _empty_manifest_terminate_build 0
+Name: python-nevergrad
+Version: 0.6.0
+Release: 1
+Summary: A Python toolbox for performing gradient-free optimization
+License: MIT
+URL: https://github.com/facebookresearch/nevergrad
+Source0: https://mirrors.nju.edu.cn/pypi/web/packages/06/ea/2e1f13a237258c30444aa3573040ef81723f4442c58de4af476700e62797/nevergrad-0.6.0.tar.gz
+BuildArch: noarch
+
+Requires: python3-numpy
+Requires: python3-cma
+Requires: python3-bayesian-optimization
+Requires: python3-typing-extensions
+Requires: python3-pandas
+Requires: python3-black
+Requires: python3-mypy
+Requires: python3-pytest
+Requires: python3-pytest-cov
+Requires: python3-pylint
+Requires: python3-wheel
+Requires: python3-setuptools
+Requires: python3-sphinx
+Requires: python3-sphinx-rtd-theme
+Requires: python3-recommonmark
+Requires: python3-twine
+Requires: python3-autodocsumm
+Requires: python3-pandas
+Requires: python3-pyparsing
+Requires: python3-docutils
+Requires: python3-requests
+Requires: python3-xlwt
+Requires: python3-xlrd
+Requires: python3-opencv-python
+Requires: python3-matplotlib
+Requires: python3-gym
+Requires: python3-gym-anm
+Requires: python3-pygame
+Requires: python3-torch
+Requires: python3-hiplot
+Requires: python3-fcmaes
+Requires: python3-openpyxl
+Requires: python3-pyproj
+Requires: python3-Pillow
+Requires: python3-tqdm
+Requires: python3-torchvision
+Requires: python3-pyomo
+Requires: python3-mixsimulator
+Requires: python3-hyperopt
+Requires: python3-IOHexperimenter
+Requires: python3-cdt
+Requires: python3-tensorflow-estimator
+Requires: python3-scikit-learn
+Requires: python3-scikit-image
+Requires: python3-tensorflow
+Requires: python3-image-quality
+Requires: python3-keras
+Requires: python3-pymoo
+Requires: python3-Keras-Preprocessing
+Requires: python3-silence-tensorflow
+Requires: python3-tensorflow-probability
+Requires: python3-bayes-optim
+Requires: python3-nlopt
+Requires: python3-pybullet
+Requires: python3-box2d-py
+Requires: python3-glfw
+Requires: python3-mujoco
+Requires: python3-olymp
+Requires: python3-requests
+Requires: python3-xlwt
+Requires: python3-xlrd
+Requires: python3-opencv-python
+Requires: python3-matplotlib
+Requires: python3-gym
+Requires: python3-gym-anm
+Requires: python3-pygame
+Requires: python3-torch
+Requires: python3-hiplot
+Requires: python3-fcmaes
+Requires: python3-pandas
+Requires: python3-openpyxl
+Requires: python3-pyproj
+Requires: python3-Pillow
+Requires: python3-tqdm
+Requires: python3-torchvision
+Requires: python3-pyomo
+Requires: python3-mixsimulator
+Requires: python3-hyperopt
+Requires: python3-IOHexperimenter
+Requires: python3-cdt
+Requires: python3-tensorflow-estimator
+Requires: python3-scikit-learn
+Requires: python3-scikit-image
+Requires: python3-tensorflow
+Requires: python3-image-quality
+Requires: python3-keras
+Requires: python3-pymoo
+Requires: python3-Keras-Preprocessing
+Requires: python3-silence-tensorflow
+Requires: python3-tensorflow-probability
+Requires: python3-bayes-optim
+Requires: python3-nlopt
+Requires: python3-pybullet
+Requires: python3-box2d-py
+Requires: python3-glfw
+Requires: python3-mujoco
+Requires: python3-olymp
+Requires: python3-black
+Requires: python3-mypy
+Requires: python3-pytest
+Requires: python3-pytest-cov
+Requires: python3-pylint
+Requires: python3-wheel
+Requires: python3-setuptools
+Requires: python3-sphinx
+Requires: python3-sphinx-rtd-theme
+Requires: python3-recommonmark
+Requires: python3-twine
+Requires: python3-autodocsumm
+Requires: python3-pandas
+Requires: python3-pyparsing
+Requires: python3-docutils
+
+%description
+[![Support Ukraine](https://img.shields.io/badge/Support-Ukraine-FFD500?style=flat&labelColor=005BBB)](https://opensource.fb.com/support-ukraine) [![CircleCI](https://circleci.com/gh/facebookresearch/nevergrad/tree/main.svg?style=svg)](https://circleci.com/gh/facebookresearch/nevergrad/tree/main)
+
+# Nevergrad - A gradient-free optimization platform
+
+![Nevergrad](https://raw.githubusercontent.com/facebookresearch/nevergrad/0.6.0/docs/resources/Nevergrad-LogoMark.png)
+
+
+`nevergrad` is a Python 3.6+ library. It can be installed with:
+
+```
+pip install nevergrad
+```
+
+More installation options, including windows installation, and complete instructions are available in the "Getting started" section of the [**documentation**](https://facebookresearch.github.io/nevergrad/).
+
+You can join Nevergrad users Facebook group [here](https://www.facebook.com/groups/nevergradusers/).
+
+Minimizing a function using an optimizer (here `NGOpt`) is straightforward:
+
+```python
+import nevergrad as ng
+
+def square(x):
+ return sum((x - .5)**2)
+
+optimizer = ng.optimizers.NGOpt(parametrization=2, budget=100)
+recommendation = optimizer.minimize(square)
+print(recommendation.value) # recommended value
+>>> [0.49971112 0.5002944]
+```
+
+`nevergrad` can also support bounded continuous variables as well as discrete variables, and mixture of those.
+To do this, one can specify the input space:
+
+```python
+import nevergrad as ng
+
+def fake_training(learning_rate: float, batch_size: int, architecture: str) -> float:
+ # optimal for learning_rate=0.2, batch_size=4, architecture="conv"
+ return (learning_rate - 0.2)**2 + (batch_size - 4)**2 + (0 if architecture == "conv" else 10)
+
+# Instrumentation class is used for functions with multiple inputs
+# (positional and/or keywords)
+parametrization = ng.p.Instrumentation(
+ # a log-distributed scalar between 0.001 and 1.0
+ learning_rate=ng.p.Log(lower=0.001, upper=1.0),
+ # an integer from 1 to 12
+ batch_size=ng.p.Scalar(lower=1, upper=12).set_integer_casting(),
+ # either "conv" or "fc"
+ architecture=ng.p.Choice(["conv", "fc"])
+)
+
+optimizer = ng.optimizers.NGOpt(parametrization=parametrization, budget=100)
+recommendation = optimizer.minimize(fake_training)
+
+# show the recommended keyword arguments of the function
+print(recommendation.kwargs)
+>>> {'learning_rate': 0.1998, 'batch_size': 4, 'architecture': 'conv'}
+```
+
+Learn more on parametrization in the [**documentation**](https://facebookresearch.github.io/nevergrad/)!
+
+![Example of optimization](https://raw.githubusercontent.com/facebookresearch/nevergrad/0.6.0/docs/resources/TwoPointsDE.gif)
+
+*Convergence of a population of points to the minima with two-points DE.*
+
+
+## Documentation
+
+Check out our [**documentation**](https://facebookresearch.github.io/nevergrad/)! It's still a work in progress, don't hesitate to submit issues and/or PR to update it and make it clearer!
+
+
+## Citing
+
+```bibtex
+@misc{nevergrad,
+ author = {J. Rapin and O. Teytaud},
+ title = {{Nevergrad - A gradient-free optimization platform}},
+ year = {2018},
+ publisher = {GitHub},
+ journal = {GitHub repository},
+ howpublished = {\url{https://GitHub.com/FacebookResearch/Nevergrad}},
+}
+```
+
+## License
+
+`nevergrad` is released under the MIT license. See [LICENSE](https://github.com/facebookresearch/nevergrad/blob/0.6.0/LICENSE) for additional details about it.
+See also our [Terms of Use](https://opensource.facebook.com/legal/terms) and [Privacy Policy](https://opensource.facebook.com/legal/privacy).
+
+
+
+
+%package -n python3-nevergrad
+Summary: A Python toolbox for performing gradient-free optimization
+Provides: python-nevergrad
+BuildRequires: python3-devel
+BuildRequires: python3-setuptools
+BuildRequires: python3-pip
+%description -n python3-nevergrad
+[![Support Ukraine](https://img.shields.io/badge/Support-Ukraine-FFD500?style=flat&labelColor=005BBB)](https://opensource.fb.com/support-ukraine) [![CircleCI](https://circleci.com/gh/facebookresearch/nevergrad/tree/main.svg?style=svg)](https://circleci.com/gh/facebookresearch/nevergrad/tree/main)
+
+# Nevergrad - A gradient-free optimization platform
+
+![Nevergrad](https://raw.githubusercontent.com/facebookresearch/nevergrad/0.6.0/docs/resources/Nevergrad-LogoMark.png)
+
+
+`nevergrad` is a Python 3.6+ library. It can be installed with:
+
+```
+pip install nevergrad
+```
+
+More installation options, including windows installation, and complete instructions are available in the "Getting started" section of the [**documentation**](https://facebookresearch.github.io/nevergrad/).
+
+You can join Nevergrad users Facebook group [here](https://www.facebook.com/groups/nevergradusers/).
+
+Minimizing a function using an optimizer (here `NGOpt`) is straightforward:
+
+```python
+import nevergrad as ng
+
+def square(x):
+ return sum((x - .5)**2)
+
+optimizer = ng.optimizers.NGOpt(parametrization=2, budget=100)
+recommendation = optimizer.minimize(square)
+print(recommendation.value) # recommended value
+>>> [0.49971112 0.5002944]
+```
+
+`nevergrad` can also support bounded continuous variables as well as discrete variables, and mixture of those.
+To do this, one can specify the input space:
+
+```python
+import nevergrad as ng
+
+def fake_training(learning_rate: float, batch_size: int, architecture: str) -> float:
+ # optimal for learning_rate=0.2, batch_size=4, architecture="conv"
+ return (learning_rate - 0.2)**2 + (batch_size - 4)**2 + (0 if architecture == "conv" else 10)
+
+# Instrumentation class is used for functions with multiple inputs
+# (positional and/or keywords)
+parametrization = ng.p.Instrumentation(
+ # a log-distributed scalar between 0.001 and 1.0
+ learning_rate=ng.p.Log(lower=0.001, upper=1.0),
+ # an integer from 1 to 12
+ batch_size=ng.p.Scalar(lower=1, upper=12).set_integer_casting(),
+ # either "conv" or "fc"
+ architecture=ng.p.Choice(["conv", "fc"])
+)
+
+optimizer = ng.optimizers.NGOpt(parametrization=parametrization, budget=100)
+recommendation = optimizer.minimize(fake_training)
+
+# show the recommended keyword arguments of the function
+print(recommendation.kwargs)
+>>> {'learning_rate': 0.1998, 'batch_size': 4, 'architecture': 'conv'}
+```
+
+Learn more on parametrization in the [**documentation**](https://facebookresearch.github.io/nevergrad/)!
+
+![Example of optimization](https://raw.githubusercontent.com/facebookresearch/nevergrad/0.6.0/docs/resources/TwoPointsDE.gif)
+
+*Convergence of a population of points to the minima with two-points DE.*
+
+
+## Documentation
+
+Check out our [**documentation**](https://facebookresearch.github.io/nevergrad/)! It's still a work in progress, don't hesitate to submit issues and/or PR to update it and make it clearer!
+
+
+## Citing
+
+```bibtex
+@misc{nevergrad,
+ author = {J. Rapin and O. Teytaud},
+ title = {{Nevergrad - A gradient-free optimization platform}},
+ year = {2018},
+ publisher = {GitHub},
+ journal = {GitHub repository},
+ howpublished = {\url{https://GitHub.com/FacebookResearch/Nevergrad}},
+}
+```
+
+## License
+
+`nevergrad` is released under the MIT license. See [LICENSE](https://github.com/facebookresearch/nevergrad/blob/0.6.0/LICENSE) for additional details about it.
+See also our [Terms of Use](https://opensource.facebook.com/legal/terms) and [Privacy Policy](https://opensource.facebook.com/legal/privacy).
+
+
+
+
+%package help
+Summary: Development documents and examples for nevergrad
+Provides: python3-nevergrad-doc
+%description help
+[![Support Ukraine](https://img.shields.io/badge/Support-Ukraine-FFD500?style=flat&labelColor=005BBB)](https://opensource.fb.com/support-ukraine) [![CircleCI](https://circleci.com/gh/facebookresearch/nevergrad/tree/main.svg?style=svg)](https://circleci.com/gh/facebookresearch/nevergrad/tree/main)
+
+# Nevergrad - A gradient-free optimization platform
+
+![Nevergrad](https://raw.githubusercontent.com/facebookresearch/nevergrad/0.6.0/docs/resources/Nevergrad-LogoMark.png)
+
+
+`nevergrad` is a Python 3.6+ library. It can be installed with:
+
+```
+pip install nevergrad
+```
+
+More installation options, including windows installation, and complete instructions are available in the "Getting started" section of the [**documentation**](https://facebookresearch.github.io/nevergrad/).
+
+You can join Nevergrad users Facebook group [here](https://www.facebook.com/groups/nevergradusers/).
+
+Minimizing a function using an optimizer (here `NGOpt`) is straightforward:
+
+```python
+import nevergrad as ng
+
+def square(x):
+ return sum((x - .5)**2)
+
+optimizer = ng.optimizers.NGOpt(parametrization=2, budget=100)
+recommendation = optimizer.minimize(square)
+print(recommendation.value) # recommended value
+>>> [0.49971112 0.5002944]
+```
+
+`nevergrad` can also support bounded continuous variables as well as discrete variables, and mixture of those.
+To do this, one can specify the input space:
+
+```python
+import nevergrad as ng
+
+def fake_training(learning_rate: float, batch_size: int, architecture: str) -> float:
+ # optimal for learning_rate=0.2, batch_size=4, architecture="conv"
+ return (learning_rate - 0.2)**2 + (batch_size - 4)**2 + (0 if architecture == "conv" else 10)
+
+# Instrumentation class is used for functions with multiple inputs
+# (positional and/or keywords)
+parametrization = ng.p.Instrumentation(
+ # a log-distributed scalar between 0.001 and 1.0
+ learning_rate=ng.p.Log(lower=0.001, upper=1.0),
+ # an integer from 1 to 12
+ batch_size=ng.p.Scalar(lower=1, upper=12).set_integer_casting(),
+ # either "conv" or "fc"
+ architecture=ng.p.Choice(["conv", "fc"])
+)
+
+optimizer = ng.optimizers.NGOpt(parametrization=parametrization, budget=100)
+recommendation = optimizer.minimize(fake_training)
+
+# show the recommended keyword arguments of the function
+print(recommendation.kwargs)
+>>> {'learning_rate': 0.1998, 'batch_size': 4, 'architecture': 'conv'}
+```
+
+Learn more on parametrization in the [**documentation**](https://facebookresearch.github.io/nevergrad/)!
+
+![Example of optimization](https://raw.githubusercontent.com/facebookresearch/nevergrad/0.6.0/docs/resources/TwoPointsDE.gif)
+
+*Convergence of a population of points to the minima with two-points DE.*
+
+
+## Documentation
+
+Check out our [**documentation**](https://facebookresearch.github.io/nevergrad/)! It's still a work in progress, don't hesitate to submit issues and/or PR to update it and make it clearer!
+
+
+## Citing
+
+```bibtex
+@misc{nevergrad,
+ author = {J. Rapin and O. Teytaud},
+ title = {{Nevergrad - A gradient-free optimization platform}},
+ year = {2018},
+ publisher = {GitHub},
+ journal = {GitHub repository},
+ howpublished = {\url{https://GitHub.com/FacebookResearch/Nevergrad}},
+}
+```
+
+## License
+
+`nevergrad` is released under the MIT license. See [LICENSE](https://github.com/facebookresearch/nevergrad/blob/0.6.0/LICENSE) for additional details about it.
+See also our [Terms of Use](https://opensource.facebook.com/legal/terms) and [Privacy Policy](https://opensource.facebook.com/legal/privacy).
+
+
+
+
+%prep
+%autosetup -n nevergrad-0.6.0
+
+%build
+%py3_build
+
+%install
+%py3_install
+install -d -m755 %{buildroot}/%{_pkgdocdir}
+if [ -d doc ]; then cp -arf doc %{buildroot}/%{_pkgdocdir}; fi
+if [ -d docs ]; then cp -arf docs %{buildroot}/%{_pkgdocdir}; fi
+if [ -d example ]; then cp -arf example %{buildroot}/%{_pkgdocdir}; fi
+if [ -d examples ]; then cp -arf examples %{buildroot}/%{_pkgdocdir}; fi
+pushd %{buildroot}
+if [ -d usr/lib ]; then
+ find usr/lib -type f -printf "/%h/%f\n" >> filelist.lst
+fi
+if [ -d usr/lib64 ]; then
+ find usr/lib64 -type f -printf "/%h/%f\n" >> filelist.lst
+fi
+if [ -d usr/bin ]; then
+ find usr/bin -type f -printf "/%h/%f\n" >> filelist.lst
+fi
+if [ -d usr/sbin ]; then
+ find usr/sbin -type f -printf "/%h/%f\n" >> filelist.lst
+fi
+touch doclist.lst
+if [ -d usr/share/man ]; then
+ find usr/share/man -type f -printf "/%h/%f.gz\n" >> doclist.lst
+fi
+popd
+mv %{buildroot}/filelist.lst .
+mv %{buildroot}/doclist.lst .
+
+%files -n python3-nevergrad -f filelist.lst
+%dir %{python3_sitelib}/*
+
+%files help -f doclist.lst
+%{_docdir}/*
+
+%changelog
+* Tue Apr 11 2023 Python_Bot <Python_Bot@openeuler.org> - 0.6.0-1
+- Package Spec generated
diff --git a/sources b/sources
new file mode 100644
index 0000000..6ac8a66
--- /dev/null
+++ b/sources
@@ -0,0 +1 @@
+404b64a20e0501d40d7e7abe1302ebdb nevergrad-0.6.0.tar.gz