%global _empty_manifest_terminate_build 0 Name: python-mlrose Version: 1.3.0 Release: 1 Summary: MLROSe: Machine Learning, Randomized Optimization and Search License: BSD URL: https://github.com/gkhayes/mlrose Source0: https://mirrors.nju.edu.cn/pypi/web/packages/2d/d3/d1e626bbc828aa2f3bbadb8a4616093cc09dd87f86f22603e90ddb410151/mlrose-1.3.0.tar.gz BuildArch: noarch Requires: python3-numpy Requires: python3-scipy Requires: python3-sklearn %description # mlrose: Machine Learning, Randomized Optimization and SEarch mlrose is a Python package for applying some of the most common randomized optimization and search algorithms to a range of different optimization problems, over both discrete- and continuous-valued parameter spaces. ## Project Background mlrose was initially developed to support students of Georgia Tech's OMSCS/OMSA offering of CS 7641: Machine Learning. It includes implementations of all randomized optimization algorithms taught in this course, as well as functionality to apply these algorithms to integer-string optimization problems, such as N-Queens and the Knapsack problem; continuous-valued optimization problems, such as the neural network weight problem; and tour optimization problems, such as the Travelling Salesperson problem. It also has the flexibility to solve user-defined optimization problems. At the time of development, there did not exist a single Python package that collected all of this functionality together in the one location. ## Main Features #### *Randomized Optimization Algorithms* - Implementations of: hill climbing, randomized hill climbing, simulated annealing, genetic algorithm and (discrete) MIMIC; - Solve both maximization and minimization problems; - Define the algorithm's initial state or start from a random state; - Define your own simulated annealing decay schedule or use one of three pre-defined, customizable decay schedules: geometric decay, arithmetic decay or exponential decay. #### *Problem Types* - Solve discrete-value (bit-string and integer-string), continuous-value and tour optimization (travelling salesperson) problems; - Define your own fitness function for optimization or use a pre-defined function. - Pre-defined fitness functions exist for solving the: One Max, Flip Flop, Four Peaks, Six Peaks, Continuous Peaks, Knapsack, Travelling Salesperson, N-Queens and Max-K Color optimization problems. #### *Machine Learning Weight Optimization* - Optimize the weights of neural networks, linear regression models and logistic regression models using randomized hill climbing, simulated annealing, the genetic algorithm or gradient descent; - Supports classification and regression neural networks. ## Installation mlrose was written in Python 3 and requires NumPy, SciPy and Scikit-Learn (sklearn). The latest released version is available at the [Python package index](https://pypi.org/project/mlrose/) and can be installed using `pip`: ``` pip install mlrose ``` ## Documentation The official mlrose documentation can be found [here](https://mlrose.readthedocs.io/). A Jupyter notebook containing the examples used in the documentation is also available [here](https://github.com/gkhayes/mlrose/blob/master/tutorial_examples.ipynb). ## Licensing, Authors, Acknowledgements mlrose was written by Genevieve Hayes and is distributed under the [3-Clause BSD license](https://github.com/gkhayes/mlrose/blob/master/LICENSE). You can cite mlrose in research publications and reports as follows: * Hayes, G. (2019). ***mlrose: Machine Learning, Randomized Optimization and SEarch package for Python***. https://github.com/gkhayes/mlrose. Accessed: *day month year*. BibTeX entry: ``` @misc{Hayes19, author = {Hayes, G}, title = {{mlrose: Machine Learning, Randomized Optimization and SEarch package for Python}}, year = 2019, howpublished = {\url{https://github.com/gkhayes/mlrose}}, note = {Accessed: day month year} } ``` %package -n python3-mlrose Summary: MLROSe: Machine Learning, Randomized Optimization and Search Provides: python-mlrose BuildRequires: python3-devel BuildRequires: python3-setuptools BuildRequires: python3-pip %description -n python3-mlrose # mlrose: Machine Learning, Randomized Optimization and SEarch mlrose is a Python package for applying some of the most common randomized optimization and search algorithms to a range of different optimization problems, over both discrete- and continuous-valued parameter spaces. ## Project Background mlrose was initially developed to support students of Georgia Tech's OMSCS/OMSA offering of CS 7641: Machine Learning. It includes implementations of all randomized optimization algorithms taught in this course, as well as functionality to apply these algorithms to integer-string optimization problems, such as N-Queens and the Knapsack problem; continuous-valued optimization problems, such as the neural network weight problem; and tour optimization problems, such as the Travelling Salesperson problem. It also has the flexibility to solve user-defined optimization problems. At the time of development, there did not exist a single Python package that collected all of this functionality together in the one location. ## Main Features #### *Randomized Optimization Algorithms* - Implementations of: hill climbing, randomized hill climbing, simulated annealing, genetic algorithm and (discrete) MIMIC; - Solve both maximization and minimization problems; - Define the algorithm's initial state or start from a random state; - Define your own simulated annealing decay schedule or use one of three pre-defined, customizable decay schedules: geometric decay, arithmetic decay or exponential decay. #### *Problem Types* - Solve discrete-value (bit-string and integer-string), continuous-value and tour optimization (travelling salesperson) problems; - Define your own fitness function for optimization or use a pre-defined function. - Pre-defined fitness functions exist for solving the: One Max, Flip Flop, Four Peaks, Six Peaks, Continuous Peaks, Knapsack, Travelling Salesperson, N-Queens and Max-K Color optimization problems. #### *Machine Learning Weight Optimization* - Optimize the weights of neural networks, linear regression models and logistic regression models using randomized hill climbing, simulated annealing, the genetic algorithm or gradient descent; - Supports classification and regression neural networks. ## Installation mlrose was written in Python 3 and requires NumPy, SciPy and Scikit-Learn (sklearn). The latest released version is available at the [Python package index](https://pypi.org/project/mlrose/) and can be installed using `pip`: ``` pip install mlrose ``` ## Documentation The official mlrose documentation can be found [here](https://mlrose.readthedocs.io/). A Jupyter notebook containing the examples used in the documentation is also available [here](https://github.com/gkhayes/mlrose/blob/master/tutorial_examples.ipynb). ## Licensing, Authors, Acknowledgements mlrose was written by Genevieve Hayes and is distributed under the [3-Clause BSD license](https://github.com/gkhayes/mlrose/blob/master/LICENSE). You can cite mlrose in research publications and reports as follows: * Hayes, G. (2019). ***mlrose: Machine Learning, Randomized Optimization and SEarch package for Python***. https://github.com/gkhayes/mlrose. Accessed: *day month year*. BibTeX entry: ``` @misc{Hayes19, author = {Hayes, G}, title = {{mlrose: Machine Learning, Randomized Optimization and SEarch package for Python}}, year = 2019, howpublished = {\url{https://github.com/gkhayes/mlrose}}, note = {Accessed: day month year} } ``` %package help Summary: Development documents and examples for mlrose Provides: python3-mlrose-doc %description help # mlrose: Machine Learning, Randomized Optimization and SEarch mlrose is a Python package for applying some of the most common randomized optimization and search algorithms to a range of different optimization problems, over both discrete- and continuous-valued parameter spaces. ## Project Background mlrose was initially developed to support students of Georgia Tech's OMSCS/OMSA offering of CS 7641: Machine Learning. It includes implementations of all randomized optimization algorithms taught in this course, as well as functionality to apply these algorithms to integer-string optimization problems, such as N-Queens and the Knapsack problem; continuous-valued optimization problems, such as the neural network weight problem; and tour optimization problems, such as the Travelling Salesperson problem. It also has the flexibility to solve user-defined optimization problems. At the time of development, there did not exist a single Python package that collected all of this functionality together in the one location. ## Main Features #### *Randomized Optimization Algorithms* - Implementations of: hill climbing, randomized hill climbing, simulated annealing, genetic algorithm and (discrete) MIMIC; - Solve both maximization and minimization problems; - Define the algorithm's initial state or start from a random state; - Define your own simulated annealing decay schedule or use one of three pre-defined, customizable decay schedules: geometric decay, arithmetic decay or exponential decay. #### *Problem Types* - Solve discrete-value (bit-string and integer-string), continuous-value and tour optimization (travelling salesperson) problems; - Define your own fitness function for optimization or use a pre-defined function. - Pre-defined fitness functions exist for solving the: One Max, Flip Flop, Four Peaks, Six Peaks, Continuous Peaks, Knapsack, Travelling Salesperson, N-Queens and Max-K Color optimization problems. #### *Machine Learning Weight Optimization* - Optimize the weights of neural networks, linear regression models and logistic regression models using randomized hill climbing, simulated annealing, the genetic algorithm or gradient descent; - Supports classification and regression neural networks. ## Installation mlrose was written in Python 3 and requires NumPy, SciPy and Scikit-Learn (sklearn). The latest released version is available at the [Python package index](https://pypi.org/project/mlrose/) and can be installed using `pip`: ``` pip install mlrose ``` ## Documentation The official mlrose documentation can be found [here](https://mlrose.readthedocs.io/). A Jupyter notebook containing the examples used in the documentation is also available [here](https://github.com/gkhayes/mlrose/blob/master/tutorial_examples.ipynb). ## Licensing, Authors, Acknowledgements mlrose was written by Genevieve Hayes and is distributed under the [3-Clause BSD license](https://github.com/gkhayes/mlrose/blob/master/LICENSE). You can cite mlrose in research publications and reports as follows: * Hayes, G. (2019). ***mlrose: Machine Learning, Randomized Optimization and SEarch package for Python***. https://github.com/gkhayes/mlrose. Accessed: *day month year*. BibTeX entry: ``` @misc{Hayes19, author = {Hayes, G}, title = {{mlrose: Machine Learning, Randomized Optimization and SEarch package for Python}}, year = 2019, howpublished = {\url{https://github.com/gkhayes/mlrose}}, note = {Accessed: day month year} } ``` %prep %autosetup -n mlrose-1.3.0 %build %py3_build %install %py3_install install -d -m755 %{buildroot}/%{_pkgdocdir} if [ -d doc ]; then cp -arf doc %{buildroot}/%{_pkgdocdir}; fi if [ -d docs ]; then cp -arf docs %{buildroot}/%{_pkgdocdir}; fi if [ -d example ]; then cp -arf example %{buildroot}/%{_pkgdocdir}; fi if [ -d examples ]; then cp -arf examples %{buildroot}/%{_pkgdocdir}; fi pushd %{buildroot} if [ -d usr/lib ]; then find usr/lib -type f -printf "/%h/%f\n" >> filelist.lst fi if [ -d usr/lib64 ]; then find usr/lib64 -type f -printf "/%h/%f\n" >> filelist.lst fi if [ -d usr/bin ]; then find usr/bin -type f -printf "/%h/%f\n" >> filelist.lst fi if [ -d usr/sbin ]; then find usr/sbin -type f -printf "/%h/%f\n" >> filelist.lst fi touch doclist.lst if [ -d usr/share/man ]; then find usr/share/man -type f -printf "/%h/%f.gz\n" >> doclist.lst fi popd mv %{buildroot}/filelist.lst . mv %{buildroot}/doclist.lst . %files -n python3-mlrose -f filelist.lst %dir %{python3_sitelib}/* %files help -f doclist.lst %{_docdir}/* %changelog * Tue Apr 25 2023 Python_Bot - 1.3.0-1 - Package Spec generated