%global _empty_manifest_terminate_build 0
Name: python-bayes-optim
Version: 0.2.6
Release: 1
Summary: A Bayesian Optimization Library
License: GNU General Public License v3 (GPLv3)
URL: https://github.com/wangronin/Bayesian-Optimization
Source0: https://mirrors.nju.edu.cn/pypi/web/packages/a0/36/9665f88aba063959f25b367c1e8ffbb1202ca5829076ad797ec05e36d647/bayes-optim-0.2.6.tar.gz
BuildArch: noarch
Requires: python3-torch
Requires: python3-dill
Requires: python3-joblib
Requires: python3-numpy
Requires: python3-pyDOE
Requires: python3-scikit-learn
Requires: python3-scipy
Requires: python3-sklearn
Requires: python3-tabulate
Requires: python3-threadpoolctl
Requires: python3-requests
Requires: python3-requests-oauthlib
Requires: python3-sobol-seq
Requires: python3-py-expression-eval
%description
[![Actions Status](https://github.com/wangronin/Bayesian-Optimization/workflows/Build%20and%20Test/badge.svg)](https://github.com/wangronin/Bayesian-Optimization/actions)
# Bayesian Optimization Library
A `Python` implementation of the Bayesian Optimization (BO) algorithm working on decision spaces composed of either real, integer, catergorical variables, or a mixture thereof.
Underpinned by surrogate models, BO iteratively proposes candidate solutions using the so-called **acquisition function** which balances exploration with exploitation, and updates the surrogate model with newly observed objective values. This algorithm is designed to optimize **expensive black-box** problems efficiently.
![](assets/BO-example.gif)
## Installation
You could either install the stable version on `pypi`:
```shell
pip install bayes-optim
```
Or, take the lastest version from github:
```shell
git clone https://github.com/wangronin/Bayesian-Optimization.git
cd Bayesian-Optimization && python setup.py install --user
```
## Example
For real-valued search variables, the simplest usage is via the `fmin` function:
```python
from bayes_optim import fmin
def f(x):
return sum(x ** 2)
minimum = fmin(f, [-5] * 2, [5] * 2, max_FEs=30, seed=42)
```
And you could also have much finer control over most ingredients of BO, e.g., the surrogate
model and acquisition functions. Please see the example below:
```python
from bayes_optim import BO, RealSpace
from bayes_optim.Surrogate import GaussianProcess
dim = 5
space = RealSpace([-5, 5]) * dim # create the search space
# hyperparameters of the GPR model
thetaL = 1e-10 * (ub - lb) * np.ones(dim)
thetaU = 10 * (ub - lb) * np.ones(dim)
model = GaussianProcess( # create the GPR model
thetaL=thetaL, thetaU=thetaU
)
opt = BO(
search_space=space,
obj_fun=fitness,
model=model,
DoE_size=5, # number of initial sample points
max_FEs=50, # maximal function evaluation
verbose=True
)
opt.run()
```
For more detailed usage and exmaples, please check out our [wiki page](https://github.com/wangronin/Bayesian-Optimization/wiki).
## Features
This implementation differs from alternative packages/libraries in the following features:
* **Parallelization**, also known as _batch-sequential optimization_, for which several different approaches are implemented here.
* **Moment-Generating Function of the improvment** (MGFI) [WvSEB17a] is a recently proposed acquistion function, which implictly controls the exploration-exploitation trade-off.
* **Mixed-Integer Evolution Strategy** for optimizing the acqusition function, which is enabled when the search space is a mixture of real, integer, and categorical variables.
## Project Structure
* `bayes-optim/SearchSpace.py`: implementation of the search/decision space.
* `bayes-optim/base.py`: the base class of Bayesian Optimization.
* `bayes-optim/AcquisitionFunction.py`: the implemetation of acquisition functions (see below for the list of implemented ones).
* `bayes-optim/Surrogate`: we implemented the Gaussian Process Regression (GPR) and Random Forest (RF).
* `bayes-optim/BayesOpt.py` contains several BO variants:
* `BO`: noiseless + sequential
* `ParallelBO`: noiseless + parallel (a.k.a. batch-sequential)
* `AnnealingBO`: noiseless + parallel + annealling [WEB18]
* `SelfAdaptiveBO`: noiseless + parallel + self-adaptive [WEB19]
* `NoisyBO`: noisy + parallel
* `bayes-optim/Extension.py` is meant to include the lastest developments that are not extensively tested:
* `PCABO`: noiseless + parallel + PCA-assisted dimensionality reduction [RaponiWBBD20] **[Under Construction]**
* `MultiAcquisitionBO`: noiseless + parallelization with multiple different acquisition functions **[Under Construction]**
## Acquisition Functions
The following infill-criteria are implemented in the library:
* _Expected Improvement_ (EI)
* Probability of Improvement (PI) / Probability of Improvement
* _Upper Confidence Bound_ (UCB)
* _Moment-Generating Function of Improvement_ (MGFI)
* _Generalized Expected Improvement_ (GEI) **[Under Construction]**
For sequential working mode, Expected Improvement is used by default. For parallelization mode, MGFI is enabled by default.
## Surrogate Model
The meta (surrogate)-model used in Bayesian optimization. The basic requirement for such a model is to provide the uncertainty quantification (either empirical or theorerical) for the prediction. To easily handle the categorical data, __random forest__ model is used by default. The implementation here is based the one in _scikit-learn_, with modifications on uncertainty quantification.
## A brief Introduction to Bayesian Optimization
Bayesian Optimization [Moc74, JSW98] (BO) is a sequential optimization strategy originally proposed to solve the single-objective black-box optimiza-tion problem that is costly to evaluate. Here, we shall restrict our discussion to the single-objective case. BO typically starts with sampling an initial design of experiment (DoE) of size, X={x1,x2,...,xn}, which is usually generated by simple random sampling, Latin Hypercube Sampling [SWN03], or the more sophisticated low-discrepancy sequence [Nie88] (e.g., Sobol sequences). Taking the initial DoE X and its corresponding objective value, Y={f(x1), f(x2),..., f(xn)} ⊆ ℝ, we proceed to construct a statistical model M describing the probability distribution of the objective function conditioned onthe initial evidence, namely Pr(f|X,Y). In most application scenarios of BO, there is a lack of a priori knowledge about f and therefore nonparametric models (e.g., Gaussian process regression or random forest) are commonly chosen for M, which gives rise to a predictor f'(x) for all x ∈ X and an uncertainty quantification s'(x) that estimates, for instance, the mean squared error of the predic-tion E(f'(x)−f(x))2. Based on f' and s', promising points can be identified via the so-called acquisition function which balances exploitation with exploration of the optimization process.
## Reference
* [Moc74] Jonas Mockus. "On bayesian methods for seeking the extremum". In Guri I. Marchuk, editor, _Optimization Techniques, IFIP Technical Conference, Novosibirsk_, USSR, July 1-7, 1974, volume 27 of _Lecture Notes in Computer Science_, pages 400–404. Springer, 1974.
* [JSW98] Donald R. Jones, Matthias Schonlau, and William J. Welch. "Efficient global optimization of expensive black-box functions". _J. Glob. Optim._, 13(4):455–492, 1998.
* [SWN03] Thomas J. Santner, Brian J. Williams, and William I. Notz. "The Design and Analysis of Computer Experiments". _Springer series in statistics._ Springer, 2003.
* [Nie88] Harald Niederreiter. "Low-discrepancy and low-dispersion sequences". _Journal of number theory_, 30(1):51–70, 1988.
* [WvSEB17a] Hao Wang, Bas van Stein, Michael Emmerich, and Thomas Bäck. "A New Acquisition Function for Bayesian Optimization Based on the Moment-Generating Function". In _Systems, Man, and Cybernetics (SMC), 2017 IEEE International Conference on_, pages 507–512. IEEE, 2017.
* [WEB18] Hao Wang, Michael Emmerich, and Thomas Bäck. "Cooling Strategies for the Moment-Generating Function in Bayesian Global Optimization". In _2018 IEEE Congress on Evolutionary Computation_, CEC 2018, Rio de Janeiro, Brazil, July 8-13, 2018, pages 1–8. IEEE, 2018.
* [WEB19] Hao, Wang, Michael Emmerich, and Thomas Bäck. "Towards self-adaptive efficient global optimization". In _AIP Conference Proceedings_, vol. 2070, no. 1, p. 020056. AIP Publishing LLC, 2019.
* [RaponiWBBD20] Elena Raponi, Hao Wang, Mariusz Bujny, Simonetta Boria, and Carola Doerr: "High Dimensional Bayesian Optimization Assisted by Principal Component Analysis". In _International Conference on Parallel Problem Solving from Nature_, pp. 169-183. Springer, Cham, 2020.
%package -n python3-bayes-optim
Summary: A Bayesian Optimization Library
Provides: python-bayes-optim
BuildRequires: python3-devel
BuildRequires: python3-setuptools
BuildRequires: python3-pip
%description -n python3-bayes-optim
[![Actions Status](https://github.com/wangronin/Bayesian-Optimization/workflows/Build%20and%20Test/badge.svg)](https://github.com/wangronin/Bayesian-Optimization/actions)
# Bayesian Optimization Library
A `Python` implementation of the Bayesian Optimization (BO) algorithm working on decision spaces composed of either real, integer, catergorical variables, or a mixture thereof.
Underpinned by surrogate models, BO iteratively proposes candidate solutions using the so-called **acquisition function** which balances exploration with exploitation, and updates the surrogate model with newly observed objective values. This algorithm is designed to optimize **expensive black-box** problems efficiently.
![](assets/BO-example.gif)
## Installation
You could either install the stable version on `pypi`:
```shell
pip install bayes-optim
```
Or, take the lastest version from github:
```shell
git clone https://github.com/wangronin/Bayesian-Optimization.git
cd Bayesian-Optimization && python setup.py install --user
```
## Example
For real-valued search variables, the simplest usage is via the `fmin` function:
```python
from bayes_optim import fmin
def f(x):
return sum(x ** 2)
minimum = fmin(f, [-5] * 2, [5] * 2, max_FEs=30, seed=42)
```
And you could also have much finer control over most ingredients of BO, e.g., the surrogate
model and acquisition functions. Please see the example below:
```python
from bayes_optim import BO, RealSpace
from bayes_optim.Surrogate import GaussianProcess
dim = 5
space = RealSpace([-5, 5]) * dim # create the search space
# hyperparameters of the GPR model
thetaL = 1e-10 * (ub - lb) * np.ones(dim)
thetaU = 10 * (ub - lb) * np.ones(dim)
model = GaussianProcess( # create the GPR model
thetaL=thetaL, thetaU=thetaU
)
opt = BO(
search_space=space,
obj_fun=fitness,
model=model,
DoE_size=5, # number of initial sample points
max_FEs=50, # maximal function evaluation
verbose=True
)
opt.run()
```
For more detailed usage and exmaples, please check out our [wiki page](https://github.com/wangronin/Bayesian-Optimization/wiki).
## Features
This implementation differs from alternative packages/libraries in the following features:
* **Parallelization**, also known as _batch-sequential optimization_, for which several different approaches are implemented here.
* **Moment-Generating Function of the improvment** (MGFI) [WvSEB17a] is a recently proposed acquistion function, which implictly controls the exploration-exploitation trade-off.
* **Mixed-Integer Evolution Strategy** for optimizing the acqusition function, which is enabled when the search space is a mixture of real, integer, and categorical variables.
## Project Structure
* `bayes-optim/SearchSpace.py`: implementation of the search/decision space.
* `bayes-optim/base.py`: the base class of Bayesian Optimization.
* `bayes-optim/AcquisitionFunction.py`: the implemetation of acquisition functions (see below for the list of implemented ones).
* `bayes-optim/Surrogate`: we implemented the Gaussian Process Regression (GPR) and Random Forest (RF).
* `bayes-optim/BayesOpt.py` contains several BO variants:
* `BO`: noiseless + sequential
* `ParallelBO`: noiseless + parallel (a.k.a. batch-sequential)
* `AnnealingBO`: noiseless + parallel + annealling [WEB18]
* `SelfAdaptiveBO`: noiseless + parallel + self-adaptive [WEB19]
* `NoisyBO`: noisy + parallel
* `bayes-optim/Extension.py` is meant to include the lastest developments that are not extensively tested:
* `PCABO`: noiseless + parallel + PCA-assisted dimensionality reduction [RaponiWBBD20] **[Under Construction]**
* `MultiAcquisitionBO`: noiseless + parallelization with multiple different acquisition functions **[Under Construction]**
## Acquisition Functions
The following infill-criteria are implemented in the library:
* _Expected Improvement_ (EI)
* Probability of Improvement (PI) / Probability of Improvement
* _Upper Confidence Bound_ (UCB)
* _Moment-Generating Function of Improvement_ (MGFI)
* _Generalized Expected Improvement_ (GEI) **[Under Construction]**
For sequential working mode, Expected Improvement is used by default. For parallelization mode, MGFI is enabled by default.
## Surrogate Model
The meta (surrogate)-model used in Bayesian optimization. The basic requirement for such a model is to provide the uncertainty quantification (either empirical or theorerical) for the prediction. To easily handle the categorical data, __random forest__ model is used by default. The implementation here is based the one in _scikit-learn_, with modifications on uncertainty quantification.
## A brief Introduction to Bayesian Optimization
Bayesian Optimization [Moc74, JSW98] (BO) is a sequential optimization strategy originally proposed to solve the single-objective black-box optimiza-tion problem that is costly to evaluate. Here, we shall restrict our discussion to the single-objective case. BO typically starts with sampling an initial design of experiment (DoE) of size, X={x1,x2,...,xn}, which is usually generated by simple random sampling, Latin Hypercube Sampling [SWN03], or the more sophisticated low-discrepancy sequence [Nie88] (e.g., Sobol sequences). Taking the initial DoE X and its corresponding objective value, Y={f(x1), f(x2),..., f(xn)} ⊆ ℝ, we proceed to construct a statistical model M describing the probability distribution of the objective function conditioned onthe initial evidence, namely Pr(f|X,Y). In most application scenarios of BO, there is a lack of a priori knowledge about f and therefore nonparametric models (e.g., Gaussian process regression or random forest) are commonly chosen for M, which gives rise to a predictor f'(x) for all x ∈ X and an uncertainty quantification s'(x) that estimates, for instance, the mean squared error of the predic-tion E(f'(x)−f(x))2. Based on f' and s', promising points can be identified via the so-called acquisition function which balances exploitation with exploration of the optimization process.
## Reference
* [Moc74] Jonas Mockus. "On bayesian methods for seeking the extremum". In Guri I. Marchuk, editor, _Optimization Techniques, IFIP Technical Conference, Novosibirsk_, USSR, July 1-7, 1974, volume 27 of _Lecture Notes in Computer Science_, pages 400–404. Springer, 1974.
* [JSW98] Donald R. Jones, Matthias Schonlau, and William J. Welch. "Efficient global optimization of expensive black-box functions". _J. Glob. Optim._, 13(4):455–492, 1998.
* [SWN03] Thomas J. Santner, Brian J. Williams, and William I. Notz. "The Design and Analysis of Computer Experiments". _Springer series in statistics._ Springer, 2003.
* [Nie88] Harald Niederreiter. "Low-discrepancy and low-dispersion sequences". _Journal of number theory_, 30(1):51–70, 1988.
* [WvSEB17a] Hao Wang, Bas van Stein, Michael Emmerich, and Thomas Bäck. "A New Acquisition Function for Bayesian Optimization Based on the Moment-Generating Function". In _Systems, Man, and Cybernetics (SMC), 2017 IEEE International Conference on_, pages 507–512. IEEE, 2017.
* [WEB18] Hao Wang, Michael Emmerich, and Thomas Bäck. "Cooling Strategies for the Moment-Generating Function in Bayesian Global Optimization". In _2018 IEEE Congress on Evolutionary Computation_, CEC 2018, Rio de Janeiro, Brazil, July 8-13, 2018, pages 1–8. IEEE, 2018.
* [WEB19] Hao, Wang, Michael Emmerich, and Thomas Bäck. "Towards self-adaptive efficient global optimization". In _AIP Conference Proceedings_, vol. 2070, no. 1, p. 020056. AIP Publishing LLC, 2019.
* [RaponiWBBD20] Elena Raponi, Hao Wang, Mariusz Bujny, Simonetta Boria, and Carola Doerr: "High Dimensional Bayesian Optimization Assisted by Principal Component Analysis". In _International Conference on Parallel Problem Solving from Nature_, pp. 169-183. Springer, Cham, 2020.
%package help
Summary: Development documents and examples for bayes-optim
Provides: python3-bayes-optim-doc
%description help
[![Actions Status](https://github.com/wangronin/Bayesian-Optimization/workflows/Build%20and%20Test/badge.svg)](https://github.com/wangronin/Bayesian-Optimization/actions)
# Bayesian Optimization Library
A `Python` implementation of the Bayesian Optimization (BO) algorithm working on decision spaces composed of either real, integer, catergorical variables, or a mixture thereof.
Underpinned by surrogate models, BO iteratively proposes candidate solutions using the so-called **acquisition function** which balances exploration with exploitation, and updates the surrogate model with newly observed objective values. This algorithm is designed to optimize **expensive black-box** problems efficiently.
![](assets/BO-example.gif)
## Installation
You could either install the stable version on `pypi`:
```shell
pip install bayes-optim
```
Or, take the lastest version from github:
```shell
git clone https://github.com/wangronin/Bayesian-Optimization.git
cd Bayesian-Optimization && python setup.py install --user
```
## Example
For real-valued search variables, the simplest usage is via the `fmin` function:
```python
from bayes_optim import fmin
def f(x):
return sum(x ** 2)
minimum = fmin(f, [-5] * 2, [5] * 2, max_FEs=30, seed=42)
```
And you could also have much finer control over most ingredients of BO, e.g., the surrogate
model and acquisition functions. Please see the example below:
```python
from bayes_optim import BO, RealSpace
from bayes_optim.Surrogate import GaussianProcess
dim = 5
space = RealSpace([-5, 5]) * dim # create the search space
# hyperparameters of the GPR model
thetaL = 1e-10 * (ub - lb) * np.ones(dim)
thetaU = 10 * (ub - lb) * np.ones(dim)
model = GaussianProcess( # create the GPR model
thetaL=thetaL, thetaU=thetaU
)
opt = BO(
search_space=space,
obj_fun=fitness,
model=model,
DoE_size=5, # number of initial sample points
max_FEs=50, # maximal function evaluation
verbose=True
)
opt.run()
```
For more detailed usage and exmaples, please check out our [wiki page](https://github.com/wangronin/Bayesian-Optimization/wiki).
## Features
This implementation differs from alternative packages/libraries in the following features:
* **Parallelization**, also known as _batch-sequential optimization_, for which several different approaches are implemented here.
* **Moment-Generating Function of the improvment** (MGFI) [WvSEB17a] is a recently proposed acquistion function, which implictly controls the exploration-exploitation trade-off.
* **Mixed-Integer Evolution Strategy** for optimizing the acqusition function, which is enabled when the search space is a mixture of real, integer, and categorical variables.
## Project Structure
* `bayes-optim/SearchSpace.py`: implementation of the search/decision space.
* `bayes-optim/base.py`: the base class of Bayesian Optimization.
* `bayes-optim/AcquisitionFunction.py`: the implemetation of acquisition functions (see below for the list of implemented ones).
* `bayes-optim/Surrogate`: we implemented the Gaussian Process Regression (GPR) and Random Forest (RF).
* `bayes-optim/BayesOpt.py` contains several BO variants:
* `BO`: noiseless + sequential
* `ParallelBO`: noiseless + parallel (a.k.a. batch-sequential)
* `AnnealingBO`: noiseless + parallel + annealling [WEB18]
* `SelfAdaptiveBO`: noiseless + parallel + self-adaptive [WEB19]
* `NoisyBO`: noisy + parallel
* `bayes-optim/Extension.py` is meant to include the lastest developments that are not extensively tested:
* `PCABO`: noiseless + parallel + PCA-assisted dimensionality reduction [RaponiWBBD20] **[Under Construction]**
* `MultiAcquisitionBO`: noiseless + parallelization with multiple different acquisition functions **[Under Construction]**
## Acquisition Functions
The following infill-criteria are implemented in the library:
* _Expected Improvement_ (EI)
* Probability of Improvement (PI) / Probability of Improvement
* _Upper Confidence Bound_ (UCB)
* _Moment-Generating Function of Improvement_ (MGFI)
* _Generalized Expected Improvement_ (GEI) **[Under Construction]**
For sequential working mode, Expected Improvement is used by default. For parallelization mode, MGFI is enabled by default.
## Surrogate Model
The meta (surrogate)-model used in Bayesian optimization. The basic requirement for such a model is to provide the uncertainty quantification (either empirical or theorerical) for the prediction. To easily handle the categorical data, __random forest__ model is used by default. The implementation here is based the one in _scikit-learn_, with modifications on uncertainty quantification.
## A brief Introduction to Bayesian Optimization
Bayesian Optimization [Moc74, JSW98] (BO) is a sequential optimization strategy originally proposed to solve the single-objective black-box optimiza-tion problem that is costly to evaluate. Here, we shall restrict our discussion to the single-objective case. BO typically starts with sampling an initial design of experiment (DoE) of size, X={x1,x2,...,xn}, which is usually generated by simple random sampling, Latin Hypercube Sampling [SWN03], or the more sophisticated low-discrepancy sequence [Nie88] (e.g., Sobol sequences). Taking the initial DoE X and its corresponding objective value, Y={f(x1), f(x2),..., f(xn)} ⊆ ℝ, we proceed to construct a statistical model M describing the probability distribution of the objective function conditioned onthe initial evidence, namely Pr(f|X,Y). In most application scenarios of BO, there is a lack of a priori knowledge about f and therefore nonparametric models (e.g., Gaussian process regression or random forest) are commonly chosen for M, which gives rise to a predictor f'(x) for all x ∈ X and an uncertainty quantification s'(x) that estimates, for instance, the mean squared error of the predic-tion E(f'(x)−f(x))2. Based on f' and s', promising points can be identified via the so-called acquisition function which balances exploitation with exploration of the optimization process.
## Reference
* [Moc74] Jonas Mockus. "On bayesian methods for seeking the extremum". In Guri I. Marchuk, editor, _Optimization Techniques, IFIP Technical Conference, Novosibirsk_, USSR, July 1-7, 1974, volume 27 of _Lecture Notes in Computer Science_, pages 400–404. Springer, 1974.
* [JSW98] Donald R. Jones, Matthias Schonlau, and William J. Welch. "Efficient global optimization of expensive black-box functions". _J. Glob. Optim._, 13(4):455–492, 1998.
* [SWN03] Thomas J. Santner, Brian J. Williams, and William I. Notz. "The Design and Analysis of Computer Experiments". _Springer series in statistics._ Springer, 2003.
* [Nie88] Harald Niederreiter. "Low-discrepancy and low-dispersion sequences". _Journal of number theory_, 30(1):51–70, 1988.
* [WvSEB17a] Hao Wang, Bas van Stein, Michael Emmerich, and Thomas Bäck. "A New Acquisition Function for Bayesian Optimization Based on the Moment-Generating Function". In _Systems, Man, and Cybernetics (SMC), 2017 IEEE International Conference on_, pages 507–512. IEEE, 2017.
* [WEB18] Hao Wang, Michael Emmerich, and Thomas Bäck. "Cooling Strategies for the Moment-Generating Function in Bayesian Global Optimization". In _2018 IEEE Congress on Evolutionary Computation_, CEC 2018, Rio de Janeiro, Brazil, July 8-13, 2018, pages 1–8. IEEE, 2018.
* [WEB19] Hao, Wang, Michael Emmerich, and Thomas Bäck. "Towards self-adaptive efficient global optimization". In _AIP Conference Proceedings_, vol. 2070, no. 1, p. 020056. AIP Publishing LLC, 2019.
* [RaponiWBBD20] Elena Raponi, Hao Wang, Mariusz Bujny, Simonetta Boria, and Carola Doerr: "High Dimensional Bayesian Optimization Assisted by Principal Component Analysis". In _International Conference on Parallel Problem Solving from Nature_, pp. 169-183. Springer, Cham, 2020.
%prep
%autosetup -n bayes-optim-0.2.6
%build
%py3_build
%install
%py3_install
install -d -m755 %{buildroot}/%{_pkgdocdir}
if [ -d doc ]; then cp -arf doc %{buildroot}/%{_pkgdocdir}; fi
if [ -d docs ]; then cp -arf docs %{buildroot}/%{_pkgdocdir}; fi
if [ -d example ]; then cp -arf example %{buildroot}/%{_pkgdocdir}; fi
if [ -d examples ]; then cp -arf examples %{buildroot}/%{_pkgdocdir}; fi
pushd %{buildroot}
if [ -d usr/lib ]; then
find usr/lib -type f -printf "/%h/%f\n" >> filelist.lst
fi
if [ -d usr/lib64 ]; then
find usr/lib64 -type f -printf "/%h/%f\n" >> filelist.lst
fi
if [ -d usr/bin ]; then
find usr/bin -type f -printf "/%h/%f\n" >> filelist.lst
fi
if [ -d usr/sbin ]; then
find usr/sbin -type f -printf "/%h/%f\n" >> filelist.lst
fi
touch doclist.lst
if [ -d usr/share/man ]; then
find usr/share/man -type f -printf "/%h/%f.gz\n" >> doclist.lst
fi
popd
mv %{buildroot}/filelist.lst .
mv %{buildroot}/doclist.lst .
%files -n python3-bayes-optim -f filelist.lst
%dir %{python3_sitelib}/*
%files help -f doclist.lst
%{_docdir}/*
%changelog
* Mon May 29 2023 Python_Bot - 0.2.6-1
- Package Spec generated