%global _empty_manifest_terminate_build 0
Name: python-ngboost
Version: 0.4.1
Release: 1
Summary: Library for probabilistic predictions via gradient boosting.
License: Apache-2.0
URL: https://github.com/stanfordmlgroup/ngboost
Source0: https://mirrors.nju.edu.cn/pypi/web/packages/90/a5/ee3cf88698107fcbe59ca8231e43cba338c0be6c229ba1e2ec26cd6fff1f/ngboost-0.4.1.tar.gz
BuildArch: noarch
Requires: python3-lifelines
Requires: python3-numpy
Requires: python3-pandas
Requires: python3-scikit-learn
Requires: python3-scipy
Requires: python3-tqdm
%description
# NGBoost: Natural Gradient Boosting for Probabilistic Prediction

[](https://github.com/stanfordmlgroup/ngboost/graphs/contributors)
[](https://opensource.org/licenses/Apache-2.0)
[](https://github.com/psf/black)
[](https://pypi.org/project/ngboost)
[](https://pypistats.org/packages/ngboost)
ngboost is a Python library that implements Natural Gradient Boosting, as described in ["NGBoost: Natural Gradient Boosting for Probabilistic Prediction"](https://stanfordmlgroup.github.io/projects/ngboost/). It is built on top of [Scikit-Learn](https://scikit-learn.org/stable/), and is designed to be scalable and modular with respect to choice of proper scoring rule, distribution, and base learner. A didactic introduction to the methodology underlying NGBoost is available in this [slide deck](https://drive.google.com/file/d/183BWFAdFms81MKy6hSku8qI97OwS_JH_/view?usp=sharing).
## Installation
```sh
via pip
pip install --upgrade ngboost
via conda-forge
conda install -c conda-forge ngboost
```
## Usage
Probabilistic regression example on the Boston housing dataset:
```python
from ngboost import NGBRegressor
from sklearn.datasets import load_boston
from sklearn.model_selection import train_test_split
from sklearn.metrics import mean_squared_error
X, Y = load_boston(True)
X_train, X_test, Y_train, Y_test = train_test_split(X, Y, test_size=0.2)
ngb = NGBRegressor().fit(X_train, Y_train)
Y_preds = ngb.predict(X_test)
Y_dists = ngb.pred_dist(X_test)
# test Mean Squared Error
test_MSE = mean_squared_error(Y_preds, Y_test)
print('Test MSE', test_MSE)
# test Negative Log Likelihood
test_NLL = -Y_dists.logpdf(Y_test).mean()
print('Test NLL', test_NLL)
```
Details on available distributions, scoring rules, learners, tuning, and model interpretation are available in our [user guide](https://stanfordmlgroup.github.io/ngboost/intro.html), which also includes numerous usage examples and information on how to add new distributions or scores to NGBoost.
## License
[Apache License 2.0](https://github.com/stanfordmlgroup/ngboost/blob/master/LICENSE).
## Reference
Tony Duan, Anand Avati, Daisy Yi Ding, Khanh K. Thai, Sanjay Basu, Andrew Y. Ng, Alejandro Schuler. 2019.
NGBoost: Natural Gradient Boosting for Probabilistic Prediction.
[arXiv](https://arxiv.org/abs/1910.03225)
%package -n python3-ngboost
Summary: Library for probabilistic predictions via gradient boosting.
Provides: python-ngboost
BuildRequires: python3-devel
BuildRequires: python3-setuptools
BuildRequires: python3-pip
%description -n python3-ngboost
# NGBoost: Natural Gradient Boosting for Probabilistic Prediction

[](https://github.com/stanfordmlgroup/ngboost/graphs/contributors)
[](https://opensource.org/licenses/Apache-2.0)
[](https://github.com/psf/black)
[](https://pypi.org/project/ngboost)
[](https://pypistats.org/packages/ngboost)
ngboost is a Python library that implements Natural Gradient Boosting, as described in ["NGBoost: Natural Gradient Boosting for Probabilistic Prediction"](https://stanfordmlgroup.github.io/projects/ngboost/). It is built on top of [Scikit-Learn](https://scikit-learn.org/stable/), and is designed to be scalable and modular with respect to choice of proper scoring rule, distribution, and base learner. A didactic introduction to the methodology underlying NGBoost is available in this [slide deck](https://drive.google.com/file/d/183BWFAdFms81MKy6hSku8qI97OwS_JH_/view?usp=sharing).
## Installation
```sh
via pip
pip install --upgrade ngboost
via conda-forge
conda install -c conda-forge ngboost
```
## Usage
Probabilistic regression example on the Boston housing dataset:
```python
from ngboost import NGBRegressor
from sklearn.datasets import load_boston
from sklearn.model_selection import train_test_split
from sklearn.metrics import mean_squared_error
X, Y = load_boston(True)
X_train, X_test, Y_train, Y_test = train_test_split(X, Y, test_size=0.2)
ngb = NGBRegressor().fit(X_train, Y_train)
Y_preds = ngb.predict(X_test)
Y_dists = ngb.pred_dist(X_test)
# test Mean Squared Error
test_MSE = mean_squared_error(Y_preds, Y_test)
print('Test MSE', test_MSE)
# test Negative Log Likelihood
test_NLL = -Y_dists.logpdf(Y_test).mean()
print('Test NLL', test_NLL)
```
Details on available distributions, scoring rules, learners, tuning, and model interpretation are available in our [user guide](https://stanfordmlgroup.github.io/ngboost/intro.html), which also includes numerous usage examples and information on how to add new distributions or scores to NGBoost.
## License
[Apache License 2.0](https://github.com/stanfordmlgroup/ngboost/blob/master/LICENSE).
## Reference
Tony Duan, Anand Avati, Daisy Yi Ding, Khanh K. Thai, Sanjay Basu, Andrew Y. Ng, Alejandro Schuler. 2019.
NGBoost: Natural Gradient Boosting for Probabilistic Prediction.
[arXiv](https://arxiv.org/abs/1910.03225)
%package help
Summary: Development documents and examples for ngboost
Provides: python3-ngboost-doc
%description help
# NGBoost: Natural Gradient Boosting for Probabilistic Prediction

[](https://github.com/stanfordmlgroup/ngboost/graphs/contributors)
[](https://opensource.org/licenses/Apache-2.0)
[](https://github.com/psf/black)
[](https://pypi.org/project/ngboost)
[](https://pypistats.org/packages/ngboost)
ngboost is a Python library that implements Natural Gradient Boosting, as described in ["NGBoost: Natural Gradient Boosting for Probabilistic Prediction"](https://stanfordmlgroup.github.io/projects/ngboost/). It is built on top of [Scikit-Learn](https://scikit-learn.org/stable/), and is designed to be scalable and modular with respect to choice of proper scoring rule, distribution, and base learner. A didactic introduction to the methodology underlying NGBoost is available in this [slide deck](https://drive.google.com/file/d/183BWFAdFms81MKy6hSku8qI97OwS_JH_/view?usp=sharing).
## Installation
```sh
via pip
pip install --upgrade ngboost
via conda-forge
conda install -c conda-forge ngboost
```
## Usage
Probabilistic regression example on the Boston housing dataset:
```python
from ngboost import NGBRegressor
from sklearn.datasets import load_boston
from sklearn.model_selection import train_test_split
from sklearn.metrics import mean_squared_error
X, Y = load_boston(True)
X_train, X_test, Y_train, Y_test = train_test_split(X, Y, test_size=0.2)
ngb = NGBRegressor().fit(X_train, Y_train)
Y_preds = ngb.predict(X_test)
Y_dists = ngb.pred_dist(X_test)
# test Mean Squared Error
test_MSE = mean_squared_error(Y_preds, Y_test)
print('Test MSE', test_MSE)
# test Negative Log Likelihood
test_NLL = -Y_dists.logpdf(Y_test).mean()
print('Test NLL', test_NLL)
```
Details on available distributions, scoring rules, learners, tuning, and model interpretation are available in our [user guide](https://stanfordmlgroup.github.io/ngboost/intro.html), which also includes numerous usage examples and information on how to add new distributions or scores to NGBoost.
## License
[Apache License 2.0](https://github.com/stanfordmlgroup/ngboost/blob/master/LICENSE).
## Reference
Tony Duan, Anand Avati, Daisy Yi Ding, Khanh K. Thai, Sanjay Basu, Andrew Y. Ng, Alejandro Schuler. 2019.
NGBoost: Natural Gradient Boosting for Probabilistic Prediction.
[arXiv](https://arxiv.org/abs/1910.03225)
%prep
%autosetup -n ngboost-0.4.1
%build
%py3_build
%install
%py3_install
install -d -m755 %{buildroot}/%{_pkgdocdir}
if [ -d doc ]; then cp -arf doc %{buildroot}/%{_pkgdocdir}; fi
if [ -d docs ]; then cp -arf docs %{buildroot}/%{_pkgdocdir}; fi
if [ -d example ]; then cp -arf example %{buildroot}/%{_pkgdocdir}; fi
if [ -d examples ]; then cp -arf examples %{buildroot}/%{_pkgdocdir}; fi
pushd %{buildroot}
if [ -d usr/lib ]; then
find usr/lib -type f -printf "/%h/%f\n" >> filelist.lst
fi
if [ -d usr/lib64 ]; then
find usr/lib64 -type f -printf "/%h/%f\n" >> filelist.lst
fi
if [ -d usr/bin ]; then
find usr/bin -type f -printf "/%h/%f\n" >> filelist.lst
fi
if [ -d usr/sbin ]; then
find usr/sbin -type f -printf "/%h/%f\n" >> filelist.lst
fi
touch doclist.lst
if [ -d usr/share/man ]; then
find usr/share/man -type f -printf "/%h/%f.gz\n" >> doclist.lst
fi
popd
mv %{buildroot}/filelist.lst .
mv %{buildroot}/doclist.lst .
%files -n python3-ngboost -f filelist.lst
%dir %{python3_sitelib}/*
%files help -f doclist.lst
%{_docdir}/*
%changelog
* Tue Apr 11 2023 Python_Bot - 0.4.1-1
- Package Spec generated