%global _empty_manifest_terminate_build 0
Name:		python-dbispipeline
Version:	0.8.30
Release:	1
Summary:	should make things more reproducible
License:	BSD License
URL:		https://git.uibk.ac.at/dbis/software/dbispipeline
Source0:	https://mirrors.nju.edu.cn/pypi/web/packages/fd/fe/3a6aa98a849db50ac9a415b301a6132338c816c4149d545a20281d1a140a/dbispipeline-0.8.30.tar.gz
BuildArch:	noarch

Requires:	python3-gitpython
Requires:	python3-matplotlib
Requires:	python3-pandas
Requires:	python3-psycopg2-binary
Requires:	python3-scikit-learn
Requires:	python3-sqlalchemy
Requires:	python3-click
Requires:	python3-logzero
Requires:	python3-pyyaml

%description
    datasets:
      - music/acousticbrainz
      - music/billboard
      - music/millionsongdataset
    ```
    would assume that a physical directory exists at
    `/storage/nas3/datasets/music/billboard` and after calling the script
    `dbispipeline-link` without parameters using the above configuration, the
    following symlinks will be created:
    ```
    data/acousticbrainz -> /storage/nas3/datasets/music/acousticbrainz
    data/billboard -> /storage/nas3/datasets/music/billboard
    data/millionsongdataset -> /storage/nas3/datasets/music/millionsongdataset
    ```
    The value of `dataset_dir` from the config can be overwritten in the cli
    script by using the `-p` option.
## Requirements
* python >= 3.6
* a PostgreSQL database
* an email server if you want to use notification emails
## Installation
1. Install dbispipeline in your python. We recommend using pipenv to keep your
   dependencies clean: `pipenv install dbispipeline`
   This call will install a virtual environment as well as all dependencies.
2. Write your plan(s). See the example plan files for guidance.
3. call `pipenv run dp <yourplanfile.py>`
Enjoy!
## Configuration
The framework look in multiple directories for its configuration files.
* `/usr/local/etc/dbispipeline.ini` used for system wide default.
* `$HOME/.config/dbispipeline.ini` used for user specific configurations.
* `./dbispipeline.ini` for project specific configurations.
And example configuration file looks like this:
```ini
[database]
# url to your postgres database
host = your.personal.database
# your database user name
user = user
# port of your postgres database, default = 5432
# port = 5432
# password of your database user
password = <secure-password>
# database to use
database = pipelineresults
# table to be used
result_table = my_super_awesome_results
[project]
# this will be stored in the database
name = dbispipeline-test
# this is used to store backups of the execution
# it is possible to override this by setting the DBISPIPELINE_BACKUP_DIR
# environment variable
# the default is the temp dir of the os if this option is not set.
backup_dir = tmp
# this is used to linke the used datasets spcified in data/links.yaml
# it is possible to override this by setting the DBISPIPELINE_DATASET_DIR
# environment variable
dataset_dir = /storage/nas/datasets
[mail]
# email address to use as sender
sender = botname@yourserver.com
# recipient. This should probably be set on a home-directory-basis.
recipient = you@yourserver.com
# smtp server address to use
smtp_server = smtp.yourserver.com
# use smtp authentication, default = no
# authenticate = no
# username for smtp authentication, required if authenticate = yes
# username = foo
# password for smtp authentication, required if authenticate = yes
# password = bar
# port to use for smtp server connection, default = 465
# port = 465
```

%package -n python3-dbispipeline
Summary:	should make things more reproducible
Provides:	python-dbispipeline
BuildRequires:	python3-devel
BuildRequires:	python3-setuptools
BuildRequires:	python3-pip
%description -n python3-dbispipeline
    datasets:
      - music/acousticbrainz
      - music/billboard
      - music/millionsongdataset
    ```
    would assume that a physical directory exists at
    `/storage/nas3/datasets/music/billboard` and after calling the script
    `dbispipeline-link` without parameters using the above configuration, the
    following symlinks will be created:
    ```
    data/acousticbrainz -> /storage/nas3/datasets/music/acousticbrainz
    data/billboard -> /storage/nas3/datasets/music/billboard
    data/millionsongdataset -> /storage/nas3/datasets/music/millionsongdataset
    ```
    The value of `dataset_dir` from the config can be overwritten in the cli
    script by using the `-p` option.
## Requirements
* python >= 3.6
* a PostgreSQL database
* an email server if you want to use notification emails
## Installation
1. Install dbispipeline in your python. We recommend using pipenv to keep your
   dependencies clean: `pipenv install dbispipeline`
   This call will install a virtual environment as well as all dependencies.
2. Write your plan(s). See the example plan files for guidance.
3. call `pipenv run dp <yourplanfile.py>`
Enjoy!
## Configuration
The framework look in multiple directories for its configuration files.
* `/usr/local/etc/dbispipeline.ini` used for system wide default.
* `$HOME/.config/dbispipeline.ini` used for user specific configurations.
* `./dbispipeline.ini` for project specific configurations.
And example configuration file looks like this:
```ini
[database]
# url to your postgres database
host = your.personal.database
# your database user name
user = user
# port of your postgres database, default = 5432
# port = 5432
# password of your database user
password = <secure-password>
# database to use
database = pipelineresults
# table to be used
result_table = my_super_awesome_results
[project]
# this will be stored in the database
name = dbispipeline-test
# this is used to store backups of the execution
# it is possible to override this by setting the DBISPIPELINE_BACKUP_DIR
# environment variable
# the default is the temp dir of the os if this option is not set.
backup_dir = tmp
# this is used to linke the used datasets spcified in data/links.yaml
# it is possible to override this by setting the DBISPIPELINE_DATASET_DIR
# environment variable
dataset_dir = /storage/nas/datasets
[mail]
# email address to use as sender
sender = botname@yourserver.com
# recipient. This should probably be set on a home-directory-basis.
recipient = you@yourserver.com
# smtp server address to use
smtp_server = smtp.yourserver.com
# use smtp authentication, default = no
# authenticate = no
# username for smtp authentication, required if authenticate = yes
# username = foo
# password for smtp authentication, required if authenticate = yes
# password = bar
# port to use for smtp server connection, default = 465
# port = 465
```

%package help
Summary:	Development documents and examples for dbispipeline
Provides:	python3-dbispipeline-doc
%description help
    datasets:
      - music/acousticbrainz
      - music/billboard
      - music/millionsongdataset
    ```
    would assume that a physical directory exists at
    `/storage/nas3/datasets/music/billboard` and after calling the script
    `dbispipeline-link` without parameters using the above configuration, the
    following symlinks will be created:
    ```
    data/acousticbrainz -> /storage/nas3/datasets/music/acousticbrainz
    data/billboard -> /storage/nas3/datasets/music/billboard
    data/millionsongdataset -> /storage/nas3/datasets/music/millionsongdataset
    ```
    The value of `dataset_dir` from the config can be overwritten in the cli
    script by using the `-p` option.
## Requirements
* python >= 3.6
* a PostgreSQL database
* an email server if you want to use notification emails
## Installation
1. Install dbispipeline in your python. We recommend using pipenv to keep your
   dependencies clean: `pipenv install dbispipeline`
   This call will install a virtual environment as well as all dependencies.
2. Write your plan(s). See the example plan files for guidance.
3. call `pipenv run dp <yourplanfile.py>`
Enjoy!
## Configuration
The framework look in multiple directories for its configuration files.
* `/usr/local/etc/dbispipeline.ini` used for system wide default.
* `$HOME/.config/dbispipeline.ini` used for user specific configurations.
* `./dbispipeline.ini` for project specific configurations.
And example configuration file looks like this:
```ini
[database]
# url to your postgres database
host = your.personal.database
# your database user name
user = user
# port of your postgres database, default = 5432
# port = 5432
# password of your database user
password = <secure-password>
# database to use
database = pipelineresults
# table to be used
result_table = my_super_awesome_results
[project]
# this will be stored in the database
name = dbispipeline-test
# this is used to store backups of the execution
# it is possible to override this by setting the DBISPIPELINE_BACKUP_DIR
# environment variable
# the default is the temp dir of the os if this option is not set.
backup_dir = tmp
# this is used to linke the used datasets spcified in data/links.yaml
# it is possible to override this by setting the DBISPIPELINE_DATASET_DIR
# environment variable
dataset_dir = /storage/nas/datasets
[mail]
# email address to use as sender
sender = botname@yourserver.com
# recipient. This should probably be set on a home-directory-basis.
recipient = you@yourserver.com
# smtp server address to use
smtp_server = smtp.yourserver.com
# use smtp authentication, default = no
# authenticate = no
# username for smtp authentication, required if authenticate = yes
# username = foo
# password for smtp authentication, required if authenticate = yes
# password = bar
# port to use for smtp server connection, default = 465
# port = 465
```

%prep
%autosetup -n dbispipeline-0.8.30

%build
%py3_build

%install
%py3_install
install -d -m755 %{buildroot}/%{_pkgdocdir}
if [ -d doc ]; then cp -arf doc %{buildroot}/%{_pkgdocdir}; fi
if [ -d docs ]; then cp -arf docs %{buildroot}/%{_pkgdocdir}; fi
if [ -d example ]; then cp -arf example %{buildroot}/%{_pkgdocdir}; fi
if [ -d examples ]; then cp -arf examples %{buildroot}/%{_pkgdocdir}; fi
pushd %{buildroot}
if [ -d usr/lib ]; then
	find usr/lib -type f -printf "/%h/%f\n" >> filelist.lst
fi
if [ -d usr/lib64 ]; then
	find usr/lib64 -type f -printf "/%h/%f\n" >> filelist.lst
fi
if [ -d usr/bin ]; then
	find usr/bin -type f -printf "/%h/%f\n" >> filelist.lst
fi
if [ -d usr/sbin ]; then
	find usr/sbin -type f -printf "/%h/%f\n" >> filelist.lst
fi
touch doclist.lst
if [ -d usr/share/man ]; then
	find usr/share/man -type f -printf "/%h/%f.gz\n" >> doclist.lst
fi
popd
mv %{buildroot}/filelist.lst .
mv %{buildroot}/doclist.lst .

%files -n python3-dbispipeline -f filelist.lst
%dir %{python3_sitelib}/*

%files help -f doclist.lst
%{_docdir}/*

%changelog
* Tue May 30 2023 Python_Bot <Python_Bot@openeuler.org> - 0.8.30-1
- Package Spec generated