diff options
author | CoprDistGit <infra@openeuler.org> | 2023-05-05 05:59:42 +0000 |
---|---|---|
committer | CoprDistGit <infra@openeuler.org> | 2023-05-05 05:59:42 +0000 |
commit | ecbc73f848c58c4137816bed44e1c879c634d2f6 (patch) | |
tree | 38e3b8612815b4a3b702d67d6357cbda2f291f74 | |
parent | af7449dabd052a208420dc202971bcc8744ef234 (diff) |
automatic import of python-airflow-commonsopeneuler20.03
-rw-r--r-- | .gitignore | 1 | ||||
-rw-r--r-- | python-airflow-commons.spec | 193 | ||||
-rw-r--r-- | sources | 1 |
3 files changed, 195 insertions, 0 deletions
@@ -0,0 +1 @@ +/airflow-commons-0.0.67.tar.gz diff --git a/python-airflow-commons.spec b/python-airflow-commons.spec new file mode 100644 index 0000000..f95df91 --- /dev/null +++ b/python-airflow-commons.spec @@ -0,0 +1,193 @@ +%global _empty_manifest_terminate_build 0 +Name: python-airflow-commons +Version: 0.0.67 +Release: 1 +Summary: Common functions for airflow +License: MIT License +URL: https://github.com/migroscomtr/airflow-commons/ +Source0: https://mirrors.nju.edu.cn/pypi/web/packages/fc/3b/3a72e278fa1cb4bf5cd07a74348d2871d25c3ff1590dbcb4270c8a6160cb/airflow-commons-0.0.67.tar.gz +BuildArch: noarch + +Requires: python3-pytz +Requires: python3-datetime +Requires: python3-google-cloud-bigquery +Requires: python3-pandas +Requires: python3-sqlalchemy +Requires: python3-pymysql +Requires: python3-boto3 +Requires: python3-botocore +Requires: python3-aiobotocore +Requires: python3-pyyaml +Requires: python3-s3fs +Requires: python3-s3transfer +Requires: python3-pyarrow + +%description +# airflow-commons +A python package that contains common functionalities for airflow + +## Installation +Use the package manager pip to install airflow-commons. +```bash +pip install airflow-commons +``` +## Modules +* bigquery_operator: With this module you can manage your Google BigQuery operations. +* mysql_operator: Using this module, you can connect to your MySQL data source and manage your data operations. +* s3_operator: This operator connects to your s3 bucket and lets you manage your bucket. +* glossary: This module consists of constants used across project +* sql_resources: Template BigQuery and MySQL queries such as merge, delete, select etc. are located here. +* utils: Generic methods like connection, querying etc. are implemented in this module. + +## Usage +* Sample deduplication code works like: +```python +from airflow_commons import bigquery_operator + +bigquery_operator.deduplicate( + service_account_file="path_to_file", + start_date="01-01-2020 14:00:00", + end_date="01-01-2020 15:00:00", + project_id="bigquery_project_id", + source_dataset="source_dataset", + source_table="source_table", + target_dataset="target_dataset", + target_table="target_table", + oldest_allowable_target_partition="01-01-2015 00:00:00", + primary_keys=["primary_keys"], + time_columns=["time_columns"], + allow_partition_pruning=True, + ) +``` + + +%package -n python3-airflow-commons +Summary: Common functions for airflow +Provides: python-airflow-commons +BuildRequires: python3-devel +BuildRequires: python3-setuptools +BuildRequires: python3-pip +%description -n python3-airflow-commons +# airflow-commons +A python package that contains common functionalities for airflow + +## Installation +Use the package manager pip to install airflow-commons. +```bash +pip install airflow-commons +``` +## Modules +* bigquery_operator: With this module you can manage your Google BigQuery operations. +* mysql_operator: Using this module, you can connect to your MySQL data source and manage your data operations. +* s3_operator: This operator connects to your s3 bucket and lets you manage your bucket. +* glossary: This module consists of constants used across project +* sql_resources: Template BigQuery and MySQL queries such as merge, delete, select etc. are located here. +* utils: Generic methods like connection, querying etc. are implemented in this module. + +## Usage +* Sample deduplication code works like: +```python +from airflow_commons import bigquery_operator + +bigquery_operator.deduplicate( + service_account_file="path_to_file", + start_date="01-01-2020 14:00:00", + end_date="01-01-2020 15:00:00", + project_id="bigquery_project_id", + source_dataset="source_dataset", + source_table="source_table", + target_dataset="target_dataset", + target_table="target_table", + oldest_allowable_target_partition="01-01-2015 00:00:00", + primary_keys=["primary_keys"], + time_columns=["time_columns"], + allow_partition_pruning=True, + ) +``` + + +%package help +Summary: Development documents and examples for airflow-commons +Provides: python3-airflow-commons-doc +%description help +# airflow-commons +A python package that contains common functionalities for airflow + +## Installation +Use the package manager pip to install airflow-commons. +```bash +pip install airflow-commons +``` +## Modules +* bigquery_operator: With this module you can manage your Google BigQuery operations. +* mysql_operator: Using this module, you can connect to your MySQL data source and manage your data operations. +* s3_operator: This operator connects to your s3 bucket and lets you manage your bucket. +* glossary: This module consists of constants used across project +* sql_resources: Template BigQuery and MySQL queries such as merge, delete, select etc. are located here. +* utils: Generic methods like connection, querying etc. are implemented in this module. + +## Usage +* Sample deduplication code works like: +```python +from airflow_commons import bigquery_operator + +bigquery_operator.deduplicate( + service_account_file="path_to_file", + start_date="01-01-2020 14:00:00", + end_date="01-01-2020 15:00:00", + project_id="bigquery_project_id", + source_dataset="source_dataset", + source_table="source_table", + target_dataset="target_dataset", + target_table="target_table", + oldest_allowable_target_partition="01-01-2015 00:00:00", + primary_keys=["primary_keys"], + time_columns=["time_columns"], + allow_partition_pruning=True, + ) +``` + + +%prep +%autosetup -n airflow-commons-0.0.67 + +%build +%py3_build + +%install +%py3_install +install -d -m755 %{buildroot}/%{_pkgdocdir} +if [ -d doc ]; then cp -arf doc %{buildroot}/%{_pkgdocdir}; fi +if [ -d docs ]; then cp -arf docs %{buildroot}/%{_pkgdocdir}; fi +if [ -d example ]; then cp -arf example %{buildroot}/%{_pkgdocdir}; fi +if [ -d examples ]; then cp -arf examples %{buildroot}/%{_pkgdocdir}; fi +pushd %{buildroot} +if [ -d usr/lib ]; then + find usr/lib -type f -printf "/%h/%f\n" >> filelist.lst +fi +if [ -d usr/lib64 ]; then + find usr/lib64 -type f -printf "/%h/%f\n" >> filelist.lst +fi +if [ -d usr/bin ]; then + find usr/bin -type f -printf "/%h/%f\n" >> filelist.lst +fi +if [ -d usr/sbin ]; then + find usr/sbin -type f -printf "/%h/%f\n" >> filelist.lst +fi +touch doclist.lst +if [ -d usr/share/man ]; then + find usr/share/man -type f -printf "/%h/%f.gz\n" >> doclist.lst +fi +popd +mv %{buildroot}/filelist.lst . +mv %{buildroot}/doclist.lst . + +%files -n python3-airflow-commons -f filelist.lst +%dir %{python3_sitelib}/* + +%files help -f doclist.lst +%{_docdir}/* + +%changelog +* Fri May 05 2023 Python_Bot <Python_Bot@openeuler.org> - 0.0.67-1 +- Package Spec generated @@ -0,0 +1 @@ +1b83721b19f643728dd9bae48f0655c5 airflow-commons-0.0.67.tar.gz |