summaryrefslogtreecommitdiff
diff options
context:
space:
mode:
-rw-r--r--.gitignore1
-rw-r--r--python-dag-factory.spec446
-rw-r--r--sources1
3 files changed, 448 insertions, 0 deletions
diff --git a/.gitignore b/.gitignore
index e69de29..546ab56 100644
--- a/.gitignore
+++ b/.gitignore
@@ -0,0 +1 @@
+/dag-factory-0.17.3.tar.gz
diff --git a/python-dag-factory.spec b/python-dag-factory.spec
new file mode 100644
index 0000000..d7d878a
--- /dev/null
+++ b/python-dag-factory.spec
@@ -0,0 +1,446 @@
+%global _empty_manifest_terminate_build 0
+Name: python-dag-factory
+Version: 0.17.3
+Release: 1
+Summary: Dynamically build Airflow DAGs from YAML files
+License: MIT
+URL: https://github.com/ajbosco/dag-factory
+Source0: https://mirrors.nju.edu.cn/pypi/web/packages/5e/84/2f2c0eecea7f36e96626e02641cf91f8680d6a5ecec29007071510209489/dag-factory-0.17.3.tar.gz
+BuildArch: noarch
+
+Requires: python3-apache-airflow[http,kubernetes]
+Requires: python3-pyyaml
+Requires: python3-packaging
+Requires: python3-black
+Requires: python3-pytest
+Requires: python3-pylint
+Requires: python3-pytest-cov
+Requires: python3-tox
+
+%description
+
+# dag-factory
+
+[![Github Actions](https://github.com/ajbosco/dag-factory/workflows/build/badge.svg?branch=master&event=push)](https://github.com/ajbosco/dag-factory/actions?workflow=build)
+[![Coverage](https://codecov.io/github/ajbosco/dag-factory/coverage.svg?branch=master)](https://codecov.io/github/ajbosco/dag-factory?branch=master)
+[![PyPi](https://img.shields.io/pypi/v/dag-factory.svg)](https://pypi.org/project/dag-factory/)
+[![Code Style](https://img.shields.io/badge/code%20style-black-000000.svg)](https://github.com/ambv/black)
+[![Downloads](https://pepy.tech/badge/dag-factory)](https://pepy.tech/project/dag-factory)
+
+*dag-factory* is a library for dynamically generating [Apache Airflow](https://github.com/apache/incubator-airflow) DAGs from YAML configuration files.
+- [Installation](#installation)
+- [Usage](#usage)
+- [Benefits](#benefits)
+- [Contributing](#contributing)
+
+## Installation
+
+To install *dag-factory* run `pip install dag-factory`. It requires Python 3.6.0+ and Apache Airflow 2.0+.
+
+## Usage
+
+After installing *dag-factory* in your Airflow environment, there are two steps to creating DAGs. First, we need to create a YAML configuration file. For example:
+
+```yaml
+example_dag1:
+ default_args:
+ owner: 'example_owner'
+ start_date: 2018-01-01 # or '2 days'
+ end_date: 2018-01-05
+ retries: 1
+ retry_delay_sec: 300
+ schedule_interval: '0 3 * * *'
+ concurrency: 1
+ max_active_runs: 1
+ dagrun_timeout_sec: 60
+ default_view: 'tree' # or 'graph', 'duration', 'gantt', 'landing_times'
+ orientation: 'LR' # or 'TB', 'RL', 'BT'
+ description: 'this is an example dag!'
+ on_success_callback_name: print_hello
+ on_success_callback_file: /usr/local/airflow/dags/print_hello.py
+ on_failure_callback_name: print_hello
+ on_failure_callback_file: /usr/local/airflow/dags/print_hello.py
+ tasks:
+ task_1:
+ operator: airflow.operators.bash_operator.BashOperator
+ bash_command: 'echo 1'
+ task_2:
+ operator: airflow.operators.bash_operator.BashOperator
+ bash_command: 'echo 2'
+ dependencies: [task_1]
+ task_3:
+ operator: airflow.operators.bash_operator.BashOperator
+ bash_command: 'echo 3'
+ dependencies: [task_1]
+```
+
+Then in the DAGs folder in your Airflow environment you need to create a python file like this:
+
+```python
+from airflow import DAG
+import dagfactory
+
+dag_factory = dagfactory.DagFactory("/path/to/dags/config_file.yml")
+
+dag_factory.clean_dags(globals())
+dag_factory.generate_dags(globals())
+```
+
+And this DAG will be generated and ready to run in Airflow!
+
+If you have several configuration files you can import them like this:
+
+```python
+# 'airflow' word is required for the dagbag to parse this file
+from dagfactory import load_yaml_dags
+
+load_yaml_dags(globals_dict=globals(), suffix=['dag.yaml'])
+```
+
+![screenshot](/img/example_dag.png)
+
+## Notes
+
+### HttpSensor (since 0.10.0)
+
+The package `airflow.sensors.http_sensor` works with all supported versions of Airflow. In Airflow 2.0+, the new package name can be used in the operator value: `airflow.providers.http.sensors.http`
+
+The following example shows `response_check` logic in a python file:
+
+```yaml
+task_2:
+ operator: airflow.sensors.http_sensor.HttpSensor
+ http_conn_id: 'test-http'
+ method: 'GET'
+ response_check_name: check_sensor
+ response_check_file: /path/to/example1/http_conn.py
+ dependencies: [task_1]
+```
+
+The `response_check` logic can also be provided as a lambda:
+
+```yaml
+task_2:
+ operator: airflow.sensors.http_sensor.HttpSensor
+ http_conn_id: 'test-http'
+ method: 'GET'
+ response_check_lambda: 'lambda response: "ok" in reponse.text'
+ dependencies: [task_1]
+```
+
+## Benefits
+
+* Construct DAGs without knowing Python
+* Construct DAGs without learning Airflow primitives
+* Avoid duplicative code
+* Everyone loves YAML! ;)
+
+## Contributing
+
+Contributions are welcome! Just submit a Pull Request or Github Issue.
+
+
+
+
+%package -n python3-dag-factory
+Summary: Dynamically build Airflow DAGs from YAML files
+Provides: python-dag-factory
+BuildRequires: python3-devel
+BuildRequires: python3-setuptools
+BuildRequires: python3-pip
+%description -n python3-dag-factory
+
+# dag-factory
+
+[![Github Actions](https://github.com/ajbosco/dag-factory/workflows/build/badge.svg?branch=master&event=push)](https://github.com/ajbosco/dag-factory/actions?workflow=build)
+[![Coverage](https://codecov.io/github/ajbosco/dag-factory/coverage.svg?branch=master)](https://codecov.io/github/ajbosco/dag-factory?branch=master)
+[![PyPi](https://img.shields.io/pypi/v/dag-factory.svg)](https://pypi.org/project/dag-factory/)
+[![Code Style](https://img.shields.io/badge/code%20style-black-000000.svg)](https://github.com/ambv/black)
+[![Downloads](https://pepy.tech/badge/dag-factory)](https://pepy.tech/project/dag-factory)
+
+*dag-factory* is a library for dynamically generating [Apache Airflow](https://github.com/apache/incubator-airflow) DAGs from YAML configuration files.
+- [Installation](#installation)
+- [Usage](#usage)
+- [Benefits](#benefits)
+- [Contributing](#contributing)
+
+## Installation
+
+To install *dag-factory* run `pip install dag-factory`. It requires Python 3.6.0+ and Apache Airflow 2.0+.
+
+## Usage
+
+After installing *dag-factory* in your Airflow environment, there are two steps to creating DAGs. First, we need to create a YAML configuration file. For example:
+
+```yaml
+example_dag1:
+ default_args:
+ owner: 'example_owner'
+ start_date: 2018-01-01 # or '2 days'
+ end_date: 2018-01-05
+ retries: 1
+ retry_delay_sec: 300
+ schedule_interval: '0 3 * * *'
+ concurrency: 1
+ max_active_runs: 1
+ dagrun_timeout_sec: 60
+ default_view: 'tree' # or 'graph', 'duration', 'gantt', 'landing_times'
+ orientation: 'LR' # or 'TB', 'RL', 'BT'
+ description: 'this is an example dag!'
+ on_success_callback_name: print_hello
+ on_success_callback_file: /usr/local/airflow/dags/print_hello.py
+ on_failure_callback_name: print_hello
+ on_failure_callback_file: /usr/local/airflow/dags/print_hello.py
+ tasks:
+ task_1:
+ operator: airflow.operators.bash_operator.BashOperator
+ bash_command: 'echo 1'
+ task_2:
+ operator: airflow.operators.bash_operator.BashOperator
+ bash_command: 'echo 2'
+ dependencies: [task_1]
+ task_3:
+ operator: airflow.operators.bash_operator.BashOperator
+ bash_command: 'echo 3'
+ dependencies: [task_1]
+```
+
+Then in the DAGs folder in your Airflow environment you need to create a python file like this:
+
+```python
+from airflow import DAG
+import dagfactory
+
+dag_factory = dagfactory.DagFactory("/path/to/dags/config_file.yml")
+
+dag_factory.clean_dags(globals())
+dag_factory.generate_dags(globals())
+```
+
+And this DAG will be generated and ready to run in Airflow!
+
+If you have several configuration files you can import them like this:
+
+```python
+# 'airflow' word is required for the dagbag to parse this file
+from dagfactory import load_yaml_dags
+
+load_yaml_dags(globals_dict=globals(), suffix=['dag.yaml'])
+```
+
+![screenshot](/img/example_dag.png)
+
+## Notes
+
+### HttpSensor (since 0.10.0)
+
+The package `airflow.sensors.http_sensor` works with all supported versions of Airflow. In Airflow 2.0+, the new package name can be used in the operator value: `airflow.providers.http.sensors.http`
+
+The following example shows `response_check` logic in a python file:
+
+```yaml
+task_2:
+ operator: airflow.sensors.http_sensor.HttpSensor
+ http_conn_id: 'test-http'
+ method: 'GET'
+ response_check_name: check_sensor
+ response_check_file: /path/to/example1/http_conn.py
+ dependencies: [task_1]
+```
+
+The `response_check` logic can also be provided as a lambda:
+
+```yaml
+task_2:
+ operator: airflow.sensors.http_sensor.HttpSensor
+ http_conn_id: 'test-http'
+ method: 'GET'
+ response_check_lambda: 'lambda response: "ok" in reponse.text'
+ dependencies: [task_1]
+```
+
+## Benefits
+
+* Construct DAGs without knowing Python
+* Construct DAGs without learning Airflow primitives
+* Avoid duplicative code
+* Everyone loves YAML! ;)
+
+## Contributing
+
+Contributions are welcome! Just submit a Pull Request or Github Issue.
+
+
+
+
+%package help
+Summary: Development documents and examples for dag-factory
+Provides: python3-dag-factory-doc
+%description help
+
+# dag-factory
+
+[![Github Actions](https://github.com/ajbosco/dag-factory/workflows/build/badge.svg?branch=master&event=push)](https://github.com/ajbosco/dag-factory/actions?workflow=build)
+[![Coverage](https://codecov.io/github/ajbosco/dag-factory/coverage.svg?branch=master)](https://codecov.io/github/ajbosco/dag-factory?branch=master)
+[![PyPi](https://img.shields.io/pypi/v/dag-factory.svg)](https://pypi.org/project/dag-factory/)
+[![Code Style](https://img.shields.io/badge/code%20style-black-000000.svg)](https://github.com/ambv/black)
+[![Downloads](https://pepy.tech/badge/dag-factory)](https://pepy.tech/project/dag-factory)
+
+*dag-factory* is a library for dynamically generating [Apache Airflow](https://github.com/apache/incubator-airflow) DAGs from YAML configuration files.
+- [Installation](#installation)
+- [Usage](#usage)
+- [Benefits](#benefits)
+- [Contributing](#contributing)
+
+## Installation
+
+To install *dag-factory* run `pip install dag-factory`. It requires Python 3.6.0+ and Apache Airflow 2.0+.
+
+## Usage
+
+After installing *dag-factory* in your Airflow environment, there are two steps to creating DAGs. First, we need to create a YAML configuration file. For example:
+
+```yaml
+example_dag1:
+ default_args:
+ owner: 'example_owner'
+ start_date: 2018-01-01 # or '2 days'
+ end_date: 2018-01-05
+ retries: 1
+ retry_delay_sec: 300
+ schedule_interval: '0 3 * * *'
+ concurrency: 1
+ max_active_runs: 1
+ dagrun_timeout_sec: 60
+ default_view: 'tree' # or 'graph', 'duration', 'gantt', 'landing_times'
+ orientation: 'LR' # or 'TB', 'RL', 'BT'
+ description: 'this is an example dag!'
+ on_success_callback_name: print_hello
+ on_success_callback_file: /usr/local/airflow/dags/print_hello.py
+ on_failure_callback_name: print_hello
+ on_failure_callback_file: /usr/local/airflow/dags/print_hello.py
+ tasks:
+ task_1:
+ operator: airflow.operators.bash_operator.BashOperator
+ bash_command: 'echo 1'
+ task_2:
+ operator: airflow.operators.bash_operator.BashOperator
+ bash_command: 'echo 2'
+ dependencies: [task_1]
+ task_3:
+ operator: airflow.operators.bash_operator.BashOperator
+ bash_command: 'echo 3'
+ dependencies: [task_1]
+```
+
+Then in the DAGs folder in your Airflow environment you need to create a python file like this:
+
+```python
+from airflow import DAG
+import dagfactory
+
+dag_factory = dagfactory.DagFactory("/path/to/dags/config_file.yml")
+
+dag_factory.clean_dags(globals())
+dag_factory.generate_dags(globals())
+```
+
+And this DAG will be generated and ready to run in Airflow!
+
+If you have several configuration files you can import them like this:
+
+```python
+# 'airflow' word is required for the dagbag to parse this file
+from dagfactory import load_yaml_dags
+
+load_yaml_dags(globals_dict=globals(), suffix=['dag.yaml'])
+```
+
+![screenshot](/img/example_dag.png)
+
+## Notes
+
+### HttpSensor (since 0.10.0)
+
+The package `airflow.sensors.http_sensor` works with all supported versions of Airflow. In Airflow 2.0+, the new package name can be used in the operator value: `airflow.providers.http.sensors.http`
+
+The following example shows `response_check` logic in a python file:
+
+```yaml
+task_2:
+ operator: airflow.sensors.http_sensor.HttpSensor
+ http_conn_id: 'test-http'
+ method: 'GET'
+ response_check_name: check_sensor
+ response_check_file: /path/to/example1/http_conn.py
+ dependencies: [task_1]
+```
+
+The `response_check` logic can also be provided as a lambda:
+
+```yaml
+task_2:
+ operator: airflow.sensors.http_sensor.HttpSensor
+ http_conn_id: 'test-http'
+ method: 'GET'
+ response_check_lambda: 'lambda response: "ok" in reponse.text'
+ dependencies: [task_1]
+```
+
+## Benefits
+
+* Construct DAGs without knowing Python
+* Construct DAGs without learning Airflow primitives
+* Avoid duplicative code
+* Everyone loves YAML! ;)
+
+## Contributing
+
+Contributions are welcome! Just submit a Pull Request or Github Issue.
+
+
+
+
+%prep
+%autosetup -n dag-factory-0.17.3
+
+%build
+%py3_build
+
+%install
+%py3_install
+install -d -m755 %{buildroot}/%{_pkgdocdir}
+if [ -d doc ]; then cp -arf doc %{buildroot}/%{_pkgdocdir}; fi
+if [ -d docs ]; then cp -arf docs %{buildroot}/%{_pkgdocdir}; fi
+if [ -d example ]; then cp -arf example %{buildroot}/%{_pkgdocdir}; fi
+if [ -d examples ]; then cp -arf examples %{buildroot}/%{_pkgdocdir}; fi
+pushd %{buildroot}
+if [ -d usr/lib ]; then
+ find usr/lib -type f -printf "/%h/%f\n" >> filelist.lst
+fi
+if [ -d usr/lib64 ]; then
+ find usr/lib64 -type f -printf "/%h/%f\n" >> filelist.lst
+fi
+if [ -d usr/bin ]; then
+ find usr/bin -type f -printf "/%h/%f\n" >> filelist.lst
+fi
+if [ -d usr/sbin ]; then
+ find usr/sbin -type f -printf "/%h/%f\n" >> filelist.lst
+fi
+touch doclist.lst
+if [ -d usr/share/man ]; then
+ find usr/share/man -type f -printf "/%h/%f.gz\n" >> doclist.lst
+fi
+popd
+mv %{buildroot}/filelist.lst .
+mv %{buildroot}/doclist.lst .
+
+%files -n python3-dag-factory -f filelist.lst
+%dir %{python3_sitelib}/*
+
+%files help -f doclist.lst
+%{_docdir}/*
+
+%changelog
+* Mon Apr 10 2023 Python_Bot <Python_Bot@openeuler.org> - 0.17.3-1
+- Package Spec generated
diff --git a/sources b/sources
new file mode 100644
index 0000000..1941f56
--- /dev/null
+++ b/sources
@@ -0,0 +1 @@
+f52aad858e5d60a1a7b5eed8d3c44914 dag-factory-0.17.3.tar.gz