summaryrefslogtreecommitdiff
diff options
context:
space:
mode:
authorCoprDistGit <infra@openeuler.org>2023-04-11 11:04:14 +0000
committerCoprDistGit <infra@openeuler.org>2023-04-11 11:04:14 +0000
commit73b1fc1c29358b7db6cf16c53ac079f22635f9a6 (patch)
treec18e7628964dcf998b86e1c94d1da0f406f299ad
parent8b025a6402d319965fd065977a8a8262ff245d0a (diff)
automatic import of python-airflow-dbt
-rw-r--r--.gitignore1
-rw-r--r--python-airflow-dbt.spec568
-rw-r--r--sources1
3 files changed, 570 insertions, 0 deletions
diff --git a/.gitignore b/.gitignore
index e69de29..399fb0e 100644
--- a/.gitignore
+++ b/.gitignore
@@ -0,0 +1 @@
+/airflow_dbt-0.4.0.tar.gz
diff --git a/python-airflow-dbt.spec b/python-airflow-dbt.spec
new file mode 100644
index 0000000..1377bfc
--- /dev/null
+++ b/python-airflow-dbt.spec
@@ -0,0 +1,568 @@
+%global _empty_manifest_terminate_build 0
+Name: python-airflow-dbt
+Version: 0.4.0
+Release: 1
+Summary: Apache Airflow integration for dbt
+License: MIT
+URL: https://github.com/gocardless/airflow-dbt
+Source0: https://mirrors.nju.edu.cn/pypi/web/packages/60/c3/2519922ca9550170975299ff0c918421dcf382ae60181750b96226a1f098/airflow_dbt-0.4.0.tar.gz
+BuildArch: noarch
+
+Requires: python3-apache-airflow
+
+%description
+# airflow-dbt
+
+This is a collection of [Airflow](https://airflow.apache.org/) operators to provide easy integration with [dbt](https://www.getdbt.com).
+
+```py
+from airflow import DAG
+from airflow_dbt.operators.dbt_operator import (
+ DbtSeedOperator,
+ DbtSnapshotOperator,
+ DbtRunOperator,
+ DbtTestOperator
+)
+from airflow.utils.dates import days_ago
+
+default_args = {
+ 'dir': '/srv/app/dbt',
+ 'start_date': days_ago(0)
+}
+
+with DAG(dag_id='dbt', default_args=default_args, schedule_interval='@daily') as dag:
+
+ dbt_seed = DbtSeedOperator(
+ task_id='dbt_seed',
+ )
+
+ dbt_snapshot = DbtSnapshotOperator(
+ task_id='dbt_snapshot',
+ )
+
+ dbt_run = DbtRunOperator(
+ task_id='dbt_run',
+ )
+
+ dbt_test = DbtTestOperator(
+ task_id='dbt_test',
+ retries=0, # Failing tests would fail the task, and we don't want Airflow to try again
+ )
+
+ dbt_seed >> dbt_snapshot >> dbt_run >> dbt_test
+```
+
+## Installation
+
+Install from PyPI:
+
+```sh
+pip install airflow-dbt
+```
+
+It will also need access to the `dbt` CLI, which should either be on your `PATH` or can be set with the `dbt_bin` argument in each operator.
+
+## Usage
+
+There are five operators currently implemented:
+
+* `DbtDocsGenerateOperator`
+ * Calls [`dbt docs generate`](https://docs.getdbt.com/reference/commands/cmd-docs)
+* `DbtDepsOperator`
+ * Calls [`dbt deps`](https://docs.getdbt.com/docs/deps)
+* `DbtSeedOperator`
+ * Calls [`dbt seed`](https://docs.getdbt.com/docs/seed)
+* `DbtSnapshotOperator`
+ * Calls [`dbt snapshot`](https://docs.getdbt.com/docs/snapshot)
+* `DbtRunOperator`
+ * Calls [`dbt run`](https://docs.getdbt.com/docs/run)
+* `DbtTestOperator`
+ * Calls [`dbt test`](https://docs.getdbt.com/docs/test)
+
+
+Each of the above operators accept the following arguments:
+
+* `profiles_dir`
+ * If set, passed as the `--profiles-dir` argument to the `dbt` command
+* `target`
+ * If set, passed as the `--target` argument to the `dbt` command
+* `dir`
+ * The directory to run the `dbt` command in
+* `full_refresh`
+ * If set to `True`, passes `--full-refresh`
+* `vars`
+ * If set, passed as the `--vars` argument to the `dbt` command. Should be set as a Python dictionary, as will be passed to the `dbt` command as YAML
+* `models`
+ * If set, passed as the `--models` argument to the `dbt` command
+* `exclude`
+ * If set, passed as the `--exclude` argument to the `dbt` command
+* `select`
+ * If set, passed as the `--select` argument to the `dbt` command
+* `dbt_bin`
+ * The `dbt` CLI. Defaults to `dbt`, so assumes it's on your `PATH`
+* `verbose`
+ * The operator will log verbosely to the Airflow logs
+* `warn_error`
+ * If set to `True`, passes `--warn-error` argument to `dbt` command and will treat warnings as errors
+
+Typically you will want to use the `DbtRunOperator`, followed by the `DbtTestOperator`, as shown earlier.
+
+You can also use the hook directly. Typically this can be used for when you need to combine the `dbt` command with another task in the same operators, for example running `dbt docs` and uploading the docs to somewhere they can be served from.
+
+## Building Locally
+
+To install from the repository:
+First it's recommended to create a virtual environment:
+```bash
+python3 -m venv .venv
+
+source .venv/bin/activate
+```
+
+Install using `pip`:
+```bash
+pip install .
+```
+
+## Testing
+
+To run tests locally, first create a virtual environment (see [Building Locally](https://github.com/gocardless/airflow-dbt#building-locally) section)
+
+Install dependencies:
+```bash
+pip install . pytest
+```
+
+Run the tests:
+```bash
+pytest tests/
+```
+
+## Code style
+This project uses [flake8](https://flake8.pycqa.org/en/latest/).
+
+To check your code, first create a virtual environment (see [Building Locally](https://github.com/gocardless/airflow-dbt#building-locally) section):
+```bash
+pip install flake8
+flake8 airflow_dbt/ tests/ setup.py
+```
+
+## Package management
+
+If you use dbt's package manager you should include all dependencies before deploying your dbt project.
+
+For Docker users, packages specified in `packages.yml` should be included as part your docker image by calling `dbt deps` in your `Dockerfile`.
+
+## Amazon Managed Workflows for Apache Airflow (MWAA)
+
+If you use MWAA, you just need to update the `requirements.txt` file and add `airflow-dbt` and `dbt` to it.
+
+Then you can have your dbt code inside a folder `{DBT_FOLDER}` in the dags folder on S3 and configure the dbt task like below:
+
+```python
+dbt_run = DbtRunOperator(
+ task_id='dbt_run',
+ dbt_bin='/usr/local/airflow/.local/bin/dbt',
+ profiles_dir='/usr/local/airflow/dags/{DBT_FOLDER}/',
+ dir='/usr/local/airflow/dags/{DBT_FOLDER}/'
+)
+```
+
+## License & Contributing
+
+* This is available as open source under the terms of the [MIT License](http://opensource.org/licenses/MIT).
+* Bug reports and pull requests are welcome on GitHub at https://github.com/gocardless/airflow-dbt.
+
+GoCardless ♥ open source. If you do too, come [join us](https://gocardless.com/about/jobs).
+
+
+
+
+%package -n python3-airflow-dbt
+Summary: Apache Airflow integration for dbt
+Provides: python-airflow-dbt
+BuildRequires: python3-devel
+BuildRequires: python3-setuptools
+BuildRequires: python3-pip
+%description -n python3-airflow-dbt
+# airflow-dbt
+
+This is a collection of [Airflow](https://airflow.apache.org/) operators to provide easy integration with [dbt](https://www.getdbt.com).
+
+```py
+from airflow import DAG
+from airflow_dbt.operators.dbt_operator import (
+ DbtSeedOperator,
+ DbtSnapshotOperator,
+ DbtRunOperator,
+ DbtTestOperator
+)
+from airflow.utils.dates import days_ago
+
+default_args = {
+ 'dir': '/srv/app/dbt',
+ 'start_date': days_ago(0)
+}
+
+with DAG(dag_id='dbt', default_args=default_args, schedule_interval='@daily') as dag:
+
+ dbt_seed = DbtSeedOperator(
+ task_id='dbt_seed',
+ )
+
+ dbt_snapshot = DbtSnapshotOperator(
+ task_id='dbt_snapshot',
+ )
+
+ dbt_run = DbtRunOperator(
+ task_id='dbt_run',
+ )
+
+ dbt_test = DbtTestOperator(
+ task_id='dbt_test',
+ retries=0, # Failing tests would fail the task, and we don't want Airflow to try again
+ )
+
+ dbt_seed >> dbt_snapshot >> dbt_run >> dbt_test
+```
+
+## Installation
+
+Install from PyPI:
+
+```sh
+pip install airflow-dbt
+```
+
+It will also need access to the `dbt` CLI, which should either be on your `PATH` or can be set with the `dbt_bin` argument in each operator.
+
+## Usage
+
+There are five operators currently implemented:
+
+* `DbtDocsGenerateOperator`
+ * Calls [`dbt docs generate`](https://docs.getdbt.com/reference/commands/cmd-docs)
+* `DbtDepsOperator`
+ * Calls [`dbt deps`](https://docs.getdbt.com/docs/deps)
+* `DbtSeedOperator`
+ * Calls [`dbt seed`](https://docs.getdbt.com/docs/seed)
+* `DbtSnapshotOperator`
+ * Calls [`dbt snapshot`](https://docs.getdbt.com/docs/snapshot)
+* `DbtRunOperator`
+ * Calls [`dbt run`](https://docs.getdbt.com/docs/run)
+* `DbtTestOperator`
+ * Calls [`dbt test`](https://docs.getdbt.com/docs/test)
+
+
+Each of the above operators accept the following arguments:
+
+* `profiles_dir`
+ * If set, passed as the `--profiles-dir` argument to the `dbt` command
+* `target`
+ * If set, passed as the `--target` argument to the `dbt` command
+* `dir`
+ * The directory to run the `dbt` command in
+* `full_refresh`
+ * If set to `True`, passes `--full-refresh`
+* `vars`
+ * If set, passed as the `--vars` argument to the `dbt` command. Should be set as a Python dictionary, as will be passed to the `dbt` command as YAML
+* `models`
+ * If set, passed as the `--models` argument to the `dbt` command
+* `exclude`
+ * If set, passed as the `--exclude` argument to the `dbt` command
+* `select`
+ * If set, passed as the `--select` argument to the `dbt` command
+* `dbt_bin`
+ * The `dbt` CLI. Defaults to `dbt`, so assumes it's on your `PATH`
+* `verbose`
+ * The operator will log verbosely to the Airflow logs
+* `warn_error`
+ * If set to `True`, passes `--warn-error` argument to `dbt` command and will treat warnings as errors
+
+Typically you will want to use the `DbtRunOperator`, followed by the `DbtTestOperator`, as shown earlier.
+
+You can also use the hook directly. Typically this can be used for when you need to combine the `dbt` command with another task in the same operators, for example running `dbt docs` and uploading the docs to somewhere they can be served from.
+
+## Building Locally
+
+To install from the repository:
+First it's recommended to create a virtual environment:
+```bash
+python3 -m venv .venv
+
+source .venv/bin/activate
+```
+
+Install using `pip`:
+```bash
+pip install .
+```
+
+## Testing
+
+To run tests locally, first create a virtual environment (see [Building Locally](https://github.com/gocardless/airflow-dbt#building-locally) section)
+
+Install dependencies:
+```bash
+pip install . pytest
+```
+
+Run the tests:
+```bash
+pytest tests/
+```
+
+## Code style
+This project uses [flake8](https://flake8.pycqa.org/en/latest/).
+
+To check your code, first create a virtual environment (see [Building Locally](https://github.com/gocardless/airflow-dbt#building-locally) section):
+```bash
+pip install flake8
+flake8 airflow_dbt/ tests/ setup.py
+```
+
+## Package management
+
+If you use dbt's package manager you should include all dependencies before deploying your dbt project.
+
+For Docker users, packages specified in `packages.yml` should be included as part your docker image by calling `dbt deps` in your `Dockerfile`.
+
+## Amazon Managed Workflows for Apache Airflow (MWAA)
+
+If you use MWAA, you just need to update the `requirements.txt` file and add `airflow-dbt` and `dbt` to it.
+
+Then you can have your dbt code inside a folder `{DBT_FOLDER}` in the dags folder on S3 and configure the dbt task like below:
+
+```python
+dbt_run = DbtRunOperator(
+ task_id='dbt_run',
+ dbt_bin='/usr/local/airflow/.local/bin/dbt',
+ profiles_dir='/usr/local/airflow/dags/{DBT_FOLDER}/',
+ dir='/usr/local/airflow/dags/{DBT_FOLDER}/'
+)
+```
+
+## License & Contributing
+
+* This is available as open source under the terms of the [MIT License](http://opensource.org/licenses/MIT).
+* Bug reports and pull requests are welcome on GitHub at https://github.com/gocardless/airflow-dbt.
+
+GoCardless ♥ open source. If you do too, come [join us](https://gocardless.com/about/jobs).
+
+
+
+
+%package help
+Summary: Development documents and examples for airflow-dbt
+Provides: python3-airflow-dbt-doc
+%description help
+# airflow-dbt
+
+This is a collection of [Airflow](https://airflow.apache.org/) operators to provide easy integration with [dbt](https://www.getdbt.com).
+
+```py
+from airflow import DAG
+from airflow_dbt.operators.dbt_operator import (
+ DbtSeedOperator,
+ DbtSnapshotOperator,
+ DbtRunOperator,
+ DbtTestOperator
+)
+from airflow.utils.dates import days_ago
+
+default_args = {
+ 'dir': '/srv/app/dbt',
+ 'start_date': days_ago(0)
+}
+
+with DAG(dag_id='dbt', default_args=default_args, schedule_interval='@daily') as dag:
+
+ dbt_seed = DbtSeedOperator(
+ task_id='dbt_seed',
+ )
+
+ dbt_snapshot = DbtSnapshotOperator(
+ task_id='dbt_snapshot',
+ )
+
+ dbt_run = DbtRunOperator(
+ task_id='dbt_run',
+ )
+
+ dbt_test = DbtTestOperator(
+ task_id='dbt_test',
+ retries=0, # Failing tests would fail the task, and we don't want Airflow to try again
+ )
+
+ dbt_seed >> dbt_snapshot >> dbt_run >> dbt_test
+```
+
+## Installation
+
+Install from PyPI:
+
+```sh
+pip install airflow-dbt
+```
+
+It will also need access to the `dbt` CLI, which should either be on your `PATH` or can be set with the `dbt_bin` argument in each operator.
+
+## Usage
+
+There are five operators currently implemented:
+
+* `DbtDocsGenerateOperator`
+ * Calls [`dbt docs generate`](https://docs.getdbt.com/reference/commands/cmd-docs)
+* `DbtDepsOperator`
+ * Calls [`dbt deps`](https://docs.getdbt.com/docs/deps)
+* `DbtSeedOperator`
+ * Calls [`dbt seed`](https://docs.getdbt.com/docs/seed)
+* `DbtSnapshotOperator`
+ * Calls [`dbt snapshot`](https://docs.getdbt.com/docs/snapshot)
+* `DbtRunOperator`
+ * Calls [`dbt run`](https://docs.getdbt.com/docs/run)
+* `DbtTestOperator`
+ * Calls [`dbt test`](https://docs.getdbt.com/docs/test)
+
+
+Each of the above operators accept the following arguments:
+
+* `profiles_dir`
+ * If set, passed as the `--profiles-dir` argument to the `dbt` command
+* `target`
+ * If set, passed as the `--target` argument to the `dbt` command
+* `dir`
+ * The directory to run the `dbt` command in
+* `full_refresh`
+ * If set to `True`, passes `--full-refresh`
+* `vars`
+ * If set, passed as the `--vars` argument to the `dbt` command. Should be set as a Python dictionary, as will be passed to the `dbt` command as YAML
+* `models`
+ * If set, passed as the `--models` argument to the `dbt` command
+* `exclude`
+ * If set, passed as the `--exclude` argument to the `dbt` command
+* `select`
+ * If set, passed as the `--select` argument to the `dbt` command
+* `dbt_bin`
+ * The `dbt` CLI. Defaults to `dbt`, so assumes it's on your `PATH`
+* `verbose`
+ * The operator will log verbosely to the Airflow logs
+* `warn_error`
+ * If set to `True`, passes `--warn-error` argument to `dbt` command and will treat warnings as errors
+
+Typically you will want to use the `DbtRunOperator`, followed by the `DbtTestOperator`, as shown earlier.
+
+You can also use the hook directly. Typically this can be used for when you need to combine the `dbt` command with another task in the same operators, for example running `dbt docs` and uploading the docs to somewhere they can be served from.
+
+## Building Locally
+
+To install from the repository:
+First it's recommended to create a virtual environment:
+```bash
+python3 -m venv .venv
+
+source .venv/bin/activate
+```
+
+Install using `pip`:
+```bash
+pip install .
+```
+
+## Testing
+
+To run tests locally, first create a virtual environment (see [Building Locally](https://github.com/gocardless/airflow-dbt#building-locally) section)
+
+Install dependencies:
+```bash
+pip install . pytest
+```
+
+Run the tests:
+```bash
+pytest tests/
+```
+
+## Code style
+This project uses [flake8](https://flake8.pycqa.org/en/latest/).
+
+To check your code, first create a virtual environment (see [Building Locally](https://github.com/gocardless/airflow-dbt#building-locally) section):
+```bash
+pip install flake8
+flake8 airflow_dbt/ tests/ setup.py
+```
+
+## Package management
+
+If you use dbt's package manager you should include all dependencies before deploying your dbt project.
+
+For Docker users, packages specified in `packages.yml` should be included as part your docker image by calling `dbt deps` in your `Dockerfile`.
+
+## Amazon Managed Workflows for Apache Airflow (MWAA)
+
+If you use MWAA, you just need to update the `requirements.txt` file and add `airflow-dbt` and `dbt` to it.
+
+Then you can have your dbt code inside a folder `{DBT_FOLDER}` in the dags folder on S3 and configure the dbt task like below:
+
+```python
+dbt_run = DbtRunOperator(
+ task_id='dbt_run',
+ dbt_bin='/usr/local/airflow/.local/bin/dbt',
+ profiles_dir='/usr/local/airflow/dags/{DBT_FOLDER}/',
+ dir='/usr/local/airflow/dags/{DBT_FOLDER}/'
+)
+```
+
+## License & Contributing
+
+* This is available as open source under the terms of the [MIT License](http://opensource.org/licenses/MIT).
+* Bug reports and pull requests are welcome on GitHub at https://github.com/gocardless/airflow-dbt.
+
+GoCardless ♥ open source. If you do too, come [join us](https://gocardless.com/about/jobs).
+
+
+
+
+%prep
+%autosetup -n airflow-dbt-0.4.0
+
+%build
+%py3_build
+
+%install
+%py3_install
+install -d -m755 %{buildroot}/%{_pkgdocdir}
+if [ -d doc ]; then cp -arf doc %{buildroot}/%{_pkgdocdir}; fi
+if [ -d docs ]; then cp -arf docs %{buildroot}/%{_pkgdocdir}; fi
+if [ -d example ]; then cp -arf example %{buildroot}/%{_pkgdocdir}; fi
+if [ -d examples ]; then cp -arf examples %{buildroot}/%{_pkgdocdir}; fi
+pushd %{buildroot}
+if [ -d usr/lib ]; then
+ find usr/lib -type f -printf "/%h/%f\n" >> filelist.lst
+fi
+if [ -d usr/lib64 ]; then
+ find usr/lib64 -type f -printf "/%h/%f\n" >> filelist.lst
+fi
+if [ -d usr/bin ]; then
+ find usr/bin -type f -printf "/%h/%f\n" >> filelist.lst
+fi
+if [ -d usr/sbin ]; then
+ find usr/sbin -type f -printf "/%h/%f\n" >> filelist.lst
+fi
+touch doclist.lst
+if [ -d usr/share/man ]; then
+ find usr/share/man -type f -printf "/%h/%f.gz\n" >> doclist.lst
+fi
+popd
+mv %{buildroot}/filelist.lst .
+mv %{buildroot}/doclist.lst .
+
+%files -n python3-airflow-dbt -f filelist.lst
+%dir %{python3_sitelib}/*
+
+%files help -f doclist.lst
+%{_docdir}/*
+
+%changelog
+* Tue Apr 11 2023 Python_Bot <Python_Bot@openeuler.org> - 0.4.0-1
+- Package Spec generated
diff --git a/sources b/sources
new file mode 100644
index 0000000..c13e402
--- /dev/null
+++ b/sources
@@ -0,0 +1 @@
+7d93c0f549884125ba8c2762ff883cff airflow_dbt-0.4.0.tar.gz