summaryrefslogtreecommitdiff
path: root/python-airflow-provider-fivetran.spec
diff options
context:
space:
mode:
Diffstat (limited to 'python-airflow-provider-fivetran.spec')
-rw-r--r--python-airflow-provider-fivetran.spec278
1 files changed, 278 insertions, 0 deletions
diff --git a/python-airflow-provider-fivetran.spec b/python-airflow-provider-fivetran.spec
new file mode 100644
index 0000000..343feda
--- /dev/null
+++ b/python-airflow-provider-fivetran.spec
@@ -0,0 +1,278 @@
+%global _empty_manifest_terminate_build 0
+Name: python-airflow-provider-fivetran
+Version: 1.1.4
+Release: 1
+Summary: A Fivetran provider for Apache Airflow
+License: Apache License 2.0
+URL: https://github.com/fivetran/airflow-provider-fivetran
+Source0: https://mirrors.nju.edu.cn/pypi/web/packages/00/f4/187ef2c73cb5778e3db716750557db5b5fd7c4d1d7153d38ecaa65609796/airflow-provider-fivetran-1.1.4.tar.gz
+BuildArch: noarch
+
+Requires: python3-requests
+Requires: python3-apache-airflow
+
+%description
+# Fivetran Provider for Apache Airflow
+
+This package provides an operator, sensor, and hook that integrates [Fivetran](https://fivetran.com) into Apache Airflow. `FivetranOperator` allows you to start Fivetran jobs from Airflow and `FivetranSensor` allows you to monitor a Fivetran sync job for completion before running downstream processes.
+
+[Fivetran automates your data pipeline, and Airflow automates your data processing.](https://www.youtube.com/watch?v=siSx6L2ckSw&ab_channel=Fivetran)
+
+## Installation
+
+Prerequisites: An environment running `apache-airflow`.
+
+```
+pip install airflow-provider-fivetran
+```
+
+## Configuration
+
+In the Airflow user interface, configure a Connection for Fivetran. Most of the Connection config fields will be left blank. Configure the following fields:
+
+* `Conn Id`: `fivetran_default`
+* `Conn Type`: `Fivetran`
+* `Fivetran API Key`: Your Fivetran API Key
+* `Fivetran API Secret`: Your Fivetran API Secret
+
+Find the Fivetran API Key and Secret in the [Fivetran Account Settings](https://fivetran.com/account/settings), under the **API Config** section. See our documentation for more information on [Fivetran API Authentication](https://fivetran.com/docs/rest-api/getting-started#authentication).
+
+The sensor and operator assume the `Conn Id` is set to `fivetran_default`, however if you are managing multipe Fivetran accounts, you can set this to anything you like. See the DAG in examples to see how to specify a custom `Conn Id`.
+
+## Modules
+
+### [Fivetran Operator](https://github.com/fivetran/airflow-provider-fivetran/blob/main/fivetran_provider/operators/fivetran.py)
+
+`FivetranOperator` starts a Fivetran sync job. Note that when a Fivetran sync job is controlled via an Operator, it is no longer run on the schedule as managed by Fivetran. In other words, it is now scheduled only from Airflow.
+
+`FivetranOperator` requires that you specify the `connector_id` of the sync job to start. You can find `connector_id` in the Settings page of the connector you configured in the [Fivetran dashboard](https://fivetran.com/dashboard/connectors).
+
+Import into your DAG via:
+```
+from fivetran_provider.operators.fivetran import FivetranOperator
+```
+
+### [Fivetran Sensor](https://github.com/fivetran/airflow-provider-fivetran/blob/main/fivetran_provider/sensors/fivetran.py)
+
+`FivetranSensor` monitors a Fivetran sync job for completion. Monitoring with `FivetranSensor` allows you to trigger downstream processes only when the Fivetran sync jobs have completed, ensuring data consistency. You can use multiple instances of `FivetranSensor` to monitor multiple Fivetran connectors.
+
+Note, it is possible to monitor a sync that is scheduled and managed from Fivetran; in other words, you can use `FivetranSensor` without using `FivetranOperator`. If used in this way, your DAG will wait until the sync job starts on its Fivetran-controlled schedule and then completes.
+
+`FivetranSensor` requires that you specify the `connector_id` of the sync job to start. You can find `connector_id` in the Settings page of the connector you configured in the [Fivetran dashboard](https://fivetran.com/dashboard/connectors).
+
+Import into your DAG via:
+```
+from fivetran_provider.sensors.fivetran import FivetranSensor
+```
+
+## Examples
+
+See the [**examples**](https://github.com/fivetran/airflow-provider-fivetran/tree/main/fivetran_provider/example_dags) directory for an example DAG.
+
+## Issues
+
+Please submit [issues](https://github.com/fivetran/airflow-provider-fivetran/issues) and [pull requests](https://github.com/fivetran/airflow-provider-fivetran/pulls) in our official repo:
+[https://github.com/fivetran/airflow-provider-fivetran](https://github.com/fivetran/airflow-provider-fivetran)
+
+We are happy to hear from you. Please email any feedback to the authors at [devrel@fivetran.com](mailto:devrel@fivetran.com).
+
+
+## Acknowledgements
+
+Special thanks to [Pete DeJoy](https://github.com/petedejoy), [Plinio Guzman](https://github.com/pgzmnk), and [David Koenitzer](https://github.com/sunkickr) of [Astronomer.io](https://www.astronomer.io/) for their contributions and support in getting this provider off the ground.
+
+
+%package -n python3-airflow-provider-fivetran
+Summary: A Fivetran provider for Apache Airflow
+Provides: python-airflow-provider-fivetran
+BuildRequires: python3-devel
+BuildRequires: python3-setuptools
+BuildRequires: python3-pip
+%description -n python3-airflow-provider-fivetran
+# Fivetran Provider for Apache Airflow
+
+This package provides an operator, sensor, and hook that integrates [Fivetran](https://fivetran.com) into Apache Airflow. `FivetranOperator` allows you to start Fivetran jobs from Airflow and `FivetranSensor` allows you to monitor a Fivetran sync job for completion before running downstream processes.
+
+[Fivetran automates your data pipeline, and Airflow automates your data processing.](https://www.youtube.com/watch?v=siSx6L2ckSw&ab_channel=Fivetran)
+
+## Installation
+
+Prerequisites: An environment running `apache-airflow`.
+
+```
+pip install airflow-provider-fivetran
+```
+
+## Configuration
+
+In the Airflow user interface, configure a Connection for Fivetran. Most of the Connection config fields will be left blank. Configure the following fields:
+
+* `Conn Id`: `fivetran_default`
+* `Conn Type`: `Fivetran`
+* `Fivetran API Key`: Your Fivetran API Key
+* `Fivetran API Secret`: Your Fivetran API Secret
+
+Find the Fivetran API Key and Secret in the [Fivetran Account Settings](https://fivetran.com/account/settings), under the **API Config** section. See our documentation for more information on [Fivetran API Authentication](https://fivetran.com/docs/rest-api/getting-started#authentication).
+
+The sensor and operator assume the `Conn Id` is set to `fivetran_default`, however if you are managing multipe Fivetran accounts, you can set this to anything you like. See the DAG in examples to see how to specify a custom `Conn Id`.
+
+## Modules
+
+### [Fivetran Operator](https://github.com/fivetran/airflow-provider-fivetran/blob/main/fivetran_provider/operators/fivetran.py)
+
+`FivetranOperator` starts a Fivetran sync job. Note that when a Fivetran sync job is controlled via an Operator, it is no longer run on the schedule as managed by Fivetran. In other words, it is now scheduled only from Airflow.
+
+`FivetranOperator` requires that you specify the `connector_id` of the sync job to start. You can find `connector_id` in the Settings page of the connector you configured in the [Fivetran dashboard](https://fivetran.com/dashboard/connectors).
+
+Import into your DAG via:
+```
+from fivetran_provider.operators.fivetran import FivetranOperator
+```
+
+### [Fivetran Sensor](https://github.com/fivetran/airflow-provider-fivetran/blob/main/fivetran_provider/sensors/fivetran.py)
+
+`FivetranSensor` monitors a Fivetran sync job for completion. Monitoring with `FivetranSensor` allows you to trigger downstream processes only when the Fivetran sync jobs have completed, ensuring data consistency. You can use multiple instances of `FivetranSensor` to monitor multiple Fivetran connectors.
+
+Note, it is possible to monitor a sync that is scheduled and managed from Fivetran; in other words, you can use `FivetranSensor` without using `FivetranOperator`. If used in this way, your DAG will wait until the sync job starts on its Fivetran-controlled schedule and then completes.
+
+`FivetranSensor` requires that you specify the `connector_id` of the sync job to start. You can find `connector_id` in the Settings page of the connector you configured in the [Fivetran dashboard](https://fivetran.com/dashboard/connectors).
+
+Import into your DAG via:
+```
+from fivetran_provider.sensors.fivetran import FivetranSensor
+```
+
+## Examples
+
+See the [**examples**](https://github.com/fivetran/airflow-provider-fivetran/tree/main/fivetran_provider/example_dags) directory for an example DAG.
+
+## Issues
+
+Please submit [issues](https://github.com/fivetran/airflow-provider-fivetran/issues) and [pull requests](https://github.com/fivetran/airflow-provider-fivetran/pulls) in our official repo:
+[https://github.com/fivetran/airflow-provider-fivetran](https://github.com/fivetran/airflow-provider-fivetran)
+
+We are happy to hear from you. Please email any feedback to the authors at [devrel@fivetran.com](mailto:devrel@fivetran.com).
+
+
+## Acknowledgements
+
+Special thanks to [Pete DeJoy](https://github.com/petedejoy), [Plinio Guzman](https://github.com/pgzmnk), and [David Koenitzer](https://github.com/sunkickr) of [Astronomer.io](https://www.astronomer.io/) for their contributions and support in getting this provider off the ground.
+
+
+%package help
+Summary: Development documents and examples for airflow-provider-fivetran
+Provides: python3-airflow-provider-fivetran-doc
+%description help
+# Fivetran Provider for Apache Airflow
+
+This package provides an operator, sensor, and hook that integrates [Fivetran](https://fivetran.com) into Apache Airflow. `FivetranOperator` allows you to start Fivetran jobs from Airflow and `FivetranSensor` allows you to monitor a Fivetran sync job for completion before running downstream processes.
+
+[Fivetran automates your data pipeline, and Airflow automates your data processing.](https://www.youtube.com/watch?v=siSx6L2ckSw&ab_channel=Fivetran)
+
+## Installation
+
+Prerequisites: An environment running `apache-airflow`.
+
+```
+pip install airflow-provider-fivetran
+```
+
+## Configuration
+
+In the Airflow user interface, configure a Connection for Fivetran. Most of the Connection config fields will be left blank. Configure the following fields:
+
+* `Conn Id`: `fivetran_default`
+* `Conn Type`: `Fivetran`
+* `Fivetran API Key`: Your Fivetran API Key
+* `Fivetran API Secret`: Your Fivetran API Secret
+
+Find the Fivetran API Key and Secret in the [Fivetran Account Settings](https://fivetran.com/account/settings), under the **API Config** section. See our documentation for more information on [Fivetran API Authentication](https://fivetran.com/docs/rest-api/getting-started#authentication).
+
+The sensor and operator assume the `Conn Id` is set to `fivetran_default`, however if you are managing multipe Fivetran accounts, you can set this to anything you like. See the DAG in examples to see how to specify a custom `Conn Id`.
+
+## Modules
+
+### [Fivetran Operator](https://github.com/fivetran/airflow-provider-fivetran/blob/main/fivetran_provider/operators/fivetran.py)
+
+`FivetranOperator` starts a Fivetran sync job. Note that when a Fivetran sync job is controlled via an Operator, it is no longer run on the schedule as managed by Fivetran. In other words, it is now scheduled only from Airflow.
+
+`FivetranOperator` requires that you specify the `connector_id` of the sync job to start. You can find `connector_id` in the Settings page of the connector you configured in the [Fivetran dashboard](https://fivetran.com/dashboard/connectors).
+
+Import into your DAG via:
+```
+from fivetran_provider.operators.fivetran import FivetranOperator
+```
+
+### [Fivetran Sensor](https://github.com/fivetran/airflow-provider-fivetran/blob/main/fivetran_provider/sensors/fivetran.py)
+
+`FivetranSensor` monitors a Fivetran sync job for completion. Monitoring with `FivetranSensor` allows you to trigger downstream processes only when the Fivetran sync jobs have completed, ensuring data consistency. You can use multiple instances of `FivetranSensor` to monitor multiple Fivetran connectors.
+
+Note, it is possible to monitor a sync that is scheduled and managed from Fivetran; in other words, you can use `FivetranSensor` without using `FivetranOperator`. If used in this way, your DAG will wait until the sync job starts on its Fivetran-controlled schedule and then completes.
+
+`FivetranSensor` requires that you specify the `connector_id` of the sync job to start. You can find `connector_id` in the Settings page of the connector you configured in the [Fivetran dashboard](https://fivetran.com/dashboard/connectors).
+
+Import into your DAG via:
+```
+from fivetran_provider.sensors.fivetran import FivetranSensor
+```
+
+## Examples
+
+See the [**examples**](https://github.com/fivetran/airflow-provider-fivetran/tree/main/fivetran_provider/example_dags) directory for an example DAG.
+
+## Issues
+
+Please submit [issues](https://github.com/fivetran/airflow-provider-fivetran/issues) and [pull requests](https://github.com/fivetran/airflow-provider-fivetran/pulls) in our official repo:
+[https://github.com/fivetran/airflow-provider-fivetran](https://github.com/fivetran/airflow-provider-fivetran)
+
+We are happy to hear from you. Please email any feedback to the authors at [devrel@fivetran.com](mailto:devrel@fivetran.com).
+
+
+## Acknowledgements
+
+Special thanks to [Pete DeJoy](https://github.com/petedejoy), [Plinio Guzman](https://github.com/pgzmnk), and [David Koenitzer](https://github.com/sunkickr) of [Astronomer.io](https://www.astronomer.io/) for their contributions and support in getting this provider off the ground.
+
+
+%prep
+%autosetup -n airflow-provider-fivetran-1.1.4
+
+%build
+%py3_build
+
+%install
+%py3_install
+install -d -m755 %{buildroot}/%{_pkgdocdir}
+if [ -d doc ]; then cp -arf doc %{buildroot}/%{_pkgdocdir}; fi
+if [ -d docs ]; then cp -arf docs %{buildroot}/%{_pkgdocdir}; fi
+if [ -d example ]; then cp -arf example %{buildroot}/%{_pkgdocdir}; fi
+if [ -d examples ]; then cp -arf examples %{buildroot}/%{_pkgdocdir}; fi
+pushd %{buildroot}
+if [ -d usr/lib ]; then
+ find usr/lib -type f -printf "/%h/%f\n" >> filelist.lst
+fi
+if [ -d usr/lib64 ]; then
+ find usr/lib64 -type f -printf "/%h/%f\n" >> filelist.lst
+fi
+if [ -d usr/bin ]; then
+ find usr/bin -type f -printf "/%h/%f\n" >> filelist.lst
+fi
+if [ -d usr/sbin ]; then
+ find usr/sbin -type f -printf "/%h/%f\n" >> filelist.lst
+fi
+touch doclist.lst
+if [ -d usr/share/man ]; then
+ find usr/share/man -type f -printf "/%h/%f.gz\n" >> doclist.lst
+fi
+popd
+mv %{buildroot}/filelist.lst .
+mv %{buildroot}/doclist.lst .
+
+%files -n python3-airflow-provider-fivetran -f filelist.lst
+%dir %{python3_sitelib}/*
+
+%files help -f doclist.lst
+%{_docdir}/*
+
+%changelog
+* Wed Apr 12 2023 Python_Bot <Python_Bot@openeuler.org> - 1.1.4-1
+- Package Spec generated