summaryrefslogtreecommitdiff
diff options
context:
space:
mode:
authorCoprDistGit <infra@openeuler.org>2023-05-15 08:03:27 +0000
committerCoprDistGit <infra@openeuler.org>2023-05-15 08:03:27 +0000
commit69747bf4080d00debf8ebea06ce85771410e8830 (patch)
treedd871cb04a95a0f833e201cf6b8ba383c39c7c57
parentd9e1b761be7611ba7b2eaf3a0a19396e48a38f39 (diff)
automatic import of python-bigeye-airflow
-rw-r--r--.gitignore1
-rw-r--r--python-bigeye-airflow.spec243
-rw-r--r--sources1
3 files changed, 245 insertions, 0 deletions
diff --git a/.gitignore b/.gitignore
index e69de29..f85b412 100644
--- a/.gitignore
+++ b/.gitignore
@@ -0,0 +1 @@
+/bigeye_airflow-0.1.20.tar.gz
diff --git a/python-bigeye-airflow.spec b/python-bigeye-airflow.spec
new file mode 100644
index 0000000..0365b7a
--- /dev/null
+++ b/python-bigeye-airflow.spec
@@ -0,0 +1,243 @@
+%global _empty_manifest_terminate_build 0
+Name: python-bigeye-airflow
+Version: 0.1.20
+Release: 1
+Summary: Bigeye Airflow Library supports Airflow 2.4.3 and offers custom operators for interacting with your your bigeye workspace.
+License: Proprietary
+URL: https://docs.bigeye.com/docs
+Source0: https://mirrors.nju.edu.cn/pypi/web/packages/ae/6d/6da37db5f2c2350edc4b2b8e8597f6f1d0c97b2b8bc94c3cc57b79780ba6/bigeye_airflow-0.1.20.tar.gz
+BuildArch: noarch
+
+Requires: python3-Flask-OpenID
+Requires: python3-apache-airflow
+Requires: python3-bigeye-sdk
+
+%description
+# Bigeye Airflow Operators for Airflow Versions 2.x
+
+## Operators
+### Create Metric Operator (bigeye_airflow.oerators.create_metric_operator)
+
+The CreateMetricOperator creates metrics from a list of metric configurations provided to the operator.
+This operator will fill in reasonable defaults like setting thresholds. It authenticates through an Airflow connection
+ID and offers the option to run the metrics after those metrics have been created. Please review the link below to
+understand the structure of the configurations.
+
+[Create or Update Metric Swagger](https://docs.bigeye.com/reference/createmetric)
+
+#### Parameters
+1. connection_id: str - The Airfow connection ID used to store the required Bigeye credential.
+2. warehouse_id: int - The Bigeye source/warehouse id to which the metric configurations will be deployed.
+3. configuration: List[dict] - A list of metric configurations conforming to the following schema.
+ ```
+ schema_name: str
+ table_name: str
+ column_name: str
+ metric_template_id: uuid.UUID
+ metric_name: str
+ description: str
+ notifications: List[str]
+ thresholds: List[dict]
+ filters: List[str]
+ group_by: List[str]
+ user_defined_metric_name: str
+ metric_type: SimpleMetricCategory
+ default_check_frequency_hours: int
+ update_schedule: str
+ delay_at_update: str
+ timezone: str
+ should_backfill: bool
+ lookback_type: str
+ lookback_days: int
+ window_size: str
+ _window_size_seconds
+ ```
+4. run_after_upsert: bool - If true it will run the metrics after creation. Defaults to False.
+
+### Run Metrics Operator
+
+The RunMetricsOperator will run metrics in Bigeye based on the following:
+
+1. All metrics for a given table, by providing warehouse ID, schema name and table name.
+2. Any and all metrics, given a list of metric IDs.
+
+Currently, if a list of metric IDs is provided these will be run instead of metrics provided for
+warehouse_id, schema_name, table_name.
+
+#### Parameters
+1. connection_id: str - The Airfow connection ID used to store the required Bigeye credential.
+2. warehouse_id: int - The Bigeye source/warehouse id for which metrics will be run.
+3. schema_name: str - The schema name for which metrics will be run.
+4. table_name: str - The table name for which metrics will be run.
+5. metric_ids: List[int] - The metric ids to run.
+
+%package -n python3-bigeye-airflow
+Summary: Bigeye Airflow Library supports Airflow 2.4.3 and offers custom operators for interacting with your your bigeye workspace.
+Provides: python-bigeye-airflow
+BuildRequires: python3-devel
+BuildRequires: python3-setuptools
+BuildRequires: python3-pip
+%description -n python3-bigeye-airflow
+# Bigeye Airflow Operators for Airflow Versions 2.x
+
+## Operators
+### Create Metric Operator (bigeye_airflow.oerators.create_metric_operator)
+
+The CreateMetricOperator creates metrics from a list of metric configurations provided to the operator.
+This operator will fill in reasonable defaults like setting thresholds. It authenticates through an Airflow connection
+ID and offers the option to run the metrics after those metrics have been created. Please review the link below to
+understand the structure of the configurations.
+
+[Create or Update Metric Swagger](https://docs.bigeye.com/reference/createmetric)
+
+#### Parameters
+1. connection_id: str - The Airfow connection ID used to store the required Bigeye credential.
+2. warehouse_id: int - The Bigeye source/warehouse id to which the metric configurations will be deployed.
+3. configuration: List[dict] - A list of metric configurations conforming to the following schema.
+ ```
+ schema_name: str
+ table_name: str
+ column_name: str
+ metric_template_id: uuid.UUID
+ metric_name: str
+ description: str
+ notifications: List[str]
+ thresholds: List[dict]
+ filters: List[str]
+ group_by: List[str]
+ user_defined_metric_name: str
+ metric_type: SimpleMetricCategory
+ default_check_frequency_hours: int
+ update_schedule: str
+ delay_at_update: str
+ timezone: str
+ should_backfill: bool
+ lookback_type: str
+ lookback_days: int
+ window_size: str
+ _window_size_seconds
+ ```
+4. run_after_upsert: bool - If true it will run the metrics after creation. Defaults to False.
+
+### Run Metrics Operator
+
+The RunMetricsOperator will run metrics in Bigeye based on the following:
+
+1. All metrics for a given table, by providing warehouse ID, schema name and table name.
+2. Any and all metrics, given a list of metric IDs.
+
+Currently, if a list of metric IDs is provided these will be run instead of metrics provided for
+warehouse_id, schema_name, table_name.
+
+#### Parameters
+1. connection_id: str - The Airfow connection ID used to store the required Bigeye credential.
+2. warehouse_id: int - The Bigeye source/warehouse id for which metrics will be run.
+3. schema_name: str - The schema name for which metrics will be run.
+4. table_name: str - The table name for which metrics will be run.
+5. metric_ids: List[int] - The metric ids to run.
+
+%package help
+Summary: Development documents and examples for bigeye-airflow
+Provides: python3-bigeye-airflow-doc
+%description help
+# Bigeye Airflow Operators for Airflow Versions 2.x
+
+## Operators
+### Create Metric Operator (bigeye_airflow.oerators.create_metric_operator)
+
+The CreateMetricOperator creates metrics from a list of metric configurations provided to the operator.
+This operator will fill in reasonable defaults like setting thresholds. It authenticates through an Airflow connection
+ID and offers the option to run the metrics after those metrics have been created. Please review the link below to
+understand the structure of the configurations.
+
+[Create or Update Metric Swagger](https://docs.bigeye.com/reference/createmetric)
+
+#### Parameters
+1. connection_id: str - The Airfow connection ID used to store the required Bigeye credential.
+2. warehouse_id: int - The Bigeye source/warehouse id to which the metric configurations will be deployed.
+3. configuration: List[dict] - A list of metric configurations conforming to the following schema.
+ ```
+ schema_name: str
+ table_name: str
+ column_name: str
+ metric_template_id: uuid.UUID
+ metric_name: str
+ description: str
+ notifications: List[str]
+ thresholds: List[dict]
+ filters: List[str]
+ group_by: List[str]
+ user_defined_metric_name: str
+ metric_type: SimpleMetricCategory
+ default_check_frequency_hours: int
+ update_schedule: str
+ delay_at_update: str
+ timezone: str
+ should_backfill: bool
+ lookback_type: str
+ lookback_days: int
+ window_size: str
+ _window_size_seconds
+ ```
+4. run_after_upsert: bool - If true it will run the metrics after creation. Defaults to False.
+
+### Run Metrics Operator
+
+The RunMetricsOperator will run metrics in Bigeye based on the following:
+
+1. All metrics for a given table, by providing warehouse ID, schema name and table name.
+2. Any and all metrics, given a list of metric IDs.
+
+Currently, if a list of metric IDs is provided these will be run instead of metrics provided for
+warehouse_id, schema_name, table_name.
+
+#### Parameters
+1. connection_id: str - The Airfow connection ID used to store the required Bigeye credential.
+2. warehouse_id: int - The Bigeye source/warehouse id for which metrics will be run.
+3. schema_name: str - The schema name for which metrics will be run.
+4. table_name: str - The table name for which metrics will be run.
+5. metric_ids: List[int] - The metric ids to run.
+
+%prep
+%autosetup -n bigeye-airflow-0.1.20
+
+%build
+%py3_build
+
+%install
+%py3_install
+install -d -m755 %{buildroot}/%{_pkgdocdir}
+if [ -d doc ]; then cp -arf doc %{buildroot}/%{_pkgdocdir}; fi
+if [ -d docs ]; then cp -arf docs %{buildroot}/%{_pkgdocdir}; fi
+if [ -d example ]; then cp -arf example %{buildroot}/%{_pkgdocdir}; fi
+if [ -d examples ]; then cp -arf examples %{buildroot}/%{_pkgdocdir}; fi
+pushd %{buildroot}
+if [ -d usr/lib ]; then
+ find usr/lib -type f -printf "/%h/%f\n" >> filelist.lst
+fi
+if [ -d usr/lib64 ]; then
+ find usr/lib64 -type f -printf "/%h/%f\n" >> filelist.lst
+fi
+if [ -d usr/bin ]; then
+ find usr/bin -type f -printf "/%h/%f\n" >> filelist.lst
+fi
+if [ -d usr/sbin ]; then
+ find usr/sbin -type f -printf "/%h/%f\n" >> filelist.lst
+fi
+touch doclist.lst
+if [ -d usr/share/man ]; then
+ find usr/share/man -type f -printf "/%h/%f.gz\n" >> doclist.lst
+fi
+popd
+mv %{buildroot}/filelist.lst .
+mv %{buildroot}/doclist.lst .
+
+%files -n python3-bigeye-airflow -f filelist.lst
+%dir %{python3_sitelib}/*
+
+%files help -f doclist.lst
+%{_docdir}/*
+
+%changelog
+* Mon May 15 2023 Python_Bot <Python_Bot@openeuler.org> - 0.1.20-1
+- Package Spec generated
diff --git a/sources b/sources
new file mode 100644
index 0000000..7cdd898
--- /dev/null
+++ b/sources
@@ -0,0 +1 @@
+99df0b923cc5a70f3bd30c514b862214 bigeye_airflow-0.1.20.tar.gz