summaryrefslogtreecommitdiff
diff options
context:
space:
mode:
authorCoprDistGit <infra@openeuler.org>2023-04-10 15:22:22 +0000
committerCoprDistGit <infra@openeuler.org>2023-04-10 15:22:22 +0000
commitfbc00d2f7212e7243671dc66465f5549732fb735 (patch)
treea8e8d2d526a25e02398b041bf52059defebfd39e
parent658753c9c786897cea630082bb42d3e758e451e6 (diff)
automatic import of python-airflow-prometheus-exporter
-rw-r--r--.gitignore1
-rw-r--r--python-airflow-prometheus-exporter.spec395
-rw-r--r--sources1
3 files changed, 397 insertions, 0 deletions
diff --git a/.gitignore b/.gitignore
index e69de29..152abfc 100644
--- a/.gitignore
+++ b/.gitignore
@@ -0,0 +1 @@
+/airflow_prometheus_exporter-1.0.8.tar.gz
diff --git a/python-airflow-prometheus-exporter.spec b/python-airflow-prometheus-exporter.spec
new file mode 100644
index 0000000..b4f4aad
--- /dev/null
+++ b/python-airflow-prometheus-exporter.spec
@@ -0,0 +1,395 @@
+%global _empty_manifest_terminate_build 0
+Name: python-airflow-prometheus-exporter
+Version: 1.0.8
+Release: 1
+Summary: Prometheus Exporter for Airflow Metrics
+License: BSD 3-Clause
+URL: https://github.com/robinhood/airflow_prometheus_exporter
+Source0: https://mirrors.nju.edu.cn/pypi/web/packages/2b/6a/ba5031cd8b10f9ed8cdc6915c2ec2366770a74268f7f8af367e412bb9040/airflow_prometheus_exporter-1.0.8.tar.gz
+BuildArch: noarch
+
+Requires: python3-apache-airflow
+Requires: python3-prometheus-client
+Requires: python3-bumpversion
+Requires: python3-tox
+Requires: python3-twine
+
+%description
+# Airflow Prometheus Exporter
+
+[![Build Status](https://travis-ci.org/robinhood/airflow-prometheus-exporter.svg?branch=master)](https://travis-ci.org/robinhood/airflow-prometheus-exporter)
+
+The Airflow Prometheus Exporter exposes various metrics about the Scheduler, DAGs and Tasks which helps improve the observability of an Airflow cluster.
+
+The exporter is based on this [prometheus exporter for Airflow](https://github.com/epoch8/airflow-exporter).
+
+## Requirements
+
+The plugin has been tested with:
+
+- Airflow >= 1.10.4
+- Python 3.6+
+
+The scheduler metrics assume that there is a DAG named `canary_dag`. In our setup, the `canary_dag` is a DAG which has a tasks which perform very simple actions such as establishing database connections. This DAG is used to test the uptime of the Airflow scheduler itself.
+
+## Installation
+
+The exporter can be installed as an Airflow Plugin using:
+
+```pip install airflow-prometheus-exporter```
+
+This should ideally be installed in your Airflow virtualenv.
+
+## Metrics
+
+Metrics will be available at
+
+`http://<your_airflow_host_and_port>/admin/metrics/`
+
+### Task Specific Metrics
+
+#### `airflow_task_status`
+
+Number of tasks with a specific status.
+
+All the possible states are listed [here](https://github.com/apache/airflow/blob/master/airflow/utils/state.py#L46).
+
+#### `airflow_task_duration`
+
+Duration of successful tasks in seconds.
+
+#### `airflow_task_fail_count`
+
+Number of times a particular task has failed.
+
+#### `airflow_xcom_param`
+
+value of configurable parameter in xcom table
+
+xcom fields is deserialized as a dictionary and if key is found for a paticular task-id, the value is reported as a guage
+
+Add task / key combinations in config.yaml:
+
+```bash
+xcom_params:
+ -
+ task_id: abc
+ key: count
+ -
+ task_id: def
+ key: errors
+
+```
+
+
+a task_id of 'all' will match against all airflow tasks:
+
+```
+xcom_params:
+ -
+ task_id: all
+ key: count
+```
+
+
+
+### Dag Specific Metrics
+
+#### `airflow_dag_status`
+
+Number of DAGs with a specific status.
+
+All the possible states are listed [here](https://github.com/apache/airflow/blob/master/airflow/utils/state.py#L59)
+
+#### `airflow_dag_run_duration`
+Duration of successful DagRun in seconds.
+
+### Scheduler Metrics
+
+#### `airflow_dag_scheduler_delay`
+
+Scheduling delay for a DAG Run in seconds. This metric assumes there is a `canary_dag`.
+
+The scheduling delay is measured as the delay between when a DAG is marked as `SCHEDULED` and when it actually starts `RUNNING`.
+
+#### `airflow_task_scheduler_delay`
+
+Scheduling delay for a Task in seconds. This metric assumes there is a `canary_dag`.
+
+#### `airflow_num_queued_tasks`
+
+Number of tasks in the `QUEUED` state at any given instance.
+
+
+
+
+%package -n python3-airflow-prometheus-exporter
+Summary: Prometheus Exporter for Airflow Metrics
+Provides: python-airflow-prometheus-exporter
+BuildRequires: python3-devel
+BuildRequires: python3-setuptools
+BuildRequires: python3-pip
+%description -n python3-airflow-prometheus-exporter
+# Airflow Prometheus Exporter
+
+[![Build Status](https://travis-ci.org/robinhood/airflow-prometheus-exporter.svg?branch=master)](https://travis-ci.org/robinhood/airflow-prometheus-exporter)
+
+The Airflow Prometheus Exporter exposes various metrics about the Scheduler, DAGs and Tasks which helps improve the observability of an Airflow cluster.
+
+The exporter is based on this [prometheus exporter for Airflow](https://github.com/epoch8/airflow-exporter).
+
+## Requirements
+
+The plugin has been tested with:
+
+- Airflow >= 1.10.4
+- Python 3.6+
+
+The scheduler metrics assume that there is a DAG named `canary_dag`. In our setup, the `canary_dag` is a DAG which has a tasks which perform very simple actions such as establishing database connections. This DAG is used to test the uptime of the Airflow scheduler itself.
+
+## Installation
+
+The exporter can be installed as an Airflow Plugin using:
+
+```pip install airflow-prometheus-exporter```
+
+This should ideally be installed in your Airflow virtualenv.
+
+## Metrics
+
+Metrics will be available at
+
+`http://<your_airflow_host_and_port>/admin/metrics/`
+
+### Task Specific Metrics
+
+#### `airflow_task_status`
+
+Number of tasks with a specific status.
+
+All the possible states are listed [here](https://github.com/apache/airflow/blob/master/airflow/utils/state.py#L46).
+
+#### `airflow_task_duration`
+
+Duration of successful tasks in seconds.
+
+#### `airflow_task_fail_count`
+
+Number of times a particular task has failed.
+
+#### `airflow_xcom_param`
+
+value of configurable parameter in xcom table
+
+xcom fields is deserialized as a dictionary and if key is found for a paticular task-id, the value is reported as a guage
+
+Add task / key combinations in config.yaml:
+
+```bash
+xcom_params:
+ -
+ task_id: abc
+ key: count
+ -
+ task_id: def
+ key: errors
+
+```
+
+
+a task_id of 'all' will match against all airflow tasks:
+
+```
+xcom_params:
+ -
+ task_id: all
+ key: count
+```
+
+
+
+### Dag Specific Metrics
+
+#### `airflow_dag_status`
+
+Number of DAGs with a specific status.
+
+All the possible states are listed [here](https://github.com/apache/airflow/blob/master/airflow/utils/state.py#L59)
+
+#### `airflow_dag_run_duration`
+Duration of successful DagRun in seconds.
+
+### Scheduler Metrics
+
+#### `airflow_dag_scheduler_delay`
+
+Scheduling delay for a DAG Run in seconds. This metric assumes there is a `canary_dag`.
+
+The scheduling delay is measured as the delay between when a DAG is marked as `SCHEDULED` and when it actually starts `RUNNING`.
+
+#### `airflow_task_scheduler_delay`
+
+Scheduling delay for a Task in seconds. This metric assumes there is a `canary_dag`.
+
+#### `airflow_num_queued_tasks`
+
+Number of tasks in the `QUEUED` state at any given instance.
+
+
+
+
+%package help
+Summary: Development documents and examples for airflow-prometheus-exporter
+Provides: python3-airflow-prometheus-exporter-doc
+%description help
+# Airflow Prometheus Exporter
+
+[![Build Status](https://travis-ci.org/robinhood/airflow-prometheus-exporter.svg?branch=master)](https://travis-ci.org/robinhood/airflow-prometheus-exporter)
+
+The Airflow Prometheus Exporter exposes various metrics about the Scheduler, DAGs and Tasks which helps improve the observability of an Airflow cluster.
+
+The exporter is based on this [prometheus exporter for Airflow](https://github.com/epoch8/airflow-exporter).
+
+## Requirements
+
+The plugin has been tested with:
+
+- Airflow >= 1.10.4
+- Python 3.6+
+
+The scheduler metrics assume that there is a DAG named `canary_dag`. In our setup, the `canary_dag` is a DAG which has a tasks which perform very simple actions such as establishing database connections. This DAG is used to test the uptime of the Airflow scheduler itself.
+
+## Installation
+
+The exporter can be installed as an Airflow Plugin using:
+
+```pip install airflow-prometheus-exporter```
+
+This should ideally be installed in your Airflow virtualenv.
+
+## Metrics
+
+Metrics will be available at
+
+`http://<your_airflow_host_and_port>/admin/metrics/`
+
+### Task Specific Metrics
+
+#### `airflow_task_status`
+
+Number of tasks with a specific status.
+
+All the possible states are listed [here](https://github.com/apache/airflow/blob/master/airflow/utils/state.py#L46).
+
+#### `airflow_task_duration`
+
+Duration of successful tasks in seconds.
+
+#### `airflow_task_fail_count`
+
+Number of times a particular task has failed.
+
+#### `airflow_xcom_param`
+
+value of configurable parameter in xcom table
+
+xcom fields is deserialized as a dictionary and if key is found for a paticular task-id, the value is reported as a guage
+
+Add task / key combinations in config.yaml:
+
+```bash
+xcom_params:
+ -
+ task_id: abc
+ key: count
+ -
+ task_id: def
+ key: errors
+
+```
+
+
+a task_id of 'all' will match against all airflow tasks:
+
+```
+xcom_params:
+ -
+ task_id: all
+ key: count
+```
+
+
+
+### Dag Specific Metrics
+
+#### `airflow_dag_status`
+
+Number of DAGs with a specific status.
+
+All the possible states are listed [here](https://github.com/apache/airflow/blob/master/airflow/utils/state.py#L59)
+
+#### `airflow_dag_run_duration`
+Duration of successful DagRun in seconds.
+
+### Scheduler Metrics
+
+#### `airflow_dag_scheduler_delay`
+
+Scheduling delay for a DAG Run in seconds. This metric assumes there is a `canary_dag`.
+
+The scheduling delay is measured as the delay between when a DAG is marked as `SCHEDULED` and when it actually starts `RUNNING`.
+
+#### `airflow_task_scheduler_delay`
+
+Scheduling delay for a Task in seconds. This metric assumes there is a `canary_dag`.
+
+#### `airflow_num_queued_tasks`
+
+Number of tasks in the `QUEUED` state at any given instance.
+
+
+
+
+%prep
+%autosetup -n airflow-prometheus-exporter-1.0.8
+
+%build
+%py3_build
+
+%install
+%py3_install
+install -d -m755 %{buildroot}/%{_pkgdocdir}
+if [ -d doc ]; then cp -arf doc %{buildroot}/%{_pkgdocdir}; fi
+if [ -d docs ]; then cp -arf docs %{buildroot}/%{_pkgdocdir}; fi
+if [ -d example ]; then cp -arf example %{buildroot}/%{_pkgdocdir}; fi
+if [ -d examples ]; then cp -arf examples %{buildroot}/%{_pkgdocdir}; fi
+pushd %{buildroot}
+if [ -d usr/lib ]; then
+ find usr/lib -type f -printf "/%h/%f\n" >> filelist.lst
+fi
+if [ -d usr/lib64 ]; then
+ find usr/lib64 -type f -printf "/%h/%f\n" >> filelist.lst
+fi
+if [ -d usr/bin ]; then
+ find usr/bin -type f -printf "/%h/%f\n" >> filelist.lst
+fi
+if [ -d usr/sbin ]; then
+ find usr/sbin -type f -printf "/%h/%f\n" >> filelist.lst
+fi
+touch doclist.lst
+if [ -d usr/share/man ]; then
+ find usr/share/man -type f -printf "/%h/%f.gz\n" >> doclist.lst
+fi
+popd
+mv %{buildroot}/filelist.lst .
+mv %{buildroot}/doclist.lst .
+
+%files -n python3-airflow-prometheus-exporter -f filelist.lst
+%dir %{python3_sitelib}/*
+
+%files help -f doclist.lst
+%{_docdir}/*
+
+%changelog
+* Mon Apr 10 2023 Python_Bot <Python_Bot@openeuler.org> - 1.0.8-1
+- Package Spec generated
diff --git a/sources b/sources
new file mode 100644
index 0000000..fde8239
--- /dev/null
+++ b/sources
@@ -0,0 +1 @@
+49c4897af3bd928c31eddee0ffd20b94 airflow_prometheus_exporter-1.0.8.tar.gz