%global _empty_manifest_terminate_build 0 Name: python-airflow-prometheus-exporter Version: 1.0.8 Release: 1 Summary: Prometheus Exporter for Airflow Metrics License: BSD 3-Clause URL: https://github.com/robinhood/airflow_prometheus_exporter Source0: https://mirrors.nju.edu.cn/pypi/web/packages/2b/6a/ba5031cd8b10f9ed8cdc6915c2ec2366770a74268f7f8af367e412bb9040/airflow_prometheus_exporter-1.0.8.tar.gz BuildArch: noarch Requires: python3-apache-airflow Requires: python3-prometheus-client Requires: python3-bumpversion Requires: python3-tox Requires: python3-twine %description # Airflow Prometheus Exporter [![Build Status](https://travis-ci.org/robinhood/airflow-prometheus-exporter.svg?branch=master)](https://travis-ci.org/robinhood/airflow-prometheus-exporter) The Airflow Prometheus Exporter exposes various metrics about the Scheduler, DAGs and Tasks which helps improve the observability of an Airflow cluster. The exporter is based on this [prometheus exporter for Airflow](https://github.com/epoch8/airflow-exporter). ## Requirements The plugin has been tested with: - Airflow >= 1.10.4 - Python 3.6+ The scheduler metrics assume that there is a DAG named `canary_dag`. In our setup, the `canary_dag` is a DAG which has a tasks which perform very simple actions such as establishing database connections. This DAG is used to test the uptime of the Airflow scheduler itself. ## Installation The exporter can be installed as an Airflow Plugin using: ```pip install airflow-prometheus-exporter``` This should ideally be installed in your Airflow virtualenv. ## Metrics Metrics will be available at `http:///admin/metrics/` ### Task Specific Metrics #### `airflow_task_status` Number of tasks with a specific status. All the possible states are listed [here](https://github.com/apache/airflow/blob/master/airflow/utils/state.py#L46). #### `airflow_task_duration` Duration of successful tasks in seconds. #### `airflow_task_fail_count` Number of times a particular task has failed. #### `airflow_xcom_param` value of configurable parameter in xcom table xcom fields is deserialized as a dictionary and if key is found for a paticular task-id, the value is reported as a guage Add task / key combinations in config.yaml: ```bash xcom_params: - task_id: abc key: count - task_id: def key: errors ``` a task_id of 'all' will match against all airflow tasks: ``` xcom_params: - task_id: all key: count ``` ### Dag Specific Metrics #### `airflow_dag_status` Number of DAGs with a specific status. All the possible states are listed [here](https://github.com/apache/airflow/blob/master/airflow/utils/state.py#L59) #### `airflow_dag_run_duration` Duration of successful DagRun in seconds. ### Scheduler Metrics #### `airflow_dag_scheduler_delay` Scheduling delay for a DAG Run in seconds. This metric assumes there is a `canary_dag`. The scheduling delay is measured as the delay between when a DAG is marked as `SCHEDULED` and when it actually starts `RUNNING`. #### `airflow_task_scheduler_delay` Scheduling delay for a Task in seconds. This metric assumes there is a `canary_dag`. #### `airflow_num_queued_tasks` Number of tasks in the `QUEUED` state at any given instance. %package -n python3-airflow-prometheus-exporter Summary: Prometheus Exporter for Airflow Metrics Provides: python-airflow-prometheus-exporter BuildRequires: python3-devel BuildRequires: python3-setuptools BuildRequires: python3-pip %description -n python3-airflow-prometheus-exporter # Airflow Prometheus Exporter [![Build Status](https://travis-ci.org/robinhood/airflow-prometheus-exporter.svg?branch=master)](https://travis-ci.org/robinhood/airflow-prometheus-exporter) The Airflow Prometheus Exporter exposes various metrics about the Scheduler, DAGs and Tasks which helps improve the observability of an Airflow cluster. The exporter is based on this [prometheus exporter for Airflow](https://github.com/epoch8/airflow-exporter). ## Requirements The plugin has been tested with: - Airflow >= 1.10.4 - Python 3.6+ The scheduler metrics assume that there is a DAG named `canary_dag`. In our setup, the `canary_dag` is a DAG which has a tasks which perform very simple actions such as establishing database connections. This DAG is used to test the uptime of the Airflow scheduler itself. ## Installation The exporter can be installed as an Airflow Plugin using: ```pip install airflow-prometheus-exporter``` This should ideally be installed in your Airflow virtualenv. ## Metrics Metrics will be available at `http:///admin/metrics/` ### Task Specific Metrics #### `airflow_task_status` Number of tasks with a specific status. All the possible states are listed [here](https://github.com/apache/airflow/blob/master/airflow/utils/state.py#L46). #### `airflow_task_duration` Duration of successful tasks in seconds. #### `airflow_task_fail_count` Number of times a particular task has failed. #### `airflow_xcom_param` value of configurable parameter in xcom table xcom fields is deserialized as a dictionary and if key is found for a paticular task-id, the value is reported as a guage Add task / key combinations in config.yaml: ```bash xcom_params: - task_id: abc key: count - task_id: def key: errors ``` a task_id of 'all' will match against all airflow tasks: ``` xcom_params: - task_id: all key: count ``` ### Dag Specific Metrics #### `airflow_dag_status` Number of DAGs with a specific status. All the possible states are listed [here](https://github.com/apache/airflow/blob/master/airflow/utils/state.py#L59) #### `airflow_dag_run_duration` Duration of successful DagRun in seconds. ### Scheduler Metrics #### `airflow_dag_scheduler_delay` Scheduling delay for a DAG Run in seconds. This metric assumes there is a `canary_dag`. The scheduling delay is measured as the delay between when a DAG is marked as `SCHEDULED` and when it actually starts `RUNNING`. #### `airflow_task_scheduler_delay` Scheduling delay for a Task in seconds. This metric assumes there is a `canary_dag`. #### `airflow_num_queued_tasks` Number of tasks in the `QUEUED` state at any given instance. %package help Summary: Development documents and examples for airflow-prometheus-exporter Provides: python3-airflow-prometheus-exporter-doc %description help # Airflow Prometheus Exporter [![Build Status](https://travis-ci.org/robinhood/airflow-prometheus-exporter.svg?branch=master)](https://travis-ci.org/robinhood/airflow-prometheus-exporter) The Airflow Prometheus Exporter exposes various metrics about the Scheduler, DAGs and Tasks which helps improve the observability of an Airflow cluster. The exporter is based on this [prometheus exporter for Airflow](https://github.com/epoch8/airflow-exporter). ## Requirements The plugin has been tested with: - Airflow >= 1.10.4 - Python 3.6+ The scheduler metrics assume that there is a DAG named `canary_dag`. In our setup, the `canary_dag` is a DAG which has a tasks which perform very simple actions such as establishing database connections. This DAG is used to test the uptime of the Airflow scheduler itself. ## Installation The exporter can be installed as an Airflow Plugin using: ```pip install airflow-prometheus-exporter``` This should ideally be installed in your Airflow virtualenv. ## Metrics Metrics will be available at `http:///admin/metrics/` ### Task Specific Metrics #### `airflow_task_status` Number of tasks with a specific status. All the possible states are listed [here](https://github.com/apache/airflow/blob/master/airflow/utils/state.py#L46). #### `airflow_task_duration` Duration of successful tasks in seconds. #### `airflow_task_fail_count` Number of times a particular task has failed. #### `airflow_xcom_param` value of configurable parameter in xcom table xcom fields is deserialized as a dictionary and if key is found for a paticular task-id, the value is reported as a guage Add task / key combinations in config.yaml: ```bash xcom_params: - task_id: abc key: count - task_id: def key: errors ``` a task_id of 'all' will match against all airflow tasks: ``` xcom_params: - task_id: all key: count ``` ### Dag Specific Metrics #### `airflow_dag_status` Number of DAGs with a specific status. All the possible states are listed [here](https://github.com/apache/airflow/blob/master/airflow/utils/state.py#L59) #### `airflow_dag_run_duration` Duration of successful DagRun in seconds. ### Scheduler Metrics #### `airflow_dag_scheduler_delay` Scheduling delay for a DAG Run in seconds. This metric assumes there is a `canary_dag`. The scheduling delay is measured as the delay between when a DAG is marked as `SCHEDULED` and when it actually starts `RUNNING`. #### `airflow_task_scheduler_delay` Scheduling delay for a Task in seconds. This metric assumes there is a `canary_dag`. #### `airflow_num_queued_tasks` Number of tasks in the `QUEUED` state at any given instance. %prep %autosetup -n airflow-prometheus-exporter-1.0.8 %build %py3_build %install %py3_install install -d -m755 %{buildroot}/%{_pkgdocdir} if [ -d doc ]; then cp -arf doc %{buildroot}/%{_pkgdocdir}; fi if [ -d docs ]; then cp -arf docs %{buildroot}/%{_pkgdocdir}; fi if [ -d example ]; then cp -arf example %{buildroot}/%{_pkgdocdir}; fi if [ -d examples ]; then cp -arf examples %{buildroot}/%{_pkgdocdir}; fi pushd %{buildroot} if [ -d usr/lib ]; then find usr/lib -type f -printf "/%h/%f\n" >> filelist.lst fi if [ -d usr/lib64 ]; then find usr/lib64 -type f -printf "/%h/%f\n" >> filelist.lst fi if [ -d usr/bin ]; then find usr/bin -type f -printf "/%h/%f\n" >> filelist.lst fi if [ -d usr/sbin ]; then find usr/sbin -type f -printf "/%h/%f\n" >> filelist.lst fi touch doclist.lst if [ -d usr/share/man ]; then find usr/share/man -type f -printf "/%h/%f.gz\n" >> doclist.lst fi popd mv %{buildroot}/filelist.lst . mv %{buildroot}/doclist.lst . %files -n python3-airflow-prometheus-exporter -f filelist.lst %dir %{python3_sitelib}/* %files help -f doclist.lst %{_docdir}/* %changelog * Mon Apr 10 2023 Python_Bot - 1.0.8-1 - Package Spec generated