summaryrefslogtreecommitdiff
path: root/python-bigeye-airflow.spec
blob: 92c7f7094743db3c5287d5ec37251111cc1aab7c (plain)
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
85
86
87
88
89
90
91
92
93
94
95
96
97
98
99
100
101
102
103
104
105
106
107
108
109
110
111
112
113
114
115
116
117
118
119
120
121
122
123
124
125
126
127
128
129
130
131
132
133
134
135
136
137
138
139
140
141
142
143
144
145
146
147
148
149
150
151
152
153
154
155
156
157
158
159
160
161
162
163
164
165
166
167
168
169
170
171
172
173
174
175
176
177
178
179
180
181
182
183
184
185
186
187
188
189
190
191
192
193
194
195
196
197
198
199
200
201
202
203
204
205
206
207
208
209
210
211
212
213
214
215
216
217
218
219
220
221
222
223
224
225
226
227
228
229
230
231
232
233
234
235
236
237
238
239
240
241
242
243
%global _empty_manifest_terminate_build 0
Name:		python-bigeye-airflow
Version:	0.1.20
Release:	1
Summary:	Bigeye Airflow Library supports Airflow 2.4.3 and offers custom operators for interacting with your your bigeye workspace.
License:	Proprietary
URL:		https://docs.bigeye.com/docs
Source0:	https://mirrors.aliyun.com/pypi/web/packages/ae/6d/6da37db5f2c2350edc4b2b8e8597f6f1d0c97b2b8bc94c3cc57b79780ba6/bigeye_airflow-0.1.20.tar.gz
BuildArch:	noarch

Requires:	python3-Flask-OpenID
Requires:	python3-apache-airflow
Requires:	python3-bigeye-sdk

%description
# Bigeye Airflow Operators for Airflow Versions 2.x

## Operators
### Create Metric Operator (bigeye_airflow.oerators.create_metric_operator)

The CreateMetricOperator creates metrics from a list of metric configurations provided to the operator.
This operator will fill in reasonable defaults like setting thresholds.  It authenticates through an Airflow connection 
ID and offers the option to run the metrics after those metrics have been created.  Please review the link below to 
understand the structure of the configurations.

[Create or Update Metric Swagger](https://docs.bigeye.com/reference/createmetric)

#### Parameters
1. connection_id: str - The Airfow connection ID used to store the required Bigeye credential.
2. warehouse_id: int - The Bigeye source/warehouse id to which the metric configurations will be deployed.
3. configuration: List[dict] - A list of metric configurations conforming to the following schema.
    ```
    schema_name: str
    table_name: str
    column_name: str
    metric_template_id: uuid.UUID
    metric_name: str
    description: str
    notifications: List[str]
    thresholds: List[dict]
    filters: List[str]
    group_by: List[str]
    user_defined_metric_name: str
    metric_type: SimpleMetricCategory
    default_check_frequency_hours: int
    update_schedule: str
    delay_at_update: str
    timezone: str
    should_backfill: bool
    lookback_type: str
    lookback_days: int
    window_size: str
    _window_size_seconds
    ```
4. run_after_upsert: bool - If true it will run the metrics after creation.  Defaults to False.

### Run Metrics Operator

The RunMetricsOperator will run metrics in Bigeye based on the following:

1. All metrics for a given table, by providing warehouse ID, schema name and table name.
2. Any and all metrics, given a list of metric IDs.  

Currently, if a list of metric IDs is provided these will be run instead of metrics provided for
warehouse_id, schema_name, table_name.

#### Parameters
1. connection_id: str - The Airfow connection ID used to store the required Bigeye credential.
2. warehouse_id: int - The Bigeye source/warehouse id for which metrics will be run.
3. schema_name: str - The schema name for which metrics will be run.
4. table_name: str - The table name for which metrics will be run.
5. metric_ids: List[int] - The metric ids to run.

%package -n python3-bigeye-airflow
Summary:	Bigeye Airflow Library supports Airflow 2.4.3 and offers custom operators for interacting with your your bigeye workspace.
Provides:	python-bigeye-airflow
BuildRequires:	python3-devel
BuildRequires:	python3-setuptools
BuildRequires:	python3-pip
%description -n python3-bigeye-airflow
# Bigeye Airflow Operators for Airflow Versions 2.x

## Operators
### Create Metric Operator (bigeye_airflow.oerators.create_metric_operator)

The CreateMetricOperator creates metrics from a list of metric configurations provided to the operator.
This operator will fill in reasonable defaults like setting thresholds.  It authenticates through an Airflow connection 
ID and offers the option to run the metrics after those metrics have been created.  Please review the link below to 
understand the structure of the configurations.

[Create or Update Metric Swagger](https://docs.bigeye.com/reference/createmetric)

#### Parameters
1. connection_id: str - The Airfow connection ID used to store the required Bigeye credential.
2. warehouse_id: int - The Bigeye source/warehouse id to which the metric configurations will be deployed.
3. configuration: List[dict] - A list of metric configurations conforming to the following schema.
    ```
    schema_name: str
    table_name: str
    column_name: str
    metric_template_id: uuid.UUID
    metric_name: str
    description: str
    notifications: List[str]
    thresholds: List[dict]
    filters: List[str]
    group_by: List[str]
    user_defined_metric_name: str
    metric_type: SimpleMetricCategory
    default_check_frequency_hours: int
    update_schedule: str
    delay_at_update: str
    timezone: str
    should_backfill: bool
    lookback_type: str
    lookback_days: int
    window_size: str
    _window_size_seconds
    ```
4. run_after_upsert: bool - If true it will run the metrics after creation.  Defaults to False.

### Run Metrics Operator

The RunMetricsOperator will run metrics in Bigeye based on the following:

1. All metrics for a given table, by providing warehouse ID, schema name and table name.
2. Any and all metrics, given a list of metric IDs.  

Currently, if a list of metric IDs is provided these will be run instead of metrics provided for
warehouse_id, schema_name, table_name.

#### Parameters
1. connection_id: str - The Airfow connection ID used to store the required Bigeye credential.
2. warehouse_id: int - The Bigeye source/warehouse id for which metrics will be run.
3. schema_name: str - The schema name for which metrics will be run.
4. table_name: str - The table name for which metrics will be run.
5. metric_ids: List[int] - The metric ids to run.

%package help
Summary:	Development documents and examples for bigeye-airflow
Provides:	python3-bigeye-airflow-doc
%description help
# Bigeye Airflow Operators for Airflow Versions 2.x

## Operators
### Create Metric Operator (bigeye_airflow.oerators.create_metric_operator)

The CreateMetricOperator creates metrics from a list of metric configurations provided to the operator.
This operator will fill in reasonable defaults like setting thresholds.  It authenticates through an Airflow connection 
ID and offers the option to run the metrics after those metrics have been created.  Please review the link below to 
understand the structure of the configurations.

[Create or Update Metric Swagger](https://docs.bigeye.com/reference/createmetric)

#### Parameters
1. connection_id: str - The Airfow connection ID used to store the required Bigeye credential.
2. warehouse_id: int - The Bigeye source/warehouse id to which the metric configurations will be deployed.
3. configuration: List[dict] - A list of metric configurations conforming to the following schema.
    ```
    schema_name: str
    table_name: str
    column_name: str
    metric_template_id: uuid.UUID
    metric_name: str
    description: str
    notifications: List[str]
    thresholds: List[dict]
    filters: List[str]
    group_by: List[str]
    user_defined_metric_name: str
    metric_type: SimpleMetricCategory
    default_check_frequency_hours: int
    update_schedule: str
    delay_at_update: str
    timezone: str
    should_backfill: bool
    lookback_type: str
    lookback_days: int
    window_size: str
    _window_size_seconds
    ```
4. run_after_upsert: bool - If true it will run the metrics after creation.  Defaults to False.

### Run Metrics Operator

The RunMetricsOperator will run metrics in Bigeye based on the following:

1. All metrics for a given table, by providing warehouse ID, schema name and table name.
2. Any and all metrics, given a list of metric IDs.  

Currently, if a list of metric IDs is provided these will be run instead of metrics provided for
warehouse_id, schema_name, table_name.

#### Parameters
1. connection_id: str - The Airfow connection ID used to store the required Bigeye credential.
2. warehouse_id: int - The Bigeye source/warehouse id for which metrics will be run.
3. schema_name: str - The schema name for which metrics will be run.
4. table_name: str - The table name for which metrics will be run.
5. metric_ids: List[int] - The metric ids to run.

%prep
%autosetup -n bigeye_airflow-0.1.20

%build
%py3_build

%install
%py3_install
install -d -m755 %{buildroot}/%{_pkgdocdir}
if [ -d doc ]; then cp -arf doc %{buildroot}/%{_pkgdocdir}; fi
if [ -d docs ]; then cp -arf docs %{buildroot}/%{_pkgdocdir}; fi
if [ -d example ]; then cp -arf example %{buildroot}/%{_pkgdocdir}; fi
if [ -d examples ]; then cp -arf examples %{buildroot}/%{_pkgdocdir}; fi
pushd %{buildroot}
if [ -d usr/lib ]; then
	find usr/lib -type f -printf "\"/%h/%f\"\n" >> filelist.lst
fi
if [ -d usr/lib64 ]; then
	find usr/lib64 -type f -printf "\"/%h/%f\"\n" >> filelist.lst
fi
if [ -d usr/bin ]; then
	find usr/bin -type f -printf "\"/%h/%f\"\n" >> filelist.lst
fi
if [ -d usr/sbin ]; then
	find usr/sbin -type f -printf "\"/%h/%f\"\n" >> filelist.lst
fi
touch doclist.lst
if [ -d usr/share/man ]; then
	find usr/share/man -type f -printf "\"/%h/%f.gz\"\n" >> doclist.lst
fi
popd
mv %{buildroot}/filelist.lst .
mv %{buildroot}/doclist.lst .

%files -n python3-bigeye-airflow -f filelist.lst
%dir %{python3_sitelib}/*

%files help -f doclist.lst
%{_docdir}/*

%changelog
* Thu Jun 08 2023 Python_Bot <Python_Bot@openeuler.org> - 0.1.20-1
- Package Spec generated