1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
85
86
87
88
89
90
91
92
93
94
95
96
97
98
99
100
101
102
|
%global _empty_manifest_terminate_build 0
Name: python-tanbih-pipeline
Version: 0.12.9
Release: 1
Summary: a pipeline framework for streaming processing
License: MIT
URL: https://github.com/yifan/pipeline
Source0: https://mirrors.nju.edu.cn/pypi/web/packages/13/94/701e46057017c28da87d6279b3d447d4084493053aa968aaf6776f945308/tanbih-pipeline-0.12.9.tar.gz
BuildArch: noarch
Requires: python3-prometheus-client
Requires: python3-zstandard
Requires: python3-pydantic
Requires: python3-dotenv
Requires: python3-azure-cosmosdb-table
Requires: python3-elasticsearch
Requires: python3-redis
Requires: python3-confluent-kafka
Requires: python3-pulsar-client
Requires: python3-azure-cosmosdb-table
Requires: python3-pika
Requires: python3-pymongo
Requires: python3-elasticsearch
Requires: python3-rq
Requires: python3-confluent-kafka
Requires: python3-pymongo
Requires: python3-mysql-connector-python
Requires: python3-pulsar-client
Requires: python3-pika
Requires: python3-redis
Requires: python3-rq
%description
Pipeline provides an unified interface to set up data stream processing systems with Kafka, Pulsar,
RabbitMQ, Redis and many more. The idea is to free developer from the dynamic change of technology
in deployment, so that a docker image released for a certain task can be used with Kafka or Redis
through changes of environment variables.
%package -n python3-tanbih-pipeline
Summary: a pipeline framework for streaming processing
Provides: python-tanbih-pipeline
BuildRequires: python3-devel
BuildRequires: python3-setuptools
BuildRequires: python3-pip
%description -n python3-tanbih-pipeline
Pipeline provides an unified interface to set up data stream processing systems with Kafka, Pulsar,
RabbitMQ, Redis and many more. The idea is to free developer from the dynamic change of technology
in deployment, so that a docker image released for a certain task can be used with Kafka or Redis
through changes of environment variables.
%package help
Summary: Development documents and examples for tanbih-pipeline
Provides: python3-tanbih-pipeline-doc
%description help
Pipeline provides an unified interface to set up data stream processing systems with Kafka, Pulsar,
RabbitMQ, Redis and many more. The idea is to free developer from the dynamic change of technology
in deployment, so that a docker image released for a certain task can be used with Kafka or Redis
through changes of environment variables.
%prep
%autosetup -n tanbih-pipeline-0.12.9
%build
%py3_build
%install
%py3_install
install -d -m755 %{buildroot}/%{_pkgdocdir}
if [ -d doc ]; then cp -arf doc %{buildroot}/%{_pkgdocdir}; fi
if [ -d docs ]; then cp -arf docs %{buildroot}/%{_pkgdocdir}; fi
if [ -d example ]; then cp -arf example %{buildroot}/%{_pkgdocdir}; fi
if [ -d examples ]; then cp -arf examples %{buildroot}/%{_pkgdocdir}; fi
pushd %{buildroot}
if [ -d usr/lib ]; then
find usr/lib -type f -printf "/%h/%f\n" >> filelist.lst
fi
if [ -d usr/lib64 ]; then
find usr/lib64 -type f -printf "/%h/%f\n" >> filelist.lst
fi
if [ -d usr/bin ]; then
find usr/bin -type f -printf "/%h/%f\n" >> filelist.lst
fi
if [ -d usr/sbin ]; then
find usr/sbin -type f -printf "/%h/%f\n" >> filelist.lst
fi
touch doclist.lst
if [ -d usr/share/man ]; then
find usr/share/man -type f -printf "/%h/%f.gz\n" >> doclist.lst
fi
popd
mv %{buildroot}/filelist.lst .
mv %{buildroot}/doclist.lst .
%files -n python3-tanbih-pipeline -f filelist.lst
%dir %{python3_sitelib}/*
%files help -f doclist.lst
%{_docdir}/*
%changelog
* Mon May 15 2023 Python_Bot <Python_Bot@openeuler.org> - 0.12.9-1
- Package Spec generated
|