summaryrefslogtreecommitdiff
diff options
context:
space:
mode:
authorCoprDistGit <infra@openeuler.org>2023-04-11 23:34:02 +0000
committerCoprDistGit <infra@openeuler.org>2023-04-11 23:34:02 +0000
commit4f5b41a6e9ac604ff533fac6cd6129395bc49b58 (patch)
treefb8682255e646827a030b2d7b97787f859fff648
parent4c31ec055ba2453a3a34f713273e0231b54e49f0 (diff)
automatic import of python-apache-flink
-rw-r--r--.gitignore1
-rw-r--r--python-apache-flink.spec303
-rw-r--r--sources1
3 files changed, 305 insertions, 0 deletions
diff --git a/.gitignore b/.gitignore
index e69de29..75e2fb9 100644
--- a/.gitignore
+++ b/.gitignore
@@ -0,0 +1 @@
+/apache-flink-1.17.0.tar.gz
diff --git a/python-apache-flink.spec b/python-apache-flink.spec
new file mode 100644
index 0000000..c95fa8c
--- /dev/null
+++ b/python-apache-flink.spec
@@ -0,0 +1,303 @@
+%global _empty_manifest_terminate_build 0
+Name: python-apache-flink
+Version: 1.17.0
+Release: 1
+Summary: Apache Flink Python API
+License: https://www.apache.org/licenses/LICENSE-2.0
+URL: https://flink.apache.org
+Source0: https://mirrors.nju.edu.cn/pypi/web/packages/79/d3/dee7bb8fd0aac3d91950e901dbbcc64e97cb2cbd054723a98a5976f9db48/apache-flink-1.17.0.tar.gz
+BuildArch: noarch
+
+Requires: python3-py4j
+Requires: python3-dateutil
+Requires: python3-apache-beam
+Requires: python3-cloudpickle
+Requires: python3-avro-python3
+Requires: python3-pytz
+Requires: python3-fastavro
+Requires: python3-requests
+Requires: python3-protobuf
+Requires: python3-numpy
+Requires: python3-pandas
+Requires: python3-pyarrow
+Requires: python3-httplib2
+Requires: python3-apache-flink-libraries
+Requires: python3-pemja
+
+%description
+# Apache Flink
+
+Apache Flink is a framework and distributed processing engine for stateful computations over unbounded and bounded data streams. Flink has been designed to run in all common cluster environments, perform computations at in-memory speed and at any scale.
+
+Learn more about Flink at [https://flink.apache.org/](https://flink.apache.org/)
+
+## Python Packaging
+
+PyFlink is a Python API for Apache Flink that allows you to build scalable batch and streaming workloads,
+such as real-time data processing pipelines, large-scale exploratory data analysis, Machine Learning (ML)
+pipelines and ETL processes. If you’re already familiar with Python and libraries such as Pandas,
+then PyFlink makes it simpler to leverage the full capabilities of the Flink ecosystem.
+Depending on the level of abstraction you need, there are two different APIs that can be used in PyFlink: PyFlink Table API and PyFlink DataStream API.
+
+The PyFlink Table API allows you to write powerful relational queries in a way that is similar to
+using SQL or working with tabular data in Python. You can find more information about it via the tutorial
+[https://nightlies.apache.org/flink/flink-docs-stable/docs/dev/python/table_api_tutorial/](https://nightlies.apache.org/flink/flink-docs-stable/docs/dev/python/table_api_tutorial/)
+
+The PyFlink DataStream API gives you lower-level control over the core building blocks of Flink,
+state and time, to build more complex stream processing use cases.
+Tutorial can be found at [https://nightlies.apache.org/flink/flink-docs-stable/docs/dev/python/datastream_tutorial/](https://nightlies.apache.org/flink/flink-docs-stable/docs/dev/python/datastream_tutorial/)
+
+You can find more information via the documentation at [https://nightlies.apache.org/flink/flink-docs-stable/docs/dev/python/overview/](https://nightlies.apache.org/flink/flink-docs-stable/docs/dev/python/overview/)
+
+The auto-generated Python docs can be found at [https://nightlies.apache.org/flink/flink-docs-stable/api/python/](https://nightlies.apache.org/flink/flink-docs-stable/api/python/)
+
+## Python Requirements
+
+Apache Flink Python API depends on Py4J (currently version 0.10.9.7), CloudPickle (currently version 2.2.0), python-dateutil(currently version >=2.8.0,<3), Apache Beam (currently version 2.43.0).
+
+## Development Notices
+
+### Protobuf Code Generation
+
+Protocol buffer is used in file `flink_fn_execution_pb2.py` and the file is generated from `flink-fn-execution.proto`. Whenever `flink-fn-execution.proto` is updated, please re-generate `flink_fn_execution_pb2.py` by executing:
+
+```
+python pyflink/gen_protos.py
+```
+
+PyFlink depends on the following libraries to execute the above script:
+1. grpcio-tools (>=1.29.0,<=1.46.3)
+2. setuptools (>=37.0.0)
+3. pip (>=20.3)
+
+### Running Test Cases
+
+Currently, we use conda and tox to verify the compatibility of the Flink Python API for multiple versions of Python and will integrate some useful plugins with tox, such as flake8.
+We can enter the directory where this README.md file is located and run test cases by executing
+
+```
+./dev/lint-python.sh
+```
+
+To use your system conda environment, you can set `FLINK_CONDA_HOME` variable:
+
+```shell
+export FLINK_CONDA_HOME=$(dirname $(dirname $CONDA_EXE))
+```
+
+Create a virtual environment:
+```shell
+conda create -n pyflink_38 python=3.8
+```
+
+Then you can activate your environment and run tests, for example:
+
+```shell
+conda activate pyflink_38
+pip install -r ./dev/dev-requirements.txt
+./dev/lint-python.sh
+```
+
+
+%package -n python3-apache-flink
+Summary: Apache Flink Python API
+Provides: python-apache-flink
+BuildRequires: python3-devel
+BuildRequires: python3-setuptools
+BuildRequires: python3-pip
+%description -n python3-apache-flink
+# Apache Flink
+
+Apache Flink is a framework and distributed processing engine for stateful computations over unbounded and bounded data streams. Flink has been designed to run in all common cluster environments, perform computations at in-memory speed and at any scale.
+
+Learn more about Flink at [https://flink.apache.org/](https://flink.apache.org/)
+
+## Python Packaging
+
+PyFlink is a Python API for Apache Flink that allows you to build scalable batch and streaming workloads,
+such as real-time data processing pipelines, large-scale exploratory data analysis, Machine Learning (ML)
+pipelines and ETL processes. If you’re already familiar with Python and libraries such as Pandas,
+then PyFlink makes it simpler to leverage the full capabilities of the Flink ecosystem.
+Depending on the level of abstraction you need, there are two different APIs that can be used in PyFlink: PyFlink Table API and PyFlink DataStream API.
+
+The PyFlink Table API allows you to write powerful relational queries in a way that is similar to
+using SQL or working with tabular data in Python. You can find more information about it via the tutorial
+[https://nightlies.apache.org/flink/flink-docs-stable/docs/dev/python/table_api_tutorial/](https://nightlies.apache.org/flink/flink-docs-stable/docs/dev/python/table_api_tutorial/)
+
+The PyFlink DataStream API gives you lower-level control over the core building blocks of Flink,
+state and time, to build more complex stream processing use cases.
+Tutorial can be found at [https://nightlies.apache.org/flink/flink-docs-stable/docs/dev/python/datastream_tutorial/](https://nightlies.apache.org/flink/flink-docs-stable/docs/dev/python/datastream_tutorial/)
+
+You can find more information via the documentation at [https://nightlies.apache.org/flink/flink-docs-stable/docs/dev/python/overview/](https://nightlies.apache.org/flink/flink-docs-stable/docs/dev/python/overview/)
+
+The auto-generated Python docs can be found at [https://nightlies.apache.org/flink/flink-docs-stable/api/python/](https://nightlies.apache.org/flink/flink-docs-stable/api/python/)
+
+## Python Requirements
+
+Apache Flink Python API depends on Py4J (currently version 0.10.9.7), CloudPickle (currently version 2.2.0), python-dateutil(currently version >=2.8.0,<3), Apache Beam (currently version 2.43.0).
+
+## Development Notices
+
+### Protobuf Code Generation
+
+Protocol buffer is used in file `flink_fn_execution_pb2.py` and the file is generated from `flink-fn-execution.proto`. Whenever `flink-fn-execution.proto` is updated, please re-generate `flink_fn_execution_pb2.py` by executing:
+
+```
+python pyflink/gen_protos.py
+```
+
+PyFlink depends on the following libraries to execute the above script:
+1. grpcio-tools (>=1.29.0,<=1.46.3)
+2. setuptools (>=37.0.0)
+3. pip (>=20.3)
+
+### Running Test Cases
+
+Currently, we use conda and tox to verify the compatibility of the Flink Python API for multiple versions of Python and will integrate some useful plugins with tox, such as flake8.
+We can enter the directory where this README.md file is located and run test cases by executing
+
+```
+./dev/lint-python.sh
+```
+
+To use your system conda environment, you can set `FLINK_CONDA_HOME` variable:
+
+```shell
+export FLINK_CONDA_HOME=$(dirname $(dirname $CONDA_EXE))
+```
+
+Create a virtual environment:
+```shell
+conda create -n pyflink_38 python=3.8
+```
+
+Then you can activate your environment and run tests, for example:
+
+```shell
+conda activate pyflink_38
+pip install -r ./dev/dev-requirements.txt
+./dev/lint-python.sh
+```
+
+
+%package help
+Summary: Development documents and examples for apache-flink
+Provides: python3-apache-flink-doc
+%description help
+# Apache Flink
+
+Apache Flink is a framework and distributed processing engine for stateful computations over unbounded and bounded data streams. Flink has been designed to run in all common cluster environments, perform computations at in-memory speed and at any scale.
+
+Learn more about Flink at [https://flink.apache.org/](https://flink.apache.org/)
+
+## Python Packaging
+
+PyFlink is a Python API for Apache Flink that allows you to build scalable batch and streaming workloads,
+such as real-time data processing pipelines, large-scale exploratory data analysis, Machine Learning (ML)
+pipelines and ETL processes. If you’re already familiar with Python and libraries such as Pandas,
+then PyFlink makes it simpler to leverage the full capabilities of the Flink ecosystem.
+Depending on the level of abstraction you need, there are two different APIs that can be used in PyFlink: PyFlink Table API and PyFlink DataStream API.
+
+The PyFlink Table API allows you to write powerful relational queries in a way that is similar to
+using SQL or working with tabular data in Python. You can find more information about it via the tutorial
+[https://nightlies.apache.org/flink/flink-docs-stable/docs/dev/python/table_api_tutorial/](https://nightlies.apache.org/flink/flink-docs-stable/docs/dev/python/table_api_tutorial/)
+
+The PyFlink DataStream API gives you lower-level control over the core building blocks of Flink,
+state and time, to build more complex stream processing use cases.
+Tutorial can be found at [https://nightlies.apache.org/flink/flink-docs-stable/docs/dev/python/datastream_tutorial/](https://nightlies.apache.org/flink/flink-docs-stable/docs/dev/python/datastream_tutorial/)
+
+You can find more information via the documentation at [https://nightlies.apache.org/flink/flink-docs-stable/docs/dev/python/overview/](https://nightlies.apache.org/flink/flink-docs-stable/docs/dev/python/overview/)
+
+The auto-generated Python docs can be found at [https://nightlies.apache.org/flink/flink-docs-stable/api/python/](https://nightlies.apache.org/flink/flink-docs-stable/api/python/)
+
+## Python Requirements
+
+Apache Flink Python API depends on Py4J (currently version 0.10.9.7), CloudPickle (currently version 2.2.0), python-dateutil(currently version >=2.8.0,<3), Apache Beam (currently version 2.43.0).
+
+## Development Notices
+
+### Protobuf Code Generation
+
+Protocol buffer is used in file `flink_fn_execution_pb2.py` and the file is generated from `flink-fn-execution.proto`. Whenever `flink-fn-execution.proto` is updated, please re-generate `flink_fn_execution_pb2.py` by executing:
+
+```
+python pyflink/gen_protos.py
+```
+
+PyFlink depends on the following libraries to execute the above script:
+1. grpcio-tools (>=1.29.0,<=1.46.3)
+2. setuptools (>=37.0.0)
+3. pip (>=20.3)
+
+### Running Test Cases
+
+Currently, we use conda and tox to verify the compatibility of the Flink Python API for multiple versions of Python and will integrate some useful plugins with tox, such as flake8.
+We can enter the directory where this README.md file is located and run test cases by executing
+
+```
+./dev/lint-python.sh
+```
+
+To use your system conda environment, you can set `FLINK_CONDA_HOME` variable:
+
+```shell
+export FLINK_CONDA_HOME=$(dirname $(dirname $CONDA_EXE))
+```
+
+Create a virtual environment:
+```shell
+conda create -n pyflink_38 python=3.8
+```
+
+Then you can activate your environment and run tests, for example:
+
+```shell
+conda activate pyflink_38
+pip install -r ./dev/dev-requirements.txt
+./dev/lint-python.sh
+```
+
+
+%prep
+%autosetup -n apache-flink-1.17.0
+
+%build
+%py3_build
+
+%install
+%py3_install
+install -d -m755 %{buildroot}/%{_pkgdocdir}
+if [ -d doc ]; then cp -arf doc %{buildroot}/%{_pkgdocdir}; fi
+if [ -d docs ]; then cp -arf docs %{buildroot}/%{_pkgdocdir}; fi
+if [ -d example ]; then cp -arf example %{buildroot}/%{_pkgdocdir}; fi
+if [ -d examples ]; then cp -arf examples %{buildroot}/%{_pkgdocdir}; fi
+pushd %{buildroot}
+if [ -d usr/lib ]; then
+ find usr/lib -type f -printf "/%h/%f\n" >> filelist.lst
+fi
+if [ -d usr/lib64 ]; then
+ find usr/lib64 -type f -printf "/%h/%f\n" >> filelist.lst
+fi
+if [ -d usr/bin ]; then
+ find usr/bin -type f -printf "/%h/%f\n" >> filelist.lst
+fi
+if [ -d usr/sbin ]; then
+ find usr/sbin -type f -printf "/%h/%f\n" >> filelist.lst
+fi
+touch doclist.lst
+if [ -d usr/share/man ]; then
+ find usr/share/man -type f -printf "/%h/%f.gz\n" >> doclist.lst
+fi
+popd
+mv %{buildroot}/filelist.lst .
+mv %{buildroot}/doclist.lst .
+
+%files -n python3-apache-flink -f filelist.lst
+%dir %{python3_sitelib}/*
+
+%files help -f doclist.lst
+%{_docdir}/*
+
+%changelog
+* Tue Apr 11 2023 Python_Bot <Python_Bot@openeuler.org> - 1.17.0-1
+- Package Spec generated
diff --git a/sources b/sources
new file mode 100644
index 0000000..4195d42
--- /dev/null
+++ b/sources
@@ -0,0 +1 @@
+6c151dd1710f8b47bb95e0988f3d1ac3 apache-flink-1.17.0.tar.gz