%global _empty_manifest_terminate_build 0 Name: python-delta-spark Version: 2.3.0 Release: 1 Summary: Python APIs for using Delta Lake with Apache Spark License: Apache-2.0 URL: https://github.com/delta-io/delta/ Source0: https://mirrors.nju.edu.cn/pypi/web/packages/f9/7a/e74098331739fd854ef34afcbc4493a6056bf02f13d5187380938cb151c6/delta-spark-2.3.0.tar.gz BuildArch: noarch Requires: python3-pyspark Requires: python3-importlib-metadata %description # Delta Lake [Delta Lake](https://delta.io) is an open source storage layer that brings reliability to data lakes. Delta Lake provides ACID transactions, scalable metadata handling, and unifies streaming and batch data processing. Delta Lake runs on top of your existing data lake and is fully compatible with Apache Spark APIs. This PyPi package contains the Python APIs for using Delta Lake with Apache Spark. ## Installation and usage 1. Install using `pip install delta-spark` 2. To use the Delta Lake with Apache Spark, you have to set additional configurations when creating the SparkSession. See the online [project web page](https://docs.delta.io/latest/delta-intro.html) for details. ## Documentation This README file only contains basic information related to pip installed Delta Lake. You can find the full documentation on the [project web page](https://docs.delta.io/latest/delta-intro.html) %package -n python3-delta-spark Summary: Python APIs for using Delta Lake with Apache Spark Provides: python-delta-spark BuildRequires: python3-devel BuildRequires: python3-setuptools BuildRequires: python3-pip %description -n python3-delta-spark # Delta Lake [Delta Lake](https://delta.io) is an open source storage layer that brings reliability to data lakes. Delta Lake provides ACID transactions, scalable metadata handling, and unifies streaming and batch data processing. Delta Lake runs on top of your existing data lake and is fully compatible with Apache Spark APIs. This PyPi package contains the Python APIs for using Delta Lake with Apache Spark. ## Installation and usage 1. Install using `pip install delta-spark` 2. To use the Delta Lake with Apache Spark, you have to set additional configurations when creating the SparkSession. See the online [project web page](https://docs.delta.io/latest/delta-intro.html) for details. ## Documentation This README file only contains basic information related to pip installed Delta Lake. You can find the full documentation on the [project web page](https://docs.delta.io/latest/delta-intro.html) %package help Summary: Development documents and examples for delta-spark Provides: python3-delta-spark-doc %description help # Delta Lake [Delta Lake](https://delta.io) is an open source storage layer that brings reliability to data lakes. Delta Lake provides ACID transactions, scalable metadata handling, and unifies streaming and batch data processing. Delta Lake runs on top of your existing data lake and is fully compatible with Apache Spark APIs. This PyPi package contains the Python APIs for using Delta Lake with Apache Spark. ## Installation and usage 1. Install using `pip install delta-spark` 2. To use the Delta Lake with Apache Spark, you have to set additional configurations when creating the SparkSession. See the online [project web page](https://docs.delta.io/latest/delta-intro.html) for details. ## Documentation This README file only contains basic information related to pip installed Delta Lake. You can find the full documentation on the [project web page](https://docs.delta.io/latest/delta-intro.html) %prep %autosetup -n delta-spark-2.3.0 %build %py3_build %install %py3_install install -d -m755 %{buildroot}/%{_pkgdocdir} if [ -d doc ]; then cp -arf doc %{buildroot}/%{_pkgdocdir}; fi if [ -d docs ]; then cp -arf docs %{buildroot}/%{_pkgdocdir}; fi if [ -d example ]; then cp -arf example %{buildroot}/%{_pkgdocdir}; fi if [ -d examples ]; then cp -arf examples %{buildroot}/%{_pkgdocdir}; fi pushd %{buildroot} if [ -d usr/lib ]; then find usr/lib -type f -printf "/%h/%f\n" >> filelist.lst fi if [ -d usr/lib64 ]; then find usr/lib64 -type f -printf "/%h/%f\n" >> filelist.lst fi if [ -d usr/bin ]; then find usr/bin -type f -printf "/%h/%f\n" >> filelist.lst fi if [ -d usr/sbin ]; then find usr/sbin -type f -printf "/%h/%f\n" >> filelist.lst fi touch doclist.lst if [ -d usr/share/man ]; then find usr/share/man -type f -printf "/%h/%f.gz\n" >> doclist.lst fi popd mv %{buildroot}/filelist.lst . mv %{buildroot}/doclist.lst . %files -n python3-delta-spark -f filelist.lst %dir %{python3_sitelib}/* %files help -f doclist.lst %{_docdir}/* %changelog * Fri Apr 21 2023 Python_Bot <Python_Bot@openeuler.org> - 2.3.0-1 - Package Spec generated