%global _empty_manifest_terminate_build 0 Name: python-gs-chunked-io Version: 0.5.2 Release: 1 Summary: Streaming read/writes to Google Storage blobs with ascynchronous buffering. License: MIT URL: https://github.com/xbrianh/gs-chunked-io.git Source0: https://mirrors.nju.edu.cn/pypi/web/packages/aa/76/5f86e2c4cc5d09c74e39f633a01bb75c3ad78d59e00b90105bb51e611288/gs-chunked-io-0.5.2.tar.gz BuildArch: noarch %description # gs-chunked-io: Streams for Google Storage _gs-chunked-io_ provides transparently chunked io streams for google storage objects. Writable streams are managed as multipart objects, composed when the stream is closed. IO opperations are concurrent by default. The number of concurrent threads can be adjusted using the `threads` parameter, or disabled entirely with `threads=None`. ``` import gs_chunked_io as gscio from google.cloud.storage import Client client = Client() bucket = client.bucket("my-bucket") blob = bucket.get_blob("my-key") # Readable stream: with gscio.Reader(blob) as fh: fh.read(size) # Writable stream: with gscio.Writer("my_new_key", bucket) as fh: fh.write(data) # Process blob in chunks: for chunk in gscio.for_each_chunk(blob): my_chunk_processor(chunk) # Multipart copy with processing: dst_bucket = client.bucket("my_dest_bucket") with gscio.Writer("my_dest_key", dst_bucket) as writer: for chunk in gscio.for_each_chunk(blob) process_my_chunk(chunk) writer(chunk) # Extract .tar.gz on the fly: import gzip import tarfile with gscio.Reader(blob) as fh: gzip_reader = gzip.GzipFile(fileobj=fh) tf = tarfile.TarFile(fileobj=gzip_reader) for tarinfo in tf: process_my_tarinfo(tarinfo) ``` ## Installation ``` pip install gs-chunked-io ``` ## Links Project home page [GitHub](https://github.com/xbrianh/gs-chunked-io) Package distribution [PyPI](https://pypi.org/project/gs-chunked-io/) ### Bugs Please report bugs, issues, feature requests, etc. on [GitHub](https://github.com/xbrianh/gs-chunked-io). ![](https://travis-ci.org/xbrianh/gs-chunked-io.svg?branch=master) ![](https://badge.fury.io/py/gs-chunked-io.svg) %package -n python3-gs-chunked-io Summary: Streaming read/writes to Google Storage blobs with ascynchronous buffering. Provides: python-gs-chunked-io BuildRequires: python3-devel BuildRequires: python3-setuptools BuildRequires: python3-pip %description -n python3-gs-chunked-io # gs-chunked-io: Streams for Google Storage _gs-chunked-io_ provides transparently chunked io streams for google storage objects. Writable streams are managed as multipart objects, composed when the stream is closed. IO opperations are concurrent by default. The number of concurrent threads can be adjusted using the `threads` parameter, or disabled entirely with `threads=None`. ``` import gs_chunked_io as gscio from google.cloud.storage import Client client = Client() bucket = client.bucket("my-bucket") blob = bucket.get_blob("my-key") # Readable stream: with gscio.Reader(blob) as fh: fh.read(size) # Writable stream: with gscio.Writer("my_new_key", bucket) as fh: fh.write(data) # Process blob in chunks: for chunk in gscio.for_each_chunk(blob): my_chunk_processor(chunk) # Multipart copy with processing: dst_bucket = client.bucket("my_dest_bucket") with gscio.Writer("my_dest_key", dst_bucket) as writer: for chunk in gscio.for_each_chunk(blob) process_my_chunk(chunk) writer(chunk) # Extract .tar.gz on the fly: import gzip import tarfile with gscio.Reader(blob) as fh: gzip_reader = gzip.GzipFile(fileobj=fh) tf = tarfile.TarFile(fileobj=gzip_reader) for tarinfo in tf: process_my_tarinfo(tarinfo) ``` ## Installation ``` pip install gs-chunked-io ``` ## Links Project home page [GitHub](https://github.com/xbrianh/gs-chunked-io) Package distribution [PyPI](https://pypi.org/project/gs-chunked-io/) ### Bugs Please report bugs, issues, feature requests, etc. on [GitHub](https://github.com/xbrianh/gs-chunked-io). ![](https://travis-ci.org/xbrianh/gs-chunked-io.svg?branch=master) ![](https://badge.fury.io/py/gs-chunked-io.svg) %package help Summary: Development documents and examples for gs-chunked-io Provides: python3-gs-chunked-io-doc %description help # gs-chunked-io: Streams for Google Storage _gs-chunked-io_ provides transparently chunked io streams for google storage objects. Writable streams are managed as multipart objects, composed when the stream is closed. IO opperations are concurrent by default. The number of concurrent threads can be adjusted using the `threads` parameter, or disabled entirely with `threads=None`. ``` import gs_chunked_io as gscio from google.cloud.storage import Client client = Client() bucket = client.bucket("my-bucket") blob = bucket.get_blob("my-key") # Readable stream: with gscio.Reader(blob) as fh: fh.read(size) # Writable stream: with gscio.Writer("my_new_key", bucket) as fh: fh.write(data) # Process blob in chunks: for chunk in gscio.for_each_chunk(blob): my_chunk_processor(chunk) # Multipart copy with processing: dst_bucket = client.bucket("my_dest_bucket") with gscio.Writer("my_dest_key", dst_bucket) as writer: for chunk in gscio.for_each_chunk(blob) process_my_chunk(chunk) writer(chunk) # Extract .tar.gz on the fly: import gzip import tarfile with gscio.Reader(blob) as fh: gzip_reader = gzip.GzipFile(fileobj=fh) tf = tarfile.TarFile(fileobj=gzip_reader) for tarinfo in tf: process_my_tarinfo(tarinfo) ``` ## Installation ``` pip install gs-chunked-io ``` ## Links Project home page [GitHub](https://github.com/xbrianh/gs-chunked-io) Package distribution [PyPI](https://pypi.org/project/gs-chunked-io/) ### Bugs Please report bugs, issues, feature requests, etc. on [GitHub](https://github.com/xbrianh/gs-chunked-io). ![](https://travis-ci.org/xbrianh/gs-chunked-io.svg?branch=master) ![](https://badge.fury.io/py/gs-chunked-io.svg) %prep %autosetup -n gs-chunked-io-0.5.2 %build %py3_build %install %py3_install install -d -m755 %{buildroot}/%{_pkgdocdir} if [ -d doc ]; then cp -arf doc %{buildroot}/%{_pkgdocdir}; fi if [ -d docs ]; then cp -arf docs %{buildroot}/%{_pkgdocdir}; fi if [ -d example ]; then cp -arf example %{buildroot}/%{_pkgdocdir}; fi if [ -d examples ]; then cp -arf examples %{buildroot}/%{_pkgdocdir}; fi pushd %{buildroot} if [ -d usr/lib ]; then find usr/lib -type f -printf "/%h/%f\n" >> filelist.lst fi if [ -d usr/lib64 ]; then find usr/lib64 -type f -printf "/%h/%f\n" >> filelist.lst fi if [ -d usr/bin ]; then find usr/bin -type f -printf "/%h/%f\n" >> filelist.lst fi if [ -d usr/sbin ]; then find usr/sbin -type f -printf "/%h/%f\n" >> filelist.lst fi touch doclist.lst if [ -d usr/share/man ]; then find usr/share/man -type f -printf "/%h/%f.gz\n" >> doclist.lst fi popd mv %{buildroot}/filelist.lst . mv %{buildroot}/doclist.lst . %files -n python3-gs-chunked-io -f filelist.lst %dir %{python3_sitelib}/* %files help -f doclist.lst %{_docdir}/* %changelog * Mon May 29 2023 Python_Bot - 0.5.2-1 - Package Spec generated