%global _empty_manifest_terminate_build 0 Name: python-s3-concat Version: 0.2.3 Release: 1 Summary: Concat files in s3 License: MIT URL: https://github.com/xtream1101/s3-concat Source0: https://mirrors.nju.edu.cn/pypi/web/packages/e2/83/ee700da614ed6746d985b76245d604950cad63558f76cc220fbe8b29d02f/s3-concat-0.2.3.tar.gz BuildArch: noarch %description # Python S3 Concat [![PyPI](https://img.shields.io/pypi/v/s3-concat.svg)](https://pypi.python.org/pypi/s3-concat) [![PyPI](https://img.shields.io/pypi/l/s3-concat.svg)](https://pypi.python.org/pypi/s3-concat) S3 Concat is used to concatenate many small files in an s3 bucket into fewer larger files. ## Install `pip install s3-concat` ## Usage ### Command Line `$ s3-concat -h` ### Import ```python from s3_concat import S3Concat bucket = 'YOUR_BUCKET_NAME' path_to_concat = 'PATH_TO_FILES_TO_CONCAT' concatenated_file = 'FILE_TO_SAVE_TO.json' # Setting this to a size will always add a part number at the end of the file name min_file_size = '50MB' # ex: FILE_TO_SAVE_TO-1.json, FILE_TO_SAVE_TO-2.json, ... # Setting this to None will concat all files into a single file # min_file_size = None ex: FILE_TO_SAVE_TO.json # Init the job job = S3Concat(bucket, concatenated_file, min_file_size, content_type='application/json', # session=boto3.session.Session(), # For custom aws session # s3_client_kwargs={} # Use to pass arguments allowed by the s3 client: https://boto3.amazonaws.com/v1/documentation/api/latest/reference/core/session.html ) # Add files, can call multiple times to add files from other directories job.add_files(path_to_concat) # Add a single file at a time job.add_file('some/file_key.json') # Only use small_parts_threads if you need to. See Advanced Usage section below. job.concat(small_parts_threads=4) ``` ## Advanced Usage Depending on your use case, you may want to use `small_parts_threads`. - `small_parts_threads` is only used when the files you are trying to concat are less then 5MB. Due to the limitations of the s3 multipart_upload api (see *Limitations* below) any files less then 5MB need to be download locally, concated together, then re uploaded. By setting this thread count it will download the parts in parallel for faster creation of the concatination process. The values set for these arguments depends on your use case and the system you are running this on. ## Limitations This uses the multipart upload of s3 and its limits are https://docs.aws.amazon.com/AmazonS3/latest/dev/qfacts.html %package -n python3-s3-concat Summary: Concat files in s3 Provides: python-s3-concat BuildRequires: python3-devel BuildRequires: python3-setuptools BuildRequires: python3-pip %description -n python3-s3-concat # Python S3 Concat [![PyPI](https://img.shields.io/pypi/v/s3-concat.svg)](https://pypi.python.org/pypi/s3-concat) [![PyPI](https://img.shields.io/pypi/l/s3-concat.svg)](https://pypi.python.org/pypi/s3-concat) S3 Concat is used to concatenate many small files in an s3 bucket into fewer larger files. ## Install `pip install s3-concat` ## Usage ### Command Line `$ s3-concat -h` ### Import ```python from s3_concat import S3Concat bucket = 'YOUR_BUCKET_NAME' path_to_concat = 'PATH_TO_FILES_TO_CONCAT' concatenated_file = 'FILE_TO_SAVE_TO.json' # Setting this to a size will always add a part number at the end of the file name min_file_size = '50MB' # ex: FILE_TO_SAVE_TO-1.json, FILE_TO_SAVE_TO-2.json, ... # Setting this to None will concat all files into a single file # min_file_size = None ex: FILE_TO_SAVE_TO.json # Init the job job = S3Concat(bucket, concatenated_file, min_file_size, content_type='application/json', # session=boto3.session.Session(), # For custom aws session # s3_client_kwargs={} # Use to pass arguments allowed by the s3 client: https://boto3.amazonaws.com/v1/documentation/api/latest/reference/core/session.html ) # Add files, can call multiple times to add files from other directories job.add_files(path_to_concat) # Add a single file at a time job.add_file('some/file_key.json') # Only use small_parts_threads if you need to. See Advanced Usage section below. job.concat(small_parts_threads=4) ``` ## Advanced Usage Depending on your use case, you may want to use `small_parts_threads`. - `small_parts_threads` is only used when the files you are trying to concat are less then 5MB. Due to the limitations of the s3 multipart_upload api (see *Limitations* below) any files less then 5MB need to be download locally, concated together, then re uploaded. By setting this thread count it will download the parts in parallel for faster creation of the concatination process. The values set for these arguments depends on your use case and the system you are running this on. ## Limitations This uses the multipart upload of s3 and its limits are https://docs.aws.amazon.com/AmazonS3/latest/dev/qfacts.html %package help Summary: Development documents and examples for s3-concat Provides: python3-s3-concat-doc %description help # Python S3 Concat [![PyPI](https://img.shields.io/pypi/v/s3-concat.svg)](https://pypi.python.org/pypi/s3-concat) [![PyPI](https://img.shields.io/pypi/l/s3-concat.svg)](https://pypi.python.org/pypi/s3-concat) S3 Concat is used to concatenate many small files in an s3 bucket into fewer larger files. ## Install `pip install s3-concat` ## Usage ### Command Line `$ s3-concat -h` ### Import ```python from s3_concat import S3Concat bucket = 'YOUR_BUCKET_NAME' path_to_concat = 'PATH_TO_FILES_TO_CONCAT' concatenated_file = 'FILE_TO_SAVE_TO.json' # Setting this to a size will always add a part number at the end of the file name min_file_size = '50MB' # ex: FILE_TO_SAVE_TO-1.json, FILE_TO_SAVE_TO-2.json, ... # Setting this to None will concat all files into a single file # min_file_size = None ex: FILE_TO_SAVE_TO.json # Init the job job = S3Concat(bucket, concatenated_file, min_file_size, content_type='application/json', # session=boto3.session.Session(), # For custom aws session # s3_client_kwargs={} # Use to pass arguments allowed by the s3 client: https://boto3.amazonaws.com/v1/documentation/api/latest/reference/core/session.html ) # Add files, can call multiple times to add files from other directories job.add_files(path_to_concat) # Add a single file at a time job.add_file('some/file_key.json') # Only use small_parts_threads if you need to. See Advanced Usage section below. job.concat(small_parts_threads=4) ``` ## Advanced Usage Depending on your use case, you may want to use `small_parts_threads`. - `small_parts_threads` is only used when the files you are trying to concat are less then 5MB. Due to the limitations of the s3 multipart_upload api (see *Limitations* below) any files less then 5MB need to be download locally, concated together, then re uploaded. By setting this thread count it will download the parts in parallel for faster creation of the concatination process. The values set for these arguments depends on your use case and the system you are running this on. ## Limitations This uses the multipart upload of s3 and its limits are https://docs.aws.amazon.com/AmazonS3/latest/dev/qfacts.html %prep %autosetup -n s3-concat-0.2.3 %build %py3_build %install %py3_install install -d -m755 %{buildroot}/%{_pkgdocdir} if [ -d doc ]; then cp -arf doc %{buildroot}/%{_pkgdocdir}; fi if [ -d docs ]; then cp -arf docs %{buildroot}/%{_pkgdocdir}; fi if [ -d example ]; then cp -arf example %{buildroot}/%{_pkgdocdir}; fi if [ -d examples ]; then cp -arf examples %{buildroot}/%{_pkgdocdir}; fi pushd %{buildroot} if [ -d usr/lib ]; then find usr/lib -type f -printf "/%h/%f\n" >> filelist.lst fi if [ -d usr/lib64 ]; then find usr/lib64 -type f -printf "/%h/%f\n" >> filelist.lst fi if [ -d usr/bin ]; then find usr/bin -type f -printf "/%h/%f\n" >> filelist.lst fi if [ -d usr/sbin ]; then find usr/sbin -type f -printf "/%h/%f\n" >> filelist.lst fi touch doclist.lst if [ -d usr/share/man ]; then find usr/share/man -type f -printf "/%h/%f.gz\n" >> doclist.lst fi popd mv %{buildroot}/filelist.lst . mv %{buildroot}/doclist.lst . %files -n python3-s3-concat -f filelist.lst %dir %{python3_sitelib}/* %files help -f doclist.lst %{_docdir}/* %changelog * Tue Apr 25 2023 Python_Bot - 0.2.3-1 - Package Spec generated