diff options
-rw-r--r-- | .gitignore | 1 | ||||
-rw-r--r-- | python-dynamodump.spec | 656 | ||||
-rw-r--r-- | sources | 1 |
3 files changed, 658 insertions, 0 deletions
@@ -0,0 +1 @@ +/dynamodump-1.9.0.tar.gz diff --git a/python-dynamodump.spec b/python-dynamodump.spec new file mode 100644 index 0000000..8c732dc --- /dev/null +++ b/python-dynamodump.spec @@ -0,0 +1,656 @@ +%global _empty_manifest_terminate_build 0 +Name: python-dynamodump +Version: 1.9.0 +Release: 1 +Summary: Simple backup and restore for Amazon DynamoDB using AWS SDK for Python (boto3) +License: MIT License +URL: https://github.com/bchew/dynamodump +Source0: https://mirrors.aliyun.com/pypi/web/packages/04/85/437ad19d560f955aac8387c0dbc70b21fb4da4221e70cb4b77a5b8a5d1b5/dynamodump-1.9.0.tar.gz +BuildArch: noarch + +Requires: python3-boto3 +Requires: python3-six + +%description +# dynamodump + + + +[](https://hub.docker.com/r/bchew/dynamodump) + + +[](https://pypi.org/project/dynamodump) +[](https://pypi.org/project/dynamodump) + + +Simple backup and restore script for Amazon DynamoDB using AWS SDK for Python (boto3) to work similarly to mysqldump. + +Suitable for DynamoDB usages of smaller data volume which do not warrant the usage of AWS Data Pipeline for backup/restores/empty. + +dynamodump supports local DynamoDB instances as well (tested with [DynamoDB Local](https://docs.aws.amazon.com/amazondynamodb/latest/developerguide/DynamoDBLocal.html)). + +## Table of Contents + +- [Installation](#installation) +- [Usage](#usage) +- [Script (unattended) usage](#script-unattended-usage) +- [Docker CLI usage](#docker-cli-usage) +- [AWS example](#aws-example) +- [Local example](#local-example) +- [Development](#development) + +## Installation + +``` +pip install dynamodump +``` + +## Usage + +``` +usage: dynamodump.py [-h] [-a {zip,tar}] [-b BUCKET] [-m {backup,restore,empty}] [-r REGION] [--host HOST] [--port PORT] [--accessKey ACCESSKEY] [--secretKey SECRETKEY] [-p PROFILE] [-s SRCTABLE] [-d DESTTABLE] + [--prefixSeparator PREFIXSEPARATOR] [--noSeparator] [--readCapacity READCAPACITY] [-t TAG] [--writeCapacity WRITECAPACITY] [--schemaOnly] [--dataOnly] [--noConfirm] [--skipThroughputUpdate] + [--dumpPath DUMPPATH] [--billingMode {PROVISIONED,PAY_PER_REQUEST}] [--log LOG] [--limit LIMIT] [-f FILTEROPTION] + +Simple DynamoDB backup/restore/empty. + +options: + -h, --help show this help message and exit + -a {zip,tar}, --archive {zip,tar} + Type of compressed archive to create. If unset, don't create archive + -b BUCKET, --bucket BUCKET + S3 bucket in which to store or retrieve backups. [must already exist] + -m {backup,restore,empty}, --mode {backup,restore,empty} + Operation to perform + -r REGION, --region REGION + AWS region to use, e.g. 'us-west-1'. Can use any region for local testing + --host HOST Host of local DynamoDB. This parameter initialises dynamodump for local DynamoDB testing [required only for local] + --port PORT Port of local DynamoDB [required only for local] + --accessKey ACCESSKEY + Access key of local DynamoDB [required only for local] + --secretKey SECRETKEY + Secret key of local DynamoDB [required only for local] + -p PROFILE, --profile PROFILE + AWS credentials file profile to use. Allows you to use a profile instead accessKey, secretKey authentication + -s SRCTABLE, --srcTable SRCTABLE + Source DynamoDB table name to backup or restore from, use 'tablename*' for wildcard prefix selection or '*' for all tables. Mutually exclusive with --tag + -d DESTTABLE, --destTable DESTTABLE + Destination DynamoDB table name to backup or restore to, use 'tablename*' for wildcard prefix selection (defaults to use '-' separator) [optional, defaults to source] + --prefixSeparator PREFIXSEPARATOR + Specify a different prefix separator, e.g. '.' [optional] + --noSeparator Overrides the use of a prefix separator for backup wildcard searches [optional] + --readCapacity READCAPACITY + Change the temp read capacity of the DynamoDB table to backup from [optional] + -t TAG, --tag TAG Tag to use for identifying tables to back up. Mutually exclusive with srcTable. Provided as KEY=VALUE + --writeCapacity WRITECAPACITY + Change the temp write capacity of the DynamoDB table to restore to [defaults to 25, optional] + --schemaOnly Backup or restore the schema only. Do not backup/restore data. Can be used with both backup and restore modes. Cannot be used with the --dataOnly [optional] + --dataOnly Restore data only. Do not delete/recreate schema [optional for restore] + --noConfirm Don't ask for confirmation before deleting existing schemas. + --skipThroughputUpdate + Skip updating throughput values across tables [optional] + --dumpPath DUMPPATH Directory to place and search for DynamoDB table backups (defaults to use 'dump') [optional] + --billingMode {PROVISIONED,PAY_PER_REQUEST} + Set billing mode between PROVISIONED|PAY_PER_REQUEST (defaults to use 'PROVISIONED') [optional] + --log LOG Logging level - DEBUG|INFO|WARNING|ERROR|CRITICAL [optional] + --limit LIMIT Limit option for backup, will stop the back up process after number of backed up items reaches the limit [optional] + -f FILTEROPTION, --filterOption FILTEROPTION + Filter option for backup, JSON file of which keys are ['FilterExpression', 'ExpressionAttributeNames', 'ExpressionAttributeValues'] +``` + +Backup files are stored in a 'dump' subdirectory, and are restored from there as well by default. + +## Script (unattended) usage + +As of v1.2.0, note that `--noConfirm` is required to perform data restores involving deletions without any confirmation. + +## Docker CLI usage + +``` +docker run --rm -it bchew/dynamodump -h +``` + +dynamodump container images are also published to Amazon ECR Public and GitHub Packages: + +``` +public.ecr.aws/bchew/dynamodump +ghcr.io/bchew/dynamodump +``` + +## AWS example + +Single table backup/restore: + +``` +dynamodump -m backup -r us-west-1 -s testTable + +dynamodump -m restore -r us-west-1 -s testTable +``` + +Multiple table backup/restore (assumes prefix of 'production-' of table names, use --prefixSeparator to specify a +different separator): + +``` +dynamodump -m backup -r us-west-1 -s production* + +dynamodump -m restore -r us-west-1 -s production* +``` + +The above, but between different environments (e.g. production-_ tables to development-_ tables): + +``` +dynamodump -m backup -r us-west-1 -s production* + +dynamodump -m restore -r us-west-1 -s production* -d development* +``` + +Backup all tables and restore only data (will not delete and recreate schema): + +``` +dynamodump -m backup -r us-west-1 -s "*" + +dynamodump -m restore -r us-west-1 -s "*" --dataOnly +``` + +Dump all table schemas and create the schemas (e.g. creating blank tables in a different AWS account): + +``` +dynamodump -m backup -r us-west-1 -p source_credentials -s "*" --schemaOnly + +dynamodump -m restore -r us-west-1 -p destination_credentials -s "*" --schemaOnly +``` + +Backup all tables based on AWS tag `key=value` + +``` +dynamodump -p profile -r us-east-1 -m backup -t KEY=VALUE +``` + +Backup all tables based on AWS tag, compress and store in specified S3 bucket. + +``` +dynamodump -p profile -r us-east-1 -m backup -a tar -b some_s3_bucket -t TAG_KEY=TAG_VALUE + +dynamodump -p profile -r us-east-1 -m backup -a zip -b some_s3_bucket -t TAG_KEY=TAG_VALUE +``` + +Restore from S3 bucket to specified destination table + +``` +## source_table identifies archive file in S3 bucket from which backup data is restored +dynamodump -a tar -b some_s3_bucket -m restore -r us-east-1 -p profile -d destination_table -s source_table +``` + +## Local example + +The following assumes your local DynamoDB is running on localhost:8000 and is accessible via 'a' as access/secret keys. +You must specify the host to get local behavior. + +``` +dynamodump -m backup -r local -s testTable --host localhost --port 8000 --accessKey a --secretKey a + +dynamodump -m restore -r local -s testTable --host localhost --port 8000 --accessKey a --secretKey a +``` + +Multiple table backup/restore as stated in the AWS examples are also available for local. + +## Development + +``` +python3 -m venv env +source env/bin/activate + +# install dev requirements +pip3 install -r requirements-dev.txt + +# one-time install of pre-commit hooks +pre-commit install +``` + + +%package -n python3-dynamodump +Summary: Simple backup and restore for Amazon DynamoDB using AWS SDK for Python (boto3) +Provides: python-dynamodump +BuildRequires: python3-devel +BuildRequires: python3-setuptools +BuildRequires: python3-pip +%description -n python3-dynamodump +# dynamodump + + + +[](https://hub.docker.com/r/bchew/dynamodump) + + +[](https://pypi.org/project/dynamodump) +[](https://pypi.org/project/dynamodump) + + +Simple backup and restore script for Amazon DynamoDB using AWS SDK for Python (boto3) to work similarly to mysqldump. + +Suitable for DynamoDB usages of smaller data volume which do not warrant the usage of AWS Data Pipeline for backup/restores/empty. + +dynamodump supports local DynamoDB instances as well (tested with [DynamoDB Local](https://docs.aws.amazon.com/amazondynamodb/latest/developerguide/DynamoDBLocal.html)). + +## Table of Contents + +- [Installation](#installation) +- [Usage](#usage) +- [Script (unattended) usage](#script-unattended-usage) +- [Docker CLI usage](#docker-cli-usage) +- [AWS example](#aws-example) +- [Local example](#local-example) +- [Development](#development) + +## Installation + +``` +pip install dynamodump +``` + +## Usage + +``` +usage: dynamodump.py [-h] [-a {zip,tar}] [-b BUCKET] [-m {backup,restore,empty}] [-r REGION] [--host HOST] [--port PORT] [--accessKey ACCESSKEY] [--secretKey SECRETKEY] [-p PROFILE] [-s SRCTABLE] [-d DESTTABLE] + [--prefixSeparator PREFIXSEPARATOR] [--noSeparator] [--readCapacity READCAPACITY] [-t TAG] [--writeCapacity WRITECAPACITY] [--schemaOnly] [--dataOnly] [--noConfirm] [--skipThroughputUpdate] + [--dumpPath DUMPPATH] [--billingMode {PROVISIONED,PAY_PER_REQUEST}] [--log LOG] [--limit LIMIT] [-f FILTEROPTION] + +Simple DynamoDB backup/restore/empty. + +options: + -h, --help show this help message and exit + -a {zip,tar}, --archive {zip,tar} + Type of compressed archive to create. If unset, don't create archive + -b BUCKET, --bucket BUCKET + S3 bucket in which to store or retrieve backups. [must already exist] + -m {backup,restore,empty}, --mode {backup,restore,empty} + Operation to perform + -r REGION, --region REGION + AWS region to use, e.g. 'us-west-1'. Can use any region for local testing + --host HOST Host of local DynamoDB. This parameter initialises dynamodump for local DynamoDB testing [required only for local] + --port PORT Port of local DynamoDB [required only for local] + --accessKey ACCESSKEY + Access key of local DynamoDB [required only for local] + --secretKey SECRETKEY + Secret key of local DynamoDB [required only for local] + -p PROFILE, --profile PROFILE + AWS credentials file profile to use. Allows you to use a profile instead accessKey, secretKey authentication + -s SRCTABLE, --srcTable SRCTABLE + Source DynamoDB table name to backup or restore from, use 'tablename*' for wildcard prefix selection or '*' for all tables. Mutually exclusive with --tag + -d DESTTABLE, --destTable DESTTABLE + Destination DynamoDB table name to backup or restore to, use 'tablename*' for wildcard prefix selection (defaults to use '-' separator) [optional, defaults to source] + --prefixSeparator PREFIXSEPARATOR + Specify a different prefix separator, e.g. '.' [optional] + --noSeparator Overrides the use of a prefix separator for backup wildcard searches [optional] + --readCapacity READCAPACITY + Change the temp read capacity of the DynamoDB table to backup from [optional] + -t TAG, --tag TAG Tag to use for identifying tables to back up. Mutually exclusive with srcTable. Provided as KEY=VALUE + --writeCapacity WRITECAPACITY + Change the temp write capacity of the DynamoDB table to restore to [defaults to 25, optional] + --schemaOnly Backup or restore the schema only. Do not backup/restore data. Can be used with both backup and restore modes. Cannot be used with the --dataOnly [optional] + --dataOnly Restore data only. Do not delete/recreate schema [optional for restore] + --noConfirm Don't ask for confirmation before deleting existing schemas. + --skipThroughputUpdate + Skip updating throughput values across tables [optional] + --dumpPath DUMPPATH Directory to place and search for DynamoDB table backups (defaults to use 'dump') [optional] + --billingMode {PROVISIONED,PAY_PER_REQUEST} + Set billing mode between PROVISIONED|PAY_PER_REQUEST (defaults to use 'PROVISIONED') [optional] + --log LOG Logging level - DEBUG|INFO|WARNING|ERROR|CRITICAL [optional] + --limit LIMIT Limit option for backup, will stop the back up process after number of backed up items reaches the limit [optional] + -f FILTEROPTION, --filterOption FILTEROPTION + Filter option for backup, JSON file of which keys are ['FilterExpression', 'ExpressionAttributeNames', 'ExpressionAttributeValues'] +``` + +Backup files are stored in a 'dump' subdirectory, and are restored from there as well by default. + +## Script (unattended) usage + +As of v1.2.0, note that `--noConfirm` is required to perform data restores involving deletions without any confirmation. + +## Docker CLI usage + +``` +docker run --rm -it bchew/dynamodump -h +``` + +dynamodump container images are also published to Amazon ECR Public and GitHub Packages: + +``` +public.ecr.aws/bchew/dynamodump +ghcr.io/bchew/dynamodump +``` + +## AWS example + +Single table backup/restore: + +``` +dynamodump -m backup -r us-west-1 -s testTable + +dynamodump -m restore -r us-west-1 -s testTable +``` + +Multiple table backup/restore (assumes prefix of 'production-' of table names, use --prefixSeparator to specify a +different separator): + +``` +dynamodump -m backup -r us-west-1 -s production* + +dynamodump -m restore -r us-west-1 -s production* +``` + +The above, but between different environments (e.g. production-_ tables to development-_ tables): + +``` +dynamodump -m backup -r us-west-1 -s production* + +dynamodump -m restore -r us-west-1 -s production* -d development* +``` + +Backup all tables and restore only data (will not delete and recreate schema): + +``` +dynamodump -m backup -r us-west-1 -s "*" + +dynamodump -m restore -r us-west-1 -s "*" --dataOnly +``` + +Dump all table schemas and create the schemas (e.g. creating blank tables in a different AWS account): + +``` +dynamodump -m backup -r us-west-1 -p source_credentials -s "*" --schemaOnly + +dynamodump -m restore -r us-west-1 -p destination_credentials -s "*" --schemaOnly +``` + +Backup all tables based on AWS tag `key=value` + +``` +dynamodump -p profile -r us-east-1 -m backup -t KEY=VALUE +``` + +Backup all tables based on AWS tag, compress and store in specified S3 bucket. + +``` +dynamodump -p profile -r us-east-1 -m backup -a tar -b some_s3_bucket -t TAG_KEY=TAG_VALUE + +dynamodump -p profile -r us-east-1 -m backup -a zip -b some_s3_bucket -t TAG_KEY=TAG_VALUE +``` + +Restore from S3 bucket to specified destination table + +``` +## source_table identifies archive file in S3 bucket from which backup data is restored +dynamodump -a tar -b some_s3_bucket -m restore -r us-east-1 -p profile -d destination_table -s source_table +``` + +## Local example + +The following assumes your local DynamoDB is running on localhost:8000 and is accessible via 'a' as access/secret keys. +You must specify the host to get local behavior. + +``` +dynamodump -m backup -r local -s testTable --host localhost --port 8000 --accessKey a --secretKey a + +dynamodump -m restore -r local -s testTable --host localhost --port 8000 --accessKey a --secretKey a +``` + +Multiple table backup/restore as stated in the AWS examples are also available for local. + +## Development + +``` +python3 -m venv env +source env/bin/activate + +# install dev requirements +pip3 install -r requirements-dev.txt + +# one-time install of pre-commit hooks +pre-commit install +``` + + +%package help +Summary: Development documents and examples for dynamodump +Provides: python3-dynamodump-doc +%description help +# dynamodump + + + +[](https://hub.docker.com/r/bchew/dynamodump) + + +[](https://pypi.org/project/dynamodump) +[](https://pypi.org/project/dynamodump) + + +Simple backup and restore script for Amazon DynamoDB using AWS SDK for Python (boto3) to work similarly to mysqldump. + +Suitable for DynamoDB usages of smaller data volume which do not warrant the usage of AWS Data Pipeline for backup/restores/empty. + +dynamodump supports local DynamoDB instances as well (tested with [DynamoDB Local](https://docs.aws.amazon.com/amazondynamodb/latest/developerguide/DynamoDBLocal.html)). + +## Table of Contents + +- [Installation](#installation) +- [Usage](#usage) +- [Script (unattended) usage](#script-unattended-usage) +- [Docker CLI usage](#docker-cli-usage) +- [AWS example](#aws-example) +- [Local example](#local-example) +- [Development](#development) + +## Installation + +``` +pip install dynamodump +``` + +## Usage + +``` +usage: dynamodump.py [-h] [-a {zip,tar}] [-b BUCKET] [-m {backup,restore,empty}] [-r REGION] [--host HOST] [--port PORT] [--accessKey ACCESSKEY] [--secretKey SECRETKEY] [-p PROFILE] [-s SRCTABLE] [-d DESTTABLE] + [--prefixSeparator PREFIXSEPARATOR] [--noSeparator] [--readCapacity READCAPACITY] [-t TAG] [--writeCapacity WRITECAPACITY] [--schemaOnly] [--dataOnly] [--noConfirm] [--skipThroughputUpdate] + [--dumpPath DUMPPATH] [--billingMode {PROVISIONED,PAY_PER_REQUEST}] [--log LOG] [--limit LIMIT] [-f FILTEROPTION] + +Simple DynamoDB backup/restore/empty. + +options: + -h, --help show this help message and exit + -a {zip,tar}, --archive {zip,tar} + Type of compressed archive to create. If unset, don't create archive + -b BUCKET, --bucket BUCKET + S3 bucket in which to store or retrieve backups. [must already exist] + -m {backup,restore,empty}, --mode {backup,restore,empty} + Operation to perform + -r REGION, --region REGION + AWS region to use, e.g. 'us-west-1'. Can use any region for local testing + --host HOST Host of local DynamoDB. This parameter initialises dynamodump for local DynamoDB testing [required only for local] + --port PORT Port of local DynamoDB [required only for local] + --accessKey ACCESSKEY + Access key of local DynamoDB [required only for local] + --secretKey SECRETKEY + Secret key of local DynamoDB [required only for local] + -p PROFILE, --profile PROFILE + AWS credentials file profile to use. Allows you to use a profile instead accessKey, secretKey authentication + -s SRCTABLE, --srcTable SRCTABLE + Source DynamoDB table name to backup or restore from, use 'tablename*' for wildcard prefix selection or '*' for all tables. Mutually exclusive with --tag + -d DESTTABLE, --destTable DESTTABLE + Destination DynamoDB table name to backup or restore to, use 'tablename*' for wildcard prefix selection (defaults to use '-' separator) [optional, defaults to source] + --prefixSeparator PREFIXSEPARATOR + Specify a different prefix separator, e.g. '.' [optional] + --noSeparator Overrides the use of a prefix separator for backup wildcard searches [optional] + --readCapacity READCAPACITY + Change the temp read capacity of the DynamoDB table to backup from [optional] + -t TAG, --tag TAG Tag to use for identifying tables to back up. Mutually exclusive with srcTable. Provided as KEY=VALUE + --writeCapacity WRITECAPACITY + Change the temp write capacity of the DynamoDB table to restore to [defaults to 25, optional] + --schemaOnly Backup or restore the schema only. Do not backup/restore data. Can be used with both backup and restore modes. Cannot be used with the --dataOnly [optional] + --dataOnly Restore data only. Do not delete/recreate schema [optional for restore] + --noConfirm Don't ask for confirmation before deleting existing schemas. + --skipThroughputUpdate + Skip updating throughput values across tables [optional] + --dumpPath DUMPPATH Directory to place and search for DynamoDB table backups (defaults to use 'dump') [optional] + --billingMode {PROVISIONED,PAY_PER_REQUEST} + Set billing mode between PROVISIONED|PAY_PER_REQUEST (defaults to use 'PROVISIONED') [optional] + --log LOG Logging level - DEBUG|INFO|WARNING|ERROR|CRITICAL [optional] + --limit LIMIT Limit option for backup, will stop the back up process after number of backed up items reaches the limit [optional] + -f FILTEROPTION, --filterOption FILTEROPTION + Filter option for backup, JSON file of which keys are ['FilterExpression', 'ExpressionAttributeNames', 'ExpressionAttributeValues'] +``` + +Backup files are stored in a 'dump' subdirectory, and are restored from there as well by default. + +## Script (unattended) usage + +As of v1.2.0, note that `--noConfirm` is required to perform data restores involving deletions without any confirmation. + +## Docker CLI usage + +``` +docker run --rm -it bchew/dynamodump -h +``` + +dynamodump container images are also published to Amazon ECR Public and GitHub Packages: + +``` +public.ecr.aws/bchew/dynamodump +ghcr.io/bchew/dynamodump +``` + +## AWS example + +Single table backup/restore: + +``` +dynamodump -m backup -r us-west-1 -s testTable + +dynamodump -m restore -r us-west-1 -s testTable +``` + +Multiple table backup/restore (assumes prefix of 'production-' of table names, use --prefixSeparator to specify a +different separator): + +``` +dynamodump -m backup -r us-west-1 -s production* + +dynamodump -m restore -r us-west-1 -s production* +``` + +The above, but between different environments (e.g. production-_ tables to development-_ tables): + +``` +dynamodump -m backup -r us-west-1 -s production* + +dynamodump -m restore -r us-west-1 -s production* -d development* +``` + +Backup all tables and restore only data (will not delete and recreate schema): + +``` +dynamodump -m backup -r us-west-1 -s "*" + +dynamodump -m restore -r us-west-1 -s "*" --dataOnly +``` + +Dump all table schemas and create the schemas (e.g. creating blank tables in a different AWS account): + +``` +dynamodump -m backup -r us-west-1 -p source_credentials -s "*" --schemaOnly + +dynamodump -m restore -r us-west-1 -p destination_credentials -s "*" --schemaOnly +``` + +Backup all tables based on AWS tag `key=value` + +``` +dynamodump -p profile -r us-east-1 -m backup -t KEY=VALUE +``` + +Backup all tables based on AWS tag, compress and store in specified S3 bucket. + +``` +dynamodump -p profile -r us-east-1 -m backup -a tar -b some_s3_bucket -t TAG_KEY=TAG_VALUE + +dynamodump -p profile -r us-east-1 -m backup -a zip -b some_s3_bucket -t TAG_KEY=TAG_VALUE +``` + +Restore from S3 bucket to specified destination table + +``` +## source_table identifies archive file in S3 bucket from which backup data is restored +dynamodump -a tar -b some_s3_bucket -m restore -r us-east-1 -p profile -d destination_table -s source_table +``` + +## Local example + +The following assumes your local DynamoDB is running on localhost:8000 and is accessible via 'a' as access/secret keys. +You must specify the host to get local behavior. + +``` +dynamodump -m backup -r local -s testTable --host localhost --port 8000 --accessKey a --secretKey a + +dynamodump -m restore -r local -s testTable --host localhost --port 8000 --accessKey a --secretKey a +``` + +Multiple table backup/restore as stated in the AWS examples are also available for local. + +## Development + +``` +python3 -m venv env +source env/bin/activate + +# install dev requirements +pip3 install -r requirements-dev.txt + +# one-time install of pre-commit hooks +pre-commit install +``` + + +%prep +%autosetup -n dynamodump-1.9.0 + +%build +%py3_build + +%install +%py3_install +install -d -m755 %{buildroot}/%{_pkgdocdir} +if [ -d doc ]; then cp -arf doc %{buildroot}/%{_pkgdocdir}; fi +if [ -d docs ]; then cp -arf docs %{buildroot}/%{_pkgdocdir}; fi +if [ -d example ]; then cp -arf example %{buildroot}/%{_pkgdocdir}; fi +if [ -d examples ]; then cp -arf examples %{buildroot}/%{_pkgdocdir}; fi +pushd %{buildroot} +if [ -d usr/lib ]; then + find usr/lib -type f -printf "\"/%h/%f\"\n" >> filelist.lst +fi +if [ -d usr/lib64 ]; then + find usr/lib64 -type f -printf "\"/%h/%f\"\n" >> filelist.lst +fi +if [ -d usr/bin ]; then + find usr/bin -type f -printf "\"/%h/%f\"\n" >> filelist.lst +fi +if [ -d usr/sbin ]; then + find usr/sbin -type f -printf "\"/%h/%f\"\n" >> filelist.lst +fi +touch doclist.lst +if [ -d usr/share/man ]; then + find usr/share/man -type f -printf "\"/%h/%f.gz\"\n" >> doclist.lst +fi +popd +mv %{buildroot}/filelist.lst . +mv %{buildroot}/doclist.lst . + +%files -n python3-dynamodump -f filelist.lst +%dir %{python3_sitelib}/* + +%files help -f doclist.lst +%{_docdir}/* + +%changelog +* Tue Jun 20 2023 Python_Bot <Python_Bot@openeuler.org> - 1.9.0-1 +- Package Spec generated @@ -0,0 +1 @@ +368520568271cf9bda2a255d82daa482 dynamodump-1.9.0.tar.gz |