diff options
| author | CoprDistGit <infra@openeuler.org> | 2023-04-10 16:16:22 +0000 |
|---|---|---|
| committer | CoprDistGit <infra@openeuler.org> | 2023-04-10 16:16:22 +0000 |
| commit | 7cf4b05f740226cd8f6cc4c073a7eee14d0d9170 (patch) | |
| tree | 8d497f2764abab8bd5a73541ee2fe2341efd2171 /python-azfs.spec | |
| parent | 152e43719905bcbdf96cb818206f49722b40f4e0 (diff) | |
automatic import of python-azfs
Diffstat (limited to 'python-azfs.spec')
| -rw-r--r-- | python-azfs.spec | 542 |
1 files changed, 542 insertions, 0 deletions
diff --git a/python-azfs.spec b/python-azfs.spec new file mode 100644 index 0000000..07d0fde --- /dev/null +++ b/python-azfs.spec @@ -0,0 +1,542 @@ +%global _empty_manifest_terminate_build 0 +Name: python-azfs +Version: 0.2.14 +Release: 1 +Summary: AzFS is to provide convenient Python read/write functions for Azure Storage Account. +License: MIT +URL: https://github.com/gsy0911/azfs +Source0: https://mirrors.nju.edu.cn/pypi/web/packages/88/f6/ef1d5eb6bc4c111d1f3a8178229d6b61f7c40decb6b53a639cf656fce229/azfs-0.2.14.tar.gz +BuildArch: noarch + +Requires: python3-pandas +Requires: python3-azure-cosmosdb-table +Requires: python3-azure-identity +Requires: python3-azure-storage-blob +Requires: python3-azure-storage-file-datalake +Requires: python3-azure-storage-queue +Requires: python3-fsspec +Requires: python3-click + +%description +# AzFS + +[](https://github.com/gsy0911/azfs/actions?query=workflow%3Apytest) +[](https://codecov.io/gh/gsy0911/azfs) +[](https://lgtm.com/projects/g/gsy0911/azfs/context:python) +[](https://azfs.readthedocs.io/en/latest/?badge=latest) + + +[](https://www.python.org/downloads/release/python-377/) +[](https://pypi.org/project/azfs/) +[](https://pepy.tech/project/azfs) + +AzFS is to provide convenient Python read/write functions for Azure Storage Account. + +`AzFS` can + +* list files in blob (also with wildcard `*`), +* check if file exists, +* read csv as pd.DataFrame, and json as dict from blob, +* write pd.DataFrame as csv, and dict as json to blob. + +## install + +```bash +$ pip install azfs +``` + +## usage + +For `Blob` Storage. + + +```python +import azfs +from azure.identity import DefaultAzureCredential +import pandas as pd + +# credential is not required if your environment is on AAD(Azure Active Directory) +azc = azfs.AzFileClient() + +# credential is required if your environment is not on AAD +credential = "[your storage account credential]" +# or +credential = DefaultAzureCredential() +azc = azfs.AzFileClient(credential=credential) + +# connection_string is also supported +connection_string = "DefaultEndpointsProtocol=https;AccountName=xxxx;AccountKey=xxxx;EndpointSuffix=core.windows.net" +azc = azfs.AzFileClient(connection_string=connection_string) + +# data paths +csv_path = "https://testazfs.blob.core.windows.net/test_caontainer/test_file.csv" + +# read csv as pd.DataFrame +df = azc.read_csv(csv_path, index_col=0) +# or +with azc: + df = pd.read_csv_az(csv_path, header=None) + + +# write csv +azc.write_csv(path=csv_path, df=df) +# or +with azc: + df.to_csv_az(path=csv_path, index=False) + +# you can read multiple files +csv_pattern_path = "https://testazfs.blob.core.windows.net/test_caontainer/*.csv" +df = azc.read().csv(csv_pattern_path) + +# to apply additional filter or another process +df = azc.read().apply(function=lambda x: x[x['id'] == 'AAA']).csv(csv_pattern_path) + +# in addition, you can use multiprocessing +df = azc.read(use_mp=True).apply(function=lambda x: x[x['id'] == 'AAA']).csv(csv_pattern_path) +``` + +For `Queue` Storage + +```python +import azfs +queue_url = "https://{storage_account}.queue.core.windows.net/{queue_name}" + +azc = azfs.AzFileClient() +queue_message = azc.get(queue_url) +# message will not be deleted if `delete=False` +# queue_message = azc.get(queue_url, delete=False) + +# get message content +queue_content = queue_message.get('content') + +``` + +For `Table` Storage + +```python + +import azfs +cons = { + "account_name": "{storage_account_name}", + "account_key": "{credential}", + "database_name": "{database_name}" +} + +table_client = azfs.TableStorageWrapper(**cons) + +# put data, according to the keyword you put +table_client.put(id_="1", message="hello_world") + +# get data +table_client.get(id_="1") + +``` + +check more details in [](https://azfs.readthedocs.io/en/latest/?badge=latest) + +### types of authorization + +Supported authentication types are +* [Azure Active Directory (AAD) token credential](https://docs.microsoft.com/azure/storage/common/storage-auth-aad). +* connection_string, like `DefaultEndpointsProtocol=https;AccountName=xxxx;AccountKey=xxxx;EndpointSuffix=core.windows.net` + +### types of storage account kind + +The table below shows if `AzFS` provides read/write functions for the storage. + + +| account kind | Blob | Data Lake | Queue | File | Table | +|:--|:--:|:--:|:--:|:--:|:--:| +| StorageV2 | O | O | O | X | O | +| StorageV1 | O | O | O | X | O | +| BlobStorage | O | - | - | - | - | + +* O: provides basic functions +* X: not provides +* -: storage type unavailable + +## dependencies + +``` +pandas +azure-identity >= "1.3.1" +azure-storage-blob >= "12.3.0" +azure-storage-file-datalake >= "12.0.0" +azure-storage-queue >= "12.1.1" +azure-cosmosdb-table +``` + + +## references + +* [azure-sdk-for-python/storage](https://github.com/Azure/azure-sdk-for-python/tree/master/sdk/storage) +* [filesystem_spec](https://github.com/intake/filesystem_spec) + + + +%package -n python3-azfs +Summary: AzFS is to provide convenient Python read/write functions for Azure Storage Account. +Provides: python-azfs +BuildRequires: python3-devel +BuildRequires: python3-setuptools +BuildRequires: python3-pip +%description -n python3-azfs +# AzFS + +[](https://github.com/gsy0911/azfs/actions?query=workflow%3Apytest) +[](https://codecov.io/gh/gsy0911/azfs) +[](https://lgtm.com/projects/g/gsy0911/azfs/context:python) +[](https://azfs.readthedocs.io/en/latest/?badge=latest) + + +[](https://www.python.org/downloads/release/python-377/) +[](https://pypi.org/project/azfs/) +[](https://pepy.tech/project/azfs) + +AzFS is to provide convenient Python read/write functions for Azure Storage Account. + +`AzFS` can + +* list files in blob (also with wildcard `*`), +* check if file exists, +* read csv as pd.DataFrame, and json as dict from blob, +* write pd.DataFrame as csv, and dict as json to blob. + +## install + +```bash +$ pip install azfs +``` + +## usage + +For `Blob` Storage. + + +```python +import azfs +from azure.identity import DefaultAzureCredential +import pandas as pd + +# credential is not required if your environment is on AAD(Azure Active Directory) +azc = azfs.AzFileClient() + +# credential is required if your environment is not on AAD +credential = "[your storage account credential]" +# or +credential = DefaultAzureCredential() +azc = azfs.AzFileClient(credential=credential) + +# connection_string is also supported +connection_string = "DefaultEndpointsProtocol=https;AccountName=xxxx;AccountKey=xxxx;EndpointSuffix=core.windows.net" +azc = azfs.AzFileClient(connection_string=connection_string) + +# data paths +csv_path = "https://testazfs.blob.core.windows.net/test_caontainer/test_file.csv" + +# read csv as pd.DataFrame +df = azc.read_csv(csv_path, index_col=0) +# or +with azc: + df = pd.read_csv_az(csv_path, header=None) + + +# write csv +azc.write_csv(path=csv_path, df=df) +# or +with azc: + df.to_csv_az(path=csv_path, index=False) + +# you can read multiple files +csv_pattern_path = "https://testazfs.blob.core.windows.net/test_caontainer/*.csv" +df = azc.read().csv(csv_pattern_path) + +# to apply additional filter or another process +df = azc.read().apply(function=lambda x: x[x['id'] == 'AAA']).csv(csv_pattern_path) + +# in addition, you can use multiprocessing +df = azc.read(use_mp=True).apply(function=lambda x: x[x['id'] == 'AAA']).csv(csv_pattern_path) +``` + +For `Queue` Storage + +```python +import azfs +queue_url = "https://{storage_account}.queue.core.windows.net/{queue_name}" + +azc = azfs.AzFileClient() +queue_message = azc.get(queue_url) +# message will not be deleted if `delete=False` +# queue_message = azc.get(queue_url, delete=False) + +# get message content +queue_content = queue_message.get('content') + +``` + +For `Table` Storage + +```python + +import azfs +cons = { + "account_name": "{storage_account_name}", + "account_key": "{credential}", + "database_name": "{database_name}" +} + +table_client = azfs.TableStorageWrapper(**cons) + +# put data, according to the keyword you put +table_client.put(id_="1", message="hello_world") + +# get data +table_client.get(id_="1") + +``` + +check more details in [](https://azfs.readthedocs.io/en/latest/?badge=latest) + +### types of authorization + +Supported authentication types are +* [Azure Active Directory (AAD) token credential](https://docs.microsoft.com/azure/storage/common/storage-auth-aad). +* connection_string, like `DefaultEndpointsProtocol=https;AccountName=xxxx;AccountKey=xxxx;EndpointSuffix=core.windows.net` + +### types of storage account kind + +The table below shows if `AzFS` provides read/write functions for the storage. + + +| account kind | Blob | Data Lake | Queue | File | Table | +|:--|:--:|:--:|:--:|:--:|:--:| +| StorageV2 | O | O | O | X | O | +| StorageV1 | O | O | O | X | O | +| BlobStorage | O | - | - | - | - | + +* O: provides basic functions +* X: not provides +* -: storage type unavailable + +## dependencies + +``` +pandas +azure-identity >= "1.3.1" +azure-storage-blob >= "12.3.0" +azure-storage-file-datalake >= "12.0.0" +azure-storage-queue >= "12.1.1" +azure-cosmosdb-table +``` + + +## references + +* [azure-sdk-for-python/storage](https://github.com/Azure/azure-sdk-for-python/tree/master/sdk/storage) +* [filesystem_spec](https://github.com/intake/filesystem_spec) + + + +%package help +Summary: Development documents and examples for azfs +Provides: python3-azfs-doc +%description help +# AzFS + +[](https://github.com/gsy0911/azfs/actions?query=workflow%3Apytest) +[](https://codecov.io/gh/gsy0911/azfs) +[](https://lgtm.com/projects/g/gsy0911/azfs/context:python) +[](https://azfs.readthedocs.io/en/latest/?badge=latest) + + +[](https://www.python.org/downloads/release/python-377/) +[](https://pypi.org/project/azfs/) +[](https://pepy.tech/project/azfs) + +AzFS is to provide convenient Python read/write functions for Azure Storage Account. + +`AzFS` can + +* list files in blob (also with wildcard `*`), +* check if file exists, +* read csv as pd.DataFrame, and json as dict from blob, +* write pd.DataFrame as csv, and dict as json to blob. + +## install + +```bash +$ pip install azfs +``` + +## usage + +For `Blob` Storage. + + +```python +import azfs +from azure.identity import DefaultAzureCredential +import pandas as pd + +# credential is not required if your environment is on AAD(Azure Active Directory) +azc = azfs.AzFileClient() + +# credential is required if your environment is not on AAD +credential = "[your storage account credential]" +# or +credential = DefaultAzureCredential() +azc = azfs.AzFileClient(credential=credential) + +# connection_string is also supported +connection_string = "DefaultEndpointsProtocol=https;AccountName=xxxx;AccountKey=xxxx;EndpointSuffix=core.windows.net" +azc = azfs.AzFileClient(connection_string=connection_string) + +# data paths +csv_path = "https://testazfs.blob.core.windows.net/test_caontainer/test_file.csv" + +# read csv as pd.DataFrame +df = azc.read_csv(csv_path, index_col=0) +# or +with azc: + df = pd.read_csv_az(csv_path, header=None) + + +# write csv +azc.write_csv(path=csv_path, df=df) +# or +with azc: + df.to_csv_az(path=csv_path, index=False) + +# you can read multiple files +csv_pattern_path = "https://testazfs.blob.core.windows.net/test_caontainer/*.csv" +df = azc.read().csv(csv_pattern_path) + +# to apply additional filter or another process +df = azc.read().apply(function=lambda x: x[x['id'] == 'AAA']).csv(csv_pattern_path) + +# in addition, you can use multiprocessing +df = azc.read(use_mp=True).apply(function=lambda x: x[x['id'] == 'AAA']).csv(csv_pattern_path) +``` + +For `Queue` Storage + +```python +import azfs +queue_url = "https://{storage_account}.queue.core.windows.net/{queue_name}" + +azc = azfs.AzFileClient() +queue_message = azc.get(queue_url) +# message will not be deleted if `delete=False` +# queue_message = azc.get(queue_url, delete=False) + +# get message content +queue_content = queue_message.get('content') + +``` + +For `Table` Storage + +```python + +import azfs +cons = { + "account_name": "{storage_account_name}", + "account_key": "{credential}", + "database_name": "{database_name}" +} + +table_client = azfs.TableStorageWrapper(**cons) + +# put data, according to the keyword you put +table_client.put(id_="1", message="hello_world") + +# get data +table_client.get(id_="1") + +``` + +check more details in [](https://azfs.readthedocs.io/en/latest/?badge=latest) + +### types of authorization + +Supported authentication types are +* [Azure Active Directory (AAD) token credential](https://docs.microsoft.com/azure/storage/common/storage-auth-aad). +* connection_string, like `DefaultEndpointsProtocol=https;AccountName=xxxx;AccountKey=xxxx;EndpointSuffix=core.windows.net` + +### types of storage account kind + +The table below shows if `AzFS` provides read/write functions for the storage. + + +| account kind | Blob | Data Lake | Queue | File | Table | +|:--|:--:|:--:|:--:|:--:|:--:| +| StorageV2 | O | O | O | X | O | +| StorageV1 | O | O | O | X | O | +| BlobStorage | O | - | - | - | - | + +* O: provides basic functions +* X: not provides +* -: storage type unavailable + +## dependencies + +``` +pandas +azure-identity >= "1.3.1" +azure-storage-blob >= "12.3.0" +azure-storage-file-datalake >= "12.0.0" +azure-storage-queue >= "12.1.1" +azure-cosmosdb-table +``` + + +## references + +* [azure-sdk-for-python/storage](https://github.com/Azure/azure-sdk-for-python/tree/master/sdk/storage) +* [filesystem_spec](https://github.com/intake/filesystem_spec) + + + +%prep +%autosetup -n azfs-0.2.14 + +%build +%py3_build + +%install +%py3_install +install -d -m755 %{buildroot}/%{_pkgdocdir} +if [ -d doc ]; then cp -arf doc %{buildroot}/%{_pkgdocdir}; fi +if [ -d docs ]; then cp -arf docs %{buildroot}/%{_pkgdocdir}; fi +if [ -d example ]; then cp -arf example %{buildroot}/%{_pkgdocdir}; fi +if [ -d examples ]; then cp -arf examples %{buildroot}/%{_pkgdocdir}; fi +pushd %{buildroot} +if [ -d usr/lib ]; then + find usr/lib -type f -printf "/%h/%f\n" >> filelist.lst +fi +if [ -d usr/lib64 ]; then + find usr/lib64 -type f -printf "/%h/%f\n" >> filelist.lst +fi +if [ -d usr/bin ]; then + find usr/bin -type f -printf "/%h/%f\n" >> filelist.lst +fi +if [ -d usr/sbin ]; then + find usr/sbin -type f -printf "/%h/%f\n" >> filelist.lst +fi +touch doclist.lst +if [ -d usr/share/man ]; then + find usr/share/man -type f -printf "/%h/%f.gz\n" >> doclist.lst +fi +popd +mv %{buildroot}/filelist.lst . +mv %{buildroot}/doclist.lst . + +%files -n python3-azfs -f filelist.lst +%dir %{python3_sitelib}/* + +%files help -f doclist.lst +%{_docdir}/* + +%changelog +* Mon Apr 10 2023 Python_Bot <Python_Bot@openeuler.org> - 0.2.14-1 +- Package Spec generated |
