summaryrefslogtreecommitdiff
diff options
context:
space:
mode:
-rw-r--r--.gitignore1
-rw-r--r--python-databricks-client.spec506
-rw-r--r--sources1
3 files changed, 508 insertions, 0 deletions
diff --git a/.gitignore b/.gitignore
index e69de29..fad7506 100644
--- a/.gitignore
+++ b/.gitignore
@@ -0,0 +1 @@
+/databricks_client-0.0.3.tar.gz
diff --git a/python-databricks-client.spec b/python-databricks-client.spec
new file mode 100644
index 0000000..eb419e5
--- /dev/null
+++ b/python-databricks-client.spec
@@ -0,0 +1,506 @@
+%global _empty_manifest_terminate_build 0
+Name: python-databricks-client
+Version: 0.0.3
+Release: 1
+Summary: REST client for Databricks
+License: MIT License
+URL: https://github.com/microsoft/DataOps
+Source0: https://mirrors.nju.edu.cn/pypi/web/packages/d3/2d/ce9b221b49889d17ecac400ac54b7e434abec6176e42fecc7cde637ed5d0/databricks_client-0.0.3.tar.gz
+BuildArch: noarch
+
+Requires: python3-requests
+Requires: python3-azure-core
+
+%description
+# databricks-client
+
+## About
+
+A REST client for the [Databricks REST API](https://docs.databricks.com/dev-tools/api/latest/index.html).
+
+This module is a thin layer allowing to build HTTP [Requests](https://requests.readthedocs.io/en/master/).
+It does not expose API operations as distinct methods, but rather exposes generic methods allowing
+to build API calls.
+
+The Databricks API sometimes returns 200 error codes and HTML content when the request is not
+properly authenticated. The client intercepts such occurrences (detecting non-JSON returned content)
+and wraps them into an exception.
+
+_This open-source project is not developed by nor affiliated with Databricks._
+
+## Installing
+
+```
+pip install databricks-client
+```
+
+## Usage
+
+```python
+import databricks_client
+
+client = databricks_client.create("https://northeurope.azuredatabricks.net/api/2.0")
+client.auth_pat_token(pat_token)
+client.ensure_available()
+clusters_list = client.get('clusters/list')
+for cluster in clusters_list["clusters"]:
+ print(cluster)
+```
+
+## Usage with a newly provisioned workspace
+
+If using this module as part of a provisioning job, you need to call `client.ensure_available()`.
+
+When the first user logs it to a new Databricks workspace, workspace provisioning is triggered,
+and the API is not available until that job has completed (that usually takes under a minute,
+but could take longer depending on the network configuration). In that case you would get an
+error such as the following when calling the API:
+
+```
+"Succeeded{"error_code":"INVALID_PARAMETER_VALUE","message":"Unknown worker environment WorkerEnvId(workerenv-4312344789891641)"}
+```
+
+The method `client.ensure_available(url="instance-pools/list", retries=100, delay_seconds=6)`
+prevents this error by attempting to
+connect to the provided URL and retries as long as the workspace is in provisioning
+state, or until the given number of retries has elapsed.
+
+## Usage with Azure Active Directory
+
+Note: Azure AD authentication for Databricks is currently in preview.
+
+The client generates short-lived Azure AD tokens. If you need to use your client for longer
+than the lifetime (typically 30 minutes), rerun `client.auth_azuread` periodically.
+
+### Azure AD authentication with Azure CLI
+
+[Install the Azure CLI](https://docs.microsoft.com/en-us/cli/azure/install-azure-cli?view=azure-cli-latest).
+
+```
+pip install databricks-client[azurecli]
+az login
+```
+
+```python
+import databricks_client
+
+client = databricks_client.create("https://northeurope.azuredatabricks.net/api/2.0")
+client.auth_azuread("/subscriptions/00000000-0000-0000-0000-000000000000/resourceGroups/my-rg/providers/Microsoft.Databricks/workspaces/my-workspace")
+# or client.auth_azuread(resource_group="my-rg", workspace_name="my-workspace")
+client.ensure_available()
+clusters_list = client.get('clusters/list')
+for cluster in clusters_list["clusters"]:
+ print(cluster)
+```
+
+This is recommended with Azure DevOps Pipelines using the [Azure CLI task](https://docs.microsoft.com/en-us/azure/devops/pipelines/tasks/deploy/azure-cli?view=azure-devops).
+
+### Azure AD authentication with ADAL
+
+```
+pip install databricks-client
+pip install adal
+```
+
+```python
+import databricks_client
+import adal
+
+authority_host_uri = 'https://login.microsoftonline.com'
+authority_uri = authority_host_uri + '/' + tenant_id
+context = adal.AuthenticationContext(authority_uri)
+
+def token_callback(resource):
+ return context.acquire_token_with_client_credentials(resource, client_id, client_secret)["accessToken"]
+
+client = databricks_client.create("https://northeurope.azuredatabricks.net/api/2.0")
+client.auth_azuread("/subscriptions/00000000-0000-0000-0000-000000000000/resourceGroups/my-rg/providers/Microsoft.Databricks/workspaces/my-workspace", token_callback)
+# or client.auth_azuread(resource_group="my-rg", workspace_name="my-workspace", token_callback=token_callback)
+client.ensure_available()
+clusters_list = client.get('clusters/list')
+for cluster in clusters_list["clusters"]:
+ print(cluster)
+```
+
+## Example usages
+
+### Generating a PAT token
+
+```python
+response = client.post(
+ 'token/create',
+ json={"lifetime_seconds": 60, "comment": "Unit Test Token"}
+)
+pat_token = response['token_value']
+```
+
+### Uploading a notebook
+
+```python
+import base64
+
+with open(notebook_file, "rb") as f:
+ file_content = f.read()
+
+client.post(
+ 'workspace/import',
+ json={
+ "content": base64.b64encode(file_content).decode('ascii'),
+ "path": notebook_path,
+ "overwrite": False,
+ "language": "PYTHON",
+ "format": "SOURCE"
+ }
+)
+```
+
+
+
+
+
+%package -n python3-databricks-client
+Summary: REST client for Databricks
+Provides: python-databricks-client
+BuildRequires: python3-devel
+BuildRequires: python3-setuptools
+BuildRequires: python3-pip
+%description -n python3-databricks-client
+# databricks-client
+
+## About
+
+A REST client for the [Databricks REST API](https://docs.databricks.com/dev-tools/api/latest/index.html).
+
+This module is a thin layer allowing to build HTTP [Requests](https://requests.readthedocs.io/en/master/).
+It does not expose API operations as distinct methods, but rather exposes generic methods allowing
+to build API calls.
+
+The Databricks API sometimes returns 200 error codes and HTML content when the request is not
+properly authenticated. The client intercepts such occurrences (detecting non-JSON returned content)
+and wraps them into an exception.
+
+_This open-source project is not developed by nor affiliated with Databricks._
+
+## Installing
+
+```
+pip install databricks-client
+```
+
+## Usage
+
+```python
+import databricks_client
+
+client = databricks_client.create("https://northeurope.azuredatabricks.net/api/2.0")
+client.auth_pat_token(pat_token)
+client.ensure_available()
+clusters_list = client.get('clusters/list')
+for cluster in clusters_list["clusters"]:
+ print(cluster)
+```
+
+## Usage with a newly provisioned workspace
+
+If using this module as part of a provisioning job, you need to call `client.ensure_available()`.
+
+When the first user logs it to a new Databricks workspace, workspace provisioning is triggered,
+and the API is not available until that job has completed (that usually takes under a minute,
+but could take longer depending on the network configuration). In that case you would get an
+error such as the following when calling the API:
+
+```
+"Succeeded{"error_code":"INVALID_PARAMETER_VALUE","message":"Unknown worker environment WorkerEnvId(workerenv-4312344789891641)"}
+```
+
+The method `client.ensure_available(url="instance-pools/list", retries=100, delay_seconds=6)`
+prevents this error by attempting to
+connect to the provided URL and retries as long as the workspace is in provisioning
+state, or until the given number of retries has elapsed.
+
+## Usage with Azure Active Directory
+
+Note: Azure AD authentication for Databricks is currently in preview.
+
+The client generates short-lived Azure AD tokens. If you need to use your client for longer
+than the lifetime (typically 30 minutes), rerun `client.auth_azuread` periodically.
+
+### Azure AD authentication with Azure CLI
+
+[Install the Azure CLI](https://docs.microsoft.com/en-us/cli/azure/install-azure-cli?view=azure-cli-latest).
+
+```
+pip install databricks-client[azurecli]
+az login
+```
+
+```python
+import databricks_client
+
+client = databricks_client.create("https://northeurope.azuredatabricks.net/api/2.0")
+client.auth_azuread("/subscriptions/00000000-0000-0000-0000-000000000000/resourceGroups/my-rg/providers/Microsoft.Databricks/workspaces/my-workspace")
+# or client.auth_azuread(resource_group="my-rg", workspace_name="my-workspace")
+client.ensure_available()
+clusters_list = client.get('clusters/list')
+for cluster in clusters_list["clusters"]:
+ print(cluster)
+```
+
+This is recommended with Azure DevOps Pipelines using the [Azure CLI task](https://docs.microsoft.com/en-us/azure/devops/pipelines/tasks/deploy/azure-cli?view=azure-devops).
+
+### Azure AD authentication with ADAL
+
+```
+pip install databricks-client
+pip install adal
+```
+
+```python
+import databricks_client
+import adal
+
+authority_host_uri = 'https://login.microsoftonline.com'
+authority_uri = authority_host_uri + '/' + tenant_id
+context = adal.AuthenticationContext(authority_uri)
+
+def token_callback(resource):
+ return context.acquire_token_with_client_credentials(resource, client_id, client_secret)["accessToken"]
+
+client = databricks_client.create("https://northeurope.azuredatabricks.net/api/2.0")
+client.auth_azuread("/subscriptions/00000000-0000-0000-0000-000000000000/resourceGroups/my-rg/providers/Microsoft.Databricks/workspaces/my-workspace", token_callback)
+# or client.auth_azuread(resource_group="my-rg", workspace_name="my-workspace", token_callback=token_callback)
+client.ensure_available()
+clusters_list = client.get('clusters/list')
+for cluster in clusters_list["clusters"]:
+ print(cluster)
+```
+
+## Example usages
+
+### Generating a PAT token
+
+```python
+response = client.post(
+ 'token/create',
+ json={"lifetime_seconds": 60, "comment": "Unit Test Token"}
+)
+pat_token = response['token_value']
+```
+
+### Uploading a notebook
+
+```python
+import base64
+
+with open(notebook_file, "rb") as f:
+ file_content = f.read()
+
+client.post(
+ 'workspace/import',
+ json={
+ "content": base64.b64encode(file_content).decode('ascii'),
+ "path": notebook_path,
+ "overwrite": False,
+ "language": "PYTHON",
+ "format": "SOURCE"
+ }
+)
+```
+
+
+
+
+
+%package help
+Summary: Development documents and examples for databricks-client
+Provides: python3-databricks-client-doc
+%description help
+# databricks-client
+
+## About
+
+A REST client for the [Databricks REST API](https://docs.databricks.com/dev-tools/api/latest/index.html).
+
+This module is a thin layer allowing to build HTTP [Requests](https://requests.readthedocs.io/en/master/).
+It does not expose API operations as distinct methods, but rather exposes generic methods allowing
+to build API calls.
+
+The Databricks API sometimes returns 200 error codes and HTML content when the request is not
+properly authenticated. The client intercepts such occurrences (detecting non-JSON returned content)
+and wraps them into an exception.
+
+_This open-source project is not developed by nor affiliated with Databricks._
+
+## Installing
+
+```
+pip install databricks-client
+```
+
+## Usage
+
+```python
+import databricks_client
+
+client = databricks_client.create("https://northeurope.azuredatabricks.net/api/2.0")
+client.auth_pat_token(pat_token)
+client.ensure_available()
+clusters_list = client.get('clusters/list')
+for cluster in clusters_list["clusters"]:
+ print(cluster)
+```
+
+## Usage with a newly provisioned workspace
+
+If using this module as part of a provisioning job, you need to call `client.ensure_available()`.
+
+When the first user logs it to a new Databricks workspace, workspace provisioning is triggered,
+and the API is not available until that job has completed (that usually takes under a minute,
+but could take longer depending on the network configuration). In that case you would get an
+error such as the following when calling the API:
+
+```
+"Succeeded{"error_code":"INVALID_PARAMETER_VALUE","message":"Unknown worker environment WorkerEnvId(workerenv-4312344789891641)"}
+```
+
+The method `client.ensure_available(url="instance-pools/list", retries=100, delay_seconds=6)`
+prevents this error by attempting to
+connect to the provided URL and retries as long as the workspace is in provisioning
+state, or until the given number of retries has elapsed.
+
+## Usage with Azure Active Directory
+
+Note: Azure AD authentication for Databricks is currently in preview.
+
+The client generates short-lived Azure AD tokens. If you need to use your client for longer
+than the lifetime (typically 30 minutes), rerun `client.auth_azuread` periodically.
+
+### Azure AD authentication with Azure CLI
+
+[Install the Azure CLI](https://docs.microsoft.com/en-us/cli/azure/install-azure-cli?view=azure-cli-latest).
+
+```
+pip install databricks-client[azurecli]
+az login
+```
+
+```python
+import databricks_client
+
+client = databricks_client.create("https://northeurope.azuredatabricks.net/api/2.0")
+client.auth_azuread("/subscriptions/00000000-0000-0000-0000-000000000000/resourceGroups/my-rg/providers/Microsoft.Databricks/workspaces/my-workspace")
+# or client.auth_azuread(resource_group="my-rg", workspace_name="my-workspace")
+client.ensure_available()
+clusters_list = client.get('clusters/list')
+for cluster in clusters_list["clusters"]:
+ print(cluster)
+```
+
+This is recommended with Azure DevOps Pipelines using the [Azure CLI task](https://docs.microsoft.com/en-us/azure/devops/pipelines/tasks/deploy/azure-cli?view=azure-devops).
+
+### Azure AD authentication with ADAL
+
+```
+pip install databricks-client
+pip install adal
+```
+
+```python
+import databricks_client
+import adal
+
+authority_host_uri = 'https://login.microsoftonline.com'
+authority_uri = authority_host_uri + '/' + tenant_id
+context = adal.AuthenticationContext(authority_uri)
+
+def token_callback(resource):
+ return context.acquire_token_with_client_credentials(resource, client_id, client_secret)["accessToken"]
+
+client = databricks_client.create("https://northeurope.azuredatabricks.net/api/2.0")
+client.auth_azuread("/subscriptions/00000000-0000-0000-0000-000000000000/resourceGroups/my-rg/providers/Microsoft.Databricks/workspaces/my-workspace", token_callback)
+# or client.auth_azuread(resource_group="my-rg", workspace_name="my-workspace", token_callback=token_callback)
+client.ensure_available()
+clusters_list = client.get('clusters/list')
+for cluster in clusters_list["clusters"]:
+ print(cluster)
+```
+
+## Example usages
+
+### Generating a PAT token
+
+```python
+response = client.post(
+ 'token/create',
+ json={"lifetime_seconds": 60, "comment": "Unit Test Token"}
+)
+pat_token = response['token_value']
+```
+
+### Uploading a notebook
+
+```python
+import base64
+
+with open(notebook_file, "rb") as f:
+ file_content = f.read()
+
+client.post(
+ 'workspace/import',
+ json={
+ "content": base64.b64encode(file_content).decode('ascii'),
+ "path": notebook_path,
+ "overwrite": False,
+ "language": "PYTHON",
+ "format": "SOURCE"
+ }
+)
+```
+
+
+
+
+
+%prep
+%autosetup -n databricks-client-0.0.3
+
+%build
+%py3_build
+
+%install
+%py3_install
+install -d -m755 %{buildroot}/%{_pkgdocdir}
+if [ -d doc ]; then cp -arf doc %{buildroot}/%{_pkgdocdir}; fi
+if [ -d docs ]; then cp -arf docs %{buildroot}/%{_pkgdocdir}; fi
+if [ -d example ]; then cp -arf example %{buildroot}/%{_pkgdocdir}; fi
+if [ -d examples ]; then cp -arf examples %{buildroot}/%{_pkgdocdir}; fi
+pushd %{buildroot}
+if [ -d usr/lib ]; then
+ find usr/lib -type f -printf "/%h/%f\n" >> filelist.lst
+fi
+if [ -d usr/lib64 ]; then
+ find usr/lib64 -type f -printf "/%h/%f\n" >> filelist.lst
+fi
+if [ -d usr/bin ]; then
+ find usr/bin -type f -printf "/%h/%f\n" >> filelist.lst
+fi
+if [ -d usr/sbin ]; then
+ find usr/sbin -type f -printf "/%h/%f\n" >> filelist.lst
+fi
+touch doclist.lst
+if [ -d usr/share/man ]; then
+ find usr/share/man -type f -printf "/%h/%f.gz\n" >> doclist.lst
+fi
+popd
+mv %{buildroot}/filelist.lst .
+mv %{buildroot}/doclist.lst .
+
+%files -n python3-databricks-client -f filelist.lst
+%dir %{python3_sitelib}/*
+
+%files help -f doclist.lst
+%{_docdir}/*
+
+%changelog
+* Tue Apr 11 2023 Python_Bot <Python_Bot@openeuler.org> - 0.0.3-1
+- Package Spec generated
diff --git a/sources b/sources
new file mode 100644
index 0000000..e6e8d0f
--- /dev/null
+++ b/sources
@@ -0,0 +1 @@
+ef38fb080f27ae16e73da7f7098f4b4c databricks_client-0.0.3.tar.gz