summaryrefslogtreecommitdiff
diff options
context:
space:
mode:
authorCoprDistGit <infra@openeuler.org>2023-04-23 04:51:22 +0000
committerCoprDistGit <infra@openeuler.org>2023-04-23 04:51:22 +0000
commita5a542b33e105445a424844d217a6ee8b14c358d (patch)
tree1c4d4ad29b85555ed5fe65217f8755414df79162
parent778db38c44b84617115c74eb36092aac5a8a6d94 (diff)
automatic import of python-jinaopeneuler20.03
-rw-r--r--.gitignore1
-rw-r--r--python-jina.spec35
-rw-r--r--sources2
3 files changed, 24 insertions, 14 deletions
diff --git a/.gitignore b/.gitignore
index c4eb844..5fd5f1f 100644
--- a/.gitignore
+++ b/.gitignore
@@ -1 +1,2 @@
/jina-3.14.1.tar.gz
+/jina-3.15.0.tar.gz
diff --git a/python-jina.spec b/python-jina.spec
index 8a007a6..a26271e 100644
--- a/python-jina.spec
+++ b/python-jina.spec
@@ -1,11 +1,11 @@
%global _empty_manifest_terminate_build 0
Name: python-jina
-Version: 3.14.1
+Version: 3.15.0
Release: 1
Summary: Build multimodal AI services via cloud native technologies · Neural Search · Generative AI · MLOps
License: Apache 2.0
URL: https://github.com/jina-ai/jina/
-Source0: https://mirrors.nju.edu.cn/pypi/web/packages/d6/93/909b20eeddce3941d76a06c357e1d9d7386159e9420f04750d023116ff48/jina-3.14.1.tar.gz
+Source0: https://mirrors.nju.edu.cn/pypi/web/packages/68/fd/559d5809832dfc49615aaf5626f1ff28b1de54453a72e2056ad9f702203a/jina-3.15.0.tar.gz
BuildArch: noarch
@@ -13,14 +13,14 @@ BuildArch: noarch
<p align="center">
<a href="https://docs.jina.ai"><img src="https://github.com/jina-ai/jina/blob/master/.github/readme/streamline-banner.png?raw=true" alt="Jina: Streamline AI & ML Product Delivery" width="100%"></a>
</p>
-### Build AI & ML Services
+### Build AI Services
<!-- start build-ai-services -->
[![Open In Colab](https://colab.research.google.com/assets/colab-badge.svg)](https://colab.research.google.com/github/jina-ai/jina/blob/master/.github/getting-started/notebook.ipynb)
Let's build a fast, reliable and scalable gRPC-based AI service. In Jina we call this an **[Executor](https://docs.jina.ai/concepts/executor/)**. Our simple Executor will use Facebook's mBART-50 model to translate French to English. We'll then use a **Deployment** to serve it.
> **Note**
> A Deployment serves just one Executor. To combine multiple Executors into a pipeline and serve that, use a [Flow](#build-a-pipeline).
> **Note**
-> Run the [code in Colab](https://colab.research.google.com/assets/colab-badge.svg) to install all dependencies.
+> Run the [code in Colab](https://colab.research.google.com/github/jina-ai/jina/blob/master/.github/getting-started/notebook.ipynb#scrollTo=0l-lkmz4H-jW) to install all dependencies.
Let's implement the service's logic:
<table>
<tr>
@@ -67,6 +67,7 @@ Then we deploy it with either the Python API or YAML:
<td>
```python
from jina import Deployment
+from translate_executor import Translator
with Deployment(uses=Translator, timeout_ready=-1) as dep:
dep.block()
```
@@ -109,9 +110,11 @@ print(response[0].text)
an astronaut is walking in a park
```
<!-- end build-ai-services -->
+> **Note**
+> In a notebook, one cannot use `deployment.block()` and then make requests to the client. Please refer to the colab link above for reproducible Jupyter Notebook code snippets.
### Build a pipeline
<!-- start build-pipelines -->
-[![Open In Colab](https://colab.research.google.com/assets/colab-badge.svg)](https://colab.research.google.com/github/jina-ai/jina/blob/docs-readme-changes/.github/getting-started/notebook.ipynb)
+[![Open In Colab](https://colab.research.google.com/assets/colab-badge.svg)](https://colab.research.google.com/github/jina-ai/jina/blob/master/.github/getting-started/notebook.ipynb#scrollTo=YfNm1nScH30U)
Sometimes you want to chain microservices together into a pipeline. That's where a [Flow](https://docs.jina.ai/concepts/flow/) comes in.
A Flow is a [DAG](https://de.wikipedia.org/wiki/DAG) pipeline, composed of a set of steps, It orchestrates a set of [Executors](https://docs.jina.ai/concepts/executor/) and a [Gateway](https://docs.jina.ai/concepts/gateway/) to offer an end-to-end service.
> **Note**
@@ -199,14 +202,14 @@ BuildRequires: python3-pip
<p align="center">
<a href="https://docs.jina.ai"><img src="https://github.com/jina-ai/jina/blob/master/.github/readme/streamline-banner.png?raw=true" alt="Jina: Streamline AI & ML Product Delivery" width="100%"></a>
</p>
-### Build AI & ML Services
+### Build AI Services
<!-- start build-ai-services -->
[![Open In Colab](https://colab.research.google.com/assets/colab-badge.svg)](https://colab.research.google.com/github/jina-ai/jina/blob/master/.github/getting-started/notebook.ipynb)
Let's build a fast, reliable and scalable gRPC-based AI service. In Jina we call this an **[Executor](https://docs.jina.ai/concepts/executor/)**. Our simple Executor will use Facebook's mBART-50 model to translate French to English. We'll then use a **Deployment** to serve it.
> **Note**
> A Deployment serves just one Executor. To combine multiple Executors into a pipeline and serve that, use a [Flow](#build-a-pipeline).
> **Note**
-> Run the [code in Colab](https://colab.research.google.com/assets/colab-badge.svg) to install all dependencies.
+> Run the [code in Colab](https://colab.research.google.com/github/jina-ai/jina/blob/master/.github/getting-started/notebook.ipynb#scrollTo=0l-lkmz4H-jW) to install all dependencies.
Let's implement the service's logic:
<table>
<tr>
@@ -253,6 +256,7 @@ Then we deploy it with either the Python API or YAML:
<td>
```python
from jina import Deployment
+from translate_executor import Translator
with Deployment(uses=Translator, timeout_ready=-1) as dep:
dep.block()
```
@@ -295,9 +299,11 @@ print(response[0].text)
an astronaut is walking in a park
```
<!-- end build-ai-services -->
+> **Note**
+> In a notebook, one cannot use `deployment.block()` and then make requests to the client. Please refer to the colab link above for reproducible Jupyter Notebook code snippets.
### Build a pipeline
<!-- start build-pipelines -->
-[![Open In Colab](https://colab.research.google.com/assets/colab-badge.svg)](https://colab.research.google.com/github/jina-ai/jina/blob/docs-readme-changes/.github/getting-started/notebook.ipynb)
+[![Open In Colab](https://colab.research.google.com/assets/colab-badge.svg)](https://colab.research.google.com/github/jina-ai/jina/blob/master/.github/getting-started/notebook.ipynb#scrollTo=YfNm1nScH30U)
Sometimes you want to chain microservices together into a pipeline. That's where a [Flow](https://docs.jina.ai/concepts/flow/) comes in.
A Flow is a [DAG](https://de.wikipedia.org/wiki/DAG) pipeline, composed of a set of steps, It orchestrates a set of [Executors](https://docs.jina.ai/concepts/executor/) and a [Gateway](https://docs.jina.ai/concepts/gateway/) to offer an end-to-end service.
> **Note**
@@ -382,14 +388,14 @@ Provides: python3-jina-doc
<p align="center">
<a href="https://docs.jina.ai"><img src="https://github.com/jina-ai/jina/blob/master/.github/readme/streamline-banner.png?raw=true" alt="Jina: Streamline AI & ML Product Delivery" width="100%"></a>
</p>
-### Build AI & ML Services
+### Build AI Services
<!-- start build-ai-services -->
[![Open In Colab](https://colab.research.google.com/assets/colab-badge.svg)](https://colab.research.google.com/github/jina-ai/jina/blob/master/.github/getting-started/notebook.ipynb)
Let's build a fast, reliable and scalable gRPC-based AI service. In Jina we call this an **[Executor](https://docs.jina.ai/concepts/executor/)**. Our simple Executor will use Facebook's mBART-50 model to translate French to English. We'll then use a **Deployment** to serve it.
> **Note**
> A Deployment serves just one Executor. To combine multiple Executors into a pipeline and serve that, use a [Flow](#build-a-pipeline).
> **Note**
-> Run the [code in Colab](https://colab.research.google.com/assets/colab-badge.svg) to install all dependencies.
+> Run the [code in Colab](https://colab.research.google.com/github/jina-ai/jina/blob/master/.github/getting-started/notebook.ipynb#scrollTo=0l-lkmz4H-jW) to install all dependencies.
Let's implement the service's logic:
<table>
<tr>
@@ -436,6 +442,7 @@ Then we deploy it with either the Python API or YAML:
<td>
```python
from jina import Deployment
+from translate_executor import Translator
with Deployment(uses=Translator, timeout_ready=-1) as dep:
dep.block()
```
@@ -478,9 +485,11 @@ print(response[0].text)
an astronaut is walking in a park
```
<!-- end build-ai-services -->
+> **Note**
+> In a notebook, one cannot use `deployment.block()` and then make requests to the client. Please refer to the colab link above for reproducible Jupyter Notebook code snippets.
### Build a pipeline
<!-- start build-pipelines -->
-[![Open In Colab](https://colab.research.google.com/assets/colab-badge.svg)](https://colab.research.google.com/github/jina-ai/jina/blob/docs-readme-changes/.github/getting-started/notebook.ipynb)
+[![Open In Colab](https://colab.research.google.com/assets/colab-badge.svg)](https://colab.research.google.com/github/jina-ai/jina/blob/master/.github/getting-started/notebook.ipynb#scrollTo=YfNm1nScH30U)
Sometimes you want to chain microservices together into a pipeline. That's where a [Flow](https://docs.jina.ai/concepts/flow/) comes in.
A Flow is a [DAG](https://de.wikipedia.org/wiki/DAG) pipeline, composed of a set of steps, It orchestrates a set of [Executors](https://docs.jina.ai/concepts/executor/) and a [Gateway](https://docs.jina.ai/concepts/gateway/) to offer an end-to-end service.
> **Note**
@@ -559,7 +568,7 @@ Read more about [deploying Flows to JCloud](https://docs.jina.ai/concepts/jcloud
<!-- end build-pipelines -->
%prep
-%autosetup -n jina-3.14.1
+%autosetup -n jina-3.15.0
%build
%py3_build
@@ -599,5 +608,5 @@ mv %{buildroot}/doclist.lst .
%{_docdir}/*
%changelog
-* Mon Apr 10 2023 Python_Bot <Python_Bot@openeuler.org> - 3.14.1-1
+* Sun Apr 23 2023 Python_Bot <Python_Bot@openeuler.org> - 3.15.0-1
- Package Spec generated
diff --git a/sources b/sources
index 5c83cb1..258ba29 100644
--- a/sources
+++ b/sources
@@ -1 +1 @@
-4b6a33345d4c258924fcb6cebe45642a jina-3.14.1.tar.gz
+0d50ec365278099aa914c711fa5b7456 jina-3.15.0.tar.gz