summaryrefslogtreecommitdiff
diff options
context:
space:
mode:
authorCoprDistGit <infra@openeuler.org>2023-06-20 07:49:25 +0000
committerCoprDistGit <infra@openeuler.org>2023-06-20 07:49:25 +0000
commit7a605d4f26ed18ce88e60a317653bd38f87353df (patch)
tree48cc23fb5287a9e3dd63b801dc6db59273940a54
parentb7c4af38cdfb45d65f8433053fc89fdea5845984 (diff)
automatic import of python-naogiopeneuler20.03
-rw-r--r--.gitignore1
-rw-r--r--python-naogi.spec379
-rw-r--r--sources1
3 files changed, 381 insertions, 0 deletions
diff --git a/.gitignore b/.gitignore
index e69de29..70bde62 100644
--- a/.gitignore
+++ b/.gitignore
@@ -0,0 +1 @@
+/naogi-0.0.6.tar.gz
diff --git a/python-naogi.spec b/python-naogi.spec
new file mode 100644
index 0000000..5072b1b
--- /dev/null
+++ b/python-naogi.spec
@@ -0,0 +1,379 @@
+%global _empty_manifest_terminate_build 0
+Name: python-naogi
+Version: 0.0.6
+Release: 1
+Summary: Abstract class for Naogi ML deployment
+License: MIT License
+URL: https://github.com/Naogi/naogi_model
+Source0: https://mirrors.aliyun.com/pypi/web/packages/e4/2f/334fdce0d9b6a3672e3dab71d5e638faccaf9d39fc6911f4a5f89dab64c5/naogi-0.0.6.tar.gz
+BuildArch: noarch
+
+Requires: python3-flask
+
+%description
+# naogi_model
+
+NaogiModel it is an abstract class for the naogi.com ML deployment platform
+
+## How to deploy via naogi.com
+* Add `naogi` to your project requirements.txt
+* create file `naogi.py` in the root directory (copypaste file from [naogi.py](https://github.com/Naogi/naogi_model))
+* implement your logic of model loading, prepareing and calling
+* go to you naogi.com profile, create project and connect git
+<br>
+<br>
+
+## How it works? (What to implement in naogi.py)
+### Loading model (server starting time)
+When naogi server is starting, it call `load_model(self)` -- you have to implement model loading logic in that function (loading from file, internet, etc.)
+
+Here you have to load and init your model and save the model object to some variable
+
+Example
+```python
+def load_model(self):
+ self.model = __get_model()
+ self.model.load_weights()
+```
+<br>
+<br>
+
+### Prepareing (request time)
+When you call [GET/POST] /prepare of your API `prepare(self, params_dict)` is calling first.
+
+All request params can be found in `params_dict`. Here you can prepare you params: open and modify Image, transform and normalize text and safe data for `prepare` to self attribute.
+
+Example
+```python
+# now you can make GET /predict?text_data=My-long-text
+# (and not worry about spaces)
+def prepare(self, params_dict):
+ self.text = params_dict['text_data'].strip()
+```
+<br>
+<br>
+
+### Predicting (request time)
+After request params prepareing `predict(self)` is calling.
+
+```python
+def predict(self):
+ raw = self.model.predict(self.text)
+ return __from_raw_to_some(raw)
+```
+
+Here you have to return the value, that valid for some Renderer class (json, file, custom)
+<br>
+<br>
+
+### Rendering (request time)
+And the last step is calling `renderer().render(...)` and passing the result of `predict`
+Out of the box you can use `JsonRenderer` and `FileRenderer`
+
+or
+
+you can create custom renderer from `AbstractRenderer`
+```python
+class MyRenderer(AbstractRenderer):
+ def render(data):
+ return ...
+```
+```python
+def renderer(self):
+ return MyRenderer
+```
+
+`JsonRenderer` accepts any json.dumps valid data
+
+`FileRenderer` uses flask's `send_file` under the hood, so you can pass any bytes. [Additional params can watch here](https://github.com/Naogi/naogi_model/blob/main/src/naogi/__init__.py#L17)
+
+<br>
+<br>
+
+### Fin
+And finally you can make API calls to `<your-naogi-project-url>/predict` with params
+
+
+## Development
+...
+
+### Testing
+Before testing you should install **pytest**
+
+From root folder
+```shell
+PYTHONPATH='./' pytest tests/renderers/pil_image_renderer.py
+```
+
+### Deploy
+```shell
+rm -rf dist/*
+python3 -m build
+python3 -m twine upload --repository pypi dist/*
+```
+
+
+
+
+%package -n python3-naogi
+Summary: Abstract class for Naogi ML deployment
+Provides: python-naogi
+BuildRequires: python3-devel
+BuildRequires: python3-setuptools
+BuildRequires: python3-pip
+%description -n python3-naogi
+# naogi_model
+
+NaogiModel it is an abstract class for the naogi.com ML deployment platform
+
+## How to deploy via naogi.com
+* Add `naogi` to your project requirements.txt
+* create file `naogi.py` in the root directory (copypaste file from [naogi.py](https://github.com/Naogi/naogi_model))
+* implement your logic of model loading, prepareing and calling
+* go to you naogi.com profile, create project and connect git
+<br>
+<br>
+
+## How it works? (What to implement in naogi.py)
+### Loading model (server starting time)
+When naogi server is starting, it call `load_model(self)` -- you have to implement model loading logic in that function (loading from file, internet, etc.)
+
+Here you have to load and init your model and save the model object to some variable
+
+Example
+```python
+def load_model(self):
+ self.model = __get_model()
+ self.model.load_weights()
+```
+<br>
+<br>
+
+### Prepareing (request time)
+When you call [GET/POST] /prepare of your API `prepare(self, params_dict)` is calling first.
+
+All request params can be found in `params_dict`. Here you can prepare you params: open and modify Image, transform and normalize text and safe data for `prepare` to self attribute.
+
+Example
+```python
+# now you can make GET /predict?text_data=My-long-text
+# (and not worry about spaces)
+def prepare(self, params_dict):
+ self.text = params_dict['text_data'].strip()
+```
+<br>
+<br>
+
+### Predicting (request time)
+After request params prepareing `predict(self)` is calling.
+
+```python
+def predict(self):
+ raw = self.model.predict(self.text)
+ return __from_raw_to_some(raw)
+```
+
+Here you have to return the value, that valid for some Renderer class (json, file, custom)
+<br>
+<br>
+
+### Rendering (request time)
+And the last step is calling `renderer().render(...)` and passing the result of `predict`
+Out of the box you can use `JsonRenderer` and `FileRenderer`
+
+or
+
+you can create custom renderer from `AbstractRenderer`
+```python
+class MyRenderer(AbstractRenderer):
+ def render(data):
+ return ...
+```
+```python
+def renderer(self):
+ return MyRenderer
+```
+
+`JsonRenderer` accepts any json.dumps valid data
+
+`FileRenderer` uses flask's `send_file` under the hood, so you can pass any bytes. [Additional params can watch here](https://github.com/Naogi/naogi_model/blob/main/src/naogi/__init__.py#L17)
+
+<br>
+<br>
+
+### Fin
+And finally you can make API calls to `<your-naogi-project-url>/predict` with params
+
+
+## Development
+...
+
+### Testing
+Before testing you should install **pytest**
+
+From root folder
+```shell
+PYTHONPATH='./' pytest tests/renderers/pil_image_renderer.py
+```
+
+### Deploy
+```shell
+rm -rf dist/*
+python3 -m build
+python3 -m twine upload --repository pypi dist/*
+```
+
+
+
+
+%package help
+Summary: Development documents and examples for naogi
+Provides: python3-naogi-doc
+%description help
+# naogi_model
+
+NaogiModel it is an abstract class for the naogi.com ML deployment platform
+
+## How to deploy via naogi.com
+* Add `naogi` to your project requirements.txt
+* create file `naogi.py` in the root directory (copypaste file from [naogi.py](https://github.com/Naogi/naogi_model))
+* implement your logic of model loading, prepareing and calling
+* go to you naogi.com profile, create project and connect git
+<br>
+<br>
+
+## How it works? (What to implement in naogi.py)
+### Loading model (server starting time)
+When naogi server is starting, it call `load_model(self)` -- you have to implement model loading logic in that function (loading from file, internet, etc.)
+
+Here you have to load and init your model and save the model object to some variable
+
+Example
+```python
+def load_model(self):
+ self.model = __get_model()
+ self.model.load_weights()
+```
+<br>
+<br>
+
+### Prepareing (request time)
+When you call [GET/POST] /prepare of your API `prepare(self, params_dict)` is calling first.
+
+All request params can be found in `params_dict`. Here you can prepare you params: open and modify Image, transform and normalize text and safe data for `prepare` to self attribute.
+
+Example
+```python
+# now you can make GET /predict?text_data=My-long-text
+# (and not worry about spaces)
+def prepare(self, params_dict):
+ self.text = params_dict['text_data'].strip()
+```
+<br>
+<br>
+
+### Predicting (request time)
+After request params prepareing `predict(self)` is calling.
+
+```python
+def predict(self):
+ raw = self.model.predict(self.text)
+ return __from_raw_to_some(raw)
+```
+
+Here you have to return the value, that valid for some Renderer class (json, file, custom)
+<br>
+<br>
+
+### Rendering (request time)
+And the last step is calling `renderer().render(...)` and passing the result of `predict`
+Out of the box you can use `JsonRenderer` and `FileRenderer`
+
+or
+
+you can create custom renderer from `AbstractRenderer`
+```python
+class MyRenderer(AbstractRenderer):
+ def render(data):
+ return ...
+```
+```python
+def renderer(self):
+ return MyRenderer
+```
+
+`JsonRenderer` accepts any json.dumps valid data
+
+`FileRenderer` uses flask's `send_file` under the hood, so you can pass any bytes. [Additional params can watch here](https://github.com/Naogi/naogi_model/blob/main/src/naogi/__init__.py#L17)
+
+<br>
+<br>
+
+### Fin
+And finally you can make API calls to `<your-naogi-project-url>/predict` with params
+
+
+## Development
+...
+
+### Testing
+Before testing you should install **pytest**
+
+From root folder
+```shell
+PYTHONPATH='./' pytest tests/renderers/pil_image_renderer.py
+```
+
+### Deploy
+```shell
+rm -rf dist/*
+python3 -m build
+python3 -m twine upload --repository pypi dist/*
+```
+
+
+
+
+%prep
+%autosetup -n naogi-0.0.6
+
+%build
+%py3_build
+
+%install
+%py3_install
+install -d -m755 %{buildroot}/%{_pkgdocdir}
+if [ -d doc ]; then cp -arf doc %{buildroot}/%{_pkgdocdir}; fi
+if [ -d docs ]; then cp -arf docs %{buildroot}/%{_pkgdocdir}; fi
+if [ -d example ]; then cp -arf example %{buildroot}/%{_pkgdocdir}; fi
+if [ -d examples ]; then cp -arf examples %{buildroot}/%{_pkgdocdir}; fi
+pushd %{buildroot}
+if [ -d usr/lib ]; then
+ find usr/lib -type f -printf "\"/%h/%f\"\n" >> filelist.lst
+fi
+if [ -d usr/lib64 ]; then
+ find usr/lib64 -type f -printf "\"/%h/%f\"\n" >> filelist.lst
+fi
+if [ -d usr/bin ]; then
+ find usr/bin -type f -printf "\"/%h/%f\"\n" >> filelist.lst
+fi
+if [ -d usr/sbin ]; then
+ find usr/sbin -type f -printf "\"/%h/%f\"\n" >> filelist.lst
+fi
+touch doclist.lst
+if [ -d usr/share/man ]; then
+ find usr/share/man -type f -printf "\"/%h/%f.gz\"\n" >> doclist.lst
+fi
+popd
+mv %{buildroot}/filelist.lst .
+mv %{buildroot}/doclist.lst .
+
+%files -n python3-naogi -f filelist.lst
+%dir %{python3_sitelib}/*
+
+%files help -f doclist.lst
+%{_docdir}/*
+
+%changelog
+* Tue Jun 20 2023 Python_Bot <Python_Bot@openeuler.org> - 0.0.6-1
+- Package Spec generated
diff --git a/sources b/sources
new file mode 100644
index 0000000..4978c05
--- /dev/null
+++ b/sources
@@ -0,0 +1 @@
+21b391e3c7e83c9a1c1bf0a922b0f333 naogi-0.0.6.tar.gz