diff options
| author | CoprDistGit <infra@openeuler.org> | 2023-05-31 03:37:04 +0000 |
|---|---|---|
| committer | CoprDistGit <infra@openeuler.org> | 2023-05-31 03:37:04 +0000 |
| commit | e98cbad51b618aaee500eda8cd1801c968de6e94 (patch) | |
| tree | 9810a6aaea912cf84028ba68fddd6d2bd5baf114 | |
| parent | 08806913f3808fb61862d5965c8fd92e8e4c1172 (diff) | |
automatic import of python-inferrd
| -rw-r--r-- | .gitignore | 1 | ||||
| -rw-r--r-- | python-inferrd.spec | 219 | ||||
| -rw-r--r-- | sources | 1 |
3 files changed, 221 insertions, 0 deletions
@@ -0,0 +1 @@ +/inferrd-0.1.47.tar.gz diff --git a/python-inferrd.spec b/python-inferrd.spec new file mode 100644 index 0000000..e35ad6c --- /dev/null +++ b/python-inferrd.spec @@ -0,0 +1,219 @@ +%global _empty_manifest_terminate_build 0 +Name: python-inferrd +Version: 0.1.47 +Release: 1 +Summary: inferrd.com +License: MIT +URL: https://inferrd.com +Source0: https://mirrors.nju.edu.cn/pypi/web/packages/f5/a2/e2c1f3dea6003537bac93472be6e2e704756db160e444404a425ff04da04/inferrd-0.1.47.tar.gz +BuildArch: noarch + +Requires: python3-joblib +Requires: python3-flask +Requires: python3-dill +Requires: python3-easysettings +Requires: python3-numpy +Requires: python3-tqdm +Requires: python3-pandas +Requires: python3-tensorflow +Requires: python3-requests + +%description +# Inferrd + +Inferrd is a hosting platform for TensorFlow. + +### Authentication + +In order to use this library you need to get a api token from [inferrd.com](https://inferrd.com) + +Authenticate with the `inferrd.auth` method: + +```python +import inferrd + +inferrd.auth('<token>') +``` + +### Deploying TensorFlow + +First, create a model on [inferrd.com](https://inferrd.com) and select the kind of instance you want. Then simple call `inferrd.deploy_tf`: + +```python +import inferrd + +# this only needs to be done once +inferrd.auth('<token>') + +# deploy TF +inferrd.deploy_tf(tf_model, '<name of the model>') +``` + +### Fetching predictions + +Inferrd allows us to pull predictions back into your notebook by using `inferrd.get_requests`: + +```python +import inferrd + +# this only needs to be done once +inferrd.auth('<token>') + +# get the requests +requests = inferrd.get_requests('<name of the model>', limit=100, page=0, includeFailures=False) +``` + + + + + +%package -n python3-inferrd +Summary: inferrd.com +Provides: python-inferrd +BuildRequires: python3-devel +BuildRequires: python3-setuptools +BuildRequires: python3-pip +%description -n python3-inferrd +# Inferrd + +Inferrd is a hosting platform for TensorFlow. + +### Authentication + +In order to use this library you need to get a api token from [inferrd.com](https://inferrd.com) + +Authenticate with the `inferrd.auth` method: + +```python +import inferrd + +inferrd.auth('<token>') +``` + +### Deploying TensorFlow + +First, create a model on [inferrd.com](https://inferrd.com) and select the kind of instance you want. Then simple call `inferrd.deploy_tf`: + +```python +import inferrd + +# this only needs to be done once +inferrd.auth('<token>') + +# deploy TF +inferrd.deploy_tf(tf_model, '<name of the model>') +``` + +### Fetching predictions + +Inferrd allows us to pull predictions back into your notebook by using `inferrd.get_requests`: + +```python +import inferrd + +# this only needs to be done once +inferrd.auth('<token>') + +# get the requests +requests = inferrd.get_requests('<name of the model>', limit=100, page=0, includeFailures=False) +``` + + + + + +%package help +Summary: Development documents and examples for inferrd +Provides: python3-inferrd-doc +%description help +# Inferrd + +Inferrd is a hosting platform for TensorFlow. + +### Authentication + +In order to use this library you need to get a api token from [inferrd.com](https://inferrd.com) + +Authenticate with the `inferrd.auth` method: + +```python +import inferrd + +inferrd.auth('<token>') +``` + +### Deploying TensorFlow + +First, create a model on [inferrd.com](https://inferrd.com) and select the kind of instance you want. Then simple call `inferrd.deploy_tf`: + +```python +import inferrd + +# this only needs to be done once +inferrd.auth('<token>') + +# deploy TF +inferrd.deploy_tf(tf_model, '<name of the model>') +``` + +### Fetching predictions + +Inferrd allows us to pull predictions back into your notebook by using `inferrd.get_requests`: + +```python +import inferrd + +# this only needs to be done once +inferrd.auth('<token>') + +# get the requests +requests = inferrd.get_requests('<name of the model>', limit=100, page=0, includeFailures=False) +``` + + + + + +%prep +%autosetup -n inferrd-0.1.47 + +%build +%py3_build + +%install +%py3_install +install -d -m755 %{buildroot}/%{_pkgdocdir} +if [ -d doc ]; then cp -arf doc %{buildroot}/%{_pkgdocdir}; fi +if [ -d docs ]; then cp -arf docs %{buildroot}/%{_pkgdocdir}; fi +if [ -d example ]; then cp -arf example %{buildroot}/%{_pkgdocdir}; fi +if [ -d examples ]; then cp -arf examples %{buildroot}/%{_pkgdocdir}; fi +pushd %{buildroot} +if [ -d usr/lib ]; then + find usr/lib -type f -printf "/%h/%f\n" >> filelist.lst +fi +if [ -d usr/lib64 ]; then + find usr/lib64 -type f -printf "/%h/%f\n" >> filelist.lst +fi +if [ -d usr/bin ]; then + find usr/bin -type f -printf "/%h/%f\n" >> filelist.lst +fi +if [ -d usr/sbin ]; then + find usr/sbin -type f -printf "/%h/%f\n" >> filelist.lst +fi +touch doclist.lst +if [ -d usr/share/man ]; then + find usr/share/man -type f -printf "/%h/%f.gz\n" >> doclist.lst +fi +popd +mv %{buildroot}/filelist.lst . +mv %{buildroot}/doclist.lst . + +%files -n python3-inferrd -f filelist.lst +%dir %{python3_sitelib}/* + +%files help -f doclist.lst +%{_docdir}/* + +%changelog +* Wed May 31 2023 Python_Bot <Python_Bot@openeuler.org> - 0.1.47-1 +- Package Spec generated @@ -0,0 +1 @@ +8807b23438dacf4fc24dd1d1fc04bd93 inferrd-0.1.47.tar.gz |
