%global _empty_manifest_terminate_build 0 Name: python-backend.ai-accelerator-cuda Version: 2.0.1 Release: 1 Summary: Backend.AI Accelerator Plugin for CUDA License: LGPLv3 URL: https://backend.ai Source0: https://mirrors.aliyun.com/pypi/web/packages/09/01/957e77caa7001c9b5e863dba92e83892da93a49847231878f74f2e83890e/backend.ai-accelerator-cuda-2.0.1.tar.gz BuildArch: noarch Requires: python3-aiohttp Requires: python3-attrs Requires: python3-wheel Requires: python3-twine Requires: python3-pytest-sugar Requires: python3-pytest Requires: python3-pytest-asyncio Requires: python3-pytest-cov Requires: python3-pytest-mock Requires: python3-asynctest Requires: python3-flake8 Requires: python3-mypy Requires: python3-codecov %description Just install this along with Backend.AI agents, using the same virtual environment. This will allow the agents to detect CUDA devices on their hosts and make them available to Backend.AI kernel sessions. ```console $ pip install backend.ai-accelerator-cuda ``` This open-source edition of CUDA plugins support allocation of one or more CUDA devices to a container, slot-by-slot. %package -n python3-backend.ai-accelerator-cuda Summary: Backend.AI Accelerator Plugin for CUDA Provides: python-backend.ai-accelerator-cuda BuildRequires: python3-devel BuildRequires: python3-setuptools BuildRequires: python3-pip %description -n python3-backend.ai-accelerator-cuda Just install this along with Backend.AI agents, using the same virtual environment. This will allow the agents to detect CUDA devices on their hosts and make them available to Backend.AI kernel sessions. ```console $ pip install backend.ai-accelerator-cuda ``` This open-source edition of CUDA plugins support allocation of one or more CUDA devices to a container, slot-by-slot. %package help Summary: Development documents and examples for backend.ai-accelerator-cuda Provides: python3-backend.ai-accelerator-cuda-doc %description help Just install this along with Backend.AI agents, using the same virtual environment. This will allow the agents to detect CUDA devices on their hosts and make them available to Backend.AI kernel sessions. ```console $ pip install backend.ai-accelerator-cuda ``` This open-source edition of CUDA plugins support allocation of one or more CUDA devices to a container, slot-by-slot. %prep %autosetup -n backend.ai-accelerator-cuda-2.0.1 %build %py3_build %install %py3_install install -d -m755 %{buildroot}/%{_pkgdocdir} if [ -d doc ]; then cp -arf doc %{buildroot}/%{_pkgdocdir}; fi if [ -d docs ]; then cp -arf docs %{buildroot}/%{_pkgdocdir}; fi if [ -d example ]; then cp -arf example %{buildroot}/%{_pkgdocdir}; fi if [ -d examples ]; then cp -arf examples %{buildroot}/%{_pkgdocdir}; fi pushd %{buildroot} if [ -d usr/lib ]; then find usr/lib -type f -printf "\"/%h/%f\"\n" >> filelist.lst fi if [ -d usr/lib64 ]; then find usr/lib64 -type f -printf "\"/%h/%f\"\n" >> filelist.lst fi if [ -d usr/bin ]; then find usr/bin -type f -printf "\"/%h/%f\"\n" >> filelist.lst fi if [ -d usr/sbin ]; then find usr/sbin -type f -printf "\"/%h/%f\"\n" >> filelist.lst fi touch doclist.lst if [ -d usr/share/man ]; then find usr/share/man -type f -printf "\"/%h/%f.gz\"\n" >> doclist.lst fi popd mv %{buildroot}/filelist.lst . mv %{buildroot}/doclist.lst . %files -n python3-backend.ai-accelerator-cuda -f filelist.lst %dir %{python3_sitelib}/* %files help -f doclist.lst %{_docdir}/* %changelog * Tue Jun 20 2023 Python_Bot - 2.0.1-1 - Package Spec generated