%global _empty_manifest_terminate_build 0 %define debug_package %{nil} Name: onnxruntime Version: 1.17.0 Release: 2 Summary: ONNX Runtime: cross-platform, high performance ML inferencing and training accelerator License: BSD-3 URL: https://github.com/pytorch/onnxruntime Source0: https://atomgit.com/havefun/onnxruntime/raw/master/onnxruntime-1.17.0.tar.gz BuildRequires: g++ Requires: python3-future Requires: python3-numpy %description ONNX Runtime is a cross-platform inference and training machine-learning accelerator. ONNX Runtime inference can enable faster customer experiences and lower costs, supporting models from deep learning frameworks such as PyTorch and TensorFlow/Keras as well as classical machine learning libraries such as scikit-learn, LightGBM, XGBoost, etc. ONNX Runtime is compatible with different hardware, drivers, and operating systems, and provides optimal performance by leveraging hardware accelerators where applicable alongside graph optimizations and transforms. %package -n python3-onnxruntime Summary: ONNX Runtime: cross-platform, high performance ML inferencing and training accelerator Provides: python-onnxruntime BuildRequires: python3-devel BuildRequires: python3-setuptools BuildRequires: python3-setuptools_scm BuildRequires: python3-pbr BuildRequires: python3-pip BuildRequires: python3-wheel BuildRequires: python3-hatchling BuildRequires: python3-astunparse BuildRequires: python3-numpy BuildRequires: python3-pyyaml BuildRequires: cmake BuildRequires: python3-typing-extensions BuildRequires: python3-requests BuildRequires: python3-pytorch AutoReqProv: no %description -n python3-onnxruntime ONNX Runtime is a cross-platform inference and training machine-learning accelerator. ONNX Runtime inference can enable faster customer experiences and lower costs, supporting models from deep learning frameworks such as PyTorch and TensorFlow/Keras as well as classical machine learning libraries such as scikit-learn, LightGBM, XGBoost, etc. ONNX Runtime is compatible with different hardware, drivers, and operating systems, and provides optimal performance by leveraging hardware accelerators where applicable alongside graph optimizations and transforms. %package help Summary: Development documents and examples for torch Provides: python3-onnxruntime-doc %description help ONNX Runtime is a cross-platform inference and training machine-learning accelerator. ONNX Runtime inference can enable faster customer experiences and lower costs, supporting models from deep learning frameworks such as PyTorch and TensorFlow/Keras as well as classical machine learning libraries such as scikit-learn, LightGBM, XGBoost, etc. ONNX Runtime is compatible with different hardware, drivers, and operating systems, and provides optimal performance by leveraging hardware accelerators where applicable alongside graph optimizations and transforms. %prep %autosetup -p1 -n %{name}-%{version} %build %py3_build #python3 setup.py build %install %define _unpackaged_files_terminate_build 0 %py3_install #python3 setup.py install %files -n python3-onnxruntime %doc *.md %license LICENSE %{python3_sitelib}/* %changelog * Tue April 15 2024 Hongyu Li<543306408@qq.com> - Package init