%global debug_package %{nil} Name: onnxruntime Version: 1.17.1 Release: 1%{?dist} Summary: ONNXRuntime with source build for CPU License: MIT URL: https://onnxruntime.ai Source0: https://github.com/microsoft/%{name}/archive/refs/tags/v%{version}.tar.gz %description ONNX Runtime is an open-source high-performance inference engine for running computational graphs on various platforms. ONNX stands for "Open Neural Network Exchange," which is an open standard for representing neural network models, making it easier to share and interchange models between different deep learning frameworks. ONNX Runtime supports running models on CPUs, GPUs, and other devices, providing cross-platform and cross-framework performance optimizations. %package -n python3-onnxruntime Summary: Summary: ONNX Runtime offers a range of APIs that simplify the process of loading, inferring, and optimizing models Provides: python-onnxruntime BuildRequires: tar, ca-certificates, build-essential, cmake, curl, python3-devel, python3-setuptools, python3-wheel, python3-pip, python3-numpy, python3-flatbuffers, python3-packaging, python3-protobuf, python3-mpmath, python3-sympy Requires: ca-certificates, python3-setuptools, python3-wheel, python3-pip, python3-numpy, python3-flatbuffers, python3-packaging, python3-protobuf, python3-mpmath, python3-sympy # use the auto build requires %?python_enable_dependency_generator %description -n python3-onnxruntime ONNX Runtime is a cross-platform inference and training machine-learning accelerator. ONNXRuntime with source build for CPU, licensed under the MIT License. %package help Summary: docs for the onnxruntime Provides: python3-onnxruntime-doc %description help description for the doc of the python3-onnxruntime's usage %prep %autosetup -p1 -n onnxruntime-%{version} %build %pyproject_build %install %pyproject_install %files -n python3-%{name} %doc *.md %license LICENSE %{python3_sitelib}/%{name}/ %changelog * Tue Mar 31 2024 Your Name - 1.0.0-1 - First build of ONNXRuntime for CPU