%global _empty_manifest_terminate_build 0 Name: onnxruntime Version: 1.17.1 Release: 1 Summary: Open Source Neural Network Inference Engine License: MIT URL: https://github.com/microsoft/onnxruntime Source0: https://openi.pcl.ac.cn/JunJun-Liu/onnx-runtime/onnxruntime-1.17.1.tar.gz BuildRequires: gcc-c++ BuildRequires: python3-devel BuildRequires: cmake BuildRequires: python3-pytorch >= 2.0.1 Requires: python3-numpy Requires: python3-pytorch >= 2.0.1 %description ONNX Runtime is a performance-focused inference engine for ONNX (Open Neural Network Exchange) models. These models can come from a variety of frameworks, such as TensorFlow, PyTorch, or Keras. ONNX Runtime provides an easy way to run models in a fast and efficient manner with minimal dependencies. %prep %setup -q -n onnxruntime-1.17.1 %build export CFLAGS+=" -Wno-error=maybe-uninitialized -Wno-error=uninitialized -Wno-error=restrict -fPIC" export CXXFLAGS+=" -Wno-error=maybe-uninitialized -Wno-error=uninitialized -Wno-error=restrict -fPIC" mkdir -p build && cd build cmake ../cmake -DCMAKE_BUILD_TYPE=Release \ -DCMAKE_INSTALL_PREFIX=%{_prefix} \ -DCMAKE_C_FLAGS_RELEASE="%{optflags}" \ -DCMAKE_CXX_FLAGS_RELEASE="%{optflags}" %make_build %install %make_install install -d -m755 %{buildroot}/%{_pkgdocdir} if [ -d doc ]; then cp -arf doc %{buildroot}/%{_pkgdocdir}; fi if [ -d docs ]; then cp -arf docs %{buildroot}/%{_pkgdocdir}; fi if [ -d example ]; then cp -arf example %{buildroot}/%{_pkgdocdir}; fi if [ -d examples ]; then cp -arf examples %{buildroot}/%{_pkgdocdir}; fi pushd %{buildroot} touch doclist.lst if [ -d usr/share/man ]; then find usr/share/man -type f -printf "/%h/%f.gz\n" >> doclist.lst fi popd mv %{buildroot}/doclist.lst . %files %doc *.md %license LICENSE %{_bindir}/* %{python3_sitearch}/* %changelog * Wed Feb 28 2024 JunjunLiu <172074482@qq.com> - 1.17.1-1 - Initial package version for ONNX Runtime 1.17.1