summaryrefslogtreecommitdiff
diff options
context:
space:
mode:
-rw-r--r--.gitignore1
-rw-r--r--python-opt-einsum.spec245
-rw-r--r--sources1
3 files changed, 247 insertions, 0 deletions
diff --git a/.gitignore b/.gitignore
index e69de29..0032e12 100644
--- a/.gitignore
+++ b/.gitignore
@@ -0,0 +1 @@
+/opt_einsum-3.3.0.tar.gz
diff --git a/python-opt-einsum.spec b/python-opt-einsum.spec
new file mode 100644
index 0000000..2fefe9b
--- /dev/null
+++ b/python-opt-einsum.spec
@@ -0,0 +1,245 @@
+%global _empty_manifest_terminate_build 0
+Name: python-opt-einsum
+Version: 3.3.0
+Release: 1
+Summary: Optimizing numpys einsum function
+License: MIT
+URL: https://github.com/dgasmith/opt_einsum
+Source0: https://mirrors.nju.edu.cn/pypi/web/packages/7d/bf/9257e53a0e7715bc1127e15063e831f076723c6cd60985333a1c18878fb8/opt_einsum-3.3.0.tar.gz
+BuildArch: noarch
+
+Requires: python3-numpy
+Requires: python3-sphinx
+Requires: python3-sphinxcontrib-napoleon
+Requires: python3-sphinx-rtd-theme
+Requires: python3-numpydoc
+Requires: python3-pytest
+Requires: python3-pytest-cov
+Requires: python3-pytest-pep8
+
+%description
+Optimized einsum can significantly reduce the overall execution time of einsum-like expressions (e.g.,
+[`np.einsum`](https://docs.scipy.org/doc/numpy/reference/generated/numpy.einsum.html),
+[`dask.array.einsum`](https://docs.dask.org/en/latest/array-api.html#dask.array.einsum),
+[`pytorch.einsum`](https://pytorch.org/docs/stable/torch.html#torch.einsum),
+[`tensorflow.einsum`](https://www.tensorflow.org/api_docs/python/tf/einsum),
+)
+by optimizing the expression's contraction order and dispatching many
+operations to canonical BLAS, cuBLAS, or other specialized routines. Optimized
+einsum is agnostic to the backend and can handle NumPy, Dask, PyTorch,
+Tensorflow, CuPy, Sparse, Theano, JAX, and Autograd arrays as well as potentially
+any library which conforms to a standard API. See the
+[**documentation**](http://optimized-einsum.readthedocs.io) for more
+information.
+## Example usage
+The [`opt_einsum.contract`](https://optimized-einsum.readthedocs.io/en/latest/autosummary/opt_einsum.contract.html#opt-einsum-contract)
+function can often act as a drop-in replacement for `einsum`
+functions without futher changes to the code while providing superior performance.
+Here, a tensor contraction is preformed with and without optimization:
+```python
+import numpy as np
+from opt_einsum import contract
+N = 10
+C = np.random.rand(N, N)
+I = np.random.rand(N, N, N, N)
+%timeit np.einsum('pi,qj,ijkl,rk,sl->pqrs', C, C, I, C, C)
+1 loops, best of 3: 934 ms per loop
+%timeit contract('pi,qj,ijkl,rk,sl->pqrs', C, C, I, C, C)
+1000 loops, best of 3: 324 us per loop
+```
+In this particular example, we see a ~3000x performance improvement which is
+not uncommon when compared against unoptimized contractions. See the [backend
+examples](https://optimized-einsum.readthedocs.io/en/latest/backends.html)
+for more information on using other backends.
+## Features
+The algorithms found in this repository often power the `einsum` optimizations
+in many of the above projects. For example, the optimization of `np.einsum`
+has been passed upstream and most of the same features that can be found in
+this repository can be enabled with `np.einsum(..., optimize=True)`. However,
+this repository often has more up to date algorithms for complex contractions.
+The following capabilities are enabled by `opt_einsum`:
+* Inspect [detailed information](http://optimized-einsum.readthedocs.io/en/latest/path_finding.html) about the path chosen.
+* Perform contractions with [numerous backends](http://optimized-einsum.readthedocs.io/en/latest/backends.html), including on the GPU and with libraries such as [TensorFlow](https://www.tensorflow.org) and [PyTorch](https://pytorch.org).
+* Generate [reusable expressions](http://optimized-einsum.readthedocs.io/en/latest/reusing_paths.html), potentially with [constant tensors](http://optimized-einsum.readthedocs.io/en/latest/reusing_paths.html#specifying-constants), that can be compiled for greater performance.
+* Use an arbitrary number of indices to find contractions for [hundreds or even thousands of tensors](http://optimized-einsum.readthedocs.io/en/latest/ex_large_expr_with_greedy.html).
+* Share [intermediate computations](http://optimized-einsum.readthedocs.io/en/latest/sharing_intermediates.html) among multiple contractions.
+* Compute gradients of tensor contractions using [autograd](https://github.com/HIPS/autograd) or [jax](https://github.com/google/jax)
+Please see the [documentation](http://optimized-einsum.readthedocs.io/en/latest/?badge=latest) for more features!
+## Installation
+`opt_einsum` can either be installed via `pip install opt_einsum` or from conda `conda install opt_einsum -c conda-forge`. See the installation [documenation](http://optimized-einsum.readthedocs.io/en/latest/install.html) for further methods.
+## Citation
+If this code has benefited your research, please support us by citing:
+Daniel G. A. Smith and Johnnie Gray, opt_einsum - A Python package for optimizing contraction order for einsum-like expressions. *Journal of Open Source Software*, **2018**, 3(26), 753
+DOI: https://doi.org/10.21105/joss.00753
+## Contributing
+All contributions, bug reports, bug fixes, documentation improvements, enhancements, and ideas are welcome.
+A detailed overview on how to contribute can be found in the [contributing guide](https://github.com/dgasmith/opt_einsum/blob/master/.github/CONTRIBUTING.md).
+
+%package -n python3-opt-einsum
+Summary: Optimizing numpys einsum function
+Provides: python-opt-einsum
+BuildRequires: python3-devel
+BuildRequires: python3-setuptools
+BuildRequires: python3-pip
+%description -n python3-opt-einsum
+Optimized einsum can significantly reduce the overall execution time of einsum-like expressions (e.g.,
+[`np.einsum`](https://docs.scipy.org/doc/numpy/reference/generated/numpy.einsum.html),
+[`dask.array.einsum`](https://docs.dask.org/en/latest/array-api.html#dask.array.einsum),
+[`pytorch.einsum`](https://pytorch.org/docs/stable/torch.html#torch.einsum),
+[`tensorflow.einsum`](https://www.tensorflow.org/api_docs/python/tf/einsum),
+)
+by optimizing the expression's contraction order and dispatching many
+operations to canonical BLAS, cuBLAS, or other specialized routines. Optimized
+einsum is agnostic to the backend and can handle NumPy, Dask, PyTorch,
+Tensorflow, CuPy, Sparse, Theano, JAX, and Autograd arrays as well as potentially
+any library which conforms to a standard API. See the
+[**documentation**](http://optimized-einsum.readthedocs.io) for more
+information.
+## Example usage
+The [`opt_einsum.contract`](https://optimized-einsum.readthedocs.io/en/latest/autosummary/opt_einsum.contract.html#opt-einsum-contract)
+function can often act as a drop-in replacement for `einsum`
+functions without futher changes to the code while providing superior performance.
+Here, a tensor contraction is preformed with and without optimization:
+```python
+import numpy as np
+from opt_einsum import contract
+N = 10
+C = np.random.rand(N, N)
+I = np.random.rand(N, N, N, N)
+%timeit np.einsum('pi,qj,ijkl,rk,sl->pqrs', C, C, I, C, C)
+1 loops, best of 3: 934 ms per loop
+%timeit contract('pi,qj,ijkl,rk,sl->pqrs', C, C, I, C, C)
+1000 loops, best of 3: 324 us per loop
+```
+In this particular example, we see a ~3000x performance improvement which is
+not uncommon when compared against unoptimized contractions. See the [backend
+examples](https://optimized-einsum.readthedocs.io/en/latest/backends.html)
+for more information on using other backends.
+## Features
+The algorithms found in this repository often power the `einsum` optimizations
+in many of the above projects. For example, the optimization of `np.einsum`
+has been passed upstream and most of the same features that can be found in
+this repository can be enabled with `np.einsum(..., optimize=True)`. However,
+this repository often has more up to date algorithms for complex contractions.
+The following capabilities are enabled by `opt_einsum`:
+* Inspect [detailed information](http://optimized-einsum.readthedocs.io/en/latest/path_finding.html) about the path chosen.
+* Perform contractions with [numerous backends](http://optimized-einsum.readthedocs.io/en/latest/backends.html), including on the GPU and with libraries such as [TensorFlow](https://www.tensorflow.org) and [PyTorch](https://pytorch.org).
+* Generate [reusable expressions](http://optimized-einsum.readthedocs.io/en/latest/reusing_paths.html), potentially with [constant tensors](http://optimized-einsum.readthedocs.io/en/latest/reusing_paths.html#specifying-constants), that can be compiled for greater performance.
+* Use an arbitrary number of indices to find contractions for [hundreds or even thousands of tensors](http://optimized-einsum.readthedocs.io/en/latest/ex_large_expr_with_greedy.html).
+* Share [intermediate computations](http://optimized-einsum.readthedocs.io/en/latest/sharing_intermediates.html) among multiple contractions.
+* Compute gradients of tensor contractions using [autograd](https://github.com/HIPS/autograd) or [jax](https://github.com/google/jax)
+Please see the [documentation](http://optimized-einsum.readthedocs.io/en/latest/?badge=latest) for more features!
+## Installation
+`opt_einsum` can either be installed via `pip install opt_einsum` or from conda `conda install opt_einsum -c conda-forge`. See the installation [documenation](http://optimized-einsum.readthedocs.io/en/latest/install.html) for further methods.
+## Citation
+If this code has benefited your research, please support us by citing:
+Daniel G. A. Smith and Johnnie Gray, opt_einsum - A Python package for optimizing contraction order for einsum-like expressions. *Journal of Open Source Software*, **2018**, 3(26), 753
+DOI: https://doi.org/10.21105/joss.00753
+## Contributing
+All contributions, bug reports, bug fixes, documentation improvements, enhancements, and ideas are welcome.
+A detailed overview on how to contribute can be found in the [contributing guide](https://github.com/dgasmith/opt_einsum/blob/master/.github/CONTRIBUTING.md).
+
+%package help
+Summary: Development documents and examples for opt-einsum
+Provides: python3-opt-einsum-doc
+%description help
+Optimized einsum can significantly reduce the overall execution time of einsum-like expressions (e.g.,
+[`np.einsum`](https://docs.scipy.org/doc/numpy/reference/generated/numpy.einsum.html),
+[`dask.array.einsum`](https://docs.dask.org/en/latest/array-api.html#dask.array.einsum),
+[`pytorch.einsum`](https://pytorch.org/docs/stable/torch.html#torch.einsum),
+[`tensorflow.einsum`](https://www.tensorflow.org/api_docs/python/tf/einsum),
+)
+by optimizing the expression's contraction order and dispatching many
+operations to canonical BLAS, cuBLAS, or other specialized routines. Optimized
+einsum is agnostic to the backend and can handle NumPy, Dask, PyTorch,
+Tensorflow, CuPy, Sparse, Theano, JAX, and Autograd arrays as well as potentially
+any library which conforms to a standard API. See the
+[**documentation**](http://optimized-einsum.readthedocs.io) for more
+information.
+## Example usage
+The [`opt_einsum.contract`](https://optimized-einsum.readthedocs.io/en/latest/autosummary/opt_einsum.contract.html#opt-einsum-contract)
+function can often act as a drop-in replacement for `einsum`
+functions without futher changes to the code while providing superior performance.
+Here, a tensor contraction is preformed with and without optimization:
+```python
+import numpy as np
+from opt_einsum import contract
+N = 10
+C = np.random.rand(N, N)
+I = np.random.rand(N, N, N, N)
+%timeit np.einsum('pi,qj,ijkl,rk,sl->pqrs', C, C, I, C, C)
+1 loops, best of 3: 934 ms per loop
+%timeit contract('pi,qj,ijkl,rk,sl->pqrs', C, C, I, C, C)
+1000 loops, best of 3: 324 us per loop
+```
+In this particular example, we see a ~3000x performance improvement which is
+not uncommon when compared against unoptimized contractions. See the [backend
+examples](https://optimized-einsum.readthedocs.io/en/latest/backends.html)
+for more information on using other backends.
+## Features
+The algorithms found in this repository often power the `einsum` optimizations
+in many of the above projects. For example, the optimization of `np.einsum`
+has been passed upstream and most of the same features that can be found in
+this repository can be enabled with `np.einsum(..., optimize=True)`. However,
+this repository often has more up to date algorithms for complex contractions.
+The following capabilities are enabled by `opt_einsum`:
+* Inspect [detailed information](http://optimized-einsum.readthedocs.io/en/latest/path_finding.html) about the path chosen.
+* Perform contractions with [numerous backends](http://optimized-einsum.readthedocs.io/en/latest/backends.html), including on the GPU and with libraries such as [TensorFlow](https://www.tensorflow.org) and [PyTorch](https://pytorch.org).
+* Generate [reusable expressions](http://optimized-einsum.readthedocs.io/en/latest/reusing_paths.html), potentially with [constant tensors](http://optimized-einsum.readthedocs.io/en/latest/reusing_paths.html#specifying-constants), that can be compiled for greater performance.
+* Use an arbitrary number of indices to find contractions for [hundreds or even thousands of tensors](http://optimized-einsum.readthedocs.io/en/latest/ex_large_expr_with_greedy.html).
+* Share [intermediate computations](http://optimized-einsum.readthedocs.io/en/latest/sharing_intermediates.html) among multiple contractions.
+* Compute gradients of tensor contractions using [autograd](https://github.com/HIPS/autograd) or [jax](https://github.com/google/jax)
+Please see the [documentation](http://optimized-einsum.readthedocs.io/en/latest/?badge=latest) for more features!
+## Installation
+`opt_einsum` can either be installed via `pip install opt_einsum` or from conda `conda install opt_einsum -c conda-forge`. See the installation [documenation](http://optimized-einsum.readthedocs.io/en/latest/install.html) for further methods.
+## Citation
+If this code has benefited your research, please support us by citing:
+Daniel G. A. Smith and Johnnie Gray, opt_einsum - A Python package for optimizing contraction order for einsum-like expressions. *Journal of Open Source Software*, **2018**, 3(26), 753
+DOI: https://doi.org/10.21105/joss.00753
+## Contributing
+All contributions, bug reports, bug fixes, documentation improvements, enhancements, and ideas are welcome.
+A detailed overview on how to contribute can be found in the [contributing guide](https://github.com/dgasmith/opt_einsum/blob/master/.github/CONTRIBUTING.md).
+
+%prep
+%autosetup -n opt-einsum-3.3.0
+
+%build
+%py3_build
+
+%install
+%py3_install
+install -d -m755 %{buildroot}/%{_pkgdocdir}
+if [ -d doc ]; then cp -arf doc %{buildroot}/%{_pkgdocdir}; fi
+if [ -d docs ]; then cp -arf docs %{buildroot}/%{_pkgdocdir}; fi
+if [ -d example ]; then cp -arf example %{buildroot}/%{_pkgdocdir}; fi
+if [ -d examples ]; then cp -arf examples %{buildroot}/%{_pkgdocdir}; fi
+pushd %{buildroot}
+if [ -d usr/lib ]; then
+ find usr/lib -type f -printf "/%h/%f\n" >> filelist.lst
+fi
+if [ -d usr/lib64 ]; then
+ find usr/lib64 -type f -printf "/%h/%f\n" >> filelist.lst
+fi
+if [ -d usr/bin ]; then
+ find usr/bin -type f -printf "/%h/%f\n" >> filelist.lst
+fi
+if [ -d usr/sbin ]; then
+ find usr/sbin -type f -printf "/%h/%f\n" >> filelist.lst
+fi
+touch doclist.lst
+if [ -d usr/share/man ]; then
+ find usr/share/man -type f -printf "/%h/%f.gz\n" >> doclist.lst
+fi
+popd
+mv %{buildroot}/filelist.lst .
+mv %{buildroot}/doclist.lst .
+
+%files -n python3-opt-einsum -f filelist.lst
+%dir %{python3_sitelib}/*
+
+%files help -f doclist.lst
+%{_docdir}/*
+
+%changelog
+* Mon Apr 10 2023 Python_Bot <Python_Bot@openeuler.org> - 3.3.0-1
+- Package Spec generated
diff --git a/sources b/sources
new file mode 100644
index 0000000..dd4450f
--- /dev/null
+++ b/sources
@@ -0,0 +1 @@
+acf0a3997aab84b4e9a854296cc34971 opt_einsum-3.3.0.tar.gz