summaryrefslogtreecommitdiff
diff options
context:
space:
mode:
authorCoprDistGit <infra@openeuler.org>2023-04-10 17:13:22 +0000
committerCoprDistGit <infra@openeuler.org>2023-04-10 17:13:22 +0000
commit9836fbcd4218a814526672caa21399014a093c29 (patch)
tree2c398599888bc1752e2fedb439ada09e0b72193d
parent3714f5f3d044ab54fa96db6666c371e20901c3cc (diff)
automatic import of python-spacy-transformers
-rw-r--r--.gitignore1
-rw-r--r--python-spacy-transformers.spec320
-rw-r--r--sources1
3 files changed, 322 insertions, 0 deletions
diff --git a/.gitignore b/.gitignore
index e69de29..30d63b8 100644
--- a/.gitignore
+++ b/.gitignore
@@ -0,0 +1 @@
+/spacy-transformers-1.2.2.tar.gz
diff --git a/python-spacy-transformers.spec b/python-spacy-transformers.spec
new file mode 100644
index 0000000..cfb5ab2
--- /dev/null
+++ b/python-spacy-transformers.spec
@@ -0,0 +1,320 @@
+%global _empty_manifest_terminate_build 0
+Name: python-spacy-transformers
+Version: 1.2.2
+Release: 1
+Summary: spaCy pipelines for pre-trained BERT and other transformers
+License: MIT
+URL: https://spacy.io
+Source0: https://mirrors.nju.edu.cn/pypi/web/packages/a1/fa/711780a25596a4254c81734a5ea3aa09874732a24b02cf36503e5399a407/spacy-transformers-1.2.2.tar.gz
+
+Requires: python3-spacy
+Requires: python3-numpy
+Requires: python3-transformers
+Requires: python3-torch
+Requires: python3-srsly
+Requires: python3-spacy-alignments
+Requires: python3-dataclasses
+Requires: python3-cupy
+Requires: python3-cupy-cuda100
+Requires: python3-cupy-cuda101
+Requires: python3-cupy-cuda102
+Requires: python3-cupy-cuda110
+Requires: python3-cupy-cuda111
+Requires: python3-cupy-cuda112
+Requires: python3-cupy-cuda80
+Requires: python3-cupy-cuda90
+Requires: python3-cupy-cuda91
+Requires: python3-cupy-cuda92
+
+%description
+<a href="https://explosion.ai"><img src="https://explosion.ai/assets/img/logo.svg" width="125" height="125" align="right" /></a>
+
+# spacy-transformers: Use pretrained transformers like BERT, XLNet and GPT-2 in spaCy
+
+This package provides [spaCy](https://github.com/explosion/spaCy) components and
+architectures to use transformer models via
+[Hugging Face's `transformers`](https://github.com/huggingface/transformers) in
+spaCy. The result is convenient access to state-of-the-art transformer
+architectures, such as BERT, GPT-2, XLNet, etc.
+
+> **This release requires [spaCy v3](https://spacy.io/usage/v3).** For
+> the previous version of this library, see the
+> [`v0.6.x` branch](https://github.com/explosion/spacy-transformers/tree/v0.6.x).
+
+[![Azure Pipelines](https://img.shields.io/azure-devops/build/explosion-ai/public/18/master.svg?logo=azure-pipelines&style=flat-square)](https://dev.azure.com/explosion-ai/public/_build?definitionId=18)
+[![PyPi](https://img.shields.io/pypi/v/spacy-transformers.svg?style=flat-square&logo=pypi&logoColor=white)](https://pypi.python.org/pypi/spacy-transformers)
+[![GitHub](https://img.shields.io/github/release/explosion/spacy-transformers/all.svg?style=flat-square&logo=github)](https://github.com/explosion/spacy-transformers/releases)
+[![Code style: black](https://img.shields.io/badge/code%20style-black-000000.svg?style=flat-square)](https://github.com/ambv/black)
+
+## Features
+
+- Use pretrained transformer models like **BERT**, **RoBERTa** and **XLNet** to
+ power your spaCy pipeline.
+- Easy **multi-task learning**: backprop to one transformer model from several
+ pipeline components.
+- Train using spaCy v3's powerful and extensible config system.
+- Automatic alignment of transformer output to spaCy's tokenization.
+- Easily customize what transformer data is saved in the `Doc` object.
+- Easily customize how long documents are processed.
+- Out-of-the-box serialization and model packaging.
+
+## 🚀 Installation
+
+Installing the package from pip will automatically install all dependencies,
+including PyTorch and spaCy. Make sure you install this package **before** you
+install the models. Also note that this package requires **Python 3.6+**,
+**PyTorch v1.5+** and **spaCy v3.0+**.
+
+```bash
+pip install 'spacy[transformers]'
+```
+
+For GPU installation, find your CUDA version using `nvcc --version` and add the
+[version in brackets](https://spacy.io/usage/#gpu), e.g.
+`spacy[transformers,cuda92]` for CUDA9.2 or `spacy[transformers,cuda100]` for
+CUDA10.0.
+
+If you are having trouble installing PyTorch, follow the
+[instructions](https://pytorch.org/get-started/locally/) on the official website
+for your specific operating system and requirements, or try the following:
+
+```bash
+pip install spacy-transformers -f https://download.pytorch.org/whl/torch_stable.html
+```
+
+## 📖 Documentation
+
+> ⚠️ **Important note:** This package has been extensively refactored to take
+> advantage of [spaCy v3.0](https://spacy.io). Previous versions that
+> were built for [spaCy v2.x](https://v2.spacy.io) worked considerably
+> differently. Please see previous tagged versions of this README for
+> documentation on prior versions.
+
+- 📘
+ [Embeddings, Transformers and Transfer Learning](https://spacy.io/usage/embeddings-transformers):
+ How to use transformers in spaCy
+- 📘 [Training Pipelines and Models](https://spacy.io/usage/training):
+ Train and update components on your own data and integrate custom models
+- 📘
+ [Layers and Model Architectures](https://spacy.io/usage/layers-architectures):
+ Power spaCy components with custom neural networks
+- 📗 [`Transformer`](https://spacy.io/api/transformer): Pipeline
+ component API reference
+- 📗
+ [Transformer architectures](https://spacy.io/api/architectures#transformers):
+ Architectures and registered functions
+
+
+%package -n python3-spacy-transformers
+Summary: spaCy pipelines for pre-trained BERT and other transformers
+Provides: python-spacy-transformers
+BuildRequires: python3-devel
+BuildRequires: python3-setuptools
+BuildRequires: python3-pip
+BuildRequires: python3-cffi
+BuildRequires: gcc
+BuildRequires: gdb
+%description -n python3-spacy-transformers
+<a href="https://explosion.ai"><img src="https://explosion.ai/assets/img/logo.svg" width="125" height="125" align="right" /></a>
+
+# spacy-transformers: Use pretrained transformers like BERT, XLNet and GPT-2 in spaCy
+
+This package provides [spaCy](https://github.com/explosion/spaCy) components and
+architectures to use transformer models via
+[Hugging Face's `transformers`](https://github.com/huggingface/transformers) in
+spaCy. The result is convenient access to state-of-the-art transformer
+architectures, such as BERT, GPT-2, XLNet, etc.
+
+> **This release requires [spaCy v3](https://spacy.io/usage/v3).** For
+> the previous version of this library, see the
+> [`v0.6.x` branch](https://github.com/explosion/spacy-transformers/tree/v0.6.x).
+
+[![Azure Pipelines](https://img.shields.io/azure-devops/build/explosion-ai/public/18/master.svg?logo=azure-pipelines&style=flat-square)](https://dev.azure.com/explosion-ai/public/_build?definitionId=18)
+[![PyPi](https://img.shields.io/pypi/v/spacy-transformers.svg?style=flat-square&logo=pypi&logoColor=white)](https://pypi.python.org/pypi/spacy-transformers)
+[![GitHub](https://img.shields.io/github/release/explosion/spacy-transformers/all.svg?style=flat-square&logo=github)](https://github.com/explosion/spacy-transformers/releases)
+[![Code style: black](https://img.shields.io/badge/code%20style-black-000000.svg?style=flat-square)](https://github.com/ambv/black)
+
+## Features
+
+- Use pretrained transformer models like **BERT**, **RoBERTa** and **XLNet** to
+ power your spaCy pipeline.
+- Easy **multi-task learning**: backprop to one transformer model from several
+ pipeline components.
+- Train using spaCy v3's powerful and extensible config system.
+- Automatic alignment of transformer output to spaCy's tokenization.
+- Easily customize what transformer data is saved in the `Doc` object.
+- Easily customize how long documents are processed.
+- Out-of-the-box serialization and model packaging.
+
+## 🚀 Installation
+
+Installing the package from pip will automatically install all dependencies,
+including PyTorch and spaCy. Make sure you install this package **before** you
+install the models. Also note that this package requires **Python 3.6+**,
+**PyTorch v1.5+** and **spaCy v3.0+**.
+
+```bash
+pip install 'spacy[transformers]'
+```
+
+For GPU installation, find your CUDA version using `nvcc --version` and add the
+[version in brackets](https://spacy.io/usage/#gpu), e.g.
+`spacy[transformers,cuda92]` for CUDA9.2 or `spacy[transformers,cuda100]` for
+CUDA10.0.
+
+If you are having trouble installing PyTorch, follow the
+[instructions](https://pytorch.org/get-started/locally/) on the official website
+for your specific operating system and requirements, or try the following:
+
+```bash
+pip install spacy-transformers -f https://download.pytorch.org/whl/torch_stable.html
+```
+
+## 📖 Documentation
+
+> ⚠️ **Important note:** This package has been extensively refactored to take
+> advantage of [spaCy v3.0](https://spacy.io). Previous versions that
+> were built for [spaCy v2.x](https://v2.spacy.io) worked considerably
+> differently. Please see previous tagged versions of this README for
+> documentation on prior versions.
+
+- 📘
+ [Embeddings, Transformers and Transfer Learning](https://spacy.io/usage/embeddings-transformers):
+ How to use transformers in spaCy
+- 📘 [Training Pipelines and Models](https://spacy.io/usage/training):
+ Train and update components on your own data and integrate custom models
+- 📘
+ [Layers and Model Architectures](https://spacy.io/usage/layers-architectures):
+ Power spaCy components with custom neural networks
+- 📗 [`Transformer`](https://spacy.io/api/transformer): Pipeline
+ component API reference
+- 📗
+ [Transformer architectures](https://spacy.io/api/architectures#transformers):
+ Architectures and registered functions
+
+
+%package help
+Summary: Development documents and examples for spacy-transformers
+Provides: python3-spacy-transformers-doc
+%description help
+<a href="https://explosion.ai"><img src="https://explosion.ai/assets/img/logo.svg" width="125" height="125" align="right" /></a>
+
+# spacy-transformers: Use pretrained transformers like BERT, XLNet and GPT-2 in spaCy
+
+This package provides [spaCy](https://github.com/explosion/spaCy) components and
+architectures to use transformer models via
+[Hugging Face's `transformers`](https://github.com/huggingface/transformers) in
+spaCy. The result is convenient access to state-of-the-art transformer
+architectures, such as BERT, GPT-2, XLNet, etc.
+
+> **This release requires [spaCy v3](https://spacy.io/usage/v3).** For
+> the previous version of this library, see the
+> [`v0.6.x` branch](https://github.com/explosion/spacy-transformers/tree/v0.6.x).
+
+[![Azure Pipelines](https://img.shields.io/azure-devops/build/explosion-ai/public/18/master.svg?logo=azure-pipelines&style=flat-square)](https://dev.azure.com/explosion-ai/public/_build?definitionId=18)
+[![PyPi](https://img.shields.io/pypi/v/spacy-transformers.svg?style=flat-square&logo=pypi&logoColor=white)](https://pypi.python.org/pypi/spacy-transformers)
+[![GitHub](https://img.shields.io/github/release/explosion/spacy-transformers/all.svg?style=flat-square&logo=github)](https://github.com/explosion/spacy-transformers/releases)
+[![Code style: black](https://img.shields.io/badge/code%20style-black-000000.svg?style=flat-square)](https://github.com/ambv/black)
+
+## Features
+
+- Use pretrained transformer models like **BERT**, **RoBERTa** and **XLNet** to
+ power your spaCy pipeline.
+- Easy **multi-task learning**: backprop to one transformer model from several
+ pipeline components.
+- Train using spaCy v3's powerful and extensible config system.
+- Automatic alignment of transformer output to spaCy's tokenization.
+- Easily customize what transformer data is saved in the `Doc` object.
+- Easily customize how long documents are processed.
+- Out-of-the-box serialization and model packaging.
+
+## 🚀 Installation
+
+Installing the package from pip will automatically install all dependencies,
+including PyTorch and spaCy. Make sure you install this package **before** you
+install the models. Also note that this package requires **Python 3.6+**,
+**PyTorch v1.5+** and **spaCy v3.0+**.
+
+```bash
+pip install 'spacy[transformers]'
+```
+
+For GPU installation, find your CUDA version using `nvcc --version` and add the
+[version in brackets](https://spacy.io/usage/#gpu), e.g.
+`spacy[transformers,cuda92]` for CUDA9.2 or `spacy[transformers,cuda100]` for
+CUDA10.0.
+
+If you are having trouble installing PyTorch, follow the
+[instructions](https://pytorch.org/get-started/locally/) on the official website
+for your specific operating system and requirements, or try the following:
+
+```bash
+pip install spacy-transformers -f https://download.pytorch.org/whl/torch_stable.html
+```
+
+## 📖 Documentation
+
+> ⚠️ **Important note:** This package has been extensively refactored to take
+> advantage of [spaCy v3.0](https://spacy.io). Previous versions that
+> were built for [spaCy v2.x](https://v2.spacy.io) worked considerably
+> differently. Please see previous tagged versions of this README for
+> documentation on prior versions.
+
+- 📘
+ [Embeddings, Transformers and Transfer Learning](https://spacy.io/usage/embeddings-transformers):
+ How to use transformers in spaCy
+- 📘 [Training Pipelines and Models](https://spacy.io/usage/training):
+ Train and update components on your own data and integrate custom models
+- 📘
+ [Layers and Model Architectures](https://spacy.io/usage/layers-architectures):
+ Power spaCy components with custom neural networks
+- 📗 [`Transformer`](https://spacy.io/api/transformer): Pipeline
+ component API reference
+- 📗
+ [Transformer architectures](https://spacy.io/api/architectures#transformers):
+ Architectures and registered functions
+
+
+%prep
+%autosetup -n spacy-transformers-1.2.2
+
+%build
+%py3_build
+
+%install
+%py3_install
+install -d -m755 %{buildroot}/%{_pkgdocdir}
+if [ -d doc ]; then cp -arf doc %{buildroot}/%{_pkgdocdir}; fi
+if [ -d docs ]; then cp -arf docs %{buildroot}/%{_pkgdocdir}; fi
+if [ -d example ]; then cp -arf example %{buildroot}/%{_pkgdocdir}; fi
+if [ -d examples ]; then cp -arf examples %{buildroot}/%{_pkgdocdir}; fi
+pushd %{buildroot}
+if [ -d usr/lib ]; then
+ find usr/lib -type f -printf "/%h/%f\n" >> filelist.lst
+fi
+if [ -d usr/lib64 ]; then
+ find usr/lib64 -type f -printf "/%h/%f\n" >> filelist.lst
+fi
+if [ -d usr/bin ]; then
+ find usr/bin -type f -printf "/%h/%f\n" >> filelist.lst
+fi
+if [ -d usr/sbin ]; then
+ find usr/sbin -type f -printf "/%h/%f\n" >> filelist.lst
+fi
+touch doclist.lst
+if [ -d usr/share/man ]; then
+ find usr/share/man -type f -printf "/%h/%f.gz\n" >> doclist.lst
+fi
+popd
+mv %{buildroot}/filelist.lst .
+mv %{buildroot}/doclist.lst .
+
+%files -n python3-spacy-transformers -f filelist.lst
+%dir %{python3_sitearch}/*
+
+%files help -f doclist.lst
+%{_docdir}/*
+
+%changelog
+* Mon Apr 10 2023 Python_Bot <Python_Bot@openeuler.org> - 1.2.2-1
+- Package Spec generated
diff --git a/sources b/sources
new file mode 100644
index 0000000..7d5579d
--- /dev/null
+++ b/sources
@@ -0,0 +1 @@
+4cc7caf08495c0a4838b0e0a2a59b48a spacy-transformers-1.2.2.tar.gz