%global _empty_manifest_terminate_build 0
Name: python-spacy-transformers
Version: 1.2.3
Release: 1
Summary: spaCy pipelines for pre-trained BERT and other transformers
License: MIT
URL: https://spacy.io
Source0: https://mirrors.nju.edu.cn/pypi/web/packages/53/95/9edb2e8412ff4877ce59c0b7aac44402037fa0a49a9ad3e859cc33339329/spacy-transformers-1.2.3.tar.gz
Requires: python3-spacy
Requires: python3-numpy
Requires: python3-transformers
Requires: python3-torch
Requires: python3-srsly
Requires: python3-spacy-alignments
Requires: python3-dataclasses
Requires: python3-cupy
Requires: python3-cupy-cuda100
Requires: python3-cupy-cuda101
Requires: python3-cupy-cuda102
Requires: python3-cupy-cuda110
Requires: python3-cupy-cuda111
Requires: python3-cupy-cuda112
Requires: python3-cupy-cuda80
Requires: python3-cupy-cuda90
Requires: python3-cupy-cuda91
Requires: python3-cupy-cuda92
%description
# spacy-transformers: Use pretrained transformers like BERT, XLNet and GPT-2 in spaCy
This package provides [spaCy](https://github.com/explosion/spaCy) components and
architectures to use transformer models via
[Hugging Face's `transformers`](https://github.com/huggingface/transformers) in
spaCy. The result is convenient access to state-of-the-art transformer
architectures, such as BERT, GPT-2, XLNet, etc.
> **This release requires [spaCy v3](https://spacy.io/usage/v3).** For
> the previous version of this library, see the
> [`v0.6.x` branch](https://github.com/explosion/spacy-transformers/tree/v0.6.x).
[](https://dev.azure.com/explosion-ai/public/_build?definitionId=18)
[](https://pypi.python.org/pypi/spacy-transformers)
[](https://github.com/explosion/spacy-transformers/releases)
[](https://github.com/ambv/black)
## Features
- Use pretrained transformer models like **BERT**, **RoBERTa** and **XLNet** to
power your spaCy pipeline.
- Easy **multi-task learning**: backprop to one transformer model from several
pipeline components.
- Train using spaCy v3's powerful and extensible config system.
- Automatic alignment of transformer output to spaCy's tokenization.
- Easily customize what transformer data is saved in the `Doc` object.
- Easily customize how long documents are processed.
- Out-of-the-box serialization and model packaging.
## 🚀 Installation
Installing the package from pip will automatically install all dependencies,
including PyTorch and spaCy. Make sure you install this package **before** you
install the models. Also note that this package requires **Python 3.6+**,
**PyTorch v1.5+** and **spaCy v3.0+**.
```bash
pip install 'spacy[transformers]'
```
For GPU installation, find your CUDA version using `nvcc --version` and add the
[version in brackets](https://spacy.io/usage/#gpu), e.g.
`spacy[transformers,cuda92]` for CUDA9.2 or `spacy[transformers,cuda100]` for
CUDA10.0.
If you are having trouble installing PyTorch, follow the
[instructions](https://pytorch.org/get-started/locally/) on the official website
for your specific operating system and requirements, or try the following:
```bash
pip install spacy-transformers -f https://download.pytorch.org/whl/torch_stable.html
```
## 📖 Documentation
> ⚠️ **Important note:** This package has been extensively refactored to take
> advantage of [spaCy v3.0](https://spacy.io). Previous versions that
> were built for [spaCy v2.x](https://v2.spacy.io) worked considerably
> differently. Please see previous tagged versions of this README for
> documentation on prior versions.
- 📘
[Embeddings, Transformers and Transfer Learning](https://spacy.io/usage/embeddings-transformers):
How to use transformers in spaCy
- 📘 [Training Pipelines and Models](https://spacy.io/usage/training):
Train and update components on your own data and integrate custom models
- 📘
[Layers and Model Architectures](https://spacy.io/usage/layers-architectures):
Power spaCy components with custom neural networks
- 📗 [`Transformer`](https://spacy.io/api/transformer): Pipeline
component API reference
- 📗
[Transformer architectures](https://spacy.io/api/architectures#transformers):
Architectures and registered functions
## Bug reports and other issues
Please use [spaCy's issue tracker](https://github.com/explosion/spaCy/issues) to report a bug, or open a new thread on the
[discussion board](https://github.com/explosion/spaCy/discussions)
for any other issue.
%package -n python3-spacy-transformers
Summary: spaCy pipelines for pre-trained BERT and other transformers
Provides: python-spacy-transformers
BuildRequires: python3-devel
BuildRequires: python3-setuptools
BuildRequires: python3-pip
BuildRequires: python3-cffi
BuildRequires: gcc
BuildRequires: gdb
%description -n python3-spacy-transformers
# spacy-transformers: Use pretrained transformers like BERT, XLNet and GPT-2 in spaCy
This package provides [spaCy](https://github.com/explosion/spaCy) components and
architectures to use transformer models via
[Hugging Face's `transformers`](https://github.com/huggingface/transformers) in
spaCy. The result is convenient access to state-of-the-art transformer
architectures, such as BERT, GPT-2, XLNet, etc.
> **This release requires [spaCy v3](https://spacy.io/usage/v3).** For
> the previous version of this library, see the
> [`v0.6.x` branch](https://github.com/explosion/spacy-transformers/tree/v0.6.x).
[](https://dev.azure.com/explosion-ai/public/_build?definitionId=18)
[](https://pypi.python.org/pypi/spacy-transformers)
[](https://github.com/explosion/spacy-transformers/releases)
[](https://github.com/ambv/black)
## Features
- Use pretrained transformer models like **BERT**, **RoBERTa** and **XLNet** to
power your spaCy pipeline.
- Easy **multi-task learning**: backprop to one transformer model from several
pipeline components.
- Train using spaCy v3's powerful and extensible config system.
- Automatic alignment of transformer output to spaCy's tokenization.
- Easily customize what transformer data is saved in the `Doc` object.
- Easily customize how long documents are processed.
- Out-of-the-box serialization and model packaging.
## 🚀 Installation
Installing the package from pip will automatically install all dependencies,
including PyTorch and spaCy. Make sure you install this package **before** you
install the models. Also note that this package requires **Python 3.6+**,
**PyTorch v1.5+** and **spaCy v3.0+**.
```bash
pip install 'spacy[transformers]'
```
For GPU installation, find your CUDA version using `nvcc --version` and add the
[version in brackets](https://spacy.io/usage/#gpu), e.g.
`spacy[transformers,cuda92]` for CUDA9.2 or `spacy[transformers,cuda100]` for
CUDA10.0.
If you are having trouble installing PyTorch, follow the
[instructions](https://pytorch.org/get-started/locally/) on the official website
for your specific operating system and requirements, or try the following:
```bash
pip install spacy-transformers -f https://download.pytorch.org/whl/torch_stable.html
```
## 📖 Documentation
> ⚠️ **Important note:** This package has been extensively refactored to take
> advantage of [spaCy v3.0](https://spacy.io). Previous versions that
> were built for [spaCy v2.x](https://v2.spacy.io) worked considerably
> differently. Please see previous tagged versions of this README for
> documentation on prior versions.
- 📘
[Embeddings, Transformers and Transfer Learning](https://spacy.io/usage/embeddings-transformers):
How to use transformers in spaCy
- 📘 [Training Pipelines and Models](https://spacy.io/usage/training):
Train and update components on your own data and integrate custom models
- 📘
[Layers and Model Architectures](https://spacy.io/usage/layers-architectures):
Power spaCy components with custom neural networks
- 📗 [`Transformer`](https://spacy.io/api/transformer): Pipeline
component API reference
- 📗
[Transformer architectures](https://spacy.io/api/architectures#transformers):
Architectures and registered functions
## Bug reports and other issues
Please use [spaCy's issue tracker](https://github.com/explosion/spaCy/issues) to report a bug, or open a new thread on the
[discussion board](https://github.com/explosion/spaCy/discussions)
for any other issue.
%package help
Summary: Development documents and examples for spacy-transformers
Provides: python3-spacy-transformers-doc
%description help
# spacy-transformers: Use pretrained transformers like BERT, XLNet and GPT-2 in spaCy
This package provides [spaCy](https://github.com/explosion/spaCy) components and
architectures to use transformer models via
[Hugging Face's `transformers`](https://github.com/huggingface/transformers) in
spaCy. The result is convenient access to state-of-the-art transformer
architectures, such as BERT, GPT-2, XLNet, etc.
> **This release requires [spaCy v3](https://spacy.io/usage/v3).** For
> the previous version of this library, see the
> [`v0.6.x` branch](https://github.com/explosion/spacy-transformers/tree/v0.6.x).
[](https://dev.azure.com/explosion-ai/public/_build?definitionId=18)
[](https://pypi.python.org/pypi/spacy-transformers)
[](https://github.com/explosion/spacy-transformers/releases)
[](https://github.com/ambv/black)
## Features
- Use pretrained transformer models like **BERT**, **RoBERTa** and **XLNet** to
power your spaCy pipeline.
- Easy **multi-task learning**: backprop to one transformer model from several
pipeline components.
- Train using spaCy v3's powerful and extensible config system.
- Automatic alignment of transformer output to spaCy's tokenization.
- Easily customize what transformer data is saved in the `Doc` object.
- Easily customize how long documents are processed.
- Out-of-the-box serialization and model packaging.
## 🚀 Installation
Installing the package from pip will automatically install all dependencies,
including PyTorch and spaCy. Make sure you install this package **before** you
install the models. Also note that this package requires **Python 3.6+**,
**PyTorch v1.5+** and **spaCy v3.0+**.
```bash
pip install 'spacy[transformers]'
```
For GPU installation, find your CUDA version using `nvcc --version` and add the
[version in brackets](https://spacy.io/usage/#gpu), e.g.
`spacy[transformers,cuda92]` for CUDA9.2 or `spacy[transformers,cuda100]` for
CUDA10.0.
If you are having trouble installing PyTorch, follow the
[instructions](https://pytorch.org/get-started/locally/) on the official website
for your specific operating system and requirements, or try the following:
```bash
pip install spacy-transformers -f https://download.pytorch.org/whl/torch_stable.html
```
## 📖 Documentation
> ⚠️ **Important note:** This package has been extensively refactored to take
> advantage of [spaCy v3.0](https://spacy.io). Previous versions that
> were built for [spaCy v2.x](https://v2.spacy.io) worked considerably
> differently. Please see previous tagged versions of this README for
> documentation on prior versions.
- 📘
[Embeddings, Transformers and Transfer Learning](https://spacy.io/usage/embeddings-transformers):
How to use transformers in spaCy
- 📘 [Training Pipelines and Models](https://spacy.io/usage/training):
Train and update components on your own data and integrate custom models
- 📘
[Layers and Model Architectures](https://spacy.io/usage/layers-architectures):
Power spaCy components with custom neural networks
- 📗 [`Transformer`](https://spacy.io/api/transformer): Pipeline
component API reference
- 📗
[Transformer architectures](https://spacy.io/api/architectures#transformers):
Architectures and registered functions
## Bug reports and other issues
Please use [spaCy's issue tracker](https://github.com/explosion/spaCy/issues) to report a bug, or open a new thread on the
[discussion board](https://github.com/explosion/spaCy/discussions)
for any other issue.
%prep
%autosetup -n spacy-transformers-1.2.3
%build
%py3_build
%install
%py3_install
install -d -m755 %{buildroot}/%{_pkgdocdir}
if [ -d doc ]; then cp -arf doc %{buildroot}/%{_pkgdocdir}; fi
if [ -d docs ]; then cp -arf docs %{buildroot}/%{_pkgdocdir}; fi
if [ -d example ]; then cp -arf example %{buildroot}/%{_pkgdocdir}; fi
if [ -d examples ]; then cp -arf examples %{buildroot}/%{_pkgdocdir}; fi
pushd %{buildroot}
if [ -d usr/lib ]; then
find usr/lib -type f -printf "/%h/%f\n" >> filelist.lst
fi
if [ -d usr/lib64 ]; then
find usr/lib64 -type f -printf "/%h/%f\n" >> filelist.lst
fi
if [ -d usr/bin ]; then
find usr/bin -type f -printf "/%h/%f\n" >> filelist.lst
fi
if [ -d usr/sbin ]; then
find usr/sbin -type f -printf "/%h/%f\n" >> filelist.lst
fi
touch doclist.lst
if [ -d usr/share/man ]; then
find usr/share/man -type f -printf "/%h/%f.gz\n" >> doclist.lst
fi
popd
mv %{buildroot}/filelist.lst .
mv %{buildroot}/doclist.lst .
%files -n python3-spacy-transformers -f filelist.lst
%dir %{python3_sitearch}/*
%files help -f doclist.lst
%{_docdir}/*
%changelog
* Fri Apr 21 2023 Python_Bot - 1.2.3-1
- Package Spec generated