diff options
| author | CoprDistGit <infra@openeuler.org> | 2023-04-12 03:58:36 +0000 |
|---|---|---|
| committer | CoprDistGit <infra@openeuler.org> | 2023-04-12 03:58:36 +0000 |
| commit | 900e51c47fbe0176320a12d4c3e6546453459436 (patch) | |
| tree | 9edca5f30c772f8c6fafbefc4766d15bae5303f1 | |
| parent | f8674ed2320f58b373976018c4a2d897515d8f96 (diff) | |
automatic import of python-opennmt-tf
| -rw-r--r-- | .gitignore | 1 | ||||
| -rw-r--r-- | python-opennmt-tf.spec | 305 | ||||
| -rw-r--r-- | sources | 1 |
3 files changed, 307 insertions, 0 deletions
@@ -0,0 +1 @@ +/OpenNMT-tf-2.31.0.tar.gz diff --git a/python-opennmt-tf.spec b/python-opennmt-tf.spec new file mode 100644 index 0000000..c46f1b4 --- /dev/null +++ b/python-opennmt-tf.spec @@ -0,0 +1,305 @@ +%global _empty_manifest_terminate_build 0 +Name: python-OpenNMT-tf +Version: 2.31.0 +Release: 1 +Summary: Neural machine translation and sequence learning using TensorFlow +License: MIT +URL: https://opennmt.net +Source0: https://mirrors.nju.edu.cn/pypi/web/packages/5d/8d/4b12ae213b41eb63e19b1d0a7f4e71287dc0ff5b3e13ef8b098fbd8ab169/OpenNMT-tf-2.31.0.tar.gz +BuildArch: noarch + +Requires: python3-ctranslate2 +Requires: python3-packaging +Requires: python3-pyonmttok +Requires: python3-pyyaml +Requires: python3-rouge +Requires: python3-sacrebleu +Requires: python3-tensorflow-addons +Requires: python3-myst-parser +Requires: python3-sphinx-rtd-theme +Requires: python3-sphinx +Requires: python3-tensorflow +Requires: python3-tensorflow-text +Requires: python3-black +Requires: python3-flake8 +Requires: python3-isort +Requires: python3-parameterized +Requires: python3-pytest-cov + +%description +OpenNMT-tf also implements most of the techniques commonly used to train and evaluate sequence models, such as: +* automatic evaluation during the training +* multiple decoding strategy: greedy search, beam search, random sampling +* N-best rescoring +* gradient accumulation +* scheduled sampling +* checkpoint averaging +* ... and more! +*See the [documentation](https://opennmt.net/OpenNMT-tf/) to learn how to use these features.* +## Usage +OpenNMT-tf requires: +* Python 3.7 or above +* TensorFlow 2.6, 2.7, 2.8, 2.9, 2.10, or 2.11 +We recommend installing it with `pip`: +```bash +pip install --upgrade pip +pip install OpenNMT-tf +``` +*See the [documentation](https://opennmt.net/OpenNMT-tf/installation.html) for more information.* +### Command line +OpenNMT-tf comes with several command line utilities to prepare data, train, and evaluate models. +For all tasks involving a model execution, OpenNMT-tf uses a unique entrypoint: `onmt-main`. A typical OpenNMT-tf run consists of 3 elements: +* the **model** type +* the **parameters** described in a YAML file +* the **run** type such as `train`, `eval`, `infer`, `export`, `score`, `average_checkpoints`, or `update_vocab` +that are passed to the main script: +``` +onmt-main --model_type <model> --config <config_file.yml> --auto_config <run_type> <run_options> +``` +*For more information and examples on how to use OpenNMT-tf, please visit [our documentation](https://opennmt.net/OpenNMT-tf).* +### Library +OpenNMT-tf also exposes [well-defined and stable APIs](https://opennmt.net/OpenNMT-tf/package/overview.html), from high-level training utilities to low-level model layers and dataset transformations. +For example, the `Runner` class can be used to train and evaluate models with few lines of code: +```python +import opennmt +config = { + "model_dir": "/data/wmt-ende/checkpoints/", + "data": { + "source_vocabulary": "/data/wmt-ende/joint-vocab.txt", + "target_vocabulary": "/data/wmt-ende/joint-vocab.txt", + "train_features_file": "/data/wmt-ende/train.en", + "train_labels_file": "/data/wmt-ende/train.de", + "eval_features_file": "/data/wmt-ende/valid.en", + "eval_labels_file": "/data/wmt-ende/valid.de", + } +} +model = opennmt.models.TransformerBase() +runner = opennmt.Runner(model, config, auto_config=True) +runner.train(num_devices=2, with_eval=True) +``` +Here is another example using OpenNMT-tf to run efficient beam search with a self-attentional decoder: +```python +decoder = opennmt.decoders.SelfAttentionDecoder(num_layers=6, vocab_size=32000) +initial_state = decoder.initial_state( + memory=memory, memory_sequence_length=memory_sequence_length +) +batch_size = tf.shape(memory)[0] +start_ids = tf.fill([batch_size], opennmt.START_OF_SENTENCE_ID) +decoding_result = decoder.dynamic_decode( + target_embedding, + start_ids=start_ids, + initial_state=initial_state, + decoding_strategy=opennmt.utils.BeamSearch(4), +) +``` +More examples using OpenNMT-tf as a library can be found online: +* The directory [examples/library](https://github.com/OpenNMT/OpenNMT-tf/tree/master/examples/library) contains additional examples that use OpenNMT-tf as a library +* [nmt-wizard-docker](https://github.com/OpenNMT/nmt-wizard-docker) uses the high-level `opennmt.Runner` API to wrap OpenNMT-tf with a custom interface for training, translating, and serving +*For a complete overview of the APIs, see the [package documentation](https://opennmt.net/OpenNMT-tf/package/overview.html).* +## Additional resources +* [Documentation](https://opennmt.net/OpenNMT-tf) +* [Forum](https://forum.opennmt.net) +* [Gitter](https://gitter.im/OpenNMT/OpenNMT-tf) + +%package -n python3-OpenNMT-tf +Summary: Neural machine translation and sequence learning using TensorFlow +Provides: python-OpenNMT-tf +BuildRequires: python3-devel +BuildRequires: python3-setuptools +BuildRequires: python3-pip +%description -n python3-OpenNMT-tf +OpenNMT-tf also implements most of the techniques commonly used to train and evaluate sequence models, such as: +* automatic evaluation during the training +* multiple decoding strategy: greedy search, beam search, random sampling +* N-best rescoring +* gradient accumulation +* scheduled sampling +* checkpoint averaging +* ... and more! +*See the [documentation](https://opennmt.net/OpenNMT-tf/) to learn how to use these features.* +## Usage +OpenNMT-tf requires: +* Python 3.7 or above +* TensorFlow 2.6, 2.7, 2.8, 2.9, 2.10, or 2.11 +We recommend installing it with `pip`: +```bash +pip install --upgrade pip +pip install OpenNMT-tf +``` +*See the [documentation](https://opennmt.net/OpenNMT-tf/installation.html) for more information.* +### Command line +OpenNMT-tf comes with several command line utilities to prepare data, train, and evaluate models. +For all tasks involving a model execution, OpenNMT-tf uses a unique entrypoint: `onmt-main`. A typical OpenNMT-tf run consists of 3 elements: +* the **model** type +* the **parameters** described in a YAML file +* the **run** type such as `train`, `eval`, `infer`, `export`, `score`, `average_checkpoints`, or `update_vocab` +that are passed to the main script: +``` +onmt-main --model_type <model> --config <config_file.yml> --auto_config <run_type> <run_options> +``` +*For more information and examples on how to use OpenNMT-tf, please visit [our documentation](https://opennmt.net/OpenNMT-tf).* +### Library +OpenNMT-tf also exposes [well-defined and stable APIs](https://opennmt.net/OpenNMT-tf/package/overview.html), from high-level training utilities to low-level model layers and dataset transformations. +For example, the `Runner` class can be used to train and evaluate models with few lines of code: +```python +import opennmt +config = { + "model_dir": "/data/wmt-ende/checkpoints/", + "data": { + "source_vocabulary": "/data/wmt-ende/joint-vocab.txt", + "target_vocabulary": "/data/wmt-ende/joint-vocab.txt", + "train_features_file": "/data/wmt-ende/train.en", + "train_labels_file": "/data/wmt-ende/train.de", + "eval_features_file": "/data/wmt-ende/valid.en", + "eval_labels_file": "/data/wmt-ende/valid.de", + } +} +model = opennmt.models.TransformerBase() +runner = opennmt.Runner(model, config, auto_config=True) +runner.train(num_devices=2, with_eval=True) +``` +Here is another example using OpenNMT-tf to run efficient beam search with a self-attentional decoder: +```python +decoder = opennmt.decoders.SelfAttentionDecoder(num_layers=6, vocab_size=32000) +initial_state = decoder.initial_state( + memory=memory, memory_sequence_length=memory_sequence_length +) +batch_size = tf.shape(memory)[0] +start_ids = tf.fill([batch_size], opennmt.START_OF_SENTENCE_ID) +decoding_result = decoder.dynamic_decode( + target_embedding, + start_ids=start_ids, + initial_state=initial_state, + decoding_strategy=opennmt.utils.BeamSearch(4), +) +``` +More examples using OpenNMT-tf as a library can be found online: +* The directory [examples/library](https://github.com/OpenNMT/OpenNMT-tf/tree/master/examples/library) contains additional examples that use OpenNMT-tf as a library +* [nmt-wizard-docker](https://github.com/OpenNMT/nmt-wizard-docker) uses the high-level `opennmt.Runner` API to wrap OpenNMT-tf with a custom interface for training, translating, and serving +*For a complete overview of the APIs, see the [package documentation](https://opennmt.net/OpenNMT-tf/package/overview.html).* +## Additional resources +* [Documentation](https://opennmt.net/OpenNMT-tf) +* [Forum](https://forum.opennmt.net) +* [Gitter](https://gitter.im/OpenNMT/OpenNMT-tf) + +%package help +Summary: Development documents and examples for OpenNMT-tf +Provides: python3-OpenNMT-tf-doc +%description help +OpenNMT-tf also implements most of the techniques commonly used to train and evaluate sequence models, such as: +* automatic evaluation during the training +* multiple decoding strategy: greedy search, beam search, random sampling +* N-best rescoring +* gradient accumulation +* scheduled sampling +* checkpoint averaging +* ... and more! +*See the [documentation](https://opennmt.net/OpenNMT-tf/) to learn how to use these features.* +## Usage +OpenNMT-tf requires: +* Python 3.7 or above +* TensorFlow 2.6, 2.7, 2.8, 2.9, 2.10, or 2.11 +We recommend installing it with `pip`: +```bash +pip install --upgrade pip +pip install OpenNMT-tf +``` +*See the [documentation](https://opennmt.net/OpenNMT-tf/installation.html) for more information.* +### Command line +OpenNMT-tf comes with several command line utilities to prepare data, train, and evaluate models. +For all tasks involving a model execution, OpenNMT-tf uses a unique entrypoint: `onmt-main`. A typical OpenNMT-tf run consists of 3 elements: +* the **model** type +* the **parameters** described in a YAML file +* the **run** type such as `train`, `eval`, `infer`, `export`, `score`, `average_checkpoints`, or `update_vocab` +that are passed to the main script: +``` +onmt-main --model_type <model> --config <config_file.yml> --auto_config <run_type> <run_options> +``` +*For more information and examples on how to use OpenNMT-tf, please visit [our documentation](https://opennmt.net/OpenNMT-tf).* +### Library +OpenNMT-tf also exposes [well-defined and stable APIs](https://opennmt.net/OpenNMT-tf/package/overview.html), from high-level training utilities to low-level model layers and dataset transformations. +For example, the `Runner` class can be used to train and evaluate models with few lines of code: +```python +import opennmt +config = { + "model_dir": "/data/wmt-ende/checkpoints/", + "data": { + "source_vocabulary": "/data/wmt-ende/joint-vocab.txt", + "target_vocabulary": "/data/wmt-ende/joint-vocab.txt", + "train_features_file": "/data/wmt-ende/train.en", + "train_labels_file": "/data/wmt-ende/train.de", + "eval_features_file": "/data/wmt-ende/valid.en", + "eval_labels_file": "/data/wmt-ende/valid.de", + } +} +model = opennmt.models.TransformerBase() +runner = opennmt.Runner(model, config, auto_config=True) +runner.train(num_devices=2, with_eval=True) +``` +Here is another example using OpenNMT-tf to run efficient beam search with a self-attentional decoder: +```python +decoder = opennmt.decoders.SelfAttentionDecoder(num_layers=6, vocab_size=32000) +initial_state = decoder.initial_state( + memory=memory, memory_sequence_length=memory_sequence_length +) +batch_size = tf.shape(memory)[0] +start_ids = tf.fill([batch_size], opennmt.START_OF_SENTENCE_ID) +decoding_result = decoder.dynamic_decode( + target_embedding, + start_ids=start_ids, + initial_state=initial_state, + decoding_strategy=opennmt.utils.BeamSearch(4), +) +``` +More examples using OpenNMT-tf as a library can be found online: +* The directory [examples/library](https://github.com/OpenNMT/OpenNMT-tf/tree/master/examples/library) contains additional examples that use OpenNMT-tf as a library +* [nmt-wizard-docker](https://github.com/OpenNMT/nmt-wizard-docker) uses the high-level `opennmt.Runner` API to wrap OpenNMT-tf with a custom interface for training, translating, and serving +*For a complete overview of the APIs, see the [package documentation](https://opennmt.net/OpenNMT-tf/package/overview.html).* +## Additional resources +* [Documentation](https://opennmt.net/OpenNMT-tf) +* [Forum](https://forum.opennmt.net) +* [Gitter](https://gitter.im/OpenNMT/OpenNMT-tf) + +%prep +%autosetup -n OpenNMT-tf-2.31.0 + +%build +%py3_build + +%install +%py3_install +install -d -m755 %{buildroot}/%{_pkgdocdir} +if [ -d doc ]; then cp -arf doc %{buildroot}/%{_pkgdocdir}; fi +if [ -d docs ]; then cp -arf docs %{buildroot}/%{_pkgdocdir}; fi +if [ -d example ]; then cp -arf example %{buildroot}/%{_pkgdocdir}; fi +if [ -d examples ]; then cp -arf examples %{buildroot}/%{_pkgdocdir}; fi +pushd %{buildroot} +if [ -d usr/lib ]; then + find usr/lib -type f -printf "/%h/%f\n" >> filelist.lst +fi +if [ -d usr/lib64 ]; then + find usr/lib64 -type f -printf "/%h/%f\n" >> filelist.lst +fi +if [ -d usr/bin ]; then + find usr/bin -type f -printf "/%h/%f\n" >> filelist.lst +fi +if [ -d usr/sbin ]; then + find usr/sbin -type f -printf "/%h/%f\n" >> filelist.lst +fi +touch doclist.lst +if [ -d usr/share/man ]; then + find usr/share/man -type f -printf "/%h/%f.gz\n" >> doclist.lst +fi +popd +mv %{buildroot}/filelist.lst . +mv %{buildroot}/doclist.lst . + +%files -n python3-OpenNMT-tf -f filelist.lst +%dir %{python3_sitelib}/* + +%files help -f doclist.lst +%{_docdir}/* + +%changelog +* Wed Apr 12 2023 Python_Bot <Python_Bot@openeuler.org> - 2.31.0-1 +- Package Spec generated @@ -0,0 +1 @@ +bd979a3289f0f311a220a385ccf531fa OpenNMT-tf-2.31.0.tar.gz |
