summaryrefslogtreecommitdiff
diff options
context:
space:
mode:
authorCoprDistGit <infra@openeuler.org>2023-05-10 06:34:17 +0000
committerCoprDistGit <infra@openeuler.org>2023-05-10 06:34:17 +0000
commit2a6020530af99832387aaaedf6ea28803863412d (patch)
tree6a9343f1038659f1a8652c6cb12a74bb097b72de
parentb92ee3436b00dd02a04fbf14cee6b3115d525e03 (diff)
automatic import of python-bert-multitask-learning
-rw-r--r--.gitignore1
-rw-r--r--python-bert-multitask-learning.spec364
-rw-r--r--sources1
3 files changed, 366 insertions, 0 deletions
diff --git a/.gitignore b/.gitignore
index e69de29..e36d738 100644
--- a/.gitignore
+++ b/.gitignore
@@ -0,0 +1 @@
+/bert_multitask_learning-0.7.0.tar.gz
diff --git a/python-bert-multitask-learning.spec b/python-bert-multitask-learning.spec
new file mode 100644
index 0000000..af42038
--- /dev/null
+++ b/python-bert-multitask-learning.spec
@@ -0,0 +1,364 @@
+%global _empty_manifest_terminate_build 0
+Name: python-bert-multitask-learning
+Version: 0.7.0
+Release: 1
+Summary: BERT for Multi-task Learning
+License: MIT
+URL: https://github.com/JayYip/bert-multitask-learning
+Source0: https://mirrors.nju.edu.cn/pypi/web/packages/55/9d/12581fd57c88e19308746a67f1d76f6356c91cbcbd1d123ec346c4e35620/bert_multitask_learning-0.7.0.tar.gz
+BuildArch: noarch
+
+Requires: python3-numpy
+Requires: python3-joblib
+Requires: python3-tqdm
+Requires: python3-six
+Requires: python3-pandas
+Requires: python3-setuptools
+Requires: python3-nltk
+Requires: python3-scikit-learn
+Requires: python3-transformers
+Requires: python3-tensorflow-addons
+
+%description
+# Bert for Multi-task Learning
+
+
+
+[python](https://img.shields.io/badge/python%20-3.6.0-brightgreen.svg) [![tensorflow](https://img.shields.io/badge/tensorflow-1.13.1-green.svg)](https://www.tensorflow.org/) [![PyPI version fury.io](https://badge.fury.io/py/ansicolortags.svg)](https://pypi.python.org/pypi/bert-multitask-learning/) [![PyPI license](https://img.shields.io/pypi/l/ansicolortags.svg)](https://pypi.python.org/pypi/bert-multitask-learning/)
+
+[中文文档](#Bert多任务学习)
+
+**Note: Since 0.4.0, tf version >= 2.1 is required.**
+
+## Install
+
+```
+pip install bert-multitask-learning
+```
+
+## What is it
+
+This a project that uses transformers(based on huggingface transformers) to do **multi-modal multi-task learning**.
+
+## Why do I need this
+
+In the original BERT code, neither multi-task learning or multiple GPU training is possible. Plus, the original purpose of this project is NER which dose not have a working script in the original BERT code.
+
+To sum up, compared to the original bert repo, this repo has the following features:
+
+1. Multimodal multi-task learning(major reason of re-writing the majority of code).
+2. Multiple GPU training
+3. Support sequence labeling (for example, NER) and Encoder-Decoder Seq2Seq(with transformer decoder).
+
+## What type of problems are supported?
+
+- Masked LM and next sentence prediction Pre-train(pretrain)
+- Classification(cls)
+- Sequence Labeling(seq_tag)
+- Multi-Label Classification(multi_cls)
+- Multi-modal Mask LM(mask_lm)
+
+## How to run pre-defined problems
+
+There are two types of chaining operations can be used to chain problems.
+
+- `&`. If two problems have the same inputs, they can be chained using `&`. Problems chained by `&` will be trained at the same time.
+- `|`. If two problems don't have the same inputs, they need to be chained using `|`. Problems chained by `|` will be sampled to train at every instance.
+
+For example, `cws|NER|weibo_ner&weibo_cws`, one problem will be sampled at each turn, say `weibo_ner&weibo_cws`, then `weibo_ner` and `weibo_cws` will trained for this turn together. Therefore, in a particular batch, some tasks might not be sampled, and their loss could be 0 in this batch.
+
+Please see the examples in [notebooks](notebooks/) for more details about training, evaluation and export models.
+
+
+# Bert多任务学习
+
+**注意:版本0.4.0后要求tf>=2.1**
+
+## 安装
+
+```
+pip install bert-multitask-learning
+```
+
+## 这是什么
+
+这是利用transformer(基于huggingface transformers)进行**多模态多任务学习**的项目.
+
+## 我为什么需要这个项目
+
+在原始的BERT代码中, 是没有办法直接用多GPU进行多任务学习的. 另外, BERT并没有给出序列标注和Seq2seq的训练代码.
+
+因此, 和原来的BERT相比, 这个项目具有以下特点:
+
+1. 多任务学习
+2. 多GPU训练
+3. 序列标注以及Encoder-decoder seq2seq的支持(用transformer decoder)
+
+## 目前支持的任务类型
+
+- Masked LM和next sentence prediction预训练(pretrain)
+- 单标签分类(cls)
+- 序列标注(seq_tag)
+- 多标签分类(multi_cls)
+- 多模态Mask LM(mask_lm)
+
+## 如何运行预定义任务
+
+可以用两种方法来将多个任务连接起来.
+
+- `&`. 如果两个任务有相同的输入, 不同标签的话, 那么他们**可以**用`&`来连接. 被`&`连接起来的任务会被同时训练.
+- `|`. 如果两个任务为不同的输入, 那么他们**必须**用`|`来连接. 被`|`连接起来的任务会被随机抽取来训练.
+
+例如, 我们定义任务`cws|NER|weibo_ner&weibo_cws`, 那么在生成每一条数据时, 一个任务块会被随机抽取出来, 例如在这一次抽样中, `weibo_ner&weibo_cws`被选中. 那么这次`weibo_ner`和`weibo_cws`会被同时训练. 因此, 在一个batch中, 有可能某些任务没有被抽中, loss为0.
+
+训练, eval和导出模型请见[notebooks](notebooks/)
+
+
+
+
+%package -n python3-bert-multitask-learning
+Summary: BERT for Multi-task Learning
+Provides: python-bert-multitask-learning
+BuildRequires: python3-devel
+BuildRequires: python3-setuptools
+BuildRequires: python3-pip
+%description -n python3-bert-multitask-learning
+# Bert for Multi-task Learning
+
+
+
+[python](https://img.shields.io/badge/python%20-3.6.0-brightgreen.svg) [![tensorflow](https://img.shields.io/badge/tensorflow-1.13.1-green.svg)](https://www.tensorflow.org/) [![PyPI version fury.io](https://badge.fury.io/py/ansicolortags.svg)](https://pypi.python.org/pypi/bert-multitask-learning/) [![PyPI license](https://img.shields.io/pypi/l/ansicolortags.svg)](https://pypi.python.org/pypi/bert-multitask-learning/)
+
+[中文文档](#Bert多任务学习)
+
+**Note: Since 0.4.0, tf version >= 2.1 is required.**
+
+## Install
+
+```
+pip install bert-multitask-learning
+```
+
+## What is it
+
+This a project that uses transformers(based on huggingface transformers) to do **multi-modal multi-task learning**.
+
+## Why do I need this
+
+In the original BERT code, neither multi-task learning or multiple GPU training is possible. Plus, the original purpose of this project is NER which dose not have a working script in the original BERT code.
+
+To sum up, compared to the original bert repo, this repo has the following features:
+
+1. Multimodal multi-task learning(major reason of re-writing the majority of code).
+2. Multiple GPU training
+3. Support sequence labeling (for example, NER) and Encoder-Decoder Seq2Seq(with transformer decoder).
+
+## What type of problems are supported?
+
+- Masked LM and next sentence prediction Pre-train(pretrain)
+- Classification(cls)
+- Sequence Labeling(seq_tag)
+- Multi-Label Classification(multi_cls)
+- Multi-modal Mask LM(mask_lm)
+
+## How to run pre-defined problems
+
+There are two types of chaining operations can be used to chain problems.
+
+- `&`. If two problems have the same inputs, they can be chained using `&`. Problems chained by `&` will be trained at the same time.
+- `|`. If two problems don't have the same inputs, they need to be chained using `|`. Problems chained by `|` will be sampled to train at every instance.
+
+For example, `cws|NER|weibo_ner&weibo_cws`, one problem will be sampled at each turn, say `weibo_ner&weibo_cws`, then `weibo_ner` and `weibo_cws` will trained for this turn together. Therefore, in a particular batch, some tasks might not be sampled, and their loss could be 0 in this batch.
+
+Please see the examples in [notebooks](notebooks/) for more details about training, evaluation and export models.
+
+
+# Bert多任务学习
+
+**注意:版本0.4.0后要求tf>=2.1**
+
+## 安装
+
+```
+pip install bert-multitask-learning
+```
+
+## 这是什么
+
+这是利用transformer(基于huggingface transformers)进行**多模态多任务学习**的项目.
+
+## 我为什么需要这个项目
+
+在原始的BERT代码中, 是没有办法直接用多GPU进行多任务学习的. 另外, BERT并没有给出序列标注和Seq2seq的训练代码.
+
+因此, 和原来的BERT相比, 这个项目具有以下特点:
+
+1. 多任务学习
+2. 多GPU训练
+3. 序列标注以及Encoder-decoder seq2seq的支持(用transformer decoder)
+
+## 目前支持的任务类型
+
+- Masked LM和next sentence prediction预训练(pretrain)
+- 单标签分类(cls)
+- 序列标注(seq_tag)
+- 多标签分类(multi_cls)
+- 多模态Mask LM(mask_lm)
+
+## 如何运行预定义任务
+
+可以用两种方法来将多个任务连接起来.
+
+- `&`. 如果两个任务有相同的输入, 不同标签的话, 那么他们**可以**用`&`来连接. 被`&`连接起来的任务会被同时训练.
+- `|`. 如果两个任务为不同的输入, 那么他们**必须**用`|`来连接. 被`|`连接起来的任务会被随机抽取来训练.
+
+例如, 我们定义任务`cws|NER|weibo_ner&weibo_cws`, 那么在生成每一条数据时, 一个任务块会被随机抽取出来, 例如在这一次抽样中, `weibo_ner&weibo_cws`被选中. 那么这次`weibo_ner`和`weibo_cws`会被同时训练. 因此, 在一个batch中, 有可能某些任务没有被抽中, loss为0.
+
+训练, eval和导出模型请见[notebooks](notebooks/)
+
+
+
+
+%package help
+Summary: Development documents and examples for bert-multitask-learning
+Provides: python3-bert-multitask-learning-doc
+%description help
+# Bert for Multi-task Learning
+
+
+
+[python](https://img.shields.io/badge/python%20-3.6.0-brightgreen.svg) [![tensorflow](https://img.shields.io/badge/tensorflow-1.13.1-green.svg)](https://www.tensorflow.org/) [![PyPI version fury.io](https://badge.fury.io/py/ansicolortags.svg)](https://pypi.python.org/pypi/bert-multitask-learning/) [![PyPI license](https://img.shields.io/pypi/l/ansicolortags.svg)](https://pypi.python.org/pypi/bert-multitask-learning/)
+
+[中文文档](#Bert多任务学习)
+
+**Note: Since 0.4.0, tf version >= 2.1 is required.**
+
+## Install
+
+```
+pip install bert-multitask-learning
+```
+
+## What is it
+
+This a project that uses transformers(based on huggingface transformers) to do **multi-modal multi-task learning**.
+
+## Why do I need this
+
+In the original BERT code, neither multi-task learning or multiple GPU training is possible. Plus, the original purpose of this project is NER which dose not have a working script in the original BERT code.
+
+To sum up, compared to the original bert repo, this repo has the following features:
+
+1. Multimodal multi-task learning(major reason of re-writing the majority of code).
+2. Multiple GPU training
+3. Support sequence labeling (for example, NER) and Encoder-Decoder Seq2Seq(with transformer decoder).
+
+## What type of problems are supported?
+
+- Masked LM and next sentence prediction Pre-train(pretrain)
+- Classification(cls)
+- Sequence Labeling(seq_tag)
+- Multi-Label Classification(multi_cls)
+- Multi-modal Mask LM(mask_lm)
+
+## How to run pre-defined problems
+
+There are two types of chaining operations can be used to chain problems.
+
+- `&`. If two problems have the same inputs, they can be chained using `&`. Problems chained by `&` will be trained at the same time.
+- `|`. If two problems don't have the same inputs, they need to be chained using `|`. Problems chained by `|` will be sampled to train at every instance.
+
+For example, `cws|NER|weibo_ner&weibo_cws`, one problem will be sampled at each turn, say `weibo_ner&weibo_cws`, then `weibo_ner` and `weibo_cws` will trained for this turn together. Therefore, in a particular batch, some tasks might not be sampled, and their loss could be 0 in this batch.
+
+Please see the examples in [notebooks](notebooks/) for more details about training, evaluation and export models.
+
+
+# Bert多任务学习
+
+**注意:版本0.4.0后要求tf>=2.1**
+
+## 安装
+
+```
+pip install bert-multitask-learning
+```
+
+## 这是什么
+
+这是利用transformer(基于huggingface transformers)进行**多模态多任务学习**的项目.
+
+## 我为什么需要这个项目
+
+在原始的BERT代码中, 是没有办法直接用多GPU进行多任务学习的. 另外, BERT并没有给出序列标注和Seq2seq的训练代码.
+
+因此, 和原来的BERT相比, 这个项目具有以下特点:
+
+1. 多任务学习
+2. 多GPU训练
+3. 序列标注以及Encoder-decoder seq2seq的支持(用transformer decoder)
+
+## 目前支持的任务类型
+
+- Masked LM和next sentence prediction预训练(pretrain)
+- 单标签分类(cls)
+- 序列标注(seq_tag)
+- 多标签分类(multi_cls)
+- 多模态Mask LM(mask_lm)
+
+## 如何运行预定义任务
+
+可以用两种方法来将多个任务连接起来.
+
+- `&`. 如果两个任务有相同的输入, 不同标签的话, 那么他们**可以**用`&`来连接. 被`&`连接起来的任务会被同时训练.
+- `|`. 如果两个任务为不同的输入, 那么他们**必须**用`|`来连接. 被`|`连接起来的任务会被随机抽取来训练.
+
+例如, 我们定义任务`cws|NER|weibo_ner&weibo_cws`, 那么在生成每一条数据时, 一个任务块会被随机抽取出来, 例如在这一次抽样中, `weibo_ner&weibo_cws`被选中. 那么这次`weibo_ner`和`weibo_cws`会被同时训练. 因此, 在一个batch中, 有可能某些任务没有被抽中, loss为0.
+
+训练, eval和导出模型请见[notebooks](notebooks/)
+
+
+
+
+%prep
+%autosetup -n bert-multitask-learning-0.7.0
+
+%build
+%py3_build
+
+%install
+%py3_install
+install -d -m755 %{buildroot}/%{_pkgdocdir}
+if [ -d doc ]; then cp -arf doc %{buildroot}/%{_pkgdocdir}; fi
+if [ -d docs ]; then cp -arf docs %{buildroot}/%{_pkgdocdir}; fi
+if [ -d example ]; then cp -arf example %{buildroot}/%{_pkgdocdir}; fi
+if [ -d examples ]; then cp -arf examples %{buildroot}/%{_pkgdocdir}; fi
+pushd %{buildroot}
+if [ -d usr/lib ]; then
+ find usr/lib -type f -printf "/%h/%f\n" >> filelist.lst
+fi
+if [ -d usr/lib64 ]; then
+ find usr/lib64 -type f -printf "/%h/%f\n" >> filelist.lst
+fi
+if [ -d usr/bin ]; then
+ find usr/bin -type f -printf "/%h/%f\n" >> filelist.lst
+fi
+if [ -d usr/sbin ]; then
+ find usr/sbin -type f -printf "/%h/%f\n" >> filelist.lst
+fi
+touch doclist.lst
+if [ -d usr/share/man ]; then
+ find usr/share/man -type f -printf "/%h/%f.gz\n" >> doclist.lst
+fi
+popd
+mv %{buildroot}/filelist.lst .
+mv %{buildroot}/doclist.lst .
+
+%files -n python3-bert-multitask-learning -f filelist.lst
+%dir %{python3_sitelib}/*
+
+%files help -f doclist.lst
+%{_docdir}/*
+
+%changelog
+* Wed May 10 2023 Python_Bot <Python_Bot@openeuler.org> - 0.7.0-1
+- Package Spec generated
diff --git a/sources b/sources
new file mode 100644
index 0000000..bf987c4
--- /dev/null
+++ b/sources
@@ -0,0 +1 @@
+aee2f11572da890d4964433e8dfcda3c bert_multitask_learning-0.7.0.tar.gz