diff options
| author | CoprDistGit <infra@openeuler.org> | 2023-05-15 04:35:23 +0000 |
|---|---|---|
| committer | CoprDistGit <infra@openeuler.org> | 2023-05-15 04:35:23 +0000 |
| commit | 403a0858b9b72eca586b7b87ba0690ec8ee86330 (patch) | |
| tree | 6052302effed595a4f28c06562fd032a4d8b7fea | |
| parent | 91c392c13cf87698ee24d98fc62e87de211707dd (diff) | |
automatic import of python-autonlp
| -rw-r--r-- | .gitignore | 1 | ||||
| -rw-r--r-- | python-autonlp.spec | 429 | ||||
| -rw-r--r-- | sources | 1 |
3 files changed, 431 insertions, 0 deletions
@@ -0,0 +1 @@ +/autonlp-0.3.7.tar.gz diff --git a/python-autonlp.spec b/python-autonlp.spec new file mode 100644 index 0000000..09932d9 --- /dev/null +++ b/python-autonlp.spec @@ -0,0 +1,429 @@ +%global _empty_manifest_terminate_build 0 +Name: python-autonlp +Version: 0.3.7 +Release: 1 +Summary: HuggingFace/AutoNLP +License: Apache 2.0 +URL: https://github.com/huggingface/autonlp +Source0: https://mirrors.nju.edu.cn/pypi/web/packages/be/d3/e9843aa60363a0f21f5d02bbd71973694d74a75c187d9d67831af3f59cc4/autonlp-0.3.7.tar.gz +BuildArch: noarch + +Requires: python3-loguru +Requires: python3-requests +Requires: python3-tqdm +Requires: python3-prettytable +Requires: python3-huggingface-hub +Requires: python3-datasets +Requires: python3-loguru +Requires: python3-requests +Requires: python3-tqdm +Requires: python3-prettytable +Requires: python3-huggingface-hub +Requires: python3-datasets +Requires: python3-black +Requires: python3-isort +Requires: python3-flake8 +Requires: python3-pytest +Requires: python3-loguru +Requires: python3-requests +Requires: python3-tqdm +Requires: python3-prettytable +Requires: python3-huggingface-hub +Requires: python3-datasets +Requires: python3-recommonmark +Requires: python3-sphinx +Requires: python3-sphinx-markdown-tables +Requires: python3-sphinx-rtd-theme +Requires: python3-sphinx-copybutton +Requires: python3-loguru +Requires: python3-requests +Requires: python3-tqdm +Requires: python3-prettytable +Requires: python3-huggingface-hub +Requires: python3-datasets +Requires: python3-black +Requires: python3-isort +Requires: python3-flake8 + +%description +# 🤗 AutoNLP + +AutoNLP: faster and easier training and deployments of SOTA NLP models + +## Installation + +You can Install AutoNLP python package via PIP. Please note you will need python >= 3.7 for AutoNLP to work properly. + + pip install autonlp + +Please make sure that you have git lfs installed. Check out the instructions here: https://github.com/git-lfs/git-lfs/wiki/Installation + +## Quick start - in the terminal + +Please take a look at [AutoNLP Documentation](https://huggingface.co/docs/autonlp/) for a list of supported tasks and languages. + +Note: +AutoNLP is currently in beta release. To participate in the beta, just go to https://huggingface.co/autonlp and apply 🤗 + +First, create a project: + +```bash +autonlp login --api-key YOUR_HUGGING_FACE_API_TOKEN +autonlp create_project --name sentiment_detection --language en --task binary_classification --max_models 5 +``` + +Upload files and start the training. You need a training and a validation split. Only CSV files are supported at the moment. +```bash +# Train split +autonlp upload --project sentiment_detection --split train \ + --col_mapping review:text,sentiment:target \ + --files ~/datasets/train.csv +# Validation split +autonlp upload --project sentiment_detection --split valid \ + --col_mapping review:text,sentiment:target \ + --files ~/datasets/valid.csv +``` + +Once the files are uploaded, you can start training the model: +```bash +autonlp train --project sentiment_detection +``` + +Monitor the progress of your project. +```bash +# Project progress +autonlp project_info --name sentiment_detection +# Model metrics +autonlp metrics --project PROJECT_ID +``` + +## Quick start - Python API + +Setting up: +```python +from autonlp import AutoNLP +client = AutoNLP() +client.login(token="YOUR_HUGGING_FACE_API_TOKEN") +``` + +Creating a project and uploading files to it: +```python +project = client.create_project(name="sentiment_detection", task="binary_classification", language="en", max_models=5) +project.upload( + filepaths=["/path/to/train.csv"], + split="train", + col_mapping={ + "review": "text", + "sentiment": "target", + }) + +# also upload a validation with split="valid" +``` + +Start the training of your models: +```python +project.train() +``` + +To monitor the progress of your training: +```python +project.refresh() +print(project) +``` + +After the training of your models has succeeded, you can retrieve the metrics for each model and test them with the 🤗 Inference API: + +```python +client.predict(project="sentiment_detection", model_id=42, input_text="i love autonlp") +``` + +or use command line: + +```bash +autonlp predict --project sentiment_detection --model_id 42 --sentence "i love autonlp" +``` + +## How much do I have to pay? + +It's difficult to provide an exact answer to this question, however, we have an estimator that might help you. +Just enter the number of samples and language and you will get an estimate. Please keep in mind that this is just an estimate and can easily over-estimate or under-estimate (we are actively working on this). + +```bash +autonlp estimate --num_train_samples 10000 --project_name sentiment_detection +``` + + + + +%package -n python3-autonlp +Summary: HuggingFace/AutoNLP +Provides: python-autonlp +BuildRequires: python3-devel +BuildRequires: python3-setuptools +BuildRequires: python3-pip +%description -n python3-autonlp +# 🤗 AutoNLP + +AutoNLP: faster and easier training and deployments of SOTA NLP models + +## Installation + +You can Install AutoNLP python package via PIP. Please note you will need python >= 3.7 for AutoNLP to work properly. + + pip install autonlp + +Please make sure that you have git lfs installed. Check out the instructions here: https://github.com/git-lfs/git-lfs/wiki/Installation + +## Quick start - in the terminal + +Please take a look at [AutoNLP Documentation](https://huggingface.co/docs/autonlp/) for a list of supported tasks and languages. + +Note: +AutoNLP is currently in beta release. To participate in the beta, just go to https://huggingface.co/autonlp and apply 🤗 + +First, create a project: + +```bash +autonlp login --api-key YOUR_HUGGING_FACE_API_TOKEN +autonlp create_project --name sentiment_detection --language en --task binary_classification --max_models 5 +``` + +Upload files and start the training. You need a training and a validation split. Only CSV files are supported at the moment. +```bash +# Train split +autonlp upload --project sentiment_detection --split train \ + --col_mapping review:text,sentiment:target \ + --files ~/datasets/train.csv +# Validation split +autonlp upload --project sentiment_detection --split valid \ + --col_mapping review:text,sentiment:target \ + --files ~/datasets/valid.csv +``` + +Once the files are uploaded, you can start training the model: +```bash +autonlp train --project sentiment_detection +``` + +Monitor the progress of your project. +```bash +# Project progress +autonlp project_info --name sentiment_detection +# Model metrics +autonlp metrics --project PROJECT_ID +``` + +## Quick start - Python API + +Setting up: +```python +from autonlp import AutoNLP +client = AutoNLP() +client.login(token="YOUR_HUGGING_FACE_API_TOKEN") +``` + +Creating a project and uploading files to it: +```python +project = client.create_project(name="sentiment_detection", task="binary_classification", language="en", max_models=5) +project.upload( + filepaths=["/path/to/train.csv"], + split="train", + col_mapping={ + "review": "text", + "sentiment": "target", + }) + +# also upload a validation with split="valid" +``` + +Start the training of your models: +```python +project.train() +``` + +To monitor the progress of your training: +```python +project.refresh() +print(project) +``` + +After the training of your models has succeeded, you can retrieve the metrics for each model and test them with the 🤗 Inference API: + +```python +client.predict(project="sentiment_detection", model_id=42, input_text="i love autonlp") +``` + +or use command line: + +```bash +autonlp predict --project sentiment_detection --model_id 42 --sentence "i love autonlp" +``` + +## How much do I have to pay? + +It's difficult to provide an exact answer to this question, however, we have an estimator that might help you. +Just enter the number of samples and language and you will get an estimate. Please keep in mind that this is just an estimate and can easily over-estimate or under-estimate (we are actively working on this). + +```bash +autonlp estimate --num_train_samples 10000 --project_name sentiment_detection +``` + + + + +%package help +Summary: Development documents and examples for autonlp +Provides: python3-autonlp-doc +%description help +# 🤗 AutoNLP + +AutoNLP: faster and easier training and deployments of SOTA NLP models + +## Installation + +You can Install AutoNLP python package via PIP. Please note you will need python >= 3.7 for AutoNLP to work properly. + + pip install autonlp + +Please make sure that you have git lfs installed. Check out the instructions here: https://github.com/git-lfs/git-lfs/wiki/Installation + +## Quick start - in the terminal + +Please take a look at [AutoNLP Documentation](https://huggingface.co/docs/autonlp/) for a list of supported tasks and languages. + +Note: +AutoNLP is currently in beta release. To participate in the beta, just go to https://huggingface.co/autonlp and apply 🤗 + +First, create a project: + +```bash +autonlp login --api-key YOUR_HUGGING_FACE_API_TOKEN +autonlp create_project --name sentiment_detection --language en --task binary_classification --max_models 5 +``` + +Upload files and start the training. You need a training and a validation split. Only CSV files are supported at the moment. +```bash +# Train split +autonlp upload --project sentiment_detection --split train \ + --col_mapping review:text,sentiment:target \ + --files ~/datasets/train.csv +# Validation split +autonlp upload --project sentiment_detection --split valid \ + --col_mapping review:text,sentiment:target \ + --files ~/datasets/valid.csv +``` + +Once the files are uploaded, you can start training the model: +```bash +autonlp train --project sentiment_detection +``` + +Monitor the progress of your project. +```bash +# Project progress +autonlp project_info --name sentiment_detection +# Model metrics +autonlp metrics --project PROJECT_ID +``` + +## Quick start - Python API + +Setting up: +```python +from autonlp import AutoNLP +client = AutoNLP() +client.login(token="YOUR_HUGGING_FACE_API_TOKEN") +``` + +Creating a project and uploading files to it: +```python +project = client.create_project(name="sentiment_detection", task="binary_classification", language="en", max_models=5) +project.upload( + filepaths=["/path/to/train.csv"], + split="train", + col_mapping={ + "review": "text", + "sentiment": "target", + }) + +# also upload a validation with split="valid" +``` + +Start the training of your models: +```python +project.train() +``` + +To monitor the progress of your training: +```python +project.refresh() +print(project) +``` + +After the training of your models has succeeded, you can retrieve the metrics for each model and test them with the 🤗 Inference API: + +```python +client.predict(project="sentiment_detection", model_id=42, input_text="i love autonlp") +``` + +or use command line: + +```bash +autonlp predict --project sentiment_detection --model_id 42 --sentence "i love autonlp" +``` + +## How much do I have to pay? + +It's difficult to provide an exact answer to this question, however, we have an estimator that might help you. +Just enter the number of samples and language and you will get an estimate. Please keep in mind that this is just an estimate and can easily over-estimate or under-estimate (we are actively working on this). + +```bash +autonlp estimate --num_train_samples 10000 --project_name sentiment_detection +``` + + + + +%prep +%autosetup -n autonlp-0.3.7 + +%build +%py3_build + +%install +%py3_install +install -d -m755 %{buildroot}/%{_pkgdocdir} +if [ -d doc ]; then cp -arf doc %{buildroot}/%{_pkgdocdir}; fi +if [ -d docs ]; then cp -arf docs %{buildroot}/%{_pkgdocdir}; fi +if [ -d example ]; then cp -arf example %{buildroot}/%{_pkgdocdir}; fi +if [ -d examples ]; then cp -arf examples %{buildroot}/%{_pkgdocdir}; fi +pushd %{buildroot} +if [ -d usr/lib ]; then + find usr/lib -type f -printf "/%h/%f\n" >> filelist.lst +fi +if [ -d usr/lib64 ]; then + find usr/lib64 -type f -printf "/%h/%f\n" >> filelist.lst +fi +if [ -d usr/bin ]; then + find usr/bin -type f -printf "/%h/%f\n" >> filelist.lst +fi +if [ -d usr/sbin ]; then + find usr/sbin -type f -printf "/%h/%f\n" >> filelist.lst +fi +touch doclist.lst +if [ -d usr/share/man ]; then + find usr/share/man -type f -printf "/%h/%f.gz\n" >> doclist.lst +fi +popd +mv %{buildroot}/filelist.lst . +mv %{buildroot}/doclist.lst . + +%files -n python3-autonlp -f filelist.lst +%dir %{python3_sitelib}/* + +%files help -f doclist.lst +%{_docdir}/* + +%changelog +* Mon May 15 2023 Python_Bot <Python_Bot@openeuler.org> - 0.3.7-1 +- Package Spec generated @@ -0,0 +1 @@ +37cbfc7da9850cbc4558e9d6b4055985 autonlp-0.3.7.tar.gz |
