%global _empty_manifest_terminate_build 0 Name: python-ai2-tango Version: 1.2.1 Release: 1 Summary: A library for choreographing your machine learning research. License: Apache URL: https://github.com/allenai/tango Source0: https://mirrors.nju.edu.cn/pypi/web/packages/f0/0d/12f9f4b125aeff7c4dc54e25c671afaaf63748a74109a460c524493d730f/ai2-tango-1.2.1.tar.gz BuildArch: noarch Requires: python3-cached-path Requires: python3-rjsonnet Requires: python3-GitPython Requires: python3-PyYAML Requires: python3-dill Requires: python3-base58 Requires: python3-xxhash Requires: python3-filelock Requires: python3-click Requires: python3-click-help-colors Requires: python3-rich Requires: python3-tqdm Requires: python3-more-itertools Requires: python3-sqlitedict Requires: python3-glob2 Requires: python3-petname Requires: python3-pytz Requires: python3-torch Requires: python3-numpy Requires: python3-datasets Requires: python3-wandb Requires: python3-retry Requires: python3-transformers Requires: python3-sentencepiece Requires: python3-fairscale Requires: python3-beaker-py Requires: python3-google-cloud-storage Requires: python3-google-cloud-datastore Requires: python3-sacremoses Requires: python3-jax Requires: python3-flax Requires: python3-optax Requires: python3-tensorflow-cpu Requires: python3-beaker-py Requires: python3-datasets Requires: python3-flake8 Requires: python3-mypy Requires: python3-types-PyYAML Requires: python3-types-setuptools Requires: python3-types-pytz Requires: python3-types-retry Requires: python3-black Requires: python3-isort Requires: python3-pytest Requires: python3-pytest-sphinx Requires: python3-flaky Requires: python3-pytest-cov Requires: python3-coverage Requires: python3-codecov Requires: python3-twine Requires: python3-setuptools Requires: python3-wheel Requires: python3-Sphinx Requires: python3-furo Requires: python3-myst-parser Requires: python3-sphinx-copybutton Requires: python3-sphinx-autobuild Requires: python3-sphinx-autodoc-typehints Requires: python3-packaging Requires: python3-torchmetrics Requires: python3-torch Requires: python3-numpy Requires: python3-fairscale Requires: python3-datasets Requires: python3-jax Requires: python3-flax Requires: python3-optax Requires: python3-tensorflow-cpu Requires: python3-google-cloud-storage Requires: python3-google-cloud-datastore Requires: python3-torch Requires: python3-numpy Requires: python3-torch Requires: python3-numpy Requires: python3-datasets Requires: python3-transformers Requires: python3-sentencepiece Requires: python3-sacremoses Requires: python3-wandb Requires: python3-retry %description



AI2 Tango replaces messy directories and spreadsheets full of file versions by organizing experiments into discrete steps that can be cached and reused throughout the lifetime of a research project.


CI PyPI Documentation Status License
## Quick links - [Documentation](https://ai2-tango.readthedocs.io/) - [PyPI Package](https://pypi.org/project/ai2-tango/) - [Contributing](https://github.com/allenai/tango/blob/main/CONTRIBUTING.md) - [License](https://github.com/allenai/tango/blob/main/LICENSE) ## In this README - [Quick start](#quick-start) - [Installation](#installation) - [Installing with PIP](#installing-with-pip) - [Installing with Conda](#installing-with-conda) - [Installing from source](#installing-from-source) - [Checking your installation](#checking-your-installation) - [Docker image](#docker-image) - [FAQ](#faq) - [Team](#team) - [License](#license) ## Quick start Create a Tango step: ```python # hello.py from tango import step @step() def hello(name: str) -> str: message = f"Hello, {name}!" print(message) return message ``` And create a corresponding experiment configuration file: ```jsonnet // hello.jsonnet { steps: { hello: { type: "hello", name: "World", } } } ``` Then run the experiment using a local workspace to cache the result: ```bash tango run hello.jsonnet -w /tmp/workspace ``` You'll see something like this in the output: ``` Starting new run expert-llama ● Starting step "hello"... Hello, World! ✓ Finished step "hello" ✓ Finished run expert-llama ``` If you run this a second time the output will now look like this: ``` Starting new run open-crab ✓ Found output for step "hello" in cache... ✓ Finished run open-crab ``` You won't see "Hello, World!" this time because the result of the step was found in the cache, so it wasn't run again. For a more detailed introduction check out the [First Steps](https://ai2-tango.readthedocs.io/en/latest/first_steps.html) walk-through. ## Installation **ai2-tango** requires Python 3.8 or later. ### Installing with `pip` **ai2-tango** is available [on PyPI](https://pypi.org/project/ai2-tango/). Just run ```bash pip install ai2-tango ``` To install with a specific integration, such as `torch` for example, run ```bash pip install 'ai2-tango[torch]' ``` To install with all integrations, run ```bash pip install 'ai2-tango[all]' ``` ### Installing with `conda` **ai2-tango** is available on conda-forge. You can install just the base package with ```bash conda install tango -c conda-forge ``` You can pick and choose from the integrations with one of these: ```bash conda install tango-datasets -c conda-forge conda install tango-torch -c conda-forge conda install tango-wandb -c conda-forge ``` You can also install everything: ```bash conda install tango-all -c conda-forge ``` Even though **ai2-tango** itself is quite small, installing everything will pull in a lot of dependencies. Don't be surprised if this takes a while! ### Installing from source To install **ai2-tango** from source, first clone [the repository](https://github.com/allenai/tango): ```bash git clone https://github.com/allenai/tango.git cd tango ``` Then run ```bash pip install -e '.[all]' ``` To install with only a specific integration, such as `torch` for example, run ```bash pip install -e '.[torch]' ``` Or to install just the base tango library, you can run ```bash pip install -e . ``` ### Checking your installation Run ```bash tango info ``` to check your installation. ### Docker image You can build a Docker image suitable for tango projects by using [the official Dockerfile](https://github.com/allenai/tango/blob/main/Dockerfile) as a starting point for your own Dockerfile, or you can simply use one of our [prebuilt images](https://github.com/allenai/tango/pkgs/container/tango) as a base image in your Dockerfile. For example: ```Dockerfile # Start from a prebuilt tango base image. # You can choose the right tag from the available options here: # https://github.com/allenai/tango/pkgs/container/tango/versions FROM ghcr.io/allenai/tango:cuda11.3 # Install your project's additional requirements. COPY requirements.txt . RUN /opt/conda/bin/pip install --no-cache-dir -r requirements.txt # Install source code. # This instruction copies EVERYTHING in the current directory (build context), # which may not be what you want. Consider using a ".dockerignore" file to # exclude files and directories that you don't want on the image. COPY . . ``` Make sure to choose the right base image for your use case depending on the version of tango you're using and the CUDA version that your host machine supports. You can see a list of all available image tags [on GitHub](https://github.com/allenai/tango/pkgs/container/tango/versions). ## FAQ ### Why is the library named Tango? The motivation behind this library is that we can make research easier by composing it into well-defined steps. What happens when you choreograph a number of steps together? Well, you get a dance. And since our [team's leader](https://nasmith.github.io/) is part of a tango band, "AI2 Tango" was an obvious choice! ### How can I debug my steps through the Tango CLI? You can run the `tango` command through [pdb](https://docs.python.org/3/library/pdb.html). For example: ```bash python -m pdb -m tango run config.jsonnet ``` ### How is Tango different from [Metaflow](https://metaflow.org), [Airflow](https://airflow.apache.org), or [redun](https://github.com/insitro/redun)? We've found that existing DAG execution engines like these tools are great for production workflows but not as well suited for messy, collaborative research projects where code is changing constantly. AI2 Tango was built *specifically* for these kinds of research projects. ### How does Tango's caching mechanism work? AI2 Tango caches the results of steps based on the `unique_id` of the step. The `unique_id` is essentially a hash of all of the inputs to the step along with: 1. the step class's fully qualified name, and 2. the step class's `VERSION` class variable (an arbitrary string). Unlike other workflow engines like [redun](https://github.com/insitro/redun), Tango does *not* take into account the source code of the class itself (other than its fully qualified name) because we've found that using a hash of the source code bytes is way too sensitive and less transparent for users. When you change the source code of your step in a meaningful way you can just manually change the `VERSION` class variable to indicate to Tango that the step has been updated. ## Team **ai2-tango** is developed and maintained by the AllenNLP team, backed by [the Allen Institute for Artificial Intelligence (AI2)](https://allenai.org/). AI2 is a non-profit institute with the mission to contribute to humanity through high-impact AI research and engineering. To learn more about who specifically contributed to this codebase, see [our contributors](https://github.com/allenai/tango/graphs/contributors) page. ## License **ai2-tango** is licensed under [Apache 2.0](https://www.apache.org/licenses/LICENSE-2.0). A full copy of the license can be found [on GitHub](https://github.com/allenai/tango/blob/main/LICENSE). %package -n python3-ai2-tango Summary: A library for choreographing your machine learning research. Provides: python-ai2-tango BuildRequires: python3-devel BuildRequires: python3-setuptools BuildRequires: python3-pip %description -n python3-ai2-tango



AI2 Tango replaces messy directories and spreadsheets full of file versions by organizing experiments into discrete steps that can be cached and reused throughout the lifetime of a research project.


CI PyPI Documentation Status License
## Quick links - [Documentation](https://ai2-tango.readthedocs.io/) - [PyPI Package](https://pypi.org/project/ai2-tango/) - [Contributing](https://github.com/allenai/tango/blob/main/CONTRIBUTING.md) - [License](https://github.com/allenai/tango/blob/main/LICENSE) ## In this README - [Quick start](#quick-start) - [Installation](#installation) - [Installing with PIP](#installing-with-pip) - [Installing with Conda](#installing-with-conda) - [Installing from source](#installing-from-source) - [Checking your installation](#checking-your-installation) - [Docker image](#docker-image) - [FAQ](#faq) - [Team](#team) - [License](#license) ## Quick start Create a Tango step: ```python # hello.py from tango import step @step() def hello(name: str) -> str: message = f"Hello, {name}!" print(message) return message ``` And create a corresponding experiment configuration file: ```jsonnet // hello.jsonnet { steps: { hello: { type: "hello", name: "World", } } } ``` Then run the experiment using a local workspace to cache the result: ```bash tango run hello.jsonnet -w /tmp/workspace ``` You'll see something like this in the output: ``` Starting new run expert-llama ● Starting step "hello"... Hello, World! ✓ Finished step "hello" ✓ Finished run expert-llama ``` If you run this a second time the output will now look like this: ``` Starting new run open-crab ✓ Found output for step "hello" in cache... ✓ Finished run open-crab ``` You won't see "Hello, World!" this time because the result of the step was found in the cache, so it wasn't run again. For a more detailed introduction check out the [First Steps](https://ai2-tango.readthedocs.io/en/latest/first_steps.html) walk-through. ## Installation **ai2-tango** requires Python 3.8 or later. ### Installing with `pip` **ai2-tango** is available [on PyPI](https://pypi.org/project/ai2-tango/). Just run ```bash pip install ai2-tango ``` To install with a specific integration, such as `torch` for example, run ```bash pip install 'ai2-tango[torch]' ``` To install with all integrations, run ```bash pip install 'ai2-tango[all]' ``` ### Installing with `conda` **ai2-tango** is available on conda-forge. You can install just the base package with ```bash conda install tango -c conda-forge ``` You can pick and choose from the integrations with one of these: ```bash conda install tango-datasets -c conda-forge conda install tango-torch -c conda-forge conda install tango-wandb -c conda-forge ``` You can also install everything: ```bash conda install tango-all -c conda-forge ``` Even though **ai2-tango** itself is quite small, installing everything will pull in a lot of dependencies. Don't be surprised if this takes a while! ### Installing from source To install **ai2-tango** from source, first clone [the repository](https://github.com/allenai/tango): ```bash git clone https://github.com/allenai/tango.git cd tango ``` Then run ```bash pip install -e '.[all]' ``` To install with only a specific integration, such as `torch` for example, run ```bash pip install -e '.[torch]' ``` Or to install just the base tango library, you can run ```bash pip install -e . ``` ### Checking your installation Run ```bash tango info ``` to check your installation. ### Docker image You can build a Docker image suitable for tango projects by using [the official Dockerfile](https://github.com/allenai/tango/blob/main/Dockerfile) as a starting point for your own Dockerfile, or you can simply use one of our [prebuilt images](https://github.com/allenai/tango/pkgs/container/tango) as a base image in your Dockerfile. For example: ```Dockerfile # Start from a prebuilt tango base image. # You can choose the right tag from the available options here: # https://github.com/allenai/tango/pkgs/container/tango/versions FROM ghcr.io/allenai/tango:cuda11.3 # Install your project's additional requirements. COPY requirements.txt . RUN /opt/conda/bin/pip install --no-cache-dir -r requirements.txt # Install source code. # This instruction copies EVERYTHING in the current directory (build context), # which may not be what you want. Consider using a ".dockerignore" file to # exclude files and directories that you don't want on the image. COPY . . ``` Make sure to choose the right base image for your use case depending on the version of tango you're using and the CUDA version that your host machine supports. You can see a list of all available image tags [on GitHub](https://github.com/allenai/tango/pkgs/container/tango/versions). ## FAQ ### Why is the library named Tango? The motivation behind this library is that we can make research easier by composing it into well-defined steps. What happens when you choreograph a number of steps together? Well, you get a dance. And since our [team's leader](https://nasmith.github.io/) is part of a tango band, "AI2 Tango" was an obvious choice! ### How can I debug my steps through the Tango CLI? You can run the `tango` command through [pdb](https://docs.python.org/3/library/pdb.html). For example: ```bash python -m pdb -m tango run config.jsonnet ``` ### How is Tango different from [Metaflow](https://metaflow.org), [Airflow](https://airflow.apache.org), or [redun](https://github.com/insitro/redun)? We've found that existing DAG execution engines like these tools are great for production workflows but not as well suited for messy, collaborative research projects where code is changing constantly. AI2 Tango was built *specifically* for these kinds of research projects. ### How does Tango's caching mechanism work? AI2 Tango caches the results of steps based on the `unique_id` of the step. The `unique_id` is essentially a hash of all of the inputs to the step along with: 1. the step class's fully qualified name, and 2. the step class's `VERSION` class variable (an arbitrary string). Unlike other workflow engines like [redun](https://github.com/insitro/redun), Tango does *not* take into account the source code of the class itself (other than its fully qualified name) because we've found that using a hash of the source code bytes is way too sensitive and less transparent for users. When you change the source code of your step in a meaningful way you can just manually change the `VERSION` class variable to indicate to Tango that the step has been updated. ## Team **ai2-tango** is developed and maintained by the AllenNLP team, backed by [the Allen Institute for Artificial Intelligence (AI2)](https://allenai.org/). AI2 is a non-profit institute with the mission to contribute to humanity through high-impact AI research and engineering. To learn more about who specifically contributed to this codebase, see [our contributors](https://github.com/allenai/tango/graphs/contributors) page. ## License **ai2-tango** is licensed under [Apache 2.0](https://www.apache.org/licenses/LICENSE-2.0). A full copy of the license can be found [on GitHub](https://github.com/allenai/tango/blob/main/LICENSE). %package help Summary: Development documents and examples for ai2-tango Provides: python3-ai2-tango-doc %description help



AI2 Tango replaces messy directories and spreadsheets full of file versions by organizing experiments into discrete steps that can be cached and reused throughout the lifetime of a research project.


CI PyPI Documentation Status License
## Quick links - [Documentation](https://ai2-tango.readthedocs.io/) - [PyPI Package](https://pypi.org/project/ai2-tango/) - [Contributing](https://github.com/allenai/tango/blob/main/CONTRIBUTING.md) - [License](https://github.com/allenai/tango/blob/main/LICENSE) ## In this README - [Quick start](#quick-start) - [Installation](#installation) - [Installing with PIP](#installing-with-pip) - [Installing with Conda](#installing-with-conda) - [Installing from source](#installing-from-source) - [Checking your installation](#checking-your-installation) - [Docker image](#docker-image) - [FAQ](#faq) - [Team](#team) - [License](#license) ## Quick start Create a Tango step: ```python # hello.py from tango import step @step() def hello(name: str) -> str: message = f"Hello, {name}!" print(message) return message ``` And create a corresponding experiment configuration file: ```jsonnet // hello.jsonnet { steps: { hello: { type: "hello", name: "World", } } } ``` Then run the experiment using a local workspace to cache the result: ```bash tango run hello.jsonnet -w /tmp/workspace ``` You'll see something like this in the output: ``` Starting new run expert-llama ● Starting step "hello"... Hello, World! ✓ Finished step "hello" ✓ Finished run expert-llama ``` If you run this a second time the output will now look like this: ``` Starting new run open-crab ✓ Found output for step "hello" in cache... ✓ Finished run open-crab ``` You won't see "Hello, World!" this time because the result of the step was found in the cache, so it wasn't run again. For a more detailed introduction check out the [First Steps](https://ai2-tango.readthedocs.io/en/latest/first_steps.html) walk-through. ## Installation **ai2-tango** requires Python 3.8 or later. ### Installing with `pip` **ai2-tango** is available [on PyPI](https://pypi.org/project/ai2-tango/). Just run ```bash pip install ai2-tango ``` To install with a specific integration, such as `torch` for example, run ```bash pip install 'ai2-tango[torch]' ``` To install with all integrations, run ```bash pip install 'ai2-tango[all]' ``` ### Installing with `conda` **ai2-tango** is available on conda-forge. You can install just the base package with ```bash conda install tango -c conda-forge ``` You can pick and choose from the integrations with one of these: ```bash conda install tango-datasets -c conda-forge conda install tango-torch -c conda-forge conda install tango-wandb -c conda-forge ``` You can also install everything: ```bash conda install tango-all -c conda-forge ``` Even though **ai2-tango** itself is quite small, installing everything will pull in a lot of dependencies. Don't be surprised if this takes a while! ### Installing from source To install **ai2-tango** from source, first clone [the repository](https://github.com/allenai/tango): ```bash git clone https://github.com/allenai/tango.git cd tango ``` Then run ```bash pip install -e '.[all]' ``` To install with only a specific integration, such as `torch` for example, run ```bash pip install -e '.[torch]' ``` Or to install just the base tango library, you can run ```bash pip install -e . ``` ### Checking your installation Run ```bash tango info ``` to check your installation. ### Docker image You can build a Docker image suitable for tango projects by using [the official Dockerfile](https://github.com/allenai/tango/blob/main/Dockerfile) as a starting point for your own Dockerfile, or you can simply use one of our [prebuilt images](https://github.com/allenai/tango/pkgs/container/tango) as a base image in your Dockerfile. For example: ```Dockerfile # Start from a prebuilt tango base image. # You can choose the right tag from the available options here: # https://github.com/allenai/tango/pkgs/container/tango/versions FROM ghcr.io/allenai/tango:cuda11.3 # Install your project's additional requirements. COPY requirements.txt . RUN /opt/conda/bin/pip install --no-cache-dir -r requirements.txt # Install source code. # This instruction copies EVERYTHING in the current directory (build context), # which may not be what you want. Consider using a ".dockerignore" file to # exclude files and directories that you don't want on the image. COPY . . ``` Make sure to choose the right base image for your use case depending on the version of tango you're using and the CUDA version that your host machine supports. You can see a list of all available image tags [on GitHub](https://github.com/allenai/tango/pkgs/container/tango/versions). ## FAQ ### Why is the library named Tango? The motivation behind this library is that we can make research easier by composing it into well-defined steps. What happens when you choreograph a number of steps together? Well, you get a dance. And since our [team's leader](https://nasmith.github.io/) is part of a tango band, "AI2 Tango" was an obvious choice! ### How can I debug my steps through the Tango CLI? You can run the `tango` command through [pdb](https://docs.python.org/3/library/pdb.html). For example: ```bash python -m pdb -m tango run config.jsonnet ``` ### How is Tango different from [Metaflow](https://metaflow.org), [Airflow](https://airflow.apache.org), or [redun](https://github.com/insitro/redun)? We've found that existing DAG execution engines like these tools are great for production workflows but not as well suited for messy, collaborative research projects where code is changing constantly. AI2 Tango was built *specifically* for these kinds of research projects. ### How does Tango's caching mechanism work? AI2 Tango caches the results of steps based on the `unique_id` of the step. The `unique_id` is essentially a hash of all of the inputs to the step along with: 1. the step class's fully qualified name, and 2. the step class's `VERSION` class variable (an arbitrary string). Unlike other workflow engines like [redun](https://github.com/insitro/redun), Tango does *not* take into account the source code of the class itself (other than its fully qualified name) because we've found that using a hash of the source code bytes is way too sensitive and less transparent for users. When you change the source code of your step in a meaningful way you can just manually change the `VERSION` class variable to indicate to Tango that the step has been updated. ## Team **ai2-tango** is developed and maintained by the AllenNLP team, backed by [the Allen Institute for Artificial Intelligence (AI2)](https://allenai.org/). AI2 is a non-profit institute with the mission to contribute to humanity through high-impact AI research and engineering. To learn more about who specifically contributed to this codebase, see [our contributors](https://github.com/allenai/tango/graphs/contributors) page. ## License **ai2-tango** is licensed under [Apache 2.0](https://www.apache.org/licenses/LICENSE-2.0). A full copy of the license can be found [on GitHub](https://github.com/allenai/tango/blob/main/LICENSE). %prep %autosetup -n ai2-tango-1.2.1 %build %py3_build %install %py3_install install -d -m755 %{buildroot}/%{_pkgdocdir} if [ -d doc ]; then cp -arf doc %{buildroot}/%{_pkgdocdir}; fi if [ -d docs ]; then cp -arf docs %{buildroot}/%{_pkgdocdir}; fi if [ -d example ]; then cp -arf example %{buildroot}/%{_pkgdocdir}; fi if [ -d examples ]; then cp -arf examples %{buildroot}/%{_pkgdocdir}; fi pushd %{buildroot} if [ -d usr/lib ]; then find usr/lib -type f -printf "/%h/%f\n" >> filelist.lst fi if [ -d usr/lib64 ]; then find usr/lib64 -type f -printf "/%h/%f\n" >> filelist.lst fi if [ -d usr/bin ]; then find usr/bin -type f -printf "/%h/%f\n" >> filelist.lst fi if [ -d usr/sbin ]; then find usr/sbin -type f -printf "/%h/%f\n" >> filelist.lst fi touch doclist.lst if [ -d usr/share/man ]; then find usr/share/man -type f -printf "/%h/%f.gz\n" >> doclist.lst fi popd mv %{buildroot}/filelist.lst . mv %{buildroot}/doclist.lst . %files -n python3-ai2-tango -f filelist.lst %dir %{python3_sitelib}/* %files help -f doclist.lst %{_docdir}/* %changelog * Fri May 05 2023 Python_Bot - 1.2.1-1 - Package Spec generated