diff options
author | CoprDistGit <infra@openeuler.org> | 2023-04-21 10:54:47 +0000 |
---|---|---|
committer | CoprDistGit <infra@openeuler.org> | 2023-04-21 10:54:47 +0000 |
commit | 31cc52f868648868402249a21245ded3b60082e5 (patch) | |
tree | 955e6a200a5293a8f20c98392565e75236b33066 | |
parent | 26ae3c8c6b30038607143eb08976a78577bf12d5 (diff) |
automatic import of python-flaxopeneuler20.03
-rw-r--r-- | .gitignore | 1 | ||||
-rw-r--r-- | python-flax.spec | 24 | ||||
-rw-r--r-- | sources | 2 |
3 files changed, 14 insertions, 13 deletions
@@ -1 +1,2 @@ /flax-0.6.8.tar.gz +/flax-0.6.9.tar.gz diff --git a/python-flax.spec b/python-flax.spec index 840f3fc..6c83aac 100644 --- a/python-flax.spec +++ b/python-flax.spec @@ -1,18 +1,18 @@ %global _empty_manifest_terminate_build 0 Name: python-flax -Version: 0.6.8 +Version: 0.6.9 Release: 1 Summary: Flax: A neural network library for JAX designed for flexibility License: Apache Software License -URL: https://github.com/google/flax -Source0: https://mirrors.nju.edu.cn/pypi/web/packages/dc/94/efee7afbcfdec16910f3b6bcc76ed5ed850c44e1b69630e2620a4faaf6c9/flax-0.6.8.tar.gz +URL: https://pypi.org/project/flax/ +Source0: https://mirrors.nju.edu.cn/pypi/web/packages/f1/30/60fc59a1db0001d66f55b4f071b227377d0fc5f6a23a1c591580124af1c5/flax-0.6.9.tar.gz BuildArch: noarch Requires: python3-numpy Requires: python3-jax Requires: python3-msgpack Requires: python3-optax -Requires: python3-orbax +Requires: python3-orbax-checkpoint Requires: python3-tensorstore Requires: python3-rich Requires: python3-typing-extensions @@ -225,7 +225,7 @@ decoded = model.apply(variables, encoded, method=model.decode) In-detail examples to train and evaluate a variety of Flax models for Natural Language Processing, Computer Vision, and Speech Recognition are -actively maintained in the [🤗 Transformers repository](https://github.com/huggingface/transformers/tree/master/examples/flax). +actively maintained in the [🤗 Transformers repository](https://github.com/huggingface/transformers/tree/main/examples/flax). As of October 2021, the [19 most-used Transformer architectures](https://huggingface.co/transformers/#supported-frameworks) are supported in Flax and over 5000 pretrained checkpoints in Flax have been uploaded to the [🤗 Hub](https://huggingface.co/models?library=jax&sort=downloads). @@ -239,7 +239,7 @@ To cite this repository: author = {Jonathan Heek and Anselm Levskaya and Avital Oliver and Marvin Ritter and Bertrand Rondepierre and Andreas Steiner and Marc van {Z}ee}, title = {{F}lax: A neural network library and ecosystem for {JAX}}, url = {http://github.com/google/flax}, - version = {0.6.8}, + version = {0.6.9}, year = {2023}, } ``` @@ -444,7 +444,7 @@ decoded = model.apply(variables, encoded, method=model.decode) In-detail examples to train and evaluate a variety of Flax models for Natural Language Processing, Computer Vision, and Speech Recognition are -actively maintained in the [🤗 Transformers repository](https://github.com/huggingface/transformers/tree/master/examples/flax). +actively maintained in the [🤗 Transformers repository](https://github.com/huggingface/transformers/tree/main/examples/flax). As of October 2021, the [19 most-used Transformer architectures](https://huggingface.co/transformers/#supported-frameworks) are supported in Flax and over 5000 pretrained checkpoints in Flax have been uploaded to the [🤗 Hub](https://huggingface.co/models?library=jax&sort=downloads). @@ -458,7 +458,7 @@ To cite this repository: author = {Jonathan Heek and Anselm Levskaya and Avital Oliver and Marvin Ritter and Bertrand Rondepierre and Andreas Steiner and Marc van {Z}ee}, title = {{F}lax: A neural network library and ecosystem for {JAX}}, url = {http://github.com/google/flax}, - version = {0.6.8}, + version = {0.6.9}, year = {2023}, } ``` @@ -660,7 +660,7 @@ decoded = model.apply(variables, encoded, method=model.decode) In-detail examples to train and evaluate a variety of Flax models for Natural Language Processing, Computer Vision, and Speech Recognition are -actively maintained in the [🤗 Transformers repository](https://github.com/huggingface/transformers/tree/master/examples/flax). +actively maintained in the [🤗 Transformers repository](https://github.com/huggingface/transformers/tree/main/examples/flax). As of October 2021, the [19 most-used Transformer architectures](https://huggingface.co/transformers/#supported-frameworks) are supported in Flax and over 5000 pretrained checkpoints in Flax have been uploaded to the [🤗 Hub](https://huggingface.co/models?library=jax&sort=downloads). @@ -674,7 +674,7 @@ To cite this repository: author = {Jonathan Heek and Anselm Levskaya and Avital Oliver and Marvin Ritter and Bertrand Rondepierre and Andreas Steiner and Marc van {Z}ee}, title = {{F}lax: A neural network library and ecosystem for {JAX}}, url = {http://github.com/google/flax}, - version = {0.6.8}, + version = {0.6.9}, year = {2023}, } ``` @@ -688,7 +688,7 @@ Flax is an open source project maintained by a dedicated team in Google Research %prep -%autosetup -n flax-0.6.8 +%autosetup -n flax-0.6.9 %build %py3_build @@ -728,5 +728,5 @@ mv %{buildroot}/doclist.lst . %{_docdir}/* %changelog -* Mon Apr 10 2023 Python_Bot <Python_Bot@openeuler.org> - 0.6.8-1 +* Fri Apr 21 2023 Python_Bot <Python_Bot@openeuler.org> - 0.6.9-1 - Package Spec generated @@ -1 +1 @@ -b0f0264323f2e7006c1457df90c66833 flax-0.6.8.tar.gz +bd74ddbdd6529de78c22a970beefdc40 flax-0.6.9.tar.gz |