summaryrefslogtreecommitdiff
diff options
context:
space:
mode:
authorCoprDistGit <infra@openeuler.org>2023-04-23 06:53:22 +0000
committerCoprDistGit <infra@openeuler.org>2023-04-23 06:53:22 +0000
commitd10f9037a58be4f71fdedb561c1315de68fd7cb9 (patch)
treea48497b9bebc8a30d86314d2febe5626bc2da361
parent116e5a9467459154a405a3f5d0c53ed20fcc8f53 (diff)
automatic import of python-deepspeedopeneuler20.03
-rw-r--r--.gitignore1
-rw-r--r--python-deepspeed.spec14
-rw-r--r--sources2
3 files changed, 9 insertions, 8 deletions
diff --git a/.gitignore b/.gitignore
index 73e48d2..05d734e 100644
--- a/.gitignore
+++ b/.gitignore
@@ -1 +1,2 @@
/deepspeed-0.8.3.tar.gz
+/deepspeed-0.9.1.tar.gz
diff --git a/python-deepspeed.spec b/python-deepspeed.spec
index 02b8d70..44391da 100644
--- a/python-deepspeed.spec
+++ b/python-deepspeed.spec
@@ -1,17 +1,17 @@
%global _empty_manifest_terminate_build 0
Name: python-deepspeed
-Version: 0.8.3
+Version: 0.9.1
Release: 1
Summary: DeepSpeed library
License: MIT
URL: http://deepspeed.ai
-Source0: https://mirrors.nju.edu.cn/pypi/web/packages/0f/c0/9b57e9ec56f6f405726a384b109f8da1267e41feea081850c2fce1735712/deepspeed-0.8.3.tar.gz
+Source0: https://mirrors.nju.edu.cn/pypi/web/packages/fc/4e/5eb2fd0f363b89e8f58673fb4a9bd04b8273a43e4e7aa3d397ed76b1ca63/deepspeed-0.9.1.tar.gz
BuildArch: noarch
%description
# Extreme Speed and Scale for DL Training and Inference
-[DeepSpeed](https://www.deepspeed.ai/) is an easy-to-use deep learning optimization software suite that enables unprecedented scale and speed for Deep Learning Training and Inference. With DeepSpeed you can:
+***[DeepSpeed](https://www.deepspeed.ai/) enables world's most powerful language models like [MT-530B](https://www.microsoft.com/en-us/research/blog/using-deepspeed-and-megatron-to-train-megatron-turing-nlg-530b-the-worlds-largest-and-most-powerful-generative-language-model/) and [BLOOM](https://huggingface.co/blog/bloom-megatron-deepspeed)***. It is an easy-to-use deep learning optimization software suite that powers unprecedented scale and speed for both training and inference. With DeepSpeed you can:
* Train/Inference dense or sparse models with billions or trillions of parameters
* Achieve excellent system throughput and efficiently scale to thousands of GPUs
* Train/Inference on resource constrained GPU systems
@@ -25,7 +25,7 @@ BuildRequires: python3-setuptools
BuildRequires: python3-pip
%description -n python3-deepspeed
# Extreme Speed and Scale for DL Training and Inference
-[DeepSpeed](https://www.deepspeed.ai/) is an easy-to-use deep learning optimization software suite that enables unprecedented scale and speed for Deep Learning Training and Inference. With DeepSpeed you can:
+***[DeepSpeed](https://www.deepspeed.ai/) enables world's most powerful language models like [MT-530B](https://www.microsoft.com/en-us/research/blog/using-deepspeed-and-megatron-to-train-megatron-turing-nlg-530b-the-worlds-largest-and-most-powerful-generative-language-model/) and [BLOOM](https://huggingface.co/blog/bloom-megatron-deepspeed)***. It is an easy-to-use deep learning optimization software suite that powers unprecedented scale and speed for both training and inference. With DeepSpeed you can:
* Train/Inference dense or sparse models with billions or trillions of parameters
* Achieve excellent system throughput and efficiently scale to thousands of GPUs
* Train/Inference on resource constrained GPU systems
@@ -36,14 +36,14 @@ Summary: Development documents and examples for deepspeed
Provides: python3-deepspeed-doc
%description help
# Extreme Speed and Scale for DL Training and Inference
-[DeepSpeed](https://www.deepspeed.ai/) is an easy-to-use deep learning optimization software suite that enables unprecedented scale and speed for Deep Learning Training and Inference. With DeepSpeed you can:
+***[DeepSpeed](https://www.deepspeed.ai/) enables world's most powerful language models like [MT-530B](https://www.microsoft.com/en-us/research/blog/using-deepspeed-and-megatron-to-train-megatron-turing-nlg-530b-the-worlds-largest-and-most-powerful-generative-language-model/) and [BLOOM](https://huggingface.co/blog/bloom-megatron-deepspeed)***. It is an easy-to-use deep learning optimization software suite that powers unprecedented scale and speed for both training and inference. With DeepSpeed you can:
* Train/Inference dense or sparse models with billions or trillions of parameters
* Achieve excellent system throughput and efficiently scale to thousands of GPUs
* Train/Inference on resource constrained GPU systems
* Achieve unprecedented low latency and high throughput for inference
%prep
-%autosetup -n deepspeed-0.8.3
+%autosetup -n deepspeed-0.9.1
%build
%py3_build
@@ -83,5 +83,5 @@ mv %{buildroot}/doclist.lst .
%{_docdir}/*
%changelog
-* Tue Apr 11 2023 Python_Bot <Python_Bot@openeuler.org> - 0.8.3-1
+* Sun Apr 23 2023 Python_Bot <Python_Bot@openeuler.org> - 0.9.1-1
- Package Spec generated
diff --git a/sources b/sources
index 2f4913a..678432d 100644
--- a/sources
+++ b/sources
@@ -1 +1 @@
-1bb22a6860bc57e66e77fc3d795b3169 deepspeed-0.8.3.tar.gz
+fce1079a15071c5cd03065834f86d4dc deepspeed-0.9.1.tar.gz