diff options
author | CoprDistGit <infra@openeuler.org> | 2023-05-31 07:23:08 +0000 |
---|---|---|
committer | CoprDistGit <infra@openeuler.org> | 2023-05-31 07:23:08 +0000 |
commit | 918892ac79b61060bc1356ee12803749a88a6f86 (patch) | |
tree | 0b882a34eb8404a78b259cfd47f5997227b3d682 /python-minerl.spec | |
parent | 4488009471bded1df8bef906efc6e661f54a700a (diff) |
automatic import of python-minerl
Diffstat (limited to 'python-minerl.spec')
-rw-r--r-- | python-minerl.spec | 351 |
1 files changed, 351 insertions, 0 deletions
diff --git a/python-minerl.spec b/python-minerl.spec new file mode 100644 index 0000000..85989eb --- /dev/null +++ b/python-minerl.spec @@ -0,0 +1,351 @@ +%global _empty_manifest_terminate_build 0 +Name: python-minerl +Version: 0.4.4 +Release: 1 +Summary: MineRL environment and data loader for reinforcement learning from human demonstration in Minecraft +License: MIT +URL: http://github.com/minerllabs/minerl +Source0: https://mirrors.nju.edu.cn/pypi/web/packages/08/c1/651340d34dc6a165821c8aba61903db7c5ac874827efb185e78e35086835/minerl-0.4.4.tar.gz +BuildArch: noarch + + +%description +# The [MineRL](http://minerl.io) Python Package + +[](https://minerl.readthedocs.io/en/latest/?badge=latest) +[](https://buildkite.com/openai-mono/minerl-public-dev) +[](https://pepy.tech/project/minerl) +[](https://badge.fury.io/py/minerl) +[](https://github.com/minerllabs/minerl/issues) +[](https://github.com/minerllabs/minerl/issues?utf8=%E2%9C%93&q=is%3Aissue+is%3Aopen+label%3Abug) +[](https://discord.gg/BT9uegr) + + +Python package providing easy to use gym environments and a simple data api for the MineRLv0 dataset. + +**To [get started please read the docs here](http://minerl.io/docs/)!** + + +## Installation + +With JDK-8 installed run this command +``` +pip3 install --upgrade minerl +``` + +## Basic Usage + +Running an environment: +```python +import minerl +import gym +env = gym.make('MineRLNavigateDense-v0') + + +obs = env.reset() + +done = False +while not done: + action = env.action_space.sample() + + # One can also take a no_op action with + # action =env.action_space.noop() + + + obs, reward, done, info = env.step( + action) + +``` + +Sampling the dataset: + +```python +import minerl + +# YOU ONLY NEED TO DO THIS ONCE! +minerl.data.download('/your/local/path') + +data = minerl.data.make( + 'MineRLObtainDiamond-v0', + data_dir='/your/local/path') + +# Iterate through a single epoch gathering sequences of at most 32 steps +for current_state, action, reward, next_state, done \ + in data.batch_iter( + num_epochs=1, seq_len=32): + + # Print the POV @ the first step of the sequence + print(current_state['pov'][0]) + + # Print the final reward pf the sequence! + print(reward[-1]) + + # Check if final (next_state) is terminal. + print(done[-1]) + + # ... do something with the data. + print("At the end of trajectories the length" + "can be < max_sequence_len", len(reward)) +``` + + +Visualizing the dataset: + + +```bash + +# Make sure your MINERL_DATA_ROOT is set! +export MINERL_DATA_ROOT='/your/local/path' + +# Visualizes a random trajectory of MineRLObtainDiamondDense-v0 +python3 -m minerl.viewer MineRLObtainDiamondDense-v0 + +``` + +## MineRL Competition +If you're here for the MineRL competition. Please check [the main competition website here](https://www.aicrowd.com/challenges/neurips-2021-minerl-competition). + +%package -n python3-minerl +Summary: MineRL environment and data loader for reinforcement learning from human demonstration in Minecraft +Provides: python-minerl +BuildRequires: python3-devel +BuildRequires: python3-setuptools +BuildRequires: python3-pip +%description -n python3-minerl +# The [MineRL](http://minerl.io) Python Package + +[](https://minerl.readthedocs.io/en/latest/?badge=latest) +[](https://buildkite.com/openai-mono/minerl-public-dev) +[](https://pepy.tech/project/minerl) +[](https://badge.fury.io/py/minerl) +[](https://github.com/minerllabs/minerl/issues) +[](https://github.com/minerllabs/minerl/issues?utf8=%E2%9C%93&q=is%3Aissue+is%3Aopen+label%3Abug) +[](https://discord.gg/BT9uegr) + + +Python package providing easy to use gym environments and a simple data api for the MineRLv0 dataset. + +**To [get started please read the docs here](http://minerl.io/docs/)!** + + +## Installation + +With JDK-8 installed run this command +``` +pip3 install --upgrade minerl +``` + +## Basic Usage + +Running an environment: +```python +import minerl +import gym +env = gym.make('MineRLNavigateDense-v0') + + +obs = env.reset() + +done = False +while not done: + action = env.action_space.sample() + + # One can also take a no_op action with + # action =env.action_space.noop() + + + obs, reward, done, info = env.step( + action) + +``` + +Sampling the dataset: + +```python +import minerl + +# YOU ONLY NEED TO DO THIS ONCE! +minerl.data.download('/your/local/path') + +data = minerl.data.make( + 'MineRLObtainDiamond-v0', + data_dir='/your/local/path') + +# Iterate through a single epoch gathering sequences of at most 32 steps +for current_state, action, reward, next_state, done \ + in data.batch_iter( + num_epochs=1, seq_len=32): + + # Print the POV @ the first step of the sequence + print(current_state['pov'][0]) + + # Print the final reward pf the sequence! + print(reward[-1]) + + # Check if final (next_state) is terminal. + print(done[-1]) + + # ... do something with the data. + print("At the end of trajectories the length" + "can be < max_sequence_len", len(reward)) +``` + + +Visualizing the dataset: + + +```bash + +# Make sure your MINERL_DATA_ROOT is set! +export MINERL_DATA_ROOT='/your/local/path' + +# Visualizes a random trajectory of MineRLObtainDiamondDense-v0 +python3 -m minerl.viewer MineRLObtainDiamondDense-v0 + +``` + +## MineRL Competition +If you're here for the MineRL competition. Please check [the main competition website here](https://www.aicrowd.com/challenges/neurips-2021-minerl-competition). + +%package help +Summary: Development documents and examples for minerl +Provides: python3-minerl-doc +%description help +# The [MineRL](http://minerl.io) Python Package + +[](https://minerl.readthedocs.io/en/latest/?badge=latest) +[](https://buildkite.com/openai-mono/minerl-public-dev) +[](https://pepy.tech/project/minerl) +[](https://badge.fury.io/py/minerl) +[](https://github.com/minerllabs/minerl/issues) +[](https://github.com/minerllabs/minerl/issues?utf8=%E2%9C%93&q=is%3Aissue+is%3Aopen+label%3Abug) +[](https://discord.gg/BT9uegr) + + +Python package providing easy to use gym environments and a simple data api for the MineRLv0 dataset. + +**To [get started please read the docs here](http://minerl.io/docs/)!** + + +## Installation + +With JDK-8 installed run this command +``` +pip3 install --upgrade minerl +``` + +## Basic Usage + +Running an environment: +```python +import minerl +import gym +env = gym.make('MineRLNavigateDense-v0') + + +obs = env.reset() + +done = False +while not done: + action = env.action_space.sample() + + # One can also take a no_op action with + # action =env.action_space.noop() + + + obs, reward, done, info = env.step( + action) + +``` + +Sampling the dataset: + +```python +import minerl + +# YOU ONLY NEED TO DO THIS ONCE! +minerl.data.download('/your/local/path') + +data = minerl.data.make( + 'MineRLObtainDiamond-v0', + data_dir='/your/local/path') + +# Iterate through a single epoch gathering sequences of at most 32 steps +for current_state, action, reward, next_state, done \ + in data.batch_iter( + num_epochs=1, seq_len=32): + + # Print the POV @ the first step of the sequence + print(current_state['pov'][0]) + + # Print the final reward pf the sequence! + print(reward[-1]) + + # Check if final (next_state) is terminal. + print(done[-1]) + + # ... do something with the data. + print("At the end of trajectories the length" + "can be < max_sequence_len", len(reward)) +``` + + +Visualizing the dataset: + + +```bash + +# Make sure your MINERL_DATA_ROOT is set! +export MINERL_DATA_ROOT='/your/local/path' + +# Visualizes a random trajectory of MineRLObtainDiamondDense-v0 +python3 -m minerl.viewer MineRLObtainDiamondDense-v0 + +``` + +## MineRL Competition +If you're here for the MineRL competition. Please check [the main competition website here](https://www.aicrowd.com/challenges/neurips-2021-minerl-competition). + +%prep +%autosetup -n minerl-0.4.4 + +%build +%py3_build + +%install +%py3_install +install -d -m755 %{buildroot}/%{_pkgdocdir} +if [ -d doc ]; then cp -arf doc %{buildroot}/%{_pkgdocdir}; fi +if [ -d docs ]; then cp -arf docs %{buildroot}/%{_pkgdocdir}; fi +if [ -d example ]; then cp -arf example %{buildroot}/%{_pkgdocdir}; fi +if [ -d examples ]; then cp -arf examples %{buildroot}/%{_pkgdocdir}; fi +pushd %{buildroot} +if [ -d usr/lib ]; then + find usr/lib -type f -printf "/%h/%f\n" >> filelist.lst +fi +if [ -d usr/lib64 ]; then + find usr/lib64 -type f -printf "/%h/%f\n" >> filelist.lst +fi +if [ -d usr/bin ]; then + find usr/bin -type f -printf "/%h/%f\n" >> filelist.lst +fi +if [ -d usr/sbin ]; then + find usr/sbin -type f -printf "/%h/%f\n" >> filelist.lst +fi +touch doclist.lst +if [ -d usr/share/man ]; then + find usr/share/man -type f -printf "/%h/%f.gz\n" >> doclist.lst +fi +popd +mv %{buildroot}/filelist.lst . +mv %{buildroot}/doclist.lst . + +%files -n python3-minerl -f filelist.lst +%dir %{python3_sitelib}/* + +%files help -f doclist.lst +%{_docdir}/* + +%changelog +* Wed May 31 2023 Python_Bot <Python_Bot@openeuler.org> - 0.4.4-1 +- Package Spec generated |