summaryrefslogtreecommitdiff
diff options
context:
space:
mode:
-rw-r--r--.gitignore1
-rw-r--r--python-rctorchprivate.spec300
-rw-r--r--sources1
3 files changed, 302 insertions, 0 deletions
diff --git a/.gitignore b/.gitignore
index e69de29..4e6d2d6 100644
--- a/.gitignore
+++ b/.gitignore
@@ -0,0 +1 @@
+/rctorchprivate-0.9819998.tar.gz
diff --git a/python-rctorchprivate.spec b/python-rctorchprivate.spec
new file mode 100644
index 0000000..b618ab7
--- /dev/null
+++ b/python-rctorchprivate.spec
@@ -0,0 +1,300 @@
+%global _empty_manifest_terminate_build 0
+Name: python-rctorchprivate
+Version: 0.9819998
+Release: 1
+Summary: A Python 3 toolset for creating and optimizing Echo State Networks. This library is an extension and expansion of the previous library written by Reinier Maat: https://github.com/1Reinier/Reservoir
+License: Harvard
+URL: https://github.com/blindedjoy/RcTorch-private
+Source0: https://mirrors.nju.edu.cn/pypi/web/packages/bf/70/4da4e6d44e451dd964b55b1b4b5e7d2dab919609d4168920ab5dc4090970/rctorchprivate-0.9819998.tar.gz
+BuildArch: noarch
+
+
+%description
+A Pytorch toolset for creating and optimizing Echo State Networks.
+>License: 2020-2021 MIT
+>Authors: Hayden Joy, Marios Mattheakis
+Contains:
+- A ESN Reservoir architecture class "rc.py"
+- Bayesian Optimization (BO) class "rc_bayes.py" with optimized routines for Echo State Nets through `Botorch` (GPU optimized), can train multiple RCs in parellel durring BO
+ - an implimentation of the TURBO-1 algorithm as outlined in this paper: https://github.com/uber-research/TuRBO
+- Capable of solving differential equations (the population equation, the bernoulli equation, a simple harmonic oscillator and a nonlinear oscillator)
+Reference to prior instantiation:
+This library is an extension and expansion of a previous library written by Reinier Maat: https://github.com/1Reinier/Reservoir
+2018 International Joint Conference on Neural Networks (IJCNN), pp. 1-7. IEEE, 2018
+https://arxiv.org/abs/1903.05071
+## For example usage please see the notebooks folder.
+# Installation
+## Using pip
+Like most standard libraries, `rctorch` is hosted on [PyPI](https://pypi.org/project/RcTorch/). To install the latest stable relesase,
+```bash
+pip install -U rctorch # '-U' means update to latest version
+```
+## Example Usages
+### Imports
+```python
+from rctorch import *
+import torch
+```
+### Load data
+RcTorch has several built in datasets. Among these is the forced pendulum dataset. Here we demonstrate
+```python
+fp_data = rctorch.data.load("forced_pendulum", train_proportion = 0.2)
+force_train, force_test = fp_data["force"]
+target_train, input_test = fp_data["target"]
+#Alternatively you can use sklearn's train_test_split.
+```
+### Hyper-parameters
+```python
+#declare the hyper-parameters
+>>> hps = {'connectivity': 0.4,
+ 'spectral_radius': 1.13,
+ 'n_nodes': 202,
+ 'regularization': 1.69,
+ 'leaking_rate': 0.0098085,
+ 'bias': 0.49}
+```
+### Setting up your very own EchoStateNetwork
+```python
+my_rc = RcNetwork(**hps, random_state = 210, feedback = True)
+#fitting the data:
+my_rc.fit(y = target_train)
+#making our prediction
+score, prediction = my_rc.test(y = target_test)
+my_rc.combined_plot()
+```
+![](https://raw.githubusercontent.com/blindedjoy/RcTorch-private/blob/master/resources/pure_prediction1.jpg)
+Feedback allows the network to feed in the prediction at the previous timestep as an input. This helps the RC to make longer and more stable predictions in many situations.
+### Bayesian Optimization
+Unlike most other reservoir neural network packages ours offers the automatically tune hyper-parameters.
+```python
+#any hyper parameter can have 'log_' in front of it's name. RcTorch will interpret this properly.
+bounds_dict = {"log_connectivity" : (-2.5, -0.1),
+ "spectral_radius" : (0.1, 3),
+ "n_nodes" : (300,302),
+ "log_regularization" : (-3, 1),
+ "leaking_rate" : (0, 0.2),
+ "bias": (-1,1),
+ }
+rc_specs = {"feedback" : True,
+ "reservoir_weight_dist" : "uniform",
+ "output_activation" : "tanh",
+ "random_seed" : 209}
+rc_bo = RcBayesOpt(bounds = bounds_dict,
+ scoring_method = "nmse",
+ n_jobs = 1,
+ cv_samples = 3,
+ initial_samples= 25,
+ **rc_specs
+ )
+```
+
+%package -n python3-rctorchprivate
+Summary: A Python 3 toolset for creating and optimizing Echo State Networks. This library is an extension and expansion of the previous library written by Reinier Maat: https://github.com/1Reinier/Reservoir
+Provides: python-rctorchprivate
+BuildRequires: python3-devel
+BuildRequires: python3-setuptools
+BuildRequires: python3-pip
+%description -n python3-rctorchprivate
+A Pytorch toolset for creating and optimizing Echo State Networks.
+>License: 2020-2021 MIT
+>Authors: Hayden Joy, Marios Mattheakis
+Contains:
+- A ESN Reservoir architecture class "rc.py"
+- Bayesian Optimization (BO) class "rc_bayes.py" with optimized routines for Echo State Nets through `Botorch` (GPU optimized), can train multiple RCs in parellel durring BO
+ - an implimentation of the TURBO-1 algorithm as outlined in this paper: https://github.com/uber-research/TuRBO
+- Capable of solving differential equations (the population equation, the bernoulli equation, a simple harmonic oscillator and a nonlinear oscillator)
+Reference to prior instantiation:
+This library is an extension and expansion of a previous library written by Reinier Maat: https://github.com/1Reinier/Reservoir
+2018 International Joint Conference on Neural Networks (IJCNN), pp. 1-7. IEEE, 2018
+https://arxiv.org/abs/1903.05071
+## For example usage please see the notebooks folder.
+# Installation
+## Using pip
+Like most standard libraries, `rctorch` is hosted on [PyPI](https://pypi.org/project/RcTorch/). To install the latest stable relesase,
+```bash
+pip install -U rctorch # '-U' means update to latest version
+```
+## Example Usages
+### Imports
+```python
+from rctorch import *
+import torch
+```
+### Load data
+RcTorch has several built in datasets. Among these is the forced pendulum dataset. Here we demonstrate
+```python
+fp_data = rctorch.data.load("forced_pendulum", train_proportion = 0.2)
+force_train, force_test = fp_data["force"]
+target_train, input_test = fp_data["target"]
+#Alternatively you can use sklearn's train_test_split.
+```
+### Hyper-parameters
+```python
+#declare the hyper-parameters
+>>> hps = {'connectivity': 0.4,
+ 'spectral_radius': 1.13,
+ 'n_nodes': 202,
+ 'regularization': 1.69,
+ 'leaking_rate': 0.0098085,
+ 'bias': 0.49}
+```
+### Setting up your very own EchoStateNetwork
+```python
+my_rc = RcNetwork(**hps, random_state = 210, feedback = True)
+#fitting the data:
+my_rc.fit(y = target_train)
+#making our prediction
+score, prediction = my_rc.test(y = target_test)
+my_rc.combined_plot()
+```
+![](https://raw.githubusercontent.com/blindedjoy/RcTorch-private/blob/master/resources/pure_prediction1.jpg)
+Feedback allows the network to feed in the prediction at the previous timestep as an input. This helps the RC to make longer and more stable predictions in many situations.
+### Bayesian Optimization
+Unlike most other reservoir neural network packages ours offers the automatically tune hyper-parameters.
+```python
+#any hyper parameter can have 'log_' in front of it's name. RcTorch will interpret this properly.
+bounds_dict = {"log_connectivity" : (-2.5, -0.1),
+ "spectral_radius" : (0.1, 3),
+ "n_nodes" : (300,302),
+ "log_regularization" : (-3, 1),
+ "leaking_rate" : (0, 0.2),
+ "bias": (-1,1),
+ }
+rc_specs = {"feedback" : True,
+ "reservoir_weight_dist" : "uniform",
+ "output_activation" : "tanh",
+ "random_seed" : 209}
+rc_bo = RcBayesOpt(bounds = bounds_dict,
+ scoring_method = "nmse",
+ n_jobs = 1,
+ cv_samples = 3,
+ initial_samples= 25,
+ **rc_specs
+ )
+```
+
+%package help
+Summary: Development documents and examples for rctorchprivate
+Provides: python3-rctorchprivate-doc
+%description help
+A Pytorch toolset for creating and optimizing Echo State Networks.
+>License: 2020-2021 MIT
+>Authors: Hayden Joy, Marios Mattheakis
+Contains:
+- A ESN Reservoir architecture class "rc.py"
+- Bayesian Optimization (BO) class "rc_bayes.py" with optimized routines for Echo State Nets through `Botorch` (GPU optimized), can train multiple RCs in parellel durring BO
+ - an implimentation of the TURBO-1 algorithm as outlined in this paper: https://github.com/uber-research/TuRBO
+- Capable of solving differential equations (the population equation, the bernoulli equation, a simple harmonic oscillator and a nonlinear oscillator)
+Reference to prior instantiation:
+This library is an extension and expansion of a previous library written by Reinier Maat: https://github.com/1Reinier/Reservoir
+2018 International Joint Conference on Neural Networks (IJCNN), pp. 1-7. IEEE, 2018
+https://arxiv.org/abs/1903.05071
+## For example usage please see the notebooks folder.
+# Installation
+## Using pip
+Like most standard libraries, `rctorch` is hosted on [PyPI](https://pypi.org/project/RcTorch/). To install the latest stable relesase,
+```bash
+pip install -U rctorch # '-U' means update to latest version
+```
+## Example Usages
+### Imports
+```python
+from rctorch import *
+import torch
+```
+### Load data
+RcTorch has several built in datasets. Among these is the forced pendulum dataset. Here we demonstrate
+```python
+fp_data = rctorch.data.load("forced_pendulum", train_proportion = 0.2)
+force_train, force_test = fp_data["force"]
+target_train, input_test = fp_data["target"]
+#Alternatively you can use sklearn's train_test_split.
+```
+### Hyper-parameters
+```python
+#declare the hyper-parameters
+>>> hps = {'connectivity': 0.4,
+ 'spectral_radius': 1.13,
+ 'n_nodes': 202,
+ 'regularization': 1.69,
+ 'leaking_rate': 0.0098085,
+ 'bias': 0.49}
+```
+### Setting up your very own EchoStateNetwork
+```python
+my_rc = RcNetwork(**hps, random_state = 210, feedback = True)
+#fitting the data:
+my_rc.fit(y = target_train)
+#making our prediction
+score, prediction = my_rc.test(y = target_test)
+my_rc.combined_plot()
+```
+![](https://raw.githubusercontent.com/blindedjoy/RcTorch-private/blob/master/resources/pure_prediction1.jpg)
+Feedback allows the network to feed in the prediction at the previous timestep as an input. This helps the RC to make longer and more stable predictions in many situations.
+### Bayesian Optimization
+Unlike most other reservoir neural network packages ours offers the automatically tune hyper-parameters.
+```python
+#any hyper parameter can have 'log_' in front of it's name. RcTorch will interpret this properly.
+bounds_dict = {"log_connectivity" : (-2.5, -0.1),
+ "spectral_radius" : (0.1, 3),
+ "n_nodes" : (300,302),
+ "log_regularization" : (-3, 1),
+ "leaking_rate" : (0, 0.2),
+ "bias": (-1,1),
+ }
+rc_specs = {"feedback" : True,
+ "reservoir_weight_dist" : "uniform",
+ "output_activation" : "tanh",
+ "random_seed" : 209}
+rc_bo = RcBayesOpt(bounds = bounds_dict,
+ scoring_method = "nmse",
+ n_jobs = 1,
+ cv_samples = 3,
+ initial_samples= 25,
+ **rc_specs
+ )
+```
+
+%prep
+%autosetup -n rctorchprivate-0.9819998
+
+%build
+%py3_build
+
+%install
+%py3_install
+install -d -m755 %{buildroot}/%{_pkgdocdir}
+if [ -d doc ]; then cp -arf doc %{buildroot}/%{_pkgdocdir}; fi
+if [ -d docs ]; then cp -arf docs %{buildroot}/%{_pkgdocdir}; fi
+if [ -d example ]; then cp -arf example %{buildroot}/%{_pkgdocdir}; fi
+if [ -d examples ]; then cp -arf examples %{buildroot}/%{_pkgdocdir}; fi
+pushd %{buildroot}
+if [ -d usr/lib ]; then
+ find usr/lib -type f -printf "/%h/%f\n" >> filelist.lst
+fi
+if [ -d usr/lib64 ]; then
+ find usr/lib64 -type f -printf "/%h/%f\n" >> filelist.lst
+fi
+if [ -d usr/bin ]; then
+ find usr/bin -type f -printf "/%h/%f\n" >> filelist.lst
+fi
+if [ -d usr/sbin ]; then
+ find usr/sbin -type f -printf "/%h/%f\n" >> filelist.lst
+fi
+touch doclist.lst
+if [ -d usr/share/man ]; then
+ find usr/share/man -type f -printf "/%h/%f.gz\n" >> doclist.lst
+fi
+popd
+mv %{buildroot}/filelist.lst .
+mv %{buildroot}/doclist.lst .
+
+%files -n python3-rctorchprivate -f filelist.lst
+%dir %{python3_sitelib}/*
+
+%files help -f doclist.lst
+%{_docdir}/*
+
+%changelog
+* Mon May 15 2023 Python_Bot <Python_Bot@openeuler.org> - 0.9819998-1
+- Package Spec generated
diff --git a/sources b/sources
new file mode 100644
index 0000000..cf63f7b
--- /dev/null
+++ b/sources
@@ -0,0 +1 @@
+afb0949432aed7a30eaad46eafa09568 rctorchprivate-0.9819998.tar.gz