summaryrefslogtreecommitdiff
diff options
context:
space:
mode:
authorCoprDistGit <infra@openeuler.org>2023-05-29 10:27:16 +0000
committerCoprDistGit <infra@openeuler.org>2023-05-29 10:27:16 +0000
commitff8aba06bbb5518625f174bf7c76ef50267fcfca (patch)
treec9d3ccf03848e5be2157451176bcf8dfcbd6b86d
parent4bb38a33b4e2d65e4eef9ef9cf7fa55675452393 (diff)
automatic import of python-gpjax
-rw-r--r--.gitignore1
-rw-r--r--python-gpjax.spec699
-rw-r--r--sources1
3 files changed, 701 insertions, 0 deletions
diff --git a/.gitignore b/.gitignore
index e69de29..704bf0a 100644
--- a/.gitignore
+++ b/.gitignore
@@ -0,0 +1 @@
+/gpjax-0.6.1.tar.gz
diff --git a/python-gpjax.spec b/python-gpjax.spec
new file mode 100644
index 0000000..227a1ca
--- /dev/null
+++ b/python-gpjax.spec
@@ -0,0 +1,699 @@
+%global _empty_manifest_terminate_build 0
+Name: python-gpjax
+Version: 0.6.1
+Release: 1
+Summary: Gaussian processes in JAX.
+License: Apache-2.0
+URL: https://github.com/JaxGaussianProcesses/GPJax
+Source0: https://mirrors.nju.edu.cn/pypi/web/packages/39/0a/4296668bec6e4468018d9898615e84ce92abd76ba70cbe1b64d286049388/gpjax-0.6.1.tar.gz
+BuildArch: noarch
+
+Requires: python3-jax
+Requires: python3-optax
+Requires: python3-jaxtyping
+Requires: python3-tqdm
+Requires: python3-simple-pytree
+Requires: python3-tensorflow-probability
+Requires: python3-orbax-checkpoint
+Requires: python3-beartype
+Requires: python3-jaxlib
+
+%description
+<!-- <h1 align='center'>GPJax</h1>
+<h2 align='center'>Gaussian processes in Jax.</h2> -->
+<p align="center">
+<img width="700" height="300" src="https://raw.githubusercontent.com/JaxGaussianProcesses/GPJax/main/docs/_static/gpjax_logo.svg" alt="GPJax's logo">
+</p>
+
+[![codecov](https://codecov.io/gh/JaxGaussianProcesses/GPJax/branch/master/graph/badge.svg?token=DM1DRDASU2)](https://codecov.io/gh/JaxGaussianProcesses/GPJax)
+[![CodeFactor](https://www.codefactor.io/repository/github/jaxgaussianprocesses/gpjax/badge)](https://www.codefactor.io/repository/github/jaxgaussianprocesses/gpjax)
+[![Documentation Status](https://readthedocs.org/projects/gpjax/badge/?version=latest)](https://gpjax.readthedocs.io/en/latest/?badge=latest)
+[![PyPI version](https://badge.fury.io/py/GPJax.svg)](https://badge.fury.io/py/GPJax)
+[![DOI](https://joss.theoj.org/papers/10.21105/joss.04455/status.svg)](https://doi.org/10.21105/joss.04455)
+[![Downloads](https://pepy.tech/badge/gpjax)](https://pepy.tech/project/gpjax)
+[![Slack Invite](https://img.shields.io/badge/Slack_Invite--blue?style=social&logo=slack)](https://join.slack.com/t/gpjax/shared_invite/zt-1da57pmjn-rdBCVg9kApirEEn2E5Q2Zw)
+
+[**Quickstart**](#simple-example)
+| [**Install guide**](#installation)
+| [**Documentation**](https://gpjax.readthedocs.io/en/latest/)
+| [**Slack Community**](https://join.slack.com/t/gpjax/shared_invite/zt-1da57pmjn-rdBCVg9kApirEEn2E5Q2Zw)
+
+GPJax aims to provide a low-level interface to Gaussian process (GP) models in
+[Jax](https://github.com/google/jax), structured to give researchers maximum
+flexibility in extending the code to suit their own needs. The idea is that the
+code should be as close as possible to the maths we write on paper when working
+with GP models.
+
+# Package support
+
+GPJax was founded by [Thomas Pinder](https://github.com/thomaspinder). Today,
+the maintenance of GPJax is undertaken by [Thomas
+Pinder](https://github.com/thomaspinder) and [Daniel
+Dodd](https://github.com/Daniel-Dodd).
+
+We would be delighted to receive contributions from interested individuals and
+groups. To learn how you can get involved, please read our [guide for
+contributing](https://github.com/JaxGaussianProcesses/GPJax/blob/master/CONTRIBUTING.md).
+If you have any questions, we encourage you to [open an
+issue](https://github.com/JaxGaussianProcesses/GPJax/issues/new/choose). For
+broader conversations, such as best GP fitting practices or questions about the
+mathematics of GPs, we invite you to [open a
+discussion](https://github.com/JaxGaussianProcesses/GPJax/discussions).
+
+Feel free to join our [Slack
+Channel](https://join.slack.com/t/gpjax/shared_invite/zt-1da57pmjn-rdBCVg9kApirEEn2E5Q2Zw),
+where we can discuss the development of GPJax and broader support for Gaussian
+process modelling.
+
+# Supported methods and interfaces
+
+## Notebook examples
+
+> - [**Conjugate Inference**](https://gpjax.readthedocs.io/en/latest/examples/regression.html)
+> - [**Classification with MCMC**](https://gpjax.readthedocs.io/en/latest/examples/classification.html)
+> - [**Sparse Variational Inference**](https://gpjax.readthedocs.io/en/latest/examples/uncollapsed_vi.html)
+> - [**BlackJax Integration**](https://gpjax.readthedocs.io/en/latest/examples/classification.html)
+> - [**Laplace Approximation**](https://gpjax.readthedocs.io/en/latest/examples/classification.html#Laplace-approximation)
+> - [**Inference on Non-Euclidean Spaces**](https://gpjax.readthedocs.io/en/latest/examples/kernels.html#Custom-Kernel)
+> - [**Inference on Graphs**](https://gpjax.readthedocs.io/en/latest/examples/graph_kernels.html)
+> - [**Learning Gaussian Process Barycentres**](https://gpjax.readthedocs.io/en/latest/examples/barycentres.html)
+> - [**Deep Kernel Regression**](https://gpjax.readthedocs.io/en/latest/examples/haiku.html)
+
+## Guides for customisation
+>
+> - [**Custom kernels**](https://gpjax.readthedocs.io/en/latest/examples/kernels.html#Custom-Kernel)
+> - [**UCI regression**](https://gpjax.readthedocs.io/en/latest/examples/yacht.html)
+
+## Conversion between `.ipynb` and `.py`
+Above examples are stored in [examples](examples) directory in the double
+percent (`py:percent`) format. Checkout [jupytext
+using-cli](https://jupytext.readthedocs.io/en/latest/using-cli.html) for more
+info.
+
+* To convert `example.py` to `example.ipynb`, run:
+
+```bash
+jupytext --to notebook example.py
+```
+
+* To convert `example.ipynb` to `example.py`, run:
+
+```bash
+jupytext --to py:percent example.ipynb
+```
+
+# Simple example
+
+Let us import some dependencies and simulate a toy dataset $\mathcal{D}$.
+
+```python
+import gpjax as gpx
+from jax import grad, jit
+import jax.numpy as jnp
+import jax.random as jr
+import optax as ox
+
+key = jr.PRNGKey(123)
+
+f = lambda x: 10 * jnp.sin(x)
+
+n = 50
+x = jr.uniform(key=key, minval=-3.0, maxval=3.0, shape=(n,1)).sort()
+y = f(x) + jr.normal(key, shape=(n,1))
+D = gpx.Dataset(X=x, y=y)
+
+# Construct the prior
+meanf = gpx.mean_functions.Zero()
+kernel = gpx.kernels.RBF()
+prior = gpx.Prior(mean_function=meanf, kernel = kernel)
+
+# Define a likelihood
+likelihood = gpx.Gaussian(num_datapoints = n)
+
+# Construct the posterior
+posterior = prior * likelihood
+
+# Define an optimiser
+optimiser = ox.adam(learning_rate=1e-2)
+
+# Define the marginal log-likelihood
+negative_mll = jit(gpx.objectives.ConjugateMLL(negative=True))
+
+# Obtain Type 2 MLEs of the hyperparameters
+opt_posterior, history = gpx.fit(
+ model=posterior,
+ objective=negative_mll,
+ train_data=D,
+ optim=optimiser,
+ num_iters=500,
+ safe=True,
+ key=key,
+)
+
+# Infer the predictive posterior distribution
+xtest = jnp.linspace(-3., 3., 100).reshape(-1, 1)
+latent_dist = opt_posterior(xtest, D)
+predictive_dist = opt_posterior.likelihood(latent_dist)
+
+# Obtain the predictive mean and standard deviation
+pred_mean = predictive_dist.mean()
+pred_std = predictive_dist.stddev()
+```
+
+# Installation
+
+## Stable version
+
+The latest stable version of GPJax can be installed via
+pip:
+
+```bash
+pip install gpjax
+```
+
+> **Note**
+>
+> We recommend you check your installation version:
+> ```python
+> python -c 'import gpjax; print(gpjax.__version__)'
+> ```
+
+
+
+## Development version
+> **Warning**
+>
+> This version is possibly unstable and may contain bugs.
+
+Clone a copy of the repository to your local machine and run the setup
+configuration in development mode.
+```bash
+git clone https://github.com/JaxGaussianProcesses/GPJax.git
+cd GPJax
+poetry install
+```
+
+> **Note**
+>
+> We advise you create virtual environment before installing:
+> ```
+> conda create -n gpjax_experimental python=3.10.0
+> conda activate gpjax_experimental
+> ```
+>
+> and recommend you check your installation passes the supplied unit tests:
+>
+> ```python
+> poetry run pytest
+> ```
+
+# Citing GPJax
+
+If you use GPJax in your research, please cite our [JOSS paper](https://joss.theoj.org/papers/10.21105/joss.04455#).
+
+```
+@article{Pinder2022,
+ doi = {10.21105/joss.04455},
+ url = {https://doi.org/10.21105/joss.04455},
+ year = {2022},
+ publisher = {The Open Journal},
+ volume = {7},
+ number = {75},
+ pages = {4455},
+ author = {Thomas Pinder and Daniel Dodd},
+ title = {GPJax: A Gaussian Process Framework in JAX},
+ journal = {Journal of Open Source Software}
+}
+```
+
+
+%package -n python3-gpjax
+Summary: Gaussian processes in JAX.
+Provides: python-gpjax
+BuildRequires: python3-devel
+BuildRequires: python3-setuptools
+BuildRequires: python3-pip
+%description -n python3-gpjax
+<!-- <h1 align='center'>GPJax</h1>
+<h2 align='center'>Gaussian processes in Jax.</h2> -->
+<p align="center">
+<img width="700" height="300" src="https://raw.githubusercontent.com/JaxGaussianProcesses/GPJax/main/docs/_static/gpjax_logo.svg" alt="GPJax's logo">
+</p>
+
+[![codecov](https://codecov.io/gh/JaxGaussianProcesses/GPJax/branch/master/graph/badge.svg?token=DM1DRDASU2)](https://codecov.io/gh/JaxGaussianProcesses/GPJax)
+[![CodeFactor](https://www.codefactor.io/repository/github/jaxgaussianprocesses/gpjax/badge)](https://www.codefactor.io/repository/github/jaxgaussianprocesses/gpjax)
+[![Documentation Status](https://readthedocs.org/projects/gpjax/badge/?version=latest)](https://gpjax.readthedocs.io/en/latest/?badge=latest)
+[![PyPI version](https://badge.fury.io/py/GPJax.svg)](https://badge.fury.io/py/GPJax)
+[![DOI](https://joss.theoj.org/papers/10.21105/joss.04455/status.svg)](https://doi.org/10.21105/joss.04455)
+[![Downloads](https://pepy.tech/badge/gpjax)](https://pepy.tech/project/gpjax)
+[![Slack Invite](https://img.shields.io/badge/Slack_Invite--blue?style=social&logo=slack)](https://join.slack.com/t/gpjax/shared_invite/zt-1da57pmjn-rdBCVg9kApirEEn2E5Q2Zw)
+
+[**Quickstart**](#simple-example)
+| [**Install guide**](#installation)
+| [**Documentation**](https://gpjax.readthedocs.io/en/latest/)
+| [**Slack Community**](https://join.slack.com/t/gpjax/shared_invite/zt-1da57pmjn-rdBCVg9kApirEEn2E5Q2Zw)
+
+GPJax aims to provide a low-level interface to Gaussian process (GP) models in
+[Jax](https://github.com/google/jax), structured to give researchers maximum
+flexibility in extending the code to suit their own needs. The idea is that the
+code should be as close as possible to the maths we write on paper when working
+with GP models.
+
+# Package support
+
+GPJax was founded by [Thomas Pinder](https://github.com/thomaspinder). Today,
+the maintenance of GPJax is undertaken by [Thomas
+Pinder](https://github.com/thomaspinder) and [Daniel
+Dodd](https://github.com/Daniel-Dodd).
+
+We would be delighted to receive contributions from interested individuals and
+groups. To learn how you can get involved, please read our [guide for
+contributing](https://github.com/JaxGaussianProcesses/GPJax/blob/master/CONTRIBUTING.md).
+If you have any questions, we encourage you to [open an
+issue](https://github.com/JaxGaussianProcesses/GPJax/issues/new/choose). For
+broader conversations, such as best GP fitting practices or questions about the
+mathematics of GPs, we invite you to [open a
+discussion](https://github.com/JaxGaussianProcesses/GPJax/discussions).
+
+Feel free to join our [Slack
+Channel](https://join.slack.com/t/gpjax/shared_invite/zt-1da57pmjn-rdBCVg9kApirEEn2E5Q2Zw),
+where we can discuss the development of GPJax and broader support for Gaussian
+process modelling.
+
+# Supported methods and interfaces
+
+## Notebook examples
+
+> - [**Conjugate Inference**](https://gpjax.readthedocs.io/en/latest/examples/regression.html)
+> - [**Classification with MCMC**](https://gpjax.readthedocs.io/en/latest/examples/classification.html)
+> - [**Sparse Variational Inference**](https://gpjax.readthedocs.io/en/latest/examples/uncollapsed_vi.html)
+> - [**BlackJax Integration**](https://gpjax.readthedocs.io/en/latest/examples/classification.html)
+> - [**Laplace Approximation**](https://gpjax.readthedocs.io/en/latest/examples/classification.html#Laplace-approximation)
+> - [**Inference on Non-Euclidean Spaces**](https://gpjax.readthedocs.io/en/latest/examples/kernels.html#Custom-Kernel)
+> - [**Inference on Graphs**](https://gpjax.readthedocs.io/en/latest/examples/graph_kernels.html)
+> - [**Learning Gaussian Process Barycentres**](https://gpjax.readthedocs.io/en/latest/examples/barycentres.html)
+> - [**Deep Kernel Regression**](https://gpjax.readthedocs.io/en/latest/examples/haiku.html)
+
+## Guides for customisation
+>
+> - [**Custom kernels**](https://gpjax.readthedocs.io/en/latest/examples/kernels.html#Custom-Kernel)
+> - [**UCI regression**](https://gpjax.readthedocs.io/en/latest/examples/yacht.html)
+
+## Conversion between `.ipynb` and `.py`
+Above examples are stored in [examples](examples) directory in the double
+percent (`py:percent`) format. Checkout [jupytext
+using-cli](https://jupytext.readthedocs.io/en/latest/using-cli.html) for more
+info.
+
+* To convert `example.py` to `example.ipynb`, run:
+
+```bash
+jupytext --to notebook example.py
+```
+
+* To convert `example.ipynb` to `example.py`, run:
+
+```bash
+jupytext --to py:percent example.ipynb
+```
+
+# Simple example
+
+Let us import some dependencies and simulate a toy dataset $\mathcal{D}$.
+
+```python
+import gpjax as gpx
+from jax import grad, jit
+import jax.numpy as jnp
+import jax.random as jr
+import optax as ox
+
+key = jr.PRNGKey(123)
+
+f = lambda x: 10 * jnp.sin(x)
+
+n = 50
+x = jr.uniform(key=key, minval=-3.0, maxval=3.0, shape=(n,1)).sort()
+y = f(x) + jr.normal(key, shape=(n,1))
+D = gpx.Dataset(X=x, y=y)
+
+# Construct the prior
+meanf = gpx.mean_functions.Zero()
+kernel = gpx.kernels.RBF()
+prior = gpx.Prior(mean_function=meanf, kernel = kernel)
+
+# Define a likelihood
+likelihood = gpx.Gaussian(num_datapoints = n)
+
+# Construct the posterior
+posterior = prior * likelihood
+
+# Define an optimiser
+optimiser = ox.adam(learning_rate=1e-2)
+
+# Define the marginal log-likelihood
+negative_mll = jit(gpx.objectives.ConjugateMLL(negative=True))
+
+# Obtain Type 2 MLEs of the hyperparameters
+opt_posterior, history = gpx.fit(
+ model=posterior,
+ objective=negative_mll,
+ train_data=D,
+ optim=optimiser,
+ num_iters=500,
+ safe=True,
+ key=key,
+)
+
+# Infer the predictive posterior distribution
+xtest = jnp.linspace(-3., 3., 100).reshape(-1, 1)
+latent_dist = opt_posterior(xtest, D)
+predictive_dist = opt_posterior.likelihood(latent_dist)
+
+# Obtain the predictive mean and standard deviation
+pred_mean = predictive_dist.mean()
+pred_std = predictive_dist.stddev()
+```
+
+# Installation
+
+## Stable version
+
+The latest stable version of GPJax can be installed via
+pip:
+
+```bash
+pip install gpjax
+```
+
+> **Note**
+>
+> We recommend you check your installation version:
+> ```python
+> python -c 'import gpjax; print(gpjax.__version__)'
+> ```
+
+
+
+## Development version
+> **Warning**
+>
+> This version is possibly unstable and may contain bugs.
+
+Clone a copy of the repository to your local machine and run the setup
+configuration in development mode.
+```bash
+git clone https://github.com/JaxGaussianProcesses/GPJax.git
+cd GPJax
+poetry install
+```
+
+> **Note**
+>
+> We advise you create virtual environment before installing:
+> ```
+> conda create -n gpjax_experimental python=3.10.0
+> conda activate gpjax_experimental
+> ```
+>
+> and recommend you check your installation passes the supplied unit tests:
+>
+> ```python
+> poetry run pytest
+> ```
+
+# Citing GPJax
+
+If you use GPJax in your research, please cite our [JOSS paper](https://joss.theoj.org/papers/10.21105/joss.04455#).
+
+```
+@article{Pinder2022,
+ doi = {10.21105/joss.04455},
+ url = {https://doi.org/10.21105/joss.04455},
+ year = {2022},
+ publisher = {The Open Journal},
+ volume = {7},
+ number = {75},
+ pages = {4455},
+ author = {Thomas Pinder and Daniel Dodd},
+ title = {GPJax: A Gaussian Process Framework in JAX},
+ journal = {Journal of Open Source Software}
+}
+```
+
+
+%package help
+Summary: Development documents and examples for gpjax
+Provides: python3-gpjax-doc
+%description help
+<!-- <h1 align='center'>GPJax</h1>
+<h2 align='center'>Gaussian processes in Jax.</h2> -->
+<p align="center">
+<img width="700" height="300" src="https://raw.githubusercontent.com/JaxGaussianProcesses/GPJax/main/docs/_static/gpjax_logo.svg" alt="GPJax's logo">
+</p>
+
+[![codecov](https://codecov.io/gh/JaxGaussianProcesses/GPJax/branch/master/graph/badge.svg?token=DM1DRDASU2)](https://codecov.io/gh/JaxGaussianProcesses/GPJax)
+[![CodeFactor](https://www.codefactor.io/repository/github/jaxgaussianprocesses/gpjax/badge)](https://www.codefactor.io/repository/github/jaxgaussianprocesses/gpjax)
+[![Documentation Status](https://readthedocs.org/projects/gpjax/badge/?version=latest)](https://gpjax.readthedocs.io/en/latest/?badge=latest)
+[![PyPI version](https://badge.fury.io/py/GPJax.svg)](https://badge.fury.io/py/GPJax)
+[![DOI](https://joss.theoj.org/papers/10.21105/joss.04455/status.svg)](https://doi.org/10.21105/joss.04455)
+[![Downloads](https://pepy.tech/badge/gpjax)](https://pepy.tech/project/gpjax)
+[![Slack Invite](https://img.shields.io/badge/Slack_Invite--blue?style=social&logo=slack)](https://join.slack.com/t/gpjax/shared_invite/zt-1da57pmjn-rdBCVg9kApirEEn2E5Q2Zw)
+
+[**Quickstart**](#simple-example)
+| [**Install guide**](#installation)
+| [**Documentation**](https://gpjax.readthedocs.io/en/latest/)
+| [**Slack Community**](https://join.slack.com/t/gpjax/shared_invite/zt-1da57pmjn-rdBCVg9kApirEEn2E5Q2Zw)
+
+GPJax aims to provide a low-level interface to Gaussian process (GP) models in
+[Jax](https://github.com/google/jax), structured to give researchers maximum
+flexibility in extending the code to suit their own needs. The idea is that the
+code should be as close as possible to the maths we write on paper when working
+with GP models.
+
+# Package support
+
+GPJax was founded by [Thomas Pinder](https://github.com/thomaspinder). Today,
+the maintenance of GPJax is undertaken by [Thomas
+Pinder](https://github.com/thomaspinder) and [Daniel
+Dodd](https://github.com/Daniel-Dodd).
+
+We would be delighted to receive contributions from interested individuals and
+groups. To learn how you can get involved, please read our [guide for
+contributing](https://github.com/JaxGaussianProcesses/GPJax/blob/master/CONTRIBUTING.md).
+If you have any questions, we encourage you to [open an
+issue](https://github.com/JaxGaussianProcesses/GPJax/issues/new/choose). For
+broader conversations, such as best GP fitting practices or questions about the
+mathematics of GPs, we invite you to [open a
+discussion](https://github.com/JaxGaussianProcesses/GPJax/discussions).
+
+Feel free to join our [Slack
+Channel](https://join.slack.com/t/gpjax/shared_invite/zt-1da57pmjn-rdBCVg9kApirEEn2E5Q2Zw),
+where we can discuss the development of GPJax and broader support for Gaussian
+process modelling.
+
+# Supported methods and interfaces
+
+## Notebook examples
+
+> - [**Conjugate Inference**](https://gpjax.readthedocs.io/en/latest/examples/regression.html)
+> - [**Classification with MCMC**](https://gpjax.readthedocs.io/en/latest/examples/classification.html)
+> - [**Sparse Variational Inference**](https://gpjax.readthedocs.io/en/latest/examples/uncollapsed_vi.html)
+> - [**BlackJax Integration**](https://gpjax.readthedocs.io/en/latest/examples/classification.html)
+> - [**Laplace Approximation**](https://gpjax.readthedocs.io/en/latest/examples/classification.html#Laplace-approximation)
+> - [**Inference on Non-Euclidean Spaces**](https://gpjax.readthedocs.io/en/latest/examples/kernels.html#Custom-Kernel)
+> - [**Inference on Graphs**](https://gpjax.readthedocs.io/en/latest/examples/graph_kernels.html)
+> - [**Learning Gaussian Process Barycentres**](https://gpjax.readthedocs.io/en/latest/examples/barycentres.html)
+> - [**Deep Kernel Regression**](https://gpjax.readthedocs.io/en/latest/examples/haiku.html)
+
+## Guides for customisation
+>
+> - [**Custom kernels**](https://gpjax.readthedocs.io/en/latest/examples/kernels.html#Custom-Kernel)
+> - [**UCI regression**](https://gpjax.readthedocs.io/en/latest/examples/yacht.html)
+
+## Conversion between `.ipynb` and `.py`
+Above examples are stored in [examples](examples) directory in the double
+percent (`py:percent`) format. Checkout [jupytext
+using-cli](https://jupytext.readthedocs.io/en/latest/using-cli.html) for more
+info.
+
+* To convert `example.py` to `example.ipynb`, run:
+
+```bash
+jupytext --to notebook example.py
+```
+
+* To convert `example.ipynb` to `example.py`, run:
+
+```bash
+jupytext --to py:percent example.ipynb
+```
+
+# Simple example
+
+Let us import some dependencies and simulate a toy dataset $\mathcal{D}$.
+
+```python
+import gpjax as gpx
+from jax import grad, jit
+import jax.numpy as jnp
+import jax.random as jr
+import optax as ox
+
+key = jr.PRNGKey(123)
+
+f = lambda x: 10 * jnp.sin(x)
+
+n = 50
+x = jr.uniform(key=key, minval=-3.0, maxval=3.0, shape=(n,1)).sort()
+y = f(x) + jr.normal(key, shape=(n,1))
+D = gpx.Dataset(X=x, y=y)
+
+# Construct the prior
+meanf = gpx.mean_functions.Zero()
+kernel = gpx.kernels.RBF()
+prior = gpx.Prior(mean_function=meanf, kernel = kernel)
+
+# Define a likelihood
+likelihood = gpx.Gaussian(num_datapoints = n)
+
+# Construct the posterior
+posterior = prior * likelihood
+
+# Define an optimiser
+optimiser = ox.adam(learning_rate=1e-2)
+
+# Define the marginal log-likelihood
+negative_mll = jit(gpx.objectives.ConjugateMLL(negative=True))
+
+# Obtain Type 2 MLEs of the hyperparameters
+opt_posterior, history = gpx.fit(
+ model=posterior,
+ objective=negative_mll,
+ train_data=D,
+ optim=optimiser,
+ num_iters=500,
+ safe=True,
+ key=key,
+)
+
+# Infer the predictive posterior distribution
+xtest = jnp.linspace(-3., 3., 100).reshape(-1, 1)
+latent_dist = opt_posterior(xtest, D)
+predictive_dist = opt_posterior.likelihood(latent_dist)
+
+# Obtain the predictive mean and standard deviation
+pred_mean = predictive_dist.mean()
+pred_std = predictive_dist.stddev()
+```
+
+# Installation
+
+## Stable version
+
+The latest stable version of GPJax can be installed via
+pip:
+
+```bash
+pip install gpjax
+```
+
+> **Note**
+>
+> We recommend you check your installation version:
+> ```python
+> python -c 'import gpjax; print(gpjax.__version__)'
+> ```
+
+
+
+## Development version
+> **Warning**
+>
+> This version is possibly unstable and may contain bugs.
+
+Clone a copy of the repository to your local machine and run the setup
+configuration in development mode.
+```bash
+git clone https://github.com/JaxGaussianProcesses/GPJax.git
+cd GPJax
+poetry install
+```
+
+> **Note**
+>
+> We advise you create virtual environment before installing:
+> ```
+> conda create -n gpjax_experimental python=3.10.0
+> conda activate gpjax_experimental
+> ```
+>
+> and recommend you check your installation passes the supplied unit tests:
+>
+> ```python
+> poetry run pytest
+> ```
+
+# Citing GPJax
+
+If you use GPJax in your research, please cite our [JOSS paper](https://joss.theoj.org/papers/10.21105/joss.04455#).
+
+```
+@article{Pinder2022,
+ doi = {10.21105/joss.04455},
+ url = {https://doi.org/10.21105/joss.04455},
+ year = {2022},
+ publisher = {The Open Journal},
+ volume = {7},
+ number = {75},
+ pages = {4455},
+ author = {Thomas Pinder and Daniel Dodd},
+ title = {GPJax: A Gaussian Process Framework in JAX},
+ journal = {Journal of Open Source Software}
+}
+```
+
+
+%prep
+%autosetup -n gpjax-0.6.1
+
+%build
+%py3_build
+
+%install
+%py3_install
+install -d -m755 %{buildroot}/%{_pkgdocdir}
+if [ -d doc ]; then cp -arf doc %{buildroot}/%{_pkgdocdir}; fi
+if [ -d docs ]; then cp -arf docs %{buildroot}/%{_pkgdocdir}; fi
+if [ -d example ]; then cp -arf example %{buildroot}/%{_pkgdocdir}; fi
+if [ -d examples ]; then cp -arf examples %{buildroot}/%{_pkgdocdir}; fi
+pushd %{buildroot}
+if [ -d usr/lib ]; then
+ find usr/lib -type f -printf "/%h/%f\n" >> filelist.lst
+fi
+if [ -d usr/lib64 ]; then
+ find usr/lib64 -type f -printf "/%h/%f\n" >> filelist.lst
+fi
+if [ -d usr/bin ]; then
+ find usr/bin -type f -printf "/%h/%f\n" >> filelist.lst
+fi
+if [ -d usr/sbin ]; then
+ find usr/sbin -type f -printf "/%h/%f\n" >> filelist.lst
+fi
+touch doclist.lst
+if [ -d usr/share/man ]; then
+ find usr/share/man -type f -printf "/%h/%f.gz\n" >> doclist.lst
+fi
+popd
+mv %{buildroot}/filelist.lst .
+mv %{buildroot}/doclist.lst .
+
+%files -n python3-gpjax -f filelist.lst
+%dir %{python3_sitelib}/*
+
+%files help -f doclist.lst
+%{_docdir}/*
+
+%changelog
+* Mon May 29 2023 Python_Bot <Python_Bot@openeuler.org> - 0.6.1-1
+- Package Spec generated
diff --git a/sources b/sources
new file mode 100644
index 0000000..07757d3
--- /dev/null
+++ b/sources
@@ -0,0 +1 @@
+9fbb475a687fd03f514b0fc19e9f30bf gpjax-0.6.1.tar.gz