%global _empty_manifest_terminate_build 0
Name:		python-mtse
Version:	0.1.6
Release:	1
Summary:	Multi Time Series Encoders
License:	Apache 2.0
URL:		https://github.com/FractalySyn/mtse
Source0:	https://mirrors.aliyun.com/pypi/web/packages/7f/e2/83c5b6608acfaa38c6853d19cc5871f5409d3a8e89425b28f36443dcb077/mtse-0.1.6.tar.gz
BuildArch:	noarch

Requires:	python3-torch
Requires:	python3-numpy
Requires:	python3-pandas
Requires:	python3-matplotlib
Requires:	python3-setuptools

%description
# Multi Time Series Encoders

The objective of this python package is to make easy the encoding and the classification/regression of multivariate time series (**mts**) data even when these are asynchronous. We say that data are of type **mts** when each observation is associated with multiple time series (e.g. the vital signs of a patient at a specific period).

## Installation

The current version has been developed in Python 3.7. It also works in Python 3.8. If you encounter an issue, please try to run it again in a virtual machine containing Python 3.7 or 3.8.

```bash
pip install mtse
```

## Sample code

```python
import mtse

### Load sample data ###
train, val, test, norm = mtse.get_sample(return_norm=True)

### Using the class `mtse` ###
mtan = mtse.mtse(device='cuda', seed=1, experiment_id='mtan')
mtan.load_data(train, val, test, norm=norm)
mtan.build_model('mtan', 'regression', learn_emb=True, early_stop=10, cuda_empty_cache=True)
mtan.train(lossf='mape', n_iters=200, save_startegy='best')
mtan.predict(checkpoint='best')
mtan.encode_ts(data_to_embed='test', embed_pandas=True)
```

**More details and examples in the documentation**

## What can be implemented / improved

#### Encoders
  - [x] mTAN - Multi Time Attention Network - encoder
  - [ ] mTAN - Multi Time Attention Network - encoder-decoder
  - [ ] SeFT - Set Function for Time series
  - [ ] STraTS - Self-supervised Transformer for Time-Series
  - [ ] ODE-based encoders

Note that we only implemented the mTAN encoder as a baseline for now. At this stage, this model works only for supervised learning, meaning that it uses the target variable to compute the loss and update the encoder weights. Thus, the priority would be to implement an unsupervised encoder next (encoder-decoder models or self-supervised encoders).

#### Other features
  - Cross-validation evaluation, prediction and encoding
  - Support for other data inputs in the dataset classes (currently the `mtan_Dataset` class)
  - Support for time-series forecasting and inference tasks

## References

Satya Narayan Shukla and Benjamin Marlin, ["Multi-Time Attention Networks for Irregularly Sampled Time Series"](https://openreview.net/forum?id=4c0J6lwQ4_), *International Conference on Learning Representations*, 2021.




%package -n python3-mtse
Summary:	Multi Time Series Encoders
Provides:	python-mtse
BuildRequires:	python3-devel
BuildRequires:	python3-setuptools
BuildRequires:	python3-pip
%description -n python3-mtse
# Multi Time Series Encoders

The objective of this python package is to make easy the encoding and the classification/regression of multivariate time series (**mts**) data even when these are asynchronous. We say that data are of type **mts** when each observation is associated with multiple time series (e.g. the vital signs of a patient at a specific period).

## Installation

The current version has been developed in Python 3.7. It also works in Python 3.8. If you encounter an issue, please try to run it again in a virtual machine containing Python 3.7 or 3.8.

```bash
pip install mtse
```

## Sample code

```python
import mtse

### Load sample data ###
train, val, test, norm = mtse.get_sample(return_norm=True)

### Using the class `mtse` ###
mtan = mtse.mtse(device='cuda', seed=1, experiment_id='mtan')
mtan.load_data(train, val, test, norm=norm)
mtan.build_model('mtan', 'regression', learn_emb=True, early_stop=10, cuda_empty_cache=True)
mtan.train(lossf='mape', n_iters=200, save_startegy='best')
mtan.predict(checkpoint='best')
mtan.encode_ts(data_to_embed='test', embed_pandas=True)
```

**More details and examples in the documentation**

## What can be implemented / improved

#### Encoders
  - [x] mTAN - Multi Time Attention Network - encoder
  - [ ] mTAN - Multi Time Attention Network - encoder-decoder
  - [ ] SeFT - Set Function for Time series
  - [ ] STraTS - Self-supervised Transformer for Time-Series
  - [ ] ODE-based encoders

Note that we only implemented the mTAN encoder as a baseline for now. At this stage, this model works only for supervised learning, meaning that it uses the target variable to compute the loss and update the encoder weights. Thus, the priority would be to implement an unsupervised encoder next (encoder-decoder models or self-supervised encoders).

#### Other features
  - Cross-validation evaluation, prediction and encoding
  - Support for other data inputs in the dataset classes (currently the `mtan_Dataset` class)
  - Support for time-series forecasting and inference tasks

## References

Satya Narayan Shukla and Benjamin Marlin, ["Multi-Time Attention Networks for Irregularly Sampled Time Series"](https://openreview.net/forum?id=4c0J6lwQ4_), *International Conference on Learning Representations*, 2021.




%package help
Summary:	Development documents and examples for mtse
Provides:	python3-mtse-doc
%description help
# Multi Time Series Encoders

The objective of this python package is to make easy the encoding and the classification/regression of multivariate time series (**mts**) data even when these are asynchronous. We say that data are of type **mts** when each observation is associated with multiple time series (e.g. the vital signs of a patient at a specific period).

## Installation

The current version has been developed in Python 3.7. It also works in Python 3.8. If you encounter an issue, please try to run it again in a virtual machine containing Python 3.7 or 3.8.

```bash
pip install mtse
```

## Sample code

```python
import mtse

### Load sample data ###
train, val, test, norm = mtse.get_sample(return_norm=True)

### Using the class `mtse` ###
mtan = mtse.mtse(device='cuda', seed=1, experiment_id='mtan')
mtan.load_data(train, val, test, norm=norm)
mtan.build_model('mtan', 'regression', learn_emb=True, early_stop=10, cuda_empty_cache=True)
mtan.train(lossf='mape', n_iters=200, save_startegy='best')
mtan.predict(checkpoint='best')
mtan.encode_ts(data_to_embed='test', embed_pandas=True)
```

**More details and examples in the documentation**

## What can be implemented / improved

#### Encoders
  - [x] mTAN - Multi Time Attention Network - encoder
  - [ ] mTAN - Multi Time Attention Network - encoder-decoder
  - [ ] SeFT - Set Function for Time series
  - [ ] STraTS - Self-supervised Transformer for Time-Series
  - [ ] ODE-based encoders

Note that we only implemented the mTAN encoder as a baseline for now. At this stage, this model works only for supervised learning, meaning that it uses the target variable to compute the loss and update the encoder weights. Thus, the priority would be to implement an unsupervised encoder next (encoder-decoder models or self-supervised encoders).

#### Other features
  - Cross-validation evaluation, prediction and encoding
  - Support for other data inputs in the dataset classes (currently the `mtan_Dataset` class)
  - Support for time-series forecasting and inference tasks

## References

Satya Narayan Shukla and Benjamin Marlin, ["Multi-Time Attention Networks for Irregularly Sampled Time Series"](https://openreview.net/forum?id=4c0J6lwQ4_), *International Conference on Learning Representations*, 2021.




%prep
%autosetup -n mtse-0.1.6

%build
%py3_build

%install
%py3_install
install -d -m755 %{buildroot}/%{_pkgdocdir}
if [ -d doc ]; then cp -arf doc %{buildroot}/%{_pkgdocdir}; fi
if [ -d docs ]; then cp -arf docs %{buildroot}/%{_pkgdocdir}; fi
if [ -d example ]; then cp -arf example %{buildroot}/%{_pkgdocdir}; fi
if [ -d examples ]; then cp -arf examples %{buildroot}/%{_pkgdocdir}; fi
pushd %{buildroot}
if [ -d usr/lib ]; then
	find usr/lib -type f -printf "\"/%h/%f\"\n" >> filelist.lst
fi
if [ -d usr/lib64 ]; then
	find usr/lib64 -type f -printf "\"/%h/%f\"\n" >> filelist.lst
fi
if [ -d usr/bin ]; then
	find usr/bin -type f -printf "\"/%h/%f\"\n" >> filelist.lst
fi
if [ -d usr/sbin ]; then
	find usr/sbin -type f -printf "\"/%h/%f\"\n" >> filelist.lst
fi
touch doclist.lst
if [ -d usr/share/man ]; then
	find usr/share/man -type f -printf "\"/%h/%f.gz\"\n" >> doclist.lst
fi
popd
mv %{buildroot}/filelist.lst .
mv %{buildroot}/doclist.lst .

%files -n python3-mtse -f filelist.lst
%dir %{python3_sitelib}/*

%files help -f doclist.lst
%{_docdir}/*

%changelog
* Fri Jun 09 2023 Python_Bot <Python_Bot@openeuler.org> - 0.1.6-1
- Package Spec generated