%global _empty_manifest_terminate_build 0
Name: python-MetaNN
Version: 0.3.2
Release: 1
Summary: MetaNN provides extensions of PyTorch nn.Module for meta learning
License: MIT License
URL: https://github.com/yhqjohn/MetaModule
Source0: https://mirrors.aliyun.com/pypi/web/packages/6c/03/f8b81237ad7930ed19b3b7e4cdc5aa5ef4c60cc1936cdb102558d993bcb5/MetaNN-0.3.2.tar.gz
BuildArch: noarch
Requires: python3-torch
%description
1. Introduction
____________________
In meta learner scenario, it is common use dependent variables as parameters, and back propagate the gradient of the parameters. However, parameters of PyTorch Module are designed to be leaf nodes and it is forbidden for parameters to have grad_fn. Meta learning coders are therefore forced to rewrite the basic layers to adapt the meta learning requirements.
This module provide an extension of torch.nn.Module, DependentModule that has dependent parameters, allowing the differentiable dependent parameters. It also provide the method to transform nn.Module into DependentModule, and turning all of the parameters of a nn.Module into dependent parameters.
2. Installation
__________________
pip install MetaNN
3. Example
___________
PyTorch suggest all parameters of a module to be independent variables. Using DependentModule arbitrary torch.nn.module can be transformed into dependent module.
from metann import DependentModule
from torch import nn
net = torch.nn.Sequential(
nn.Linear(10, 100),
nn.Linear(100, 5))
net = DependentModule(net)
print(net)
Higher-level api such as MAML class are more recommended to use.
from metann.meta import MAML, default_evaluator_classification as evaluator
from torch import nn
net = torch.nn.Sequential(
nn.Linear(10, 100),
nn.Linear(100, 5))
)
maml = MAML(net, steps_train=5, steps_eval=10, lr=0.01)
output = maml(data_train)
loss = evaluator(output, data_test)
loss.backward()
4. Documents
_____________
The documents are available at ReadTheDocs.
`MetaNN `__
5. License
__________
`MIT `__
Copyright (c) 2019-present, Hanqiao Yu
%package -n python3-MetaNN
Summary: MetaNN provides extensions of PyTorch nn.Module for meta learning
Provides: python-MetaNN
BuildRequires: python3-devel
BuildRequires: python3-setuptools
BuildRequires: python3-pip
%description -n python3-MetaNN
1. Introduction
____________________
In meta learner scenario, it is common use dependent variables as parameters, and back propagate the gradient of the parameters. However, parameters of PyTorch Module are designed to be leaf nodes and it is forbidden for parameters to have grad_fn. Meta learning coders are therefore forced to rewrite the basic layers to adapt the meta learning requirements.
This module provide an extension of torch.nn.Module, DependentModule that has dependent parameters, allowing the differentiable dependent parameters. It also provide the method to transform nn.Module into DependentModule, and turning all of the parameters of a nn.Module into dependent parameters.
2. Installation
__________________
pip install MetaNN
3. Example
___________
PyTorch suggest all parameters of a module to be independent variables. Using DependentModule arbitrary torch.nn.module can be transformed into dependent module.
from metann import DependentModule
from torch import nn
net = torch.nn.Sequential(
nn.Linear(10, 100),
nn.Linear(100, 5))
net = DependentModule(net)
print(net)
Higher-level api such as MAML class are more recommended to use.
from metann.meta import MAML, default_evaluator_classification as evaluator
from torch import nn
net = torch.nn.Sequential(
nn.Linear(10, 100),
nn.Linear(100, 5))
)
maml = MAML(net, steps_train=5, steps_eval=10, lr=0.01)
output = maml(data_train)
loss = evaluator(output, data_test)
loss.backward()
4. Documents
_____________
The documents are available at ReadTheDocs.
`MetaNN `__
5. License
__________
`MIT `__
Copyright (c) 2019-present, Hanqiao Yu
%package help
Summary: Development documents and examples for MetaNN
Provides: python3-MetaNN-doc
%description help
1. Introduction
____________________
In meta learner scenario, it is common use dependent variables as parameters, and back propagate the gradient of the parameters. However, parameters of PyTorch Module are designed to be leaf nodes and it is forbidden for parameters to have grad_fn. Meta learning coders are therefore forced to rewrite the basic layers to adapt the meta learning requirements.
This module provide an extension of torch.nn.Module, DependentModule that has dependent parameters, allowing the differentiable dependent parameters. It also provide the method to transform nn.Module into DependentModule, and turning all of the parameters of a nn.Module into dependent parameters.
2. Installation
__________________
pip install MetaNN
3. Example
___________
PyTorch suggest all parameters of a module to be independent variables. Using DependentModule arbitrary torch.nn.module can be transformed into dependent module.
from metann import DependentModule
from torch import nn
net = torch.nn.Sequential(
nn.Linear(10, 100),
nn.Linear(100, 5))
net = DependentModule(net)
print(net)
Higher-level api such as MAML class are more recommended to use.
from metann.meta import MAML, default_evaluator_classification as evaluator
from torch import nn
net = torch.nn.Sequential(
nn.Linear(10, 100),
nn.Linear(100, 5))
)
maml = MAML(net, steps_train=5, steps_eval=10, lr=0.01)
output = maml(data_train)
loss = evaluator(output, data_test)
loss.backward()
4. Documents
_____________
The documents are available at ReadTheDocs.
`MetaNN `__
5. License
__________
`MIT `__
Copyright (c) 2019-present, Hanqiao Yu
%prep
%autosetup -n MetaNN-0.3.2
%build
%py3_build
%install
%py3_install
install -d -m755 %{buildroot}/%{_pkgdocdir}
if [ -d doc ]; then cp -arf doc %{buildroot}/%{_pkgdocdir}; fi
if [ -d docs ]; then cp -arf docs %{buildroot}/%{_pkgdocdir}; fi
if [ -d example ]; then cp -arf example %{buildroot}/%{_pkgdocdir}; fi
if [ -d examples ]; then cp -arf examples %{buildroot}/%{_pkgdocdir}; fi
pushd %{buildroot}
if [ -d usr/lib ]; then
find usr/lib -type f -printf "\"/%h/%f\"\n" >> filelist.lst
fi
if [ -d usr/lib64 ]; then
find usr/lib64 -type f -printf "\"/%h/%f\"\n" >> filelist.lst
fi
if [ -d usr/bin ]; then
find usr/bin -type f -printf "\"/%h/%f\"\n" >> filelist.lst
fi
if [ -d usr/sbin ]; then
find usr/sbin -type f -printf "\"/%h/%f\"\n" >> filelist.lst
fi
touch doclist.lst
if [ -d usr/share/man ]; then
find usr/share/man -type f -printf "\"/%h/%f.gz\"\n" >> doclist.lst
fi
popd
mv %{buildroot}/filelist.lst .
mv %{buildroot}/doclist.lst .
%files -n python3-MetaNN -f filelist.lst
%dir %{python3_sitelib}/*
%files help -f doclist.lst
%{_docdir}/*
%changelog
* Thu Jun 08 2023 Python_Bot - 0.3.2-1
- Package Spec generated