%global _empty_manifest_terminate_build 0 Name: python-onecache Version: 0.5.0 Release: 1 Summary: Python cache for sync and async code License: MIT URL: https://pypi.org/project/onecache/ Source0: https://mirrors.nju.edu.cn/pypi/web/packages/f8/b3/6acdbd261d8e879540ea06af832eeb55f8e60550bad033805621dd500890/onecache-0.5.0.tar.gz BuildArch: noarch %description [![Coverage Status](https://coveralls.io/repos/github/sonic182/onecache/badge.svg?branch=master)](https://coveralls.io/github/sonic182/onecache?branch=master) ![github status](https://github.com/sonic182/onecache/actions/workflows/python.yml/badge.svg) # OneCache Python cache for sync and async code. Cache uses LRU algoritm. Cache can optionally have TTL. Tested in python 3.7, 3.9, and pypy3.9 for windows, mac and linux (see github status badge), it should work in versions between them. It may work for python3.6 # Usage ```python from onecache import CacheDecorator from onecache import AsyncCacheDecorator class Counter: def __init__(self, count=0): self.count = count @pytest.mark.asyncio async def test_async_cache_counter(): """Test async cache, counter case.""" counter = Counter() @AsyncCacheDecorator() async def mycoro(counter: Counter): counter.count += 1 return counter.count assert 1 == (await mycoro(counter)) assert 1 == (await mycoro(counter)) def test_cache_counter(): """Test async cache, counter case.""" counter = Counter() @CacheDecorator() def sample(counter: Counter): counter.count += 1 return counter.count assert 1 == (sample(counter)) assert 1 == (sample(counter)) ``` Decorator classes supports the following arguments * **maxsize (int)**: Maximun number of items to be cached. default: 512 * **ttl (int)**: time to expire in milliseconds, if None, it does not expire. default: None * **skip_args (bool)**: apply cache as the function doesn't have any arguments, default: False * **cache_class (class)**: Class to use for cache instance. default: LRUCache * **refresh_ttl (bool)**: if cache with ttl, This flag makes key expiration timestamp to be refresh per access. default: False * **thread_safe (bool)**: tell decorator to use thread safe lock. default=False * **max_mem_size (int)**: max mem size in bytes. Ceil for sum of cache values sizes. default=None which means no limit. For pypy this value is ignored as the objects can change by the JIT compilation. If num of records exceds maxsize, it drops the oldest. # Development Install packages with pip-tools: ```bash pip install pip-tools pip-compile pip-compile test-requirements.in pip-sync requirements.txt test-requirements.txt ``` # Contribute 1. Fork 2. create a branch `feature/your_feature` 3. commit - push - pull request Thanks :) %package -n python3-onecache Summary: Python cache for sync and async code Provides: python-onecache BuildRequires: python3-devel BuildRequires: python3-setuptools BuildRequires: python3-pip %description -n python3-onecache [![Coverage Status](https://coveralls.io/repos/github/sonic182/onecache/badge.svg?branch=master)](https://coveralls.io/github/sonic182/onecache?branch=master) ![github status](https://github.com/sonic182/onecache/actions/workflows/python.yml/badge.svg) # OneCache Python cache for sync and async code. Cache uses LRU algoritm. Cache can optionally have TTL. Tested in python 3.7, 3.9, and pypy3.9 for windows, mac and linux (see github status badge), it should work in versions between them. It may work for python3.6 # Usage ```python from onecache import CacheDecorator from onecache import AsyncCacheDecorator class Counter: def __init__(self, count=0): self.count = count @pytest.mark.asyncio async def test_async_cache_counter(): """Test async cache, counter case.""" counter = Counter() @AsyncCacheDecorator() async def mycoro(counter: Counter): counter.count += 1 return counter.count assert 1 == (await mycoro(counter)) assert 1 == (await mycoro(counter)) def test_cache_counter(): """Test async cache, counter case.""" counter = Counter() @CacheDecorator() def sample(counter: Counter): counter.count += 1 return counter.count assert 1 == (sample(counter)) assert 1 == (sample(counter)) ``` Decorator classes supports the following arguments * **maxsize (int)**: Maximun number of items to be cached. default: 512 * **ttl (int)**: time to expire in milliseconds, if None, it does not expire. default: None * **skip_args (bool)**: apply cache as the function doesn't have any arguments, default: False * **cache_class (class)**: Class to use for cache instance. default: LRUCache * **refresh_ttl (bool)**: if cache with ttl, This flag makes key expiration timestamp to be refresh per access. default: False * **thread_safe (bool)**: tell decorator to use thread safe lock. default=False * **max_mem_size (int)**: max mem size in bytes. Ceil for sum of cache values sizes. default=None which means no limit. For pypy this value is ignored as the objects can change by the JIT compilation. If num of records exceds maxsize, it drops the oldest. # Development Install packages with pip-tools: ```bash pip install pip-tools pip-compile pip-compile test-requirements.in pip-sync requirements.txt test-requirements.txt ``` # Contribute 1. Fork 2. create a branch `feature/your_feature` 3. commit - push - pull request Thanks :) %package help Summary: Development documents and examples for onecache Provides: python3-onecache-doc %description help [![Coverage Status](https://coveralls.io/repos/github/sonic182/onecache/badge.svg?branch=master)](https://coveralls.io/github/sonic182/onecache?branch=master) ![github status](https://github.com/sonic182/onecache/actions/workflows/python.yml/badge.svg) # OneCache Python cache for sync and async code. Cache uses LRU algoritm. Cache can optionally have TTL. Tested in python 3.7, 3.9, and pypy3.9 for windows, mac and linux (see github status badge), it should work in versions between them. It may work for python3.6 # Usage ```python from onecache import CacheDecorator from onecache import AsyncCacheDecorator class Counter: def __init__(self, count=0): self.count = count @pytest.mark.asyncio async def test_async_cache_counter(): """Test async cache, counter case.""" counter = Counter() @AsyncCacheDecorator() async def mycoro(counter: Counter): counter.count += 1 return counter.count assert 1 == (await mycoro(counter)) assert 1 == (await mycoro(counter)) def test_cache_counter(): """Test async cache, counter case.""" counter = Counter() @CacheDecorator() def sample(counter: Counter): counter.count += 1 return counter.count assert 1 == (sample(counter)) assert 1 == (sample(counter)) ``` Decorator classes supports the following arguments * **maxsize (int)**: Maximun number of items to be cached. default: 512 * **ttl (int)**: time to expire in milliseconds, if None, it does not expire. default: None * **skip_args (bool)**: apply cache as the function doesn't have any arguments, default: False * **cache_class (class)**: Class to use for cache instance. default: LRUCache * **refresh_ttl (bool)**: if cache with ttl, This flag makes key expiration timestamp to be refresh per access. default: False * **thread_safe (bool)**: tell decorator to use thread safe lock. default=False * **max_mem_size (int)**: max mem size in bytes. Ceil for sum of cache values sizes. default=None which means no limit. For pypy this value is ignored as the objects can change by the JIT compilation. If num of records exceds maxsize, it drops the oldest. # Development Install packages with pip-tools: ```bash pip install pip-tools pip-compile pip-compile test-requirements.in pip-sync requirements.txt test-requirements.txt ``` # Contribute 1. Fork 2. create a branch `feature/your_feature` 3. commit - push - pull request Thanks :) %prep %autosetup -n onecache-0.5.0 %build %py3_build %install %py3_install install -d -m755 %{buildroot}/%{_pkgdocdir} if [ -d doc ]; then cp -arf doc %{buildroot}/%{_pkgdocdir}; fi if [ -d docs ]; then cp -arf docs %{buildroot}/%{_pkgdocdir}; fi if [ -d example ]; then cp -arf example %{buildroot}/%{_pkgdocdir}; fi if [ -d examples ]; then cp -arf examples %{buildroot}/%{_pkgdocdir}; fi pushd %{buildroot} if [ -d usr/lib ]; then find usr/lib -type f -printf "/%h/%f\n" >> filelist.lst fi if [ -d usr/lib64 ]; then find usr/lib64 -type f -printf "/%h/%f\n" >> filelist.lst fi if [ -d usr/bin ]; then find usr/bin -type f -printf "/%h/%f\n" >> filelist.lst fi if [ -d usr/sbin ]; then find usr/sbin -type f -printf "/%h/%f\n" >> filelist.lst fi touch doclist.lst if [ -d usr/share/man ]; then find usr/share/man -type f -printf "/%h/%f.gz\n" >> doclist.lst fi popd mv %{buildroot}/filelist.lst . mv %{buildroot}/doclist.lst . %files -n python3-onecache -f filelist.lst %dir %{python3_sitelib}/* %files help -f doclist.lst %{_docdir}/* %changelog * Fri May 05 2023 Python_Bot - 0.5.0-1 - Package Spec generated