%global _empty_manifest_terminate_build 0 Name: python-shared-memory-dict Version: 0.7.2 Release: 1 Summary: A very simple shared memory dict implementation License: MIT URL: https://github.com/luizalabs/shared-memory-dict Source0: https://mirrors.nju.edu.cn/pypi/web/packages/34/0a/16c63e478658ce9ccb9e5d04b7650347a1e00f4db6da11487fdeb45b6b94/shared-memory-dict-0.7.2.tar.gz BuildArch: noarch Requires: python3-django Requires: python3-aiocache %description # Shared Memory Dict A very simple [shared memory](https://docs.python.org/3/library/multiprocessing.shared_memory.html) dict implementation. **Requires**: Python >= 3.8 ```python >>> # In the first Python interactive shell >> from shared_memory_dict import SharedMemoryDict >> smd = SharedMemoryDict(name='tokens', size=1024) >> smd['some-key'] = 'some-value-with-any-type' >> smd['some-key'] 'some-value-with-any-type' >>> # In either the same shell or a new Python shell on the same machine >> existing_smd = SharedMemoryDict(name='tokens', size=1024) >>> existing_smd['some-key'] 'some-value-with-any-type' >>> existing_smd['new-key'] = 'some-value-with-any-type' >>> # Back in the first Python interactive shell, smd reflects this change >> smd['new-key'] 'some-value-with-any-type' >>> # Clean up from within the second Python shell >>> existing_smd.shm.close() # or "del existing_smd" >>> # Clean up from within the first Python shell >>> smd.shm.close() >>> smd.shm.unlink() # Free and release the shared memory block at the very end >>> del smd # use of smd after call unlink() is unsupported ``` > The arg `name` defines the location of the memory block, so if you want to share the memory between process use the same name. > The size (in bytes) occupied by the contents of the dictionary depends on the serialization used in storage. By default pickle is used. ## Installation Using `pip`: ```shell pip install shared-memory-dict ``` ## Locks To use [multiprocessing.Lock](https://docs.python.org/3.8/library/multiprocessing.html#multiprocessing.Lock) on write operations of shared memory dict set environment variable `SHARED_MEMORY_USE_LOCK=1`. ## Serialization We use [pickle](https://docs.python.org/3/library/pickle.html) as default to read and write the data into the shared memory block. You can create a custom serializer by implementing the `dumps` and `loads` methods. Custom serializers should raise `SerializationError` if the serialization fails and `DeserializationError` if the deserialization fails. Both are defined in the `shared_memory_dict.serializers` module. An example of a JSON serializer extracted from serializers module: ```python NULL_BYTE: Final = b"\x00" class JSONSerializer: def dumps(self, obj: dict) -> bytes: try: return json.dumps(obj).encode() + NULL_BYTE except (ValueError, TypeError): raise SerializationError(obj) def loads(self, data: bytes) -> dict: data = data.split(NULL_BYTE, 1)[0] try: return json.loads(data) except json.JSONDecodeError: raise DeserializationError(data) ``` Note: A null byte is used to separate the dictionary contents from the bytes that are in memory. To use the custom serializer you must set it when creating a new shared memory dict instance: ```python >>> smd = SharedMemoryDict(name='tokens', size=1024, serializer=JSONSerializer()) ``` ### Caveat The pickle module is not secure. Only unpickle data you trust. See more [here](https://docs.python.org/3/library/pickle.html). ## Django Cache Implementation There's a [Django Cache Implementation](https://docs.djangoproject.com/en/3.0/topics/cache/) with Shared Memory Dict: ```python # settings/base.py CACHES = { 'default': { 'BACKEND': 'shared_memory_dict.caches.django.SharedMemoryCache', 'LOCATION': 'memory', 'OPTIONS': {'MEMORY_BLOCK_SIZE': 1024} } } ``` **Install with**: `pip install "shared-memory-dict[django]"` ### Caveat With Django cache implementation the keys only expire when they're read. Be careful with memory usage ## AioCache Backend There's also a [AioCache Backend Implementation](https://aiocache.readthedocs.io/en/latest/caches.html) with Shared Memory Dict: ```python From aiocache import caches caches.set_config({ 'default': { 'cache': 'shared_memory_dict.caches.aiocache.SharedMemoryCache', 'size': 1024, }, }) ``` > This implementation is very based on aiocache [SimpleMemoryCache](https://aiocache.readthedocs.io/en/latest/caches.html#simplememorycache) **Install with**: `pip install "shared-memory-dict[aiocache]"` %package -n python3-shared-memory-dict Summary: A very simple shared memory dict implementation Provides: python-shared-memory-dict BuildRequires: python3-devel BuildRequires: python3-setuptools BuildRequires: python3-pip %description -n python3-shared-memory-dict # Shared Memory Dict A very simple [shared memory](https://docs.python.org/3/library/multiprocessing.shared_memory.html) dict implementation. **Requires**: Python >= 3.8 ```python >>> # In the first Python interactive shell >> from shared_memory_dict import SharedMemoryDict >> smd = SharedMemoryDict(name='tokens', size=1024) >> smd['some-key'] = 'some-value-with-any-type' >> smd['some-key'] 'some-value-with-any-type' >>> # In either the same shell or a new Python shell on the same machine >> existing_smd = SharedMemoryDict(name='tokens', size=1024) >>> existing_smd['some-key'] 'some-value-with-any-type' >>> existing_smd['new-key'] = 'some-value-with-any-type' >>> # Back in the first Python interactive shell, smd reflects this change >> smd['new-key'] 'some-value-with-any-type' >>> # Clean up from within the second Python shell >>> existing_smd.shm.close() # or "del existing_smd" >>> # Clean up from within the first Python shell >>> smd.shm.close() >>> smd.shm.unlink() # Free and release the shared memory block at the very end >>> del smd # use of smd after call unlink() is unsupported ``` > The arg `name` defines the location of the memory block, so if you want to share the memory between process use the same name. > The size (in bytes) occupied by the contents of the dictionary depends on the serialization used in storage. By default pickle is used. ## Installation Using `pip`: ```shell pip install shared-memory-dict ``` ## Locks To use [multiprocessing.Lock](https://docs.python.org/3.8/library/multiprocessing.html#multiprocessing.Lock) on write operations of shared memory dict set environment variable `SHARED_MEMORY_USE_LOCK=1`. ## Serialization We use [pickle](https://docs.python.org/3/library/pickle.html) as default to read and write the data into the shared memory block. You can create a custom serializer by implementing the `dumps` and `loads` methods. Custom serializers should raise `SerializationError` if the serialization fails and `DeserializationError` if the deserialization fails. Both are defined in the `shared_memory_dict.serializers` module. An example of a JSON serializer extracted from serializers module: ```python NULL_BYTE: Final = b"\x00" class JSONSerializer: def dumps(self, obj: dict) -> bytes: try: return json.dumps(obj).encode() + NULL_BYTE except (ValueError, TypeError): raise SerializationError(obj) def loads(self, data: bytes) -> dict: data = data.split(NULL_BYTE, 1)[0] try: return json.loads(data) except json.JSONDecodeError: raise DeserializationError(data) ``` Note: A null byte is used to separate the dictionary contents from the bytes that are in memory. To use the custom serializer you must set it when creating a new shared memory dict instance: ```python >>> smd = SharedMemoryDict(name='tokens', size=1024, serializer=JSONSerializer()) ``` ### Caveat The pickle module is not secure. Only unpickle data you trust. See more [here](https://docs.python.org/3/library/pickle.html). ## Django Cache Implementation There's a [Django Cache Implementation](https://docs.djangoproject.com/en/3.0/topics/cache/) with Shared Memory Dict: ```python # settings/base.py CACHES = { 'default': { 'BACKEND': 'shared_memory_dict.caches.django.SharedMemoryCache', 'LOCATION': 'memory', 'OPTIONS': {'MEMORY_BLOCK_SIZE': 1024} } } ``` **Install with**: `pip install "shared-memory-dict[django]"` ### Caveat With Django cache implementation the keys only expire when they're read. Be careful with memory usage ## AioCache Backend There's also a [AioCache Backend Implementation](https://aiocache.readthedocs.io/en/latest/caches.html) with Shared Memory Dict: ```python From aiocache import caches caches.set_config({ 'default': { 'cache': 'shared_memory_dict.caches.aiocache.SharedMemoryCache', 'size': 1024, }, }) ``` > This implementation is very based on aiocache [SimpleMemoryCache](https://aiocache.readthedocs.io/en/latest/caches.html#simplememorycache) **Install with**: `pip install "shared-memory-dict[aiocache]"` %package help Summary: Development documents and examples for shared-memory-dict Provides: python3-shared-memory-dict-doc %description help # Shared Memory Dict A very simple [shared memory](https://docs.python.org/3/library/multiprocessing.shared_memory.html) dict implementation. **Requires**: Python >= 3.8 ```python >>> # In the first Python interactive shell >> from shared_memory_dict import SharedMemoryDict >> smd = SharedMemoryDict(name='tokens', size=1024) >> smd['some-key'] = 'some-value-with-any-type' >> smd['some-key'] 'some-value-with-any-type' >>> # In either the same shell or a new Python shell on the same machine >> existing_smd = SharedMemoryDict(name='tokens', size=1024) >>> existing_smd['some-key'] 'some-value-with-any-type' >>> existing_smd['new-key'] = 'some-value-with-any-type' >>> # Back in the first Python interactive shell, smd reflects this change >> smd['new-key'] 'some-value-with-any-type' >>> # Clean up from within the second Python shell >>> existing_smd.shm.close() # or "del existing_smd" >>> # Clean up from within the first Python shell >>> smd.shm.close() >>> smd.shm.unlink() # Free and release the shared memory block at the very end >>> del smd # use of smd after call unlink() is unsupported ``` > The arg `name` defines the location of the memory block, so if you want to share the memory between process use the same name. > The size (in bytes) occupied by the contents of the dictionary depends on the serialization used in storage. By default pickle is used. ## Installation Using `pip`: ```shell pip install shared-memory-dict ``` ## Locks To use [multiprocessing.Lock](https://docs.python.org/3.8/library/multiprocessing.html#multiprocessing.Lock) on write operations of shared memory dict set environment variable `SHARED_MEMORY_USE_LOCK=1`. ## Serialization We use [pickle](https://docs.python.org/3/library/pickle.html) as default to read and write the data into the shared memory block. You can create a custom serializer by implementing the `dumps` and `loads` methods. Custom serializers should raise `SerializationError` if the serialization fails and `DeserializationError` if the deserialization fails. Both are defined in the `shared_memory_dict.serializers` module. An example of a JSON serializer extracted from serializers module: ```python NULL_BYTE: Final = b"\x00" class JSONSerializer: def dumps(self, obj: dict) -> bytes: try: return json.dumps(obj).encode() + NULL_BYTE except (ValueError, TypeError): raise SerializationError(obj) def loads(self, data: bytes) -> dict: data = data.split(NULL_BYTE, 1)[0] try: return json.loads(data) except json.JSONDecodeError: raise DeserializationError(data) ``` Note: A null byte is used to separate the dictionary contents from the bytes that are in memory. To use the custom serializer you must set it when creating a new shared memory dict instance: ```python >>> smd = SharedMemoryDict(name='tokens', size=1024, serializer=JSONSerializer()) ``` ### Caveat The pickle module is not secure. Only unpickle data you trust. See more [here](https://docs.python.org/3/library/pickle.html). ## Django Cache Implementation There's a [Django Cache Implementation](https://docs.djangoproject.com/en/3.0/topics/cache/) with Shared Memory Dict: ```python # settings/base.py CACHES = { 'default': { 'BACKEND': 'shared_memory_dict.caches.django.SharedMemoryCache', 'LOCATION': 'memory', 'OPTIONS': {'MEMORY_BLOCK_SIZE': 1024} } } ``` **Install with**: `pip install "shared-memory-dict[django]"` ### Caveat With Django cache implementation the keys only expire when they're read. Be careful with memory usage ## AioCache Backend There's also a [AioCache Backend Implementation](https://aiocache.readthedocs.io/en/latest/caches.html) with Shared Memory Dict: ```python From aiocache import caches caches.set_config({ 'default': { 'cache': 'shared_memory_dict.caches.aiocache.SharedMemoryCache', 'size': 1024, }, }) ``` > This implementation is very based on aiocache [SimpleMemoryCache](https://aiocache.readthedocs.io/en/latest/caches.html#simplememorycache) **Install with**: `pip install "shared-memory-dict[aiocache]"` %prep %autosetup -n shared-memory-dict-0.7.2 %build %py3_build %install %py3_install install -d -m755 %{buildroot}/%{_pkgdocdir} if [ -d doc ]; then cp -arf doc %{buildroot}/%{_pkgdocdir}; fi if [ -d docs ]; then cp -arf docs %{buildroot}/%{_pkgdocdir}; fi if [ -d example ]; then cp -arf example %{buildroot}/%{_pkgdocdir}; fi if [ -d examples ]; then cp -arf examples %{buildroot}/%{_pkgdocdir}; fi pushd %{buildroot} if [ -d usr/lib ]; then find usr/lib -type f -printf "/%h/%f\n" >> filelist.lst fi if [ -d usr/lib64 ]; then find usr/lib64 -type f -printf "/%h/%f\n" >> filelist.lst fi if [ -d usr/bin ]; then find usr/bin -type f -printf "/%h/%f\n" >> filelist.lst fi if [ -d usr/sbin ]; then find usr/sbin -type f -printf "/%h/%f\n" >> filelist.lst fi touch doclist.lst if [ -d usr/share/man ]; then find usr/share/man -type f -printf "/%h/%f.gz\n" >> doclist.lst fi popd mv %{buildroot}/filelist.lst . mv %{buildroot}/doclist.lst . %files -n python3-shared-memory-dict -f filelist.lst %dir %{python3_sitelib}/* %files help -f doclist.lst %{_docdir}/* %changelog * Tue May 30 2023 Python_Bot - 0.7.2-1 - Package Spec generated