summaryrefslogtreecommitdiff
diff options
context:
space:
mode:
authorCoprDistGit <infra@openeuler.org>2023-04-10 19:46:38 +0000
committerCoprDistGit <infra@openeuler.org>2023-04-10 19:46:38 +0000
commit4e0f70b4b17421caafef0bcecc9dbd80fc9f1f49 (patch)
tree806a30b6b8d905b3734acca9b01bf2da0321c779
parent0c4d4a6ebf990d420065c99b90e340412293d03f (diff)
automatic import of python-objsize
-rw-r--r--.gitignore1
-rw-r--r--python-objsize.spec718
-rw-r--r--sources1
3 files changed, 720 insertions, 0 deletions
diff --git a/.gitignore b/.gitignore
index e69de29..b91c7f0 100644
--- a/.gitignore
+++ b/.gitignore
@@ -0,0 +1 @@
+/objsize-0.6.1.tar.gz
diff --git a/python-objsize.spec b/python-objsize.spec
new file mode 100644
index 0000000..810e73d
--- /dev/null
+++ b/python-objsize.spec
@@ -0,0 +1,718 @@
+%global _empty_manifest_terminate_build 0
+Name: python-objsize
+Version: 0.6.1
+Release: 1
+Summary: Traversal over Python's objects subtree and calculate the total size of the subtree in bytes (deep size).
+License: BSD-3-Clause
+URL: https://github.com/liran-funaro/objsize
+Source0: https://mirrors.nju.edu.cn/pypi/web/packages/59/12/f62869f8c5f436cc7beedaccc8bfc97113943e0541c22c8505c3b2555616/objsize-0.6.1.tar.gz
+BuildArch: noarch
+
+Requires: python3-black
+Requires: python3-bumpver
+Requires: python3-isort
+Requires: python3-pip-tools
+Requires: python3-pytest
+Requires: python3-pytest-cov
+Requires: python3-coveralls
+
+%description
+# objsize
+[![Coverage Status](https://coveralls.io/repos/github/liran-funaro/objsize/badge.svg?branch=master)](https://coveralls.io/github/liran-funaro/objsize?branch=master)
+Traversal over Python's objects subtree and calculate the total size of the subtree in bytes (deep size).
+This module traverses all child objects using Python's internal GC implementation.
+It attempts to ignore shared objects (i.e., `None`, types, modules, classes, functions, lambdas), as they are common
+among all objects.
+It is implemented without recursive calls for high performance.
+# Features
+- Traverse objects' subtree
+- Calculate objects' (deep) size in bytes
+- Exclude non-exclusive objects
+- Exclude specified objects subtree
+- Allow the user to specify unique handlers for:
+ - Object's size calculation
+ - Object's referents (i.e., its children)
+ - Object filter (skip specific objects)
+[Pympler](https://pythonhosted.org/Pympler/) also supports determining an object deep size via `pympler.asizeof()`.
+There are two main differences between `objsize` and `pympler`.
+1. `objsize` has additional features:
+ * Traversing the object subtree: iterating all the object's descendants one by one.
+ * Excluding non-exclusive objects. That is, objects that are also referenced from somewhere else in the program.
+ This is true for calculating the object's deep size and for traversing its descendants.
+2. `objsize` has a simple and robust implementation with significantly fewer lines of code, compared to `pympler`.
+ The Pympler implementation uses recursion, and thus have to use a maximal depth argument to avoid reaching Python's
+ max depth.
+ `objsize`, however, uses BFS which is more efficient and simple to follow.
+ Moreover, the Pympler implementation carefully takes care of any object type.
+ `objsize` archives the same goal with a simple and generic implementation, which has fewer lines of code.
+# Install
+```bash
+pip install objsize==0.6.1
+```
+# Basic Usage
+Calculate the size of the object including all its members in bytes.
+```pycon
+>>> import objsize
+>>> objsize.get_deep_size(dict(arg1='hello', arg2='world'))
+340
+```
+It is possible to calculate the deep size of multiple objects by passing multiple arguments:
+```pycon
+>>> objsize.get_deep_size(['hello', 'world'], dict(arg1='hello', arg2='world'), {'hello', 'world'})
+628
+```
+# Complex Data
+`objsize` can calculate the size of an object's entire subtree in bytes regardless of the type of objects in it, and its
+depth.
+Here is a complex data structure, for example, that include a self reference:
+```python
+my_data = (list(range(3)), list(range(3, 6)))
+class MyClass:
+ def __init__(self, x, y):
+ self.x = x
+ self.y = y
+ self.d = {'x': x, 'y': y, 'self': self}
+ def __repr__(self):
+ return "MyClass"
+my_obj = MyClass(*my_data)
+```
+We can calculate `my_obj` deep size, including its stored data.
+```pycon
+>>> objsize.get_deep_size(my_obj)
+708
+```
+We might want to ignore non-exclusive objects such as the ones stored in `my_data`.
+```pycon
+>>> objsize.get_deep_size(my_obj, exclude=[my_data])
+384
+```
+Or simply let `objsize` detect that automatically:
+```pycon
+>>> objsize.get_exclusive_deep_size(my_obj)
+384
+```
+# Non Shared Functions or Classes
+`objsize` filters functions, lambdas, and classes by default since they are usually shared among many objects.
+For example:
+```pycon
+>>> method_dict = {"identity": lambda x: x, "double": lambda x: x*2}
+>>> objsize.get_deep_size(method_dict)
+232
+```
+Some objects, however, as illustrated in the above example, have unique functions not shared by other objects.
+Due to this, it may be useful to count their sizes.
+You can achieve this by providing an alternative filter function.
+```pycon
+>>> objsize.get_deep_size(method_dict, filter_func=objsize.shared_object_filter)
+986
+```
+Notes:
+* The default filter function is `objsize.shared_object_or_function_filter`.
+* When using `objsize.shared_object_filter`, shared functions and lambdas are also counted, but builtin functions are
+ still excluded.
+# Special Cases
+Some objects handle their data in a way that prevents Python's GC from detecting it.
+The user can supply a special way to calculate the actual size of these objects.
+## Case 1: `torch`
+Using a simple calculation of the object size won't work for `torch.Tensor`.
+```pycon
+>>> import torch
+>>> objsize.get_deep_size(torch.rand(200))
+72
+```
+So the user can define its own size calculation handler for such cases:
+```python
+import objsize
+import sys
+import torch
+def get_size_of_torch(o):
+ # `objsize.safe_is_instance` catches `ReferenceError` caused by `weakref` objects
+ if objsize.safe_is_instance(o, torch.Tensor):
+ return sys.getsizeof(o.storage())
+ else:
+ return sys.getsizeof(o)
+```
+Then use it as follows:
+```pycon
+>>> import torch
+>>> objsize.get_deep_size(
+848
+```
+However, this neglects the object's internal structure.
+The user can help `objsize` to find the object's hidden storage by supplying it with its own referent and filter
+functions:
+```python
+import objsize
+import gc
+import torch
+def get_referents_torch(*objs):
+ # Yield all native referents
+ yield from gc.get_referents(*objs)
+ for o in objs:
+ # If the object is a torch tensor, then also yield its storage
+ if type(o) == torch.Tensor:
+ yield o.storage()
+def filter_func(o):
+ # Torch storage points to another meta storage that is
+ # already included in the outer storage calculation,
+ # so we need to filter it.
+ # Also, `torch.dtype` is a common object like Python's types.
+ return not objsize.safe_is_instance(o, (
+ *objsize.SharedObjectOrFunctionType, torch.storage._UntypedStorage, torch.dtype
+ ))
+```
+Then use these as follows:
+```pycon
+>>> import torch
+>>> objsize.get_deep_size(
+1024
+```
+## Case 2: `weakref`
+Using a simple calculation of the object size won't work for `weakref.proxy`.
+```pycon
+>>> import weakref
+>>> class Foo(list):
+>>> o = Foo([0]*100)
+>>> objsize.get_deep_size(o)
+896
+>>> o_ref = weakref.proxy(o)
+>>> objsize.get_deep_size(o_ref)
+72
+```
+To mitigate this, you can provide a method that attempts to fetch the proxy's referents:
+```python
+import weakref
+import gc
+def get_weakref_referents(*objs):
+ yield from gc.get_referents(*objs)
+ for o in objs:
+ if type(o) in weakref.ProxyTypes:
+ try:
+ yield o.__repr__.__self__
+ except ReferenceError:
+ pass
+```
+Then use it as follows:
+```pycon
+>>> objsize.get_deep_size(o_ref, get_referents_func=get_weakref_referents)
+968
+```
+After the referenced object will be collected, then the size of the proxy object will be reduced.
+```pycon
+>>> del o
+>>> gc.collect()
+>>> # Wait for the object to be collected
+>>> objsize.get_deep_size(o_ref, get_referents_func=get_weakref_referents)
+72
+```
+# Traversal
+A user can implement its own function over the entire subtree using the traversal method, which traverses all the
+objects in the subtree.
+```pycon
+>>> for o in objsize.traverse_bfs(my_obj):
+MyClass
+{'x': [0, 1, 2], 'y': [3, 4, 5], 'd': {'x': [0, 1, 2], 'y': [3, 4, 5], 'self': MyClass}}
+[0, 1, 2]
+[3, 4, 5]
+{'x': [0, 1, 2], 'y': [3, 4, 5], 'self': MyClass}
+2
+1
+0
+5
+4
+3
+```
+Similar to before, non-exclusive objects can be ignored.
+```pycon
+>>> for o in objsize.traverse_exclusive_bfs(my_obj):
+MyClass
+{'x': [0, 1, 2], 'y': [3, 4, 5], 'd': {'x': [0, 1, 2], 'y': [3, 4, 5], 'self': MyClass}}
+{'x': [0, 1, 2], 'y': [3, 4, 5], 'self': MyClass}
+```
+# License
+[BSD-3](LICENSE)
+
+%package -n python3-objsize
+Summary: Traversal over Python's objects subtree and calculate the total size of the subtree in bytes (deep size).
+Provides: python-objsize
+BuildRequires: python3-devel
+BuildRequires: python3-setuptools
+BuildRequires: python3-pip
+%description -n python3-objsize
+# objsize
+[![Coverage Status](https://coveralls.io/repos/github/liran-funaro/objsize/badge.svg?branch=master)](https://coveralls.io/github/liran-funaro/objsize?branch=master)
+Traversal over Python's objects subtree and calculate the total size of the subtree in bytes (deep size).
+This module traverses all child objects using Python's internal GC implementation.
+It attempts to ignore shared objects (i.e., `None`, types, modules, classes, functions, lambdas), as they are common
+among all objects.
+It is implemented without recursive calls for high performance.
+# Features
+- Traverse objects' subtree
+- Calculate objects' (deep) size in bytes
+- Exclude non-exclusive objects
+- Exclude specified objects subtree
+- Allow the user to specify unique handlers for:
+ - Object's size calculation
+ - Object's referents (i.e., its children)
+ - Object filter (skip specific objects)
+[Pympler](https://pythonhosted.org/Pympler/) also supports determining an object deep size via `pympler.asizeof()`.
+There are two main differences between `objsize` and `pympler`.
+1. `objsize` has additional features:
+ * Traversing the object subtree: iterating all the object's descendants one by one.
+ * Excluding non-exclusive objects. That is, objects that are also referenced from somewhere else in the program.
+ This is true for calculating the object's deep size and for traversing its descendants.
+2. `objsize` has a simple and robust implementation with significantly fewer lines of code, compared to `pympler`.
+ The Pympler implementation uses recursion, and thus have to use a maximal depth argument to avoid reaching Python's
+ max depth.
+ `objsize`, however, uses BFS which is more efficient and simple to follow.
+ Moreover, the Pympler implementation carefully takes care of any object type.
+ `objsize` archives the same goal with a simple and generic implementation, which has fewer lines of code.
+# Install
+```bash
+pip install objsize==0.6.1
+```
+# Basic Usage
+Calculate the size of the object including all its members in bytes.
+```pycon
+>>> import objsize
+>>> objsize.get_deep_size(dict(arg1='hello', arg2='world'))
+340
+```
+It is possible to calculate the deep size of multiple objects by passing multiple arguments:
+```pycon
+>>> objsize.get_deep_size(['hello', 'world'], dict(arg1='hello', arg2='world'), {'hello', 'world'})
+628
+```
+# Complex Data
+`objsize` can calculate the size of an object's entire subtree in bytes regardless of the type of objects in it, and its
+depth.
+Here is a complex data structure, for example, that include a self reference:
+```python
+my_data = (list(range(3)), list(range(3, 6)))
+class MyClass:
+ def __init__(self, x, y):
+ self.x = x
+ self.y = y
+ self.d = {'x': x, 'y': y, 'self': self}
+ def __repr__(self):
+ return "MyClass"
+my_obj = MyClass(*my_data)
+```
+We can calculate `my_obj` deep size, including its stored data.
+```pycon
+>>> objsize.get_deep_size(my_obj)
+708
+```
+We might want to ignore non-exclusive objects such as the ones stored in `my_data`.
+```pycon
+>>> objsize.get_deep_size(my_obj, exclude=[my_data])
+384
+```
+Or simply let `objsize` detect that automatically:
+```pycon
+>>> objsize.get_exclusive_deep_size(my_obj)
+384
+```
+# Non Shared Functions or Classes
+`objsize` filters functions, lambdas, and classes by default since they are usually shared among many objects.
+For example:
+```pycon
+>>> method_dict = {"identity": lambda x: x, "double": lambda x: x*2}
+>>> objsize.get_deep_size(method_dict)
+232
+```
+Some objects, however, as illustrated in the above example, have unique functions not shared by other objects.
+Due to this, it may be useful to count their sizes.
+You can achieve this by providing an alternative filter function.
+```pycon
+>>> objsize.get_deep_size(method_dict, filter_func=objsize.shared_object_filter)
+986
+```
+Notes:
+* The default filter function is `objsize.shared_object_or_function_filter`.
+* When using `objsize.shared_object_filter`, shared functions and lambdas are also counted, but builtin functions are
+ still excluded.
+# Special Cases
+Some objects handle their data in a way that prevents Python's GC from detecting it.
+The user can supply a special way to calculate the actual size of these objects.
+## Case 1: `torch`
+Using a simple calculation of the object size won't work for `torch.Tensor`.
+```pycon
+>>> import torch
+>>> objsize.get_deep_size(torch.rand(200))
+72
+```
+So the user can define its own size calculation handler for such cases:
+```python
+import objsize
+import sys
+import torch
+def get_size_of_torch(o):
+ # `objsize.safe_is_instance` catches `ReferenceError` caused by `weakref` objects
+ if objsize.safe_is_instance(o, torch.Tensor):
+ return sys.getsizeof(o.storage())
+ else:
+ return sys.getsizeof(o)
+```
+Then use it as follows:
+```pycon
+>>> import torch
+>>> objsize.get_deep_size(
+848
+```
+However, this neglects the object's internal structure.
+The user can help `objsize` to find the object's hidden storage by supplying it with its own referent and filter
+functions:
+```python
+import objsize
+import gc
+import torch
+def get_referents_torch(*objs):
+ # Yield all native referents
+ yield from gc.get_referents(*objs)
+ for o in objs:
+ # If the object is a torch tensor, then also yield its storage
+ if type(o) == torch.Tensor:
+ yield o.storage()
+def filter_func(o):
+ # Torch storage points to another meta storage that is
+ # already included in the outer storage calculation,
+ # so we need to filter it.
+ # Also, `torch.dtype` is a common object like Python's types.
+ return not objsize.safe_is_instance(o, (
+ *objsize.SharedObjectOrFunctionType, torch.storage._UntypedStorage, torch.dtype
+ ))
+```
+Then use these as follows:
+```pycon
+>>> import torch
+>>> objsize.get_deep_size(
+1024
+```
+## Case 2: `weakref`
+Using a simple calculation of the object size won't work for `weakref.proxy`.
+```pycon
+>>> import weakref
+>>> class Foo(list):
+>>> o = Foo([0]*100)
+>>> objsize.get_deep_size(o)
+896
+>>> o_ref = weakref.proxy(o)
+>>> objsize.get_deep_size(o_ref)
+72
+```
+To mitigate this, you can provide a method that attempts to fetch the proxy's referents:
+```python
+import weakref
+import gc
+def get_weakref_referents(*objs):
+ yield from gc.get_referents(*objs)
+ for o in objs:
+ if type(o) in weakref.ProxyTypes:
+ try:
+ yield o.__repr__.__self__
+ except ReferenceError:
+ pass
+```
+Then use it as follows:
+```pycon
+>>> objsize.get_deep_size(o_ref, get_referents_func=get_weakref_referents)
+968
+```
+After the referenced object will be collected, then the size of the proxy object will be reduced.
+```pycon
+>>> del o
+>>> gc.collect()
+>>> # Wait for the object to be collected
+>>> objsize.get_deep_size(o_ref, get_referents_func=get_weakref_referents)
+72
+```
+# Traversal
+A user can implement its own function over the entire subtree using the traversal method, which traverses all the
+objects in the subtree.
+```pycon
+>>> for o in objsize.traverse_bfs(my_obj):
+MyClass
+{'x': [0, 1, 2], 'y': [3, 4, 5], 'd': {'x': [0, 1, 2], 'y': [3, 4, 5], 'self': MyClass}}
+[0, 1, 2]
+[3, 4, 5]
+{'x': [0, 1, 2], 'y': [3, 4, 5], 'self': MyClass}
+2
+1
+0
+5
+4
+3
+```
+Similar to before, non-exclusive objects can be ignored.
+```pycon
+>>> for o in objsize.traverse_exclusive_bfs(my_obj):
+MyClass
+{'x': [0, 1, 2], 'y': [3, 4, 5], 'd': {'x': [0, 1, 2], 'y': [3, 4, 5], 'self': MyClass}}
+{'x': [0, 1, 2], 'y': [3, 4, 5], 'self': MyClass}
+```
+# License
+[BSD-3](LICENSE)
+
+%package help
+Summary: Development documents and examples for objsize
+Provides: python3-objsize-doc
+%description help
+# objsize
+[![Coverage Status](https://coveralls.io/repos/github/liran-funaro/objsize/badge.svg?branch=master)](https://coveralls.io/github/liran-funaro/objsize?branch=master)
+Traversal over Python's objects subtree and calculate the total size of the subtree in bytes (deep size).
+This module traverses all child objects using Python's internal GC implementation.
+It attempts to ignore shared objects (i.e., `None`, types, modules, classes, functions, lambdas), as they are common
+among all objects.
+It is implemented without recursive calls for high performance.
+# Features
+- Traverse objects' subtree
+- Calculate objects' (deep) size in bytes
+- Exclude non-exclusive objects
+- Exclude specified objects subtree
+- Allow the user to specify unique handlers for:
+ - Object's size calculation
+ - Object's referents (i.e., its children)
+ - Object filter (skip specific objects)
+[Pympler](https://pythonhosted.org/Pympler/) also supports determining an object deep size via `pympler.asizeof()`.
+There are two main differences between `objsize` and `pympler`.
+1. `objsize` has additional features:
+ * Traversing the object subtree: iterating all the object's descendants one by one.
+ * Excluding non-exclusive objects. That is, objects that are also referenced from somewhere else in the program.
+ This is true for calculating the object's deep size and for traversing its descendants.
+2. `objsize` has a simple and robust implementation with significantly fewer lines of code, compared to `pympler`.
+ The Pympler implementation uses recursion, and thus have to use a maximal depth argument to avoid reaching Python's
+ max depth.
+ `objsize`, however, uses BFS which is more efficient and simple to follow.
+ Moreover, the Pympler implementation carefully takes care of any object type.
+ `objsize` archives the same goal with a simple and generic implementation, which has fewer lines of code.
+# Install
+```bash
+pip install objsize==0.6.1
+```
+# Basic Usage
+Calculate the size of the object including all its members in bytes.
+```pycon
+>>> import objsize
+>>> objsize.get_deep_size(dict(arg1='hello', arg2='world'))
+340
+```
+It is possible to calculate the deep size of multiple objects by passing multiple arguments:
+```pycon
+>>> objsize.get_deep_size(['hello', 'world'], dict(arg1='hello', arg2='world'), {'hello', 'world'})
+628
+```
+# Complex Data
+`objsize` can calculate the size of an object's entire subtree in bytes regardless of the type of objects in it, and its
+depth.
+Here is a complex data structure, for example, that include a self reference:
+```python
+my_data = (list(range(3)), list(range(3, 6)))
+class MyClass:
+ def __init__(self, x, y):
+ self.x = x
+ self.y = y
+ self.d = {'x': x, 'y': y, 'self': self}
+ def __repr__(self):
+ return "MyClass"
+my_obj = MyClass(*my_data)
+```
+We can calculate `my_obj` deep size, including its stored data.
+```pycon
+>>> objsize.get_deep_size(my_obj)
+708
+```
+We might want to ignore non-exclusive objects such as the ones stored in `my_data`.
+```pycon
+>>> objsize.get_deep_size(my_obj, exclude=[my_data])
+384
+```
+Or simply let `objsize` detect that automatically:
+```pycon
+>>> objsize.get_exclusive_deep_size(my_obj)
+384
+```
+# Non Shared Functions or Classes
+`objsize` filters functions, lambdas, and classes by default since they are usually shared among many objects.
+For example:
+```pycon
+>>> method_dict = {"identity": lambda x: x, "double": lambda x: x*2}
+>>> objsize.get_deep_size(method_dict)
+232
+```
+Some objects, however, as illustrated in the above example, have unique functions not shared by other objects.
+Due to this, it may be useful to count their sizes.
+You can achieve this by providing an alternative filter function.
+```pycon
+>>> objsize.get_deep_size(method_dict, filter_func=objsize.shared_object_filter)
+986
+```
+Notes:
+* The default filter function is `objsize.shared_object_or_function_filter`.
+* When using `objsize.shared_object_filter`, shared functions and lambdas are also counted, but builtin functions are
+ still excluded.
+# Special Cases
+Some objects handle their data in a way that prevents Python's GC from detecting it.
+The user can supply a special way to calculate the actual size of these objects.
+## Case 1: `torch`
+Using a simple calculation of the object size won't work for `torch.Tensor`.
+```pycon
+>>> import torch
+>>> objsize.get_deep_size(torch.rand(200))
+72
+```
+So the user can define its own size calculation handler for such cases:
+```python
+import objsize
+import sys
+import torch
+def get_size_of_torch(o):
+ # `objsize.safe_is_instance` catches `ReferenceError` caused by `weakref` objects
+ if objsize.safe_is_instance(o, torch.Tensor):
+ return sys.getsizeof(o.storage())
+ else:
+ return sys.getsizeof(o)
+```
+Then use it as follows:
+```pycon
+>>> import torch
+>>> objsize.get_deep_size(
+848
+```
+However, this neglects the object's internal structure.
+The user can help `objsize` to find the object's hidden storage by supplying it with its own referent and filter
+functions:
+```python
+import objsize
+import gc
+import torch
+def get_referents_torch(*objs):
+ # Yield all native referents
+ yield from gc.get_referents(*objs)
+ for o in objs:
+ # If the object is a torch tensor, then also yield its storage
+ if type(o) == torch.Tensor:
+ yield o.storage()
+def filter_func(o):
+ # Torch storage points to another meta storage that is
+ # already included in the outer storage calculation,
+ # so we need to filter it.
+ # Also, `torch.dtype` is a common object like Python's types.
+ return not objsize.safe_is_instance(o, (
+ *objsize.SharedObjectOrFunctionType, torch.storage._UntypedStorage, torch.dtype
+ ))
+```
+Then use these as follows:
+```pycon
+>>> import torch
+>>> objsize.get_deep_size(
+1024
+```
+## Case 2: `weakref`
+Using a simple calculation of the object size won't work for `weakref.proxy`.
+```pycon
+>>> import weakref
+>>> class Foo(list):
+>>> o = Foo([0]*100)
+>>> objsize.get_deep_size(o)
+896
+>>> o_ref = weakref.proxy(o)
+>>> objsize.get_deep_size(o_ref)
+72
+```
+To mitigate this, you can provide a method that attempts to fetch the proxy's referents:
+```python
+import weakref
+import gc
+def get_weakref_referents(*objs):
+ yield from gc.get_referents(*objs)
+ for o in objs:
+ if type(o) in weakref.ProxyTypes:
+ try:
+ yield o.__repr__.__self__
+ except ReferenceError:
+ pass
+```
+Then use it as follows:
+```pycon
+>>> objsize.get_deep_size(o_ref, get_referents_func=get_weakref_referents)
+968
+```
+After the referenced object will be collected, then the size of the proxy object will be reduced.
+```pycon
+>>> del o
+>>> gc.collect()
+>>> # Wait for the object to be collected
+>>> objsize.get_deep_size(o_ref, get_referents_func=get_weakref_referents)
+72
+```
+# Traversal
+A user can implement its own function over the entire subtree using the traversal method, which traverses all the
+objects in the subtree.
+```pycon
+>>> for o in objsize.traverse_bfs(my_obj):
+MyClass
+{'x': [0, 1, 2], 'y': [3, 4, 5], 'd': {'x': [0, 1, 2], 'y': [3, 4, 5], 'self': MyClass}}
+[0, 1, 2]
+[3, 4, 5]
+{'x': [0, 1, 2], 'y': [3, 4, 5], 'self': MyClass}
+2
+1
+0
+5
+4
+3
+```
+Similar to before, non-exclusive objects can be ignored.
+```pycon
+>>> for o in objsize.traverse_exclusive_bfs(my_obj):
+MyClass
+{'x': [0, 1, 2], 'y': [3, 4, 5], 'd': {'x': [0, 1, 2], 'y': [3, 4, 5], 'self': MyClass}}
+{'x': [0, 1, 2], 'y': [3, 4, 5], 'self': MyClass}
+```
+# License
+[BSD-3](LICENSE)
+
+%prep
+%autosetup -n objsize-0.6.1
+
+%build
+%py3_build
+
+%install
+%py3_install
+install -d -m755 %{buildroot}/%{_pkgdocdir}
+if [ -d doc ]; then cp -arf doc %{buildroot}/%{_pkgdocdir}; fi
+if [ -d docs ]; then cp -arf docs %{buildroot}/%{_pkgdocdir}; fi
+if [ -d example ]; then cp -arf example %{buildroot}/%{_pkgdocdir}; fi
+if [ -d examples ]; then cp -arf examples %{buildroot}/%{_pkgdocdir}; fi
+pushd %{buildroot}
+if [ -d usr/lib ]; then
+ find usr/lib -type f -printf "/%h/%f\n" >> filelist.lst
+fi
+if [ -d usr/lib64 ]; then
+ find usr/lib64 -type f -printf "/%h/%f\n" >> filelist.lst
+fi
+if [ -d usr/bin ]; then
+ find usr/bin -type f -printf "/%h/%f\n" >> filelist.lst
+fi
+if [ -d usr/sbin ]; then
+ find usr/sbin -type f -printf "/%h/%f\n" >> filelist.lst
+fi
+touch doclist.lst
+if [ -d usr/share/man ]; then
+ find usr/share/man -type f -printf "/%h/%f.gz\n" >> doclist.lst
+fi
+popd
+mv %{buildroot}/filelist.lst .
+mv %{buildroot}/doclist.lst .
+
+%files -n python3-objsize -f filelist.lst
+%dir %{python3_sitelib}/*
+
+%files help -f doclist.lst
+%{_docdir}/*
+
+%changelog
+* Mon Apr 10 2023 Python_Bot <Python_Bot@openeuler.org> - 0.6.1-1
+- Package Spec generated
diff --git a/sources b/sources
new file mode 100644
index 0000000..7bc272c
--- /dev/null
+++ b/sources
@@ -0,0 +1 @@
+36b5bcd8032a2e64f6096fc719b28b5d objsize-0.6.1.tar.gz