summaryrefslogtreecommitdiff
diff options
context:
space:
mode:
authorCoprDistGit <infra@openeuler.org>2023-05-05 09:24:13 +0000
committerCoprDistGit <infra@openeuler.org>2023-05-05 09:24:13 +0000
commit41cbbf65d7bbb7286f980ee78261d9070bcc558a (patch)
tree9dcb32b8cf85fe2c3be00ebcdc0affad3d8000a0
parentca8fe8241db9f9c108a4341d3237fb5974ae2fce (diff)
automatic import of python-openleapopeneuler20.03
-rw-r--r--.gitignore1
-rw-r--r--python-openleap.spec384
-rw-r--r--sources1
3 files changed, 386 insertions, 0 deletions
diff --git a/.gitignore b/.gitignore
index e69de29..b96228b 100644
--- a/.gitignore
+++ b/.gitignore
@@ -0,0 +1 @@
+/openleap-0.5.6.tar.gz
diff --git a/python-openleap.spec b/python-openleap.spec
new file mode 100644
index 0000000..5d5a76b
--- /dev/null
+++ b/python-openleap.spec
@@ -0,0 +1,384 @@
+%global _empty_manifest_terminate_build 0
+Name: python-openleap
+Version: 0.5.6
+Release: 1
+Summary: Hand tracking and gesture recognition module
+License: LICENSE
+URL: https://github.com/szymciem8/OpenLeap
+Source0: https://mirrors.nju.edu.cn/pypi/web/packages/bd/6b/9a55dc6d2c2724205055282464ca90c195e52642521871fc8e216a0268a4/openleap-0.5.6.tar.gz
+BuildArch: noarch
+
+Requires: python3-mediapipe
+Requires: python3-opencv-python
+Requires: python3-pandas
+
+%description
+# OpenLeap
+
+## Table of contents
+- [OpenLeap](#openleap)
+ - [Table of contents](#table-of-contents)
+ - [General Info](#general-info)
+ - [Technologies](#technologies)
+ - [Setup](#setup)
+ - [Simple Example](#simple-example)
+ - [Access Hand Information](#access-hand-information)
+ - [Example](#example)
+ - [Another Example](#another-example)
+
+## General Info
+OpenLeap is an open source project that allows you to add hand gesture control to your Python projects.
+
+## Technologies
+
+Project was created with technologies:
+
+- Python
+- OpenCV
+- MediaPipe
+- SciKit Learn
+
+## Setup
+OpenLeap can be installed using pip, as shown below.
+
+```
+$ pip install openleap
+```
+
+## Simple Example
+
+Test openleap controller with an example program. Code below will create an instance of opencv window with feed from the camera.
+
+
+```
+import openleap
+
+controller = openleap.OpenLeap(screen_show=True,
+ screeen_type='BLACK',
+ show_data_on_image=True,
+ show_data_in_console=True,
+ gesture_model='sign_language')
+
+controller.loop()
+
+```
+
+<p align="center">
+ <img src="https://raw.githubusercontent.com/szymciem8/OpenLeap/main/Documentation/images/example_program.gif?token=AMBI64BGASHC4OPJW6OD3YDBV2BJK" width="850" />
+</p>
+
+OpenLeap returns relative position of each hand, distance between thumb tip and index finger tip, rotation angle by wrist point and recognized gesture. There are two models for gesture recognition.
+
+The first one can recognized wheter hand is opened or closed into fist, second model can recognized sign language alphabet as shown below.
+
+<p align="center">
+ <img src="https://pastevents.impactcee.com/wp-content/uploads/2016/10/DayTranslationsBlog-Learn-American-Sign-Language.jpg" width="850" />
+</p>
+
+
+OpenLeap object can be created with couple of options.
+- **screen_show** - if set to True, window with camera feed will be created.
+- **screen_type** - "CAM" or "BLACK" background.
+- **show_data_on_image** - descriptive
+- **show_data_in_console** - descriptive
+- **gesture_model** - chose gesture recognition model, "basic" or "sign_language"
+
+## Access Hand Information
+
+Recognized gestures, hand position, tilt and so on are stored in a dictionary called 'data' that consists of two dataclass objects for right and left hand. Dataclass object is of given structure:
+
+```
+@dataclass
+class Data:
+ x : float = 0
+ y : float = 0
+ z : float = 0
+ distance: float = 0.0
+ angle: float = 0.0
+ gesture: str = None
+```
+
+Dataclass containing all of the data above is continuously being updated in **main()** or **loop()** function depending on which one is being used.
+
+### Example
+
+```
+if controller.data['right'].gesture == 'open':
+ print('Right hand is opened!')
+elif controller.data['right'].gesture == 'fist':
+ print('Right hand is closed!')
+```
+
+### Another Example
+
+```
+if controller.data['right'].distance < 20:
+ print('Click has been detected!')
+```
+
+
+
+%package -n python3-openleap
+Summary: Hand tracking and gesture recognition module
+Provides: python-openleap
+BuildRequires: python3-devel
+BuildRequires: python3-setuptools
+BuildRequires: python3-pip
+%description -n python3-openleap
+# OpenLeap
+
+## Table of contents
+- [OpenLeap](#openleap)
+ - [Table of contents](#table-of-contents)
+ - [General Info](#general-info)
+ - [Technologies](#technologies)
+ - [Setup](#setup)
+ - [Simple Example](#simple-example)
+ - [Access Hand Information](#access-hand-information)
+ - [Example](#example)
+ - [Another Example](#another-example)
+
+## General Info
+OpenLeap is an open source project that allows you to add hand gesture control to your Python projects.
+
+## Technologies
+
+Project was created with technologies:
+
+- Python
+- OpenCV
+- MediaPipe
+- SciKit Learn
+
+## Setup
+OpenLeap can be installed using pip, as shown below.
+
+```
+$ pip install openleap
+```
+
+## Simple Example
+
+Test openleap controller with an example program. Code below will create an instance of opencv window with feed from the camera.
+
+
+```
+import openleap
+
+controller = openleap.OpenLeap(screen_show=True,
+ screeen_type='BLACK',
+ show_data_on_image=True,
+ show_data_in_console=True,
+ gesture_model='sign_language')
+
+controller.loop()
+
+```
+
+<p align="center">
+ <img src="https://raw.githubusercontent.com/szymciem8/OpenLeap/main/Documentation/images/example_program.gif?token=AMBI64BGASHC4OPJW6OD3YDBV2BJK" width="850" />
+</p>
+
+OpenLeap returns relative position of each hand, distance between thumb tip and index finger tip, rotation angle by wrist point and recognized gesture. There are two models for gesture recognition.
+
+The first one can recognized wheter hand is opened or closed into fist, second model can recognized sign language alphabet as shown below.
+
+<p align="center">
+ <img src="https://pastevents.impactcee.com/wp-content/uploads/2016/10/DayTranslationsBlog-Learn-American-Sign-Language.jpg" width="850" />
+</p>
+
+
+OpenLeap object can be created with couple of options.
+- **screen_show** - if set to True, window with camera feed will be created.
+- **screen_type** - "CAM" or "BLACK" background.
+- **show_data_on_image** - descriptive
+- **show_data_in_console** - descriptive
+- **gesture_model** - chose gesture recognition model, "basic" or "sign_language"
+
+## Access Hand Information
+
+Recognized gestures, hand position, tilt and so on are stored in a dictionary called 'data' that consists of two dataclass objects for right and left hand. Dataclass object is of given structure:
+
+```
+@dataclass
+class Data:
+ x : float = 0
+ y : float = 0
+ z : float = 0
+ distance: float = 0.0
+ angle: float = 0.0
+ gesture: str = None
+```
+
+Dataclass containing all of the data above is continuously being updated in **main()** or **loop()** function depending on which one is being used.
+
+### Example
+
+```
+if controller.data['right'].gesture == 'open':
+ print('Right hand is opened!')
+elif controller.data['right'].gesture == 'fist':
+ print('Right hand is closed!')
+```
+
+### Another Example
+
+```
+if controller.data['right'].distance < 20:
+ print('Click has been detected!')
+```
+
+
+
+%package help
+Summary: Development documents and examples for openleap
+Provides: python3-openleap-doc
+%description help
+# OpenLeap
+
+## Table of contents
+- [OpenLeap](#openleap)
+ - [Table of contents](#table-of-contents)
+ - [General Info](#general-info)
+ - [Technologies](#technologies)
+ - [Setup](#setup)
+ - [Simple Example](#simple-example)
+ - [Access Hand Information](#access-hand-information)
+ - [Example](#example)
+ - [Another Example](#another-example)
+
+## General Info
+OpenLeap is an open source project that allows you to add hand gesture control to your Python projects.
+
+## Technologies
+
+Project was created with technologies:
+
+- Python
+- OpenCV
+- MediaPipe
+- SciKit Learn
+
+## Setup
+OpenLeap can be installed using pip, as shown below.
+
+```
+$ pip install openleap
+```
+
+## Simple Example
+
+Test openleap controller with an example program. Code below will create an instance of opencv window with feed from the camera.
+
+
+```
+import openleap
+
+controller = openleap.OpenLeap(screen_show=True,
+ screeen_type='BLACK',
+ show_data_on_image=True,
+ show_data_in_console=True,
+ gesture_model='sign_language')
+
+controller.loop()
+
+```
+
+<p align="center">
+ <img src="https://raw.githubusercontent.com/szymciem8/OpenLeap/main/Documentation/images/example_program.gif?token=AMBI64BGASHC4OPJW6OD3YDBV2BJK" width="850" />
+</p>
+
+OpenLeap returns relative position of each hand, distance between thumb tip and index finger tip, rotation angle by wrist point and recognized gesture. There are two models for gesture recognition.
+
+The first one can recognized wheter hand is opened or closed into fist, second model can recognized sign language alphabet as shown below.
+
+<p align="center">
+ <img src="https://pastevents.impactcee.com/wp-content/uploads/2016/10/DayTranslationsBlog-Learn-American-Sign-Language.jpg" width="850" />
+</p>
+
+
+OpenLeap object can be created with couple of options.
+- **screen_show** - if set to True, window with camera feed will be created.
+- **screen_type** - "CAM" or "BLACK" background.
+- **show_data_on_image** - descriptive
+- **show_data_in_console** - descriptive
+- **gesture_model** - chose gesture recognition model, "basic" or "sign_language"
+
+## Access Hand Information
+
+Recognized gestures, hand position, tilt and so on are stored in a dictionary called 'data' that consists of two dataclass objects for right and left hand. Dataclass object is of given structure:
+
+```
+@dataclass
+class Data:
+ x : float = 0
+ y : float = 0
+ z : float = 0
+ distance: float = 0.0
+ angle: float = 0.0
+ gesture: str = None
+```
+
+Dataclass containing all of the data above is continuously being updated in **main()** or **loop()** function depending on which one is being used.
+
+### Example
+
+```
+if controller.data['right'].gesture == 'open':
+ print('Right hand is opened!')
+elif controller.data['right'].gesture == 'fist':
+ print('Right hand is closed!')
+```
+
+### Another Example
+
+```
+if controller.data['right'].distance < 20:
+ print('Click has been detected!')
+```
+
+
+
+%prep
+%autosetup -n openleap-0.5.6
+
+%build
+%py3_build
+
+%install
+%py3_install
+install -d -m755 %{buildroot}/%{_pkgdocdir}
+if [ -d doc ]; then cp -arf doc %{buildroot}/%{_pkgdocdir}; fi
+if [ -d docs ]; then cp -arf docs %{buildroot}/%{_pkgdocdir}; fi
+if [ -d example ]; then cp -arf example %{buildroot}/%{_pkgdocdir}; fi
+if [ -d examples ]; then cp -arf examples %{buildroot}/%{_pkgdocdir}; fi
+pushd %{buildroot}
+if [ -d usr/lib ]; then
+ find usr/lib -type f -printf "/%h/%f\n" >> filelist.lst
+fi
+if [ -d usr/lib64 ]; then
+ find usr/lib64 -type f -printf "/%h/%f\n" >> filelist.lst
+fi
+if [ -d usr/bin ]; then
+ find usr/bin -type f -printf "/%h/%f\n" >> filelist.lst
+fi
+if [ -d usr/sbin ]; then
+ find usr/sbin -type f -printf "/%h/%f\n" >> filelist.lst
+fi
+touch doclist.lst
+if [ -d usr/share/man ]; then
+ find usr/share/man -type f -printf "/%h/%f.gz\n" >> doclist.lst
+fi
+popd
+mv %{buildroot}/filelist.lst .
+mv %{buildroot}/doclist.lst .
+
+%files -n python3-openleap -f filelist.lst
+%dir %{python3_sitelib}/*
+
+%files help -f doclist.lst
+%{_docdir}/*
+
+%changelog
+* Fri May 05 2023 Python_Bot <Python_Bot@openeuler.org> - 0.5.6-1
+- Package Spec generated
diff --git a/sources b/sources
new file mode 100644
index 0000000..824f98f
--- /dev/null
+++ b/sources
@@ -0,0 +1 @@
+41c9b19947dbabb35b0f04beeb3d9a4b openleap-0.5.6.tar.gz