%global _empty_manifest_terminate_build 0 Name: python-tinysegmenter3 Version: 0.1.0 Release: 1 Summary: Super compact Japanese tokenizer License: New BSD URL: https://github.com/SamuraiT/tinysegmenter Source0: https://mirrors.nju.edu.cn/pypi/web/packages/fa/02/fcfeebe21e1e030da593f2151538c273e1eeccd8fb62d18811dbffc5cd6d/tinysegmenter3-0.1.0.tar.gz BuildArch: noarch %description TinySegmenter -- Super compact Japanese tokenizer was originally created by (c) 2008 Taku Kudo for javascript under the terms of a new BSD licence. For details, see [here](http://lilyx.net/pages/tinysegmenter_licence.txt) tinysegmenter for python2.x was written by Masato Hagiwara. for his information see [here](http://lilyx.net/pages/tinysegmenterp.html) This tinysegmenter is modified for python3.x and python2.x for distribution by Tatsuro Yasukawa. Additionaly, this tinysegmenter is modified for being more faster - thanks to @chezou, @cocoatomo and @methane. See info about [tinysegmenter](https://github.com/SamuraiT/tinysegmenter) %package -n python3-tinysegmenter3 Summary: Super compact Japanese tokenizer Provides: python-tinysegmenter3 BuildRequires: python3-devel BuildRequires: python3-setuptools BuildRequires: python3-pip %description -n python3-tinysegmenter3 TinySegmenter -- Super compact Japanese tokenizer was originally created by (c) 2008 Taku Kudo for javascript under the terms of a new BSD licence. For details, see [here](http://lilyx.net/pages/tinysegmenter_licence.txt) tinysegmenter for python2.x was written by Masato Hagiwara. for his information see [here](http://lilyx.net/pages/tinysegmenterp.html) This tinysegmenter is modified for python3.x and python2.x for distribution by Tatsuro Yasukawa. Additionaly, this tinysegmenter is modified for being more faster - thanks to @chezou, @cocoatomo and @methane. See info about [tinysegmenter](https://github.com/SamuraiT/tinysegmenter) %package help Summary: Development documents and examples for tinysegmenter3 Provides: python3-tinysegmenter3-doc %description help TinySegmenter -- Super compact Japanese tokenizer was originally created by (c) 2008 Taku Kudo for javascript under the terms of a new BSD licence. For details, see [here](http://lilyx.net/pages/tinysegmenter_licence.txt) tinysegmenter for python2.x was written by Masato Hagiwara. for his information see [here](http://lilyx.net/pages/tinysegmenterp.html) This tinysegmenter is modified for python3.x and python2.x for distribution by Tatsuro Yasukawa. Additionaly, this tinysegmenter is modified for being more faster - thanks to @chezou, @cocoatomo and @methane. See info about [tinysegmenter](https://github.com/SamuraiT/tinysegmenter) %prep %autosetup -n tinysegmenter3-0.1.0 %build %py3_build %install %py3_install install -d -m755 %{buildroot}/%{_pkgdocdir} if [ -d doc ]; then cp -arf doc %{buildroot}/%{_pkgdocdir}; fi if [ -d docs ]; then cp -arf docs %{buildroot}/%{_pkgdocdir}; fi if [ -d example ]; then cp -arf example %{buildroot}/%{_pkgdocdir}; fi if [ -d examples ]; then cp -arf examples %{buildroot}/%{_pkgdocdir}; fi pushd %{buildroot} if [ -d usr/lib ]; then find usr/lib -type f -printf "/%h/%f\n" >> filelist.lst fi if [ -d usr/lib64 ]; then find usr/lib64 -type f -printf "/%h/%f\n" >> filelist.lst fi if [ -d usr/bin ]; then find usr/bin -type f -printf "/%h/%f\n" >> filelist.lst fi if [ -d usr/sbin ]; then find usr/sbin -type f -printf "/%h/%f\n" >> filelist.lst fi touch doclist.lst if [ -d usr/share/man ]; then find usr/share/man -type f -printf "/%h/%f.gz\n" >> doclist.lst fi popd mv %{buildroot}/filelist.lst . mv %{buildroot}/doclist.lst . %files -n python3-tinysegmenter3 -f filelist.lst %dir %{python3_sitelib}/* %files help -f doclist.lst %{_docdir}/* %changelog * Wed May 17 2023 Python_Bot - 0.1.0-1 - Package Spec generated