%global _empty_manifest_terminate_build 0 Name: python-scrapfly-sdk Version: 0.8.6 Release: 1 Summary: Scrapfly SDK for Scrapfly License: BSD URL: https://github.com/scrapfly/python-sdk Source0: https://mirrors.aliyun.com/pypi/web/packages/f3/7c/a9f781012e8ed997e585945bf2812a4bad56b628e994d24493f9b4799145/scrapfly-sdk-0.8.6.tar.gz BuildArch: noarch Requires: python3-decorator Requires: python3-requests Requires: python3-dateutil Requires: python3-loguru Requires: python3-urllib3 Requires: python3-backoff Requires: python3-soupsieve Requires: python3-brotlipy Requires: python3-cchardet Requires: python3-msgpack Requires: python3-lxml Requires: python3-beautifulsoup4 Requires: python3-scrapy Requires: python3-extruct Requires: python3-bumpversion Requires: python3-isort Requires: python3-readme-renderer Requires: python3-twine Requires: python3-setuptools Requires: python3-wheel Requires: python3-pdoc3 Requires: python3-lxml Requires: python3-beautifulsoup4 Requires: python3-soupsieve Requires: python3-extruct Requires: python3-scrapy Requires: python3-brotlipy Requires: python3-cchardet Requires: python3-msgpack %description # Scrapfly SDK ## Installation `pip install scrapfly-sdk` You can also install extra dependencies * `pip install "scrapfly-sdk[seepdup]"` for performance improvement * `pip install "scrapfly-sdk[concurrency]"` for concurrency out of the box (asyncio / thread) * `pip install "scrapfly-sdk[scrapy]"` for scrapy integration * `pip install "scrapfly-sdk[scrapy]"` Everything! ## Get Your API Key You can create a free account on [Scrapfly](https://scrapfly.io/register) to get your API Key. * [Usage](https://scrapfly.io/docs/sdk/python) * [Python API](https://scrapfly.github.io/python-scrapfly/scrapfly) * [Open API 3 Spec](https://scrapfly.io/docs/openapi#get-/scrape) * [Scrapy Integration](https://scrapfly.io/docs/sdk/scrapy) ## Migration ### Migrate from 0.7.x to 0.8 asyncio-pool dependency has been dropped `scrapfly.concurrent_scrape` is now an async generator. If the concurrency is `None` or not defined, the max concurrency allowed by your current subscription is used. ```python async for result in scrapfly.concurrent_scrape(concurrency=10, scrape_configs=[ScrapConfig(...), ...]): print(result) ``` brotli args is deprecated and will be removed in the next minor. There is not benefit in most of case versus gzip regarding and size and use more CPU. ### What's new ### 0.8.x * Better error log * Async/Improvement for concurrent scrape with asyncio * Scrapy media pipeline are now supported out of the box %package -n python3-scrapfly-sdk Summary: Scrapfly SDK for Scrapfly Provides: python-scrapfly-sdk BuildRequires: python3-devel BuildRequires: python3-setuptools BuildRequires: python3-pip %description -n python3-scrapfly-sdk # Scrapfly SDK ## Installation `pip install scrapfly-sdk` You can also install extra dependencies * `pip install "scrapfly-sdk[seepdup]"` for performance improvement * `pip install "scrapfly-sdk[concurrency]"` for concurrency out of the box (asyncio / thread) * `pip install "scrapfly-sdk[scrapy]"` for scrapy integration * `pip install "scrapfly-sdk[scrapy]"` Everything! ## Get Your API Key You can create a free account on [Scrapfly](https://scrapfly.io/register) to get your API Key. * [Usage](https://scrapfly.io/docs/sdk/python) * [Python API](https://scrapfly.github.io/python-scrapfly/scrapfly) * [Open API 3 Spec](https://scrapfly.io/docs/openapi#get-/scrape) * [Scrapy Integration](https://scrapfly.io/docs/sdk/scrapy) ## Migration ### Migrate from 0.7.x to 0.8 asyncio-pool dependency has been dropped `scrapfly.concurrent_scrape` is now an async generator. If the concurrency is `None` or not defined, the max concurrency allowed by your current subscription is used. ```python async for result in scrapfly.concurrent_scrape(concurrency=10, scrape_configs=[ScrapConfig(...), ...]): print(result) ``` brotli args is deprecated and will be removed in the next minor. There is not benefit in most of case versus gzip regarding and size and use more CPU. ### What's new ### 0.8.x * Better error log * Async/Improvement for concurrent scrape with asyncio * Scrapy media pipeline are now supported out of the box %package help Summary: Development documents and examples for scrapfly-sdk Provides: python3-scrapfly-sdk-doc %description help # Scrapfly SDK ## Installation `pip install scrapfly-sdk` You can also install extra dependencies * `pip install "scrapfly-sdk[seepdup]"` for performance improvement * `pip install "scrapfly-sdk[concurrency]"` for concurrency out of the box (asyncio / thread) * `pip install "scrapfly-sdk[scrapy]"` for scrapy integration * `pip install "scrapfly-sdk[scrapy]"` Everything! ## Get Your API Key You can create a free account on [Scrapfly](https://scrapfly.io/register) to get your API Key. * [Usage](https://scrapfly.io/docs/sdk/python) * [Python API](https://scrapfly.github.io/python-scrapfly/scrapfly) * [Open API 3 Spec](https://scrapfly.io/docs/openapi#get-/scrape) * [Scrapy Integration](https://scrapfly.io/docs/sdk/scrapy) ## Migration ### Migrate from 0.7.x to 0.8 asyncio-pool dependency has been dropped `scrapfly.concurrent_scrape` is now an async generator. If the concurrency is `None` or not defined, the max concurrency allowed by your current subscription is used. ```python async for result in scrapfly.concurrent_scrape(concurrency=10, scrape_configs=[ScrapConfig(...), ...]): print(result) ``` brotli args is deprecated and will be removed in the next minor. There is not benefit in most of case versus gzip regarding and size and use more CPU. ### What's new ### 0.8.x * Better error log * Async/Improvement for concurrent scrape with asyncio * Scrapy media pipeline are now supported out of the box %prep %autosetup -n scrapfly-sdk-0.8.6 %build %py3_build %install %py3_install install -d -m755 %{buildroot}/%{_pkgdocdir} if [ -d doc ]; then cp -arf doc %{buildroot}/%{_pkgdocdir}; fi if [ -d docs ]; then cp -arf docs %{buildroot}/%{_pkgdocdir}; fi if [ -d example ]; then cp -arf example %{buildroot}/%{_pkgdocdir}; fi if [ -d examples ]; then cp -arf examples %{buildroot}/%{_pkgdocdir}; fi pushd %{buildroot} if [ -d usr/lib ]; then find usr/lib -type f -printf "\"/%h/%f\"\n" >> filelist.lst fi if [ -d usr/lib64 ]; then find usr/lib64 -type f -printf "\"/%h/%f\"\n" >> filelist.lst fi if [ -d usr/bin ]; then find usr/bin -type f -printf "\"/%h/%f\"\n" >> filelist.lst fi if [ -d usr/sbin ]; then find usr/sbin -type f -printf "\"/%h/%f\"\n" >> filelist.lst fi touch doclist.lst if [ -d usr/share/man ]; then find usr/share/man -type f -printf "\"/%h/%f.gz\"\n" >> doclist.lst fi popd mv %{buildroot}/filelist.lst . mv %{buildroot}/doclist.lst . %files -n python3-scrapfly-sdk -f filelist.lst %dir %{python3_sitelib}/* %files help -f doclist.lst %{_docdir}/* %changelog * Tue Jun 20 2023 Python_Bot - 0.8.6-1 - Package Spec generated