%global _empty_manifest_terminate_build 0 Name: python-text-sensitivity Version: 0.3.3 Release: 1 Summary: Extension of text_explainability for sensitivity testing (robustness, fairness) License: GNU LGPL v3 URL: https://git.science.uu.nl/m.j.robeer/text_sensitivity Source0: https://mirrors.nju.edu.cn/pypi/web/packages/3a/e4/ab3bae1ad306176c73838f59b68a809f1158a4ce4ed4bf97d820d694ce3a/text_sensitivity-0.3.3.tar.gz BuildArch: noarch Requires: python3-genbase Requires: python3-text-explainability Requires: python3-nlpaug Requires: python3-Faker Requires: python3-yaspin %description > Extension of [text_explainability](https://git.science.uu.nl/m.j.robeer/text_explainability) Uses the **generic architecture** of `text_explainability` to also include tests of **safety** (_how safe it the model in production_, i.e. types of inputs it can handle), **robustness** (_how generalizable the model is in production_, e.g. stability when adding typos, or the effect of adding random unrelated data) and **fairness** (_if equal individuals are treated equally by the model_, e.g. subgroup fairness on sex and nationality). © Marcel Robeer, 2021 ## Quick tour **Safety**: test if your model is able to handle different data types. ```python from text_sensitivity import RandomAscii, RandomEmojis, combine_generators # Generate 10 strings with random ASCII characters RandomAscii().generate_list(n=10) # Generate 5 strings with random ASCII characters and emojis combine_generators(RandomAscii(), RandomEmojis()).generate_list(n=5) ``` **Robustness**: if your model performs equally for different entities ... ```python from text_sensitivity import RandomAddress, RandomEmail # Random address of your current locale (default = 'nl') RandomAddress(sep=', ').generate_list(n=5) # Random e-mail addresses in Spanish ('es') and Portuguese ('pt'), and include from which country the e-mail is RandomEmail(languages=['es', 'pt']).generate_list(n=10, attributes=True) ``` ```python from text_sensitivity import compare_accuracy from text_sensitivity.perturbation import to_upper, add_typos # Is model accuracy equal when we change all sentences to uppercase? compare_accuracy(env, model, to_upper) # Is model accuracy equal when we add typos in words? compare_accuracy(env, model, add_typos) ``` **Fairness**: see if performance is equal among subgroups. ```python from text_sensitivity import RandomName # Generate random Dutch ('nl') and Russian ('ru') names, both 'male' and 'female' (+ return attributes) RandomName(languages=['nl', 'ru'], sex=['male', 'female']).generate_list(n=10, attributes=True) ``` ## Installation See the [installation](docs/INSTALLATION.md) instructions for an extended installation guide. | Method | Instructions | |--------|--------------| | `pip` | Install from [PyPI](https://pypi.org/project/text-sensitivity/) via `pip3 install text_sensitivity`. | | Local | Clone this repository and install via `pip3 install -e .` or locally run `python3 setup.py install`. ## Documentation Full documentation of the latest version is provided at [https://text-sensitivity.readthedocs.io/](https://text-sensitivity.readthedocs.io/). ## Example usage See [example_usage.md](example_usage.md) to see an example of how the package can be used, or run the lines in `example_usage.py` to do explore it interactively. ## Releases `text_sensitivity` is officially released through [PyPI](https://pypi.org/project/text-sensitivity/). See [CHANGELOG.md](CHANGELOG.md) for a full overview of the changes for each version. ## Citation ```bibtex @misc{text_sensitivity, title = {Python package text\_sensitivity}, author = {Marcel Robeer}, howpublished = {\url{https://git.science.uu.nl/m.j.robeer/text_sensitivity}}, year = {2021} } ``` ## Maintenance ### Contributors - [Marcel Robeer](https://www.uu.nl/staff/MJRobeer) (`@m.j.robeer`) - [Elize Herrewijnen](https://www.uu.nl/staff/EHerrewijnen) (`@e.herrewijnen`) ### Todo Tasks yet to be done: * Word-level perturbations * Add fairness-specific metrics: - Counterfactual fairness * Add expected behavior - Robustness: equal to prior prediction, or in some cases might expect that it deviates - Fairness: may deviate from original prediction * Tests - Add tests for perturbations - Add tests for sensitivity testing schemes * Add visualization ability ## Credits - Edward Ma. _[NLP Augmentation](https://github.com/makcedward/nlpaug)_. 2019. - Daniele Faraglia and other contributors. _[Faker](https://github.com/joke2k/faker)_. 2012. - Marco Tulio Ribeiro, Tongshuang Wu, Carlos Guestrin and Sameer Singh. [Beyond Accuracy: Behavioral Testing of NLP models with CheckList](https://paperswithcode.com/paper/beyond-accuracy-behavioral-testing-of-nlp). _Association for Computational Linguistics_ (_ACL_). 2020. %package -n python3-text-sensitivity Summary: Extension of text_explainability for sensitivity testing (robustness, fairness) Provides: python-text-sensitivity BuildRequires: python3-devel BuildRequires: python3-setuptools BuildRequires: python3-pip %description -n python3-text-sensitivity > Extension of [text_explainability](https://git.science.uu.nl/m.j.robeer/text_explainability) Uses the **generic architecture** of `text_explainability` to also include tests of **safety** (_how safe it the model in production_, i.e. types of inputs it can handle), **robustness** (_how generalizable the model is in production_, e.g. stability when adding typos, or the effect of adding random unrelated data) and **fairness** (_if equal individuals are treated equally by the model_, e.g. subgroup fairness on sex and nationality). © Marcel Robeer, 2021 ## Quick tour **Safety**: test if your model is able to handle different data types. ```python from text_sensitivity import RandomAscii, RandomEmojis, combine_generators # Generate 10 strings with random ASCII characters RandomAscii().generate_list(n=10) # Generate 5 strings with random ASCII characters and emojis combine_generators(RandomAscii(), RandomEmojis()).generate_list(n=5) ``` **Robustness**: if your model performs equally for different entities ... ```python from text_sensitivity import RandomAddress, RandomEmail # Random address of your current locale (default = 'nl') RandomAddress(sep=', ').generate_list(n=5) # Random e-mail addresses in Spanish ('es') and Portuguese ('pt'), and include from which country the e-mail is RandomEmail(languages=['es', 'pt']).generate_list(n=10, attributes=True) ``` ```python from text_sensitivity import compare_accuracy from text_sensitivity.perturbation import to_upper, add_typos # Is model accuracy equal when we change all sentences to uppercase? compare_accuracy(env, model, to_upper) # Is model accuracy equal when we add typos in words? compare_accuracy(env, model, add_typos) ``` **Fairness**: see if performance is equal among subgroups. ```python from text_sensitivity import RandomName # Generate random Dutch ('nl') and Russian ('ru') names, both 'male' and 'female' (+ return attributes) RandomName(languages=['nl', 'ru'], sex=['male', 'female']).generate_list(n=10, attributes=True) ``` ## Installation See the [installation](docs/INSTALLATION.md) instructions for an extended installation guide. | Method | Instructions | |--------|--------------| | `pip` | Install from [PyPI](https://pypi.org/project/text-sensitivity/) via `pip3 install text_sensitivity`. | | Local | Clone this repository and install via `pip3 install -e .` or locally run `python3 setup.py install`. ## Documentation Full documentation of the latest version is provided at [https://text-sensitivity.readthedocs.io/](https://text-sensitivity.readthedocs.io/). ## Example usage See [example_usage.md](example_usage.md) to see an example of how the package can be used, or run the lines in `example_usage.py` to do explore it interactively. ## Releases `text_sensitivity` is officially released through [PyPI](https://pypi.org/project/text-sensitivity/). See [CHANGELOG.md](CHANGELOG.md) for a full overview of the changes for each version. ## Citation ```bibtex @misc{text_sensitivity, title = {Python package text\_sensitivity}, author = {Marcel Robeer}, howpublished = {\url{https://git.science.uu.nl/m.j.robeer/text_sensitivity}}, year = {2021} } ``` ## Maintenance ### Contributors - [Marcel Robeer](https://www.uu.nl/staff/MJRobeer) (`@m.j.robeer`) - [Elize Herrewijnen](https://www.uu.nl/staff/EHerrewijnen) (`@e.herrewijnen`) ### Todo Tasks yet to be done: * Word-level perturbations * Add fairness-specific metrics: - Counterfactual fairness * Add expected behavior - Robustness: equal to prior prediction, or in some cases might expect that it deviates - Fairness: may deviate from original prediction * Tests - Add tests for perturbations - Add tests for sensitivity testing schemes * Add visualization ability ## Credits - Edward Ma. _[NLP Augmentation](https://github.com/makcedward/nlpaug)_. 2019. - Daniele Faraglia and other contributors. _[Faker](https://github.com/joke2k/faker)_. 2012. - Marco Tulio Ribeiro, Tongshuang Wu, Carlos Guestrin and Sameer Singh. [Beyond Accuracy: Behavioral Testing of NLP models with CheckList](https://paperswithcode.com/paper/beyond-accuracy-behavioral-testing-of-nlp). _Association for Computational Linguistics_ (_ACL_). 2020. %package help Summary: Development documents and examples for text-sensitivity Provides: python3-text-sensitivity-doc %description help > Extension of [text_explainability](https://git.science.uu.nl/m.j.robeer/text_explainability) Uses the **generic architecture** of `text_explainability` to also include tests of **safety** (_how safe it the model in production_, i.e. types of inputs it can handle), **robustness** (_how generalizable the model is in production_, e.g. stability when adding typos, or the effect of adding random unrelated data) and **fairness** (_if equal individuals are treated equally by the model_, e.g. subgroup fairness on sex and nationality). © Marcel Robeer, 2021 ## Quick tour **Safety**: test if your model is able to handle different data types. ```python from text_sensitivity import RandomAscii, RandomEmojis, combine_generators # Generate 10 strings with random ASCII characters RandomAscii().generate_list(n=10) # Generate 5 strings with random ASCII characters and emojis combine_generators(RandomAscii(), RandomEmojis()).generate_list(n=5) ``` **Robustness**: if your model performs equally for different entities ... ```python from text_sensitivity import RandomAddress, RandomEmail # Random address of your current locale (default = 'nl') RandomAddress(sep=', ').generate_list(n=5) # Random e-mail addresses in Spanish ('es') and Portuguese ('pt'), and include from which country the e-mail is RandomEmail(languages=['es', 'pt']).generate_list(n=10, attributes=True) ``` ```python from text_sensitivity import compare_accuracy from text_sensitivity.perturbation import to_upper, add_typos # Is model accuracy equal when we change all sentences to uppercase? compare_accuracy(env, model, to_upper) # Is model accuracy equal when we add typos in words? compare_accuracy(env, model, add_typos) ``` **Fairness**: see if performance is equal among subgroups. ```python from text_sensitivity import RandomName # Generate random Dutch ('nl') and Russian ('ru') names, both 'male' and 'female' (+ return attributes) RandomName(languages=['nl', 'ru'], sex=['male', 'female']).generate_list(n=10, attributes=True) ``` ## Installation See the [installation](docs/INSTALLATION.md) instructions for an extended installation guide. | Method | Instructions | |--------|--------------| | `pip` | Install from [PyPI](https://pypi.org/project/text-sensitivity/) via `pip3 install text_sensitivity`. | | Local | Clone this repository and install via `pip3 install -e .` or locally run `python3 setup.py install`. ## Documentation Full documentation of the latest version is provided at [https://text-sensitivity.readthedocs.io/](https://text-sensitivity.readthedocs.io/). ## Example usage See [example_usage.md](example_usage.md) to see an example of how the package can be used, or run the lines in `example_usage.py` to do explore it interactively. ## Releases `text_sensitivity` is officially released through [PyPI](https://pypi.org/project/text-sensitivity/). See [CHANGELOG.md](CHANGELOG.md) for a full overview of the changes for each version. ## Citation ```bibtex @misc{text_sensitivity, title = {Python package text\_sensitivity}, author = {Marcel Robeer}, howpublished = {\url{https://git.science.uu.nl/m.j.robeer/text_sensitivity}}, year = {2021} } ``` ## Maintenance ### Contributors - [Marcel Robeer](https://www.uu.nl/staff/MJRobeer) (`@m.j.robeer`) - [Elize Herrewijnen](https://www.uu.nl/staff/EHerrewijnen) (`@e.herrewijnen`) ### Todo Tasks yet to be done: * Word-level perturbations * Add fairness-specific metrics: - Counterfactual fairness * Add expected behavior - Robustness: equal to prior prediction, or in some cases might expect that it deviates - Fairness: may deviate from original prediction * Tests - Add tests for perturbations - Add tests for sensitivity testing schemes * Add visualization ability ## Credits - Edward Ma. _[NLP Augmentation](https://github.com/makcedward/nlpaug)_. 2019. - Daniele Faraglia and other contributors. _[Faker](https://github.com/joke2k/faker)_. 2012. - Marco Tulio Ribeiro, Tongshuang Wu, Carlos Guestrin and Sameer Singh. [Beyond Accuracy: Behavioral Testing of NLP models with CheckList](https://paperswithcode.com/paper/beyond-accuracy-behavioral-testing-of-nlp). _Association for Computational Linguistics_ (_ACL_). 2020. %prep %autosetup -n text-sensitivity-0.3.3 %build %py3_build %install %py3_install install -d -m755 %{buildroot}/%{_pkgdocdir} if [ -d doc ]; then cp -arf doc %{buildroot}/%{_pkgdocdir}; fi if [ -d docs ]; then cp -arf docs %{buildroot}/%{_pkgdocdir}; fi if [ -d example ]; then cp -arf example %{buildroot}/%{_pkgdocdir}; fi if [ -d examples ]; then cp -arf examples %{buildroot}/%{_pkgdocdir}; fi pushd %{buildroot} if [ -d usr/lib ]; then find usr/lib -type f -printf "/%h/%f\n" >> filelist.lst fi if [ -d usr/lib64 ]; then find usr/lib64 -type f -printf "/%h/%f\n" >> filelist.lst fi if [ -d usr/bin ]; then find usr/bin -type f -printf "/%h/%f\n" >> filelist.lst fi if [ -d usr/sbin ]; then find usr/sbin -type f -printf "/%h/%f\n" >> filelist.lst fi touch doclist.lst if [ -d usr/share/man ]; then find usr/share/man -type f -printf "/%h/%f.gz\n" >> doclist.lst fi popd mv %{buildroot}/filelist.lst . mv %{buildroot}/doclist.lst . %files -n python3-text-sensitivity -f filelist.lst %dir %{python3_sitelib}/* %files help -f doclist.lst %{_docdir}/* %changelog * Mon May 29 2023 Python_Bot - 0.3.3-1 - Package Spec generated