%global _empty_manifest_terminate_build 0 Name: python-tf-keras-vis Version: 0.8.5 Release: 1 Summary: Neural network visualization toolkit for tf.keras License: MIT License URL: https://github.com/keisen/tf-keras-vis Source0: https://mirrors.aliyun.com/pypi/web/packages/fc/98/a542ccb528a764302b7db123c9ff19736879a81d5d326a5d2899f99561e8/tf-keras-vis-0.8.5.tar.gz BuildArch: noarch Requires: python3-scipy Requires: python3-pillow Requires: python3-deprecated Requires: python3-imageio Requires: python3-packaging Requires: python3-importlib-metadata Requires: python3-flake8 Requires: python3-flake8-docstrings Requires: python3-isort Requires: python3-yapf Requires: python3-pytest Requires: python3-pytest-pycodestyle Requires: python3-pytest-cov Requires: python3-pytest-env Requires: python3-pytest-xdist Requires: python3-sphinx Requires: python3-sphinx-autobuild Requires: python3-sphinx-rtd-theme Requires: python3-myst-parser Requires: python3-nbsphinx Requires: python3-pandoc Requires: python3-jupyterlab Requires: python3-matplotlib %description # [tf-keras-vis](https://keisen.github.io/tf-keras-vis-docs/) [![Downloads](https://pepy.tech/badge/tf-keras-vis)](https://pepy.tech/project/tf-keras-vis) [![Python](https://img.shields.io/pypi/pyversions/tf-keras-vis.svg?style=plastic)](https://badge.fury.io/py/tf-keras-vis) [![PyPI version](https://badge.fury.io/py/tf-keras-vis.svg)](https://badge.fury.io/py/tf-keras-vis) [![Python package](https://github.com/keisen/tf-keras-vis/actions/workflows/python-package.yml/badge.svg)](https://github.com/keisen/tf-keras-vis/actions/workflows/python-package.yml) [![License: MIT](https://img.shields.io/badge/License-MIT-yellow.svg)](https://opensource.org/licenses/MIT) [![Documentation](https://img.shields.io/badge/api-reference-blue.svg)](https://keisen.github.io/tf-keras-vis-docs/) ## Web documents https://keisen.github.io/tf-keras-vis-docs/ ## Overview tf-keras-vis is a visualization toolkit for debugging `tf.keras.Model` in Tensorflow2.0+. Currently supported methods for visualization include: * Feature Visualization - ActivationMaximization ([web](https://distill.pub/2017/feature-visualization/), [github](https://github.com/raghakot/keras-vis)) * Class Activation Maps - GradCAM ([paper](https://arxiv.org/pdf/1610.02391v1.pdf)) - GradCAM++ ([paper](https://arxiv.org/pdf/1710.11063.pdf)) - ScoreCAM ([paper](https://arxiv.org/pdf/1910.01279.pdf), [github](https://github.com/haofanwang/Score-CAM)) - Faster-ScoreCAM ([github](https://github.com/tabayashi0117/Score-CAM/blob/master/README.md#faster-score-cam)) - LayerCAM ([paper](http://mftp.mmcheng.net/Papers/21TIP_LayerCAM.pdf), [github](https://github.com/PengtaoJiang/LayerCAM)) :new::zap: * Saliency Maps - Vanilla Saliency ([paper](https://arxiv.org/pdf/1312.6034.pdf)) - SmoothGrad ([paper](https://arxiv.org/pdf/1706.03825.pdf)) tf-keras-vis is designed to be light-weight, flexible and ease of use. All visualizations have the features as follows: * Support **N-dim image inputs**, that's, not only support pictures but also such as 3D images. * Support **batch wise** processing, so, be able to efficiently process multiple input images. * Support the model that have either **multiple inputs** or **multiple outputs**, or both. * Support the **mixed-precision** model. And in ActivationMaximization, * Support Optimizers that are built to tf.keras. ### Visualizations #### Dense Unit #### Convolutional Filter #### Class Activation Map The images above are generated by `GradCAM++`. #### Saliency Map The images above are generated by `SmoothGrad`. ## Usage ### ActivationMaximization (Visualizing Convolutional Filter) ```python import tensorflow as tf from tensorflow.keras.applications import VGG16 from matplotlib import pyplot as plt from tf_keras_vis.activation_maximization import ActivationMaximization from tf_keras_vis.activation_maximization.callbacks import Progress from tf_keras_vis.activation_maximization.input_modifiers import Jitter, Rotate2D from tf_keras_vis.activation_maximization.regularizers import TotalVariation2D, Norm from tf_keras_vis.utils.model_modifiers import ExtractIntermediateLayer, ReplaceToLinear from tf_keras_vis.utils.scores import CategoricalScore # Create the visualization instance. # All visualization classes accept a model and model-modifier, which, for example, # replaces the activation of last layer to linear function so on, in constructor. activation_maximization = \ ActivationMaximization(VGG16(), model_modifier=[ExtractIntermediateLayer('block5_conv3'), ReplaceToLinear()], clone=False) # You can use Score class to specify visualizing target you want. # And add regularizers or input-modifiers as needed. activations = \ activation_maximization(CategoricalScore(FILTER_INDEX), steps=200, input_modifiers=[Jitter(jitter=16), Rotate2D(degree=1)], regularizers=[TotalVariation2D(weight=1.0), Norm(weight=0.3, p=1)], optimizer=tf.keras.optimizers.RMSprop(1.0, 0.999), callbacks=[Progress()]) ## Since v0.6.0, calling `astype()` is NOT necessary. # activations = activations[0].astype(np.uint8) # Render plt.imshow(activations[0]) ``` ### Gradcam++ ```python import numpy as np from matplotlib import pyplot as plt from matplotlib import cm from tf_keras_vis.gradcam_plus_plus import GradcamPlusPlus from tf_keras_vis.utils.model_modifiers import ReplaceToLinear from tf_keras_vis.utils.scores import CategoricalScore # Create GradCAM++ object gradcam = GradcamPlusPlus(YOUR_MODEL_INSTANCE, model_modifier=ReplaceToLinear(), clone=True) # Generate cam with GradCAM++ cam = gradcam(CategoricalScore(CATEGORICAL_INDEX), SEED_INPUT) ## Since v0.6.0, calling `normalize()` is NOT necessary. # cam = normalize(cam) plt.imshow(SEED_INPUT_IMAGE) heatmap = np.uint8(cm.jet(cam[0])[..., :3] * 255) plt.imshow(heatmap, cmap='jet', alpha=0.5) # overlay ``` Please see the guides below for more details: ### Getting Started Guides * [Saliency and CAMs](https://keisen.github.io/tf-keras-vis-docs/examples/attentions.html) * [Visualize Dense Layer](https://keisen.github.io/tf-keras-vis-docs/examples/visualize_dense_layer.html) * [Visualize Convolutional Filer](https://keisen.github.io/tf-keras-vis-docs/examples/visualize_conv_filters.html) **[NOTES]** If you have ever used [keras-vis](https://github.com/raghakot/keras-vis), you may feel that tf-keras-vis is similar with keras-vis. Actually tf-keras-vis derived from keras-vis, and both provided visualization methods are almost the same. But please notice that tf-keras-vis APIs does NOT have compatibility with keras-vis. ## Requirements * Python 3.7+ * Tensorflow 2.0+ ## Installation * PyPI ```bash $ pip install tf-keras-vis tensorflow ``` * Source (for development) ```bash $ git clone https://github.com/keisen/tf-keras-vis.git $ cd tf-keras-vis $ pip install -e .[develop] tensorflow ``` ## Use Cases * [chitra](https://github.com/aniketmaurya/chitra) * A Deep Learning Computer Vision library for easy data loading, model building and model interpretation with GradCAM/GradCAM++. ## Known Issues * With InceptionV3, ActivationMaximization doesn't work well, that's, it might generate meaninglessly blur image. * With cascading model, Gradcam and Gradcam++ don't work well, that's, it might occur some error. So we recommend to use FasterScoreCAM in this case. * `channels-first` models and data is unsupported. ## ToDo * Guides * Visualizing multiple attention or activation images at once utilizing batch-system of model * Define various score functions * Visualizing attentions with multiple inputs models * Visualizing attentions with multiple outputs models * Advanced score functions * Tuning Activation Maximization * Visualizing attentions for N-dim image inputs * We're going to add some methods such as below - Deep Dream - Style transfer %package -n python3-tf-keras-vis Summary: Neural network visualization toolkit for tf.keras Provides: python-tf-keras-vis BuildRequires: python3-devel BuildRequires: python3-setuptools BuildRequires: python3-pip %description -n python3-tf-keras-vis # [tf-keras-vis](https://keisen.github.io/tf-keras-vis-docs/) [![Downloads](https://pepy.tech/badge/tf-keras-vis)](https://pepy.tech/project/tf-keras-vis) [![Python](https://img.shields.io/pypi/pyversions/tf-keras-vis.svg?style=plastic)](https://badge.fury.io/py/tf-keras-vis) [![PyPI version](https://badge.fury.io/py/tf-keras-vis.svg)](https://badge.fury.io/py/tf-keras-vis) [![Python package](https://github.com/keisen/tf-keras-vis/actions/workflows/python-package.yml/badge.svg)](https://github.com/keisen/tf-keras-vis/actions/workflows/python-package.yml) [![License: MIT](https://img.shields.io/badge/License-MIT-yellow.svg)](https://opensource.org/licenses/MIT) [![Documentation](https://img.shields.io/badge/api-reference-blue.svg)](https://keisen.github.io/tf-keras-vis-docs/) ## Web documents https://keisen.github.io/tf-keras-vis-docs/ ## Overview tf-keras-vis is a visualization toolkit for debugging `tf.keras.Model` in Tensorflow2.0+. Currently supported methods for visualization include: * Feature Visualization - ActivationMaximization ([web](https://distill.pub/2017/feature-visualization/), [github](https://github.com/raghakot/keras-vis)) * Class Activation Maps - GradCAM ([paper](https://arxiv.org/pdf/1610.02391v1.pdf)) - GradCAM++ ([paper](https://arxiv.org/pdf/1710.11063.pdf)) - ScoreCAM ([paper](https://arxiv.org/pdf/1910.01279.pdf), [github](https://github.com/haofanwang/Score-CAM)) - Faster-ScoreCAM ([github](https://github.com/tabayashi0117/Score-CAM/blob/master/README.md#faster-score-cam)) - LayerCAM ([paper](http://mftp.mmcheng.net/Papers/21TIP_LayerCAM.pdf), [github](https://github.com/PengtaoJiang/LayerCAM)) :new::zap: * Saliency Maps - Vanilla Saliency ([paper](https://arxiv.org/pdf/1312.6034.pdf)) - SmoothGrad ([paper](https://arxiv.org/pdf/1706.03825.pdf)) tf-keras-vis is designed to be light-weight, flexible and ease of use. All visualizations have the features as follows: * Support **N-dim image inputs**, that's, not only support pictures but also such as 3D images. * Support **batch wise** processing, so, be able to efficiently process multiple input images. * Support the model that have either **multiple inputs** or **multiple outputs**, or both. * Support the **mixed-precision** model. And in ActivationMaximization, * Support Optimizers that are built to tf.keras. ### Visualizations #### Dense Unit #### Convolutional Filter #### Class Activation Map The images above are generated by `GradCAM++`. #### Saliency Map The images above are generated by `SmoothGrad`. ## Usage ### ActivationMaximization (Visualizing Convolutional Filter) ```python import tensorflow as tf from tensorflow.keras.applications import VGG16 from matplotlib import pyplot as plt from tf_keras_vis.activation_maximization import ActivationMaximization from tf_keras_vis.activation_maximization.callbacks import Progress from tf_keras_vis.activation_maximization.input_modifiers import Jitter, Rotate2D from tf_keras_vis.activation_maximization.regularizers import TotalVariation2D, Norm from tf_keras_vis.utils.model_modifiers import ExtractIntermediateLayer, ReplaceToLinear from tf_keras_vis.utils.scores import CategoricalScore # Create the visualization instance. # All visualization classes accept a model and model-modifier, which, for example, # replaces the activation of last layer to linear function so on, in constructor. activation_maximization = \ ActivationMaximization(VGG16(), model_modifier=[ExtractIntermediateLayer('block5_conv3'), ReplaceToLinear()], clone=False) # You can use Score class to specify visualizing target you want. # And add regularizers or input-modifiers as needed. activations = \ activation_maximization(CategoricalScore(FILTER_INDEX), steps=200, input_modifiers=[Jitter(jitter=16), Rotate2D(degree=1)], regularizers=[TotalVariation2D(weight=1.0), Norm(weight=0.3, p=1)], optimizer=tf.keras.optimizers.RMSprop(1.0, 0.999), callbacks=[Progress()]) ## Since v0.6.0, calling `astype()` is NOT necessary. # activations = activations[0].astype(np.uint8) # Render plt.imshow(activations[0]) ``` ### Gradcam++ ```python import numpy as np from matplotlib import pyplot as plt from matplotlib import cm from tf_keras_vis.gradcam_plus_plus import GradcamPlusPlus from tf_keras_vis.utils.model_modifiers import ReplaceToLinear from tf_keras_vis.utils.scores import CategoricalScore # Create GradCAM++ object gradcam = GradcamPlusPlus(YOUR_MODEL_INSTANCE, model_modifier=ReplaceToLinear(), clone=True) # Generate cam with GradCAM++ cam = gradcam(CategoricalScore(CATEGORICAL_INDEX), SEED_INPUT) ## Since v0.6.0, calling `normalize()` is NOT necessary. # cam = normalize(cam) plt.imshow(SEED_INPUT_IMAGE) heatmap = np.uint8(cm.jet(cam[0])[..., :3] * 255) plt.imshow(heatmap, cmap='jet', alpha=0.5) # overlay ``` Please see the guides below for more details: ### Getting Started Guides * [Saliency and CAMs](https://keisen.github.io/tf-keras-vis-docs/examples/attentions.html) * [Visualize Dense Layer](https://keisen.github.io/tf-keras-vis-docs/examples/visualize_dense_layer.html) * [Visualize Convolutional Filer](https://keisen.github.io/tf-keras-vis-docs/examples/visualize_conv_filters.html) **[NOTES]** If you have ever used [keras-vis](https://github.com/raghakot/keras-vis), you may feel that tf-keras-vis is similar with keras-vis. Actually tf-keras-vis derived from keras-vis, and both provided visualization methods are almost the same. But please notice that tf-keras-vis APIs does NOT have compatibility with keras-vis. ## Requirements * Python 3.7+ * Tensorflow 2.0+ ## Installation * PyPI ```bash $ pip install tf-keras-vis tensorflow ``` * Source (for development) ```bash $ git clone https://github.com/keisen/tf-keras-vis.git $ cd tf-keras-vis $ pip install -e .[develop] tensorflow ``` ## Use Cases * [chitra](https://github.com/aniketmaurya/chitra) * A Deep Learning Computer Vision library for easy data loading, model building and model interpretation with GradCAM/GradCAM++. ## Known Issues * With InceptionV3, ActivationMaximization doesn't work well, that's, it might generate meaninglessly blur image. * With cascading model, Gradcam and Gradcam++ don't work well, that's, it might occur some error. So we recommend to use FasterScoreCAM in this case. * `channels-first` models and data is unsupported. ## ToDo * Guides * Visualizing multiple attention or activation images at once utilizing batch-system of model * Define various score functions * Visualizing attentions with multiple inputs models * Visualizing attentions with multiple outputs models * Advanced score functions * Tuning Activation Maximization * Visualizing attentions for N-dim image inputs * We're going to add some methods such as below - Deep Dream - Style transfer %package help Summary: Development documents and examples for tf-keras-vis Provides: python3-tf-keras-vis-doc %description help # [tf-keras-vis](https://keisen.github.io/tf-keras-vis-docs/) [![Downloads](https://pepy.tech/badge/tf-keras-vis)](https://pepy.tech/project/tf-keras-vis) [![Python](https://img.shields.io/pypi/pyversions/tf-keras-vis.svg?style=plastic)](https://badge.fury.io/py/tf-keras-vis) [![PyPI version](https://badge.fury.io/py/tf-keras-vis.svg)](https://badge.fury.io/py/tf-keras-vis) [![Python package](https://github.com/keisen/tf-keras-vis/actions/workflows/python-package.yml/badge.svg)](https://github.com/keisen/tf-keras-vis/actions/workflows/python-package.yml) [![License: MIT](https://img.shields.io/badge/License-MIT-yellow.svg)](https://opensource.org/licenses/MIT) [![Documentation](https://img.shields.io/badge/api-reference-blue.svg)](https://keisen.github.io/tf-keras-vis-docs/) ## Web documents https://keisen.github.io/tf-keras-vis-docs/ ## Overview tf-keras-vis is a visualization toolkit for debugging `tf.keras.Model` in Tensorflow2.0+. Currently supported methods for visualization include: * Feature Visualization - ActivationMaximization ([web](https://distill.pub/2017/feature-visualization/), [github](https://github.com/raghakot/keras-vis)) * Class Activation Maps - GradCAM ([paper](https://arxiv.org/pdf/1610.02391v1.pdf)) - GradCAM++ ([paper](https://arxiv.org/pdf/1710.11063.pdf)) - ScoreCAM ([paper](https://arxiv.org/pdf/1910.01279.pdf), [github](https://github.com/haofanwang/Score-CAM)) - Faster-ScoreCAM ([github](https://github.com/tabayashi0117/Score-CAM/blob/master/README.md#faster-score-cam)) - LayerCAM ([paper](http://mftp.mmcheng.net/Papers/21TIP_LayerCAM.pdf), [github](https://github.com/PengtaoJiang/LayerCAM)) :new::zap: * Saliency Maps - Vanilla Saliency ([paper](https://arxiv.org/pdf/1312.6034.pdf)) - SmoothGrad ([paper](https://arxiv.org/pdf/1706.03825.pdf)) tf-keras-vis is designed to be light-weight, flexible and ease of use. All visualizations have the features as follows: * Support **N-dim image inputs**, that's, not only support pictures but also such as 3D images. * Support **batch wise** processing, so, be able to efficiently process multiple input images. * Support the model that have either **multiple inputs** or **multiple outputs**, or both. * Support the **mixed-precision** model. And in ActivationMaximization, * Support Optimizers that are built to tf.keras. ### Visualizations #### Dense Unit #### Convolutional Filter #### Class Activation Map The images above are generated by `GradCAM++`. #### Saliency Map The images above are generated by `SmoothGrad`. ## Usage ### ActivationMaximization (Visualizing Convolutional Filter) ```python import tensorflow as tf from tensorflow.keras.applications import VGG16 from matplotlib import pyplot as plt from tf_keras_vis.activation_maximization import ActivationMaximization from tf_keras_vis.activation_maximization.callbacks import Progress from tf_keras_vis.activation_maximization.input_modifiers import Jitter, Rotate2D from tf_keras_vis.activation_maximization.regularizers import TotalVariation2D, Norm from tf_keras_vis.utils.model_modifiers import ExtractIntermediateLayer, ReplaceToLinear from tf_keras_vis.utils.scores import CategoricalScore # Create the visualization instance. # All visualization classes accept a model and model-modifier, which, for example, # replaces the activation of last layer to linear function so on, in constructor. activation_maximization = \ ActivationMaximization(VGG16(), model_modifier=[ExtractIntermediateLayer('block5_conv3'), ReplaceToLinear()], clone=False) # You can use Score class to specify visualizing target you want. # And add regularizers or input-modifiers as needed. activations = \ activation_maximization(CategoricalScore(FILTER_INDEX), steps=200, input_modifiers=[Jitter(jitter=16), Rotate2D(degree=1)], regularizers=[TotalVariation2D(weight=1.0), Norm(weight=0.3, p=1)], optimizer=tf.keras.optimizers.RMSprop(1.0, 0.999), callbacks=[Progress()]) ## Since v0.6.0, calling `astype()` is NOT necessary. # activations = activations[0].astype(np.uint8) # Render plt.imshow(activations[0]) ``` ### Gradcam++ ```python import numpy as np from matplotlib import pyplot as plt from matplotlib import cm from tf_keras_vis.gradcam_plus_plus import GradcamPlusPlus from tf_keras_vis.utils.model_modifiers import ReplaceToLinear from tf_keras_vis.utils.scores import CategoricalScore # Create GradCAM++ object gradcam = GradcamPlusPlus(YOUR_MODEL_INSTANCE, model_modifier=ReplaceToLinear(), clone=True) # Generate cam with GradCAM++ cam = gradcam(CategoricalScore(CATEGORICAL_INDEX), SEED_INPUT) ## Since v0.6.0, calling `normalize()` is NOT necessary. # cam = normalize(cam) plt.imshow(SEED_INPUT_IMAGE) heatmap = np.uint8(cm.jet(cam[0])[..., :3] * 255) plt.imshow(heatmap, cmap='jet', alpha=0.5) # overlay ``` Please see the guides below for more details: ### Getting Started Guides * [Saliency and CAMs](https://keisen.github.io/tf-keras-vis-docs/examples/attentions.html) * [Visualize Dense Layer](https://keisen.github.io/tf-keras-vis-docs/examples/visualize_dense_layer.html) * [Visualize Convolutional Filer](https://keisen.github.io/tf-keras-vis-docs/examples/visualize_conv_filters.html) **[NOTES]** If you have ever used [keras-vis](https://github.com/raghakot/keras-vis), you may feel that tf-keras-vis is similar with keras-vis. Actually tf-keras-vis derived from keras-vis, and both provided visualization methods are almost the same. But please notice that tf-keras-vis APIs does NOT have compatibility with keras-vis. ## Requirements * Python 3.7+ * Tensorflow 2.0+ ## Installation * PyPI ```bash $ pip install tf-keras-vis tensorflow ``` * Source (for development) ```bash $ git clone https://github.com/keisen/tf-keras-vis.git $ cd tf-keras-vis $ pip install -e .[develop] tensorflow ``` ## Use Cases * [chitra](https://github.com/aniketmaurya/chitra) * A Deep Learning Computer Vision library for easy data loading, model building and model interpretation with GradCAM/GradCAM++. ## Known Issues * With InceptionV3, ActivationMaximization doesn't work well, that's, it might generate meaninglessly blur image. * With cascading model, Gradcam and Gradcam++ don't work well, that's, it might occur some error. So we recommend to use FasterScoreCAM in this case. * `channels-first` models and data is unsupported. ## ToDo * Guides * Visualizing multiple attention or activation images at once utilizing batch-system of model * Define various score functions * Visualizing attentions with multiple inputs models * Visualizing attentions with multiple outputs models * Advanced score functions * Tuning Activation Maximization * Visualizing attentions for N-dim image inputs * We're going to add some methods such as below - Deep Dream - Style transfer %prep %autosetup -n tf-keras-vis-0.8.5 %build %py3_build %install %py3_install install -d -m755 %{buildroot}/%{_pkgdocdir} if [ -d doc ]; then cp -arf doc %{buildroot}/%{_pkgdocdir}; fi if [ -d docs ]; then cp -arf docs %{buildroot}/%{_pkgdocdir}; fi if [ -d example ]; then cp -arf example %{buildroot}/%{_pkgdocdir}; fi if [ -d examples ]; then cp -arf examples %{buildroot}/%{_pkgdocdir}; fi pushd %{buildroot} if [ -d usr/lib ]; then find usr/lib -type f -printf "\"/%h/%f\"\n" >> filelist.lst fi if [ -d usr/lib64 ]; then find usr/lib64 -type f -printf "\"/%h/%f\"\n" >> filelist.lst fi if [ -d usr/bin ]; then find usr/bin -type f -printf "\"/%h/%f\"\n" >> filelist.lst fi if [ -d usr/sbin ]; then find usr/sbin -type f -printf "\"/%h/%f\"\n" >> filelist.lst fi touch doclist.lst if [ -d usr/share/man ]; then find usr/share/man -type f -printf "\"/%h/%f.gz\"\n" >> doclist.lst fi popd mv %{buildroot}/filelist.lst . mv %{buildroot}/doclist.lst . %files -n python3-tf-keras-vis -f filelist.lst %dir %{python3_sitelib}/* %files help -f doclist.lst %{_docdir}/* %changelog * Fri Jun 09 2023 Python_Bot - 0.8.5-1 - Package Spec generated