summaryrefslogtreecommitdiff
diff options
context:
space:
mode:
authorCoprDistGit <infra@openeuler.org>2023-05-05 10:14:43 +0000
committerCoprDistGit <infra@openeuler.org>2023-05-05 10:14:43 +0000
commit097d0636307537b67ddb1f0aa1c972d5e8a2cf65 (patch)
tree5a2d45a69f01f14abde61d988828e1c74db2077f
parent78897cdf87a815d272ec59b79629059c3b4cfcf0 (diff)
automatic import of python-streamlit-webrtcopeneuler20.03
-rw-r--r--.gitignore1
-rw-r--r--python-streamlit-webrtc.spec529
-rw-r--r--sources1
3 files changed, 531 insertions, 0 deletions
diff --git a/.gitignore b/.gitignore
index e69de29..cf54dee 100644
--- a/.gitignore
+++ b/.gitignore
@@ -0,0 +1 @@
+/streamlit_webrtc-0.45.0.tar.gz
diff --git a/python-streamlit-webrtc.spec b/python-streamlit-webrtc.spec
new file mode 100644
index 0000000..3016e84
--- /dev/null
+++ b/python-streamlit-webrtc.spec
@@ -0,0 +1,529 @@
+%global _empty_manifest_terminate_build 0
+Name: python-streamlit-webrtc
+Version: 0.45.0
+Release: 1
+Summary: please add a summary manually as the author left a blank one
+License: MIT
+URL: https://github.com/whitphx/streamlit-webrtc
+Source0: https://mirrors.nju.edu.cn/pypi/web/packages/41/1f/c82b7b3c72e5dc35217fed5a914d09214f9e0805e114614dbd63e4905fbb/streamlit_webrtc-0.45.0.tar.gz
+BuildArch: noarch
+
+Requires: python3-streamlit
+Requires: python3-aiortc
+Requires: python3-typing_extensions
+Requires: python3-packaging
+
+%description
+Create `app.py` with the content below.
+```py
+from streamlit_webrtc import webrtc_streamer
+webrtc_streamer(key="sample")
+```
+Unlike other Streamlit components, `webrtc_streamer()` requires the `key` argument as a unique identifier. Set an arbitrary string to it.
+Then run it with Streamlit and open http://localhost:8501/.
+```shell
+$ streamlit run app.py
+```
+You see the app view, so click the "START" button.
+Then, video and audio streaming starts. If asked for permissions to access the camera and microphone, allow it.
+![Basic example of streamlit-webrtc](./docs/images/streamlit_webrtc_basic.gif)
+Next, edit `app.py` as below and run it again.
+```py
+from streamlit_webrtc import webrtc_streamer
+import av
+def video_frame_callback(frame):
+ img = frame.to_ndarray(format="bgr24")
+ flipped = img[::-1,:,:]
+ return av.VideoFrame.from_ndarray(flipped, format="bgr24")
+webrtc_streamer(key="example", video_frame_callback=video_frame_callback)
+```
+Now the video is vertically flipped.
+![Vertically flipping example](./docs/images/streamlit_webrtc_flipped.gif)
+As an example above, you can edit the video frames by defining a callback that receives and returns a frame and passing it to the `video_frame_callback` argument (or `audio_frame_callback` for audio manipulation).
+The input and output frames are the instance of [`av.VideoFrame`](https://pyav.org/docs/develop/api/video.html#av.video.frame.VideoFrame) (or [`av.AudioFrame`](https://pyav.org/docs/develop/api/audio.html#av.audio.frame.AudioFrame) when dealing with audio) of [`PyAV` library](https://pyav.org/).
+You can inject any kinds of image (or audio) processing inside the callback.
+See examples above for more applications.
+### Pass parameters to the callback
+You can also pass parameters to the callback.
+In the example below, a boolean `flip` flag is used to turn on/off the image flipping.
+```python
+import streamlit as st
+from streamlit_webrtc import webrtc_streamer
+import av
+flip = st.checkbox("Flip")
+def video_frame_callback(frame):
+ img = frame.to_ndarray(format="bgr24")
+ flipped = img[::-1,:,:] if flip else img
+ return av.VideoFrame.from_ndarray(flipped, format="bgr24")
+webrtc_streamer(key="example", video_frame_callback=video_frame_callback)
+```
+### Pull values from the callback
+Sometimes we want to read the values generated in the callback from the outer scope.
+Note that the callback is executed in a forked thread running independently of the main script, so we have to take care of the following points and need some tricks for implementation like the example below (See also the section below for some limitations in the callback due to multi-threading).
+* Thread-safety
+ * Passing the values between inside and outside the callback must be thread-safe.
+* Using a loop to poll the values
+ * During media streaming, while the callback continues to be called, the main script execution stops at the bottom as usual. So we need to use a loop to keep the main script running and get the values from the callback in the outer scope.
+The following example is to pass the image frames from the callback to the outer scope and continuously process it in the loop. In this example, a simple image analysis (calculating the histogram like [this OpenCV tutorial](https://docs.opencv.org/4.x/d1/db7/tutorial_py_histogram_begins.html)) is done on the image frames.
+[`threading.Lock`](https://docs.python.org/3/library/threading.html#lock-objects) is one standard way to control variable accesses across threads.
+A dict object `img_container` here is a mutable container shared by the callback and the outer scope and the `lock` object is used at assigning and reading the values to/from the container for thread-safety.
+```python
+import threading
+import cv2
+import streamlit as st
+from matplotlib import pyplot as plt
+from streamlit_webrtc import webrtc_streamer
+lock = threading.Lock()
+img_container = {"img": None}
+def video_frame_callback(frame):
+ img = frame.to_ndarray(format="bgr24")
+ with lock:
+ img_container["img"] = img
+ return frame
+ctx = webrtc_streamer(key="example", video_frame_callback=video_frame_callback)
+fig_place = st.empty()
+fig, ax = plt.subplots(1, 1)
+while ctx.state.playing:
+ with lock:
+ img = img_container["img"]
+ if img is None:
+ continue
+ gray = cv2.cvtColor(img, cv2.COLOR_BGR2GRAY)
+ ax.cla()
+ ax.hist(gray.ravel(), 256, [0, 256])
+ fig_place.pyplot(fig)
+```
+## Callback limitations
+The callbacks are executed in forked threads different from the main one, so there are some limitations:
+* Streamlit methods (`st.*` such as `st.write()`) do not work inside the callbacks.
+* Variables inside the callbacks cannot be directly referred to from the outside.
+* The `global` keyword does not work expectedly in the callbacks.
+* You have to care about thread-safety when accessing the same objects both from outside and inside the callbacks as stated in the section above.
+## Class-based callbacks
+Until v0.37, the class-based callbacks were the standard.
+See the [old version of the README](https://github.com/whitphx/streamlit-webrtc/blob/v0.37.0/README.md#quick-tutorial) about it.
+## Serving from remote host
+When deploying apps to remote servers, there are some things you need to be aware of.
+### HTTPS
+`streamlit-webrtc` uses [`getUserMedia()`](https://developer.mozilla.org/en-US/docs/Web/API/MediaDevices/getUserMedia) API to access local media devices, and this method does not work in an insecure context.
+[This document](https://developer.mozilla.org/en-US/docs/Web/API/MediaDevices/getUserMedia#privacy_and_security) says
+> A secure context is, in short, a page loaded using HTTPS or the file:/// URL scheme, or a page loaded from localhost.
+So, when hosting your app on a remote server, it must be served via HTTPS if your app is using webcam or microphone.
+If not, you will encounter an error when starting using the device. For example, it's something like below on Chrome.
+> Error: navigator.mediaDevices is undefined. It seems the current document is not loaded securely.
+[Streamlit Cloud](https://streamlit.io/cloud) is a recommended way for HTTPS serving. You can easily deploy Streamlit apps with it, and most importantly for this topic, it serves the apps via HTTPS automatically by defualt.
+### Configure the STUN server
+To deploy the app to the cloud, we have to configure the *STUN* server via the `rtc_configuration` argument on `webrtc_streamer()` like below.
+```python
+webrtc_streamer(
+ # ...
+ rtc_configuration={ # Add this config
+ "iceServers": [{"urls": ["stun:stun.l.google.com:19302"]}]
+ }
+ # ...
+)
+```
+This configuration is necessary to establish the media streaming connection when the server is on a remote host.
+`streamlit_webrtc` uses WebRTC for its video and audio streaming. It has to access a "STUN server" in the global network for the remote peers (precisely, peers over the NATs) to establish WebRTC connections.
+As we don't see the details about STUN servers here, please google it if interested with keywords such as STUN, TURN, or NAT traversal, or read these articles ([1](https://towardsdatascience.com/developing-web-based-real-time-video-audio-processing-apps-quickly-with-streamlit-7c7bcd0bc5a8#1cec), [2](https://dev.to/whitphx/python-webrtc-basics-with-aiortc-48id), [3](https://www.3cx.com/pbx/what-is-a-stun-server/)).
+The example above is configured to use `stun.l.google.com:19302`, which is a free STUN server provided by Google.
+You can also use any other STUN servers.
+For example, [one user reported](https://github.com/whitphx/streamlit-webrtc/issues/283#issuecomment-889753789) that the Google's STUN server had a huge delay when using from China network, and the problem was solved by changing the STUN server.
+For those who know about the browser WebRTC API: The value of the rtc_configuration argument will be passed to the [`RTCPeerConnection`](https://developer.mozilla.org/en-US/docs/Web/API/RTCPeerConnection/RTCPeerConnection) constructor on the frontend.
+### Configure the TURN server if necessary
+Even if the STUN server is properly configured, media streaming may not work in some network environments.
+For example, in some office or public networks, there are firewalls which drop the WebRTC packets.
+In such environments, setting up a [TURN server](https://webrtc.org/getting-started/turn-server) is a solution. See https://github.com/whitphx/streamlit-webrtc/issues/335#issuecomment-897326755.
+## Logging
+For logging, this library uses the standard `logging` module and follows the practice described in [the official logging tutorial](https://docs.python.org/3/howto/logging.html#advanced-logging-tutorial). Then the logger names are the same as the module names - `streamlit_webrtc` or `streamlit_webrtc.*`.
+So you can get the logger instance with `logging.getLogger("streamlit_webrtc")` through which you can control the logs from this library.
+For example, if you want to set the log level on this library's logger as WARNING, you can use the following code.
+```python
+st_webrtc_logger = logging.getLogger("streamlit_webrtc")
+st_webrtc_logger.setLevel(logging.WARNING)
+```
+In practice, `aiortc`, a third-party package this library is internally using, also emits many INFO level logs and you may want to control its logs too.
+You can do it in the same way as below.
+```python
+aioice_logger = logging.getLogger("aioice")
+aioice_logger.setLevel(logging.WARNING)
+```
+## API changes
+Currently there is no documentation about the interface. See the examples in [./pages/*.py](./pages) for the usage.
+The API is not finalized yet and can be changed without backward compatibility in the future releases until v1.0.
+### For users since versions `<0.20`
+`VideoTransformerBase` and its `transform` method have been marked as deprecated in v0.20.0. Please use `VideoProcessorBase#recv()` instead.
+Note that the signature of the `recv` method is different from the `transform` in that the `recv` has to return an instance of `av.VideoFrame` or `av.AudioFrame`.
+Also, `webrtc_streamer()`'s `video_transformer_factory` and `async_transform` arguments are deprecated, so use `video_processor_factory` and `async_processing` respectively.
+See the samples in [app.py](./app.py) for their usage.
+## Resources
+* [Developing web-based real-time video/audio processing apps quickly with Streamlit](https://www.whitphx.info/posts/20211231-streamlit-webrtc-video-app-tutorial/)
+ * A tutorial for real-time video app development using `streamlit-webrtc`.
+ * Crosspost on dev.to: https://dev.to/whitphx/developing-web-based-real-time-videoaudio-processing-apps-quickly-with-streamlit-4k89
+* [New Component: streamlit-webrtc, a new way to deal with real-time media streams (Streamlit Community)](https://discuss.streamlit.io/t/new-component-streamlit-webrtc-a-new-way-to-deal-with-real-time-media-streams/8669)
+ * This is a forum topic where `streamlit-webrtc` has been introduced and discussed about.
+## Support the project
+[![ko-fi](https://ko-fi.com/img/githubbutton_sm.svg)](https://ko-fi.com/D1D2ERWFG)
+<a href="https://www.buymeacoffee.com/whitphx" target="_blank" rel="noreferrer"><img src="https://cdn.buymeacoffee.com/buttons/v2/default-yellow.png" alt="Buy Me A Coffee" width="180" height="50" ></a>
+[![GitHub Sponsors](https://img.shields.io/github/sponsors/whitphx?label=Sponsor%20me%20on%20GitHub%20Sponsors&style=social)](https://github.com/sponsors/whitphx)
+
+%package -n python3-streamlit-webrtc
+Summary: please add a summary manually as the author left a blank one
+Provides: python-streamlit-webrtc
+BuildRequires: python3-devel
+BuildRequires: python3-setuptools
+BuildRequires: python3-pip
+%description -n python3-streamlit-webrtc
+Create `app.py` with the content below.
+```py
+from streamlit_webrtc import webrtc_streamer
+webrtc_streamer(key="sample")
+```
+Unlike other Streamlit components, `webrtc_streamer()` requires the `key` argument as a unique identifier. Set an arbitrary string to it.
+Then run it with Streamlit and open http://localhost:8501/.
+```shell
+$ streamlit run app.py
+```
+You see the app view, so click the "START" button.
+Then, video and audio streaming starts. If asked for permissions to access the camera and microphone, allow it.
+![Basic example of streamlit-webrtc](./docs/images/streamlit_webrtc_basic.gif)
+Next, edit `app.py` as below and run it again.
+```py
+from streamlit_webrtc import webrtc_streamer
+import av
+def video_frame_callback(frame):
+ img = frame.to_ndarray(format="bgr24")
+ flipped = img[::-1,:,:]
+ return av.VideoFrame.from_ndarray(flipped, format="bgr24")
+webrtc_streamer(key="example", video_frame_callback=video_frame_callback)
+```
+Now the video is vertically flipped.
+![Vertically flipping example](./docs/images/streamlit_webrtc_flipped.gif)
+As an example above, you can edit the video frames by defining a callback that receives and returns a frame and passing it to the `video_frame_callback` argument (or `audio_frame_callback` for audio manipulation).
+The input and output frames are the instance of [`av.VideoFrame`](https://pyav.org/docs/develop/api/video.html#av.video.frame.VideoFrame) (or [`av.AudioFrame`](https://pyav.org/docs/develop/api/audio.html#av.audio.frame.AudioFrame) when dealing with audio) of [`PyAV` library](https://pyav.org/).
+You can inject any kinds of image (or audio) processing inside the callback.
+See examples above for more applications.
+### Pass parameters to the callback
+You can also pass parameters to the callback.
+In the example below, a boolean `flip` flag is used to turn on/off the image flipping.
+```python
+import streamlit as st
+from streamlit_webrtc import webrtc_streamer
+import av
+flip = st.checkbox("Flip")
+def video_frame_callback(frame):
+ img = frame.to_ndarray(format="bgr24")
+ flipped = img[::-1,:,:] if flip else img
+ return av.VideoFrame.from_ndarray(flipped, format="bgr24")
+webrtc_streamer(key="example", video_frame_callback=video_frame_callback)
+```
+### Pull values from the callback
+Sometimes we want to read the values generated in the callback from the outer scope.
+Note that the callback is executed in a forked thread running independently of the main script, so we have to take care of the following points and need some tricks for implementation like the example below (See also the section below for some limitations in the callback due to multi-threading).
+* Thread-safety
+ * Passing the values between inside and outside the callback must be thread-safe.
+* Using a loop to poll the values
+ * During media streaming, while the callback continues to be called, the main script execution stops at the bottom as usual. So we need to use a loop to keep the main script running and get the values from the callback in the outer scope.
+The following example is to pass the image frames from the callback to the outer scope and continuously process it in the loop. In this example, a simple image analysis (calculating the histogram like [this OpenCV tutorial](https://docs.opencv.org/4.x/d1/db7/tutorial_py_histogram_begins.html)) is done on the image frames.
+[`threading.Lock`](https://docs.python.org/3/library/threading.html#lock-objects) is one standard way to control variable accesses across threads.
+A dict object `img_container` here is a mutable container shared by the callback and the outer scope and the `lock` object is used at assigning and reading the values to/from the container for thread-safety.
+```python
+import threading
+import cv2
+import streamlit as st
+from matplotlib import pyplot as plt
+from streamlit_webrtc import webrtc_streamer
+lock = threading.Lock()
+img_container = {"img": None}
+def video_frame_callback(frame):
+ img = frame.to_ndarray(format="bgr24")
+ with lock:
+ img_container["img"] = img
+ return frame
+ctx = webrtc_streamer(key="example", video_frame_callback=video_frame_callback)
+fig_place = st.empty()
+fig, ax = plt.subplots(1, 1)
+while ctx.state.playing:
+ with lock:
+ img = img_container["img"]
+ if img is None:
+ continue
+ gray = cv2.cvtColor(img, cv2.COLOR_BGR2GRAY)
+ ax.cla()
+ ax.hist(gray.ravel(), 256, [0, 256])
+ fig_place.pyplot(fig)
+```
+## Callback limitations
+The callbacks are executed in forked threads different from the main one, so there are some limitations:
+* Streamlit methods (`st.*` such as `st.write()`) do not work inside the callbacks.
+* Variables inside the callbacks cannot be directly referred to from the outside.
+* The `global` keyword does not work expectedly in the callbacks.
+* You have to care about thread-safety when accessing the same objects both from outside and inside the callbacks as stated in the section above.
+## Class-based callbacks
+Until v0.37, the class-based callbacks were the standard.
+See the [old version of the README](https://github.com/whitphx/streamlit-webrtc/blob/v0.37.0/README.md#quick-tutorial) about it.
+## Serving from remote host
+When deploying apps to remote servers, there are some things you need to be aware of.
+### HTTPS
+`streamlit-webrtc` uses [`getUserMedia()`](https://developer.mozilla.org/en-US/docs/Web/API/MediaDevices/getUserMedia) API to access local media devices, and this method does not work in an insecure context.
+[This document](https://developer.mozilla.org/en-US/docs/Web/API/MediaDevices/getUserMedia#privacy_and_security) says
+> A secure context is, in short, a page loaded using HTTPS or the file:/// URL scheme, or a page loaded from localhost.
+So, when hosting your app on a remote server, it must be served via HTTPS if your app is using webcam or microphone.
+If not, you will encounter an error when starting using the device. For example, it's something like below on Chrome.
+> Error: navigator.mediaDevices is undefined. It seems the current document is not loaded securely.
+[Streamlit Cloud](https://streamlit.io/cloud) is a recommended way for HTTPS serving. You can easily deploy Streamlit apps with it, and most importantly for this topic, it serves the apps via HTTPS automatically by defualt.
+### Configure the STUN server
+To deploy the app to the cloud, we have to configure the *STUN* server via the `rtc_configuration` argument on `webrtc_streamer()` like below.
+```python
+webrtc_streamer(
+ # ...
+ rtc_configuration={ # Add this config
+ "iceServers": [{"urls": ["stun:stun.l.google.com:19302"]}]
+ }
+ # ...
+)
+```
+This configuration is necessary to establish the media streaming connection when the server is on a remote host.
+`streamlit_webrtc` uses WebRTC for its video and audio streaming. It has to access a "STUN server" in the global network for the remote peers (precisely, peers over the NATs) to establish WebRTC connections.
+As we don't see the details about STUN servers here, please google it if interested with keywords such as STUN, TURN, or NAT traversal, or read these articles ([1](https://towardsdatascience.com/developing-web-based-real-time-video-audio-processing-apps-quickly-with-streamlit-7c7bcd0bc5a8#1cec), [2](https://dev.to/whitphx/python-webrtc-basics-with-aiortc-48id), [3](https://www.3cx.com/pbx/what-is-a-stun-server/)).
+The example above is configured to use `stun.l.google.com:19302`, which is a free STUN server provided by Google.
+You can also use any other STUN servers.
+For example, [one user reported](https://github.com/whitphx/streamlit-webrtc/issues/283#issuecomment-889753789) that the Google's STUN server had a huge delay when using from China network, and the problem was solved by changing the STUN server.
+For those who know about the browser WebRTC API: The value of the rtc_configuration argument will be passed to the [`RTCPeerConnection`](https://developer.mozilla.org/en-US/docs/Web/API/RTCPeerConnection/RTCPeerConnection) constructor on the frontend.
+### Configure the TURN server if necessary
+Even if the STUN server is properly configured, media streaming may not work in some network environments.
+For example, in some office or public networks, there are firewalls which drop the WebRTC packets.
+In such environments, setting up a [TURN server](https://webrtc.org/getting-started/turn-server) is a solution. See https://github.com/whitphx/streamlit-webrtc/issues/335#issuecomment-897326755.
+## Logging
+For logging, this library uses the standard `logging` module and follows the practice described in [the official logging tutorial](https://docs.python.org/3/howto/logging.html#advanced-logging-tutorial). Then the logger names are the same as the module names - `streamlit_webrtc` or `streamlit_webrtc.*`.
+So you can get the logger instance with `logging.getLogger("streamlit_webrtc")` through which you can control the logs from this library.
+For example, if you want to set the log level on this library's logger as WARNING, you can use the following code.
+```python
+st_webrtc_logger = logging.getLogger("streamlit_webrtc")
+st_webrtc_logger.setLevel(logging.WARNING)
+```
+In practice, `aiortc`, a third-party package this library is internally using, also emits many INFO level logs and you may want to control its logs too.
+You can do it in the same way as below.
+```python
+aioice_logger = logging.getLogger("aioice")
+aioice_logger.setLevel(logging.WARNING)
+```
+## API changes
+Currently there is no documentation about the interface. See the examples in [./pages/*.py](./pages) for the usage.
+The API is not finalized yet and can be changed without backward compatibility in the future releases until v1.0.
+### For users since versions `<0.20`
+`VideoTransformerBase` and its `transform` method have been marked as deprecated in v0.20.0. Please use `VideoProcessorBase#recv()` instead.
+Note that the signature of the `recv` method is different from the `transform` in that the `recv` has to return an instance of `av.VideoFrame` or `av.AudioFrame`.
+Also, `webrtc_streamer()`'s `video_transformer_factory` and `async_transform` arguments are deprecated, so use `video_processor_factory` and `async_processing` respectively.
+See the samples in [app.py](./app.py) for their usage.
+## Resources
+* [Developing web-based real-time video/audio processing apps quickly with Streamlit](https://www.whitphx.info/posts/20211231-streamlit-webrtc-video-app-tutorial/)
+ * A tutorial for real-time video app development using `streamlit-webrtc`.
+ * Crosspost on dev.to: https://dev.to/whitphx/developing-web-based-real-time-videoaudio-processing-apps-quickly-with-streamlit-4k89
+* [New Component: streamlit-webrtc, a new way to deal with real-time media streams (Streamlit Community)](https://discuss.streamlit.io/t/new-component-streamlit-webrtc-a-new-way-to-deal-with-real-time-media-streams/8669)
+ * This is a forum topic where `streamlit-webrtc` has been introduced and discussed about.
+## Support the project
+[![ko-fi](https://ko-fi.com/img/githubbutton_sm.svg)](https://ko-fi.com/D1D2ERWFG)
+<a href="https://www.buymeacoffee.com/whitphx" target="_blank" rel="noreferrer"><img src="https://cdn.buymeacoffee.com/buttons/v2/default-yellow.png" alt="Buy Me A Coffee" width="180" height="50" ></a>
+[![GitHub Sponsors](https://img.shields.io/github/sponsors/whitphx?label=Sponsor%20me%20on%20GitHub%20Sponsors&style=social)](https://github.com/sponsors/whitphx)
+
+%package help
+Summary: Development documents and examples for streamlit-webrtc
+Provides: python3-streamlit-webrtc-doc
+%description help
+Create `app.py` with the content below.
+```py
+from streamlit_webrtc import webrtc_streamer
+webrtc_streamer(key="sample")
+```
+Unlike other Streamlit components, `webrtc_streamer()` requires the `key` argument as a unique identifier. Set an arbitrary string to it.
+Then run it with Streamlit and open http://localhost:8501/.
+```shell
+$ streamlit run app.py
+```
+You see the app view, so click the "START" button.
+Then, video and audio streaming starts. If asked for permissions to access the camera and microphone, allow it.
+![Basic example of streamlit-webrtc](./docs/images/streamlit_webrtc_basic.gif)
+Next, edit `app.py` as below and run it again.
+```py
+from streamlit_webrtc import webrtc_streamer
+import av
+def video_frame_callback(frame):
+ img = frame.to_ndarray(format="bgr24")
+ flipped = img[::-1,:,:]
+ return av.VideoFrame.from_ndarray(flipped, format="bgr24")
+webrtc_streamer(key="example", video_frame_callback=video_frame_callback)
+```
+Now the video is vertically flipped.
+![Vertically flipping example](./docs/images/streamlit_webrtc_flipped.gif)
+As an example above, you can edit the video frames by defining a callback that receives and returns a frame and passing it to the `video_frame_callback` argument (or `audio_frame_callback` for audio manipulation).
+The input and output frames are the instance of [`av.VideoFrame`](https://pyav.org/docs/develop/api/video.html#av.video.frame.VideoFrame) (or [`av.AudioFrame`](https://pyav.org/docs/develop/api/audio.html#av.audio.frame.AudioFrame) when dealing with audio) of [`PyAV` library](https://pyav.org/).
+You can inject any kinds of image (or audio) processing inside the callback.
+See examples above for more applications.
+### Pass parameters to the callback
+You can also pass parameters to the callback.
+In the example below, a boolean `flip` flag is used to turn on/off the image flipping.
+```python
+import streamlit as st
+from streamlit_webrtc import webrtc_streamer
+import av
+flip = st.checkbox("Flip")
+def video_frame_callback(frame):
+ img = frame.to_ndarray(format="bgr24")
+ flipped = img[::-1,:,:] if flip else img
+ return av.VideoFrame.from_ndarray(flipped, format="bgr24")
+webrtc_streamer(key="example", video_frame_callback=video_frame_callback)
+```
+### Pull values from the callback
+Sometimes we want to read the values generated in the callback from the outer scope.
+Note that the callback is executed in a forked thread running independently of the main script, so we have to take care of the following points and need some tricks for implementation like the example below (See also the section below for some limitations in the callback due to multi-threading).
+* Thread-safety
+ * Passing the values between inside and outside the callback must be thread-safe.
+* Using a loop to poll the values
+ * During media streaming, while the callback continues to be called, the main script execution stops at the bottom as usual. So we need to use a loop to keep the main script running and get the values from the callback in the outer scope.
+The following example is to pass the image frames from the callback to the outer scope and continuously process it in the loop. In this example, a simple image analysis (calculating the histogram like [this OpenCV tutorial](https://docs.opencv.org/4.x/d1/db7/tutorial_py_histogram_begins.html)) is done on the image frames.
+[`threading.Lock`](https://docs.python.org/3/library/threading.html#lock-objects) is one standard way to control variable accesses across threads.
+A dict object `img_container` here is a mutable container shared by the callback and the outer scope and the `lock` object is used at assigning and reading the values to/from the container for thread-safety.
+```python
+import threading
+import cv2
+import streamlit as st
+from matplotlib import pyplot as plt
+from streamlit_webrtc import webrtc_streamer
+lock = threading.Lock()
+img_container = {"img": None}
+def video_frame_callback(frame):
+ img = frame.to_ndarray(format="bgr24")
+ with lock:
+ img_container["img"] = img
+ return frame
+ctx = webrtc_streamer(key="example", video_frame_callback=video_frame_callback)
+fig_place = st.empty()
+fig, ax = plt.subplots(1, 1)
+while ctx.state.playing:
+ with lock:
+ img = img_container["img"]
+ if img is None:
+ continue
+ gray = cv2.cvtColor(img, cv2.COLOR_BGR2GRAY)
+ ax.cla()
+ ax.hist(gray.ravel(), 256, [0, 256])
+ fig_place.pyplot(fig)
+```
+## Callback limitations
+The callbacks are executed in forked threads different from the main one, so there are some limitations:
+* Streamlit methods (`st.*` such as `st.write()`) do not work inside the callbacks.
+* Variables inside the callbacks cannot be directly referred to from the outside.
+* The `global` keyword does not work expectedly in the callbacks.
+* You have to care about thread-safety when accessing the same objects both from outside and inside the callbacks as stated in the section above.
+## Class-based callbacks
+Until v0.37, the class-based callbacks were the standard.
+See the [old version of the README](https://github.com/whitphx/streamlit-webrtc/blob/v0.37.0/README.md#quick-tutorial) about it.
+## Serving from remote host
+When deploying apps to remote servers, there are some things you need to be aware of.
+### HTTPS
+`streamlit-webrtc` uses [`getUserMedia()`](https://developer.mozilla.org/en-US/docs/Web/API/MediaDevices/getUserMedia) API to access local media devices, and this method does not work in an insecure context.
+[This document](https://developer.mozilla.org/en-US/docs/Web/API/MediaDevices/getUserMedia#privacy_and_security) says
+> A secure context is, in short, a page loaded using HTTPS or the file:/// URL scheme, or a page loaded from localhost.
+So, when hosting your app on a remote server, it must be served via HTTPS if your app is using webcam or microphone.
+If not, you will encounter an error when starting using the device. For example, it's something like below on Chrome.
+> Error: navigator.mediaDevices is undefined. It seems the current document is not loaded securely.
+[Streamlit Cloud](https://streamlit.io/cloud) is a recommended way for HTTPS serving. You can easily deploy Streamlit apps with it, and most importantly for this topic, it serves the apps via HTTPS automatically by defualt.
+### Configure the STUN server
+To deploy the app to the cloud, we have to configure the *STUN* server via the `rtc_configuration` argument on `webrtc_streamer()` like below.
+```python
+webrtc_streamer(
+ # ...
+ rtc_configuration={ # Add this config
+ "iceServers": [{"urls": ["stun:stun.l.google.com:19302"]}]
+ }
+ # ...
+)
+```
+This configuration is necessary to establish the media streaming connection when the server is on a remote host.
+`streamlit_webrtc` uses WebRTC for its video and audio streaming. It has to access a "STUN server" in the global network for the remote peers (precisely, peers over the NATs) to establish WebRTC connections.
+As we don't see the details about STUN servers here, please google it if interested with keywords such as STUN, TURN, or NAT traversal, or read these articles ([1](https://towardsdatascience.com/developing-web-based-real-time-video-audio-processing-apps-quickly-with-streamlit-7c7bcd0bc5a8#1cec), [2](https://dev.to/whitphx/python-webrtc-basics-with-aiortc-48id), [3](https://www.3cx.com/pbx/what-is-a-stun-server/)).
+The example above is configured to use `stun.l.google.com:19302`, which is a free STUN server provided by Google.
+You can also use any other STUN servers.
+For example, [one user reported](https://github.com/whitphx/streamlit-webrtc/issues/283#issuecomment-889753789) that the Google's STUN server had a huge delay when using from China network, and the problem was solved by changing the STUN server.
+For those who know about the browser WebRTC API: The value of the rtc_configuration argument will be passed to the [`RTCPeerConnection`](https://developer.mozilla.org/en-US/docs/Web/API/RTCPeerConnection/RTCPeerConnection) constructor on the frontend.
+### Configure the TURN server if necessary
+Even if the STUN server is properly configured, media streaming may not work in some network environments.
+For example, in some office or public networks, there are firewalls which drop the WebRTC packets.
+In such environments, setting up a [TURN server](https://webrtc.org/getting-started/turn-server) is a solution. See https://github.com/whitphx/streamlit-webrtc/issues/335#issuecomment-897326755.
+## Logging
+For logging, this library uses the standard `logging` module and follows the practice described in [the official logging tutorial](https://docs.python.org/3/howto/logging.html#advanced-logging-tutorial). Then the logger names are the same as the module names - `streamlit_webrtc` or `streamlit_webrtc.*`.
+So you can get the logger instance with `logging.getLogger("streamlit_webrtc")` through which you can control the logs from this library.
+For example, if you want to set the log level on this library's logger as WARNING, you can use the following code.
+```python
+st_webrtc_logger = logging.getLogger("streamlit_webrtc")
+st_webrtc_logger.setLevel(logging.WARNING)
+```
+In practice, `aiortc`, a third-party package this library is internally using, also emits many INFO level logs and you may want to control its logs too.
+You can do it in the same way as below.
+```python
+aioice_logger = logging.getLogger("aioice")
+aioice_logger.setLevel(logging.WARNING)
+```
+## API changes
+Currently there is no documentation about the interface. See the examples in [./pages/*.py](./pages) for the usage.
+The API is not finalized yet and can be changed without backward compatibility in the future releases until v1.0.
+### For users since versions `<0.20`
+`VideoTransformerBase` and its `transform` method have been marked as deprecated in v0.20.0. Please use `VideoProcessorBase#recv()` instead.
+Note that the signature of the `recv` method is different from the `transform` in that the `recv` has to return an instance of `av.VideoFrame` or `av.AudioFrame`.
+Also, `webrtc_streamer()`'s `video_transformer_factory` and `async_transform` arguments are deprecated, so use `video_processor_factory` and `async_processing` respectively.
+See the samples in [app.py](./app.py) for their usage.
+## Resources
+* [Developing web-based real-time video/audio processing apps quickly with Streamlit](https://www.whitphx.info/posts/20211231-streamlit-webrtc-video-app-tutorial/)
+ * A tutorial for real-time video app development using `streamlit-webrtc`.
+ * Crosspost on dev.to: https://dev.to/whitphx/developing-web-based-real-time-videoaudio-processing-apps-quickly-with-streamlit-4k89
+* [New Component: streamlit-webrtc, a new way to deal with real-time media streams (Streamlit Community)](https://discuss.streamlit.io/t/new-component-streamlit-webrtc-a-new-way-to-deal-with-real-time-media-streams/8669)
+ * This is a forum topic where `streamlit-webrtc` has been introduced and discussed about.
+## Support the project
+[![ko-fi](https://ko-fi.com/img/githubbutton_sm.svg)](https://ko-fi.com/D1D2ERWFG)
+<a href="https://www.buymeacoffee.com/whitphx" target="_blank" rel="noreferrer"><img src="https://cdn.buymeacoffee.com/buttons/v2/default-yellow.png" alt="Buy Me A Coffee" width="180" height="50" ></a>
+[![GitHub Sponsors](https://img.shields.io/github/sponsors/whitphx?label=Sponsor%20me%20on%20GitHub%20Sponsors&style=social)](https://github.com/sponsors/whitphx)
+
+%prep
+%autosetup -n streamlit-webrtc-0.45.0
+
+%build
+%py3_build
+
+%install
+%py3_install
+install -d -m755 %{buildroot}/%{_pkgdocdir}
+if [ -d doc ]; then cp -arf doc %{buildroot}/%{_pkgdocdir}; fi
+if [ -d docs ]; then cp -arf docs %{buildroot}/%{_pkgdocdir}; fi
+if [ -d example ]; then cp -arf example %{buildroot}/%{_pkgdocdir}; fi
+if [ -d examples ]; then cp -arf examples %{buildroot}/%{_pkgdocdir}; fi
+pushd %{buildroot}
+if [ -d usr/lib ]; then
+ find usr/lib -type f -printf "/%h/%f\n" >> filelist.lst
+fi
+if [ -d usr/lib64 ]; then
+ find usr/lib64 -type f -printf "/%h/%f\n" >> filelist.lst
+fi
+if [ -d usr/bin ]; then
+ find usr/bin -type f -printf "/%h/%f\n" >> filelist.lst
+fi
+if [ -d usr/sbin ]; then
+ find usr/sbin -type f -printf "/%h/%f\n" >> filelist.lst
+fi
+touch doclist.lst
+if [ -d usr/share/man ]; then
+ find usr/share/man -type f -printf "/%h/%f.gz\n" >> doclist.lst
+fi
+popd
+mv %{buildroot}/filelist.lst .
+mv %{buildroot}/doclist.lst .
+
+%files -n python3-streamlit-webrtc -f filelist.lst
+%dir %{python3_sitelib}/*
+
+%files help -f doclist.lst
+%{_docdir}/*
+
+%changelog
+* Fri May 05 2023 Python_Bot <Python_Bot@openeuler.org> - 0.45.0-1
+- Package Spec generated
diff --git a/sources b/sources
new file mode 100644
index 0000000..45b636f
--- /dev/null
+++ b/sources
@@ -0,0 +1 @@
+750822d89b2c3f094a134bd20b30df55 streamlit_webrtc-0.45.0.tar.gz