screenshot of OBS stream and observed 120 millisecond latency.
Recently the news broke that in the OBS Studio codebase, it was changed to power implement support for WebRTC technology, which can be used instead of the RTMP protocol for video streaming without an intermediate server, in which P2P content is streamed directly to the user's browser.
For whom it is they are unaware of this software, they should know that It is for broadcast, composition and video recording. The development goal of OBS Studio is to create a free version of the Open Broadcaster Software application that is not tied to the Windows platform, supports OpenGL, and is extensible through plugins.
within the grounds that are mentioned to implement the support of WebRTC, is that with it latency less than one second could be achieved and that above all in the future this can be reduced even more.
With our initial measurements, we see ~120 milliseconds from broadcast to playback. We believe that we can also continue to reduce this number
OBS users can now create interactive experiences with their viewers. This would allow talk shows and other productions that require conversational latency to use OBS.
Another reason mentioned is the desire to remove support for the FTL protocol which was created for Mixer, which was pretty cool and handy, but because the protocol has been abandoned It no longer makes sense for developers to proceed with adding WebRTC, as it provides the required latency in addition to the many benefits inherent in the WebRTC stack, such as encryption, network topology strategies, robust congestion control, etc.
In addition to this, it is also highlighted that WebRTC allows broadcasters to upload multiple streams of different quality, thereby mentioning that OBS users can upload 'high', 'medium' and 'low' streams themselves.
The ability to send videos from OBS directly to the users since with WebRTC it is possible to establish a P2P connection.
The WebRTC implementation is based on the use of the libdatachannel library written in C++. In its current form, only streaming is supported (video output) in WebRTC and a service is provided with support for the WHIP process used to establish sessions between the WebRTC server and the client. The code to support WebRTC as a source is currently under review.
WebRTC stands out because it allows to achieve a reduction in delays in delivering split-second video, making it possible to create interactive content and interact with viewers in real time, for example hosting a talk show. Using WebRTC, you can switch between networks without interrupting streaming (for example, switching from Wi-Fi to a mobile network) and organize the transmission of several video streams within a single session, for example, to film from different angles or organize interactive. videos
WebRTC also allows you to download multiple versions of already transcoded streams with different levels of quality for users with different bandwidths of communication channels, so as not to perform the transcoding work on the server side. It is possible to use different video codecs such as H.265 and AV1 to reduce bandwidth requirements.
Finally, for those interested in being able to learn more about it, they can consult the details in the following link
As for interested in being able to test WebRTC in their streams, At the moment it is proposed to use Broadcast Box as a reference server implementation for WebRTC-based broadcasts, but to broadcast to a small audience, you can do without a server by configuring it in P2P mode.
They can consult the information about the implementation as well as the configuration instructions from the following link
Last but not least, it is worth mentioning that the implementation is expected to be presented in the next releases of OBS Studio.