When it comes to live streaming, latency is of major importance. Stream latency is the delay between the camera capturing an event and the event being displayed to viewers. This delay can vary from less than a second up to 60 seconds based on the streaming protocol used.
Latency can affect the streaming experience for viewers in different forms. For example: In an emergency, like a rescue mission in a collapsed building after an earthquake, a high latency streaming from the rescuer’s bodycam can lead to a misleading instruction from the main station which can result in even more harm. In a much less frighting example: You may annoyingly hear your neighbor celebrate the match-winning goal almost a minute before you even get to see it.
The rule is simple: The lower the latency, the closer the streaming is to real-time. So, what dictates latency? The streaming protocols are specific standardized rules and methods that determine how live video and audio travel through a network.
Every streaming protocol dictates breaking up video files into smaller pieces called “segments” so that they can be delivered to the end-user for reassembling and viewing, but each one offers different perks and advantages:
· WebRTC: offers ultra-low latency down to 0.5 seconds which is a great advantage when it comes to interaction between broadcasters and viewers in peer-to-peer streaming like video calls with your boss or virtual classes. As you may have experienced over the last two years over the pandemic, this super-fast transmission is easily affected by network fluctuations, and it is impossible to playback the missed segments of information.
· HLS: is a streaming protocol developed by Apple as a replacement for Adobe’s Flash Player in order to be compatible with HTML5. This makes it widely supported across multiple devices and browsers which means great reach or scalability. HLS ensures image and audio top quality by cutting the content into 10-second segments in (.ts) format but can experience up to a 45-second latency.
· MPEG DASH: shares similar perks with HLS like multi-bitrate capability, and a laid foundation on HTTP, but uses the (.mp4) format instead. Even thoughDASH has lower latency than HSL; down to 6 seconds, it is supported mainly byAndroid devices.
For years, it seemed like broadcasters had to choose not only between low latency (WebRTC) and top quality (HLS /DASH), but also had to duplicate the streaming process to reach wider audiences.Every broadcaster had to encode and store the same video file twice, one for each different media format – (.ts) and(.mp4) – This requires additional storage which makes the process more expensive, and inefficiently slower by increasing latency.
There must be a better way, right? Enter CMAF, which stands for CommonMedia Application Format. CMAF is a standard streaming format compatible with both HLS and DASH protocols. So, instead of duplicating every single content segment, there’s a single encoding, packaging, and storage. This can helps broadcasters reduce their costs by up to 75%[1].
[1] Streaming Media – “TheState of CMAF: The Holy Grail or Just Another Format?”
.
Suggested reading
Mobii Systems and UAR Rugby Join Forces to Revolutionize Rugby with Advanced Video and Data Technologies in Three-Year Partnership
Mobii Systems Completes AWS Foundational Technical Review Milestone for Its Ultra Low-Latency Streaming Solutions
Mobii Leads the Way in Zero Latency Streaming: Join the Streaming Media Tech Talk
Mobii ultra low-latency streaming services on Microsoft Azure now available in the Azure Marketplace
Subscribe for original content and announcements.