One of the articles that caught our eye at the end of last year was a piece in the Times by its Chief Sports Writer, Martyn Ziegler, discussing the problem of slow streaming and latency when it comes to watching live sports. It’s a problem that many people have suffered from in recent years; complaints about it are only increasing as more and more live events coverage moves over to streaming platforms, whether with new rights deals or as a part of increasingly comprehensive coverage of large-scale tournaments and events like the Olympics.
"It is the moment of frustration that is all too familiar to viewers watching live football streamed on Amazon Prime or Now TV. A glance down at Twitter and up pops a tweet reporting a goal in the match you are watching, yet up to a minute can pass before the action unfolds on the screen.”
However, it doesn’t have to be like this! At Red Bee, we’ve developed extremely low latency capabilities based on open standards that mean we can get a streamed event to a viewer faster — yes, faster — than the typical broadcast signal using existing technology and infrastructure. This means the action gets to them faster than any tweet – so fast, in fact, that we might have to introduce a bit of artificial lag in the signal to synchronize it with everything else.
The global average for a video signal to go from the camera to the TV screen, is about seven seconds. Martyn’s piece talks about a streaming signal taking anything from 15 seconds to one minute, but with the increased use of Adaptive Bit Rate (ABR) streaming, which varies the quality of the media stream depending on the available bandwidth and decoding processor speed at the user end, this is closer to the 15 second mark nowadays.
There are proprietary technologies that can go faster than this, but they are expensive to implement and don’t always use the open internet, making them prohibitive for consumers.
The point is this: 15 seconds is too slow. At Red Bee, we decided to see what we can do about it. Using a technology called chunked CMAF (Common Media Application Format), in tests we’ve not only got the streaming signal below 15 seconds, but we’ve got it below the seven seconds that the traditional broadcast signal takes to get to the viewer. In fact, we go better than this: we’ve got it down to 3.5 seconds. This innovation was recognised no fewer than three times in 2019 (CSI Awards, SportsPro OTT Awards and VideoTech Innovation Awards)
How did we do it? The older technologies were developed for slower internet speeds and less powerful CPUs in consumer devices. This meant they took a very cautious approach to ensure that there was none of the dreaded buffering in the signal. For instance, Apple’s HLS splits video into segments of between two to six seconds and mandates a three-segment buffer before it starts playing video. Add in the fact that a fourth segment is usually in the process of being buffered at any one time, and this process alone introduces eight to 24 seconds of time lag straight away.
CMAF was developed by Apple and Microsoft working together to speed up streaming times. Admittedly, it’s not the only game in town: there are several standards being proposed for future streaming services and we’re keeping an eye on all of them, but they all speed up the process by ensuring that work on encoding, transferring, and decoding the signal can start as soon as it’s received. It’s by using this technology that we’ve got a streaming signal down to a 3.5 second lag — and between one to two seconds of that is a buffer to ensure a high-quality picture with no buffering for the end user.
Of course, other factors will always play a part. Encoding, getting a signal ‘into’ the internet (the first mile, which from sports venues can be via anything from satellite to microwave relays), distribution, and network delays can all be factors — as can the last mile if it’s still over copper. Despite this, we are confident that a CMAF-based system based on open standards can usually match, and typically outperform, the seven seconds it takes a broadcast signal to reach the viewer.
Martyn’s article says that what is needed is more ultra-fast fiber broadband deployment. This would be nice but it isn’t strictly necessary, as the delays are introduced into the signal way before it hits the last mile to people’s houses. That’s the bad news. The good news is that the technology exists to end streaming lag now. It’s just a matter of time before a goal being announced on Twitter before the viewers have actually seen the ball hit the back of the net is just a fading memory.
By Anders Wassén, Head of OTT, Software and Integration, Red Bee Media