What’s behind the livestreaming boom? The answer is at the edge

For many, livestreaming is now the default way of experiencing the internet

Even before the pandemic, internet users watched 1.1 billion hours of live video in 2019. Since lockdown, our reliance on Zoom calls for work, TikTok, Instagram and e-sports for entertainment will have blown that figure out of the water.

But 2020 has shown how livestreaming technology is important for businesses too, enabling business continuity in challenging times and offering companies and individuals new means to sell and market products and services, opening up much-needed revenue streams. It’s important to understand how livestreaming services work, and why technologies like edge computing are crucial to business success in 2021.

The data growth challenge: explosion in livestreaming

Recent research from Seagate, conducted by IDC – Rethink Data – predicts that in just two years, from the beginning of 2020 through the beginning of 2022, enterprises will see a 42% annual increase in the volume of generated data. Streaming data now makes up a considerable and growing portion of that – whether it’s video calls, Instagram Lives, massively multiplayer online role-playing games (MMORPG), e-sports tournaments or increasingly livestreamed e-commerce. Popular in Asia and increasingly prevalent in the UK, livestreaming e-commerce is where influencers promote and sell goods through shoppable livestreams on their own social media channels or live from online shopping sites.

In many countries, streaming data now makes up a considerable portion of all raw information in use, creating a heavy load on network infrastructure that impacts live video quality. For example, if you take the 1.1 billion hours of live video users watched in a year, then at 1080P resolution this equates to 1.65 exabytes of data (equivalent to 1.65 billion gigabytes), and at 4K this equates to 7.92 exabytes – an almost incomprehensible amount. Service providers and enterprises are turning to edge computing to help carry this monumental load.

Streaming under the hood: where is the edge?

Edge computing is a set of technologies that bring data storage and computing power closer to the user to minimize latency and response time. Put simply, the edge is a location, not a thing. It is the outer boundary of the network—sometimes found hundreds or thousands of miles from the nearest enterprise or cloud data centre, and as close to the data source as possible. The benefit of applying this to livestreaming is the user experiences less lagging and buffering, and the content creator can add more interactive services, confident they will work smoothly.

IDC has found the amount of data stored at the edge is increasing at a faster rate than data stored in the core. More and more, enterprises are using the edge to store critical data to fuel latency-sensitive requests from transactions and services – watching and purchasing from a livestream for example. Another way the edge helps spread the load of livestreaming is distributed computing analysis: linking multiple computer servers over a network in order to share data and coordinate processing power.

Using edge computing to improve the streaming experience

Without edge and cloud computing supporting the weight of data traffic, bottlenecks will cause poor video quality and buffering once the number of people wanting to view a livestream passes a certain point. Edge computing practices can be used to dispatch source streams to edge servers positioned in various geographical locations. By doing this, the provider is better positioned to scale for greater user numbers, distributing viewer traffic in a logical manner that splits the load and reduces the physical distance between the viewer and where the content is stored.

Streamed data processing at the edge requires two layers: a storage layer and a processing layer. The storage layer needs to support re-playable reads and writes of large streams of data. The playback needs to be fast, inexpensive, and consistent. The processing layer is responsible for consuming data from the storage layer, running computations on that data, and then notifying the storage layer to delete data that is no longer needed.

Streaming on the edge: what’s next?

The boom in livestreaming has created challenges for storing and processing data fast enough to provide a great user experience. The solutions are being found in edge computing. As edge computing evolves, it will offer more opportunities for providers to improve the overall user experience of their products. For example, with data storage and compute power at the edge, providers will be able to build in more effective interactive features, better search capabilities, and more personalisation and recommended content – subject to user opt-ins. As more and more livestreamed data is analysed and actioned at the edge, businesses need to plan for scalability, data durability, and fault tolerance in both storage and processing.

A vast range of businesses are already using edge computing technology to facilitate livestreaming applications: from tracking real-time equipment performance on oil rigs to quality control in manufacturer production lines. And as a relatively recent technology not every possible use case will be readily apparent. Understanding this technology will enable business leaders and technology experts to spot new and innovative applications – and if one thing is for certain, it’s that there will be plenty of them.


About the Author

Ravi Naik is CIO at Seagate. At Seagate, we know that data is always in motion, alive, connected—and we harness it in order to maximize human potential. Since 1979, we have been creating precision-engineered data storage technologies that deliver superior capacity, speed, safety, and performance.

Featured image: ©Grey_Coast_Media

Copy link