close
close

How millions of people can see the same video at the same time – a computer scientist explains the technology behind the streaming

Live and on-demand video formed an estimated 66% of global internet traffic in 2022, and the top 10 days for internet traffic in 2024 fell with live streaming events such as the Jake Paul against Mike Tyson Boxing Match and the reporting of the NFL. Through streaming, you can enable seamless on-demand access to video content, from online games to short videos such as tictoks and longer content such as films, podcasts and NFL games.

The decisive aspect of streaming is his on-demand nature. Consider the global range of a Joe Rogan Podcast episode or the live reporting on the start of the SpaceX Crew Dragon SpaceCraft binding show examples of how streaming millions of viewers combine with content in real time and on-demand content worldwide.

I am a computer scientist whose research includes cloud computing. The distribution of arithmetic resources such as video servers on the Internet.

https://www.youtube.com/watch?v=92zwlmo1aig

Netflix claimed that on November 15, 2024, 65 million simultaneous streams for Jake Paul supported Mike Tyson Boxing Match, although many users reported technical problems.

“Share” of video

When it comes to video content – whether it is a live stream or a recorded video – two major challenges must be addressed. First, the video data is massive in size, so it is time -consuming to transfer from the source to devices such as television, computers, tablets and smartphones.

Second, streaming must be adaptable in order to absorb differences in the devices and internet functions of the users. For example, spectators with screens with lower resolution or slower internet speeds should be able to see a certain video, albeit in a lower quality, while those with higher -resolution displays and faster connections enjoy the best possible quality.

In order to cope with these challenges, video providers implement a number of optimizations. The first step includes the fragmentation of videos in smaller pieces, which are generally referred to as “pieces”. These chunks are then subjected to a process with the name “Coding and compression”, which optimizes the video for different resolutions and bit rate in order to meet different devices and network conditions.

If a user requests an on-demand video, the system dynamically selects the corresponding stream of chunks based on the functions of the user device, e.g. B. screen resolution and current internet speed. The video player mounted on the user's device and plays these pieces one after the other in order to create a seamless view of view.

For users with slower internet connections, the system delivers lower pieces to ensure smooth playback. For this reason, you may find a decline in video quality if your connection speed is reduced. If the video pauses during the reproduction, this is usually due to the fact that your player is waiting for additional pieces from the provider.

Video streams apply to users at different quality levels based on the device and the Internet connection of the user.
Cheetical Jaiswal

Dealing with distance and overload

The provision of video content on a large scale, regardless of whether it is recorded or live, is an important challenge if you are extrapolated to the immense number of videos used worldwide. Streaming services such as YouTube, Hulu and Netflix host enormous libraries of on-demand content, while at the same time managing countless live streams worldwide.

An apparently simple approach to providing video content would include creating a massive data center for storing all videos and related content and then streaming it to users worldwide via the Internet. However, this method is not preferred because it is associated with considerable challenges.

A main problem is the geographical latency in which the location of a user in terms of the data center affects the delay that he experiences. For example, if there is a data center in Virginia, a user in Washington, DC, would experience a minimal delay, while a user in Australia due to the increased removal and the need for data to cross several interconnected networks. This added travel time slows down the content delivery.

Another problem is network overload. Since more and more users worldwide are connected to the central data center, the connection networks are increasingly busy, which leads to frustrating delays and video buffets. If the same video is sent to several users at the same time, double data that travel via the same internet connections waste waste the bandwidth wastes and continues to clog the network.

A central data center also creates a single point of fault. If the data center experiences a failure, no users can access their content, which leads to a complete service disorder.

Content delegation networks

In order to cope with these challenges, most content providers rely on content -related networks. These networks distribute content via globally scattered presence points that are clusters of servers that store the copies of high -portrayal content locally. This approach significantly reduces latency and improves reliability.

Providers of Content Delivery Network such as Akamai and Edgio implement two main strategies for providing presence points.

The first is the “Engine” approach, in which thousands of smaller nodes of the preparation in the users come closer, often in networks for Internet service providers. This ensures minimal latency by becoming the content as close as possible to the end user.

A diagram with diamond and square series, which are connected by horizontal, vertical and diagonal arrows
With the Internet backbone at the top and the users below, this diagram shows the “Engine” approach to placement of contents deployment servers “Deep” into the network, close to users.
Cheetical Jaiswal

The second strategy is “Bring Home”, in which hundreds of larger preparation clusters are used at strategic locations, usually where ISPS are connected: Internet exchange points. While these clusters are further removed from the users than with the input low approach, they are larger, so that they can efficiently process higher traffic volumes.

A diagram with diamond and square series, which are connected by horizontal, vertical and diagonal arrows
With the Internet backbone above and the users below, this diagram shows the approach to the “Bringing Home” to place content provision servers between Backbone and Regional Internet Service providers.
Cheetical Jaiswal

Infrastructure for a networked world

Both strategies aim to optimize video streaming by reducing delays, minimizing bandwidth waste and ensuring seamless viewing experience for users worldwide.

The quick expansion of the Internet and the increase in video streaming – both live and on need – have changed how video content is delivered to users worldwide. However, the challenges in dealing with large amounts of video data, reducing geographical latency and compliance with different user devices and internet speeds require demanding solutions.

Content deletion networks have developed as the cornerstone of modern streaming and enables efficient and reliable delivery of videos. This infrastructure supports the growing demand for high -quality videos and underlines the innovative approaches that are necessary to meet the expectations of a networked world.

Leave a Comment