Why Some Live Streams are not ‘Smooth’

When streaming video across the internet, there are a number of factors that cause latency (delay in receiving the video) and other problems which make for an unsatisfactory viewing experience for the online audience.

Producers can opt for dedicated networks like MPLS (which we won’t go into here), which allow for the securing of enough high bandwidth to ensure a good quality of service (QoS). However, these mechanisms are expensive and often impractical solutions for regular streaming and if cost is a factor (as it most often is), transporting streams across the Public Internet is typically the preferred method of choice.

Using the Public Internet, however, can be challenging for a number of reasons and Streaming producers need to plan ahead as they are likely to encounter several common issues:

Packet Loss:

In order to send video data across the internet, the encoder compresses and breaks up the video data into very small packets that are sent across the network one after the other. When faced with congestion, the network may be forced to drop some packets as the data passes through. If packets are lost, the video quality will be compromised as the decoder at the other end does not have all of the information which is needed to rebuild the video images. This results in gaps in the the video stream at the viewer end.

Jitter:

This occurs when those same packets are delivered with inconsistent timing, or even out of sequence. Jitter, which shows up as jerky video, can have an exponential effect on latency and can become a significant problem if not resolved.

Intermediate Network Devices:

Devices between the sender and receiver of the video data, including streaming servers that replicate and repackage video streams can add latency delay to a video transmission.

Physical Distance:

The speed of light, even though it is extremely fast, is a limiting factor in all optical networks. The physical distance between end points must be factored into planning. Longer distance means more latency. So sending from Vancouver to Surrey takes a much shorter time than sending data to Europe or the East.

Firewalls:

Bridging between networks or streaming protocols can also add latency, so the impact of firewalls – especially the ones in corporate networks, needs to be taken into account. In fact, some office firewalls are even set up to block video streams completely for security reasons and it is important to announce to potential viewers that their firewalls could be a potentially limiting factor or even a total blockage.

Although designing the ideal solution for streaming high-quality, live video over the internet can be extremely challenging for the inexperienced producer, incorporating many of the modern protocols, techniques and technologies available today, makes it possible to ameliorate most of the above problems and produce a high quality live stream that is quite acceptable in quality.

For information and pricing on streaming your next event Live, visit us at EventcastLive.ca