The joy (and pain) of measuring streaming media

www.nwfusion.com –

Everyone knows what it means when you say it takes 5 seconds to download a Web page. When you try to measure streaming audio and video quality on the Internet, a flood of factors arrives. Factors like late and lost packets, rebuffering during playback, bandwidth delivered and bandwidth received add to the complex mix of measurements to evaluate the overall streaming audio and video quality.

Furthermore, streaming content itself is complicated:

  • More data is transferred in real time than with static Web pages
  • The data is much more sensitive to latency.
  • Multiple data types are involved (audio & video).
  • Different default delivery protocols are involved (User Datagram Protocol (UDP) vs. TCP/IP)
  • Multiple technologies are involved (Real, Windows Media, QuickTime).

    By definition, streaming content isn't subjected to the delays associated with downloading very large rich media files to a storage device. More time-sensitive than standard Web pages, streaming media is subject to significant performance problems caused by today's Internet infrastructure, causing jittery video and audio static. Still, online users expect the same smooth, uninterrupted audio and video they get from their television and radio.

    Because you can't improve what you don't measure, streaming content providers need to monitor and measure their streaming performance. What follows is a guide about some of the factors involved in measuring streaming media.

    Three major factors that affect streaming quality, which need to be measured, are startup time, audio quality and video quality. Within these factors are specific elements, such as connect time, redirect time, initial buffer time, video frame rate, recovered, lost and dropped packets, and bandwidth utilization.

    1.Startup time: The time it takes from when you press the play button until the clip begins. Startup Time equals the total time required for Initial Connection (including DNS & time to first byte), Redirection Time, and Initial Buffering.

    a.Initial connection: Time it takes to establish a Real Time Streaming Protocol (RTSP) connection between the streaming server and the streaming client (or player).

    b.Redirection: The time it takes to transfer data to the last server from a second and subsequent servers, to which the first server may redirect data.

    c.Initial buffering: The time it takes to start viewing and hearing a streaming media clip from the time the data arrives in the buffer of the client computer.

    2.Audio quality: Derived from audio encoding and audio delivery

    a.Audio encoding includes the number of audio channels, bitrate per channel and quaility of original content.

    b.Audio delivery includes the delivered bandwidth and packet delivery.

    3. Video quality: Derived from video encoding and video delivery

    a. Video encoding includes the encoded bitrate, encoded frame rate and the quality of original content

    b. Video delivery includes the delivered bandwidth and packet delivery.

    The higher the encoded bitrate AND its successful delivery to the user, for both audio and video, the higher the quality.

    Two more pieces of the quality puzzle are encoding quality and delivery quality. Encoding quality is based on a complex set of decisiions and tradeoffs depending on the potential audience: how much to dedicate to video (including frame rate, window size, bit depth) and how much to audio (one channel mono, stereo, surround sound), video resolution, video frame rate, etc. The tradeoffs become even more difficult when the audience is going to be a mixed one (both dialup and high bandwidth).

    For example, T-1 users can receive content up to 1.4M bit/sec, which comes close to supporting full motion video, but it is nowhere near DVD quality. Full screen/full motion, with near-DVD quality with stereo would require at least a 10M bit/sec stream, and the world is a long time away -- in Internet years -- of providing that kind of connectivity.

    Still, very few companies are streaming even at the 1M bit/sec rate. Because most small business and broadband home users connect at DSL or cable speeds of 300K bit/sec or less (with the majority of home users still at 56K bit/sec dial-up or less), the "largest" content currently being streamed over the Internet is at 300K bit/sec. Most of the content is streamed at a far lower bitrate. At 300K bit/sec, there is no way that full screen video can be streamed without compromising frame rate, audio quality or video resolution.

    Delivery quality includes bandwidth, packet and frame rate data:

    1. Average frames per second (video only): Indicates the average number of video frames received, which can be compared to the average number of frames actually streamed. Television and movies display video at 30 frames/sec, which is the rate at which humans discern full motion.

    2. Late and lost packets: Late packets, received by the client's play buffer but too late to use, is the worst case scenario. Not only can the client not use the packets, but the packets use up bandwidth on the receiving side, taking up space that could be used for usable packets. Lost packets never make it to the client. Both late and lost packets have very undesirable effects on audio and video, including pixelation, jitters, frozen video, audio popping or audio static. By improving connectivity to areas that are experiencing a bad connection, hosting providers can reduce the number of late and lost packets for Webcasters.

    3. Average audio and video bandwidth: The negotiation that occurs between the Webcast server and the computers at the receiving end to minimize packet loss. With connection problems or insufficient bandwidth, the server will scale back, or "thin" the stream. The server would rather scale back and deliver fewer frames than drop frames. Thinning can appear as a smaller viewing window as well as slower video motion. Thinning varies dramatically by geography, time of day, the user's Internet connection, location of caching servers, regional backbone, and peering problems, for example. This is one reason why it's so important to measure streaming performance from multiple locations, to discover significant geographic variation.

    Real Networks' Real Media and Microsoft's Windows Media elegantly handle the thinning issue by encoding several different bitrates into a single file, thereby giving the content producer the ability to raise or lower the bitrate to accommodate changing network conditions during playback. However, these options have tradeoffs. Encoding content in this format can degrade the video quality at encoding time (as opposed to delivery). Also, the technology isn't yet perfected and may not "step down" or "step up" the content when it is supposed to.

    Another factor that affects delivery quality is protocol default. Because TCP/IP can result in delays in the delivery of time sensitive content like streaming, most streaming applications will default to UDP because it is more suited for content that changes frequently or can tolerate some packet loss.

    Here are the four most common delivery scenarios that affect the quality of a stream:

    1. During the initial negotiation between the server and client, if there is not enough network bandwidth or less than optimal network conditions, the streaming server will decide to not deliver the ideal number of packets and will scale the stream back to stream fewer packets.

    2. After playback begins, if network conditions degrade the server will again make a decision to scale back the number of packets it attempts to deliver.

    3. Packets arrive to the client too late for the software application to use them.

    4. Packets get dropped or lost on the way and never reach the client computer.

    Because of the large performance variations that occur on the Internet, it is important for content providers to measure the performance of their media to gain an objective view on what their users are experiencing. You can know how your site is performing, compare yourself to your competition, and discover where you can actually make improvements. Measurements can reveal geographic differences that may be related to your provider's service, backbone problems that can be quickly identified and repaired, insufficient caching or server power that can be beefed up, etc.

  • Insider: How the basic tech behind the Internet works
    Join the discussion
    Be the first to comment on this article. Our Commenting Policies