With ever-increasing network speeds and bandwidth available to mobile web users, content providers naturally want to make rich audio and video content available to the mobile world. In case of on-demand video and audio, playback support on mobile devices is generally good, especially due to the widespread adoption of HTML5 and its native support for video and audio. Things are more challenging for live-streaming however, especially when targeting the Android platform – a growing ecosystem of phone and tablet devices.
In this post, we take a look at the following 3 protocols and how they could be used for effective live streaming of video to Android devices:
- Apple’s HTTP Live Streaming (HLS)
- Adobe’s HTTP Dynamic Streamin (HDS)
- Dynamic Adaptive Streaming over HTTP (aka MPEG-DASH)
A nice comparison between HLS and HDS can be found at .
So, what’s the issue?
In a nutshell, here’s what makes live-streaming to Android devices so challenging right now:
- Flash is not support on newer Android (4.1.x +) devices anymore, with Adobe ceasing development of Flash on mobile devices.
- RTMP (also by Adobe) used to be a good choice to stream to any mobile device with a Flash runtime.
- However, RTMP is a proprietary protocol that only works in conjunction with Flash.
- HLS support is unsatisfactory on most Android devices.
- No native HDS support.
Building your own video stack to support live streaming (e.g. via a custom application) introduces new challenges that many content providers are not willing to take on.
- Android offers less-than-optimal HLS support [1, 2]:
- Introduced in Android 3.0, not all features of the specification were implemented, resulting in a poor user-experience overall that was buggy, with buffering common when seeking or moving between full-screen and windowed modes
- “Basic support was introduced in 3.0 (Honeycomb), but has never worked well enough for practical use.”
- “Android introduced HLS support in 4.0, only to have it dropped again in 4.1.”
- “HLS streaming is supported on Android version 2.3-4.0 via Flash” but “Flash support was discontinued in Android 4.1, meaning future versions of the OS cannot play back HLS streams natively”
- “Right now, content owners are left in an awkward state if they want to deliver live video to Android browsers. If Flash is present, you can deliver a basic Flash video player. If it is not, you can try to deliver HLS but the HLS manifests must either be hand-coded or created using Android-specific tools. If the HLS video can play without buffering you’ll find that there is no way to specify the aspect ratio, so in portrait mode it looks broken.”
- “Live video support for browser-based streaming within Android tablets and phones is a significant challenge with little help available from Google, and with Android still talking about removing H.264 video support, many content owners are wondering why they should even try to support Android any longer? “
A common suggestion is to use an RTSP fallback when Flash is not available. This path seems to be hit and miss so a consistent playback over different OS versions is not guaranteed. Furthermore, many media servers or platforms do not provide RTSP streams by default.
What about HDS or MPEG DASH?
Even though HDS can be considered superior to HLS , there are a few issues:
- no native browser support on any platform, including Android
- ”[…] HDS is a protocol only targeting Flash / AIR, while HLS works across the majority of devices. If you are aiming for a single workflow, HLS is the only logical choice.” (http://www.overdigital.com/2013/04/15/a-single-protocol-and-drm/)
The MPEG body has introduced DASH as a standard for adaptive streaming but no mainstream browser supports DASH at this stage.
Currently, content providers do not have a dependable single solution that reliably delivers live streams to Android mobile devices. If Flash is available on a given device, the RTMP or HLS streaming protocols can be used in conjunction with a Flash-based embedded player. If Flash is not available, things get hairy as neither HLS nor HDS, let alone DASH, is reliably supported via the Android operating system or it’s default browser. In that case, either a custom Android app has to be developed or a native video player supporting HLS/HDS needs to be used. This results in a suboptimal user-experience as an external application has to be launched (and installed) first to enable playback of a live stream. DASH seems to be a promising standardisation effort, however only time will tell whether the industry gets on the DASH bandwagon or not. Apple is notable in it’s absence from the DASH consortium, instead pushing its’ HLS format and taking early steps towards IETF standardisation via creation of a draft document published by the IETF.
Google needs to improve native support for live streaming on their Android platform if they don’t want to lose customers to Apple’s more accessible HLS solutions, particularly for those customers who rely heavily on a seamless integration of a live streaming experience on their mobile devices. However, should their efforts be focused on improving their support for the HLS standard, or toward the emerging DASH standards? The best course of action is not clear so long as Apple and Adobe try to hold on to their proprietary streaming protocols and don’t come to an agreement with regards to DASH. 2013 is a critical year  for the DASH standard – their is a great opportunity to unify the fragmented HTTP streaming space, bringing together the best elements of Apple (HLS), Adobe (HDS) and Microsoft (Smooth Streaming) solutions if they can get buy-in from all key stakeholders.