Nov 25
HTTP Live Streaming to the iPhone
Update: Note that the vlc streaming guide has some pointers on how to do HTTP streaming with vlc as all-in-one solution.
Live http streaming (specs here) is essentially a number of short video segments (~10 seconds each) and a permanently updated index file (.m3u8) that tells the iPhone where to fetch the next segment. Each segment is x264 video in a mpeg ts container.
You can setup a working streaming chain using ffmpeg as input, a segmenter and a simple http webserver.
First you need to compile the open-source segmenter.c from assembla:
Download the segmenter.c and Makefile
If you get this error:
segmenter.c: In function 'add_output_stream': segmenter.c:44: error: 'AVCodecContext' has no member named 'ticks_per_frame' segmenter.c:46: error: 'AVCodecContext' has no member named 'ticks_per_frame' segmenter.c:54: error: 'AVCodecContext' has no member named 'channel_layout' segmenter.c:54: error: 'AVCodecContext' has no member named 'channel_layout' segmenter.c: In function 'main': segmenter.c:184: error: 'HUGE_VAL' undeclared (first use in this function) segmenter.c:184: error: (Each undeclared identifier is reported only once segmenter.c:184: error: for each function it appears in.) segmenter.c:193: error: 'INT_MAX' undeclared (first use in this function) segmenter.c:248: warning: implicit declaration of function 'avformat_alloc_context' segmenter.c:248: warning: assignment makes pointer from integer without a cast make: *** [all] Error 1
you need to modify the command-line passed to gcc via the Makefile. So edit Makefile and pass a path to ffmpeg sources to the compiler via the -I flag.
After you passed this one, you might get another error:
gcc -Wall -I/<path to my ffmpeg>/ffmpeg -g segmenter.c -o segmenter -lavformat -lavcodec -lavutil -lbz2 -lm -lz -lfaac -lmp3lame -lx264 -lfaad /usr/bin/ld: cannot find -lbz2 collect2: ld returned 1 exit status make: *** [all] Error 1
in this case you need the libbz2-dev package. Install with apt-get install libbz2-dev
If you still get errors after that, your system libs appear to be a bit messed up and gcc picks up and old libav<something> version from somewhere. A cure should be to build a recent version of ffmpeg with the --enable-shared flag set, so that new versions of the libav libraries end up being installed in /usr/local/lib.
If your freshly compiled ffmpeg (and subsequently your segmenter) tells you"ffmpeg: error while loading shared libraries: libavdevice.so.52: cannot open shared object file: No such file or directory then you need to add the shared libraries directory to your shell environment with "export LD_LIBRARY_PATH=/usr/local/lib". Help on that here.
Another approach to permanently let the system know where these libs reside is to: as root - create a file /etc/ld.so.conf.d/ffmpeg.conf and put this line into it "/usr/local/lib". Then run the command ldconfig -v to rebuild the library linker environment.
We've also discussed lib linking issues in my compiling ffmpeg post.
In order to stream proper videos to your audience, regarding video size, take these iPhone specs and some calculations into account:
480 x 320 iPhone v1 + v2 native Display resolution 640 x 360 common 16:9 video resolution # divide width like this (640 / 16) * 9 = to get height 480 x 270 iPhone 16:9 letterboxed, full width video 320 x 180 iPhone common setting for 16:9 low Quality streaming video
Now, you need to figure out for which bandwidth you'l like to transcode your source video. The iPhone can, based on hardware revision and telco provider, consume quite a number of transmission standards:
3G iPhone v2, v1 standard cellular data rate, below 100kbps in sum GPRS iPhone v2, v1 55-171kbps in sum EDGE iPhone v2, v1 108up, 217down, 236kbps in sum 3G some tell me <384kbps for audio+video is okay for 3G UMTS iPhone v2 348kbps in sum HSDPA iPhone v2 700-5700kbps in sum WiFi iPhone v2
Use this list with caution. As a general rule of thumb, these bandwidth figures might suffice. But I am no expert here (!) and haven't checked them thoroughly. So correct me if I am wrong. A better comparison might be here. 192kbps video+audio streams should be playlable on most iPhones.
Encoding videos for segmentation with ffmpeg should be a no brainer. But it turned out it is trickier than thought. Some useful command lines are shared here, here and here. One particular thing I ran into was the error "dts < pcr " which resulted in invalid segments and thus an invalid stream. Other users are getting this error as well. A bit of explanation is here. There are some related issues on the ffmpeg issue tracker, e.g. this.
Sadly, there is no desktop application that supports Apple's pantos HTTP Live Streaming standard, at least as of now. This makes it a bit cumbersome for a developer to debug an iPhone stream, at least without an iPhone at hand.
Mplayer is the wonderweapon of media-players and is *nearly* capable of streaming iPhone streams/segments. Start it with:
mplayer -playlist http://example.com/path/to/playlist.m3u8
and it will play the first segment and choke on subsequent ones. But as mplayer can play any shit thrown at it, it will even play segments that the iPhone won't play. So this is really for basic testing only.
Note:
The ioncannon posts got a working combination of ffmpeg and segementation together - using mp3 for audio. Still, I don't know why my tests with aac as audio encoder (as the original specs require) were so unsuccessful.
Further reading:
draft-pantos-http-live-streaming-01 - HTTP Live Streaming
draft-pantos-http-live-streaming-02 - HTTP Live Streaming
HTTP Live Streaming - Wikipedia