An Insight into Ffmpeg Video Streaming
Posted By Oodles Technologies | 30-Sep-2014
Ffmpeg is an open source software project that produces programs and libraries for handling multimedia data. It includes libavcodec which is an audio /video codec library , libavformat an audio/video container mux and demux library and the ffmpeg command line program for transcoding multimedia files.
Ffmpeg is a set of tools dedicated for the purpose of encoding, decoding and transcoding audio and video files.
Why we need ffmpeg ?
You got a video that every other player could not recognize ? Then transcode it with ffmpeg. It helps to automate cutting video segments out of movie with a short batch script that uses it. It also helps to constantly encode and process video material . It is free and cross platform and you can install it on your Linux server as well as Windows PC.
Ffmpeg basically stream in two ways ; it either streams to some other server which then re-streams it or can stream via UDP/TCP protocols directly to a destination receiver or to a multicast destination.
Servers which receives from ffmpeg includes ffserver , Flash media server or wowza media server. VLC can also picks up the stream from ffmpeg and then redistributes it acting more or less like a server. But since ffmpeg is more efficient than VLC in doing raw encoding so it definitely is a better option for transcoding and streaming than VLC. With it we can also live stream to online redistribution servers.
To stream audio/video content over internet you will need a streaming/broadcasting server which is able to collect multiple input sources and transcode/broadcast each one of them using multiple output streams. Various inputs sources are used to feed the server with multimedia content to distribute to multiple clients for viewing.
An insight to Ffserver -
It is a streaming server used to stream both audios and videos. It even supports various live streams from different files and time shifts on live feeds. It is configured using a configuration file ,ready at startup and if not explicitly specified then will read from /etc/ffserver.conf.
Ffserver receives pre recorded files or FFM streams from ffmpeg instances as inputs and then stream them over to RTP/RTSP/HTTP. The instance will listen on some port as specified in the configuration file. You can send one or more instances to FFM streams where ffserver is expecting to receive them. Otherwise you can make ffserver launch those ffmpeg instances at startup.
These input streams are called feeds and each one of them is specified by a <Feed> section in configuration file. You can have different output streams in various formats specified by a <Stream> section in configuration file. The server works by forwarding streams which are encoded by ffmpeg or pre-recorded streams which are read from the disk.
Ffserver acts as an HTTP server that accepts POST requests from ffmpeg to acquire the stream to publish and serve RTSP clients with the stream media content. Each feed is identified by a unique name which is corresponding to the names of the resource published on ffserver and is configured by a dedicated feed section in configuration file.
FFM and FFM2 are the two formats used by the server. They allow to store a wide variety of video and audio streams and encoding options and can even store a moving time segment of a whole movie.