background preloader

Video

Facebook Twitter

Basics of Streaming Protocols. Streaming of audio and video is a confusing subject.

Basics of Streaming Protocols

This page is aimed at providing some of the basic concepts. Streaming means sending data, usually audio or video, in a way that allows it to start being processed before it's completely received. Video clips on Web pages are a familiar example. Progressive streaming, aka progressive downloading, means receiving an ordinary file and starting to process it before it's completely downloaded. It requires no special protocols, but it requires a format that can be processed based on partial content. Progressive streaming doesn't have the flexibility of true streaming, since the data rate can't be adjusted on the fly and the transmission can't be separated into multiple streams. "True" streaming uses a streaming protocol to control the transfer. Bst-player - Add sound and video to GWT applications using Windows Media Player, QuickTime, Flash, VLC Media Player plugins, DivX Web Player, HTML 5 Video or even yours the GWT way!.

The BST Player API makes media player plug-ins available in GWT applications like other GWT widgets.

bst-player - Add sound and video to GWT applications using Windows Media Player, QuickTime, Flash, VLC Media Player plugins, DivX Web Player, HTML 5 Video or even yours the GWT way!.

It also feature a module to export the widgets as Javascript objects in non-GWT applications. The API also includes a skin interface which makes creating custom HTML based players fun filled. The custom players can then be used like a cross browser media player. Starting with version 2.0, the API further provides support of additional player widgets via the newly introduced API/Provider architecture. Player Providers The following player provider packages are available: Core Player Provider: Wraps Windows Media Player, QuickTime, Flash player, VLC Media Player, HTML 5 Video and DivX Web Player plugins as GWT widgets YouTube Player Provider: Brings YouTube videos to GWT, the BST Player way Vimeo Player Provider: Provides support Vimeo video players.

Stream live WebM video to browser using Node.js and GStreamer. Live WebM video streaming with Flumotion. The objective of this post is to setup live WebM streaming using Flumotion on a Linux box running Ubuntu (10.04 in this case).

Live WebM video streaming with Flumotion

WebM is a codec and container format based on the matroska container, it uses the Vorbis codec for audio and the VP8 codec for video. It is a royalty-free format for the Web. H.264 is an open standard but is encumbered with patents. Although Ogg is quite similar to WebM, Ogg suffers from FUD regarding its royalty-free standing. Install Flumotion We’ll install Flumotion version 0.8.1-1flu1~lucid1. Build and install VP8 plugin and GStreamer Ubuntu Natty Narwhal (11.04) includes version 0.10.32 of GStreamer. Install the following packages (and their dependencies) using Synaptics Package Manager or sudo apt-get install: bisonflexlibasound2-devlibglib2.0-devlibx11-devlibxml2-devlibvorbis-devx11proto-video-devyasm (required to build VP8 lib from Google) Execute the following commands to build and install VP8 codec lib version 0.9.6 from source: Prepare GStreamer Like this: The State Of HTML5 Video. Comparing HTML RTSP HTTP RTMP Streaming Protocols.

Stream-m - A HTML5 compatible WebM live streaming server. Stream.m is created to be an open source solution for streaming live video right into the web browser using the HTML5 video tag and Google's WebM video format.

stream-m - A HTML5 compatible WebM live streaming server

The current version is a working prototype, which showcases the main ideas. The main design goal is low resource usage. Has a web interface with a realtime bandwidth monitor (with the resolution of 1/10 of a second) for spotting network congestion. Also supports simultaneous streams (channels). Note: the recommended ffmpeg options changed (again). The live stream consists of fragments (self-contained units of frames without referencing any frame outside the fragment). The ideal fragment size is around 200 kBytes (or 1600 kbits). E.g. if you are publishing a 500 kbit stream with 16 fps, then: 1600 / 500 * 16 = 51.2 (or 1600000 / 500000 * 16 = 51.2) so every 52nd video frame should be a keyframe.

The server splits fragments when it seems necessary. All operations are done over HTTP. Java StreamingServer <configfile>