Difference between revisions of "Using gstreamer"

From ElphelWiki
Jump to: navigation, search
(Displaying)
(Command line experiments)
Line 22: Line 22:
 
You won't get 25 fps if autoexposure is on and local brightness not high enough: the camera will automatically lower framerate for keeping clear picture. Either lighten up, or play with image settings (notably, gain).
 
You won't get 25 fps if autoexposure is on and local brightness not high enough: the camera will automatically lower framerate for keeping clear picture. Either lighten up, or play with image settings (notably, gain).
  
= Command line experiments =
+
= Commandline examples =
  
 
(Note: replace width and height accordingly to your camera setup and your computer's horsepower :p).
 
(Note: replace width and height accordingly to your camera setup and your computer's horsepower :p).

Revision as of 07:21, 8 August 2009

Gstreamer has made a lot of progress lately and some say it already outperforms Mplayer because of its focus on speed and hardware acceleration like using OpenGL, etc.

Gstreamer is a modular node based player as well as encoder in a single application. It is possible to create chains of so called elements with a wide range of different plugins.

Gstreamer and live video processing over the network

Gstreamer is very suited to live video and audio processing, notably for live decoding/encoding, audio muxing.

Requirements

You will need:

  • a fairly recent gstreamer distribution ("gstreamer0.10" as ubuntu package for example), with the rtpjpegdepay plugin

Limitations

You will not (at this time) be able to decode the video streams from Elphel 353 cameras on resolutions higher than 1920x1088, because of the RTP payloading limits.

To bypass this limitation you can build it yourself (see this guide from PiTiVi).

Tips

You won't get 25 fps if autoexposure is on and local brightness not high enough: the camera will automatically lower framerate for keeping clear picture. Either lighten up, or play with image settings (notably, gain).

Commandline examples

(Note: replace width and height accordingly to your camera setup and your computer's horsepower :p).

Displaying

gst-launch-0.10 rtspsrc location=rtsp://*CAMERA_IP*:554 ! rtpjpegdepay ! jpegdec ! queue ! ffmpegcolorspace ! xvimagesink sync=false

Dumping

mjpeg dumping

gst-launch -v rtspsrc location=rtsp://elphel:554 ! queue ! rtpjpegdepay ! videorate ! capsfilter caps = "image/jpeg, framerate=(fraction)25/1, width=1024, height=768" ! queue ! matroskamux ! filesink location=/tmp/test.mkv

YUV Dumping

gst-launch -v rtspsrc location=rtsp://elphel:554 ! queue ! rtpjpegdepay ! queue ! jpegdec ! queue ! videorate ! capsfilter caps="video/x-raw-yuv, format=(fourcc)I420, width=(int)1024, height=(int)768, framerate=(fraction)25/1" ! queue ! avimux ! filesink location=/tmp/test.avi

Dump transcoding example

gst-launch filesrc location=test.mkv ! matroskademux ! queue ! jpegdec ! queue ! theoraenc bitrate=4000 ! queue ! oggmux ! filesink location=test.ogg

Live encoding

gst-launch -v rtspsrc location=rtsp://elphel:554 ! queue ! rtpjpegdepay ! queue ! jpegdec ! queue ! videorate ! capsfilter caps="video/x-raw-yuv, format=(fourcc)I420, width=(int)1024, height=(int)768, framerate=(fraction)25/1" ! queue ! theoraenc bitrate=4000 ! queue ! oggmux ! filesink location=/tmp/test1024.ogg

I did some benchmarks; a Core 2 Quad Q6600 (2.4 Ghz) is not powerful enough for h264 encoding @fullHD resolution (using 4 treads).