Using gstreamer

From ElphelWiki
Revision as of 17:17, 9 June 2016 by Oleg (talk | contribs) (3. Stream from the camera)
Jump to: navigation, search

GStreamer is a modular node based player as well as encoder in a single application. It is possible to create chains of so called elements with a wide range of different plugins.

GStreamer

GStreamer is a library for constructing graphs of media-handling components. The applications it supports range from simple Ogg/Vorbis playback, audio/video streaming to complex audio (mixing) and video (non-linear editing) processing.

Applications can take advantage of advances in codec and filter technology transparently. Developers can add new codecs and filters by writing a simple plugin with a clean, generic interface.

Installing GStreamer

NicoLargo made a script to automatically install all the required packages for GStreamer on Ubuntu 10.04. Should work for up to (K)Ubuntu 16.04 as well.

Tips

How to get 25 FPS ?

Depends on the sensor and the exposure:

  • In the sensor's datasheet find resolutions that allow the required FPS.
  • You won't get 25 fps if autoexposure is on and local brightness not high enough: the camera will automatically lower framerate for keeping clear picture. Either lighten up, or play with image settings (notably, gain).


Command line examples

Note: replace width and height accordingly to your camera setup and your computer's horsepower :P)

GStreamer 1.0

Display

1. Multipart JPEG (mjpeg), 1 channel

gst-launch-1.0 souphttpsrc is-live=true location=http://192.168.0.8:2323/mimg ! jpegdec ! xvimagesink

Note:

  • 10393 ports are 2323-2326
  • 10353 port is 8081

2. Multipart JPEGs (mjpeg), 4 channels in 1 window

  • Each source is resized to 640x480
  • Text overlay and borders added

display4in1.sh:

#!/bin/sh
LOC1="http://192.168.0.8:2323/mimg"
LOC2="http://192.168.0.8:2324/mimg"
LOC3="http://192.168.0.8:2325/mimg"
LOC4="http://192.168.0.8:2326/mimg"

LABEL1="CHN1"
LABEL2="CHN2"
LABEL3="CHN3"
LABEL4="CHN4"

#souphttpsrc
SOUP="souphttpsrc is-live=true"
#image parameters
PI="image/jpeg,width=1,height=1,framerate=1000/1,pixel-aspect-ratio=1/1"
#video parameters
PV="videoscale ! video/x-raw,width=640,height=480"
#videobox parameters
PVB="videobox fill=Black top=-4 bottom=-2 left=-4 right=-2 border-alpha=0.5"
#textoverlay parameters
PTO="textoverlay font-desc=\"Sans 24\" shaded-background=true valignment=top halignment=left"

gst-launch-1.0 -ve videomixer name=mix \
    sink_0::alpha=1 sink_0::xpos=0   sink_0::ypos=0 \
    sink_1::alpha=1 sink_1::xpos=646 sink_1::ypos=0 \
    sink_2::alpha=1 sink_2::xpos=0   sink_2::ypos=486 \
    sink_3::alpha=1 sink_3::xpos=646 sink_3::ypos=486 \
    ! \
    xvimagesink sync=false \
    $SOUP location=$LOC1 ! $PI ! jpegdec ! $PV ! $PVB ! $PTO text=$LABEL1 ! mix. \
    $SOUP location=$LOC2 ! $PI ! jpegdec ! $PV ! $PVB ! $PTO text=$LABEL2 ! mix. \
    $SOUP location=$LOC3 ! $PI ! jpegdec ! $PV ! $PVB ! $PTO text=$LABEL3 ! mix. \
    $SOUP location=$LOC4 ! $PI ! jpegdec ! $PV ! $PVB ! $PTO text=$LABEL4 ! mix.

3. RTSP stream from the camera

Notes:

  • careful with streams at higher resolution than 1920x1088 - might be problems even with resize
  • 10353 - works
  • 10393 - doesn't have a streamer yet

a.

gst-launch-1.0 rtspsrc location=rtsp://192.168.0.9:554 ! rtpjpegdepay ! jpegdec ! xvimagesink sync=false

b. with resize

gst-launch-1.0 rtspsrc location=rtsp://192.168.0.9:554 ! rtpjpegdepay ! jpegdec ! videoscale ! video/x-raw,width=640,height=480 ! xvimagesink sync=false

GStreamer 0.10

Display

careful with streams at higher resolution than 1920x1088

Display the cameras live video stream at its native resolution:

gst-launch-0.10 rtspsrc location=rtsp://192.168.0.9:554 ! rtpjpegdepay ! jpegdec ! queue ! ffmpegcolorspace ! xvimagesink sync=false

Display the cameras live video stream, resize to fit window (the ! videoscale element takes care of that)

gst-launch-0.10 rtspsrc location=rtsp://192.168.0.9:554 ! rtpjpegdepay ! jpegdec ! queue ! ffmpegcolorspace ! videoscale ! xvimagesink sync=false

Dumping

mjpeg dumping

gst-launch -v rtspsrc location=rtsp://192.168.0.9:554 ! queue ! rtpjpegdepay ! videorate ! capsfilter caps = "image/jpeg, framerate=(fraction)25/1, width=1024, height=768" ! queue ! matroskamux ! filesink location=/tmp/test.mkv

YUV Dumping

gst-launch -v rtspsrc location=rtsp://192.168.0.9:554 ! queue ! rtpjpegdepay ! queue ! jpegdec ! queue ! videorate ! capsfilter caps="video/x-raw-yuv, format=(fourcc)I420, width=(int)1024, height=(int)768, framerate=(fraction)25/1" ! queue ! avimux ! filesink location=/tmp/test.avi

Dump transcoding example

gst-launch filesrc location=test.mkv ! matroskademux ! queue ! jpegdec ! queue ! theoraenc bitrate=4000 ! queue ! oggmux ! filesink location=test.ogg

Live encoding

gst-launch -v rtspsrc location=rtsp://192.168.0.9:554 ! queue ! rtpjpegdepay ! queue ! jpegdec ! queue ! videorate ! capsfilter caps="video/x-raw-yuv, format=(fourcc)I420, width=(int)1024, height=(int)768, framerate=(fraction)25/1" ! queue ! theoraenc bitrate=4000 ! queue ! oggmux ! filesink location=/tmp/test1024.ogg

I did some benchmarks; a Core 2 Quad Q6600 (2.4 Ghz) is not powerful enough for h264 encoding @fullHD resolution (using 4 treads).