RasPi in the Sky – gstreamer first steps

See this for an explanation of the name 🙂

I did a bit of research around video streaming on the pi and there seem to be several methods used including using VLC or netcat or various other methods but they all seemed to be a little bit high latency and as we are going to be doing live video to a screen we want to keep that down so I went with Gstreamer.

The first page I came across seemed to be ideal :  http://pi.gbaman.info/?p=150  and this got me up and going but I ran into issues with the sending and listening commands.  http://blog.tkjelectronics.dk/2013/06/how-to-stream-video-and-audio-from-a-raspberry-pi-with-no-latency/ helped with sending but as the commands on both pages were for listening using a Mac I was out of luck.  After a quick poke around mailing lists and google fu I have now got it up and running.

 

So after setting up the environment as described in either of the above I have  been using the following commands.  I am not a gstreamer scientist and there are probably much better ways of doing it but this seems to be working here.

Sending Pi (pivate-eye) note you will have to change the IP address

raspivid -t 999999 -w 1080 -h 720 -fps 25 -hf -b 2000000 -o - | gst-launch-1.0 -v fdsrc ! h264parse ! rtph264pay config-interval=1 pt=96 ! gdppay ! tcpserversink host=10.42.0.112 port=5000

10.42.0.112 is the address of the sending pi.

On the listening pi (pi-tv) watch out for the ip address in it

gst-launch-1.0 -v tcpclientsrc host=10.42.0.112 port=5000 ! gdpdepay ! rtph264depay ! h264parse ! omxh264dec ! videoconvert ! autovideosink sync=false

Latency:

 

I have done some quick tests as to the latency of the video link and how it changes with resolution.   These are only very quick tests so may not be that accurate.  The methodology was to put a stopwatch on screen, stream the screen via the webcam with both the onscreen clock and the video playback of the clock visible then to take a picture using a digital camera and then just subtract the differences between the clocks.

320 x  240 = 176 ms

640 x 480 = 137 ms

800 x 600 = 325 ms

As you can see this doesn’t quite look right but it was just a quick test just to get some ballpark figures.  I have also not run multiple tests over time to see if there is drift in the lag so there is a lot more work required here. Oh and also there is a good chance that my gstreamer commands could be optimised and improved.

Also can I just say a big thanks to Odie_ on #raspberrypi on Freenode who correctly spotted I hadn’t checked my camera when I couldn’t get it working and suggested the latency methodology and generally kept prodding me until I got it working and documented on the net 🙂

3 thoughts on “RasPi in the Sky – gstreamer first steps

  1. Many thanks.
    I had exactly the same issue as you. Using Gstreamer 1.0 to transmit from raspberry pi to receiving Macintosh. But wanted to have a transmitting Pi to receiving Pi set up.
    You have saved me hours of time working out the proper command line for receiving on the pi.

    After trying out a few streaming solutions, I agree, Gstreamer so far provides the lowest level of latency, especially important for bots and drones.

    My set up is Rasp Pi A with camera board to transmit via Wifi. My receiver is Raspberry Pi B with Wifi. the receiver is attached via HDMI cable to my HD TV. The picture is very clear on a 32 inch screen and latency low, even over Wifi.

    The stream works over Local network and Internet IP addresses – using port forwarding on transmitting Pi via my home router.

    Many thanks again to you and those whose helped along the way.

  2. Hello Probably I am getting the same problems as you. I try to use my laptop with Debian 7.1.0 as receiver and when using the command for Mac the response is:

    Estableciendo el conducto a PAUSA …
    No protocol specified
    No protocol specified
    No protocol specified
    libEGL warning: DRI2: xcb_connect failed
    No protocol specified
    libEGL warning: DRI2: xcb_connect failed
    libEGL warning: GLX: failed to load GLX
    No protocol specified
    No protocol specified
    ERROR: El conducto no quiere pausarse.
    ERROR: del elemento /GstXvImageSink:autovideosink0-actual-sink-xvimage: Could not initialise Xv output
    Información adicional de depuración:
    xvimagesink.c(1291): gst_xvimagesink_xcontext_get (): /GstXvImageSink:autovideosink0-actual-sink-xvimage:
    Could not open display
    Estableciendo el conducto a NULL …
    Liberando la tubería…

    and when using the comnad like yours it says:

    ADVERTENCIA: conducto erróneo: no hay un elemento «omxh264dec»
    which means that there is not a element omxh264dec

    Please any hint?

    Thanks / Borja

  3. Hi Borja,

    Have you got the same packages installed on the receiving laptop installed? You can use the same apt-get commands on the Pi and the laptop 🙂

Comments are closed.