We're hiring!
*

SRT in GStreamer

Olivier Crête avatar

Olivier Crête
February 16, 2018

Share this post:

Reading time:

Transmitting low delay, high quality video over the Internet is hard. The trade-off is normally between video quality and transmission delay (or latency). Internet video has up to now been segregated into two segments: video streaming and video calls. On the first side, streaming video has taken over the world of the video distribution using segmented streaming technologies such as HLS and DASH, allowing services like Netflix to flourish. On the second side, you have VoIP systems, which are generally targeted a relatively low bitrate using low latency technologies such as RTP and WebRTC, and they don't result in a broadcast grade result. SRT bridges that gap by allowing the transfer of broadcast grade video at low latencies.

The SRT protocol achieves these goal using two techniques. First, if a packet is lost, it will retransmit it, but it will only do that for a certain amount of time which is determined by the configured latency, this means that the latency is bounded by the application. Second, it tries to guess the available bandwidth based on the algorithms from UDT, this means that it can then avoid sending at a rate that exceeds the link's capacity, but it also makes this information available to the application (to the encoder) so that it can adjust the encoding bitrate to not exceed the available bandwidth ensuring the best possible quality. Using the combination of these techniques, we can achieve broadcast grade video over the Internet if the bandwidth is sufficient.

At Collabora, we're very excited with the possibilities created by SRT, so we decided to integrate it into GStreamer, the most versatile multimedia framework out there. SRT is a connection oriented protocol, so it connects 2 peers. It supports 2 different modes, one in which there is a caller and a listener (so it works like TCP) and one called "rendez-vous mode" where both sides call each other so as to make it friendly to firewalls. A SRT connection can also act in two modes, either as a receiver or a sender, or in GStreamer-speak as a source or as a sink. In GStreamer, we chose to create 4 different elements: srtserversink, srtclientsink, srtserversrc, and srtclientsrc. We decided on the client/server naming instead of caller/listener as we think it's easier to understand and it matches the naming we have to TCP based elements. We also chose to implement the rendez-vous mode inside the client elements as after the initialization, the codepaths are the same.

A typical example would be to have have an encoder which is also a server with a pipeline like:

gst-launch-1.0 v4l2src ! video/x-raw, height=1080, width=1920 ! videoconvert ! x264enc tune=zerolatency ! video/x-h264, profile=high ! mpegtsmux ! srtserversink uri=srt://:8888/

And a receiver on the other side would receive it with a pipeline like this one:

gst-launch-1.0 srtclientsrc uri=srt://192.168.1.55:8888 ! decodebin ! autovideosink

Using tools like gst-launch, it's very easy to prototype SRT and it's integration into real world pipeline that can be used in real applications. Our team at Collabora would love to help you integrate SRT into your platform, using GStreamer, ffmpeg, VLC or your own multimedia framework. Contact us today to see how we can help!

Update (Jan 2019):

In GStreamer 1.16, we've decided to merge the clientsrc and serversrc srt elements into a single source element, and the same for the server. So the example pipelines in 1.16 are:

  gst-launch-1.0 -v videotestsrc ! video/x-raw, height=1080, width=1920
! videoconvert ! x264enc tune=zerolatency ! video/x-h264, profile=high
! mpegtsmux ! srtsink uri=srt://:8888/

and

  gst-launch-1.0 srtsrc uri=srt://192.168.1.55:8888 ! decodebin !
autovideosink

Comments (27)

  1. nh2:
    Jan 22, 2019 at 03:02 AM

    gst-launch-1.0 srtclientsrc srt://192.168.1.55:8888 ! decodebin ! autovideosink doesn't seem to work (any more?).

    Should probably be gst-launch-1.0 srtclientsrc uri=srt://192.168.1.55:8888 ! decodebin ! autovideosink

    See also https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/issues/874 for various problems.

    Reply to this comment

    Reply to this comment

    1. Olivier Crête:
      Jan 22, 2019 at 03:00 PM

      Yes, the git master has known issues, they will be fixed before the next release.

      Reply to this comment

      Reply to this comment

      1. nh2:
        Jan 22, 2019 at 04:58 PM

        These the incompatibility of the commands in the post were also present for the 1.14 release, not only in master. But your reply below may already have captured that.

        Reply to this comment

        Reply to this comment

    2. Olivier Crête:
      Jan 22, 2019 at 04:37 PM

      I've just noticed the missing uri=, we've updated the blog post. Also in 1.16, we're merging the client & server src and sinks. I'll be updating the blog post to cover that.

      Reply to this comment

      Reply to this comment

  2. Manuel:
    May 20, 2019 at 07:02 AM

    I'm using...

    gst-launch-1.0 -v srtsrc uri="srt://server_ip:10006?mode=caller&pksize=1316&latency=500&blocksize=1316" ! tsdemux ! opusdec ! audioconvert ! wasapisink sync=false low-latency=true

    When I simulate packet loss I get this error:

    ERROR: from element /GstPipeline:pipeline0/GstSRTSrc:srtsrc0: Internal data stream error.
    Additional debug info:
    ../libs/gst/base/gstbasesrc.c(3072): gst_base_src_loop (): /GstPipeline:pipeline0/GstSRTSrc:srtsrc0:
    streaming stopped, reason error (-5)

    The reception of the stream stops.
    When I perform the same simulation but using srt-live-stransmit sending udp on the same machine and receiving with gstreamer udpsrc, the reception does not stop.
    Is it probably a bug of the plugin or am I putting the parameters wrong in srtsrc?

    Reply to this comment

    Reply to this comment

    1. Olivier Crête:
      May 21, 2019 at 04:13 PM

      This does sound like a bug in the GStreamer plugin. Could you please file a bug at https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/issues/? Thank you!

      Reply to this comment

      Reply to this comment

      1. Niklas Hambuechen:
        May 21, 2019 at 05:35 PM

        When you've filed a bug, please link it here, as I'm also interested in that topic.

        I, too, observed that in some situations the gstreamer pipeline just stops, especially when then stream's bitrate exceeds the capabilities of the connection.

        Reply to this comment

        Reply to this comment

  3. Anonymous:
    Aug 27, 2019 at 01:17 PM

    Bonjour,
    Comment peut-on faire pour remplacer "videotestsrc" par un vrai fichier en ".mp4" (h264) ? Je n'y suis pas parvenu et cela aurai pourtant un interet certain.
    Merci.

    ---

    Hello,
    How can I replace "videotestsrc" with a real ".mp4" file (h264)? I did not succeed and it will have a certain interest.
    Thank you.

    Reply to this comment

    Reply to this comment

    1. Olivier Crête:
      Aug 27, 2019 at 06:10 PM

      You probably want something like: filesrc location=myfile.mp4 ! qtdemux ! h264parse ! mpegtsmux alignment=7 ! srtsink

      Reply to this comment

      Reply to this comment

      1. Anonymous:
        Aug 28, 2019 at 10:19 AM

        Merci pour votre aide !
        J'ai bien mon flux qui arrive sur le client.
        Bon, ce dernier fige au bout de quelques secondes et il n'y a pas de son mais c'est déjà ça ;D

        ---

        Thanks for your help !
        I have my feed coming to the client.
        Well, it freezes after a few seconds and there is no sound but it's already that ;D

        Reply to this comment

        Reply to this comment

  4. Marko:
    Jan 15, 2020 at 01:56 PM

    Hello,
    I've tried using gstreamer for srt and always get erroneous pipeline be it as a client or as a server using the old and the new commands.
    It says there's no such thing as an srtsink or srtsrc.
    I've tried scouring the web to find a solution but was unable to find anything helpful.
    I'm using the latest version of Ubuntu.
    I'd appreciate some help with this problem
    Thanks!

    Reply to this comment

    Reply to this comment

    1. Olivier Crête:
      Jan 15, 2020 at 04:07 PM

      It's possible that Ubuntu doesn't ship the SRT plugin yet (Fedora doesn't either). So you would need to compile GStreamer yourself.

      Reply to this comment

      Reply to this comment

      1. Marko:
        Jan 16, 2020 at 06:20 AM

        Thank you for your response, I'll try compiling it myself.

        Reply to this comment

        Reply to this comment

        1. Andi:
          Jan 20, 2020 at 06:15 PM

          Hi,

          I was facing the same issue, and the post is fresh, I want to share my experience.
          I am using Ubuntu 18.04 and GStreamer version 1.14.5 (in this version, it is still srtserversink, srtclientsrc etc., NOT srtsink/srtsrc). The SRT modules are part of the bad plugins. However, they do not come when installing these via apt-get install. One can check via the command gst-inspect-1.0 srtserversink, srtclientsrc etc.

          As already pointed out by Olivier, this means that one needs to compile them yourself. To start with the bad plugins, I got version 1.14.5 of the source files from https://gstreamer.freedesktop.org/src/gst-plugins-bad/. I unpacked them, cd to the directory and ran ./configure. This did tell me that "Plug-ins with dependencies that will NOT be built: srt [and a lot more bad plugins]", because it is missing the libsrt.so. This means that you also need to compile it yourself. I got the code from https://github.com/Haivision/srt, ran ./configure && make && sudo make install. It did get installed to /usr/local/lib, /usr/local/bin and /usr/local/include. Unfortunately, in my case, this is not the location where gstreamer is installed and looking for libsrt. Instead that is /usr/lib/x86_64-linux-gnu, /usr/include and /usr/bin. As the ./configure of srt does ignore the prefixes libdir, includedir and bindir, I installed srt somewhere else and moved the files manually to the directories where gstreamer is looking for them. Now after installing libsrt, I run the ./configure --prefix=/usr/lib/x86_64-linux-gnu of bad plugins. Checking the console output, it tells me it would install srt. Now I installed the bad plugins via "make && sudo make install". Checking gst-inspect-1.0 srtserversink, the srt plugins are finally available.

          At last, I wanted to run the example of this page:
          SENDER:
          gst-launch-1.0 videotestsrc ! video/x-raw, height=1080, width=1920 ! videoconvert ! x264enc tune=zerolatency ! video/x-h264, profile=high ! mpegtsmux ! srtserversink uri=srt://127.0.0.1:7001/

          RECEIVER:
          gst-launch-1.0 srtclientsrc uri=srt://127.0.0.1:7001 ! decodebin ! autovideosink

          The console outputs are:
          SENDER:
          Setting pipeline to PAUSED ...
          Pipeline is PREROLLING ...
          Redistribute latency...
          Pipeline is PREROLLED ...
          Setting pipeline to PLAYING ...
          New clock: GstSystemClock

          RECEIVER:
          Setting pipeline to PAUSED ...
          Pipeline is live and does not need PREROLL ...
          Setting pipeline to PLAYING ...
          New clock: GstSystemClock

          Everything seems to run fine, however, no render window appears showing my decoded videotestsrc, as I would have expected.

          Running the pipeline without the srt transmission, works and the window appears.
          PIPELINE:
          gst-launch-1.0 videotestsrc ! video/x-raw, height=1080, width=1920 ! videoconvert ! x264enc tune=zerolatency ! video/x-h264, profile=high ! mpegtsmux ! decodebin ! autovideosink

          What am I missing at this point?

          Thanks in advance!

          Andi

          Reply to this comment

          Reply to this comment

          1. Olivier Crête:
            Jan 20, 2020 at 06:38 PM

            The solution is your problem is that you want the URI for the server to not have an IP address, so it would be something like srt://:7001.. If you want to bind to a specific local address, you need to use the "local-address" property on srtsink on the new one.

            Reply to this comment

            Reply to this comment

            1. Andi:
              Jan 20, 2020 at 08:41 PM

              Unfortunately, this does not do the trick for me. I checked with gst-inspect-1.0 and the default uri is set with the localhost:
              gst-inspect-1.0 srtserversink
              [...]
              uri : URI in the form of srt://address:port
              flags: readable, writable
              String. Default: "srt://127.0.0.1:7001"
              [...]

              The two pipelines are aware of each other, as the receiver part remains in PAUSED and times out when the sender part does not get launched. I also checked with wireshark and the two parts are exchanging a few "UDT type: handshakes" in the beginning and "UDT type: keepalives" every second afterwards, but no actual video pakets.

              Reply to this comment

              Reply to this comment

              1. Olivier Crête:
                Jan 20, 2020 at 08:45 PM

                Tbh, I'm not sure what you should do.. Maybe run it with GST_DEBUG=GST_SCHEDULING:5 and check if the buffers get to the sink, that environment variable will make GStreamer print a message every time a buffer enters an element. It's probably better if we continue this on the GStreamer-devel mailing list (or using the #gstreamer IRC channel)

                Reply to this comment

                Reply to this comment

  5. Gianluca:
    Mar 08, 2021 at 04:49 PM

    Hi, I am using Gstreamer to send my windows desktop via srt. I use the code you see below. How to add desktop audio to the video stream? (windows 10 pro)

    gst-launch-1.0 -v gdiscreencapsrc cursor=true ! video/x-raw ! videoconvert ! x264enc tune=zerolatency ! video/x-h264, profile=high ! mpegtsmux ! queue ! srtserversink uri=srt://192.168.1.30:7001

    Reply to this comment

    Reply to this comment

    1. Olivier Crête:
      Mar 30, 2021 at 05:09 PM

      You want to add another branch before the mpegtsmux. Something like this:

      gst-launch-1.0 -v gdiscreencapsrc cursor=true ! video/x-raw ! videoconvert ! x264enc tune=zerolatency ! video/x-h264, profile=high ! mpegtsmux name=mux ! queue ! srtserversink uri=srt://192.168.1.30:7001 autoaudiosrc ! avenc_aac ! aacparse ! mux.

      Reply to this comment

      Reply to this comment

  6. Edwin:
    Apr 12, 2022 at 05:44 PM

    Hello,

    Trying to play with the latency parameter but it looks like it has mostly no impact until some low values where it crashes the whole thing down.

    My pipe is quite complex (compose from multicam, adds audio, etc.) and ends up with `srtsink uri=srt://:8888?latency=5000`.
    On the receiver side it's just `srtsrc uri=srt://192.168.2.38:8888 ! ... ! xvimagesink`

    No matter the value of latency on the srtsink (1 or 5000) I always have about 2 sec between capture and display.

    If I put the latency parameter (value has no impac) on srtsrc I get one of those two errors (with a still image, sometime just a partial image) :

    16:39:00.195144/SRT:RcvQ:worker*E:SRT.c: %1008260167:No room to store incoming packet: offset=1 avail=0 ack.seq=1105721023 pkt.seq=1105721024 rcv-remain=8191
    WARNING: from element /GstPipeline:pipeline0/GstXvImageSink:xvimagesink0: A lot of buffers are being dropped.
    Additional debug info:
    gstbasesink.c(3003): gst_base_sink_is_too_late (): /GstPipeline:pipeline0/GstXvImageSink:xvimagesink0:
    There may be a timestamping problem, or this computer is too slow.

    Or more often :
    16:43:02.988411/SRT:RcvQ:worker*E:SRT.c: %243350002:No room to store incoming packet: offset=0 avail=0 ack.seq=2087558183 pkt.seq=2087558183 rcv-remain=8191
    16:43:02.988667/SRT:RcvQ:worker*E:SRT.c: %243350002:No room to store incoming packet: offset=1 avail=0 ack.seq=2087558183 pkt.seq=2087558184 rcv-remain=8191

    Any idea what I'm missing ?

    Reply to this comment

    Reply to this comment

    1. Olivier Crête:
      Apr 12, 2022 at 07:10 PM

      It's hard to know without testing the whole pipeline . But I tried with simple pipelines like:

      v4l2src ! videoconvert ! x264enc tune=zerolatency ! video/x-h264,profile=high ! mpegtsmux alignment=7 ! srtsink uri=srt://:8888 latency=125 mode=listener wait-for-connection=false

      and srtsrc latency=125 mode=caller uri=srt://127.0.0.1:8888 ! tsdemux latency=100 ! h264parse ! avdec_h264 ! xvimagesink


      And it seems to work. Notice how I reduced the latency of tsdemux.. The mpeg-ts spec requires 700ms of latency, but we can generally reduce it a lot.

      Reply to this comment

      Reply to this comment

      1. Edwin :
        Apr 15, 2022 at 10:03 AM

        Thanks for the suggestions.

        I'll test that and keep you posted.

        If you want to have a look at the full pipes this is the (open source) project : https://github.com/aethernet/portal/ (pipes are described in the readme).

        Reply to this comment

        Reply to this comment

  7. Sujin Kim:
    Jun 19, 2023 at 09:29 AM

    Hi,
    I've started to study SRT since reading your blog Olivier.
    However I am suffering some issues with my simple gst-launch command line, which was made referring your blog.
    Will be very thankful if you check and leave some comments here : https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/issues/1775

    Reply to this comment

    Reply to this comment

  8. usama:
    Sep 25, 2023 at 07:56 PM

    Can I use a single port for publishing multiple SRT streams and play those SRT streams from that same port based on different stream ids

    Reply to this comment

    Reply to this comment

  9. Uku Kert:
    Apr 19, 2024 at 09:20 AM

    Hello! Thank you so much for the srtsink and srtsrc!

    Is there a possiblity of pausing/resuming the SRT connection? I've managed to set the element state to PAUSED and the element stops sending packets successfully. However when trying to resume the element by setting the state to PLAYING, it fails and I start get the following error and the pipeline terminates:
    ```
    Error: gst-resource-error-quark: Failed to write to SRT socket: Failed to poll socket: Operation not supported: Bad parameters: Resource temporarily unavailable (10): ../subprojects/gst-plugins-bad/ext/srt/gstsrtsink.c(205): gst_srt_sink_render (): /GstPipeline:pipeline0/GstBin:srt-sink-bin/GstSRTSink:srtsink0
    ```
    I have been digging around the SRT and srtsink source codes and I could find that both should support the PAUSED state. For the reciever I am using the following ffplay command:
    ```
    ffplay -i "srt://172.17.0.1:9999/masp/devstream?rcvbuf=150000000&mode=listener"
    ```
    Any help regarding this topic would be greatly appreciated!

    Reply to this comment

    Reply to this comment

    1. Olivier Crête:
      Apr 19, 2024 at 03:59 PM

      Looking at the source code for srtsink, the PAUSED state is treated the same as a the PLAYING state. So it shouldn't disconnect. The only difference is that in the PAUSED state, it shouldn't be sending anything. I wonder if therer is some kind of timeout which you may not catch, are you listening to the GstBus for any errors? If you have further questions, the best place is to ask is on GStreamer Discourse.

      Reply to this comment

      Reply to this comment


Add a Comment






Allowed tags: <b><i><br>Add a new comment:


Search the newsroom

Latest Blog Posts

Mesa CI and the power of pre-merge testing

08/10/2024

Having multiple developers work on pre-merge testing distributes the process and ensures that every contribution is rigorously tested before…

A shifty tale about unit testing with Maxwell, NVK's backend compiler

15/08/2024

After rigorous debugging, a new unit testing framework was added to the backend compiler for NVK. This is a walkthrough of the steps taken…

A journey towards reliable testing in the Linux Kernel

01/08/2024

We're reflecting on the steps taken as we continually seek to improve Linux kernel integration. This will include more detail about the…

Building a Board Farm for Embedded World

27/06/2024

With each board running a mainline-first Linux software stack and tested in a CI loop with the LAVA test framework, the Farm showcased Collabora's…

Smart audio filters with WirePlumber 0.5

26/06/2024

WirePlumber 0.5 arrived recently with many new and essential features including the Smart Filter Policy, enabling audio filters to automatically…

The latest on cmtp-responder, a permissively-licensed MTP responder implementation

12/06/2024

Part 3 of the cmtp-responder series with a focus on USB gadgets explores several new elements including a unified build environment with…

Open Since 2005 logo

Our website only uses a strictly necessary session cookie provided by our CMS system. To find out more please follow this link.

Collabora Limited © 2005-2024. All rights reserved. Privacy Notice. Sitemap.