This is a description of the GStreamer integration into the Mozilla code base
The main integration part is done in nsGStreamerDecoder.cpp and it corresponding .hpp file. Both of these files can be found in the content/media/video/src folder.
Until the patch has landed in the main tree, it can be found in this bug that tracks the integration.
This is what it says in the comments for the function in nsMediaDecoder.h:
// Called by the element when the playback rate has been changed. // Adjust the speed of the playback, optionally with pitch correction, // when this is called.
As we found here, the element is the object in the code that corresponds to the html5 element for video. I assume the intention of this is to allow the element to change the speed of the stream – like play in slow motion, or in high speed.
Currently non of the other implemented decoders support this function, they just return NS_ERROR_NOT_IMPLEMENTED.
The previous implementation of this function made a call to “GetPlaybackRate” on the element to retrieve the rate, but that function appears to have been removed.
Until I find a way to retrieve the playback rate, I have changed this function to return NS_ERROR_NOT_IMPLEMENTED like the other decoders. As a search in the code base for callers to this function doesn't reveal anyone calling the function this should be safe.
TODO: Find how to retrieve the play rate
This is a new function I have added, it's used by the NsGStreamerDecoder::DoHandoff function to have the code call back to the class from the main thread.
The reason this dummy function is needed, is that the helper stuff from nsThreadUtils.h expects the function to have a return type otherwise the compiler will get confused as to which function to call.
NOTE: The below is the way it worked before I changed it too much.
The function configures GStreamer to load a file and play it back, it only accepts URI's to do this, as this is natively supported by Gstreamer.
The first thing I did was to extract the URI from the nsIChannel, so it worked even no explicit URI was given.
The function uses the “playbin” high level interface of GStreamer to take care of the loading and decoding of media content. In order to extract data it uses the ”fakesink” element and the “ffmpegcolorspace” converter to transform the output data into a raw rgb color format.
GstElement* video_bin = gst_bin_new("video_bin"); GstElement* converter = gst_element_factory_make("ffmpegcolorspace", "converter"); GstElement* fakesink = gst_element_factory_make("fakesink", "sink");
The converter and the fakesink are are added to a bin and the sink pad of the converter is made sink of the whole bin:
gst_bin_add(GST_BIN(video_bin), converter); gst_bin_add(GST_BIN(video_bin), fakesink); // gst_element_link_pads(converter, "src", fakesink, "sink"); gst_element_link_filtered(converter, fakesink, gst_caps_new_simple("video/x-raw-rgb", "bpp", G_TYPE_INT, 32, "depth", G_TYPE_INT, 24, NULL)); GstPad* pad = gst_element_get_pad(converter, "sink"); gst_element_add_pad(video_bin, gst_ghost_pad_new("sink", pad)); gst_object_unref(pad);
The video sink of the playbin pipeline is then set to the bin that was just created:
g_object_set(G_OBJECT(mPipeline), "video-sink", G_OBJECT(video_bin), NULL);
In the solution described above we need to think about the following issues:
We do not control the input to the decoder directly (we only tell it where to fetch the data), meaning it will bypass anything the browser does, like caching, proxy setups, security (?) etc.
(This one needs to be verified) The playbin player is a pipeline, hence inside all timer control should be taken care of by the pipeline itself. However, since we add our own bin that acts as sink for the video part of the pipeline, it could be that this bin is not under the timer control of the pipeline.
We could make a fakesource “faksrc”, to inject data into the pipeline, but this is pulling, and should be non-blocking, which might not fit streaming that well. A better choice might be to implement a GstPushSrc which is geared at pushing data instead of pulling it.
The back-end of the playbin pipeline is the ”Decodebin” bin. Using this one will allow us to attach our source element to one end the decodebin bin, and our video_bin bin to the other end of it, make all of it into a single pipeline and use that one, hopefully with audio and video synchronization.