GStreamer 1.0 plugins for i.MX platforms
The developer of this repository has not created any items for sale yet. Need a bug fixed? Help with integration? A different license? Create a request here:
This is a set of GStreamer 1.0 plugins for Freescale's i.MX platform, which make use of the i.MX multimedia capabilities.
Currently, this software has been tested only with the i.MX6 and i.MX7 SoC families.
These plugins are licensed under the LGPL v2.
The compositor is a new feature in gstreamer-imx 0.11.0. Just like with the compositor from gst-plugins-base 1.5.1 and newer, compositor elements support an arbitrary number of request sink pads, and one srcpad.
gst-launch-1.0 \ imxg2dcompositor name=c background-color=0x223344 \ sink_0::xpos=0 sink_0::ypos=90 sink_0::width=160 sink_0::height=110 sink_0::zorder=55 sink_0::fill_color=0xff00ff00 sink_0::alpha=0.39 sink_0::rotation=0 \ sink_1::xpos=0 sink_1::ypos=20 sink_1::width=620 sink_1::height=380 sink_1::fill_color=0x44441133 ! \ queue2 ! "video/x-raw, width=800, height=600" ! imxg2dvideosink \ videotestsrc pattern=0 ! "video/x-raw, framerate=30/1" ! c.sink_0 \ videotestsrc pattern=18 ! "video/x-raw, framerate=30/1" ! c.sink_1
This creates the following frame:
The compositor properties are accessible as usual by calling gst-inspect-1.0, like:
For the sinkpad properties are equal to that of the upstream compositor
Most of the sink pad properties are the same as that of GstCompositorPad:
xpos: The x-coordinate position of the top-left corner of the picture (gint)
ypos: The y-coordinate position of the top-left corner of the picture (gint)
width: The width of the picture; the input will be scaled if necessary (gint)
height: The height of the picture; the input will be scaled if necessary (gint)
alpha: The transparency of the picture; between 0.0 and 1.0. The blending is a simple copy when fully-transparent (0.0) and fully-opaque (1.0). (gdouble)
zorder: The z-order position of the picture in the composition (guint)
In addition, the imx compositor pads have these properties:
left-margin: Left margin in pixels, defining an empty space at the left side between the border of the outer frame and the actual inner video frame
top-margin: Top margin in pixels, defining an empty space at the top side between the border of the outer frame and the actual inner video frame
right-margin: Right margin in pixels, defining an empty space at the right side between the border of the outer frame and the actual inner video frame
bottom-margin: Bottom margin in pixels, defining an empty space at the bottom side between the border of the outer frame and the actual inner video frame
rotation: 90-degree step rotation mode for the inner video frame
keep-aspect-ratio: If true, the aspect ratio of the inner video frame is maintained, potentially creating empty regions
input-crop: If true, GstVideoCropMeta data in input video frames will be supported; instead of blitting from the entire input video frame it then blits from the rectangle specified by this meta
fill-color: What color to use to fill the aforementioned empty regions, specified as a 32-bit ABGR color value
The compositors have the notion of "inner" and "outer" frames. The "inner" frame is the actual video frame, for example a movie. The "outer" frame is a superset of the inner one and also of any empty spaces. If for example the outer frame is 1600x900 (16:9), and the inner frame is 1280x960 (4:3), and
keep-aspect-ratiois set to true, then the inner frame will be scaled to fit in the middle of the outer frame, and the leftover spaces to the left and right are the "empty spaces". These get filled with the
fill-color. If any of the margin values are nonzero, then the empty spaces also include the margin regions. If
keep-aspect-ratiois false, no empty regions exist unless at least one the margins is nonzero.
Current limitations: * The G2D compositor is the preferred one. The IPU compositor suffers from IPU peculiarities like "jumps" in the frame positioning. Also, the IPU compositor currently does not support deinterlacing. * There is no PxP compositor at the moment, since the PxP engine always fills the entire output frames with black pixels, even if only a subset is drawn to.
imxvpudec: video decoder plugin
imxvpuenc_h263: h.263 encoder
imxvpuenc_h264: h.264 baseline profile Annex.B encoder
imxvpuenc_mpeg4: MPEG-4 encoder
imxvpuenc_mjpeg: Motion JPEG encoder
imxipuvideosink: video sink using the IPU to output to Framebuffer (may not work well if X11 or Wayland are running)
imxipuvideotransform: video transform element using the IPU, capable of scaling, deinterlacing, rotating (in 90 degree steps), flipping frames, and converting between color spaces
imxipucompositor: video compositor element using the IPU for combining multiple input video streams into one output video stream
imxg2dvideosink: video sink using the GPU's 2D core (through the G2D API) to output to Framebuffer (may not work well if X11 or Wayland are running)
imxg2dvideotransform: video transform element using the GPU's 2D core (through the G2D API), capable of scaling, rotating (in 90 degree steps), flipping frames, and converting between color spaces
imxg2dcompositor: video compositor element using the IPU for combining multiple input video streams into one output video stream
imxg2dtextoverlay: Adds text strings on top of a video buffer using Pango and G2D
imxg2dtimeoverlay: Overlays buffer time stamps on a video stream using Pango and G2D
imxg2dclockoverlay: Overlays the current clock time on a video stream using Pango and G2D
imxg2dtextrender: Renders a text string to an image bitmap using Pango and G2D
imxpxpvideosink: video sink using the PxP engine to output to Framebuffer (may not work well if X11 or Wayland are running)
imxpxpvideotransform: video transform element using the PxP engine, capable of scaling, rotating (in 90 degree steps), flipping frames, and converting between color spaces
imxeglvivsink: custom OpenGL ES 2.x based video sink; using the Vivante direct textures, which allow for smooth playback
imxv4l2videosrc: customized Video4Linux source with i.MX specific tweaks
imxv4l2videosink: customized Video4Linux sink with i.MX specific tweaks
imxuniaudiodec: audio decoder plugin based on Freescale's unified audio (UniAudio) architecture
imxmp3audioenc: MP3 audio encoder plugin based on Freescale's MP3 encoder
imxg2dtextrenderelements render text using Pango and G2D. Currently these elements do not support non-physically contiguous buffers. This means that for example this pipeline won't work:
videotestsrc ! imxg2dtimeoverlay ! imxg2dvideosink
For this reason, it is necessary to make sure that the video stream is made of physically contiguous buffers (allocated with an allocator derived from GstPhysMemAllocator). Captured frames from
imxv4l2videosrc, decoded frames from
imxvpudec, and transformed frames from any of the blitter-based transform elements will deliver this type of buffers (however, the transform elements will do so only if they actually have something to transform; if they switch to passthrough, data will be left untouched). This limitation will be lifted in later versions.
There are two V4L2 elements,
imxg2dvideosink. Both are necessary because they allow for using physical memory addresses for the captured frames, thus enabling zerocopy.
imxv4l2videosrcextract such an address for each captured frame, while
imxg2dvideosinkdraws the frame via DMA using that address. Note however that
imxg2dvideosinkdoes not support non-physically contiguous frames. This is because it is currently not possible to allocate any temporary input DMA buffer inside
imxg2dvideosink(due to a lack of an appropriate allocator).
There are two ways how gstreamer-imx video streams can be integrated into external elements:
You'll need a GStreamer 1.2 installation, and the libimxvpuapi library. Also, the
videoparsersbadplugin from the
gst-plugins-badpackage in GStreamer is needed, since this plugin contains video parsers like
mpegvideoparse(for MPEG1 and MPEG2), and
mpeg4videoparse(for MPEG4). You must also use a Linux kernel with i.MX additions for the VPU, GPU, IPU, PxP subsystems. Mainline kernels do not contain these (yet).
This project uses the waf meta build system. To configure , first set the following environment variables to whatever is necessary for cross compilation for your platform:
./waf configure --prefix=PREFIX --kernel-headers=KERNEL-HEADER-PATH
(The aforementioned environment variables are only necessary for this configure call.) PREFIX defines the installation prefix, that is, where the built binaries will be installed. KERNEL-HEADER-PATH defines the path to the Linux kernel headers (where linux/ipu.h can be found). It is currently unfortunately necessary to set this path if linux/ipu.h is not in the root filesystem's include directory already. (Not to be confused with the ipu.h from the imx-lib.) Without this path, the header is not found, and elements using the IPU will not be built.
If gstreamer-imx is to be built for Android, add the
./waf configure --prefix=PREFIX --kernel-headers=KERNEL-HEADER-RPATH --build-for-android
Note that for Android, plugins are built as static libraries.
Once configuration is complete, run:
This builds the plugins. Finally, to install, run:
Further notes on how to build for some Linux distributions:
Arch Linux ARM build instructions:
Yocto / OpenEmbedded build instructions: An OpenEmbedded recipe for gstreamer-imx is included in meta-freescale. Also check out the Freescale Github space. Add the meta-freescale layer to your setup's
bblayers.conf. Then it should be possible to build the
gstreamer1.0-plugins-imxrecipe. This will also automatically build libimxvpuapi, which too has a recipe in meta-freescale.