This article explains the software resources for multimedia playback and encoding on Toradex modules. It has a generic section (added starting with BSP 5) and, after that, specific sections are provided for SoMs.
Video encoding and decoding (playback) are intensive tasks for the CPU, therefore various SoCs are equipped with dedicated hardware named Video Processing Units (VPU) or similar, and sometimes video processing operations can be done on the Graphics Processing Unit (GPU). Especially when using GStreamer, make sure that you are using plugins that support hardware acceleration whenever possible. We do have specific sections for i.MX- and TK1-based SoMs in this article, make sure to read them and consult the NXP and NVIDIA documentation.
Browse your BSP version from the tabs below.
On BSP 5 we support Wayland/Weston, as explained in the BSP Layers and Reference Images for Yocto Project Software. GStreamer and Video4Linux2 (V4L2) come pre-installed in the Reference Multimedia Image. Please note that this is only the case for GPU/VPU enabled modules. Modules without a GPU/VPU like the Colibri iMX6ULL or the Colibri iMX7 are not recommended for multimedia applications and therefore you must explicitly Build a Reference Image with Yocto Project/OpenEmbedded and add GStreamer by yourself.
You must Build a Reference Image with Yocto Project/OpenEmbedded if you want to add or remove packages from the image. In any case, you must customize the reference images as they are not production-ready.
If you don't want to do a custom OpenEmbedded build, consider using Torizon, where you can install libraries and packages inside Debian Containers with commands as apt install
. Have a look at our collection of Multimedia, Camera and Video Processing articles for Torizon.
When using our Reference Multimedia Image, before you can play a video you must stop and disable the Qt demo that is autorun at the startup:
# systemctl stop wayland-app-launch
# systemctl disable wayland-app-launch
To play a fullscreen video, you may want to hide the desktop bar. One option is to remove the panel from Weston entirely. Just add the following line to /etc/xdg/weston/weston.ini
:
/etc/xdg/weston/weston.ini[shell] panel-position=none
Afterwards, you need to restart Weston:
$ systemctl restart weston@root
This is a generic section. For information on specific SoMs, we'll discuss it later on.
Most likely, for the basics you won't need to run any V4L2 commands as v4l2-ctl
, and you can focus on searching how to use GStreamer. Often people start with command-line pipelines and later they add the pipelines to their programs if needed.
While our article may help you with some tips, we strongly encourage you to study the GStreamer documentation. The command-line tools section may be particularly useful to get started.
Some videos are included in the /home/root/video
directory of the Reference Multimedia Image, for easy evaluation:
# cd /home/root/video
# ls
You can play them with gplay-1.0
, gst-play-1.0
or gst-launch-1.0
, depending on your SoM:
Tip: keep in mind that, depending on the SoM capabilities, a video with a specific encoding may not play. Try different formats.
# gplay-1.0 <video file>
# gst-play-1.0 <video file>
# gst-launch-1.0 playbin uri=file://<video file>
For more advanced use cases, consult the GStreamer documentation, as suggested in the previous section.
For software video encoding on SoMs that don't have specialized video encoding and decoding units, as the Colibri iMX7 and Colibri iMX6ULL, please consult the GStreamer documentation, as suggested in the previous section.
For SoMs that have hardware video encoding and decoding units, please consult the GStreamer documentation, as suggested in the previous section, and read the next sections that are hardware-specific.
NXP provides GStreamer plugins to access the multimedia libraries using the i.MX SoC's hardware acceleration units. You will find all of the details included on the page Embedded Linux for i.MX Applications Processors, especially on the Releases tab:
Browse the documentation using the corresponding NXP BSP versions that match our BSP versions:
Toradex BSP 5.x.y | NXP BSP |
---|---|
Up-to 5.1.0 | L5.4.24_2.1.0 |
From 5.2.0 onward | L5.4.70_2.3.0 |
On BSP 5, we only support Apalis TK1 running with an upstream kernel. Hardware-accelerated pipelines are not supported. Even if you decide to try to support it on your own, we have reports from customers stating that it's virtually impossible. Otherwise, the available features are the ones provided by the upstream kernel and GStreamer without hardware acceleration.
You should not use BSP 4 anymore. Please migrate to BSP 5 as soon as possible.
NXP provides GStreamer plugins to access the multimedia libraries using the i.MX SoC's hardware acceleration units. The table below shows the current i.MX Gstreamer Plugins, extracted from the i.MX_Linux_Release_Notes.pdf document, available on the NXP i.MX Linux Documentation ver L4.14.98_2.3.0.
Plugin | Description | Note |
---|---|---|
aiurdemux | aiur universal demuxer plugin supporting | Supports AVI, MKV, MP4, MPEG2, ASF, OGG, FLV, WebM, RMVB |
avenc_mp2 | MP3 encoder plugin from gst-libav | |
beepdec | unified audio decoder plugin | Supports MP3, AAC, AAC+, WMA, AC3, Vorbis, DD+, AMR, RA |
glcolorbalance | adjusting brightness, contrast, hue, and saturation on a video stream | |
glcolorconvert | video color space convert based on shaders | |
gldeinterlace | video deinterlacing based on shaders | |
gleffects | GL Shading Language effects plugin | |
glimagesink | video sink based on EGL | |
glvideomixer | compositing multiple videos together | |
imxcompositor_g2d | GPU2D-based video compositor plugin | |
imxcompositor_ipu | IPU-based video compositor plugin | |
imxcompositor_pxp | PXP-based video compositor plugin | |
imxvideoconvert_g2d | GPU2D-based video convert plugin | |
imxvideoconvert_ipu | IPU-based video convert plugin | |
imxvideoconvert_pxp | PXP-based video convert plugin | |
pulsesink | PulseAudio Audio Sink | Set the desired audio source with pactl list sinks and pacmd set-default-sink {sink number} commands |
pulsesrc | PulseAudio Audio Source | Set the desired audio source with pactl list sources and pacmd set-default-source {source number} commands |
v4l2h264dec | V4L2 H.264 Decoder | |
v4l2h264enc | V4L2 H.264 encoder | |
v4l2h265dec | V4L2 H.265 Decoder | |
v4l2mpeg2dec | V4L2 MPEG2 Decoder | |
v4l2mpeg4dec | V4L2 MPEG4 Decoder | |
v4l2src | V4L2-based camera source plugin | |
waylandsink | video sink based on Wayland interfaces |
Plugin | Description | Note |
---|---|---|
aiurdemux | aiur universal demuxer plugin supporting | Supports AVI, MKV, MP4, MPEG2, ASF, OGG, FLV, WebM, RMVB |
avenc_mp2 | MP3 encoder plugin from gst-libav | |
beepdec | unified audio decoder plugin | Supports MP3, AAC, AAC+, WMA, AC3, Vorbis, DD+, AMR, RA |
glcolorbalance | adjusting brightness, contrast, hue, and saturation on a video stream | |
glcolorconvert | video color space convert based on shaders | |
gldeinterlace | video deinterlacing based on shaders | |
gleffects | GL Shading Language effects plugin | |
glimagesink | video sink based on EGL | |
glvideomixer | compositing multiple videos together | |
imxcompositor_g2d | GPU2D-based video compositor plugin | |
imxcompositor_ipu | IPU-based video compositor plugin | |
imxcompositor_pxp | PXP-based video compositor plugin | |
imxvideoconvert_g2d | GPU2D-based video convert plugin | |
imxvideoconvert_ipu | IPU-based video convert plugin | |
imxvideoconvert_pxp | PXP-based video convert plugin | |
pulsesink | PulseAudio Audio Sink | Set the desired audio source with pactl list sinks and pacmd set-default-sink {sink number} commands |
pulsesrc | PulseAudio Audio Source | Set the desired audio source with pactl list sources and pacmd set-default-source {source number} commands |
v4l2h264dec | V4L2 H.264 Decoder | |
v4l2h264enc | V4L2 H.264 encoder | |
v4l2h265dec | V4L2 H.265 Decoder | |
v4l2mpeg2dec | V4L2 MPEG2 Decoder | |
v4l2mpeg4dec | V4L2 MPEG4 Decoder | |
v4l2src | V4L2-based camera source plugin |
Plugin | Description | Note |
---|---|---|
aiurdemux | aiur universal demuxer plugin supporting | Supports AVI, MKV, MP4, MPEG2, ASF, OGG, FLV, WebM, RMVB |
avenc_mp2 | MP3 encoder plugin from gst-libav | |
beepdec | unified audio decoder plugin | Supports MP3, AAC, AAC+, WMA, AC3, Vorbis, DD+, AMR, RA |
glcolorbalance | adjusting brightness, contrast, hue, and saturation on a video stream | |
glcolorconvert | video color space convert based on shaders | |
gldeinterlace | video deinterlacing based on shaders | |
gleffects | GL Shading Language effects plugin | |
glimagesink | video sink based on EGL | |
glvideomixer | compositing multiple videos together | |
imxcompositor_g2d | GPU2D-based video compositor plugin | |
imxcompositor_ipu | IPU-based video compositor plugin | |
imxcompositor_pxp | PXP-based video compositor plugin | |
imxvideoconvert_g2d | GPU2D-based video convert plugin | |
imxvideoconvert_ipu | IPU-based video convert plugin | |
imxvideoconvert_pxp | PXP-based video convert plugin | |
pulsesink | PulseAudio Audio Sink | Set the desired audio source with pactl list sinks and pacmd set-default-sink {sink number} commands |
pulsesrc | PulseAudio Audio Source | Set the desired audio source with pactl list sources and pacmd set-default-source {source number} commands |
v4l2src | V4L2-based camera source plugin | |
vpuenc_h264 | VPU-based AVC/H.264 video encoder | |
vpuenc_vp8 | VPU-based VP8 video encoder |
Plugin | Description | Note |
---|---|---|
aiurdemux | aiur universal demuxer plugin supporting | Supports AVI, MKV, MP4, MPEG2, ASF, OGG, FLV, WebM, RMVB |
avenc_mp2 | MP3 encoder plugin from gst-libav | |
beepdec | unified audio decoder plugin | Supports MP3, AAC, AAC+, WMA, AC3, Vorbis, DD+, AMR, RA |
glcolorbalance | adjusting brightness, contrast, hue, and saturation on a video stream | |
glcolorconvert | video color space convert based on shaders | |
gldeinterlace | video deinterlacing based on shaders | |
gleffects | GL Shading Language effects plugin | |
glvideomixer | compositing multiple videos together | |
imxcompositor_g2d | GPU2D-based video compositor plugin | |
imxcompositor_ipu | IPU-based video compositor plugin | |
imxcompositor_pxp | PXP-based video compositor plugin | |
imxv4l2sink | V4L2-based video sink plugin | |
imxv4l2src | V4L2-based camera source plugin | |
overlaysink | G2D-based video sink plugin | |
pulsesink | PulseAudio Audio Sink | Set the desired audio source with pactl list sinks and pacmd set-default-sink {sink number} commands | |
pulsesrc | PulseAudio Audio Source | Set the desired audio source with pactl list sources and pacmd set-default-source {source number} commands | |
vpudec | VPU-based video decoder plugin | |
vpuenc_h263 | VPU-based H.263 video encoder | |
vpuenc_h264 | VPU-based AVC/H.264 video encoder | |
vpuenc_jpeg | VPU-based JPEG video encoder | |
vpuenc_mpeg4 | VPU-based MPEG4 video encoder |
Plugin | Description | Note |
---|---|---|
aiurdemux | aiur universal demuxer plugin supporting | Supports AVI, MKV, MP4, MPEG2, ASF, OGG, FLV, WebM, RMVB |
avenc_mp2 | MP3 encoder plugin from gst-libav | |
beepdec | unified audio decoder plugin | Supports MP3, AAC, AAC+, WMA, AC3, Vorbis, DD+, AMR, RA |
pulsesink | PulseAudio Audio Sink | Set the desired audio source with pactl list sinks and pacmd set-default-sink {sink number} commands |
pulsesrc | PulseAudio Audio Source | Set the desired audio source with pactl list sources and pacmd set-default-source {source number} commands |
Display a video on Apalis iMX6Q from a CSI Camera Module 5MP OV5640 source and concurrently store it H.264 encoded to a file:
root@apalis-imx6:~# gst-launch-1.0 imxv4l2videosrc device=/dev/video2 ! tee ! queue2 ! vpuenc_h264 ! qtmux ! filesink location=temp.mp4 tee0. ! imxeglvivsink -e
Using the same setup with Apalis iMX6Q from a CSI Camera Module 5MP OV5640 source, one can capture a still image
root@apalis-imx6:~# gst-launch-1.0 v4l2src num-buffers=1 device=/dev/video1 ! jpegenc ! filesink location=test.jpg
Using the same setup with Apalis iMX6Q from a CSI Camera Module 5MP OV5640 source, one can capture a series of still images
root@apalis-imx6:~# gst-launch-1.0 v4l2src device=/dev/video1 ! jpegenc ! multifilesink location=test%d.jpg
Attention: Colibri iMX6ULL modules lack the hardware-accelerated decoding and the memory bandwidth is limited. Therefore, software decoders must be used. Depending on the video's resolution and encoding the performance may be poor.
In order to encode the videos, it is necessary to install GStreamer and its plugins with the next command line
root@colibri-imx6ull:~# opkg install gstreamer1.0-plugins-base-ximagesink gstreamer1.0-plugins-good-video4linux2 gstreamer1.0-plugins-base-videoconvert gstreamer1.0-plugins-bad-fbdevsink gstreamer1.0-plugins-base-theora gstreamer1.0-plugins-good-matroska gstreamer1.0-plugins-base-ogg
To encode and record MKV videos, one can use the following command
root@colibri-imx6ull:~# gst-launch-1.0 v4l2src device=/dev/video0 ! 'video/x-raw, framerate=30/1' ! videoconvert ! theoraenc ! matroskamux ! filesink location=./test1.mkv
To encode and record OGG videos, one can use the following command
root@colibri-imx6ull:~# gst-launch-1.0 imxv4l2src device=/dev/video0 ! video/x-raw,width=1280,height=720 ! videoconvert ! theoraenc ! oggmux ! filesink location=videotestsrc.ogg
Attention: Colibri iMX7 modules lack the hardware-accelerated decoding and the memory bandwidth is limited. Therefore, software decoders must be used. Depending on the video's resolution and encoding the performance may be poor.
Although Toradex Embedded Linux BSPs 2.7.4 and 2.8 currently do not support jpeg encoders for Colibri iMX7, it is possible to download and install GStreamer plugins to do so.
root@colibri-imx7:~# opkg update
root@colibri-imx7:~# opkg install gstreamer1.0-plugins-good-jpeg
To take a still image with JPEG encoders using a UVC webcam, one can use the following GStreamer pipeline
root@colibri-imx7:~# gst-launch-1.0 v4l2src num-buffers=1 ! jpegenc ! filesink location=test.jpg
Toradex Embedded Linux BSPs 2.7.4 and 2.8 currently support only 1 video encoder and decoder, as you can check with the command line below
root@colibri-imx7:~# gst-inspect-1.0 | grep enc
vorbis: vorbisenc: Vorbis audio encoder
theora: theoraenc: Theora video encoder
encoding: encodebin: Encoder Bin
wavenc: wavenc: WAV audio muxer
imxmp3enc.imx: imxmp3enc: imx mp3 audio encoder
coretracers: latency (GstTracerFactory)
root@colibri-imx7:~# gst-inspect-1.0 | grep dec
vorbis: vorbisdec: Vorbis audio decoder
ivorbisdec: ivorbisdec: Vorbis audio decoder
theora: theoradec: Theora video decoder
beep.imx: beepdec: Beep universal decoder
playback: uridecodebin: URI Decoder
playback: decodebin: Decoder Bin
To record a video with MKV extension using a UVC webcam, one can use the next GStreamer pipeline
gst-launch-1.0 v4l2src device=/dev/video0 ! 'video/x-raw, framerate=30/1, width=640,height=480' ! videoconvert ! theoraenc ! matroskamux ! filesink location=test.mkv
To record a video with OGG extension using a UVC webcam, one can use the next GStreamer pipeline
gst-launch-1.0 imxv4l2src device=/dev/video0 ! 'video/x-raw, framerate=30/1, width=640,height=480' ! videoconvert ! theoraenc ! oggmux ! filesink location=test.ogg
To play the recorded videos, one can use the following GStreamer pipelines
gst-launch-1.0 filesrc location=test.mkv ! matroskademux ! theoradec ! videoconvert ! autovideosink
gst-launch-1.0 filesrc location=test.ogg ! oggdemux ! theoradec ! videoconvert ! autovideosink
Attention: Vybrid modules lack the hardware-accelerated decoding and the memory bandwidth is limited. Therefore, software decoders must be used. Depending on the videos resolution and encoding the performance may be poor.
A collection of open decoders may be installed along with other useful Gstreamer packages using the following command:
opkg install gst-ffmpeg gst-plugins-base-ffmpegcolorspace gst-plugins-base-ximagesink gst-plugins-base-alsa gst-plugins-good-isomp4 gst-plugins-good-matroska
With these additions, an H.264 encoded mp4 may be played back as follows (as an example):
gst-launch-0.10 filesrc location=test_vid.mp4 ! qtdemux name=demux demux.video_00 ! queue ! ffdec_h264 ! ffmpegcolorspace ! ximagesink demux.audio_00 ! queue ! ffdec_aac ! alsasink device=hw:0,0
Note: Software video decoding is very CPU intensive which significantly limits the resolution and bitrate that video can be smoothly played back.
Note: The VF50 Flash storage is particularly size constrained, plan accordingly by minimizing the OS & installed packages or by utilizing SD or other external storage.
A video may be played from an HTTP source by installing the souphttpsrc plugin for Gstreamer:
opkg install gst-plugins-good-souphttpsrc
An example Gstreamer pipeline for playback over HTTP (additionally requiring the vorbis decoder and audioconvert gstreamer plugin packages):
gst-launch-0.10 souphttpsrc location=http://upload.wikimedia.org/wikipedia/commons/4/4b/MS_Diana_genom_Bergs_slussar_16_maj_2014.webm ! matroskademux name=demux demux.video_00 ! queue ! ffdec_vp8 ! ffmpegcolorspace ! ximagesink demux.audio_00 ! queue ! vorbisdec ! audioconvert ! alsasink device=hw:0,0
The hardware-accelerated pipelines are supported by the Linux4Tegra BSPs from NVIDIA, which Toradex uses as base for our BSP 2.8. Though we provide some basic examples in this section, you must consult the Jetson Tk1/Tegra Linux Driver Package Multimedia User Guide
The next pipeline will stream video from a UVC webcam.
root@apalis-tk1:~# gst-launch-0.10 v4l2src ! xvimagesink
Below examples allow storing still images resp. videos coming off a UVC webcam or our CSI Camera Module 5MP OV5640.
Note: The following pipeline runs on our CSI Camera Module 5MP OV5640. Unfortunately, the first few frames are currently black or greenish and need to be discarded.
root@apalis-tk1:~# gst-launch-0.10 v4l2src decimate=5 ! 'video/x-raw-yuv,format=(fourcc)UYVY,width=640,height=480' ! ffmpegcolorspace ! video/x-raw-rgb,framerate=1/1 ! ffmpegcolorspace ! pngenc ! filesink location=test.png
If you are using a UVC webcam, the next pipeline will capture the still image
root@apalis-tk1:~# gst-launch-0.10 v4l2src num-buffers=1 ! jpegenc ! filesink location=test.jpg
As our CSI Camera Module 5MP OV5640 uses YUV as color encoding format,
root@apalis-tk1:~# gst-launch-0.10 v4l2src queue-size=1 ! 'video/x-raw-yuv,format=(fourcc)UYVY,width=640,height=480' ! ffmpegcolorspace ! videorate ! video/x-raw-rgb,framerate=1/1 ! ffmpegcolorspace ! pngenc snapshot=false ! multifilesink location=test%d.png
Note: For GStreamer 1.0 the gstreamer1.0-plugins-good-multifile package is also required which so far was not part of our Embedded Linux demo images as of 2.7b5 or 2.8b1.
root@apalis-tk1:~# gst-launch-1.0 v4l2src device=/dev/video0 ! 'video/x-raw,format={UYVY},width=640,height=480' ! videorate max-rate=10 ! videoconvert ! avenc_png ! multifilesink location=test%d.png
root@apalis-tk1:~# gst-launch-1.0 v4l2src ! 'video/x-raw,format={UYVY},width=1280,height=720,framerate=30/1' ! videoconvert ! 'video/x-raw,format={I420}' ! nvvidconv ! 'video/x-raw(memory:NVMM)' ! omxvp8enc bitrate=4000000 ! avimux ! filesink location=test.avi
Note: In order for the bitrate
property to take any effect the low-latency
as well as rc-mode
resp. control-rate
properties also need to be set as per below example pipelines.
root@apalis-tk1:~# gst-launch-0.10 v4l2src queue-size=1 ! 'video/x-raw-yuv,format=(fourcc)UYVY,width=1280,height=960' ! ffmpegcolorspace ! 'video/x-raw-yuv, format=(fourcc)I420' ! nv_omx_h264enc low-latency=true rc-mode=0 bitrate=4000000 ! qtmux ! filesink location=test.mp4 -e
root@apalis-tk1:~# gst-launch-1.0 v4l2src ! 'video/x-raw,format={UYVY},width=1280,height=720,framerate=30/1' ! videoconvert ! 'video/x-raw,format={I420}' ! nvvidconv ! 'video/x-raw(memory:NVMM)' ! omxh264enc low-latency=true bitrate=4000000 control-rate=2 ! 'video/x-h264,stream-format=(string)byte-stream' ! h264parse ! avimux ! filesink location=test.avi
Verify the supported formats of your camera using the command line below
root@apalis-tk1:~# v4l2-ctl --list-formats-ext
In order to assure format compatibility, also inspect videoconvert format types with the current GStreamer version used in the pipeline
root@apalis-tk1:~# gst-inspect-1.0 videoconvert
While regular gstreamer plugins are usually using x-raw-yuv and NVIDIA's gstreamer wrapped OpenMAX stuff usually wanted x-nvrm-yuv there seems to be a third colour format representation called x-nv-yuv which is what the nv_omx_videomixer requires. Some examples:
root@colibri_t20:~# gst-launch videotestsrc ! 'video/x-raw-yuv, width=(int)640, height=(int)480, format=(fourcc)YUY2' ! nvvidconv ! 'video/x-nv-yuv, width=(int)640, height=(int)480, format=(fourcc)I420' ! nv_omx_videomixer ! nv_gl_eglimagesink
root@colibri_t20:~# gst-launch videotestsrc ! 'video/x-raw-yuv, width=(int)640, height=(int)480, format=(fourcc)YUY2' ! nvvidconv ! 'video/x-nvrm-yuv, format=(fourcc)I420' ! nvxvimagesink
root@colibri_t20:~# gst-launch videotestsrc ! 'video/x-raw-yuv, width=(int)640, height=(int)480, format=(fourcc)YUY2' ! nvvidconv ! nvxvimagesink
The following example shows how to playback video through Gstreamer using a Colibri T20 module. Please note that the two numbers at the end specify which ALSA card and device to use for audio (e.g. alsasink device=hw:1,0 for SPDIF through HDMI and alsasink device=hw:2,0 for WM9715L AC97 through headphone). Further note, since the example is performed with a Tegra module, it utilizes NVIDIA Gstreamer elements in the pipeline. Use of another module may require alternate elements.
root@colibri_t20:~# gst-launch filesrc location=/home/root/nv_medusa_h264_720_6M_cbr_2p_key60_q90_aac128_44.mp4 ! qtdemux name=demux demux.video_00 ! nv_omx_h264dec ! nv_gl_videosink rendertarget=0 demux.audio_00 ! nv_omx_aacdec ! alsasink device=hw:1,0
[ 2388.624904] unknown ioctl code
Setting pipeline to PAUSED ...
[ 2388.677056] unknown ioctl code
Pipeline is PREROLLING ...
[ 2388.840327] avp_init: read firmware from 'nvrm_avp.bin' (36528 bytes)
[ 2388.846957] avp_init: Loading AVP kernel at vaddr=d8c00000 paddr=1ff00000
[ 2388.854848] avp_reset: Resetting AVP: reset_addr=100000
[ 2388.878221] avp_init: avp init done
[ 2388.905824] avp_svc_thread: got remote peer
[ 2388.910754] avp_lib: Successfully loaded library nvmm_manager.axf (lib_id=115da8)
[ 2388.953837] avp_lib: Successfully loaded library nvmm_service.axf (lib_id=117bb0)
[ 2389.017941] avp_lib: Successfully loaded library nvmm_h264dec.axf (lib_id=119a20)
[ 2389.036379] tegra_dvfs: rate 721500000 too high for dvfs on emc
Pipeline is PREROLLED ...
Setting pipeline to PLAYING ...
New clock: GstAudioSinkClock
Got EOS from element "pipeline0".
Execution ended after 105287328998 ns.
Setting pipeline to PAUSED ...
Setting pipeline to READY ...
[ 2494.724074] avp_lib: Successfully unloaded 'nvmm_h264dec.axf'
[ 2494.764073] avp_lib: Successfully unloaded 'nvmm_service.axf'
Setting pipeline to NULL ...
Freeing pipeline ...
[ 2494.804076] avp_lib: Successfully unloaded 'nvmm_manager.axf'
[ 2494.809853] avp_svc_thread: couldn't receive msg
[ 2494.814710] avp_svc_thread: done
[ 2494.817958] avp_uninit: avp teardown done
The following example shows how to playback video through Gstreamer using a Colibri T20 module. Please note that the two numbers at the end specify which ALSA card and device to use for audio (e.g. alsasink device=hw:0,0 for WM9715L AC97 through headphone and alsasink device=hw:1,0 for SPDIF through HDMI). Further note, since the example is performed with a Tegra module, it utilizes NVIDIA Gstreamer elements in the pipeline. Use of another module may require an alternate pipeline.
Note: In case you experience banding issues, this is likely due to 16-Bit colour depth in our default image. To enable 24-Bit color depth, consult the Framebuffer (Linux) and the X-Server (Linux) articles.
root@colibri-t20:~# gst-launch filesrc location=/home/root/nv_medusa_h264_720_6M_cbr_2p_key60_q90_aac128_44.mp4 ! qtdemux name=demux demux.video_00 ! nv_omx_h264dec ! nv_gl_eglimagesink demux.audio_00 ! nv_omx_aacdec ! alsasink device=hw:0,0
Setting pipeline to PAUSED ...
Pipeline is PREROLLING ...
omx_setup error while setting FilterTimestamp
[32309.790818] avp_init: Using AVP MMU to relocate AVP kernel
[32309.821369] avp_init: Reading firmware from 'nvrm_avp.bin' (46612 bytes)
[32309.828541] avp_init: Loading AVP kernel at vaddr=d7a00000 paddr=18100000
[32309.852155] avp_reset: Resetting AVP: reset_addr=100000
[32309.869479] avp_node_try_connect: trying connect from RPC_AVP_PORT
[32309.875957] process_connect_locked: got connect (111794)
[32309.881278] avp_svc_thread: got remote peer
[32309.885664] [AVP]: AVP kernel (Nov 2 2012 16:47:31)
[32309.896950] avp_node_try_connect: got conn ack 'RPC_AVP_PORT' (cc10dd80 <-> 111758)
[32309.904691] avp_init: avp init done
[32309.908175] avp_lib: loading library 'nvmm_manager.axf'
[32309.928749] avp_lib: Successfully loaded library nvmm_manager.axf (lib_id=118960)
[32309.936494] avp_node_try_connect: trying connect from NVMM_MANAGER_SRV
[32309.951837] avp_node_try_connect: got conn ack 'NVMM_MANAGER_SRV' (cc10db00 <-> 119c00)
[32309.961486] avp_lib: loading library 'nvmm_service.axf'
[32309.983984] avp_lib: Successfully loaded library nvmm_service.axf (lib_id=11a770)
[32309.991654] avp_node_try_connect: trying connect from nbaaaaaa+
[32310.006858] avp_node_try_connect: got conn ack 'nbaaaaaa+' (cc10d200 <-> 11bab8)
[32310.025884] avp_lib: loading library 'nvmm_h264dec.axf'
[32310.140791] avp_lib: Successfully loaded library nvmm_h264dec.axf (lib_id=11c628)
[32310.149179] avp_node_try_connect: trying connect from obaaaaaa+
[32310.161838] avp_node_try_connect: got conn ack 'obaaaaaa+' (cd049540 <-> 11cbe0)
Allocating new output: 1280x720 (x 9)
[32310.234392] tegra20_ac97_hw_params(): dai->id=0, play
Pipeline is PREROLLED ...
Setting pipeline to PLAYING ...
New clock: GstAudioSinkClock
[32310.491623] tegra20_ac97_trigger()
[32310.495032] tegra20_ac97_start_playback()
[32310.499038] ac97_fifo_set_attention_level()
[32310.503216] ac97_slot_enable()
Got EOS from element "pipeline0".
Execution ended after 114565608000 ns.
Setting pipeline to PAUSED ...
[32425.070926] tegra20_ac97_trigger()
[32425.074368] tegra20_ac97_stop_playback()
[32425.078311] ac97_fifo_set_attention_level()
[32425.083534] ac97_slot_enable()
Setting pipeline to READY ...
[32425.129970] avp_trpc_close: closing 'obaaaaaa+' (11cbe0)
[32425.146843] _send_disconnect: sent disconnect msg for 11cbe0
[32425.182387] avp_lib: Successfully unloaded 'nvmm_h264dec.axf'
[32425.188896] avp_lib: unloaded 'nvmm_h264dec.axf'
[32425.212583] avp_trpc_close: closing 'nbaaaaaa+' (11bab8)
[32425.227385] _send_disconnect: sent disconnect msg for 11bab8
[32425.261971] avp_lib: Successfully unloaded 'nvmm_service.axf'
[32425.267764] avp_lib: unloaded 'nvmm_service.axf'
[32425.273879] process_disconnect_locked: got disconnect (cc10db00)
[32425.311926] avp_lib: Successfully unloaded 'nvmm_manager.axf'
[32425.317732] avp_lib: unloaded 'nvmm_manager.axf'
Setting pipeline to NULL ...
Freeing pipeline ...
[32425.405907] avp_svc_thread: AVP seems to be down; wait for kthread_stop
[32425.413156] avp_svc_thread: exiting
[32425.416771] avp_uninit: avp teardown done
Our T20 BSP V2.x supports video encoding as well.
Note the required '-e' option to gst-launch for mp4 containers. This sends an end of file through the pipeline and ensures a correctly written mp4 file. The tests were done with a Logitech C920 webcam. Available resolutions etc. depends on the webcam used.
Resolutions other than 640x480 require the gstreamer plugins from NVIDIAs L4T R16.3. These will only be part of our images later than June 2013.
VGA V4L2 source in YUV 4:2:2 format encoded to H.264 and stored to a file:
root@colibri_t20:~# gst-launch -e v4l2src device="/dev/video0" ! 'video/x-raw-yuv, width=(int)640, height=(int)480, format=(fourcc)YUY2' ! nvvidconv ! 'video/x-nvrm-yuv, format=(fourcc)I420' ! nv_omx_h264enc ! qtmux ! filesink location=temp.mp4
gst-launch v4l2src device="/dev/video0" ! 'video/x-raw-yuv, width=(int)640, height=(int)480, format=(fourcc)YUY2' ! nvvidconv ! 'video/x-nvrm-yuv, format=(fourcc)I420' ! nv_omx_h264enc ! video/x-h264 ! avimux ! filesink location=temp.avi
800x448 V4L2 source in YUV 4:2:2 format encoded to H.264 and stored to a file:
root@colibri_t20:~# gst-launch -e v4l2src device="/dev/video0" ! 'video/x-raw-yuv, width=(int)800, height=(int)448, format=(fourcc)YUY2' ! nvvidconv ! 'video/x-nvrm-yuv, format=(fourcc)I420' ! nv_omx_h264enc ! qtmux ! filesink location=temp.mp4
720p V4L2 source in YUV 4:2:2 format encoded to H.264 and stored to a file:
root@colibri_t20:~# gst-launch -e v4l2src device="/dev/video0" ! 'video/x-raw-yuv, width=(int)1280, height=(int)720, format=(fourcc)YUY2' ! nvvidconv ! 'video/x-nvrm-yuv, format=(fourcc)I420' ! nv_omx_h264enc ! qtmux ! filesink location=temp.mp4
1280x1024 videotestsource in YUV 4:2:2 format encoded to H.264 and stored to a file:
root@colibri_t20:~# gst-launch -e videotestsrc ! 'video/x-raw-yuv, width=(int)1280, height=(int)1024, format=(fourcc)YUY2' ! nvvidconv ! 'video/x-nvrm-yuv, format=(fourcc)I420' ! nv_omx_h264enc ! qtmux ! filesink location=temp.mp4
videotestsource in YUV 4:2:0 format encoded to H.264 and stored to a file:
root@colibri_t20:~# gst-launch -e videotestsource ! 'video/x-raw-yuv, width=(int)640, height=(int)480, format=(fourcc)I420' ! nvvidconv ! 'video/x-nvrm-yuv' ! nv_omx_h264enc ! qtmux ! filesink location=temp.mp4
Display a video from a VGA V4L2 source and concurrently store it H.264 encoded to a file:
root@colibri_t20:~# gst-launch -e v4l2src device="/dev/video0" ! 'video/x-raw-yuv, width=(int)640, height=(int)480, format=(fourcc)YUY2' ! nvvidconv ! 'video/x-nvrm-yuv, format=(fourcc)I420' ! tee ! nvxvimagesink tee0. ! queue2 ! nv_omx_h264enc ! qtmux ! filesink location=temp.mp4
Display a video on Apalis T30 from a OV5640 CSI-2 full HD V4L2 source and concurrently store it H.264 encoded to a file:
root@colibri_t20:~# gst-launch v4l2src device=/dev/video0 ! 'video/x-raw-yuv, framerate=15/1, width=1920, height=1088, format=(fourcc)I420' ! tee ! nv_omx_videomixer ! nv_gl_eglimagesink tee0. ! queue2 ! nv_omx_h264enc ! qtmux ! filesink location=temp.mp4 -e
Note: The encoder is limited to resolutions dividable by 16 (e.g. 1920x1088 instead of 1920x1080).
Note2: On T30 the achievable frame rate for full HD is around 15 FPS.
# Testrun to find out the used video capabilities
root@colibri-t20:~# gst-launch -v filesrc location=temp.mp4 ! qtdemux name=demux demux.video_00 ! queue ! rtph264pay pt=96 ! udpsink host=localhost port=5000 demux.audio_00 ! queue ! rtpmp4apay pt=97 ! udpsink host=localhost port=5001
Setting pipeline to PAUSED ...
Pipeline is PREROLLING ...
/GstPipeline:pipeline0/GstQueue:queue0.GstPad:sink: caps = video/x-h264, stream-format=(string)avc, alignment=(string)au, level=(string)2.1, profile=(string)constrained-baseline, codec_data=(buffer)01424015030100096742802895a0280f4401000468ce3c80, width=int)640, height=(int)480, framerate=(fraction)125/4, pixel-aspect-ratio=fraction)1/1
/GstPipeline:pipeline0/GstQueue:queue0.GstPad:src: caps = video/x-h264, stream-format=(string)avc, alignment=(string)au, level=(string)2.1, profile=(string)constrained-baseline, codec_data=(buffer)01424015030100096742802895a0280f4401000468ce3c80, width=int)640, height=(int)480, framerate=(fraction)125/4, pixel-aspect-ratio=fraction)1/1
/GstPipeline:pipeline0/GstRtpH.264Pay:rtph264pay0.GstPad:src: caps = application/x-rtp, media=(string)video, clock-rate=(int)90000, encoding-name=(string)H.264, sprop-parameter-sets=(string)\"Z0KAKJWgKA9E\\,aM48gA\\=\\=\", payload=(int)96, ssrc=(uint)3804678311, clock-base=(uint)311926741, seqnum-base=(uint)59376
/GstPipeline:pipeline0/GstRtpH.264Pay:rtph264pay0.GstPad:sink: caps = video/x-h264, stream-format=(string)avc, alignment=(string)au, level=(string)2.1, profile=(string)constrained-baseline, codec_data=(buffer)01424015030100096742802895a0280f4401000468ce3c80, width=int)640, height=(int)480, framerate=(fraction)125/4, pixel-aspect-ratio=fraction)1/1
/GstPipeline:pipeline0/GstRtpH.264Pay:rtph264pay0: timestamp = 311926741
/GstPipeline:pipeline0/GstRtpH.264Pay:rtph264pay0: seqnum = 59376
/GstPipeline:pipeline0/GstUDPSink:udpsink0.GstPad:sink: caps = application/x-rtp, media=(string)video, clock-rate=(int)90000, encoding-name=(string)H.264, sprop-parameter-sets=(string)\"Z0KAKJWgKA9E\\,aM48gA\\=\\=\", payload=(int)96, ssrc=(uint)3804678311, clock-base=(uint)311926741, seqnum-base=(uint)59376
^CCaught interrupt -- handling interrupt.
Interrupt: Stopping pipeline ...
ERROR: pipeline doesn't want to preroll.
Setting pipeline to NULL ...
/GstPipeline:pipeline0/GstUDPSink:udpsink0.GstPad:sink: caps = NULL
/GstPipeline:pipeline0/GstRtpH.264Pay:rtph264pay0.GstPad:sink: caps = NULL
/GstPipeline:pipeline0/GstRtpH.264Pay:rtph264pay0.GstPad:src: caps = NULL
/GstPipeline:pipeline0/GstQueue:queue0.GstPad:src: caps = NULL
/GstPipeline:pipeline0/GstQueue:queue0.GstPad:sink: caps = NULL
/GstPipeline:pipeline0/GstQTDemux:demux.GstPad:video_00: caps = NULL
Freeing pipeline ...
# use the displayed video capabilities
VCAPS="application/x-rtp, media=(string)video, clock-rate=(int)90000, encoding-name=(string)H.264, sprop-parameter-sets=(string)\"Z0KAKJWgKA9E\\,aM48gA\\=\\=\", payload=(int)96, ssrc=(uint)3804678311, clock-base=(uint)311926741, seqnum-base=(uint)59376"
# start the receiver with the found video capabilities
root@colibri_t20:~# gst-launch udpsrc port=5000 ! $VCAPS ! rtph264depay ! nv_omx_h264dec ! nv_gl_eglimagesink
# launch the sender again
root@colibri_t20:~# gst-launch-0.10 -e v4l2src device="/dev/video0" ! 'video/x-raw-yuv, width=(int)640, height=(int)480, format=(fourcc)YUY2' ! nvvidconv ! 'video/x-nvrm-yuv, format=(fourcc)I420' ! nv_omx_h264enc ! rtph264pay pt=96 ! udpsink host=localhost port=5000
nv_omx_hdmioverlaysink
=> fullscreen via HDMI aka DVI-D interface
nv_omx_lvdsoverlaysink
=> fullscreen via TFTLCD aka LVDS resp. VGA or DVI-A interface
Note: So far this only works as long as one uses a display manager that is limited to one single resolution (e.g. the libnvodm_disp.so.vgaonly one). This is currently being investigated by NVIDIA.
nv_omx_hdmi_videosink
=> fullscreen via HDMI-1 aka DVI-D interface
nv_omx_videosink
=> fullscreen via LVDS-1 aka parallel RGB resp. VGA or DVI-A interface
root@colibri_t20:~# gst-launch filesrc location=/home/root/media/MatrixXQ.avi ! avidemux name=demux demux.video_00 ! nv_omx_mpeg4dec ! nv_gl_videosink rendertarget=0 demux.audio_00 ! nv_omx_mp3dec ! alsasink device=hw:2,0
root@colibri_t20:~# gst-launch filesrc location=/home/root/media/Avatar_-_Featurette_HD_1080p.mov ! qtdemux name=demux demux.video_00 ! nv_omx_h264dec ! nv_gl_videosink rendertarget=0 demux.audio_00 ! nv_omx_aacdec ! alsasink device=hw:2,0
root@colibri_t20:~# gst-launch filesrc location=/home/root/media/bourne_ultimatum_trailer.mp4 ! qtdemux name=demux demux.video_00 ! nv_omx_h264dec ! nv_gl_videosink rendertarget=0 demux.audio_00 ! nv_omx_aacdec ! alsasink device=hw:2,0
The easiest is to use NVIDIA's proprietary nvgstplayer application as follows:
nvgstplayer -i /media/sda1/nv_medusa_h264_720_6M_cbr_2p_key60_q90_aac128_44.mp4 --svs nv_omx_videosink
The audio output can be chosen via ~/.asoundrc (e.g. similar to alsasink device=hw:0,1) as follows:
pcm.!default {
type hw
card 0
device 1
}
ctl.!default {
type hw
card 0
device 1
}
Alternatively one can explicitly specify a Gstreamer pipeline as follows:
root@colibri_t20:~# gst-launch filesrc location=/home/root/media/MatrixXQ.avi ! avidemux name=demux demux.video_00 ! nv_omx_mpeg4dec ! nv_gl_videosink rendertarget=0 demux.audio_00 ! nv_omx_mp3dec ! alsasink device=hw:2,0
root@colibri_t20:~# gst-launch filesrc location=/media/sda1/nv_medusa_h264_720_6M_cbr_2p_key60_q90_aac128_44.mp4 ! qtdemux name=demux demux.video_00 ! nv_omx_h264dec ! nvxvimagesink demux.audio_00 ! nv_omx_aacdec ! alsasink device=hw:0,0
If you are looking for information on how to use the CSI Camera Module 5MP OV5640, also have a look at the article CSI Camera Module 5MP OV5640 (Linux).
If you want to use a webcam, read the article Webcam (Linux) in addition to the current article.
If you want to stream video over the network, read the article Audio/Video over RTP With GStreamer (Linux).
If you are looking for information on how to use cameras with containers in TorizonCore, refer to the article How to use Cameras on Torizon.