【Gstreamer开发】TI嵌入式处理器GStreamer pipeline

摘要:
ExampleGStreamerPipelinesFromTexasInstrumentsEmbeddedProcessorsWikiJumpto:navigation,searchTranslatethispagetoTranslateExampleGStreamerPipelinesMAINTENANCEALERT:05MAY2012TheEmbeddedProcessorsWikiissch

Example GStreamer Pipelines

From Texas Instruments Embedded Processors Wiki

Jump to: navigation, search

Translate this page to Translate

Example GStreamer Pipelines

MAINTENANCE ALERT: 05 MAY 2012

The Embedded Processors Wiki is scheduled to undergo routine maintenance and updates between the hours of 8:00 AM and 12:00 PM EDT (GMT -5) on Saturday, May 5, 2012.

During this period, the wiki content may be unavailable for short periods of time and will not be editable. Please plan accordingly.

Search

var googleSearchIframeName = "results_012993352299989377144:kjaa1vfng0a";var googleSearchFormName = "searchbox_012993352299989377144:kjaa1vfng0a";var googleSearchFrameWidth = 600;var googleSearchFrameborder = 0;var googleSearchDomain = "www.google.com";var googleSearchPath = "/cse";

Contents

[hide]

1Purpose

2Testing

3Media files

3.1Creating an AVI file

4Supported Platforms

5DM355

5.1Environment Requirements

5.2Loopback: Video

5.3Loopback: Audio

5.4Loopback: Audio + Video

5.5Decode Video Files

5.6Decode Audio Files

5.7Decode .AVI Files

5.8Encode Video Files

5.9Encode Audio Files

5.10Image Encode

5.11Image Decode

5.12Resize

5.13Network Streaming

6DM357

6.1Environment Requirements

6.2Loopback: Video

6.3Loopback: Audio

6.4Loopback: Audio + Video

6.5Decode Video Files

6.6Decode Audio Files

6.7Decode .MP4 Files

6.8Decode .AVI Files

6.9Encode Video Files

6.10Encode Video in Container

6.11Image Encode

6.12Image Decode

6.13Resize

6.14Network Streaming

7DM644x

7.1Environment Requirements

7.2Loopback: Video

7.3Loopback: Audio

7.4Loopback: Audio + Video

7.5Decode Video Files

7.6Decode Audio Files

7.7Decode .MP4 Files

7.8Decode .AVI Files

7.9Decode .TS Files

7.10Encode Video Files

7.11Encode Video in Container

7.12Resize

7.13Network Streaming

8DM365

8.1Environment Requirements

8.2Loopback: Video

8.3Loopback: Audio

8.4Loopback: Audio + Video

8.5Decode Video Files

8.6Decode Audio Files

8.7Decode Container Files

8.8Encode Video Files

8.9Encode Audio Files

8.10Encode Video in Container

8.11Image Encode

8.12Image Decode

8.13Resize

8.14Network Streaming

9DM6467

9.1Environment Requirements

9.2Loopback: Video

9.3Loopback: Audio

9.4Loopback: Audio + Video

9.5Decode Video Files

9.6Decode Audio Files

9.7Decode .MP4 Files

9.8Decode .AVI Files

9.9Decode .TS Files

9.10Encode Video Files

9.11Encode Video in Container

9.12Encode Audio Files

9.13Resize

9.14Network Streaming

10DM6467T

10.1Environment Requirements

10.2Loopback: Video

10.3Loopback: Audio

10.4Loopback: Audio + Video

10.5Decode Video Files

10.6Decode Audio Files

10.7Decode Container Files

10.8Encode Video Files

10.9Encode Video in Container

10.10Encode Audio Files

10.11Resize

10.12Network Streaming

11OMAP35x

11.1Environment Requirements

11.2Loopback: Video

11.3Loopback: Audio

11.4Loopback: Audio+Video

11.5Decode Video files

11.6Decode Audio Files

11.7Decode .MP4 Files

11.8Decode .AVI Files

11.9Decode .TS Files

11.10Encode Video Files

11.11Encode Video in Container

11.12Image Encode

11.13Image Decode

11.14Resize

11.15Network Streaming

11.15.1Audio RTP Streaming

11.15.2H.264 RTP Streaming

12All

12.1Debugging

12.1.1Verbose output

12.1.2Element debug output

12.2Audio pipelines

12.2.1Controlling the sample rate and bit depth

12.2.2Generic network audio streaming example

if (window.showTocToggle) { var tocShowText = "show"; var tocHideText = "hide"; showTocToggle(); }Purpose

This page provides example pipelines that can be copied to the command line to demonstrate various GStreamer operations. Some of the pipelines may need modification for things such as file names, ip addresses, etc.

It is our hope that people using this page will add new interesting pipelines that they themselves are using. For example, on DM6467 if you are decoding a 1080 video and outputing to component please include your pipeline for others to use as a reference.

Refer to this Gstreamer article for more information on downloading and building TI Gstreamer elements. The project is hosted athttp://gstreamer.ti.com. If you are interested in understanding the design details then watch video presentationhttp://software-dl.ti.com/sdo/sdo_apps_public_sw/GStreamer_On_TI/FLV1/GStreamer_On_TI.htm

Testing

Currently these pipelines have not undergone any extensive testing. If you find an error in a pipeline please correct it.

Media files

You should be able to use any audio and video media file that conforms to the appropriate standard.

Creating an AVI file

The followingffmpegcommand takes a .mov file (say from the Apple movie trailers site) and make an AVI file. Run the command on your host computer.

ffmpeg -i tropic_thunder-tlr1a_720p.mov -r 60 -b 6000000 -vcodec mpeg2video -ab 48000000 -acodec libmp3lame -s 1280x544 tropic.avi

Supported Platforms

Following are a list of supported platforms, with links that jump directly to pipeline examples for each platform.

DM355

DM357

DM644x

DM365

DM6467

DM6467T

OMAP35x

All (commonly requested examples)

DM355

This section covers pipelines for common use cases for the DM355 processor.

Environment Requirements

Before executing the pipeline you need to set couple of environment variables, load kernel modules and activate video planes as follows:

cd /opt/gstreamer_demo/dm355/

./loadmodules.sh

export GST_REGISTRY=/tmp/gst_registry.bin

export LD_LIBRARY_PATH=/opt/gstreamer/lib

export GST_PLUGIN_PATH=/opt/gstreamer/lib/gstreamer-0.10

export PATH=/opt/gstreamer/bin:$PATH

cat /dev/zero > /dev/fb2 2> /dev/null

The above command assumes that gstreamer is installed in /opt/gstreamer directory.

Loopback: Video

v4l2src (Capture):

gst-launch -v v4l2src always-copy=FALSE ! TIDmaiVideoSink videoStd=D1_NTSC videoOutput=composite contiguousInputFrame=TRUE sync=false

videotestsrc (generated video test-bars):

gst-launch -v videotestsrc ! TIDmaiVideoSink videoStd=D1_NTSC videoOutput=composite accelFrameCopy=FALSE sync=false

Loopback: Audio

No pipelines here yet. Please feel free to add your own.

Loopback: Audio + Video

No pipelines here yet. Please feel free to add your own.

Decode Video Files

NTSC:

gst-launch -v filesrc location=sample.m4v ! TIViddec2 codecName=mpeg4dec engineName=codecServer ! TIDmaiVideoSink videoStd=D1_NTSC videoOutput=composite sync=false

PAL:

gst-launch -v filesrc location=sample.m4v ! TIViddec2 codecName=mpeg4dec engineName=codecServer ! TIDmaiVideoSink videoStd=D1_PAL videoOutput=composite sync=false

Decode Audio Files

This platform. does not have an accelerated audio decoder element. You can use the ARM based audio decoders "mad", "ffdec_mp3" or"faad"

MP3 pipelines:

gst-launch filesrc location=sample.mp3 ! mad ! alsasink

gst-launch filesrc location=sample.mp3 ! mp3parse ! ffdec_mp3 ! alsasink

AAC pipeline:

gst-launch filesrc location=sample.aac ! faad ! alsasink

Decode .AVI Files

The following pipeline assumes you have an AVI file with MPEG-4 Video and MP1L2 or MP3 Audio. Note that not all MPEG-4 video streams can be played using the DM355 MPEG-4 decoder -- make sure the MPEG-4 stream was encoded with the DM355 MPEG-4 encoder or another compatible encoder.

gst-launch -v filesrc location=sample.avi ! avidemux name=demux demux.audio_00 ! queue max-size-buffers=8000 max-size-time=0 max-size-bytes=0 ! mad ! alsasink demux.video_00 ! queue ! TIViddec2 ! TIDmaiVideoSink videoStd=D1_NTSC videoOutput=composite

Encode Video Files

videotestsrc (generated video test-bars):

gst-launch -v videotestsrc num-buffers=2000 ! TIVidenc1 codecName=mpeg4enc engineName=codecServer ! filesink location=output.m4v

v4l2src (Capture):

gst-launch -v v4l2src always-copy=FALSE num-buffers=2000 ! TIVidenc1 codecName=mpeg4enc engineName=codecServer contiguousInputFrame=TRUE ! filesink location=output.m4v

Encode Audio Files

This platform. does not have an accelerated audio encoder element. You can use the ARM based audio encoders "lame" or"faac"

No pipelines here yet. Please feel free to add your own.

Image Encode

A simple pipeline that converts a UYVY image into JPEG format.

gst-launch -v filesrc location=sample.yuv ! TIImgenc1 resolution=720x480 iColorSpace=UYVY ColorSpace=YUV420P qValue=75 ! filesink location=output.jpg

Image Decode

A simple pipeline that converts a JPEG image into UYVY format.

gst-launch -v filesrc location=sample.jpg ! TIImgdec1 resolution=720x480 ! filesink location=sample.yuv

Resize

A simple pipeline capturing from v4l2src and resizing to CIF.

gst-launch v4l2src always-copy=FALSE ! TIVidResize contiguousInputFrame=TRUE ! 'video/x-raw-yuv,width=352,height=288' ! TIDmaiVideoSink videoStd=D1_NTSC videoOutput=composite sync=false

Network Streaming

This section gives example where EVM acts as streaming server, which captures, encodes and transmit via udp. Host PC can be used as client to decode.

MPEG-4 Encode/Stream/Decode:

A simple RTP server to encode and transmit MPEG-4

gst-launch -v v4l2src always-copy=FALSE ! TIVidenc1 codecName=mpeg4enc engineName=encode contiguousInputFrame=TRUE ! rtpmp4vpay pt=96 ! udpsink host=<HOST IP ADDRESS> port=5000

When the pipeline starts to run, you'll see something that looks like this:

/GstPipeline:pipeline0/GstUDPSink:udpsink0.GstPad:sink: caps = application/x-rtp, media=(string)video, clock-rate=(int)90000, encoding-name=(string)MP4V-ES, profile-level-id=(string)1, config=(string)000001010000012000845d4c28b421e0a21f, payload=(int)96, ssrc=(guint)3412089386, clock-base=(guint)945410414, seqnum-base=(guint)27711

Make a note of caps="application/x-rtp, media=(string)video ................" string and pass this string in client below

A simple RTP client to decode MPEG-4 and display on HOST machine

gst-launch -v udpsrc port=5000 caps="<PASS_CAPS_FROM_SERVER>" ! rtpmp4vdepay ! ffdec_mpeg4 ! xvimagesink

For sending and receiving MPEG-4 with DM355 EVM you can use these 2 pipelines :

RTP server side :

gst-launch -v videotestsrc ! TIVidenc1 codecName=mpeg4enc engineName=encode ! rtpmp4vpay send-config=true ! udpsink host=<HOST IP ADDRESS> port=5000

Don't forget to set the "send-config" property to true

RTP client side :

gst-launch -v udpsrc port=5000 caps="application/x-rtp, media=(string)video, clock-rate=(int)90000, encoding-name=(string)MP4V-ES, profile-level-id=(string)1, config=(string)000001010000012000845d4c285020f0a21f, payload=(int)96, ssrc=(guint)3319524202, clock-base=(guint)4012564513, seqnum-base=(guint)25833" ! rtpmp4vdepay ! TIViddec2 codecName=mpeg4dec engineName=decode ! TIDmaiVideoSink videoStd=D1_PAL videoOutput=composite sync=false

Note that client should be started before server

DM357

This section covers pipelines for common use cases for the DM357 processor.

Environment Requirements

Before executing the pipeline you need to set couple of environment variables, load kernel modules and activate video planes as follows:

cd /opt/gstreamer_demo/dm357/

./loadmodules.sh

export GST_REGISTRY=/tmp/gst_registry.bin

export LD_LIBRARY_PATH=/opt/gstreamer/lib

export GST_PLUGIN_PATH=/opt/gstreamer/lib/gstreamer-0.10

export PATH=/opt/gstreamer/bin:$PATH

cat /dev/zero > /dev/fb2 2> /dev/null

The above command assumes that gstreamer is installed in /opt/gstreamer directory.

Notes on DM357 Performance:There is a known issue on DM357 where there are intermittent freezes in video and audio playback in some cases. If you experience this, nicing your gst-launch command to 15 as follows may resolve the issue:

nice -n 15 gst-launch .... (rest of gst-launch command)

Loopback: Video

v4l2src (Capture):

gst-launch -v v4l2src always-copy=FALSE ! TIDmaiVideoSink videoStd=D1_NTSC videoOutput=composite contiguousInputFrame=TRUE sync=false

videotestsrc (generated video test-bars):

gst-launch -v videotestsrc ! TIDmaiVideoSink videoStd=D1_NTSC videoOutput=composite accelFrameCopy=FALSE sync=false

Loopback: Audio

No pipelines here yet. Please feel free to add your own.

Loopback: Audio + Video

No pipelines here yet. Please feel free to add your own.

Decode Video Files

H.264/NTSC:

gst-launch -v filesrc location=sample.264 ! TIViddec codecName=h264dec engineName=hmjcp ! TIDmaiVideoSink videoStd=D1_NTSC videoOutput=composite sync=false

MPEG-4/NTSC:

gst-launch -v filesrc location=sample.m4v ! TIViddec codecName=mpeg4dec engineName=hmjcp ! TIDmaiVideoSink videoStd=D1_NTSC videoOutput=composite sync=false

Decode Audio Files

This platform. does not have an accelerated audio decoder element. You can use the ARM based audio decoders "mad" or"faac"

No pipelines here yet. Please feel free to add your own.

Decode .MP4 Files

No pipelines here yet. Please feel free to add your own.

Decode .AVI Files

The following pipeline assumes you have an AVI file with MPEG-2 or MPEG-4 Video and MP1L2 or MP3 Audio.

gst-launch -v filesrc location=sample.avi ! avidemux name=demux demux.audio_00 ! queue max-size-buffers=8000 max-size-time=0 max-size-bytes=0 ! mad ! alsasink demux.video_00 ! queue ! TIViddec ! TIDmaiVideoSink videoStd=D1_NTSC videoOutput=composite

Encode Video Files

H.264/v4l2src (Capture):

gst-launch -v v4l2src always-copy=FALSE num-buffers=2000 ! TIVidenc codecName=h264enc engineName=hmjcp contiguousInputFrame=TRUE ! filesink location=sample.264

MPEG-4/v4l2src (Capture):

gst-launch -v v4l2src always-copy=FALSE num-buffers=2000 ! TIVidenc codecName=mpeg4enc engineName=hmjcp contiguousInputFrame=TRUE ! filesink location=sample.m4v

H.264/videotestsrc (generated video test-bars):

gst-launch -v videotestsrc num-buffers=2000 ! TIVidenc codecName=h264enc engineName=hmjcp ! filesink location=sample.264

MPEG-4/videotestsrc (generated video test-bars):

gst-launch -v videotestsrc num-buffers=2000 ! TIVidenc codecName=mpeg4enc engineName=hmjcp ! filesink location=sample.m4v

Encode Video in Container

Encode H.264 in quicktime container (Capture):

gst-launch -v v4l2src always-copy=FALSE num-buffers=2000 ! TIVidenc codecName=h264enc engineName=hmjcp contiguousInputFrame=TRUE byteStream=FALSE ! qtmux ! filesink location=sample.mp4

gst-launch -v videotestsrc num-buffers=2000 ! TIVidenc codecName=h264enc engineName=hmjcp byteStream=FALSE ! qtmux ! filesink location=sample.mp4

Image Encode

A simple pipeline that converts a YUV422P image into JPEG format.

gst-launch filesrc location=sample.yuv ! TIImgenc resolution=720x480 iColorSpace=UYVY ColorSpace=YUV422P qValue=75 ! filesink location=sample.jpg

Image Decode

A simple pipeline that converts a JPEG image into UYVY format.

gst-launch filesrc location=sample.jpg ! TIImgdec resolution=720x480 ! filesink location=sample.yuv

Resize

A simple pipeline capturing from v4l2src and resizing to CIF.

gst-launch v4l2src always-copy=FALSE ! TIVidResize contiguousInputFrame=TRUE ! 'video/x-raw-yuv,width=352,height=288' ! TIDmaiVideoSink videoStd=D1_NTSC videoOutput=composite accelFrameCopy=FALSE sync=false

Network Streaming

This section gives example where EVM acts as streaming server, which captures, encodes and transmit via udp. Host PC can be used as client to decode.

H.264 Encode/Stream/Decode:

A simple RTP server to be run on EVM.

gst-launch -v v4l2src always-copy=FALSE ! TIVidenc codecName=h264enc engineName=hmjcp contiguousInputFrame=TRUE ! rtph264pay pt=96 ! udpsink host=<HOST_PC_IP> port=5000

When the pipeline starts to run, you'll see something that looks like this:

/GstPipeline:pipeline0/GstRtpH264Pay:rtph264pay0.GstPad:src: caps = application/x-rtp, media=(string)video, clock-rate=(int)90000, encoding-name=(string)H264, profile-level-id=(string)42801e, sprop-parameter-sets=(string)"Z0KAHtoC0PRA\,aM48gA\=\=", payload=(int)96, ssrc=(guint)895989858, clock-base=(guint)3971488929, seqnum-base=(guint)34821

Make a note of caps="application/x-rtp, media=(string)video ................" string and pass this string in client below

A simple RTP client to be run on Host PC (Linux).

gst-launch -v udpsrc port=5000 caps="<CAPS_FROM_SERVER>" ! rtph264depay ! ffdec_h264 ! xvimagesink

DM644x

This section covers pipelines for common use cases for the DM644x processor.

Environment Requirements

Before executing the pipeline you need to set couple of environment variables, load kernel modules and activate video planes as follows:

cd /opt/gstreamer_demo/dm6446/

./loadmodules.sh

export GST_REGISTRY=/tmp/gst_registry.bin

export LD_LIBRARY_PATH=/opt/gstreamer/lib

export GST_PLUGIN_PATH=/opt/gstreamer/lib/gstreamer-0.10

export PATH=/opt/gstreamer/bin:$PATH

cat /dev/zero > /dev/fb2 2> /dev/null

The above command assumes that gstreamer is installed in /opt/gstreamer directory.

Loopback: Video

v4l2src (Capture):

gst-launch -v v4l2src always-copy=FALSE ! TIDmaiVideoSink videoStd=D1_NTSC videoOutput=composite contiguousInputFrame=TRUE sync=false

videotestsrc (generated video test-bars):

gst-launch -v videotestsrc ! TIDmaiVideoSink videoStd=D1_NTSC videoOutput=composite accelFrameCopy=FALSE sync=false

Loopback: Audio

No pipelines here yet. Please feel free to add your own.

Loopback: Audio + Video

No pipelines here yet. Please feel free to add your own.

Decode Video Files

H.264/NTSC:

gst-launch -v filesrc location=sample.264 ! TIViddec2 codecName=h264dec engineName=decode ! TIDmaiVideoSink videoStd=D1_NTSC videoOutput=composite sync=false

MPEG-2/NTSC:

gst-launch -v filesrc location=sample.m2v ! TIViddec2 codecName=mpeg2dec engineName=decode ! TIDmaiVideoSink videoStd=D1_NTSC videoOutput=composite sync=false

MPEG-4/NTSC:

gst-launch -v filesrc location=sample.m4v ! TIViddec2 codecName=mpeg4dec engineName=decode ! TIDmaiVideoSink videoStd=D1_NTSC videoOutput=composite sync=false

Decode Audio Files

AAC:

gst-launch -v filesrc location=sample.aac ! TIAuddec1 codecName=aachedec engineName=decode ! alsasink sync=false

Decode .MP4 Files

The following pipeline assumes you have an .MP4 file with H.264 Video and AAC Audio.

gst-launch -v filesrc location=sample.mp4 ! qtdemux name=demux demux.audio_00 ! queue max-size-buffers=8000 max-size-time=0 max-size-bytes=0 ! TIAuddec1 ! alsasink demux.video_00 ! queue ! TIViddec2 ! TIDmaiVideoSink videoStd=D1_NTSC videoOutput=composite

Decode .AVI Files

The following pipeline assumes you have an .AVI file with MPEG-2 or MPEG-4 Video and MP1L2 or MP3 Audio.

gst-launch -v filesrc location=sample.avi ! avidemux name=demux demux.audio_00 ! queue max-size-buffers=1200 max-size-time=0 max-size-bytes=0 ! mad ! alsasink demux.video_00 ! queue ! TIViddec2 ! TIDmaiVideoSink videoStd=D1_NTSC videoOutput=composite

Decode .TS Files

The following pipeline assumes you have an transport stream file with H.264 Video and MP1L2 or MP3 Audio.

gst-launch filesrc location=sample.ts ! typefind ! mpegtsdemux name=demux demux. ! queue max-size-buffers=1200 max-size-time=0 max-size-bytes=0 ! typefind ! mad ! alsasink demux. ! typefind ! TIViddec2 ! TIDmaiVideoSink videoStd=D1_NTSC videoOutput=COMPOSITE

Encode Video Files

H.264/v4l2src (Capture):

gst-launch -v v4l2src always-copy=FALSE num-buffers=2000 ! TIVidenc1 codecName=h264enc engineName=encode contiguousInputFrame=TRUE ! filesink location=sample.264

MPEG-4/v4l2src (Capture):

gst-launch -v v4l2src always-copy=FALSE num-buffers=2000 ! TIVidenc1 codecName=mpeg4enc engineName=encode contiguousInputFrame=TRUE ! filesink location=sample.m4v

H.264/videotestsrc (generated video test-bars):

gst-launch -v videotestsrc num-buffers=2000 ! TIVidenc1 codecName=h264enc engineName=encode ! filesink location=sample.264

MPEG-4/videotestsrc (generated video test-bars):

gst-launch -v videotestsrc num-buffers=2000 ! TIVidenc1 codecName=mpeg4enc engineName=encode ! filesink location=sample.m4v

Encode Video in Container

Encode H.264 in quicktime container (Capture):

gst-launch -v v4l2src always-copy=FALSE num-buffers=2000 ! TIVidenc1 codecName=h264enc engineName=encode contiguousInputFrame=TRUE byteStream=FALSE ! qtmux ! filesink location=sample.mp4

gst-launch -v videotestsrc num-buffers=2000 ! TIVidenc1 codecName=h264enc engineName=encode byteStream=FALSE ! qtmux ! filesink location=sample.mp4

Resize

A simple pipeline capturing from v4l2src and resizing to CIF.

gst-launch v4l2src always-copy=FALSE ! TIVidResize contiguousInputFrame=TRUE ! 'video/x-raw-yuv,width=352,height=288' ! TIDmaiVideoSink videoStd=D1_NTSC videoOutput=composite sync=false accelFrameCopy=FALSE

Network Streaming

This section gives example where EVM acts as streaming server, which captures, encodes and transmit via udp. Host PC can be used as client to decode.

H.264 Encode/Stream/Decode:

A simple RTP server to be run on EVM.

gst-launch -v v4l2src always-copy=FALSE ! TIVidenc1 codecName=h264enc engineName=encode contiguousInputFrame=TRUE ! rtph264pay pt=96 ! udpsink host=<HOST_PC_IP> port=5000

When the pipeline starts to run, you'll see something that looks like this:

/GstPipeline:pipeline0/GstRtpH264Pay:rtph264pay0.GstPad:src: caps = application/x-rtp, media=(string)video, clock-rate=(int)90000, encoding-name=(string)H264, profile-level-id=(string)42801e, sprop-parameter-sets=(string)"Z0KAHtoC0PRA\,aM48gA\=\=", payload=(int)96, ssrc=(guint)895989858, clock-base=(guint)3971488929, seqnum-base=(guint)34821

Make a note of caps="application/x-rtp, media=(string)video ................" string and pass this string in client below

A simple RTP client to be run on Host PC (Linux).

gst-launch -v udpsrc port=5000 caps="<PASS_CAPS_FROM_SERVER>" ! rtph264depay ! ffdec_h264 ! xvimagesink

MPEG-4 Encode/Stream/Decode:

A simple RTP server to encode and transmit MPEG-4

gst-launch -v v4l2src always-copy=FALSE ! TIVidenc1 codecName=mpeg4enc engineName=encode contiguousInputFrame=TRUE ! rtpmp4vpay pt=96 ! udpsink host=128.247.105.80 port=5000

When the pipeline starts to run, you'll see something that looks like this:

/GstPipeline:pipeline0/GstUDPSink:udpsink0.GstPad:sink: caps = application/x-rtp, media=(string)video, clock-rate=(int)90000, encoding-name=(string)MP4V-ES, profile-level-id=(string)5, config=(string)000001b005000001b509000001000000012000847a9828b421e0a31f, payload=(int)96, ssrc=(guint)302303174, clock-base=(guint)347576712, seqnum-base=(guint)48616

Make a note of caps="application/x-rtp, media=(string)video ................" string and pass this string in client below

A simple RTP client to decodes MPEG-4 and display on HOST machine

gst-launch -v udpsrc port=5000 caps="<PASS_CAPS_FROM_SERVER>" ! rtpmp4vdepay ! ffdec_mpeg4 ! xvimagesink

MPEG-4 Receive/Decode/Display:

This section gives example where EVM acts as RTP client, which receives encoded stream via udp then decodes and display output. Host PC can be used as server to transmit encoded stream.

A simple RTP server which encodes and transmits MPEG-4 on DM6446 EVM.

gst-launch-0.10 videotestsrc ! 'video/x-raw-yuv,width=720,height=480' ! ffenc_mpeg4 ! rtpmp4vpay ! udpsink host=<EVM_IP_ADDR> port=5000 -v

Make a note of caps="application/x-rtp, media=(string)video ................" string and pass this string in client below

A simple RTP client to receive and decode the MPEG-4 encoded stream.

gst-launch -v udpsrc port=5000 caps="<CAPS_FROM_SERVER>" ! rtpmp4vdepay ! TIViddec2 ! TIDmaiVideoSink videoStd=D1_NTSC videoOutput=composite sync=false

DM365

This section covers pipelines for common use cases for the DM365 processor.

Environment Requirements

Before executing the pipeline you need to set couple of environment variables, load kernel modules and activate video planes as follows:

cd /opt/gstreamer_demo/dm365/

./loadmodules.sh

export GST_REGISTRY=/tmp/gst_registry.bin

export LD_LIBRARY_PATH=/opt/gstreamer/lib

export GST_PLUGIN_PATH=/opt/gstreamer/lib/gstreamer-0.10

export PATH=/opt/gstreamer/bin:$PATH

cat /dev/zero > /dev/fb2 2> /dev/null

The above command assumes that gstreamer is installed in /opt/gstreamer directory.

Please see some special notes while playing 720P cliphere

Loopback: Video

Generated_D1, composte out (videotestsrc):

gst-launch -v videotestsrc ! TIDmaiVideoSink videoStd=D1_NTSC videoOutput=composite accelFrameCopy=FALSE sync=false

Generated_720p60, component out (videotestsrc):

gst-launch -v videotestsrc !video/x-raw-yuv,width=1280,height=720 ! TIDmaiVideoSink videoStd=720P_60 videoOutput=component accelFrameCopy=FALSE sync=false

Capture_D1 (v4l2src):

gst-launch -v v4l2src always-copy=FALSE input-src=composite ! TIDmaiVideoSink videoStd=D1_NTSC videoOutput=composite contiguousInputFrame=TRUE sync=false

Capture_720P (v4l2src):

gst-launch -v v4l2src always-copy=FALSE ! TIDmaiVideoSink videoStd=720P_60 videoOutput=component contiguousInputFrame=TRUE sync=false

Loopback: Audio

Generated audio tone (audiotestsrc):

gst-launch audiotestsrc num-buffers=1000 ! alsasink

Captured audio (alsasrc):

gst-launch -v alsasrc ! alsasink sync=false

Loopback: Audio + Video

No pipelines here yet. Please feel free to add your own.

Decode Video Files

MPEG-4 -> NTSC_D1:

gst-launch -v filesrc location=sample_ntsc_D1.mpeg4 ! TIViddec2 engineName=codecServer codecName=mpeg4dec ! TIDmaiVideoSink videoStd=D1_NTSC videoOutput=COMPOSITE sync=FALSE

MPEG-4 -> PAL_D1:

gst-launch -v filesrc location=sample_pal_D1.mpeg4 ! TIViddec2 engineName=codecServer codecName=mpeg4dec ! TIDmaiVideoSink videoStd=D1_PAL videoOutput=COMPOSITE sync=FALSE

MPEG-4 -> NTSC_720P:

gst-launch -v filesrc location=sample_720p.mpeg4 ! TIViddec2 codecName=mpeg4dec engineName=codecServer ! queue max-size-buffers=2 max-size-time=0 max-size-bytes=0 !TIDmaiVideoSink videoStd=720P_60 videoOutput=COMPONENT sync=FALSE hideOSD=TRUE

H.264 -> NTSC_D1:

gst-launch -v filesrc location=sample_D1.264 ! TIViddec2 engineName=codecServer codecName=h264dec ! TIDmaiVideoSink videoStd=D1_NTSC videoOutput=COMPOSITE sync=FALSE

H.264 -> NTSC_720P:

gst-launch -v filesrc location=sample_720P.264 ! TIViddec2 codecName=h264dec engineName=codecServer ! queue max-size-buffers=2 max-size-time=0 max-size-bytes=0 ! TIDmaiVideoSink videoStd=720P_60 videoOutput=component sync=false hideOSD=TRUE

MPEG-2 -> NTSC_D1:

gst-launch filesrc location=sample_D1.m2v ! TIViddec2 engineName=codecServer codecName=mpeg2dec ! TIDmaiVideoSink videoStd=D1_NTSC videoOutput=COMPOSITE sync=FALSE

MPEG-2 -> NTSC_720P:

gst-launch filesrc location=sample_720p.m2v ! TIViddec2 engineName=codecServer codecName=mpeg2dec ! queue max-size-buffers=2 max-size-time=0 max-size-bytes=0 ! TIDmaiVideoSink videoStd=720P_60 videoOutput=component sync=false hideOSD=TRUE

Decode Audio Files

AAC:

gst-launch -v filesrc location=sample.aac ! TIAuddec1 codecName=aacdec engineName=codecServer ! alsasink sync=false

Decode Container Files

AVI (MPEG-4 / MP3):

gst-launch filesrc location=sample.avi ! avidemux name=demux demux.audio_00 ! queue max-size-buffers=1200 max-size-time=0 max-size-bytes=0 ! mad ! alsasink demux.video_00 ! TIViddec2 engineName=codecServer codecName=mpeg4dec ! TIDmaiVideoSink videoStd=D1_NTSC videoOutput=COMPOSITE sync=FALSE

Encode Video Files

Generated_D1 (videotestsrc) -> MPEG-4:

gst-launch videotestsrc num-buffers=1000 ! video/x-raw-yuv, format=(fourcc)NV12 ! TIVidenc1 codecName=mpeg4enc engineName=codecServer ! filesink location=output_gen_D1.m4v

YUV_D1 -> MPEG-4:

gst-launch filesrc location=sample_nv12.yuv ! video/x-raw-yuv, format=(fourcc)NV12, width=320, height=240, framerate=(fraction)30/1 ! TIVidenc1 codecName=mpeg4enc engineName=codecServer ! filesink location=output_yuv_D1.m4v

Captured_D1 (v4l2src) -> MPEG-4:

gst-launch -v v4l2src always-copy=FALSE num-buffers=800 input-src=composite ! video/x-raw-yuv,format=(fourcc)NV12,width=720,height=480 ! TIVidenc1 codecName=mpeg4enc engineName=codecServer ! filesink location=output_cap_D1.m4v

Captured_720P (v4l2src) -> MPEG-4:

gst-launch -v v4l2src always-copy=FALSE num-buffers=800 input-src=COMPONENT ! video/x-raw-yuv,format=(fourcc)NV12,width=1280,height=720 ! TIVidenc1 codecName=mpeg4enc engineName=codecServer contiguousInputFrame=TRUE ! filesink location=output_cap_720P.m4v

免责声明:文章转载自《【Gstreamer开发】TI嵌入式处理器GStreamer pipeline》仅用于学习参考。如对内容有疑问,请及时联系本站处理。

上篇linux top命令查看内存及多核CPU的使用讲述微信支付的几种模式总结下篇

宿迁高防,2C2G15M,22元/月;香港BGP,2C5G5M,25元/月 雨云优惠码:MjYwNzM=

相关文章

django实现支付宝支付

目录 django支付宝支付 新建支付宝应用 创建应用(使用沙箱环境测试) 按照官方要求生成私钥(可以上支付宝开发平台下载支付宝开发助手) 把生成的app公钥粘贴到沙箱的app中 查看沙箱账号和密码 支付宝开发地址 说明 在utils中封装请求支付宝扫码地址url的函数和生成订单id的函数 在model.py中定义表 在views.py中...

hive深入使用

Hive表的创建和数据类型 https://cwiki.apache.org/confluence/display/Hive/Home 管理表和外部的区别 # 管理表 1. 内部表也称之为MANAGED_TABLE; 2. 默认存储在/user/hive/warehouse下,也可以通过location指定; 3. 删除表时,会删除表数据以及元数据; #...

pb常用函数(二)(转)

  GetFocus()功能确定当前焦点位于哪个控件上。语法GetFocus ( )返回值GraphicObject。函数执行成功时返回当前得到焦点控件的引用,发生错误时返回无效引用。用法应用程序利用IsValid()函数可以检测GetFocus()是否返回有效的控件引用。同时,使用TypeOf()函数可以确定控件的类型。Post()功能将指定消息加入到某...

SpringBoot 获取前端页面参数的集中方式总结

SpringBoot的一个好处就是通过注解可以轻松获取前端页面的参数,之后尅将参数经过一系列处理传送到后台数据库,前端时间正好用到。大致分为一下几种: 1.指定前端URL请求参数名称与方法名称一致,这种方式简单来说就是URL请求格式中参数需要与方法的参数名称对应上,举个例子,这么一个URL请求:http://localhost:8080/0919/tes...

消息队列之 ActiveMQ

简介 ActiveMQ 特点 ActiveMQ 是由 Apache 出品的一款开源消息中间件,旨在为应用程序提供高效、可扩展、稳定、安全的企业级消息通信。 它的设计目标是提供标准的、面向消息的、多语言的应用集成消息通信中间件。ActiveMQ 实现了 JMS 1.1 并提供了很多附加的特性,比如 JMX 管理、主从管理、消息组通信、消息优先级、延迟接收...

scala的多种集合的使用(5)之数组Array(ArrayBuffer)的操作

1.创建和更新数组的不同方式 1)定义一个数组的初始大小和类型,随后填充值。 scala> val array = new Array[String](3) array: Array[String] = Array(null, null, null) scala> array(0) = "abc" scala> array(1) =...