Introduction
This article mainly shows an example of using a camera in embedded Linux based on an ARM embedded module system. The module used is Toradex VF61, which is a very cost-effective module that does not include hardware video codec acceleration. The core processor is NXP/Freescale Vybrid, Cortex-A5 and M4 heterogeneous dual-core architecture.
1). Currently, more and more embedded systems use camera applications, mainly in the following ways:
Remote monitoring: such as closed-circuit television systems, operators use cameras to remotely monitor a specific area, which can be as small as a community or as large as a municipal public place.
Surveillance video recording: Some surveillance systems may not be monitored by operators all the time, so they can record surveillance videos and call up relevant videos for review when needed.
Embedded Vision Systems: Embedded vision systems process video images and extract more complex information, such as radar and urban intelligent transportation applications.
Video sensors: For example, clinical diagnostic equipment will analyze the collected video images for diagnosis, and smart shopping devices will analyze user characteristics by collecting video images for targeted promotion and sales, etc.
2). Environment configuration
. / ARM embedded module system: Toradex VF61 and Colibri Eva board, detailed configuration manual can be found here
./ Camera
Logitech HD 720p USB Webcam
D-Link DCS-930L IP Camera
./ software:
Toradex standard Embedded Linux distribution V2.4 (pre-installed), see here for details
GStreamer framework is widely used in the development of various multimedia applications. It can realize multimedia applications such as video editing, media streaming and media playback. At the same time, with various plug-ins (including input and output units, filters, codecs, etc.), GStreamer can support a variety of different media libraries such as MP3, FFmpeg, etc. The required installation packages are as follows:
$ opkg update
$ opkg install gst-plugins-base-meta gst-plugins-good-meta gst-ffmpeg
View currently installed plugins and units
$ gst-inspect
Introduction to GStreamer elements and pipelines
According to chapter 3 of the GStreamer Application Development Manual, a component is the most important object class in GStreamer, which can be read, decoded, and displayed. A pipeline is a chain of components connected together, which can be used for some specific tasks, such as video playback or capture. By default, GStreamer contains a large set of components to facilitate the development of various multimedia applications. In this article, we will use some pipelines to demonstrate the use of some components.
The following figure is an example of a basic pipeline for Ogg playback, using a splitter and two branches, one for audio and the other for video. You can see that some components have only source pads, others have only sink pads or both.
Before connecting a pipeline, we also need to use the "gst-inspect" command to check whether the required plugins are compatible. The following example checks the ffmpegcolorspace plugin.
$ gst-inspect ffmpegcolorspace
Basic information description
-----------------------------------------------------------
Factory Details:
Long name: FFMPEG Colorspace converter
Class: Filter/Converter/Video
Description : Converts video from one colorspace to another
Author(s): GStreamer maintainers gstreamer-devel@lists.sourceforge.net
-----------------------------------------------------------
Src and sink function description
-----------------------------------------------------------
SRC template: 'src'
Availability: Always
Capabilities :
video/x-raw-yuv
video/x-raw-rgb
video/x-raw-gray
SINK template: 'sink'
Availability: Always
Capabilities :
video/x-raw-yuv
video/x-raw-rgb
video/x-raw-gray
-----------------------------------------------------------
Another example is the v4l2src component, which only has the src pad function, so it can source a video stream to another component; and the ximagesink component, which has the rgb format sink pad function. For more details about this part, please pay attention to here.
Display a video test pattern
Use the following pipeline to display a video test pattern
$ gst-launch videotestsrc ! autovideosink
The autovideosink component automatically detects the video output, and the videotestsrc component can use the "pattern" attribute to generate test videos in various formats, such as the snowflake pattern test video below.
$ gst-launch videotestsrc pattern=snow ! autovideosink
USB Camera
1). Display video from USB camera
After the camera is connected to the system, the corresponding device videox will be displayed under the /dev directory. x can be 0, 1, 2, etc., depending on the number of connected cameras.
Please use the following pipeline to display the corresponding camera video in full screen
$ gst-launch v4l2src device=/dev/videox ! ffmpegcolorspace ! ximagesink
// The Video4Linux2 plugin is an API and driver framework for capturing and playing videos. It supports a variety of USB cameras and other devices. The component v4l2src belongs to the Video4Linux2 plugin and is used to read video frames from the Video4Linux2 device, which is the USB camera. The Ffmpegcolorspace component is a filter for converting multiple color formats. The video data of the camera device usually uses the YUV color format, while the display usually uses the RGB color format. The Ximagesink component is a standard videosink component for the X desktop.
In the current case, we can see through the "top" command that the current CPU usage is 77.9%
In addition, you can also set some parameters to set the display effect such as size, frame rate, etc. For example, the following example limits the display size to 320x240, and the CPU usage drops to 28.2%
$ gst-launch v4l2src device=/dev/videox ! 'video/x-raw-yuv,width=320,height=240,framerate=30/1' ! ffmpegcolorspace ! ximagesink
2). Display two USB cameras at the same time
Use the following channel to display two cameras at the same time. Here we use a Logitech HD 720P camera and another ordinary MJPEG camera. In this case, the CPU usage is 64.8%.
$ gst-launch v4l2src device=/dev/videox ! 'video/x-raw-yuv,width=320,height=240,framerate=30/1' ! ffmpegcolorspace ! ximagesink v4l2src device=/dev/video1 'video/x-raw-yuv,width=320,height=240,framerate=30/1' ! ffmpegcolorspace ! ximagesink
3). Record USB camera video
Use the following pipeline to record camera video in MP4 format
$ gst-launch --eos-on-shutdown v4l2src device=/dev/videox ! ffenc_mjpeg! ffmux_mp4 ! filesink location=video.mp4
//--eos- on-shutdown parameter is used to close the file correctly. ffenc_mjpeg component is an MJPEG format encoder. ffmux_mp4 is an MP4 format synthesizer. The filesink component declares that the source data from v4l2 will be stored as a file instead of being displayed in the ximagesink component. In addition, the file storage location can be arbitrarily specified.
In this case, the CPU usage is about 8% when recording camera video.
4). Video playback
Use the following pipeline to play the video recorded above
$ gst-launch filesrc location=video.mp4 ! qtdemux name=demux demux.video_00 ! queue! ffdec_mjpeg! ffmpegcolorspace ! ximagesink
//The filesrc component declares that the video source data comes from a file rather than a video device such as a camera. The ffdec_mjpeg component is an MJPEG format decoder.
In this case, since the recorded video is at the highest resolution of the camera, the CPU usage is around 95%.
5). Play video via HTTP
Use the following pipeline to play a specific URL video
$ gst-launch souphttpsrc location=http://upload.wikimedia.org/wikipedia/commons/4/4b/MS_Diana_genom_Bergs_slussar_16_maj_2014.webm ! matroskademux name=demux demux.video_00 ! queue! ffdec_vp8! ffmpegcolorspace ! ximagesink
// The souphttpsrc component is used to receive network data via HTTP. Unlike playing local videos, a network address storing the video file is given to the location parameter. The ffdec_vp8 component is a webm format decoder.
In this case, the CPU usage is around 40%.
6). Stream camera video via TCP
Here is the configuration to stream the VF61 camera video to another host running Ubuntu Linux
VF61 IP = 192.168.0.8
Ubuntu IP = 192.168.0.7
Run the following pipeline on VF61
$ gst-launch v4l2src device=/dev/video1 ! video/x-raw-yuv,width=320,height=240 ! ffmpegcolorspace ! ffenc_mjpeg! tcpserversink host=192.168.0.7 port=5000
Then run the following pipeline on Ubuntu to view the video stream
$ gst-launch tcpclientsrc host=192.168.0.8 port=5000 ! jpegdec! autovideosink
A Logitech HD 720P camera is used here, and the CPU usage is around 65%.
Using D-Link IP Camera on VF61
1). Display camera video
Here we use a D-Link DSC-930L camera and set the video stream to average quality JPEG format, 320x240 resolution, 15/1' frame rate, IP = 192.168.0.200
Use the following pipeline to display camera video
$ gst-launch -v souphttpsrc location='http://192.168.0.200/video.cgi' is-live=true ! multipartdemux ! decodebin2! ffmpegcolorspace ! ximagesink
2). Video Recording
Use the following pipeline to record the video
$ gst-launch --eos-on-shutdown –v souphttpsrc location='http://192.168.0.200/video.cgi' is-live=true ! multipartdemux ! decodebin2! ffmpegcolorspace ! ffenc_mjpeg! ffmux_mp4 ! filesink location=stream.mp4
In this case, the CPU usage is around 40%.
3). Stream video to another IP address via TCP
Here we configure the streaming IP camera video to VF61 and then to another host running Ubuntu Linux
Ubuntu IP = 192.168.0.12
Run the following pipeline on VF61
$ gst-launch --eos-on-shutdown –v souphttpsrc location='http://192.168.0.200/video.cgi' is-live=true ! multipartdemux ! decodebin2! ffmpegcolorspace ! ffenc_mjpeg! Tcpserversink host=192.168.0.12 port 5000
Then run the following pipeline on Ubuntu to view the video stream
Previous article:A comprehensive comparison of the two major CPU architectures: ARM and x86. How are they developing now?
Next article:Implementation of a New Embedded System Based on EP7312
- Popular Resources
- Popular amplifiers
Professor at Beihang University, dedicated to promoting microcontrollers and embedded systems for over 20 years.
- LED chemical incompatibility test to see which chemicals LEDs can be used with
- Application of ARM9 hardware coprocessor on WinCE embedded motherboard
- What are the key points for selecting rotor flowmeter?
- LM317 high power charger circuit
- A brief analysis of Embest's application and development of embedded medical devices
- Single-phase RC protection circuit
- stm32 PVD programmable voltage monitor
- Introduction and measurement of edge trigger and level trigger of 51 single chip microcomputer
- Improved design of Linux system software shell protection technology
- What to do if the ABB robot protection device stops
- Innovation is not limited to Meizhi, Welling will appear at the 2024 China Home Appliance Technology Conference
- Innovation is not limited to Meizhi, Welling will appear at the 2024 China Home Appliance Technology Conference
- Huawei's Strategic Department Director Gai Gang: The cumulative installed base of open source Euler operating system exceeds 10 million sets
- Download from the Internet--ARM Getting Started Notes
- Learn ARM development(22)
- Learn ARM development(21)
- Learn ARM development(20)
- Learn ARM development(19)
- Learn ARM development(14)
- Learn ARM development(15)
- Offline voice module VB-01/VB-02
- Looking for a switching power supply solution with an input of 110V-220V and an output of 5V3A
- Problems with MOS tubes in power protection chips
- Xiaobai asks for help 555
- Stimulating Intelligent and Continuous Innovation——STMicroelectronics Industry Summit 2022 is just waiting for you
- What are the types of d-sub connectors and how to choose them
- Infrared tube reception problem
- What does nationality have to do with a good entrepreneur?
- Qinheng CH55X online simulation method description - using ISD51 for online simulation
- Quick connection of the light production and testing solution of Gizwits IoT platform