How to use camera in embedded Linux system

Publisher:PositiveVibesLatest update time:2020-10-10 Source: elecfansKeywords:Embedded Reading articles on mobile phones Scan QR code
Read articles on your mobile phone anytime, anywhere

  Introduction

  This article mainly shows an example of using a camera in embedded Linux based on an ARM embedded module system. The module used is Toradex VF61, which is a very cost-effective module that does not include hardware video codec acceleration. The core processor is NXP/Freescale Vybrid, Cortex-A5 and M4 heterogeneous dual-core architecture.


  1). Currently, more and more embedded systems use camera applications, mainly in the following ways:

  Remote monitoring: such as closed-circuit television systems, operators use cameras to remotely monitor a specific area, which can be as small as a community or as large as a municipal public place.


  Surveillance video recording: Some surveillance systems may not be monitored by operators all the time, so they can record surveillance videos and call up relevant videos for review when needed.


  Embedded Vision Systems: Embedded vision systems process video images and extract more complex information, such as radar and urban intelligent transportation applications.


  Video sensors: For example, clinical diagnostic equipment will analyze the collected video images for diagnosis, and smart shopping devices will analyze user characteristics by collecting video images for targeted promotion and sales, etc.


  2). Environment configuration

  . / ARM embedded module system: Toradex VF61 and Colibri Eva board, detailed configuration manual can be found here

  ./ Camera

  Logitech HD 720p USB Webcam

  D-Link DCS-930L IP Camera

  ./ software:

  Toradex standard Embedded Linux distribution V2.4 (pre-installed), see here for details

  GStreamer framework is widely used in the development of various multimedia applications. It can realize multimedia applications such as video editing, media streaming and media playback. At the same time, with various plug-ins (including input and output units, filters, codecs, etc.), GStreamer can support a variety of different media libraries such as MP3, FFmpeg, etc. The required installation packages are as follows:

  $ opkg update

  $ opkg install gst-plugins-base-meta gst-plugins-good-meta gst-ffmpeg

  View currently installed plugins and units

  $ gst-inspect

  Introduction to GStreamer elements and pipelines

  According to chapter 3 of the GStreamer Application Development Manual, a component is the most important object class in GStreamer, which can be read, decoded, and displayed. A pipeline is a chain of components connected together, which can be used for some specific tasks, such as video playback or capture. By default, GStreamer contains a large set of components to facilitate the development of various multimedia applications. In this article, we will use some pipelines to demonstrate the use of some components.


  The following figure is an example of a basic pipeline for Ogg playback, using a splitter and two branches, one for audio and the other for video. You can see that some components have only source pads, others have only sink pads or both.

  

  Before connecting a pipeline, we also need to use the "gst-inspect" command to check whether the required plugins are compatible. The following example checks the ffmpegcolorspace plugin.

  $ gst-inspect ffmpegcolorspace

  Basic information description

  -----------------------------------------------------------

  Factory Details:

  Long name: FFMPEG Colorspace converter

  Class: Filter/Converter/Video

  Description : Converts video from one colorspace to another

  Author(s): GStreamer maintainers gstreamer-devel@lists.sourceforge.net

  -----------------------------------------------------------

  Src and sink function description

  -----------------------------------------------------------

  SRC template: 'src'

  Availability: Always

  Capabilities :​

  video/x-raw-yuv

  video/x-raw-rgb

  video/x-raw-gray

  SINK template: 'sink'

  Availability: Always

  Capabilities :​

  video/x-raw-yuv

  video/x-raw-rgb

  video/x-raw-gray

  -----------------------------------------------------------

  Another example is the v4l2src component, which only has the src pad function, so it can source a video stream to another component; and the ximagesink component, which has the rgb format sink pad function. For more details about this part, please pay attention to here.

  Display a video test pattern

  Use the following pipeline to display a video test pattern

  $ gst-launch videotestsrc ! autovideosink

  

  The autovideosink component automatically detects the video output, and the videotestsrc component can use the "pattern" attribute to generate test videos in various formats, such as the snowflake pattern test video below.

  $ gst-launch videotestsrc pattern=snow ! autovideosink

  

  USB Camera

  1). Display video from USB camera

  After the camera is connected to the system, the corresponding device videox will be displayed under the /dev directory. x can be 0, 1, 2, etc., depending on the number of connected cameras.

  Please use the following pipeline to display the corresponding camera video in full screen

  $ gst-launch v4l2src device=/dev/videox ! ffmpegcolorspace ! ximagesink

  

  // The Video4Linux2 plugin is an API and driver framework for capturing and playing videos. It supports a variety of USB cameras and other devices. The component v4l2src belongs to the Video4Linux2 plugin and is used to read video frames from the Video4Linux2 device, which is the USB camera. The Ffmpegcolorspace component is a filter for converting multiple color formats. The video data of the camera device usually uses the YUV color format, while the display usually uses the RGB color format. The Ximagesink component is a standard videosink component for the X desktop.

  In the current case, we can see through the "top" command that the current CPU usage is 77.9%

  In addition, you can also set some parameters to set the display effect such as size, frame rate, etc. For example, the following example limits the display size to 320x240, and the CPU usage drops to 28.2%

  $ gst-launch v4l2src device=/dev/videox ! 'video/x-raw-yuv,width=320,height=240,framerate=30/1' ! ffmpegcolorspace ! ximagesink

  2). Display two USB cameras at the same time

  Use the following channel to display two cameras at the same time. Here we use a Logitech HD 720P camera and another ordinary MJPEG camera. In this case, the CPU usage is 64.8%.

  $ gst-launch v4l2src device=/dev/videox ! 'video/x-raw-yuv,width=320,height=240,framerate=30/1' ! ffmpegcolorspace ! ximagesink v4l2src device=/dev/video1 'video/x-raw-yuv,width=320,height=240,framerate=30/1' ! ffmpegcolorspace ! ximagesink

  

  3). Record USB camera video

  Use the following pipeline to record camera video in MP4 format

  $ gst-launch --eos-on-shutdown v4l2src device=/dev/videox ! ffenc_mjpeg! ffmux_mp4 ! filesink location=video.mp4

  //--eos- on-shutdown parameter is used to close the file correctly. ffenc_mjpeg component is an MJPEG format encoder. ffmux_mp4 is an MP4 format synthesizer. The filesink component declares that the source data from v4l2 will be stored as a file instead of being displayed in the ximagesink component. In addition, the file storage location can be arbitrarily specified.

  In this case, the CPU usage is about 8% when recording camera video.

  4). Video playback

  Use the following pipeline to play the video recorded above

  $ gst-launch filesrc location=video.mp4 ! qtdemux name=demux demux.video_00 ! queue! ffdec_mjpeg! ffmpegcolorspace ! ximagesink

  //The filesrc component declares that the video source data comes from a file rather than a video device such as a camera. The ffdec_mjpeg component is an MJPEG format decoder.

  In this case, since the recorded video is at the highest resolution of the camera, the CPU usage is around 95%.

  5). Play video via HTTP

  Use the following pipeline to play a specific URL video

  $ gst-launch souphttpsrc location=http://upload.wikimedia.org/wikipedia/commons/4/4b/MS_Diana_genom_Bergs_slussar_16_maj_2014.webm ! matroskademux name=demux demux.video_00 ! queue! ffdec_vp8! ffmpegcolorspace ! ximagesink

  // The souphttpsrc component is used to receive network data via HTTP. Unlike playing local videos, a network address storing the video file is given to the location parameter. The ffdec_vp8 component is a webm format decoder.

  In this case, the CPU usage is around 40%.

  6). Stream camera video via TCP

  Here is the configuration to stream the VF61 camera video to another host running Ubuntu Linux

  VF61 IP = 192.168.0.8

  Ubuntu IP = 192.168.0.7

  Run the following pipeline on VF61

  $ gst-launch v4l2src device=/dev/video1 ! video/x-raw-yuv,width=320,height=240 ! ffmpegcolorspace ! ffenc_mjpeg! tcpserversink host=192.168.0.7 port=5000

  Then run the following pipeline on Ubuntu to view the video stream

  $ gst-launch tcpclientsrc host=192.168.0.8 port=5000 ! jpegdec! autovideosink

  A Logitech HD 720P camera is used here, and the CPU usage is around 65%.

  Using D-Link IP Camera on VF61

  1). Display camera video

  Here we use a D-Link DSC-930L camera and set the video stream to average quality JPEG format, 320x240 resolution, 15/1' frame rate, IP = 192.168.0.200

  Use the following pipeline to display camera video

  $ gst-launch -v souphttpsrc location='http://192.168.0.200/video.cgi' is-live=true ! multipartdemux ! decodebin2! ffmpegcolorspace ! ximagesink

  

  2). Video Recording

  Use the following pipeline to record the video

  $ gst-launch --eos-on-shutdown –v souphttpsrc location='http://192.168.0.200/video.cgi' is-live=true ! multipartdemux ! decodebin2! ffmpegcolorspace ! ffenc_mjpeg! ffmux_mp4 ! filesink location=stream.mp4

  In this case, the CPU usage is around 40%.

  3). Stream video to another IP address via TCP

  Here we configure the streaming IP camera video to VF61 and then to another host running Ubuntu Linux

  Ubuntu IP = 192.168.0.12

  Run the following pipeline on VF61

  $ gst-launch --eos-on-shutdown –v souphttpsrc location='http://192.168.0.200/video.cgi' is-live=true ! multipartdemux ! decodebin2! ffmpegcolorspace ! ffenc_mjpeg! Tcpserversink host=192.168.0.12 port 5000

  Then run the following pipeline on Ubuntu to view the video stream

[1] [2]
Keywords:Embedded Reference address:How to use camera in embedded Linux system

Previous article:A comprehensive comparison of the two major CPU architectures: ARM and x86. How are they developing now?
Next article:Implementation of a New Embedded System Based on EP7312

Latest Microcontroller Articles
  • Download from the Internet--ARM Getting Started Notes
    A brief introduction: From today on, the ARM notebook of the rookie is open, and it can be regarded as a place to store these notes. Why publish it? Maybe you are interested in it. In fact, the reason for these notes is ...
  • Learn ARM development(22)
    Turning off and on interrupts Interrupts are an efficient dialogue mechanism, but sometimes you don't want to interrupt the program while it is running. For example, when you are printing something, the program suddenly interrupts and another ...
  • Learn ARM development(21)
    First, declare the task pointer, because it will be used later. Task pointer volatile TASK_TCB* volatile g_pCurrentTask = NULL;volatile TASK_TCB* vol ...
  • Learn ARM development(20)
    With the previous Tick interrupt, the basic task switching conditions are ready. However, this "easterly" is also difficult to understand. Only through continuous practice can we understand it. ...
  • Learn ARM development(19)
    After many days of hard work, I finally got the interrupt working. But in order to allow RTOS to use timer interrupts, what kind of interrupts can be implemented in S3C44B0? There are two methods in S3C44B0. ...
  • Learn ARM development(14)
  • Learn ARM development(15)
  • Learn ARM development(16)
  • Learn ARM development(17)
Change More Related Popular Components

EEWorld
subscription
account

EEWorld
service
account

Automotive
development
circle

About Us Customer Service Contact Information Datasheet Sitemap LatestNews


Room 1530, 15th Floor, Building B, No.18 Zhongguancun Street, Haidian District, Beijing, Postal Code: 100190 China Telephone: 008610 8235 0740

Copyright © 2005-2024 EEWORLD.com.cn, Inc. All rights reserved 京ICP证060456号 京ICP备10001474号-1 电信业务审批[2006]字第258号函 京公网安备 11010802033920号