Video Acquisition and Transmission System Based on ARM9

Publisher:郑哥Latest update time:2012-11-10 Source: 21ic Keywords:ARM9 Reading articles on mobile phones Scan QR code
Read articles on your mobile phone anytime, anywhere
introduction

With the development of multimedia technology and broadband network transmission technology, video acquisition and transmission system, as a core key technology in the fields of remote video monitoring, video conference and industrial automatic control, has also developed rapidly in recent years. The system proposed in this paper is carried out on a new generation of embedded development platform based on ARM926EJ-S microprocessor core, and combines the internationally popular MPEG-4 compression codec technology with streaming media transmission technology. The system has strong real-time, interactive and portable performance.

1 System Development Hardware Platform

The development platform used in this design is the MC9328MX21 of the Freescale i.MX family. As the core microprocessor of the entire system, its ARM926EJ-S core provides accelerated Java support and highly integrated system function modules, such as on-chip modules including image acceleration module, LCD controller, USB control module, CMOS sensor interface and synchronous serial port, etc., providing developers with a wealth of peripheral interfaces for multimedia application development. The core board integrates 64MB SDRAM and 16MB Flash memory, and the baseboard resources include 4 four-wire RS-232 serial ports, 1 10 Mbit·s-1/100 Mbit·s-1 adaptive Ethernet interface and audio and video acquisition equipment, which facilitates cross-compilation and multimedia data processing during system design.

The CMOS image sensor OV9640 used in the acquisition of key peripherals and video data of the system has the advantages of low power consumption, small size and high integration compared with traditional CCD image sensors. In addition, OV9640 supports multiple resolutions such as VGA, QVGA, CIF, and supports three data formats including YCrCb 4:2:2, GRB 4:2:2 and RGB: RawData, with an image frame transmission rate of 30 frames/s. The sensor acquires image data through the CSI module, and then transmits it to the PRP (eMMA Pre-processor) through a dedicated bus, where the image size is adjusted and converted into a suitable color space. The output of PRP is divided into two channels, channel 1 outputs RGB565 format data as LCD display, and channel 2 outputs YUV420 format data for MPEG or JPEG encoding.

In the connection between the two, the functions of each module port are: CSI port is for transmitting image data; I2C port is for configuring the sensor; GPIO is for controlling the sensor.

The CSI module of MC9328MX21 has an 8-bit input port. If the sensor transmits more than 8 bits of data, the image sensor is usually controlled as a slave through the I2C port. The underlying protocol is I2C, and the higher-level protocol is determined by the sensor. Here, the master clock of the image sensor is provided by MC9328MX21.

2. System software design

2.1 Building a cross-compilation environment

Since the embedded system uses the open source ARM-Linux system kernel based on the ARM microprocessor, and there are not enough resources on the development board to run the development and debugging tools, the cross-compilation and debugging environment must be set up first. First, you need to specify target=arm-linux when compiling to generate binutils suitable for the ARM platform, which includes some tools such as ld, ar and as to generate and process binary files. Then, compile and generate GCC (GNU Compiler Collection), which can support a variety of high-level languages ​​​​such as C, C++, etc. It should be noted that when compiling GCC, the support of the ARM-linux kernel header file is required, so first you need to configure the kernel #make menuconfig ARCH=ARM to generate the header file corresponding to the ARM kernel, so that you can specify the required header files for compilation through the with-headers option when configuring and compiling GCC. Finally, you also need to compile and generate the function library glibc that many user-level applications need to use. All dynamically linked programs need to use it. When compiling, it is important to turn on the --enable-add-ons option. This switch will turn on the additional package of glibc because we need to use linu-xthreads. In this way, a cross-compilation environment under embedded ARM-linux has been successfully built.

2.2 Compile and create kernel and file system

Configure kernel options through make menuconfig. Some key settings include turning on ARM926T CPUidle, I-Cache 0n and D-Cache when specifying System Type. And because the ARM-linux kernel needs to support frame buffer technology when developing application software, it is also necessary to turn on Frame-buffer sup-port in Console drivers. Then, make boot can compile and generate a customized kernel image file Image, and burn the prepared system kernel and file system into the Flash memory of the development board through the TFTP service of the host machine. In this way, the design of the operating system that can run independently on the board is completed.

2.3 Implementation of video data acquisition, encoding and transmission

This part of the work is the core of the entire design. MPEG-4 officially became an international standard in early 1999. Compared with previous standards, it pays more attention to the interactivity and flexibility of multimedia systems, mainly targeting multimedia applications such as ultra-low bit rate encoding of video conferencing and videophones. At present, in embedded systems, MPEG-4 encoding and decoding are mainly implemented through dedicated chips. Its implementation method is similar to the hardware implementation method of MPEG-1 and MPEG-2. The encoding algorithm is solidified in the hardware circuit of the chip, so it has the following disadvantages in use:

a) Low cost-performance ratio. As MPEG-4 encoding technology is still under development and there is no mature algorithm to support it, the MPEG-4 encoding chips launched on the market are modified and simplified based on the standard. They have no obvious performance advantages over H.263 and other encoding chips, so their cost-performance ratio is not high.

b) Poor portability. Since the encoding chips produced by various manufacturers have added their own improvements and optimizations to the encoding algorithms when they were solidified, the corresponding dedicated decoder must be used at the decoding end, which leads to compatibility issues.

c) No scalability. With the research on MPEG-4 codec standard, many new algorithms and improvements to the original algorithms are bound to be proposed. However, the existing MPEG-4 encoding chip has fixed the existing algorithm in the chip hardware circuit, so it is not convenient to modify and expand the algorithm on the chip.

Therefore, when designing the system, software was mainly used to implement its encoding and decoding. Using software to implement encoding and decoding in embedded systems can make up for many deficiencies in hardware encoding and decoding, and it is convenient to study and improve the algorithm itself. However, several issues need to be considered: First, due to the complex amount of calculation of the MPEG-4 encoding algorithm and the limited resources of the embedded system, the computing power of the selected platform microprocessor must be considered; secondly, in the interface part of the encoding software and data acquisition hardware, since it needs to be targeted at different acquisition hardware, many assembly-level optimizations need to be done. [page]

FFMPEG is a complete solution for the collection, recording, encoding and streaming of audio and video data. The project includes the following components:

a) FFMPEG is a command-line tool for converting audio and video file formats, and also supports real-time capture and encoding of TV card data.

b) FFserver can stream multimedia data via HTTP/RTSP.

c) FFplayer is a player based on FFMPEG library and SDL.

d) libavcodec includes all FFMPEG audio and video codec libraries, and libavformat includes the syntax and generation libraries of all supported audio and video formats.

The FFMPEG library supports a wide range of encoding and decoding formats, and the encoding and decoding speed is very fast; it supports the specified audio and video capture device to process the data source in real time and save it; FFMEPG can specify the video encoding and decoding, the frame rate, frame size and bit rate of format conversion, and the size of the bit rate control buffer through command line parameters; and FFMPEG can control the encoding and decoding method by activating advanced video options, including setting intra-frame coding, setting video quantization scale, setting P frames and the QP factor and deviation between B and I frames, motion estimation and DCT/IDCT algorithm selection, B frames and motion vectors, and the use of interleaved encoding. The choice of video capture device can also be selected through parameters, such as /dex/video0 or DV1394 dedicated channel, etc.

FFMPEG library can run on multiple platforms, including Linux, Windows and Mac OS. In embedded systems, ARM-Linux system is selected because embedded Linux has the characteristics of fully open source code, strong portability and good network support. This system supports the ARM9 architecture CPU selected this time, while FFMPEG is designed for the X86 architecture CPU of general PC. Therefore, FFMPEG needs to be ported to the ARM9 architecture system. First, it needs to be cross-compiled into a library that can run on ARM-linux. The specific steps are as follows.

Unzip the latest FFMPEG source code package and generate the FFMPEG directory. Then, for the cross-compilation chain of the developed system, modify the configure file to generate the Makefile file.

Then use the make command to automatically compile and generate the required FFMPEG library files and binary executable files that can run on the ARM development board by reading the generated Makefile file. After the compilation is successful, the host can be installed on the development board through the host's NFS service, so that you can go to the relevant directory to test whether the compiled FFMPEG can work normally:

The audio file cat.wav and the original yuv video file will be encoded to generate cat.mpg. If there is no input data file, the audio and video capture device will work, indicating that the required cross-compiled FFM-PEG library can run correctly.

2.4 Key technologies of video acquisition and encoding program

When designing a video capture program, the two function libraries libavformat and libavcodec of FFMPEG are mainly used. Many video file formats generally only define how to encode audio and video streams into an independent file without explicitly specifying the encoding tools used. The function of the libavformat library is mainly to analyze the syntax format of the video file and separate the original audio and video stream from the stream. The function of the libavcodec library is to process the encoding and decoding of the original audio and video streams according to the stream format.

When using libavformat/libavcodec library functions to process video files, first initialize it by calling av_register_all() function. This function defines all file formats and encoders that the library can support. Therefore, when reading a file, the corresponding format or encoding library is automatically used by calling this function. The video file is opened through av_open_input_file function:

The last three parameters of this function define the file format, buffer size and format parameters respectively; here we assign NULL and 0 to specify that the libavformat library function automatically detects the format and uses the default buffer size. Then we can read the file stream information:

To fill the content of the stream area in AVFormatContext, and then find the first video stream through a loop:

This allows you to read the content of the video stream to specify the selected codec and turn the codec on:

The definition of CODEC_CAP_TRUNCATED here means that when the video stream is divided into small data packets, the amount of data in each frame will change, which requires the edges of two video frames to match the edges of the data packets. Therefore, this macro is defined here to tell the encoder how to handle it. Finally, the avcodec_alloc_frame() function is called to allocate the frame buffer.

On the encoding side, you need to use the libavformat library function to read these data packets, filter out unnecessary non-video stream data, and then loop call the libavcodec library function GetNextFrame(AVFormatContext *pFormatCtx, AVCodecContext *pCodecCtx, int video-Stream, AVFrame *pFrame) to process each frame of data for encoding and decoding.

The video capture end uses the Video4Linux video device source to capture video frames. Video4Linux is an API interface for obtaining audio and video under Linux. The existing Video4Linux has two versions, v41 and v412. We use v4l for programming. Under Linux, all external devices are regarded as a special file, called a device file. Therefore, using the v4l API to obtain video images can be done by calling functions such as open and ioctl, and the hardware can be initialized, set hardware properties, and call hardware interrupts like ordinary files. After opening the video capture device, the VIDIOCGCAP control command of the ioctl(vd->fd, VIDIOCGCAP, &(vd->capability)) function is used to obtain the maximum image size that the video capture device can display, the number of channels of the signal source, and the VIDIOCGPICT of ioctl(vd->fd, VIDIOCGPICT, &(vd->picture)) to obtain some information about the brightness and contrast of the image. There are two ways to get video images in Video4Linux: overlay and mmap. Here, the MMAP method is used. The MMAP method allows the device memory to be directly mapped to the address space of the user process, so that the memory can be read and written directly in the process to control the device. When using the libavformat/libavcodec library to capture video frames from the Video4Linux video device source, the av_open_input_file() function needs to be called. Therefore, the device attribute configuration in this function needs to be modified to correspond to the device we have selected.

3 Test Results and Outlook

Here, the ffserver streaming media server component is used to implement streaming media transmission. First, you need to configure the port number, transmission bandwidth, delay, streaming media file attributes and other information about the server host in the ffserver.conf file. Then start ffserver to read the configuration file, and you can enter the server URL through WMP (Windows Media Player) on the receiving end to see the real-time captured video images. After testing, the frame rate of the embedded streaming media server when transmitting MPEG-4 video can reach 20 frames/s, and the receiving end can observe smooth and clear images.

This paper proposes an MPEG-4 streaming video acquisition and transmission system based on the embedded system of ARM9 architecture MC9328MX21. By transplanting libavformat/libavcode library under ARM-Linux operating system, the good portability of the library and Video4Linux are used to complete the acquisition and encoding of local video images, and send streaming media package data to the network. The system has the characteristics of good real-time performance, strong portability, low power consumption and remote mobile control embedded system, and the main functions are realized by software, which is conducive to the secondary development and upgrading of the system. Its application scope and prospects will be very broad.

Keywords:ARM9 Reference address:Video Acquisition and Transmission System Based on ARM9

Previous article:Design of Network Interface Based on VxWorks on ARM7 Platform
Next article:AT75C220 Based on ARM Core and Its Application in Fingerprint Recognition System

Recommended ReadingLatest update time:2024-11-16 18:01

goAhead 2.5 embedded web server ported to arm9 2440 + linux
Summary of this article: This article describes in detail the process, steps, problems encountered and their solutions for porting goAhead 2.5 to the S3C2440 Linux system. Development Environment:         Host: window XP;         Virtual machine: Ubuntu 9.10;         Cross compiler: arm-uclibc-gcc (arm-linux-g
[Microcontroller]
Design of GPRS Data Terminal Based on ARM9 Chip S3C2410a
With the development of science and technology, the pace of human life is accelerating, and the status of information in life is becoming increasingly important. How to obtain information conveniently, quickly, timely and effectively has become a key issue in modern information processing. Under this demand, China Mobi
[Microcontroller]
Design of GPRS Data Terminal Based on ARM9 Chip S3C2410a
Design of adaptive mine main fan monitoring system based on ARM9
0 Introduction The main fan of a mine is the main ventilation equipment of a mine. Its task is to remove dust and dirty gas from the mine and reduce the gas concentration in the mine. However, the main fan control technology used in most domestic coal mining industries has not been improved for many years.
[Microcontroller]
Design of adaptive mine main fan monitoring system based on ARM9
ARM9 external interrupt function test
* File name: EINT.c * Function: Test the operation of external interrupt * Author: jianqi * Version: 1.0 #include "2440addr.h" //Includes the settings of 2440 related registers #include "def.h" #define LED1 5 #define LED2 6 #define LED3 7 #define LED4 8 #define KEY1 1 #define KEY2 4 #define KEY3 2 #d
[Microcontroller]
Design of ECG Defibrillation Simulation System Based on ARM9
With the development of society, people's awareness of medical care is getting stronger and stronger, so the training of doctors has become a very important link. As a major aspect of doctor training, ECG defibrillation technology can often save people from danger in an emergency if the operation is standardized and t
[Microcontroller]
Design of ECG Defibrillation Simulation System Based on ARM9
Latest Microcontroller Articles
  • Download from the Internet--ARM Getting Started Notes
    A brief introduction: From today on, the ARM notebook of the rookie is open, and it can be regarded as a place to store these notes. Why publish it? Maybe you are interested in it. In fact, the reason for these notes is ...
  • Learn ARM development(22)
    Turning off and on interrupts Interrupts are an efficient dialogue mechanism, but sometimes you don't want to interrupt the program while it is running. For example, when you are printing something, the program suddenly interrupts and another ...
  • Learn ARM development(21)
    First, declare the task pointer, because it will be used later. Task pointer volatile TASK_TCB* volatile g_pCurrentTask = NULL;volatile TASK_TCB* vol ...
  • Learn ARM development(20)
    With the previous Tick interrupt, the basic task switching conditions are ready. However, this "easterly" is also difficult to understand. Only through continuous practice can we understand it. ...
  • Learn ARM development(19)
    After many days of hard work, I finally got the interrupt working. But in order to allow RTOS to use timer interrupts, what kind of interrupts can be implemented in S3C44B0? There are two methods in S3C44B0. ...
  • Learn ARM development(14)
  • Learn ARM development(15)
  • Learn ARM development(16)
  • Learn ARM development(17)
Change More Related Popular Components

EEWorld
subscription
account

EEWorld
service
account

Automotive
development
circle

About Us Customer Service Contact Information Datasheet Sitemap LatestNews


Room 1530, 15th Floor, Building B, No.18 Zhongguancun Street, Haidian District, Beijing, Postal Code: 100190 China Telephone: 008610 8235 0740

Copyright © 2005-2024 EEWORLD.com.cn, Inc. All rights reserved 京ICP证060456号 京ICP备10001474号-1 电信业务审批[2006]字第258号函 京公网安备 11010802033920号