With the development of multimedia technology and broadband network transmission technology, video acquisition and transmission system, as a core key technology in the fields of remote video monitoring, video conference and industrial automatic control, has also developed rapidly in recent years. The system proposed in this paper is carried out on a new generation of embedded development platform based on ARM926EJ-S microprocessor core, and combines the internationally popular MPEG-4 compression codec technology with streaming media transmission technology. The system has strong real-time, interactive and portable performance.
1 System Development Hardware Platform
The development platform used in this design is the MC9328MX21 of the Freescale i.MX family. As the core microprocessor of the entire system, its ARM926EJ-S core provides accelerated Java support and highly integrated system function modules, such as on-chip modules including image acceleration module, LCD controller, USB control module, CMOS sensor interface and synchronous serial port, etc., providing developers with a wealth of peripheral interfaces for multimedia application development. The core board integrates 64MB SDRAM and 16MB Flash memory, and the baseboard resources include 4 four-wire RS-232 serial ports, 1 10 Mbit·s-1/100 Mbit·s-1 adaptive Ethernet interface and audio and video acquisition equipment, which facilitates cross-compilation and multimedia data processing during system design.
The CMOS image sensor OV9640 used in the acquisition of key peripherals and video data of the system has the advantages of low power consumption, small size and high integration compared with traditional CCD image sensors. In addition, OV9640 supports multiple resolutions such as VGA, QVGA, CIF, and supports three data formats including YCrCb 4:2:2, GRB 4:2:2 and RGB: RawData, with an image frame transmission rate of 30 frames/s. The sensor acquires image data through the CSI module, and then transmits it to the PRP (eMMA Pre-processor) through a dedicated bus, where the image size is adjusted and converted into a suitable color space. The output of PRP is divided into two channels, channel 1 outputs RGB565 format data as LCD display, and channel 2 outputs YUV420 format data for MPEG or JPEG encoding.
|
The CSI module of MC9328MX21 has an 8-bit input port. If the sensor transmits more than 8 bits of data, the image sensor is usually controlled as a slave through the I2C port. The underlying protocol is I2C, and the higher-level protocol is determined by the sensor. Here, the master clock of the image sensor is provided by MC9328MX21.
2. System software design
2.1 Building a cross-compilation environment
Since the embedded system uses the open source ARM-Linux system kernel based on the ARM microprocessor, and there are not enough resources on the development board to run the development and debugging tools, the cross-compilation and debugging environment must be set up first. First, you need to specify target=arm-linux when compiling to generate binutils suitable for the ARM platform, which includes some tools such as ld, ar and as to generate and process binary files. Then, compile and generate GCC (GNU Compiler Collection), which can support a variety of high-level languages such as C, C++, etc. It should be noted that when compiling GCC, the support of the ARM-linux kernel header file is required, so first you need to configure the kernel #make menuconfig ARCH=ARM to generate the header file corresponding to the ARM kernel, so that you can specify the required header files for compilation through the with-headers option when configuring and compiling GCC. Finally, you also need to compile and generate the function library glibc that many user-level applications need to use. All dynamically linked programs need to use it. When compiling, it is important to turn on the --enable-add-ons option. This switch will turn on the additional package of glibc because we need to use linu-xthreads. In this way, a cross-compilation environment under embedded ARM-linux has been successfully built.
2.2 Compile and create kernel and file system
Configure kernel options through make menuconfig. Some key settings include turning on ARM926T CPUidle, I-Cache 0n and D-Cache when specifying System Type. And because the ARM-linux kernel needs to support frame buffer technology when developing application software, it is also necessary to turn on Frame-buffer sup-port in Console drivers. Then, make boot can compile and generate a customized kernel image file Image, and burn the prepared system kernel and file system into the Flash memory of the development board through the TFTP service of the host machine. In this way, the design of the operating system that can run independently on the board is completed.
2.3 Implementation of video data acquisition, encoding and transmission
This part of the work is the core of the entire design. MPEG-4 officially became an international standard in early 1999. Compared with previous standards, it pays more attention to the interactivity and flexibility of multimedia systems, mainly targeting multimedia applications such as ultra-low bit rate encoding of video conferencing and videophones. At present, in embedded systems, MPEG-4 encoding and decoding are mainly implemented through dedicated chips. Its implementation method is similar to the hardware implementation method of MPEG-1 and MPEG-2. The encoding algorithm is solidified in the hardware circuit of the chip, so it has the following disadvantages in use:
a) Low cost-performance ratio. As MPEG-4 encoding technology is still under development and there is no mature algorithm to support it, the MPEG-4 encoding chips launched on the market are modified and simplified based on the standard. They have no obvious performance advantages over H.263 and other encoding chips, so their cost-performance ratio is not high.
b) Poor portability. Since the encoding chips produced by various manufacturers have added their own improvements and optimizations to the encoding algorithms when they were solidified, the corresponding dedicated decoder must be used at the decoding end, which leads to compatibility issues.
c) No scalability. With the research on MPEG-4 codec standard, many new algorithms and improvements to the original algorithms are bound to be proposed. However, the existing MPEG-4 encoding chip has fixed the existing algorithm in the chip hardware circuit, so it is not convenient to modify and expand the algorithm on the chip.
Therefore, when designing the system, software was mainly used to implement its encoding and decoding. Using software to implement encoding and decoding in embedded systems can make up for many deficiencies in hardware encoding and decoding, and it is convenient to study and improve the algorithm itself. However, several issues need to be considered: First, due to the complex amount of calculation of the MPEG-4 encoding algorithm and the limited resources of the embedded system, the computing power of the selected platform microprocessor must be considered; secondly, in the interface part of the encoding software and data acquisition hardware, since it needs to be targeted at different acquisition hardware, many assembly-level optimizations need to be done. [page]
FFMPEG is a complete solution for the collection, recording, encoding and streaming of audio and video data. The project includes the following components:
a) FFMPEG is a command-line tool for converting audio and video file formats, and also supports real-time capture and encoding of TV card data.
b) FFserver can stream multimedia data via HTTP/RTSP.
c) FFplayer is a player based on FFMPEG library and SDL.
d) libavcodec includes all FFMPEG audio and video codec libraries, and libavformat includes the syntax and generation libraries of all supported audio and video formats.
The FFMPEG library supports a wide range of encoding and decoding formats, and the encoding and decoding speed is very fast; it supports the specified audio and video capture device to process the data source in real time and save it; FFMEPG can specify the video encoding and decoding, the frame rate, frame size and bit rate of format conversion, and the size of the bit rate control buffer through command line parameters; and FFMPEG can control the encoding and decoding method by activating advanced video options, including setting intra-frame coding, setting video quantization scale, setting P frames and the QP factor and deviation between B and I frames, motion estimation and DCT/IDCT algorithm selection, B frames and motion vectors, and the use of interleaved encoding. The choice of video capture device can also be selected through parameters, such as /dex/video0 or DV1394 dedicated channel, etc.
FFMPEG library can run on multiple platforms, including Linux, Windows and Mac OS. In embedded systems, ARM-Linux system is selected because embedded Linux has the characteristics of fully open source code, strong portability and good network support. This system supports the ARM9 architecture CPU selected this time, while FFMPEG is designed for the X86 architecture CPU of general PC. Therefore, FFMPEG needs to be ported to the ARM9 architecture system. First, it needs to be cross-compiled into a library that can run on ARM-linux. The specific steps are as follows.
Unzip the latest FFMPEG source code package and generate the FFMPEG directory. Then, for the cross-compilation chain of the developed system, modify the configure file to generate the Makefile file.
Then use the make command to automatically compile and generate the required FFMPEG library files and binary executable files that can run on the ARM development board by reading the generated Makefile file. After the compilation is successful, the host can be installed on the development board through the host's NFS service, so that you can go to the relevant directory to test whether the compiled FFMPEG can work normally:
The audio file cat.wav and the original yuv video file will be encoded to generate cat.mpg. If there is no input data file, the audio and video capture device will work, indicating that the required cross-compiled FFM-PEG library can run correctly.
2.4 Key technologies of video acquisition and encoding program
When designing a video capture program, the two function libraries libavformat and libavcodec of FFMPEG are mainly used. Many video file formats generally only define how to encode audio and video streams into an independent file without explicitly specifying the encoding tools used. The function of the libavformat library is mainly to analyze the syntax format of the video file and separate the original audio and video stream from the stream. The function of the libavcodec library is to process the encoding and decoding of the original audio and video streams according to the stream format.
When using libavformat/libavcodec library functions to process video files, first initialize it by calling av_register_all() function. This function defines all file formats and encoders that the library can support. Therefore, when reading a file, the corresponding format or encoding library is automatically used by calling this function. The video file is opened through av_open_input_file function:
The last three parameters of this function define the file format, buffer size and format parameters respectively; here we assign NULL and 0 to specify that the libavformat library function automatically detects the format and uses the default buffer size. Then we can read the file stream information:
To fill the content of the stream area in AVFormatContext, and then find the first video stream through a loop:
This allows you to read the content of the video stream to specify the selected codec and turn the codec on:
The definition of CODEC_CAP_TRUNCATED here means that when the video stream is divided into small data packets, the amount of data in each frame will change, which requires the edges of two video frames to match the edges of the data packets. Therefore, this macro is defined here to tell the encoder how to handle it. Finally, the avcodec_alloc_frame() function is called to allocate the frame buffer.
On the encoding side, you need to use the libavformat library function to read these data packets, filter out unnecessary non-video stream data, and then loop call the libavcodec library function GetNextFrame(AVFormatContext *pFormatCtx, AVCodecContext *pCodecCtx, int video-Stream, AVFrame *pFrame) to process each frame of data for encoding and decoding.
The video capture end uses the Video4Linux video device source to capture video frames. Video4Linux is an API interface for obtaining audio and video under Linux. The existing Video4Linux has two versions, v41 and v412. We use v4l for programming. Under Linux, all external devices are regarded as a special file, called a device file. Therefore, using the v4l API to obtain video images can be done by calling functions such as open and ioctl, and the hardware can be initialized, set hardware properties, and call hardware interrupts like ordinary files. After opening the video capture device, the VIDIOCGCAP control command of the ioctl(vd->fd, VIDIOCGCAP, &(vd->capability)) function is used to obtain the maximum image size that the video capture device can display, the number of channels of the signal source, and the VIDIOCGPICT of ioctl(vd->fd, VIDIOCGPICT, &(vd->picture)) to obtain some information about the brightness and contrast of the image. There are two ways to get video images in Video4Linux: overlay and mmap. Here, the MMAP method is used. The MMAP method allows the device memory to be directly mapped to the address space of the user process, so that the memory can be read and written directly in the process to control the device. When using the libavformat/libavcodec library to capture video frames from the Video4Linux video device source, the av_open_input_file() function needs to be called. Therefore, the device attribute configuration in this function needs to be modified to correspond to the device we have selected.
3 Test Results and Outlook
Here, the ffserver streaming media server component is used to implement streaming media transmission. First, you need to configure the port number, transmission bandwidth, delay, streaming media file attributes and other information about the server host in the ffserver.conf file. Then start ffserver to read the configuration file, and you can enter the server URL through WMP (Windows Media Player) on the receiving end to see the real-time captured video images. After testing, the frame rate of the embedded streaming media server when transmitting MPEG-4 video can reach 20 frames/s, and the receiving end can observe smooth and clear images.
This paper proposes an MPEG-4 streaming video acquisition and transmission system based on the embedded system of ARM9 architecture MC9328MX21. By transplanting libavformat/libavcode library under ARM-Linux operating system, the good portability of the library and Video4Linux are used to complete the acquisition and encoding of local video images, and send streaming media package data to the network. The system has the characteristics of good real-time performance, strong portability, low power consumption and remote mobile control embedded system, and the main functions are realized by software, which is conducive to the secondary development and upgrading of the system. Its application scope and prospects will be very broad.
Previous article:Design of Network Interface Based on VxWorks on ARM7 Platform
Next article:AT75C220 Based on ARM Core and Its Application in Fingerprint Recognition System
Recommended ReadingLatest update time:2024-11-16 18:01
- Popular Resources
- Popular amplifiers
- Practical Deep Neural Networks on Mobile Platforms: Principles, Architecture, and Optimization
- ARM Embedded System Principles and Applications (Wang Xiaofeng)
- ARM Cortex-M4+Wi-Fi MCU Application Guide (Embedded Technology and Application Series) (Guo Shujun)
- osk5912 evaluation board example source code
Professor at Beihang University, dedicated to promoting microcontrollers and embedded systems for over 20 years.
- Innolux's intelligent steer-by-wire solution makes cars smarter and safer
- 8051 MCU - Parity Check
- How to efficiently balance the sensitivity of tactile sensing interfaces
- What should I do if the servo motor shakes? What causes the servo motor to shake quickly?
- 【Brushless Motor】Analysis of three-phase BLDC motor and sharing of two popular development boards
- Midea Industrial Technology's subsidiaries Clou Electronics and Hekang New Energy jointly appeared at the Munich Battery Energy Storage Exhibition and Solar Energy Exhibition
- Guoxin Sichen | Application of ferroelectric memory PB85RS2MC in power battery management, with a capacity of 2M
- Analysis of common faults of frequency converter
- In a head-on competition with Qualcomm, what kind of cockpit products has Intel come up with?
- Dalian Rongke's all-vanadium liquid flow battery energy storage equipment industrialization project has entered the sprint stage before production
- Allegro MicroSystems Introduces Advanced Magnetic and Inductive Position Sensing Solutions at Electronica 2024
- Car key in the left hand, liveness detection radar in the right hand, UWB is imperative for cars!
- After a decade of rapid development, domestic CIS has entered the market
- Aegis Dagger Battery + Thor EM-i Super Hybrid, Geely New Energy has thrown out two "king bombs"
- A brief discussion on functional safety - fault, error, and failure
- In the smart car 2.0 cycle, these core industry chains are facing major opportunities!
- The United States and Japan are developing new batteries. CATL faces challenges? How should China's new energy battery industry respond?
- Murata launches high-precision 6-axis inertial sensor for automobiles
- Ford patents pre-charge alarm to help save costs and respond to emergencies
- New real-time microcontroller system from Texas Instruments enables smarter processing in automotive and industrial applications
- Schematic diagram - How does this circuit use two op amps to achieve a constant current output of 1mA?
- DSP6678 Ethernet Boot
- EEWORLD University ---- Machine Learning Basics: Case Studies (University of Washington)
- Function and usage of J-Link script file
- I would like to ask about the information about the electronic engineer certificate
- Microwave Filters for Communication Systems - Fundamentals, Design and Applications
- Read the good book "Self-study Handbook for Electronic Engineers" - Content Preview
- Thank you for your company + thank you for your wife's company @【chat, laugh, and make noise】
- [Experience sharing] [Scene reproduction project based on AI camera] Maixhub online model training--Ultraman recognition training process
- Question about the H-bridge circuit of BTS7960.