Design of Embedded Network Video Server Based on ADSP-BF532

Publisher:心愿成真Latest update time:2009-08-11 Source: 电子技术应用 Reading articles on mobile phones Scan QR code
Read articles on your mobile phone anytime, anywhere

At present, audio and video technology has been widely used in various fields of work and life, and it has also proposed efficient transmission platforms and access and processing methods for people. Multimedia monitoring systems have gradually become one of the important technical means of modern management, detection and control because they can reflect the characteristics of the monitored objects in real time, image and reality. And one of the main hot spots of network video technology is embedded digital video monitoring system. In the research of basic embedded remote video monitoring system, foreign countries started earlier and are at the leading level. There are mature embedded network video service products, which use MPEG or wavelet compression methods. Their performance is generally good, but the price is expensive. Domestic research in this area is still in its infancy. With the development of digital technology, the improvement of image data compression coding technology and standards, and the continuous decline in chip costs, more and more units are engaged in research. This system uses the ADSP-BF532 DSPCPU chip newly launched by ADI in the United States in April 2004 to realize the embedded network video server, which is used to build a high-reliability video monitoring system. ADSP-BF532 eliminates the shortcomings of traditional DSP and RISC controllers based on traditional architecture that cannot meet the extensiveness, flexibility and diversity of video applications, and fully meets the requirements of real-time multimedia digital signal processing by using the new MSA architecture technology adopted by ADI. Compared with traditional multimedia monitoring systems, the design of the embedded network video server studied in this paper has the characteristics of small size, low cost, high stability and good real-time performance, and has strong practical application value.

Research and Implementation of Embedded Network Video Server Based on ADSP-BF532

1 Principle of Embedded Network Video Server

An embedded video server is an embedded device that provides network video transmission and sharing. It adopts an embedded integrated structure and a software platform for real-time processing. It integrates multiple functions such as multi-channel video and network transmission. It collects, compresses, and composites video and audio signals and converts them into network IP packets. It uses appropriate network protocols to achieve real-time network transmission of video and audio compressed data streams, so that users can obtain real-time image and sound information of the designated site through the network no matter where they are. In addition, through an embedded network server built into the video server, remote configuration of the video server and remote control and status acquisition of other auxiliary devices can be achieved.

Since the video compression and network functions are concentrated in a small device, it can be directly connected to the local area network, achieving plug-and-play, saving a variety of complex cables, and easy installation (only one IP address needs to be set). Users do not need to install any hardware equipment, just use a browser to watch, or compile a control program on the upper host and browse through the human-machine interface.

2 System Hardware Design

The main hardware functional modules of the embedded network video server based on ADSP-BF532 are: video input module, core ADSP-BF532CPU, video output module, external storage block, simulation debug interface (JTAG) module and power module.

Figure 1 is a structural diagram of the system. The specific design is as follows: the system front-end video acquisition is implemented by ADV7183, and the video data acquisition format is YUV422Planar. The acquired data is directly stored in SDRAM through the PPI interface of ADSP-BF532. ADV7183 automatically sends data to the SDRAM unit through the PPI interface under its own clock control. When a field of data is acquired, DMA generates an interrupt. In the DMA interrupt service program, video processing is completed according to the actual settings. Audio acquisition is implemented by AD1836 and corresponding peripheral circuits. The audio acquisition format is mono, 8000Samples/ s, each sample is quantized with 16Bits, and the collected data is directly stored in SDRAM through the PPI interface of ADSP-BF532. When the buffer set to store audio collection data is full, DMA generates an interrupt, and the audio data is processed according to the actual setting in the DMA interrupt service program; ADSP-BF532 sends the collected video and audio data to MPC860 through the SPI interface for processing, connects to the physical network through the MII interface of MPC860, and then sends it to the PC for real-time monitoring and listening; on the other hand, it processes the collected audio and video data in real time, compresses the encoded data and sends it to the PC, and stores it in the hard disk. The system is equipped with a JTAG interface for use in the debugging stage. This paper mainly studies the processing of video data in embedded video servers, and does not introduce audio data.

3. System software design and optimization

The system is implemented in Visual DSP++ 3.1, a software development environment for the Blackfin series, and is mainly divided into video peripheral programming, video encoding programming, video transmission programming, and system optimization.

3.1 System peripheral software design

Video input device ADV7183: ADSP-BF532 configures ADV7183 to collect image brightness, contrast, chroma and saturation through I2C bus. The internal control registers of ADV7183 are accessed through I2C bus interface.

Research and Implementation of Embedded Network Video Server Based on ADSP-BF532

Video input PPI interface: half-duplex, maximum 16-bit data transmission, two-dimensional DMA method is used for input, and each frame of image is processed once. The two-dimensional DMA program is as follows:

This program uses the standard ITU-656 receiving mode to receive video data byte stream:

X_COUNT=360;Y_COUNT=288;

X_MODIFY=4;Y_MODIY=4;

After executing 2D DMA according to the above settings, the memory data is arranged starting from the first address:

0, 4, 8, ... 356

360, 360+4, 360+8,… 360+356

2×360, 2×360+4, 2×360+8… 2×360+356

…… …… …… …… ……

284×360, 284×360+4, 284×360+8,…284×360+356

Video output UART interface: Use the UART port provided by ADSP-BF532 to speed up the debugging process.

Programming of serial SPI and network interface device MPC860: Using the SPI interface, ADSP-BF532 is used as the master device for data transmission, and MPC800 is used as the slave device to transmit data to MPC860 for processing, receive video data and provide network interface. The process of receiving data by network port and SPI interface is shown in Figure 2.

3.2 Video Coding

Considering the compression efficiency and bit rate, the core of the video server adopts the MPEG-4 Simple Profile standard for video compression encoding. It only performs intra-frame encoding (I frame) and inter-frame prediction encoding (P frame), but not bidirectional prediction encoding (B frame), which is suitable for rectangular video object encoding.

The core algorithms include DCT and IDCT. It uses 2D 8×8DCT and circular buffer, making full use of the advantages of Blackfin DSP and reducing the number of instructions executed in the loop body. MPEG-4 uses two methods to determine the quantization step size. One is to use the TM5 rate control method, and the other is to use the rate control model defined in MPEG-4. They both change the quantization coefficient according to the bit rate and image quality requirements. The DC coefficient quantization of MPEG-4 uses nonlinear quantization, and the AC coefficient can use either H.263 quantization or MPEG quantization. Here, the H.263 mode is used.

[page]

Motion prediction uses the minimum block SAD (the sum of absolute differences between the current image and the predicted image) to search for the best matching macroblock, and Blackfin DSP provides a video-specific instruction SAA, which greatly improves the speed; finally, boundary filling (PADDING) is used to reduce motion errors on the macroblock boundaries.

3.3 Implementation of video network transmission

After video acquisition and compression, the video data is transmitted through the network. The transport layer uses the TCP transport protocol to transmit the operation control commands with very small information packets, and uses the UDP transport protocol to transmit the video image data. Because TCP, a traditional connection protocol, must require data to be transmitted to the client's application layer in sequence without error, the TCP transport protocol can be used to transmit the control commands in network monitoring, and the signal server and client can correctly receive the operation commands. The UDP transport protocol does not provide network flow control and data packet loss and error handling. In advanced compression algorithms such as MPEG-4, which use inter-frame compression methods, data packet loss may affect several consecutive frames of video images. Therefore, applications based on the UDP transport protocol must solve reliability issues by design.

Research and Implementation of Embedded Network Video Server Based on ADSP-BF532

The upper software provides a friendly user interface and is implemented with Visual C++6.0. According to user needs, it receives video data from remote terminals through the network, including MPEG-4 decoding process, network transmission control and network command encoding. There are two ways to write it. The first one requires writing a server/client software to send and receive video image data; the second one can use the server/browser mode, that is, making the client software into a control, embedding it into a web page, and making it a digital video server based on WEB. This article adopts the first way.

The network communication module uses multicast technology to improve the efficiency of the program. The system IP network data communication flow chart is shown in Figure 3. This article specially writes a class Cmulticast to implement multicast service. It is a class specially encapsulated for video transmission and is used on both the server and the client. The public access functions of this class are introduced below.

classCMulticast :publicCobject

{

public:

void Close(); //Close the created socket

BOOL IsConnect();

Static Void Unintilize(); //Winsock terminates

Static Void Initilize(); //Read data from the specified port of the specified group

Int Send(char* lpData,int size) //Send message to destination address

BOOL Create(Cstring lpstrAddr,unsigned short port,HWND HWND); //Create and bind the send or receive socket and add the socket to the group, and set the message callback mechanism

CMulticast();

Virtual ~CMulticast()

Public:

HWND m_hWnd; //Window handle for message passing

SOCKET m_hWnd; //Window handle for message passing

SOCKET m_hSocket; //Send or receive socket

BOOL m_bConnected; //Has it joined a group?

SOCKADDR_IN addr; //Send socket address SOCKADDR_IN srcaddr; //Receive or send destination address

}

3.3 System-level optimization

The Blackfin software development platform's compilation system supports ANSI C and C++ compilation, and the system also provides a large number of multimedia operation instructions. When developing system software, we can optimize the compilation of code and the processing of specific tasks based on the hardware characteristics and the experience gained in the development and debugging process. We can also optimize the design of the upper-end software to shorten the code execution time of the display module.

4 Experimental Results

Based on this experiment, the video sequence (frame rate 30fps, CIF format) collected by ADV7183 was tested. The results show that: (1) DSP can transmit video image encoding data in real time through the SPI interface without data loss or error; (2) The quality fidelity of the reconstructed image is close to 1 when testing MPEG-4 video compression encoding; (3) In a simulation experiment on the local area network, CIF-sized video images can be transmitted in real time, and the transmission delay is controlled within 0.5s; (4) The UART interface can complete local video output. The design of the embedded network video server based on ADSP-BF532 is feasible, but some technical indicators are still not competent for actual applications. It is necessary to further improve the test results of video compression and add a large-capacity hard disk storage system to store video data.

分页
Reference address:Design of Embedded Network Video Server Based on ADSP-BF532

Previous article:In-depth analysis of the definition and application technology of "cloud storage"
Next article:Gas Concentration Detection System Based on DSP and DDS Technology

Latest Embedded Articles
Change More Related Popular Components

EEWorld
subscription
account

EEWorld
service
account

Automotive
development
circle

About Us Customer Service Contact Information Datasheet Sitemap LatestNews


Room 1530, 15th Floor, Building B, No.18 Zhongguancun Street, Haidian District, Beijing, Postal Code: 100190 China Telephone: 008610 8235 0740

Copyright © 2005-2024 EEWORLD.com.cn, Inc. All rights reserved 京ICP证060456号 京ICP备10001474号-1 电信业务审批[2006]字第258号函 京公网安备 11010802033920号