Design and implementation of video security monitoring terminal based on ARM

Publisher:TranquilOasisLatest update time:2021-05-21 Source: eefocusKeywords:ARM Reading articles on mobile phones Scan QR code
Read articles on your mobile phone anytime, anywhere

Video surveillance systems are widely used in industrial, military and civil fields, playing an important role in the security and environmental monitoring of these industries. Video surveillance systems are gradually moving from analog to digital. With the rapid development of semiconductor technology and the increasing maturity of multimedia video encoding and decoding technology, the application of high-performance and complex video stream compression algorithms in embedded systems has become a reality. Nowadays, surveillance systems are mostly implemented by combining dedicated processors or RISC embedded processors with DSP. This article discusses the implementation of combining ARM processors with software compression.


Overall design of video surveillance system


First, it is necessary to make an overall plan for the system, divide the system into several functional modules, and determine the implementation method of each module. The entire video surveillance system adopts a C/S structure, which is mainly divided into two parts: the server and the client. The server mainly includes the acquisition, compression, and transmission programs running on the S3C2410 platform, and the client is the receiving, decompression, and playback program running on the PC. The video surveillance terminal captures real-time video information from the on-site camera, and transmits it to the video surveillance server via Ethernet after compression.


As shown in the system structure diagram (Figure 1), video image acquisition and packaging and sending are completed on the server side, and image reception, unpacking and playback are completed on the client side.

System hardware design


The system adopts a modular design scheme, which mainly includes the following modules: main controller module, storage circuit module, peripheral interface circuit module, power supply and reset circuit, as shown in Figure 2.

S3C2410 master controller module


The main controller module is the core of the whole system. The S3C2410 processor used is a 16/32-bit microcontroller based on the ARM920T processor core of Samsung. The maximum operating frequency of this processor can reach 203MHz. Its low power consumption, compactness and full static design are particularly suitable for applications that are sensitive to cost and power consumption. S3C2410 provides rich on-chip resources and supports Linux, making it a suitable choice for this system. It can complete the scheduling of the entire system, configure the function registers of all chips that need to work when the system is powered on, complete the encoding of the video stream, and control the physical layer chip to send the video stream through the Ethernet controller.


System storage circuit module


The main controller also needs some peripheral storage units such as Nand Flash and SDRAM. Nand Flash contains Linux Bootloader, system kernel, file system, application program, environment variables and system configuration files, etc. SDRAM has fast read and write speed and is used as a memory unit when the system is running. The design uses 64M Nand Flash and 64M SDRAM.


Peripheral circuit module


The peripherals used in this design include USB interface, network card interface, RS232 interface and JTAG interface.


The USB host controller module of the video surveillance terminal is connected to multiple USB cameras through a dedicated USB hub. In real-time monitoring, the image data captured by each camera is transmitted to the USB host controller module of the video surveillance terminal through the USB hub, and then the USB host controller module is handed over to the S3C2410 processor for centralized processing. The S3C2410 performs real-time encoding and compression on the captured images, and the encoded code stream is directly transmitted to the sending buffer and waits for sending.


This design uses CS8900A to expand the network interface. It is a 16-bit Ethernet controller produced by CIRRUS LOGIC. It adapts to different application environments through the setting of internal registers. S3C2410 controls and communicates with the CS8900A network chip through address, data, control lines and chip select signal lines. The connection between CS8900A and S3C2410 is shown in Figure 3. CS8900A is selected by the nGCS3 signal of S3C2410. The INTRQ0 terminal of CS8900A is used to generate an interrupt signal and is connected to the 16-bit data bus of S3C2410. The address line uses A[24:0].

The CS8900A Ethernet control chip transmits data through the DMA channel. First, set the parameters of the transmission control and transmission address registers, read data from the specified data storage area in turn, send it to the internal transmission buffer, and use MAC to encapsulate and send the data. After a group of data is sent, request a DMA interrupt, which is processed by S3C2410.


The RS-232 interface is connected to the PC serial bus to display and control the relevant information of the embedded system through the PC. The JTAG interface is mainly used to debug the system and burn the program into the Flash.


System software design


The software design of the video surveillance terminal mainly completes two aspects of work:


(1) Build a software platform on the hardware. Building an embedded Linux software development platform requires completing tasks such as UBOOT transplantation, embedded Linux operating system kernel transplantation, and the development of device drivers for the embedded Linux operating system.


(2) Based on the software platform, develop system applications. With the help of cross-compilation tools, develop acquisition, compression, and transmission programs running on video surveillance terminals.


Building a Linux platform based on S3C2410


Linux has many advantages, such as open source; a powerful kernel that supports multi-user, multi-threading, multi-process, good real-time performance, powerful and stable functions; customizable size and functions; and support for multiple architectures.


To build an embedded Linux development platform, you need to build a cross-compilation environment first, as shown in Figure 4. A complete cross-compilation environment includes a host and a target machine. In the development, the host is a PC with Red Hat's FedoreCore 2 operating system, and the target machine is a video surveillance terminal based on S3C2410. The cross compiler selected is GCC3.3.4 for ARM, and the embedded Linux kernel source code package version number is 2.6.8RC.

The 2.6.8RC version of the Linux kernel source code package contains all the functional modules. Only a part of them is used in the system. Therefore, before compiling the kernel, you must first configure the kernel and cut off the redundant functional modules. Only after the customized kernel meets the system design. The specific steps are as follows:


(1) Type the command make menuconfig to configure the kernel, select the YAFFS file system, and support NFS boot. The system uses a USB camera, so you need to enable the USB device support module, including the USB device file support module and the USB host controller driver module. In addition, the USB camera is a video device, so in order for the application to access it, you also need to enable the Video4Linux module.


(2) Use the make dep command to generate dependencies between kernel programs.


(3) The make zImage command generates the kernel image file.


(4) The make modules and make modules_install commands generate system loadable modules.


This will generate the zImage kernel image file and download it to the Flash of the target platform.


This design uses a USB external camera, which is required to be loaded as a module during kernel configuration. First, the driver needs to be completed. The driver needs to provide the implementation of basic I/O operation interface functions such as open, read, write, and close, the processing implementation of interrupts, memory mapping functions, and the control interface function ioctl for I/O channels, and define them in struct file_operations. In this way, when the application performs system call operations such as open, close, read, write, etc. on the device file, the embedded Linux kernel will access the functions provided by the driver through the file_operations structure. Then compile the USB driver into a module that can be dynamically loaded, so that the camera can work normally.


Design of Video Monitoring Terminal Software


The video surveillance terminal software is divided into three parts according to its functions: video acquisition, compression, and transmission. The development of this software is based on the previously configured embedded kernel.


(1) Video acquisition part


Use Video4Linux interface functions to access USB camera devices and capture real-time video streams. First, complete the definition of the v4l_struct data structure, such as basic device information, image attributes, and various signal source attributes. The acquisition module collects images from the USB camera through the USB hub, and starts multiple acquisition threads to listen on different ports. Once a connection request is received, the acquisition thread immediately reads the video stream data from the device buffer and puts it into the video processing buffer for the next step of processing.


(2) Compression of video data


In the video surveillance system, a large amount of data needs to be transmitted through the network. In order to ensure the transmission quality and real-time transmission, it is necessary to encode and compress it before transmission to reduce the amount of data. This article uses the MPEG-4 encoding standard for data compression. You can download the open source xvidcore software on the Internet as the core algorithm for video compression. Xvidcore is an efficient and highly portable multimedia encoding software. Cross-compile it on the PC and copy the generated file to the target system.


(3) Video data transmission part


The function of the transmission module is to transmit the compressed video stream to the remote PC client. The transmission of video stream data is based on the TCP/IP protocol. Video transmission uses the standard RTP transmission protocol. RTP is currently the best way to solve the problem of real-time transmission of streaming media. To perform real-time streaming programming on the Linux platform, you need to use some open source RTP libraries, such as LIBRTP, JRTPLIB, etc. Define a relatively simple handshake protocol: the acquisition program on the PC side keeps sending request data packets to the acquisition terminal, and the acquisition terminal packages the captured images and returns them to the host. Each RTP information packet is encapsulated in a UDP message segment, and then encapsulated in an IP data packet and sent out. The receiver automatically assembles the received data frames and restores them to video data.

[1] [2]
Keywords:ARM Reference address:Design and implementation of video security monitoring terminal based on ARM

Previous article:Design of 3G video helmet based on ARM11 and DSP
Next article:Research and Design of Wireless Serial Hub Based on ZigBee

Recommended ReadingLatest update time:2024-11-15 13:37

Research on audio decoder single chip system based on ARM core
Introduction   EP7209 is the world's first digital audio decoder system-on-chip that supports both the popular MP3 standard and rapidly emerging Internet audio compression standards such as Microsoft Audio. When running at 74MHz, the performance of EP7209 is the same as that of a personal computer based on a 100MHz I
[Microcontroller]
Research on audio decoder single chip system based on ARM core
ARM instruction set coprocessor instructions
ARM microprocessors can support up to 16 coprocessors for various coprocessing operations. During program execution, each coprocessor only executes coprocessing instructions for itself and ignores instructions from the ARM processor and other coprocessors.   ARM coprocessor instructions are mainly used for ARM proce
[Microcontroller]
Framework of multifunctional integrated communication control system based on ARM9 and embedded Linux system
    This paper introduces the framework design of a multifunctional integrated communication control system based on the ARM9 hardware platform and embedded Linux system and the functions of each module. The system is written in C language that complies with the POSIX. 1 standard, which realizes the acquisition, analy
[Microcontroller]
Framework of multifunctional integrated communication control system based on ARM9 and embedded Linux system
ARM bare board calling process
1. Compilation The mdk software will select the assembly file added by the user to the project file to execute (for example, startup_stm32f10x_md.s), and initialize the basic content such as stack space. This file is usually provided by the chip manufacturer, and the user can select and call it. 2. Jump to main func
[Microcontroller]
ARM bare board calling process
【ARM】S5PV210 chip boot process
The thoughts of the designers of S5PV210 chip (1) After the chip is started, the contents of iRom (BL0) are executed to initialize peripherals such as the clock and watchdog, and BL1 and BL2 are copied to the on-chip SRAM; (2) Jump to the on-chip SRAM for execution, complete the initialization of the
[Microcontroller]
【ARM】S5PV210 chip boot process
Functional design and demonstration of smart meter system based on ARM processor
  For many years, electricity management and charging have been based on the traditional operation mode of using electricity first, reading the meter later, and paying the bill. The calculation of electricity value cannot achieve higher accuracy and has large deviations. In order to adapt to the needs of society and en
[Power Management]
Functional design and demonstration of smart meter system based on ARM processor
Design and implementation of home intelligent control terminal based on ARM and GPRS technology
The home intelligent control system uses home bus technology to connect various household appliances, home security devices and various metering devices in the home to form an internal home network, which is managed uniformly by the home intelligent controller. Remote control is to connect the intelligent home controll
[Microcontroller]
Design and implementation of home intelligent control terminal based on ARM and GPRS technology
Software tools accelerate compliance with safety-critical regulations using ARM processor family
    As more and more programmers use ARM processors when designing safety-related applications in the medical, transportation, avionics and industrial fields, the software executed by these processors is subject to more stringent scrutiny, as any small mistake may lead to serious consequences.     To avoid this ou
[Microcontroller]
Software tools accelerate compliance with safety-critical regulations using ARM processor family
Latest Microcontroller Articles
Change More Related Popular Components

EEWorld
subscription
account

EEWorld
service
account

Automotive
development
circle

About Us Customer Service Contact Information Datasheet Sitemap LatestNews


Room 1530, 15th Floor, Building B, No.18 Zhongguancun Street, Haidian District, Beijing, Postal Code: 100190 China Telephone: 008610 8235 0740

Copyright © 2005-2024 EEWORLD.com.cn, Inc. All rights reserved 京ICP证060456号 京ICP备10001474号-1 电信业务审批[2006]字第258号函 京公网安备 11010802033920号