1 System Introduction
With the promotion of network multimedia applications, independent embedded multimedia communication terminal systems have become a hot topic of current research due to their low cost and good performance. The research on embedded video encoders with network communication functions has become the core content of designing multimedia communication terminal systems. The working principle of embedded network video encoders is to digitize analog video signals, compress and encode them according to international standards, and process them with network protocols before sending them to the network. The client can receive video data from the network and play them back in real time after decoding. The embedded video encoder is an independent device with a high-performance processor and operating system tightly bound together, with dedicated functions and a special design. Unlike the card-based system, it is not affected by other software and hardware in the general computer system. It is more stable and reliable in performance and easy to implement modular design of the system, which is convenient for installation, management and maintenance.
TM1300 is a high-performance multimedia processor that can use the pSOS embedded real-time operating system and provides a relatively complete online debugging tool. Designers can use the online debugging tool to develop various TriMedia resources and debug various applications on the pSOS platform, thereby ultimately realizing the entire system. This paper proposes an embedded video encoder based on TM1300 suitable for IP networks, and discusses the software and hardware design of the entire system in detail. We applied the encoder to the monitoring system, realizing real-time video transmission and achieving good image quality. The main functions implemented by the network video encoder are: A/D conversion of video signals, H.263 video compression encoding, H.323 network protocol processing, camera control and transparent data transmission, etc.
2 Hardware Design
2.1 TM1300 Introduction
The core of the video encoder, TM1300, is a high-performance DSP launched by Philips for multimedia applications, which can process high-quality video and audio. The powerful compiler and software development environment provided by TriMedia enable developers to write applications in C or C++ instead of assembly language.
The core of TM1300 is a 32-bit processor capable of 32-bit linear addressing, with an addressing capacity of up to 4GB. The core processor of TM1300 adopts VLIW structure, which can execute 5 instructions at the same time in each clock cycle. TM1300 supports 16KB high-speed data cache and 32KB high-speed instruction cache, and the high-speed data cache is bidirectional. TM1300 also integrates PCI bus interface, which can be used as slave CPU in PC environment and master CPU in embedded system. TM1300 is different from general DSP, it has special video interface, audio interface, image coprocessor unit and variable length decoder unit and other special units. Image coprocessor is mainly used for filtering or scaling of images to improve processing speed; variable length decoder can assist the core to complete Huffman decoding.
2.2 Overall hardware structure
The overall hardware structure of the network video encoder is shown in Figure 1. The encoder converts the analog video signal from the camera into a digital video signal in YUV format through the AD conversion chip SAA7111A, which is compressed into image data streams of various rates by TM1300 (1) according to the H.263 protocol, and then transmitted to TM1300 (2) responsible for protocol processing through the PCI bus. After the video compression data is encapsulated here, it is finally transmitted to the Ethernet interface unit with the Ethernet interface controller RTL8139C (L) as the core through the PCI bus, and then sent to the IP network. The peripheral expansion module is based on the W77E58 microcontroller, and controls the camera and sends and receives transparent data through two serial ports. The CPLD mainly completes functions such as address decoding and PCI bus arbitration. The developed application is compiled and linked and written into FLASH. After the encoder is powered on and reset, the bootloader in the EEPROM moves the program in FLASH to SDRAM, and the system starts running. Based on the above functions, the hardware design of the network video encoder can be divided into the following four functional units: (1) video encoding unit; (2) protocol processing unit; (3) network interface unit; (4) peripheral expansion unit.
Figure 1 Overall hardware structure of a network video encoder
2.3 Video Coding Unit
The video encoding unit is based on TM1300 (1), and its peripheral devices include EEPROM, 16M SDRAM, video AD chip SAA7111A, dual-port RAM and 16MB FLASH. Among them, SAA7111A is Philips' enhanced video input processor (EVIP). The input analog video signal can be CVBS (PAL, NTSC, etc.) and S-Video (Y/C). After A/D conversion, it outputs YUV4∶2∶2 digital video signal that complies with CCIR-656. The video input schematic diagram is shown in Figure 2. The YUV digital video output port of SAA7111A is connected to the video input port (VI) of TM1300. The working mode of SAA7111A is configured by TM1300 through the I2C bus.
Figure 2 Video input schematic diagram
SDRAM is a synchronous dynamic RAM that provides burst access for accessing applications, raw digital video data, and processed intermediate data. TM1300 (1) uses 2 external SDRAMs (12-rank interface), each with a capacity of 4 × 1M × 16 bits, for a total capacity of 16MB.
After power-on reset, TM1300 (1) reads the startup information from the EEPROM through the I2C bus, configures the clock divider register and SDRAM register, then moves the boot program in the EEPROM to the SDRAM starting from DRAM-BASE and starts executing the boot program. After the boot program moves the corresponding application program in the FLASH to the SDRAM of TM1300 (1) and TM1300 (2), they each start to work normally.
2.4 Protocol Processing Unit
The protocol processing unit is based on TM1300 (2), with extended EEPROM and 16M SDRAM. Its circuit is similar to the memory interface and startup circuit design of the video encoding unit. The working mode of TM1300 (2) is slave mode. After power-on reset, it reads startup information from the serial EEPROM through the I2C bus and configures the clock division register and SDRAM register. Then it waits for TM1300 (1) to complete the remaining work of system startup, including the configuration of MMIO space and DRAM space, and waits for TM1300 (1) to move the corresponding application program in FLASH to the SDRAM of TM1300 (2). Then TM1300 (2) can start working normally.
2.5 Network Interface Unit
The schematic diagram of the network interface unit is shown in Figure 3. It is based on REALTEK's RTL8139C (L) Ethernet controller, and is connected to the local area network through a twisted pair cable via an Ethernet transformer ST6118T and an RJ 45 socket. The RTL8139C (L) interface is fully compatible with the PCI2.1 specification and can be easily mounted on TriMedia's PCI bus. The Ethernet interface packages the video encoded and protocol processed data according to the Ethernet data format and transmits it to the Ethernet. At the same time, it automatically monitors the data changes at the receiving end, unpacks the received data, and transmits it to TM1300 (2).
Figure 3 Network interface unit schematic diagram
2.6 Peripheral expansion unit
The schematic diagram of the peripheral expansion unit is shown in Figure 4. The core is the WINBOND microcontroller W77E58 with two serial ports, and it is expanded with IDT7130 dual-port RAM, MAX232 and MAX485. W77E58 controls the camera and pan/tilt via serial port 0 according to the RS-485 interface standard, and transmits RS-232 transparent data via serial port 1. W77E58 and TM1300 (1) access the dual-port RAM through a self-defined simple protocol, thereby transmitting control information and data to each other. The functions of the peripheral expansion unit are realized by programming W77E58. In order to improve the reliability of program operation, a watchdog timer is also used in our microcontroller program development.
Figure 4 Schematic diagram of peripheral expansion unit
3 Software Structure
3.1 Encoder Software Architecture TSSA
TSSA (TriMedia Software Stream Architecture) is a stream structure based on data packet exchange, which consists of an application layer module (Application) and several functional modules (Component). It follows the idea of software component interaction in COM technology and proposes a component object model for processing multimedia data streams. The application layer module is responsible for system initialization. The use of software stream architecture can greatly simplify multimedia application development. The TSSA system is divided into the following layers (see Table 1):
Each functional module is created, started, closed, and its status changed, and the response of a functional module is received. Each functional module consists of the OL layer, the operating system abstraction layer, the AL layer, and the device library layer. Functional modules and functional modules and application layer modules exchange data through data packets, which are transmitted in the message queue. The message queue connects each functional module and the functional module and application layer module.
3.2 Encoder software structure
The structure of the encoder software is shown in Figure 5 (excluding the MCU programming part). The main control module is equivalent to the application layer module in TSSA, which is responsible for creating and running the video encoding module, protocol processing module and channel interface module, and forwarding control and status messages between various functional modules through the response queue and command queue. The video encoding module implements video compression encoding in accordance with the H.263 protocol; the protocol processing module implements the H.323 protocol stack; the channel interface module implements encapsulating datagrams into MAC frames and sending them to specific physical channels. Data is exchanged between various functional modules through a two-way message queue.
Figure 5 Encoder software structure
The main control module and each functional module run independently as tasks in the pSOS operating system. The change of task status between them is completely realized by whether the acquisition of resources can be satisfied (by calling the pSOS system kernel). pSOS dynamically switches each task according to the priority of the task to ensure the real-time performance of the system. In order to improve the response sensitivity of the main control module to the status message, we design the priority of the main control module to be higher than that of the functional module. The video encoding module, as the core part of the encoder, occupies most of the system resources and thus determines the performance of the encoder. In view of the real-time requirements of the multimedia communication system, we adopted a variety of optimization strategies in the implementation of the encoding software module. In addition to the structural and local optimization of the encoding program code, the selection of fast motion estimation algorithm in video compression and other methods, especially for the CPU architecture of TM1300, the loop is expanded for parallel processing, and other strategies such as multimedia instructions and compilation optimization provided by TM1300 are fully utilized. Practice shows that after using the above optimization strategies, the image encoding frame rate of the encoder has been increased by more than double.
4 Conclusion
The embedded network video encoder based on the TM1300 chip has the advantages of low cost and good reliability. In addition, it also achieves good performance because it fully utilizes TriMedia's hardware structure and TSSA software system designed specifically for multimedia information processing during the design and implementation process. The embedded encoder follows the ITU-T H.263 protocol and can be interconnected with terminals compatible with the H.323 protocol, achieving real-time video transmission and obtaining clear and smooth motion images. The bit rate can be selected from 64kbps to 1920kbps to adjust the frame rate and obtain different levels of image quality. At a bandwidth of 64kbps, the frame rate of decoded QCIF and CIF images can reach more than 15fps. By appropriately increasing the bit rate, the frame rate of both QCIF and CIF images can reach 25fps
Previous article:A New Method to Improve DSP Sampling Accuracy Using AD7858
Next article:Implementation of speech recognition system based on TMS320VC5507
Recommended ReadingLatest update time:2024-11-16 14:55
- Popular Resources
- Popular amplifiers
- Semantic Segmentation for Autonomous Driving: Model Evaluation, Dataset Generation, Viewpoint Comparison, and Real-time Performance
- Machine Learning and Embedded Computing in Advanced Driver Assistance Systems (ADAS)
- CVPR 2023 Paper Summary: Vision Applications and Systems
- CVPR 2023 Paper Summary: Video: Low-Level Analysis, Motion, and Tracking
- High signal-to-noise ratio MEMS microphone drives artificial intelligence interaction
- Advantages of using a differential-to-single-ended RF amplifier in a transmit signal chain design
- ON Semiconductor CEO Appears at Munich Electronica Show and Launches Treo Platform
- ON Semiconductor Launches Industry-Leading Analog and Mixed-Signal Platform
- Analog Devices ADAQ7767-1 μModule DAQ Solution for Rapid Development of Precision Data Acquisition Systems Now Available at Mouser
- Domestic high-precision, high-speed ADC chips are on the rise
- Microcontrollers that combine Hi-Fi, intelligence and USB multi-channel features – ushering in a new era of digital audio
- Using capacitive PGA, Naxin Micro launches high-precision multi-channel 24/16-bit Δ-Σ ADC
- Fully Differential Amplifier Provides High Voltage, Low Noise Signals for Precision Data Acquisition Signal Chain
- Innolux's intelligent steer-by-wire solution makes cars smarter and safer
- 8051 MCU - Parity Check
- How to efficiently balance the sensitivity of tactile sensing interfaces
- What should I do if the servo motor shakes? What causes the servo motor to shake quickly?
- 【Brushless Motor】Analysis of three-phase BLDC motor and sharing of two popular development boards
- Midea Industrial Technology's subsidiaries Clou Electronics and Hekang New Energy jointly appeared at the Munich Battery Energy Storage Exhibition and Solar Energy Exhibition
- Guoxin Sichen | Application of ferroelectric memory PB85RS2MC in power battery management, with a capacity of 2M
- Analysis of common faults of frequency converter
- In a head-on competition with Qualcomm, what kind of cockpit products has Intel come up with?
- Dalian Rongke's all-vanadium liquid flow battery energy storage equipment industrialization project has entered the sprint stage before production
- Allegro MicroSystems Introduces Advanced Magnetic and Inductive Position Sensing Solutions at Electronica 2024
- Car key in the left hand, liveness detection radar in the right hand, UWB is imperative for cars!
- After a decade of rapid development, domestic CIS has entered the market
- Aegis Dagger Battery + Thor EM-i Super Hybrid, Geely New Energy has thrown out two "king bombs"
- A brief discussion on functional safety - fault, error, and failure
- In the smart car 2.0 cycle, these core industry chains are facing major opportunities!
- The United States and Japan are developing new batteries. CATL faces challenges? How should China's new energy battery industry respond?
- Murata launches high-precision 6-axis inertial sensor for automobiles
- Ford patents pre-charge alarm to help save costs and respond to emergencies
- New real-time microcontroller system from Texas Instruments enables smarter processing in automotive and industrial applications
- EEWORLD University Hall ---- A Global View of Analog IC Design
- How to solve the abnormal application of RS-485 automatic transceiver circuit?
- Why are the punctuation marks in my code like this? Shouldn't they be like the one in the picture below?
- Keysight Technologies Award-winning Live Broadcast: Oscilloscope Applications and Techniques in General Electronic Measurements
- [NXP Rapid IoT Review] + 3. A First Look at Rapid IoT Studio
- Has anyone used LT3751 to boost 400V?
- Ask about the issue of emission exceeding the standard
- [GD32E231 DIY Contest]——06. ESP8266 Software Secondary Development
- If I give you a...
- Using C6748 and C5509A to drive nRF24L01 for data transmission