0 Introduction
The machine vision system is a non-contact optical sensing system. It integrates both software and hardware and can automatically obtain information from the collected images or generate control actions. Since its inception, machine vision has a history of 15 years, and has mainly gone through three stages: digital circuit composition, PC and output device composition, and embedded. Among them, the embedded machine vision system relies on professional computer technology, has a real-time multi-task operating system, an efficient compression chip and a powerful microprocessor. It can compress, transmit and process video on the chip. After internal processing, it can be directly connected to Ethernet or wide area network to complete real-time remote monitoring on the network. It is one of the current research hotspots.
In domestic and foreign research, there are three main ways to implement embedded machine vision systems:
(1) A system based on a standard bus that uses DSP as the calculation and control processor. Although the DSP chip can process a large amount of information and run at high speed, its I/O interface is single, not easy to expand, and has weak control capabilities, and there are still certain limitations.
(2) A machine vision system based on DSP+FPGA. The combination of FPGA and DSP can realize broadband signal processing and greatly improve the signal processing speed. However, FPGA uses hardware description language, and its algorithm development is very difficult. Function realization is controlled by hardware, and the system is greatly affected by the environment.
(3) Machine vision system using ARM microprocessor or ARM+DSP construction method. This construction method has powerful human-computer interaction function, high integration, good real-time performance, and supports multi-tasking. However, the data exchange method between ARM and DSP in this system still uses external circuit connection, which increases the instability of the system.
Combining the advantages and disadvantages of the above technical solutions, this paper proposes a new machine vision system to achieve high-speed acquisition and storage of image information.
Its core chip is the latest advanced dual-core embedded chip produced by TI. The ARM processor and DSP processor are integrated into one chip, and the coordination work of ARM and DSP is completed through software programming. The machine vision processing system constructed and developed by this chip, with the excellent control performance of the ARM processor implanted in the Linux system and the powerful computing and processing capabilities of DSP, will ensure that the system has good real-time performance and stability, and can provide a good video acquisition and processing hardware platform for the research and application of machine vision.
1 System Function
This system is a high-speed image data acquisition and storage system. Through software and hardware design, it can achieve: two channels with a resolution of 640×480, a frame rate of 60 f/s, and 12 b/pixel; one channel with a resolution of 1 024×1 024, a frame rate of 60 f/s, and 12 b/pixel. Real-time uncompressed storage.
As shown in Figure 1, the system controls the image sensor through the serial port, so that the three channels of image data signals, clocks, and various synchronization signals are input as required. The system sequentially performs image signal acquisition, data processing, and storage. The system uses its own interface to realize more functions such as display, host computer communication, keyboard control, etc., and can realize friendly human-computer dialogue. 2 Hardware Design This system selects the latest TMS320DM8168 chip from TI's Da Vinci series. This chip integrates 1 GHz ARMCortex-A8, 1 GHz TI C674x floating-point DSP, several second-generation programmable high-definition video image coprocessors, an innovative high-definition video processing subsystem (HDVPSS) and integrated codecs, supporting H.264, MPEG-4 and VC1 with high-definition resolution. It also includes multiple interfaces such as Gigabit Ethernet, PCI Express, SATA2, DDR2, DDR3, USB 2.0, MMC/SD, HDMI and DVI, which can support more functional expansion and complex applications. The chip is used to design and implement the acquisition, processing and display of two or three channels of image signals with different resolutions. The hardware schematic diagram is shown in Figure 2. The hardware modules involved in the development and design of this system are: image acquisition interface module, image acquisition module, image storage module, and peripheral interface module. 2.1 Image acquisition interface module As a connection module between image sensors and high-speed acquisition systems, this module can acquire and control images for USB interface cameras or Camera Link interface cameras. The USB interface connection is very convenient. Since the system has a USB peripheral interface, it can be connected according to the USB standard protocol. The CameraLink interface has an open interface protocol, which allows different manufacturers to maintain product differences and be compatible with each other. Therefore, the image acquisition interface module in the system adopts the Camera Link interface protocol. The module uses DS90CR288A, DS90LV049, and DS90LV047 to complete the control of the image sensor, the acquisition of image information, and the two-way communication between the image sensor and the image acquisition system.
2.2 Image Acquisition Module
The HDVPSS (HD Video Processing Subsystem) of TMS320DM8168 provides video input and video output interfaces. The video input interface provides access to external image devices (such as image sensors, video decoders, etc.).
HDVPSS can support up to 3 1080p channels at 60 f/s, and H.264 high-definition D1 encoding and 8-channel D1 decoding of 16 channels of CIF data streams at the same time; it supports 2 independent video capture input ports, each of which supports scaling and pixel format conversion. Both video input capture ports can operate as 1 16 b input channel (with separate Y and Cb/Cr inputs) or 2 clock-independent 8 b input channels (with interleaved Y/C data inputs). The first video input port can operate in 24 b mode to support RGB capture. All acquisition modes capture clocks up to 165 MHz to meet high-speed image acquisition.
The high-definition video processing subsystem (HDVPSS) has two independent video capture input ports, VIP0 and VIP1. VIP0 can be configured to 24 b, 16 b, and two independent 8 b modes, and VIP1 can be configured to 16 b and two independent 8 b. From the capture frequency and various configuration modes, it can be seen that there are multiple implementation methods for different traffic flows. In order to simplify the storage design, this solution configures VIP0 to 24 b for acquisition. In this mode, the maximum traffic flow is 165M×24 8 = 495 MB/s, which can meet the traffic requirements. From the
highest capture clock, it can be seen that the acquisition interval is 1 165M, which is about 6.1 ns. After calculation, and for the convenience of design, it is planned to use three Base-configured Camera Link cameras with a frame rate of 200 f/s. The frame rate control is all external triggering. The Camera Link camera outputs two pixels at a time, 12 b per pixel, that is, 2×12 b, which can just match the 24 b acquisition of VIP0. Take the time-sharing acquisition of three-way signals as an example, as shown in Figure 3, the acquisition method of the three-way signal is that three cameras take turns to acquire, that is, each camera acquires one frame in one cycle, which requires the realization of the timing signal of the three-way time-sharing acquisition. The timer generates a pulse width of 1/200 s, and the high level of the frame frequency is sent to the three cameras in time-sharing through the delay link; the timing relationship of the three-way acquisition signal is that one camera does not delay, one camera delays 1/200 s, and the last camera delays 2/200 s.
After receiving the command through DS90LV047A, the camera divides the captured image data into 4 LVDS data signals and 1 LVDS clock signal, and transmits them to DS90CR288A through the interface connector MDR26; DS90CR288A converts the serial data into 28 parallel signals and 1 clock signal, and transmits them to the TMS320DM8168 video capture port VIP0 for acquisition.
2.3 Image storage module
From the above design scheme, the system storage rate is about 160 MB/s, and the data volume is large. A large-capacity, high-speed solid-state hard disk can be selected and written through its SATA2 interface.
After the data acquisition is completed, the data is sent to VPDMA by configuring the HDVPSS subsystem, and finally transferred to the DDR memory. When the data volume of the DDR memory reaches the set data volume, an interrupt is generated. After the interrupt occurs, the DMA transfer between the memory and the solid-state hard disk is started according to the storage address, and the collected image is stored on the SSD through the SATA2 interface to realize data storage.
Then the timer is started to generate the next frame rate pulse to start the next cycle of data acquisition.
The external expansion memory selects the DDR3 (1 600) memory supported by the system. According to the system storage controller bit width of 32 b, the memory rate can reach 32/8×1 600M=6.4 GB/s. In this mode, acquisition and storage can be processed in parallel. The data collected by the cache is moved to the DDR3 memory, and its rate is much higher than the amount of data collected per second by the port. Because the acquisition method of this scheme is to collect each frame in turn, and the data in the frame has been arranged compactly in sequence, the data rearrangement work can be greatly reduced, and only some auxiliary data needs to be removed. The acquisition system puts all the other related signals into the form of one frame and one row, allowing the camera's clock signal to communicate with the clock signal of the system acquisition port. There is a small amount of auxiliary data before the image signal, and the auxiliary data is directly skipped when setting the DMA start address. Therefore, when the system is almost not running the program, the solid-state hard disk can occupy the DMA control right for at least 80% of the time to store the memory image data. According to the continuous write rate of the selected hard disk of 250 MB/s, 250×0.8=200 MB/s is greater than 160 MB/s, so the data collected in 1 s can be stored in real time. After the data is uploaded, you can choose to clear the original data to free up hard disk space.
[page]
2.4 Peripheral interface module
Based on the rich peripheral interfaces of the TMS320DM8168 chip, this system can flexibly design external interfaces to control peripheral devices and realize communication functions with external processors. According to the needs, the interfaces that can be selected are: 2 Gigabit Ethernet MACs (10 Mb/s, 100 Mb/s, 1 000 Mb/s) with GMII and MDIO interfaces; 2 USB ports with integrated 2.0 PHY; dual DDR2/3 SDRAM interfaces, etc., refer to Figure 2. The two
USB ports of TMS320DM8168 can meet the connection of keyboard and mouse when uploading the collected image data to the host computer, and the LCD and VGA interfaces can be used to directly display the image. The serial port can also be used to communicate with the host computer and can be used to control the Camera Link camera used in this design. The Gigabit Ethernet interface can meet the high-speed transmission of image data with its ultra-high rate.
The implementation of the above technology is mainly achieved by driving the peripheral interface through software programming. For specific solutions, see software design.
3 Software Design
This system uses the Linux operating system with a friendly interface, making the operation more flexible and capable of running multiple tasks. The interface can be used to control the camera, capture, stop, display, upload images, etc. The development of this part can be composed of two parts: transplantation and independent development. The software design is shown in Figure 4.
3.1 Ported programs
The ported programs include Linux kernel, network card driver, USB 2.0 driver, LCD driver, serial port driver, VGA driver, and SATA2 driver. In this regard, TI provides good support. There is a Linux operating system specifically for DM8168, version Linux 2.6.37, which can be developed through the Linux EZ software development kit (EZ SDK) provided by TI.
3.2 Self-developed programs
3.2.1 Drivers
In order to operate in a standardized manner under the Linux operating system, the image acquisition circuit part needs to be supported by the driver related to the image acquisition application. The acquisition circuit can be divided into multiple functional modules for driver writing, including camera acquisition driver (corresponding to the operation after VIP0 enters data); control driver (corresponding to the control of Timer); if the camera working state is to be changed according to the external environment, this part of the driver support is also required. The acquisition driver implements open and close methods. The control part implements open, close, and ioctl methods. Adaptive rate adjustment requires the implementation of open, close, ioctl, and read methods. Create device nodes in the /DEV directory, and then the application operates on the device nodes.
3.2.2 Application The
application is to be developed using QT development tools. The application is to be designed as a multi-threaded program, with a main thread and an adaptive parameter adjustment thread. The application mainly implements the acquisition program, stop, display, configuration, and upload programs, which correspond to the corresponding buttons. The
acquisition button corresponding program calls the open method of the device node, configures the corresponding hardware in the open method, registers the interrupt program, and starts the timer to start the acquisition. The process is shown in Figure 5.
Because the system already has a serial port driver, the configuration program can directly program the serial port. The adaptive environment rate adjustment program starts a new thread from the main interface program. The thread reads data through the corresponding device node to determine whether to adjust. If adjustment is required, it is reset through the serial port device node or control device node mentioned above.
4 Conclusion
The machine vision system constructed in this paper is an independent, controllable small multifunctional system with an operating system. It is implemented through hardware design and software design. Its functional modules include video image acquisition and processing, video image storage, video image communication and video image display. Using an advanced dual-core embedded processor, the video image signals obtained by multiple image sensors are collected in high-speed parallel, and image lossless compression and image fusion are performed as needed. The data can be stored in large capacity in real time, and communicate with the host computer through multiple interfaces. It has a friendly human-computer interaction interface and can drive multiple display screens to complete functions such as high-definition display and information playback.
Because the platform has a Linux operating system, system parameter settings and function selection can be completed without a host computer. The system can provide the high-definition target information required for airborne, missile-borne, and vehicle-mounted optoelectronic systems to complete high-speed scanning, rapid detection, active identification, and precise tracking tasks, and is expected to be applied in many fields such as safe cities, security industry, industrial control, medical education, logistics management, power grid operation, smart homes, smart cars, and food safety.
Previous article:Design of LCD human-computer interaction menu based on embedded system
Next article:Design and Implementation of Embedded GPS Data Acquisition System
Professor at Beihang University, dedicated to promoting microcontrollers and embedded systems for over 20 years.
- Innolux's intelligent steer-by-wire solution makes cars smarter and safer
- 8051 MCU - Parity Check
- How to efficiently balance the sensitivity of tactile sensing interfaces
- What should I do if the servo motor shakes? What causes the servo motor to shake quickly?
- 【Brushless Motor】Analysis of three-phase BLDC motor and sharing of two popular development boards
- Midea Industrial Technology's subsidiaries Clou Electronics and Hekang New Energy jointly appeared at the Munich Battery Energy Storage Exhibition and Solar Energy Exhibition
- Guoxin Sichen | Application of ferroelectric memory PB85RS2MC in power battery management, with a capacity of 2M
- Analysis of common faults of frequency converter
- In a head-on competition with Qualcomm, what kind of cockpit products has Intel come up with?
- Dalian Rongke's all-vanadium liquid flow battery energy storage equipment industrialization project has entered the sprint stage before production
- Allegro MicroSystems Introduces Advanced Magnetic and Inductive Position Sensing Solutions at Electronica 2024
- Car key in the left hand, liveness detection radar in the right hand, UWB is imperative for cars!
- After a decade of rapid development, domestic CIS has entered the market
- Aegis Dagger Battery + Thor EM-i Super Hybrid, Geely New Energy has thrown out two "king bombs"
- A brief discussion on functional safety - fault, error, and failure
- In the smart car 2.0 cycle, these core industry chains are facing major opportunities!
- The United States and Japan are developing new batteries. CATL faces challenges? How should China's new energy battery industry respond?
- Murata launches high-precision 6-axis inertial sensor for automobiles
- Ford patents pre-charge alarm to help save costs and respond to emergencies
- New real-time microcontroller system from Texas Instruments enables smarter processing in automotive and industrial applications
- How to use SEGGER J-Trace Pro streaming mode to implement instruction tracing?
- NeoPixel Array Simulator
- 【Power amplifier application】 Lamb wave signal analysis based on dry-coupled ultrasonic testing
- Answer the questions and get a gift | Rochester Electronics will help you solve the problems of the entire semiconductor cycle
- C language (vd6.0) sleep function usage and delay usage
- Battery applications in medical monitoring and the changing environment
- Extract LEF
- Case study of the method to solve the distortion of high-frequency pulse signal measured by oscilloscope
- DA output voltage accuracy cannot be achieved
- "Playing with the board" + Yu Zhennan's STM32 development board experiment 5