Today, our lives are highly dependent on sensors. As an extension of the human "five senses", sensors can perceive the world and even observe details that the human body cannot perceive. This ability is also necessary for the future intelligent society.
However, no matter how excellent the performance of a single sensor is, it still cannot meet people's requirements in many scenarios. For example, the expensive laser radar in a car can determine that there is an obstacle ahead based on the generated point cloud, but if you want to know exactly what the obstacle is, you still need the help of the on-board camera to "see" it; if you want to sense the movement of the object, you may also need millimeter-wave radar to help.
This process is like the familiar "blind men and elephant". Each sensor can only see one aspect of the object being measured based on its own characteristics and expertise. Only by combining all the characteristic information can a more complete and accurate insight be formed. This method of integrating multiple sensors together is called "sensor fusion".
A more rigorous definition of sensor fusion is: the information processing process of using computer technology to automatically analyze and synthesize information and data from multiple sensors or multiple sources under certain criteria to complete the required decision-making and estimation. These sensors as data sources can be the same (isomorphic) or different (heterogeneous), but they are not simply piled together, but are deeply fused at the data level.
In fact, examples of sensor fusion are already common in our lives. In summary, there are three main purposes for using sensor fusion technology:
• Get a global understanding. A single sensor has a single function or insufficient performance, and only when they are combined can a higher-level task be completed. For example, the 9-axis MEMS motion sensor unit we are familiar with is actually a combination of a 3-axis acceleration sensor, a 3-axis gyroscope, and a 3-axis electronic compass (geomagnetic sensor). Only through such sensor fusion can accurate motion sensing data be obtained, thereby providing users with a realistic immersive experience in high-end VR or other applications.
• Refine the detection granularity. For example, in terms of geographic location perception, satellite positioning technologies such as GPS have a detection accuracy of about ten meters and cannot be used indoors. If we can combine local positioning technologies such as Wi-Fi, Bluetooth, and UWB, or add MEMS inertial units, the positioning and motion monitoring accuracy of indoor objects can be improved by orders of magnitude.
• Achieve safety redundancy. In this regard, autonomous driving is the most typical example. The information obtained by each on-board sensor must be backed up and verified with each other to ensure true safety. For example, when the autonomous driving level is upgraded to L3 or above, millimeter-wave radar will be introduced on the basis of the on-board camera. At L4 and L5, laser radar is basically standard, and even the data collected through V2X vehicle networking will be considered for integration.
In short, sensor fusion technology is like a "coach" that can combine sensors with different performances into a team, allowing them to work together to win a game.
After selecting the sensors to be fused, the next step is to consider how to fuse them. The architecture of sensor fusion is divided into three types according to the fusion method:
• Centralized: Centralized sensor fusion is to send the raw data obtained by each sensor directly to the central processing unit for fusion processing. The advantages of this are high accuracy and flexible algorithms. However, due to the large amount of data that needs to be processed, higher computing power is required from the central processing unit. The delay in data transmission also needs to be considered, making it difficult to implement.
• Distributed: The so-called distributed means that the raw data obtained by each sensor is processed first at a place closer to the sensor end, and then the results are sent to the central processor for information fusion calculation to obtain the final result. This method has low communication bandwidth requirements, fast calculation speed and good reliability, but because the raw data will be filtered and processed, some information will be lost, so in principle the final accuracy is not as high as the centralized method.
• Hybrid: As the name implies, it is a combination of the above two methods, with some sensors using centralized fusion and other sensors using distributed fusion. Since it takes into account the advantages of centralized fusion and distributed fusion, the hybrid fusion framework has strong adaptability and high stability, but the overall system structure will be more complex, and additional costs will be incurred in data communication and computing processing.
There is also an idea to classify sensor fusion solutions according to the data information processing stage. Generally speaking, data processing must go through three levels: data acquisition, feature extraction, and recognition and decision-making. Information fusion at different levels has different strategies and application scenarios, and the results are also different.
According to this idea, sensor fusion can be divided into data-level fusion, feature-level fusion and decision-level fusion.
• Data-level fusion: After multiple sensors have collected data, the data is fused. However, data-level fusion must be performed on data collected by the same type of sensors, and cannot process heterogeneous data collected by different sensors.
• Feature-level fusion: Extract feature vectors that reflect the attributes of the monitored object from the data collected by the sensor. Information fusion of the monitored object features at this level is called feature-level fusion. This method is feasible because some key feature information can replace all data information.
• Decision-level fusion: Based on feature extraction, certain discrimination, classification, and simple logical operations are performed to make identification judgments. On this basis, information fusion is completed according to application requirements to make higher-level decisions. This is the so-called decision-level fusion. Decision-level fusion is generally application-oriented.
There is no fixed rule on how to choose the strategy and architecture of sensor fusion. It needs to be determined according to the specific practical application. Of course, it is also necessary to comprehensively consider factors such as computing power, communication, security, and cost to make the right decision.
Regardless of which sensor fusion architecture is used, you may find that sensor fusion is largely a software work, and the main focus and difficulty are in the algorithm. Therefore, developing efficient algorithms based on actual applications has become the top priority of sensor fusion development.
In terms of optimization algorithms, the introduction of artificial intelligence is an obvious development trend of sensor fusion. Through artificial neural networks, the judgment and decision-making process of the human brain can be imitated, and it has the scalable ability of continuous learning and evolution, which undoubtedly provides acceleration for the development of sensor fusion.
Although software is critical, there are opportunities for hardware to play a role in the sensor fusion process. For example, if all sensor fusion algorithm processing is done on the main processor, the processor load will be very large. Therefore, a popular approach in recent years is to introduce a sensor hub, which can independently process sensor data outside the main processor without the involvement of the main processor. Doing so can reduce the load on the main processor on the one hand, and reduce the system power consumption by reducing the working time of the main processor on the other hand, which is very necessary in power-sensitive applications such as wearables and the Internet of Things.
Market research data shows that the demand for sensor fusion systems will grow from $2.62 billion in 2017 to $7.58 billion in 2023, with a compound annual growth rate of approximately 19.4%. It can be predicted that the development of sensor fusion technology and applications in the future will show two obvious trends:
• Driven by autonomous driving, the automotive market will be the most important track for sensor fusion technology, and will spawn more new technologies and solutions.
• In addition, the trend of application diversification will also accelerate. In addition to the previous applications with high performance and safety requirements, sensor fusion technology will usher in huge development space in the field of consumer electronics.
In short, sensor fusion provides us with a more effective way to gain insight into the world, allowing us to avoid the embarrassment of "blind men touching an elephant" and, based on this insight, shape a smarter future.
Previous article:element14 Launches New Sensor-to-Software Solution Based on NI and Omega Technology
Next article:Infineon Technologies and Rainforest Connection team up to create real-time monitoring system
Recommended ReadingLatest update time:2024-11-16 12:00
- Popular Resources
- Popular amplifiers
- Analysis and Implementation of MAC Protocol for Wireless Sensor Networks (by Yang Zhijun, Xie Xianjie, and Ding Hongwei)
- Introduction to Internet of Things Engineering 2nd Edition (Gongyi Wu)
- Siemens S7-1200-PLC Programming and Application Tutorial (3rd Edition) (Edited by Shi Shouyong)
- 西门子S7-12001500 PLC SCL语言编程从入门到精通 (北岛李工)
- Melexis launches ultra-low power automotive contactless micro-power switch chip
- Infineon's PASCO2V15 XENSIV PAS CO2 5V Sensor Now Available at Mouser for Accurate CO2 Level Measurement
- Milestone! SmartSens CMOS image sensor chip shipments exceed 100 million units in a single month!
- Taishi Micro released the ultra-high integration automotive touch chip TCAE10
- The first of its kind in the world: a high-spectral real-time imaging device with 100 channels and 1 million pixels independently developed by Chinese scientists
- Melexis Launches Breakthrough Arcminaxis™ Position Sensing Technology and Products for Robotic Joints
- ams and OSRAM held a roundtable forum at the China Development Center: Close to local customer needs, leading the new direction of the intelligent era
- Optimizing Vision System Power Consumption Using Wake-on-Motion
- Infineon Technologies Expands Leading REAL3™ Time-of-Flight Portfolio with New Automotive-Qualified Laser Driver IC
- Innolux's intelligent steer-by-wire solution makes cars smarter and safer
- 8051 MCU - Parity Check
- How to efficiently balance the sensitivity of tactile sensing interfaces
- What should I do if the servo motor shakes? What causes the servo motor to shake quickly?
- 【Brushless Motor】Analysis of three-phase BLDC motor and sharing of two popular development boards
- Midea Industrial Technology's subsidiaries Clou Electronics and Hekang New Energy jointly appeared at the Munich Battery Energy Storage Exhibition and Solar Energy Exhibition
- Guoxin Sichen | Application of ferroelectric memory PB85RS2MC in power battery management, with a capacity of 2M
- Analysis of common faults of frequency converter
- In a head-on competition with Qualcomm, what kind of cockpit products has Intel come up with?
- Dalian Rongke's all-vanadium liquid flow battery energy storage equipment industrialization project has entered the sprint stage before production
- Allegro MicroSystems Introduces Advanced Magnetic and Inductive Position Sensing Solutions at Electronica 2024
- Car key in the left hand, liveness detection radar in the right hand, UWB is imperative for cars!
- After a decade of rapid development, domestic CIS has entered the market
- Aegis Dagger Battery + Thor EM-i Super Hybrid, Geely New Energy has thrown out two "king bombs"
- A brief discussion on functional safety - fault, error, and failure
- In the smart car 2.0 cycle, these core industry chains are facing major opportunities!
- The United States and Japan are developing new batteries. CATL faces challenges? How should China's new energy battery industry respond?
- Murata launches high-precision 6-axis inertial sensor for automobiles
- Ford patents pre-charge alarm to help save costs and respond to emergencies
- New real-time microcontroller system from Texas Instruments enables smarter processing in automotive and industrial applications
- Sharing of stepper motor control board based on TB67S109AFNAG
- SensiML Industrial Monitoring Demo ST SensorTile
- Looking to buy a set of black gold ZYNQ 7020/7010 development board
- Looking for a chip that drives multiple indicator lights
- Brief analysis of the application principles of common protective devices in security products
- TI's latest generation of ultra-low power multi-protocol chip ZigBee/6LoWPAN/BLE/Thread chip CC2652
- This RF article is interesting - interpreting the empirical rules of foreigners for microwave circuits
- In the LM5118 datasheet, Application and Implementation section, "Minimum load current (CCM operation) = 600mA", what does it mean?
- Autonomous low-power Tesla coil "Experimental"
- What medium is used to start the pins when they are all low level?