Article count:25239 Read by:103424336

Account Entry

A new approach to reducing lidar costs

Latest update time:2024-09-12
    Reads:

????If you hope to meet more often, please mark the star ?????? and add it to your collection~

Source: Content Compiled from semiengineering, thank you.

Using light to move data over short distances is becoming more common because there is more data to move and photons are faster and cooler than electrons.


Using fiber for mission-critical communications is well established. For decades, it has been the PHY of choice for long-distance communications because it does not suffer from the attenuation losses of copper wire. It has also become the dominant way to move data back and forth between servers and storage because it is virtually immune to interruptions, is extremely fast, and consumes less energy than copper wire. Today, all major foundries produce photonics integrated circuits (PICs), and these devices are now being pushed hard to reduce the cost of LiDAR.


PICs perform everything from spectrum splitting to optical-to-electrical conversion, bringing together optical connections. They enable higher bandwidth and lower signal loss while providing a cooler operating environment. Designers choose the wavelength of the PIC based on the intended application, which can be achieved by selecting a range of materials, including Si/SiO2, silicon nitride, and indium phosphide.


“Photonic chips are becoming cheaper and more available because of data communications driving the foundries,” said Gilles Lamant, distinguished engineer at Cadence. “If large foundries start to have production lines dedicated to photonics, they will want to keep them full, which will drive down the price of photonic chips and will help other markets significantly.”



New photonics applications range from coding and networking in quantum computers to ubiquitous sensing devices from factory inspection to medical equipment and ADAS.


“Most people know photonic integrated circuits for transceivers in data centers, which convert high-speed electrical signals into optical signals,” said Twan Korthorst, executive director of Synopsys Photonic Solutions. “Once you go into the optical domain, you can transmit very long distances for almost nothing. This is something that everyone is excited about — especially now with the boom in artificial intelligence, where everyone is investing in bigger data centers, connecting more computing power and memory to train and infer AI models. To do this, people are accelerating research in optical transceivers using integrated photonic chips. This is just a starting point, but if you can make a chip that can transmit, process and manipulate light, you can also use it in other places besides optical transceivers.”


Using photonics to reduce the cost of lidar


A common use case for photonics is LiDAR (Light Detection and Ranging), which is the basis for state-of-the-art ADAS sensing, but it can also be used for factory floor management, drone detection for counterintelligence, and seafloor mapping (aka bathymetry). It’s as integral to NOAA’s data collection as it is to studying repetitive motion on an assembly line.


Both radar and lidar exploit the ability of electromagnetic waves to bounce off surfaces, allowing them to detect and analyze objects and topological features in detail. While the physics behind such systems is complex, the principles are easy to grasp.


"Essentially, lidar is a range-finding device used to measure the distance to an object," the Synopsys website states. "The distance is measured by sending short laser pulses and recording the time interval between the emitted light pulse and the detection of the reflected (backscattered) light pulse."


The result is a point cloud that looks like a terrain-style 3D image, where the wavelength of the light and the number of laser pulses per second determine the fineness of detail. LiDAR uses wavelengths between 194THz and 750THz, also known as near-infrared (NIR, 800nm ​​to 1,550nm), visible light (400nm to 700nm), and mid-infrared (Mid-IR, above 2,000nm). The choice of wavelength depends on the application, required range, resolution, and environmental conditions. Because longer wavelengths scatter less, conventional wisdom has held that 1,550nm is better for penetrating fog. However, one study concluded that "the difference between the extinction coefficients of 905nm and 1,550nm is less than 10% when transmitting the same power in fog."


However, one could argue that even that small of a difference is critical from a brake actuation perspective. “There are a few different implementations of lidar using photonics,” said Tom Daspit, product marketing manager at Siemens EDA. “In the Bay Area, what we often see is a spinning disk on top of the car. It’s a laser that spins, and there’s a receiver that spins with it. The lidar devices being developed will be buried in the rearview mirror, headlights, or somewhere else on the car. Tesla doesn’t use lidar, but it uses optics. It looks at the pictures and tries to process them. Some other self-driving cars will also use lidar. It depends on how they intend to achieve their goals. To make this happen in a car, reliability has to be improved, and cost has to be reduced dramatically.”


High costs have dampened enthusiasm for lidar. “The biggest debate about lidar is price,” said Cadence’s Lamant. “To put it on cars, it has to be cheaper than it is now. Right now, it’s too expensive. Most lidar companies are making good progress, but they still have problems with price.”


Others agree. Elon Musk has refused to use lidar in Tesla cars, saying it is too expensive. Instead, Tesla uses 2D computer vision. Lidar suppliers have objected, saying 2D imaging doesn’t capture the world in its entirety and can’t ensure road safety.


“A camera might miss an object because of reflections from the sun or oncoming headlights, but lidar can eliminate that reflection and detect a person in the middle of the road,” wrote Sudip Nag, then vice president of software and AI products at Xilinx. (Nag is now vice president of AMD’s AI group.) Cameras shouldn’t be ruled out, though. Last year, NVIDIA researchers published a paper showing how a camera-based system could handle 3D perception.


Currently, the main hardware for lidar is a combination of compact solid-state and bulky mechanical technologies. MEMS technology is challenging lidar, with RoboSense being the leader, according to Yole Research. MEMS enable smaller devices that will eventually replace large spinning disks on vehicles and help drive down prices. This evolution is similar to backyard satellite dishes being replaced by desktop antennas. However, there are concerns that MEMS are too small to be used for road detection. MEMS lidar suppliers such as Germany's Blickfeld believe this problem can be easily solved by scaling up the MEMS and fine-tuning the spatial filtering.


Despite the concerns, some still expect lidar to be the technology of the future. Yole predicts that the global automotive lidar market will grow from $538 million in 2023 to $3.6 billion in 2029, a compound annual growth rate of 38%. The lidar market is currently dominated by Chinese companies RoboSense and Hesai, and most of the global growth is expected to continue to be driven by Chinese OEMs, which will launch 128 lidar-equipped models this year and next.


According to a recent paper in the journal Nature Communications, the current cost of lidar is estimated to be between $500 and $1,000. “This downward trend is expected to continue, potentially reaching around $100 in the next few years,” the authors said. “Global lidar penetration is currently around 0.5% of all passenger cars sold. As lidar prices approach $100, we expect this figure to surge to more than 10%.”


One answer to lowering manufacturing costs is silicon photonics, since such chips can be made using CMOS processes. “Lidar is the natural evolution of everything that has been done in data centers to date,” said Synopsys’ Korthorst. “If you Google solid-state lidar or silicon photonic lidar, you’ll find a ton of research, startups, successes and failures. Silicon photonics is very much alive. You can build a solid-state lidar today, using the same manufacturing methods and the same design tools as an optical transceiver. On the other hand, you also need to consider that because you can’t see it, you need proper eye protection.”


The industry’s answer to the eye safety issue is that lidar products adhere to the Class 1 eye safety (IEC 60825-1:2014) standard and further ensure safety by reducing the laser power to balance its wavelength (usually 1,550nm) to stay within eye safety parameters. However, Blickfeld said there could be special cases where the power is amplified, although this is unlikely.


Currently, photonics still plays a small role, although it is expected to grow. “Optical technologies, including FMCW (frequency modulated continuous wave radar), should not be used before 2028 and only in small volumes,” Yole said. “This technology is still in its emerging phase and must offer a better cost-to-performance ratio than hybrid solid-state.”


In fact, much of the cost of LiDAR comes from the high computational requirements of data processing, which FMCW vendors are trying to reduce through more integration, while FPGA vendors are also integrating DSP.


FMCW can be used for both LiDAR and radar. Although the first patent for the technology was issued in 1928, it has recently been developed as a possible solution to LiDAR's cost issues. The industry's classic technology is Time of Flight (ToF).


“Time of flight is basically like echolocation. You send out a pulse of light and see how long it takes for the reflection to come back. From the delay, you can determine the distance of the object,” said Mehdi Asghari, CEO of FMCW lidar startup SiLC. “95% of the market uses time of flight. The radar industry also started with time of flight and has now completely moved to FMCW. The main difference between lidar and radar is the frequency of the electromagnetic waves. This, in turn, forces many changes in the technical implementation of both.”


Asghari said that despite its popularity, ToF also brings performance drawbacks to radar and lidar, which are usually related to accuracy. "For example, when you use a pulse system to measure the light reflected back, the system does not distinguish between your pulse and every pulse from other users. You may measure the pulse of other systems and think that it is your pulse that is reflected back. This is called multi-user interference. ToF systems are also susceptible to performance degradation due to background light, such as direct sunlight."


FMCW solves this problem by sending a highly coherent beam of light at a constant amplitude, varying only the frequency versus time in a linear fashion. “You receive reflected light that is actually delayed compared to the light you sent out, and you mix them together coherently,” he says. “The beam that comes back is the beam you sent out earlier, maybe by just a few microseconds, so it has a different frequency than the beam you’re currently generating. When you beat those two beams together in a coherent mixer, you get a beat frequency signal that is proportional to the frequency offset between the transmitted and returned beams. When you detect this in a coherent receiver, you measure the beat frequency signal, and since time/frequency is proportional to distance, you can get depth information by measuring the frequency of the light.”


In other words, FMCW measures the change in frequency of light, rather than measuring the time delay of a light pulse like ToF. The ability to measure tiny changes in the frequency of light also allows you to directly measure the speed or motion of an object.


"When you send light toward a moving object, it compresses the frequency of the returning light. If the object moves away from you, it lowers the frequency of the returning light. If you can measure the Doppler shift of the frequency of light, you can directly measure distance and speed at the same time," Asgari said.


That information is harder to get with a ToF system. “You have to take a continuous measurement and then, based on the change in distance of the object, determine if they are coming toward you or moving away, and by how much, then determine if there is velocity,” Asghari said. “If you don’t measure a point accurately, or if there are missing points or other inaccuracies, then your velocity measurement is going to be problematic.”


This is why ToF systems are expensive. To compensate for this, they require these systems to have extremely high refresh rates so that they can measure quickly and then average to get an accurate speed measurement. The problem is that this generates a lot of data, which then needs to be processed. Using FMCW solves this problem, it generates less data, but is very accurate and requires less computation.


A major benefit of this immediate and direct speed measurement is that it enables fast and predictable system reactions. For example, it allows the system to detect a child chasing a ball on the road. Even a few milliseconds of warning or a quicker reaction can mean the difference between life and death.



in conclusion


Looking ahead, Cadence's Lamant believes positive changes are coming because data communications are forcing foundries to increase production, which will reduce the price of chips. "This will bring opportunities because prices are significantly lower because not everyone can afford servers right now. This will drive an explosion in applications."


Still, some challenges are often overlooked. “EDA can play a role in solving these challenges,” Lamant said. “Photonics by itself is not useful. You can’t do anything with photonics. You need to have electronics, and that’s one of the biggest challenges. Most of the startups I see fail because of integration issues, not because they don’t have good photonics equipment. They fail because they can’t build a system and produce it at scale. It’s not that we’ve run out of ideas in photonics. It’s that the smart people are so smart in one area, they forget that they need another. I have a barometer. When I see a startup where the CEO, CTO, and most of the other people are photonic engineers, I strongly suspect they’re not going to succeed. Everyone needs to remember that whether a component is part of data communications or a sensor, that component needs to talk to the whole system. That’s where EDA can help. We have the whole ecosystem, and photonics is just one part of it.”


Reference Links

https://semiengineering.com/photonics-could-reduce-the-cost-of-lidar/

END


????Semiconductor boutique public account recommendation????

▲Click on the business card above to follow

Focus on more original content in the semiconductor field


▲Click on the business card above to follow

Focus on the trends and developments of the global semiconductor industry

*Disclaimer: This article is originally written by the author. The content of the article is the author's personal opinion. Semiconductor Industry Observer reprints it only to convey a different point of view. It does not mean that Semiconductor Industry Observer agrees or supports this point of view. If you have any objections, please contact Semiconductor Industry Observer.



Today is the 3883rd content shared by "Semiconductor Industry Observer" for you, welcome to follow.


Recommended Reading


"The first vertical media in semiconductor industry"

Real-time professional original depth

Public account ID: icbank


If you like our content, please click "Reading" to share it with your friends.

 
EEWorld WeChat Subscription

 
EEWorld WeChat Service Number

 
AutoDevelopers

About Us Customer Service Contact Information Datasheet Sitemap LatestNews

Room 1530, Zhongguancun MOOC Times Building,Block B, 18 Zhongguancun Street, Haidian District,Beijing, China Tel:(010)82350740 Postcode:100190

Copyright © 2005-2024 EEWORLD.com.cn, Inc. All rights reserved 京ICP证060456号 京ICP备10001474号-1 电信业务审批[2006]字第258号函 京公网安备 11010802033920号