Sony develops depth sensor to help improve lidar reliability

Publisher:shtlswLatest update time:2021-02-22 Keywords:Sony Reading articles on mobile phones Scan QR code
Read articles on your mobile phone anytime, anywhere

Sony is also developing a silicon-based vision sensor for autonomous vehicles to enter the next big electronics market, the automotive market. Sony has a lot of experience in vision processing and they hope to make it cheaper.



Sony has announced the development of a stacked direct time-of-flight (dToF) depth sensor using single-photon avalanche diode (SPAD) pixels for automotive lidar.



SPAD is a pixel structure that uses avalanche multiplication technology to amplify the electrons of a single incident photon, forming an avalanche-like superposition, which can detect even weak light.


Sony's original SPAD pixel structure can achieve stable photon detection efficiency and response speed even in harsh conditions of -40℃ to 125℃, helping to improve the reliability of LiDAR.



It is understood that by March 2021, Sony will invest more than US$6.5 billion in its image sensor business and is building a new factory in Japan.


Keywords:Sony Reference address:Sony develops depth sensor to help improve lidar reliability

Previous article:Sony develops a new generation of vehicle-mounted sensors with a detection range of up to 300 meters
Next article:The University of Surrey has developed an environmentally friendly and skin-friendly sensor for autonomous driving

Latest Automotive Electronics Articles
Change More Related Popular Components

EEWorld
subscription
account

EEWorld
service
account

Automotive
development
circle

About Us Customer Service Contact Information Datasheet Sitemap LatestNews


Room 1530, 15th Floor, Building B, No.18 Zhongguancun Street, Haidian District, Beijing, Postal Code: 100190 China Telephone: 008610 8235 0740

Copyright © 2005-2024 EEWORLD.com.cn, Inc. All rights reserved 京ICP证060456号 京ICP备10001474号-1 电信业务审批[2006]字第258号函 京公网安备 11010802033920号