Article count:16439 Read by:87952319

Featured Content
Hottest Technical Articles
Exclusive: A senior executive of NetEase Games was taken away for investigation due to corruption
It is reported that Xiaohongshu is testing to directly direct traffic to personal WeChat; Luckin Coffee is reported to enter the US and hit Starbucks with $2, but the official declined to comment; It is reported that JD Pay will be connected to Taobao and Tmall丨E-commerce Morning News
Yu Kai of Horizon Robotics stands at the historical crossroads of China's intelligent driving
Lei Jun: Don't be superstitious about BBA, domestic brands are rising in an all-round way; Big V angrily criticized Porsche 4S store recall "sexy operation": brainless and illegal; Renault returns to China and is building a research and development team
A single sentence from an overseas blogger caused an overseas product to become scrapped instantly. This is a painful lesson. Amazon, Walmart, etc. began to implement a no-return and refund policy. A "civil war" broke out between Temu's semi-hosted and fully-hosted services.
Tmall 3C home appliances double 11 explosion: brands and platforms rush to
Shareholders reveal the inside story of Huayun Data fraud: thousands of official seals were forged, and more than 3 billion yuan was defrauded; Musk was exposed to want 14 mothers and children to live in a secret family estate; Yang Yuanqing said that Lenovo had difficulty recruiting employees when it went overseas in the early days
The app is coming! Robin Li will give a keynote speech on November 12, and the poster reveals a huge amount of information
It is said that Zhong Shanshan asked the packaged water department to sign a "military order" and the entire department would be dismissed if the performance did not meet the standard; Ren Zhengfei said that it is still impossible to say that Huawei has survived; Bilibili reported that employees manipulated the lottery丨Leifeng Morning News
Exclusive: Xiao Haishan resigns as general manager of China Resources Cloud, Han Juntao may take over
Account Entry

1 hour with Amnon Shashua: Detailed explanation of Mobileye's advanced autonomous driving journey

Latest update time:2019-02-08
    Reads:

▲Click above Leifeng.com Follow

Text | Misty

Report from Leiphone.com (leiphone-sz)

Leifeng.com New Intelligent Driving Note: At CES every year, Mobileye not only reviews and summarizes the development of the previous year and looks forward to the development of the next year, but also conducts an in-depth analysis of the development and future of the industry. Mobileye CEO Amnon Shashua is the soul and technology evangelist of Mobileye, and his keynote speeches are always a highlight.

In this article, you will learn about:

1. Mobileye autonomous driving solutions and strategies;

2. Maps and REM;

3. Global trends and evolution of ADAS;

This article was compiled by Leifeng.com New Intelligent Driving based on Amnon Shashua’s speech (slightly edited).


Four key figures in 2018: 28, 20, 7, 56

2018 was a fruitful year for Mobileye.

We have 28 new customers whose products have been successfully put into production, including 24 automobile manufacturers and 8 Tier 1s; 16 of them are from China, so China has become one of Mobileye's main strategic regions.

At the same time, we have released 20 projects covering 78 car models. We are not only preparing for the future, but also contributing to the current industry development.

7 projects have installed our latest chip EyeQ4. EyeQ4 is a very powerful chip device, which has been installed in BMW's X5 model.

The next number is 56. We have 56 automotive products with advanced driver assistance features, bringing users value beyond safety.

Let’s look at the next picture: EyeQ series chip sales from 2014 to 2018.

We chose 2014 because that was the year we did our IPO. We achieved 46% sequential growth from 2014 to 2018. To date, we have sold 32 million EyeQ chips, which means that 32 million cars are powered by Mobileye technology.

Among Euro NCAP's five-star cars, 13 are equipped with Mobileye technology, which shows that our leadership in the automotive field is growing stronger and stronger.

This is a BMW X5, 3 of which are supported by EyeQ4, which can enhance recognition capabilities in all aspects, including red light alarm, recognition of surrounding cars/bicycles, emergency braking and other functions.

At the same time, we are working with Volkswagen Group to focus on creating L2+ that combines front camera and Roadbook technology.

Back to cars, how do we make mapping systems provide more value to the development of cars?

For example, if a car is driving on a city road with unclear or even unmarked road signs, the map system can still identify the road. Because there are many cars on the road, the system can collect data from these cars and transmit it to the cloud to achieve road recognition. Therefore, even if there are no road signs on the road, we can still recognize the road.

Speaking of traffic lights, the mapping system can also improve ACC and LKA. We expect to release Volkswagen cars equipped with EyeQ4 and advanced map recognition systems in 2019 or 2020.

We have also signed an agreement with China's Great Wall Motors. In the next three to five years, Great Wall Motors will integrate L0-L2+ ADAS systems based on Mobileye technology into a series of models. In order to adapt to China's unique road conditions, the two parties will also jointly develop L3 and even higher-level driving systems while integrating ADAS.


Global Development Trends of ADAS

Next, let’s take a look at the global development trend of ADAS.

The percentages in the above figure represent the coverage rate of ADAS in cars according to EuroNCAP data, which was 12.8% in 2016, 17.6% in 2018, and is expected to be 20.3% in 2020 and 26.5% in 2022.

We can see that the development momentum of ADAS is encouraging. The technological development of assisted driving has not stopped. It is constantly evolving, and its proportion in automobiles is getting larger and larger.

This PPT is very important as it shows our overall strategy.

Our strategy is generally divided into two categories: autonomous driving and assisted driving. On the one hand, assisted driving can promote autonomous driving. On the other hand, autonomous driving technology can also promote the development of assisted driving technology.

We have been thinking about how to transfer each function of autonomous driving technology to assisted driving to promote the development of assisted driving. In assisted driving, we have a very obvious limitation: cost limitation. The cost of assisted driving is very high, but for autonomous driving, cost is not a problem.

So, how can we use autonomous driving to promote the development of assisted driving?

We have thought a lot about this question. I would like to share one of our thoughts with you. In the field of autonomous driving, we have two revolutions, but people like to confuse them.

One of them is the transformation of transportation, which is deeply affected by autonomous driving. With the development of autonomous driving, transportation has undergone major changes, which are manifested in countless forms: smarter cities, fewer parking lots, and so on.

Another change is a life-saving one: if every car on the road is a self-driving car, no one will die in traffic accidents.

Assisted driving has already begun to become a life-saving revolution. Although assisted driving has not yet completed this revolution, through assisted driving, we can create the right technology (I will explain it in detail at the end of this speech) to create a future of "Vision Zero".

"Vision Zero" means that if every car is equipped with an autonomous driving system, the rate of vehicle accidents can be negligible or even zero. The life-saving revolution comes from ADAS, and the traffic revolution comes from autonomous driving.

We separate the two changes so that we know what technologies are needed for each.


Detailed explanation of Mobileye's autonomous driving solution

Next, let's talk about our autonomous driving solutions. In general, our solutions are divided into four categories:

  • Visual perception and sensor fusion

  • Compute platform

  • Driving policy and RSS

  • Dynamic mapping

Visual perception and perceptual fusion are related to sensors and data. Data is collected by sensors such as cameras, radars, and lidars, and enters the computing system to create a 360-degree environmental model, which includes roads, traffic lights, road signs, etc.

At the same time, we also need a computing platform to support such a large amount of data calculations. This platform needs to be very powerful because the amount of calculations is quite huge; at the same time, considering the cost issue, it must also be very efficient.

In addition, we also need driving strategies and RSS to ensure driving safety while achieving a balance between safety and legality.

Finally, we need dynamic maps, a technology we created three years ago, and speaking of maps, we will talk about its upgrade, which I will mention later.

Now let’s talk about the five areas that Mobileye is currently working on:

  • Open EyeQ5 (open architecture EyeQ5): Intel has its own silicon photonics production line that can produce the chips needed for radar. At the same time, it has an open architecture, so our customers can write their own code on the chip and do the fusion themselves.

  • Closed EyeQ5: It not only includes EyeQ5, but also our old generation chips EyeQ4 and EyeQ3, which are the chips we currently use for assisted driving.

  • Surround Vision: It is not only used in assisted driving, but also in autonomous driving.

  • AV Series (Autonomous Driving Vehicle Series): Includes 360-degree vision, maps, driving strategies, sensors, and more.

  • AV Series+Maas platform (autonomous driving series and Maas platform): In addition to AV Series, it also includes the Maas software system.

Please remember that everything we have related to autonomous driving is related to these five aspects.

Next, let’s talk about visual perception.

Visual recognition is a complex thing. For now, I just want to talk about its basic principles, which I call "true redundancy". At present, our focus is on the camera. We know that the camera plays a very important role in self-driving vehicles, and our goal is to use the camera to achieve self-driving.

However, it is quite difficult to achieve autonomous driving relying solely on cameras, because cameras do not provide direct 3D information. Just like our eyes, the data they provide is very limited.

To realize the autonomous driving function, we need 3D information. Although the camera has high resolution, in the eyes of developers, the camera provides a "lazy" vision. Therefore, we also need other sensors, such as radar, lidar, etc., to give us direct 3D information.

But this will lead to a problem: too many sensors will cause redundancy. Therefore, we need to make the camera more powerful and have complete, end-to-end operation functions, so that other sensors can be added to the real icing on the cake and achieve true redundancy.

This is our philosophy. We are not saying that cameras can solve all problems, nor do we deny the necessity of sensors such as radar and lidar.

So we're doing two things.

First, we need to find the right way to achieve real rather than unnecessary redundancy.

In addition, we are also doing something more important, which is to migrate autonomous driving technology to assisted driving to reduce the cost of assisted driving.

Today’s sensors cost tens of thousands of dollars, and optimistically they may drop to a few thousand dollars in the future. However, their cost is still too high to be used on a large scale. So, how can we reduce costs and achieve large-scale autonomous driving?

The answer is cameras. Radar and lidar are both quite expensive, but cameras are cheap. They are the cheapest sensor you can imagine. You can buy a good quality camera for $20.

So if we want to reduce costs, we need to focus on cameras. So if you want to affect the transformation of assisted driving, we have to go the more difficult route first: using cameras to achieve autonomous driving.

After this path is established, we will let it influence the development of assisted driving. This is a strategic way of thinking. We first use real redundancy to achieve autonomous driving, and then let autonomous driving promote assisted driving.


Mobileye’s autonomous driving strategy

Let’s talk about autonomous driving strategies.

If you go to our booth, you can experience our self-driving car in VR. There are 12 cameras in total (3 cameras in the front, 2 cameras in the corners facing forward, 2 facing backward, 2 facing the side, and 3 for parking), no other sensors, no GPS.

The top of the picture shows what the camera sees, and the right side shows a 3D map of the road conditions. Let's focus on the right side. The blue car represents the self-driving car. We can see that it crosses an intersection, gives way to a red car, and stops to wait for a passerby who suddenly enters the road. This 3D map is also achieved by the camera.


Talk about computing platform

First, let’s review our chips. EyeQ4 was released in 2018, and EyeQ5 was launched just this past December 2018, which is 10 times more powerful than EyeQ4.

Currently, EyeQ5 has received orders and we will start mass production of EyeQ5 from March 2021.

Overall, the EyeQ5 is a very powerful chip, low power consumption, and it is a silicon-only “open” chip (allowing third-party code to run) – which can benefit not only autonomous driving, but also assisted driving.

We are working with Aptiv to build BMW's 2021 self-driving car for mass production, which is also equipped with the EyeQ5 chip.


Driving Strategies and RSS

Safety can be divided into functional safety and nominal safety. Most people focus on functional safety, while we focus on conventional safety, that is, how to make our designs avoid accidents. That is, at the beginning of system design, we must ensure that there will be no potential safety hazards to society and achieve safe driving.

Last year, we worked with regulators to propose the RSS (Responsibility Sensitive Safety) model, which is a set of mathematical formulas that transforms human ideas and concepts about safe driving into mathematical formulas and calculation methods to define what kind of driving behavior is safe driving.

The RSS model proposes that safe driving requires the following three points:

1. Reasonableness. That is, it should satisfy people’s understanding and judgment of “paying attention” rather than defining it arbitrarily.

2. Validity. A reasonable definition may be completely useless.

For example, a definition that sounds good is: when a car changes lanes, vehicles in other lanes are not allowed to change speed and should not be affected by the car's lane change. However, this definition that sounds "cautious" does not work in many cases, such as when you encounter terrorists, or even on busy roads in California. Because when changing lanes, other vehicles must slow down to allow the lane-changing vehicle to change lanes. Therefore, safe driving is not only reasonable, but also effective.

3. Verifiability, that is, the definition can be verified. This means that the definition must be combined with the machine to verify whether the definition is correct and effective, and to prove that there is no butterfly effect.

The butterfly effect here refers to a small, unintentional action at the beginning, which, through the effect of other actions in the system, eventually leads to a car accident.

Next, let’s take a look at the driving strategy under the RSS model framework.

We divide driving strategies into four categories: strategy, tactics, path planning, and control.

For example, the strategy means "I want to change lanes". Now let's move on to the tactical strategy, which means that I have decided to change lanes, so I need to decide which car I need to give way to and which car I want to give way to me. The distance between these two cars is the distance I need to drive to change lanes.

This kind of decision changes in real time. For example, I have decided which car needs to give way to me, but this car does not give way to me. If I still insist on my idea, an accident may occur, so I change my mind. Therefore, tactical strategy is a "momentary" decision that changes as the situation changes.

Machine learning plays a very important role in both of the above decisions.

Next, path planning is where RSS comes in. It plans the vehicle's trajectory to execute the tactical strategy, and this trajectory must be safe. So, what is a safe trajectory? This is where RSS comes in handy.

Finally, there is control. After the trajectory has been planned, the car needs to be controlled, such as when to brake.

All of these strategies are aimed at achieving safe driving.

I will show you some examples below. I want to point out that in all the examples I show you, the in-car perspective is achieved by cameras, without any other sensors. Of course, this picture is a bird's-eye view from a drone. In the picture, there is a car parked in the middle of the road, so the cars on the road start to change lanes. The two blue cars with our logo in the middle are self-driving cars. We can see that its driving behavior is very similar to that of humans, and it successfully changes lanes.

The above is the view from inside the car. We can see that the blue car with the logo on the right is the self-driving car, the red car in front is the car that the self-driving car decided to give way to, and the green car is the car that the self-driving car decided to "cut in". This decision is a tactical decision, and it is a split-second decision.

The lane-changing distance of an autonomous car is exactly the distance between the red vehicle and the green vehicle. If the green vehicle does not allow the autonomous car to take over, the autonomous car will change its decision.

It wasn’t an easy maneuver, but the self-driving car handled it remarkably well.

The following is a similar road setting, the only difference is that there is one more pedestrian. Although the road conditions are complicated, the blue self-driving car successfully avoids vehicles and pedestrians and completes the lane change.

The road setting below is a city road. We can see that the self-driving car is moving forward, and there are cars moving and stopped around it.

When it arrived at an intersection, a pedestrian was crossing the road. At this time, the self-driving car stopped and waited for the pedestrian to cross the road. After the pedestrian crossed the road, the self-driving car continued to drive. Next, the self-driving car gave way to the car coming from the right, and then successfully turned left.

This is the view from inside the car.

Let's take a look at how the self-driving car handles a partially blocked road. A stopped truck on the left blocked the left side of the road, and the self-driving car smoothly drove past the truck.

This was a difficult decision because the self-driving car needed to determine if the road was a traffic jam or a truck parked on the side of the road. However, based on how other vehicles on the road handled it, the self-driving car made the right decision.

The last example: The self-driving car goes straight through the intersection, and a bus on the left turns left into the road ahead. The self-driving car gives way to the bus, and at the same time, another car coming from the left gives way to the self-driving car.

Therefore, our self-driving car handles in a very similar way to humans, and it achieves safe driving under the RSS model framework.

We have successfully collaborated with many parties, including larger ones such as Valeo, Baidu and China ITS.

Valeo is our most recent Tier 1 partner. Valeo and Mobileye signed an agreement, stating that they will add RSS to their autonomous driving R&D projects in the future and use it in conjunction with other industrial standards and agree to jointly develop industry standards.

A few months ago, we signed a cooperation agreement with Baidu. Baidu announced at the beginning of this year that it plans to deploy the RSS model in its Apollo open source project and Apollo Pilot commercial project. The Apollo project is the first open source application of the RSS model.

Our biggest collaboration is with China ITS (Intelligent Transport System, a standard-setting group under the Chinese Ministry of Transport - China Integrated Intelligent Transportation Industry and Service Alliance), which has proposed the RSS model as a framework for the upcoming autonomous driving safety specifications. We have not only collaborated with the Chinese government, but also with many Chinese technology companies including NIO, AutoNavi, and Huawei.

At the same time, our partners are constantly expanding. Our cooperation with China Intelligent Transportation System is a great example of our cooperation with regulatory authorities.

In fact, these collaborations have nothing to do with Mobleye's technology. They are separate from the technology. It does not mean that the companies that cooperate with us need to install Mobleye cameras to develop self-driving cars, nor does it mean that Mobleye's strategies are needed to achieve self-driving. Instead, it means that under the RSS security framework, safe self-driving can be successfully achieved.


MAP & REM

Let’s talk about maps and REM (Road Experience Management) technology.

REM provides high-precision map services for self-driving cars through crowdsourcing, commonly known as the "Global Roadbook".

First, let's talk about REM data processing. Currently, all self-driving cars are equipped with cameras on the front, and most of them are driven by Mobileye chips.

If an autonomous vehicle is equipped with the EyeQ4 chip, it first needs to collect information, that is, collect road and road sign information through the chip-driven camera.

In the second step, the collected information is anonymized and encrypted.

The third step is to store the encrypted information in the cloud to generate an autonomous driving road book.

In the fourth step, map information is distributed to self-driving cars.

Finally, there is positioning. The autonomous vehicle locates itself in the road book within an accuracy of 10 centimeters.

Currently, we have cooperated with automobile companies such as BMW and Nissan to collect information during the driving process of self-driving cars and transmit it to the cloud, making the road book more powerful.

In this video, the yellow rectangle on the left represents road signs, and the white represents the road edge, lane markings, and center lanes. The camera can collect information, transmit it to the cloud, and then perform high-precision real-time positioning. The right side is the same data imaged on Google Maps.

Currently, the amount of data we transmit is 1000 bytes per 1000 meters, which is very important. We need to transmit data to the cloud, which needs to be low-cost. Currently, our cost is only $1 per car per year.

If the cost is too high, no automaker will adopt the system. The cost of transmitting gigabytes of data from the cloud to the car will also be too high.

Therefore, our system is a low-cost system.

We have already cooperated with Nissan and other companies to complete the collection of high-precision maps in Japan, and plan to cooperate with Nissan to release L3 self-driving cars in the near future. The one-click automatic generation of maps within 24 hours is a huge improvement compared to today's manual maps.

The data is compressed at 10 KB per kilometer, and the final map is compressed to 400MB. Each purple map tile represents 1 square kilometer, and the average tile size is only 30KB. The data accuracy is less than 10 centimeters, and more than 1.1 million map features are collected as a whole, covering 320,000 signals, 300,000 traffic lights, 250,000 lane lines, 190,000 roadblocks, etc.

We are also working with BMW. Most of BMW's self-driving cars are equipped with EyeQ4 chips, which will collect data and transmit it to the cloud.

REM's commercial applications are mainly divided into three aspects. The first is autonomous driving maps, the second is L2+/3/4 autonomous driving, which is also the manifestation of autonomous driving feeding back to assisted driving, and the third is non-motorized field applications.

REM has also been applied in the secondary market. First, we cooperated with the government and signed three map agreements. Second, we launched three smart city projects. At the same time, we also established autonomous driving fleets in Europe and the United States, covering 20,000 autonomous driving vehicles.

Our cooperation covers the countries, cities, companies and projects shown above.

One of the most noteworthy is that we have reached a cooperation agreement with the UK's national mapping agency, Ordnance Survey, to provide high-precision location data to UK organizations and businesses. Using maps to improve collaboration between businesses and cities will help make cities smarter and make urban roads safer.

The UK Ordnance Survey’s leading geospatial and technology expertise will be combined with Mobileye’s automotive camera-based mapping capabilities to provide new, accurate and customizable location information services to customers in the energy, infrastructure and other sectors.

Using our technology, mapping vehicles will collect vast amounts of location data about the road network and roadside infrastructure. This data will then be cross-referenced with existing geospatial databases to create precise maps of the UK’s roads and surrounding areas with incredible detail and accuracy.

Through this partnership, Ordnance Survey UK can work with us to deliver customized solutions based on location intelligence data, enabling companies in existing and developing industries to run smarter, more connected businesses. For example, utility companies can use this service to obtain the exact location of their assets above ground, such as manhole covers, lamp posts, power poles, etc. By improving their understanding of above-ground and underground assets, these companies can more effectively plan and manage maintenance needs and support other necessary work.

In addition to future self-driving cars, mapping innovations can also be applied to other areas. This collaboration between the two parties shows that Mobileye's unique mapping capabilities can extend the value of location data to new market segments, including smart cities. In addition, the key to this collaboration is to provide such data to enterprises and governments in an anonymous manner to protect privacy, and Mobileye's unique mapping method can achieve this demand.

We are also working with Volkswagen and Champion Motors to launch the first autonomous ride-hailing service early next year, with the following division of labor: Champion Motors will be responsible for operating the fleet, Volkswagen will provide the vehicles, and Mobileye will provide the autonomous driving system. This project will start running in early 2019 and gradually achieve commercialization in 2022.

At the same time, we cooperated with Beijing Public Transport Group, which is the world's largest urban public transport company and carries out ground public transport passenger transport business, public transport investment and financing and asset management, and automobile service trade.

The tripartite collaboration will provide commercially viable autonomous driving solutions for the development and deployment of public transportation in China by leveraging Beijing Public Transport Group’s extensive operational experience and Mobileye’s autonomous driving tools, a complete autonomous driving system with Level 4 driverless capabilities.


The Evolution of ADAS

Mobileye has redeveloped the RSS model for ADAS products, proposing it as a preventative complement and enhancement to AEB technology.

The technology is currently codenamed “APB”, short for “Automatic Preventative Braking”. Since the RSS model formula can determine when a vehicle enters a dangerous environment, APB technology can actively apply slight, small preventive braking to guide the vehicle back to a safe position.

APB is an enhanced version of automatic emergency braking that uses a formula to determine when a car is in a dangerous situation. When a potential danger occurs, it can prevent a collision by slowing the car down and gradually stopping it.

APB helps prevent collisions by applying barely noticeable preventive braking rather than sudden braking, helping the vehicle return to a safer position. Ignoring obstacles that impede traffic flow, APB proactively adjusts vehicle speed to maintain safety when necessary, increasing safety without disrupting traffic.

If APB can be applied to any car equipped with a front camera, it will be able to significantly reduce rear-end collisions caused by incorrect driving strategies. Once these vehicles are given surround perception capabilities and REM maps are applied to the model, APB will have the opportunity to be applied to more scenarios, which means that almost all collisions caused by improper strategies may be nipped in the bud.

Meanwhile, the APB system with surround-view camera sensing and location awareness can eliminate “almost all” rear-end collisions. We hope that by adopting such technologies, we can bring the number of road traffic casualties caused by wrong driving decisions close to zero.

We recently published a paper titled “Vision Zero: Can roadway accidents be eliminated without sacrificing traffic throughput?”, in which we proposed that technologies such as APB are critical to the realization of Mobileye’s “Vision Zero”.

The large-scale application of APB can reduce the road traffic accident casualty rate. In addition, the use of APB can also reduce the investment cost of road infrastructure such as speed limit strips, because it can actively adjust the speed of vehicles to ensure safety when necessary, but will not disrupt the normal traffic flow and cause congestion. Therefore, the "zero casualty vision" is achievable, and in our paper, we use data to prove its feasibility.

In the field of autonomous driving, we can make two kinds of changes. The first is traffic change, which is obvious and valuable. The other is road safety change, which is a change that saves lives. It can be achieved through technology, and this technology is actually not expensive. It does not require tens of thousands of dollars. It only costs a few hundred dollars, but it can save lives. Therefore, we can achieve road safety change.

This is the main message of my one-hour speech.



Recommended Reading


To successfully handle the traffic from the Spring Festival Gala, what did Baidu do last night?

As soon as the iPhone price dropped, Chinese consumers bought it

Please keep this 2019 5G mobile phone purchasing guide!

WeChat bans Toutiao mini-app; Intel confirms new CEO; Kuaishou becomes official partner of CCTV Spring Festival Gala

Alibaba, Apple and other companies released their fourth quarter financial reports; Didi responded to layoffs of more than 20%; Apple revoked Facebook's developer certificate

Apple's Q4 financial report is officially released: iPhone faces many challenges, and its revenue in China fell 27% year-on-year


The "2018 AI Adaptive Education Industry Research Report" was released on January 14, 2019. The current early bird discount price is ¥599, and the original price will be restored to ¥699 on January 20. It is a must-read for educational technology researchers, entrepreneurs, education practitioners, and investors. Scan the QR code for details.


Latest articles about

Xiaomi air conditioners are selling like hot cakes. Lu Weibing: A competitor's product that costs 3,000 yuan is sold for 20,000 yuan. Dong Mingzhu is caught in the crossfire. Royole Technology declares bankruptcy. Employees' claims may not be repaid. Zhong Shanshan says he looks down on entrepreneurs who sell goods through live streaming. 
Baidu: Making big model applications more practical 
Dahua Technology joins hands with Hongmeng, is it the direction of the tide or the collision of wisdom? 
Leading the westward expansion of e-commerce, the 150 billionth package will be delivered on Pinduoduo in 2024 
Exclusive: Vipshop Senior Operations Director Fan Li resigns 
Performance exploded! Xiaomi Motors' quarterly revenue sprinted to 10 billion yuan, Lu Weibing said there is no upper limit on the investment in intelligent driving; the widow of the founder of Shanshan Holdings took over from her eldest son as chairman; Zeekr executives called for vigilance against pig-killing scams 
Alibaba Cloud returns to growth track 
Scolding employees and being criticized for being overbearing, Dong Mingzhu: You are so funny, I am the boss; Hycan Auto was exposed to have defaulted on compensation for laid-off employees; Chairman of a state-owned enterprise responded to the high school education of the operations director丨Leifeng Morning News 
1688 is an OEM brand, not following the old path of strict selection 
The Double 11 changes in online retail: Who is driving the direction of the tide? 

 
EEWorld WeChat Subscription

 
EEWorld WeChat Service Number

 
AutoDevelopers

About Us Customer Service Contact Information Datasheet Sitemap LatestNews

Room 1530, Zhongguancun MOOC Times Building,Block B, 18 Zhongguancun Street, Haidian District,Beijing, China Tel:(010)82350740 Postcode:100190

Copyright © 2005-2024 EEWORLD.com.cn, Inc. All rights reserved 京ICP证060456号 京ICP备10001474号-1 电信业务审批[2006]字第258号函 京公网安备 11010802033920号