Leifeng.com reported that on May 7, at the Microsoft Build 2019 Developer Conference, Microsoft announced the launch of the first autonomous system using Microsoft AI. The system is built on the basic capabilities of Bonsai, a company Microsoft acquired previously, and can help developers use Microsoft's AI and Azure corresponding tools to train system models that can run autonomously.
It is reported that the system mainly uses Microsoft's machine teaching and simulation technologies to simulate the real environment for model/system training.
Microsoft AI autonomous system, machine teaching + simulation technology
Through Microsoft's autonomous system, developers can also apply Microsoft's Azure IoT, ROS for Windows and other services to build intelligent robot systems in the cloud or on the device.
Before the release of the first preview version of the system, Microsoft had already cooperated with Toyota Material Handling and Sarcos, both subsidiaries of Toyota, on this system to make intelligent improvements to their automatic forklift robots and remote visual inspection robots, respectively.
Take Sarcos' remote visual inspection robot as an example. This robot is a snake-like robot that can be used to go deep into earthquake sites, carry collapsed materials, and rescue the injured. However, the Guardian S robot had to be remotely controlled by a person to guide it through narrow spaces and complex terrain. According to officials, by using this Microsoft system, engineers were able to develop an automatic control system that enables the snake-like robot to autonomously avoid obstacles and even climb stairs and walls on its own.
As mentioned earlier, one of the key technologies of this system is "machine teaching".
To enable robots to achieve autonomous control, the current mainstream method is to apply deep learning, but this is still difficult in dynamic environments. Microsoft's system uses its "machine teaching" to achieve autonomous control of robots in complex environments.
Microsoft believes that the next stage of AI will be to incorporate human expertise when training machine learning models, which is called "machine teaching." Machine teaching aims to acquire knowledge from professionals rather than just extracting knowledge from data.
Microsoft has previously stated that Microsoft researchers began exploring the principles of machine teaching ten years ago, and Microsoft is now gradually applying these concepts to products or systems related to robots and automated production.
According to Leifeng.com, Natural Language Understanding is one of the earliest applications of Microsoft to adopt the concept of machine teaching. It is a tool in Azure Cognitive Services that can identify intent and key concepts from short texts. According to official information, the application has been used by companies such as UPS and Progressive Insurance to develop intelligent customer service robots.
This time, at Microsoft Build 2019, machine teaching technology was once again applied by Microsoft to build its autonomous robot system.
Another key technology of this system is simulation technology, including Microsoft's own AirSim simulation tool or third-party simulators.
After the algorithm or system is built, it is necessary to put the system into a simulation environment for testing before entering the real environment. AirSim is an open source simulation platform announced by Microsoft in February 2017, which is mainly used to build simulation environments for drones, self-driving cars, and robots.
Microsoft researchers said, "By creating a simulator, we can provide a more realistic view of the environment. The simulator for the robotic platform can accurately render subtle environmental features such as shadows, reflections, etc., which can have a significant impact on computer vision algorithms."
Prior to this, Toyota Material Handling, a subsidiary of Toyota, used Microsoft's AirSim to simulate the warehouse environment in the AirSim environment to train smart forklifts so that they can operate autonomously while identifying and avoiding obstacles.
In fact, this autonomous system is not only used in the field of robotics, its target application areas also include construction, energy, industry and many other fields.
Bonsai CEO Mark Hammond also said, "We are working hard to provide customers who want to build AI autonomous systems with a comprehensive platform that covers development, operations and end-to-end lifecycle management."
Bonsai, regarded by Microsoft as the "brain" of AI autonomous systems
In June 2018, Microsoft announced the acquisition of Bonsai, an artificial intelligence startup it called the "brain" of its autonomous systems.
According to Leifeng.com, Bonsai was founded in 2014. The company is positioned as "the world's deep reinforcement learning platform for enterprises" and is committed to designing deep learning tools for enterprises. Its tools are mainly used in robotics, energy, industry, and autonomous driving. Bonsai's deep learning tools use the open source machine learning library TensorFlow, enabling engineers to develop and train autonomous systems.
Bosai’s deep learning tools offer services such as automatic model generation and management, an API for simulator integration, and a software development kit (SDK). Notably, in 2017, the company developed a new technique for programming industrial control systems that performed 45 times faster than methods such as Google’s DeepMind.
According to official information, Bonsai's end-to-end platform can provide a complete set of tools. Currently, it can integrate professional knowledge into machine learning models through machine teaching. The platform will automatically select the most appropriate deep reinforcement learning algorithm for training specific models, arranging neural networks and adjusting hyperparameters.
Gurdeep Pall, corporate vice president of Microsoft, previously said, “Bonsai’s platform and rich simulation tools combined with Microsoft’s reinforcement learning work have become the simplest and richest AI tool chain for building any autonomous system for control and calibration tasks. This tool chain will be combined with the Azure machine learning portfolio with GPUs and Brainwave running on the Azure cloud, and the models built with it will be deployed and managed in Azure IoT, providing Microsoft with an end-to-end solution for autonomous systems.”
Previous article:Can conversation transcription be done in real time? New Microsoft research progress tells you the answer
Next article:Samsung releases 64-megapixel sensor: Note 10 may be equipped with it
- Popular Resources
- Popular amplifiers
- Enjoy big-screen gaming anytime, anywhere: Making portable 4K UHD 240Hz gaming projector a reality
- AMD surpasses Intel: CPU shipments surge in Q3 this year
- Exynos is losing ground, Samsung plans to use Qualcomm chips in home appliances
- Intel and 50 partners unveiled a full range of 30 notebook and desktop AI PCs equipped with Intel Core Ultra (2nd Generation)
- Innovation leads the new trend of mobile refrigeration GMCC will present new products at 2024 CIAAR
- Lenovo and NVIDIA expand collaboration to jointly launch new liquid-cooled AI servers
- Ceiling fan solution based on XMC1302
- Gartner: Global AI PC shipments are expected to account for 43% of total PC shipments in 2025
- Intel releases its first AI PC desktop processor, the Core Ultra 200S
- LED chemical incompatibility test to see which chemicals LEDs can be used with
- Application of ARM9 hardware coprocessor on WinCE embedded motherboard
- What are the key points for selecting rotor flowmeter?
- LM317 high power charger circuit
- A brief analysis of Embest's application and development of embedded medical devices
- Single-phase RC protection circuit
- stm32 PVD programmable voltage monitor
- Introduction and measurement of edge trigger and level trigger of 51 single chip microcomputer
- Improved design of Linux system software shell protection technology
- What to do if the ABB robot protection device stops
- CGD and Qorvo to jointly revolutionize motor control solutions
- CGD and Qorvo to jointly revolutionize motor control solutions
- Keysight Technologies FieldFox handheld analyzer with VDI spread spectrum module to achieve millimeter wave analysis function
- Infineon's PASCO2V15 XENSIV PAS CO2 5V Sensor Now Available at Mouser for Accurate CO2 Level Measurement
- Advanced gameplay, Harting takes your PCB board connection to a new level!
- Advanced gameplay, Harting takes your PCB board connection to a new level!
- A new chapter in Great Wall Motors R&D: solid-state battery technology leads the future
- Naxin Micro provides full-scenario GaN driver IC solutions
- Interpreting Huawei’s new solid-state battery patent, will it challenge CATL in 2030?
- Are pure electric/plug-in hybrid vehicles going crazy? A Chinese company has launched the world's first -40℃ dischargeable hybrid battery that is not afraid of cold
- One picture lets you understand TE's various industrial motor sensors
- [Project Source Code] How to forcefully delete the device driver when a device driver conflict causes a blue screen
- A beginner is asking for help. The source code of two different LPC1768 chips, the QEIPOS register is not used
- I2C Controlled 6A Three-Level Switch-Mode Single Cell Battery Charger
- Sipeed LicheeRV 86 Panel Review】6- Review Summary
- FPGA Experiment (III) VGA Color Bar Signal Display Based on HDL Language
- The key to the warehouse was found, and the music phone from 15 years ago was unboxed!
- [Flower carving hands-on] Interesting and fun music visualization series of small projects (01) --- LED rhythm lights
- Understanding Delta-Sigma (Δ) ADCs
- 【Qinheng RISC-V core CH582】 6-Hardware I2C reads DHT12 data