2508 views|1 replies

9157

Posts

6

Resources
The OP
 

A scientist who continues to live as a cyborg [Copy link]

Intel Fellow and Director of Intel's Anticipatory Computing Lab, Lama Nachman, is working on helping Dr. Peter Scott-Morgan communicate. Previously, Nachman helped physicist Stephen Hawking "speak." Nachman and her team developed the Assistant Context-Aware Toolkit, software that helps people with severe disabilities communicate through keyboard simulation, word prediction, and speech synthesis.

"I will continue to evolve. As a human, I am dead. In the future, I will continue to live as a cyborg."

Those are the words of British robotics scientist Dr. Peter Scott-Morgan, who was diagnosed with motor neurone disease (MND), also known as amyotrophic lateral sclerosis or Lou Gehrig's disease, in 2017. MND attacks the brain and nerves and eventually paralyzes all muscles, even those that support our breathing and swallowing.

Doctors told the 62-year-old scientist that he might only live until the end of 2019, but Dr. Peter Scott-Morgan had other plans: he wanted to replace all his organs with mechanical ones and become "the world's first full cyborg." Peter began his "transformation" at the end of last year, undergoing a series of operations to use technology to extend his life.

Peter now relies on synthetic speech and has developed a lifelike avatar to communicate more effectively with others. He announced publicly after his surgery late last year: "Peter 2.0 is now online, and I have my attitude towards MND."

British robotics scientist Dr Peter Scott-Morgan, who suffers from motor neurone disease, began undergoing a series of operations in 2019 in the hope of using technology to extend his life.

Lama Nachman, Intel Fellow and director of Intel’s Anticipatory Computing Lab, is part of the technical team working with Peter.

Nachman, who helped renowned physicist Stephen Hawking “speak,” and her team will continue to help Peter.

For nearly eight years, Nachman helped Hawking communicate his almost mythical intellectual achievements through an open source platform she and her team developed called the Assistant Context-Aware Toolkit, or ACAT. The software helps severely disabled people communicate through keyboard simulation, word prediction, and speech synthesis. For Hawking, a twitch of a small muscle in his cheek activates a sensor on his glasses that interacts with the computer to type out the sentence he wants to say. In Peter's case, Nachman's team installed an eye-tracking system that enables him to form sentences by staring at letters on a computer screen, and the system also has word prediction capabilities.

“How can technology empower people? That’s been my life’s work.”

Nachman, a Palestinian who grew up in Kuwait, remembers neighbors asking her to fix broken electronic devices. “I’ve always had a keen interest in the latest and greatest technology, and I loved tinkering with it, often taking it apart and fixing it,” Nachman said.

Today, Nachman's team works on context-aware computing and human artificial intelligence collaboration technologies that can help not only the elderly at home, but also students who may not be able to successfully learn in a standard classroom environment, and technicians in manufacturing facilities. "I have always felt that technology can empower the most marginalized groups," Nachman said. "It can create a level playing field and bring more equality to society, and this is especially true for the disabled community."

Hawking wanted more control over his conversations, Nachman said: "Peter is open to bigger experiments and the idea of 'learning together with the machine'. So we have been looking at how to build a response generation capability - automatically recommending answers after hearing the conversation, so that he can quickly choose an answer among these options, or push the conversation in another direction."

Intel's Anticipatory Computing Lab team, which developed the Assistant Context-Aware Toolkit, includes (from left) Alex Nguyen, Sangita Sharma, Max Pinaroc, Sai Prasad, Lama Nachman and Pete Denman. In addition to the members pictured, the team includes Bruna Girvent, Saurav Sahay and Shachi Kumar.

Nachman says that while this solution isn’t as precise as Hawking’s preferences, Peter is willing to give up full control of the conversation in exchange for intuitive collaboration with an AI-driven communication interface because it offers him greater speed.

"My ventilator is much quieter than Darth Vader's mask."[1]

Dr. Peter Scott-Morgan is known for his quick wit and self-deprecating humor, which he hopes to showcase with his artificial voice. In addition to reducing the time delay between him and the person he is talking to (also known as the "silence gap"), Nachman's team is also working on how to help Peter express emotions. When we have a normal conversation with someone, we pay attention to multiple clues - such as facial expressions and tone of voice, not just words. In Peter's case, the team is working on an AI system that can listen to the ongoing conversation and then give different suggestions and tones based on different conditions.

One day in the future, disabled people like Peter will have the opportunity to use brain waves to control their voices.

According to Nachman, part of her team's research focuses on people with disabilities who can't move their bodies, or even twitch their cheeks or eyes. Nachman said that for this group, the brain-computer interface (BCI) also includes a skull equipped with electrodes that monitor brain waves, just like an EEG test. Nachman said that she and her team are looking for ways to add BCI to ACAT to ensure that it can be applied to any disabled person.

As AI technology continues to mature, Nachman is particularly interested in exploring how to give AI systems greater power while retaining human control so that these “two different roles” can work together to achieve better results.

This post is from Talking
Add and join groups EEWorld service account EEWorld subscription account Automotive development circle

Latest reply

According to the definition of Cyborg, its premise is neural interface, machine implantation and functional replacement/enhancement. Machine implantation has long been realized, functional replacement/enhancement is also emerging, but neural interface has not yet been truly realized. Ma Daxia implanted brain-computer interface into several pigs a few days ago, which is a step closer, but it cannot be called a real neural interface. Once the neural interface is realized, human society will undergo major changes.   Details Published on 2020-9-1 14:03
 
 

1w

Posts

142

Resources
2
 

According to the definition of Cyborg, its premise is neural interface, machine implantation and functional replacement/enhancement. Machine implantation has long been realized, functional replacement/enhancement is also emerging, but neural interface has not yet been truly realized. Ma Daxia implanted brain-computer interface into several pigs a few days ago, which is a step closer, but it cannot be called a real neural interface. Once the neural interface is realized, human society will undergo major changes.

This post is from Talking
Personal signature上传了一些书籍资料,也许有你想要的:http://download.eeworld.com.cn/user/chunyang
 
 
 

Guess Your Favourite
Just looking around
Find a datasheet?

EEWorld Datasheet Technical Support

EEWorld
subscription
account

EEWorld
service
account

Automotive
development
circle

Copyright © 2005-2024 EEWORLD.com.cn, Inc. All rights reserved 京B2-20211791 京ICP备10001474号-1 电信业务审批[2006]字第258号函 京公网安备 11010802033920号
快速回复 返回顶部 Return list