Article count:16439 Read by:87952319

Hottest Technical Articles
Exclusive: A senior executive of NetEase Games was taken away for investigation due to corruption
OPPO is going global, and moving forward
It is reported that Xiaohongshu is testing to directly direct traffic to personal WeChat; Luckin Coffee is reported to enter the US and hit Starbucks with $2, but the official declined to comment; It is reported that JD Pay will be connected to Taobao and Tmall丨E-commerce Morning News
Yu Kai of Horizon Robotics stands at the historical crossroads of China's intelligent driving
Lei Jun: Don't be superstitious about BBA, domestic brands are rising in an all-round way; Big V angrily criticized Porsche 4S store recall "sexy operation": brainless and illegal; Renault returns to China and is building a research and development team
A single sentence from an overseas blogger caused an overseas product to become scrapped instantly. This is a painful lesson. Amazon, Walmart, etc. began to implement a no-return and refund policy. A "civil war" broke out between Temu's semi-hosted and fully-hosted services.
Tmall 3C home appliances double 11 explosion: brands and platforms rush to
Shareholders reveal the inside story of Huayun Data fraud: thousands of official seals were forged, and more than 3 billion yuan was defrauded; Musk was exposed to want 14 mothers and children to live in a secret family estate; Yang Yuanqing said that Lenovo had difficulty recruiting employees when it went overseas in the early days
The app is coming! Robin Li will give a keynote speech on November 12, and the poster reveals a huge amount of information
It is said that Zhong Shanshan asked the packaged water department to sign a "military order" and the entire department would be dismissed if the performance did not meet the standard; Ren Zhengfei said that it is still impossible to say that Huawei has survived; Bilibili reported that employees manipulated the lottery丨Leifeng Morning News
Account Entry

The most explosive combination in the history of science and technology: "Metaverse + Brain-Computer Interface", how far are we from it?

Latest update time:2021-10-12
    Reads:


The threshold for advancing technology is very high, the capital market stays away, and ethical issues are entangled. How many "VR years" are left before brain-computer interfaces reach the metaverse?


Author | Dong Zibo

Editor | Zhao Qinghui Cen Feng

What will the ultimate metaverse look like? Brain-computer interface will be the unanimous answer of many science fiction fans.

Popular science fiction works such as "Ghost in the Shell" and "The Matrix" have shown people a possible future of the brain-computer interface metaverse. Through the connection between the brain and the computer, people can freely obtain information, socialize, and even experience multiple senses such as taste and touch in the virtual world.

The imagination of brain-computer interface technology in the 1995 animated film "Ghost in the Shell"

Compared with "traditional" media such as PCs and mobile phones that can only provide audio and video, or even VR and AR that are still being explored, the experience that brain-computer interfaces can bring to the metaverse will be revolutionary.

In current games, most actions of player characters are preset (such as attacking, jumping, grabbing items, etc.). Players use buttons to trigger these preset actions to interact with the game. However, no matter how the player operates, the preset actions remain unchanged.

Brain-computer interfaces can control games with thoughts, which may enable freer operation. In the metaverse, players can freely move every part of their body with their own will, and the software no longer needs rigid preset actions. Players can interact with the virtual world as they please.

In terms of interaction, not only can we get rid of the "shackles" of preset actions, but feedback from multiple senses will also become possible through the two-way transmission of brain signals.

As mentioned in the previous article of Metaverse: Decameron, VR games can cause dizziness in players. This is mainly due to the lack of entity of virtual objects when interacting with in-game items, which causes a disconnect between vision and touch.

The two-way signal transmission in the brain-computer interface also allows the metaverse to overcome this problem. In the metaverse of the brain-computer interface, when you touch a stone, you can feel the texture and temperature of the stone; when you pick it up, you can even feel its weight.

In this way, the brain-computer interface will completely break the barrier between reality and virtuality, and people can "see" with their own eyes, "touch" with their own hands, and "hear" with their own ears. On that day, it will also be possible for humans to truly live in the virtual world.

Sounds a bit sci-fi? Yes, but some people are turning what is considered sci-fi into real science.

1


The most difficult part of brain-computer interface + metaverse is "mind reading"

To this day, the human brain remains one of the most difficult areas for human science to conquer.

"Brain-Computer Interface" (BCI) is a significant progress in brain science research in recent years. By encoding and decoding brain signals during brain activity, the BCI can establish a direct communication and control channel between the brain and external devices, thereby playing a role in restoring and enhancing human functions.

With the brain-computer interface, users use their consciousness to operate (such as playing games and typing), relying on the signals sent by the brain. Therefore, only by accurately identifying and analyzing the brain's signals can all the actions of players in the metaverse become possible.

Identifying brain signals and locating brain functional areas: "Mind reading" of the brain can be said to be the basis of brain-computer interface technology. Professor He Huiguang, who is engaged in brain-computer interface technology research at the Institute of Automation, Chinese Academy of Sciences, told Leifeng.com that for ease of understanding, this technology can be simply summarized as a "subtraction" process:

"When a person is at rest, there is a basic signal A in his brain. If he sees something, the brain receives visual stimulation and a dedicated area responds intuitively, generating signal B. By subtracting signal A from signal B, we can find out which area is responsible for visual work."

By analyzing the signals sent by the user's brain when receiving different types of stimulation, we can locate the functional area. Then, with the help of the brain-computer interface decoding algorithm based on artificial intelligence, the device can try to read the brain's "thoughts". The user can then use "thoughts" to complete operations in the "metaverse", thereby achieving control over external devices.

Improving decoding efficiency is a major difficulty in brain-computer research

In this way, the brain-computer interface facing the "metaverse" faces a very "hard" threshold: decoding speed.

The more brain signals the device can correctly interpret per unit time, the more precise the user's control over the computer can be. This year, Elon Musk's Neuralink released a video showing a monkey controlling a mouse through a brain-computer interface. The monkey Peg can get a little banana milkshake as a reward by moving the cursor to the glowing area.

Neuralink's experimental monkey "Peg"

In the field of brain-computer interfaces, this can be said to be a major cutting-edge breakthrough; but for the metaverse, Musk's attempt can only be said to be still in the early stages.

If brain-computer interfaces are to become the ultimate entrance to the metaverse, they need to do more than just this. Users do not want to play "Pac-Man" or "Ping Pong" in the metaverse of brain-computer interfaces, but to experience more advanced and sophisticated operating experiences such as shooting, creation, and even programming.

The experience of these metaverses is largely subject to the decoding efficiency of the brain-computer interface. Professor He said that according to the experimental paradigm, brain-computer interfaces include motor imagery brain-computer interfaces, brain-computer interfaces based on P300 potentials, and visual evoked potential brain-computer interfaces. Among them, visual evoked potentials have a higher decoding efficiency, which achieves the effect of recognizing commands through the different reactions of the human brain to flashes of different frequencies. Steady-state visual evoked potentials were first invented by Tsinghua University, and its progress is currently at the world's leading level.

Professor He Huiguang said that he recently undertook a Beijing Science and Technology Plan brain-computer interface project with Professor Gao Xiaorong of Tsinghua University and Professor Wang Yijun of the Institute of Semiconductors: "Sub-second non-invasive brain-computer interface technology and general system development". This latest technology can control 100 targets, and at the same time the decoding efficiency has been improved to sub-second level, which is a new breakthrough in the academic frontier of brain-computer interface decoding.

"Sub-second non-invasive brain-computer interface technology and general system development" project report demonstration site

Compared with the former, which achieves brain-computer connection by analyzing induced brain signals, "motor imagery" analyzes brain signals that the brain actively sends out. Simply put, even without actually moving parts of the body, humans can actively cause the brain to send out unique brain signals simply by imagining physical movements. Using this principle, machines can "read the mind" of the brain by identifying brain signals sent out when imagining different movements.

The road to brain-computer "mind reading" is full of difficulties

Brain signal decoding is still a long way from the Metaverse. Currently, the decoding speed of brain signals is only 200 bits per minute. The decoding speed of users with higher decoding efficiency is basically the same as the speed of typing on a mobile phone with one hand. This speed is far from meeting the requirements for realizing the Metaverse experience.

He Huiguang told Leifeng.com: "People have high expectations for this. When it comes to brain-computer interfaces, people always think it's simple, and all they think of is plugging in an electrode and the signal will come out naturally. But there is still a long way to go."

Can AI make brain-computer devices "better understand people's hearts" through machine learning? He Huiguang believes that the role of machine learning in brain-computer decoding of human brain signals only stays at the level of memory capacity, but it is still a long way from the real practical application of the metaverse: "Machine learning can only be limited to a limited space with limited rules, and its inductive reasoning ability is still relatively poor. The human brain learns differently. People acquire knowledge and experience through a small sample learning process."

When it comes to the brain, each person is very different. When exposed to the same visual stimulus (such as seeing a cup), different people may have different individual reactions; even the same person may react very differently to the same object at different times and in different situations. "Its individual differences, high dynamics, etc. are major challenges for the application of brain-computer interface metaverse."

2


Capital is still on the sidelines, but national strategy has already tilted

Scientific research is a protracted war that requires a lot of investment. If brain-computer interface is to become the ultimate entrance to the metaverse, scientific research capital is a variable that must be taken into consideration.

"If I have 1 billion, it's okay to invest 10 or 20 million. I want to contribute to the society and humanity." This is what Zhou Wenjing, partner of Chunni Capital, said when discussing investing in brain-computer interface projects.

Zhou Wenjing has handled four metaverse cases in the recent "metaverse" concept boom. She is also deeply touched by the mixed situation of the metaverse investment and financing market: "Too many speculators are randomly adding the metaverse concept," she said, "because a few bad coins will drive out the good coins in the market, so I have to criticize those bad coins."

Hard-core innovation financing is cold

Investment in brain-computer interface + metaverse is still in a relatively vacuum state. "Too hardcore", "too cutting-edge", "incomprehensible" are the consistent statements of many investors. At the same time, brain-computer research is mainly concentrated in universities and other scientific research institutions, and the participation of private enterprises is very small, and venture capital has no way to start.

At the same time, what makes investors hesitate to invest in brain-computer interface + metaverse is that the system and supervision are still unclear. As a medical device, the rules and regulations and regulatory policies supporting brain-computer interface are still being formulated. As a capital rash entry, it also requires investors to take great risks.

In comparison, the user-side Metaverse application products have lower investment risks, lower understanding thresholds, and faster returns on investment. Therefore, it is not surprising that hard-core scientific research has been cold in the capital market.

Despite this, many visionary investors in China have begun to pay attention to and learn about brain-computer interface + metaverse. "Hardcore content actually still has many opportunities," Zhou Wenjing said, "Hardcore content has relatively higher barriers. But there are actually very few people in China who dare to do hardcore innovation."

Foreign giants are still struggling, and Chinese forces are joining in

Even though he personally appeared on the stage, most of Neuralink's financing was paid for by Musk himself. After leading a $150 million investment in 2017 and 2019, Neuralink finally received $205 million in funding in the C round, five years after its founding.

Facebook, which publicly announced its entry into the "Metaverse" this year, also began researching brain-computer interface technology as early as 2017. As a brain-computer project in cooperation between "Reality Labs" of its "Metaverse" department and the University of California, San Francisco, "Project Steno" has been highly anticipated by the outside world. This is a head-mounted brain-computer connection device that allows users to type with their thoughts.

At the beginning of the project, Facebook's goal was to allow users to type at a speed of 100 words per minute, but at the end of the experiment, the decoding speed of "Project Steno" was only 12.5 words per minute.

In July this year, Facebook officially announced that it would stop developing head-mounted optical brain-computer interface technology and turn its attention to wrist-based input devices. In external materials, Facebook proposed that wrist-based input devices are easier to enter the market quickly. This is considered to be one of the signals that Facebook is shifting its vision from brain-computer interface technology to other entrances to the metaverse.

As foreign giants struggle, Chinese forces are rising, which also have great expectations for the future of brain-computer interface + metaverse.

In March this year, miHoYo announced a strategic partnership with Ruijin Hospital affiliated to Shanghai Jiao Tong University School of Medicine to jointly establish the "MiHoYo Joint Laboratory of Ruijin Hospital Brain Disease Center", which is mainly managed by MiHoYo's "Anti-Entropy" studio. The latter focuses on AI and virtual human research, and is generally regarded by the outside world as one of the important fulcrums of MiHoYo's layout of the metaverse.

Photos of the signing ceremony of the miHoYo Joint Laboratory of the Brain Disease Center of Ruijin Hospital

Creating a "virtual world where one billion people live" will become an important strategic goal of miHoYo in the future, and brain-computer interface will be the last piece of the puzzle to achieve this goal.

MiHoYo's entry into the brain-computer interface market may not be just the beginning. As Chinese forces join the metaverse, brain-computer interface will likely become one of its important in-depth areas.

National mid- and long-term strategic tilt

Despite the lack of private capital support, brain-computer interface has been paid attention to by the country and has been regarded as an important strategic direction that the country will focus on in the future. In my country's "Eleventh Five-Year Plan" to "Fourteenth Five-Year Plan", brain and cognitive science and brain-computer interface are the key development directions.

In 2017, the State Council issued the 13th Five-Year National Science and Technology Innovation Plan. Among the major projects of science and technology innovation in 2030, brain-like computing and brain-computer intelligence are listed. This means that in the national science and technology long-term strategy for 2030, research related to brain-computer interfaces will receive greater support from the state.

3


Brain-computer interface + metaverse, ethics must be passed

"Not only in the metaverse, I firmly oppose the application of brain-computer interfaces in non-clinical fields," said Guan Xiaoying, deputy chief physician of the neurology department of a provincial hospital. "This thing is like Pandora's box. Once it is opened, humans will be destroyed by themselves in the future."

What Director Guan said was not just her personal opinion, but actually expressed the concerns of a large part of the society.

The daunting "brain intubation"

Brain-computer interface, a cutting-edge technology, is far from the public. Coupled with the "dangerous" sounding methods such as "chip implantation" and "brain catheterization", society is full of resistance to the technology of brain-computer interface.

Professor He Huiguang, who has devoted a lot of effort to brain-computer research, also admitted that the promotion of brain-computer interface + metaverse faces a lot of ethical pressure: "People's concerns are normal, and we researchers must ultimately answer people's questions." In the face of these pressures, brain-computer interfaces must choose their own way of survival.

There are two main types of brain-computer connection methods: non-invasive and invasive. The former uses peripherals to read brain signals through the scalp and skull; the latter directly implants electrodes into the brain for more efficient and accurate brain signal reading. Due to ethical pressure, most brain-computer experiments on humans are currently conducted with non-invasive equipment, while invasive experiments are more often used on animals (such as monkeys, pigs, mice, etc.).

The "brain tube" conceived in the 1999 movie "The Matrix"

Professor He made an analogy: "Before, we listened to sounds through a wall; now that science has advanced, we try to put a glass on the wall to listen to the sound, so that we can hear it more clearly. But no matter what, it is not as good as just chiseling a small hole in the wall to hear clearly." But even such a small hole is separated by countless safety and ethical controversies and risks.

Is the cognitive revolution human evolution or Pandora's box?

Although non-invasive brain-computer interfaces sound more benign, Guan Xiaoying believes that "the essential difference between invasive and non-invasive methods is not that big. Both methods affect brain function. Abusing brain-computer interface technology is not the 'evolution' of mankind. Once this loophole is opened, the consequences will be disastrous."

Admittedly, brain-computer interfaces often face many questions related to them:

When using a brain-computer interface to enter the metaverse, the user's brain and the Internet are connected, and signals can be transmitted in both directions. In this case, is it the human who is controlling the machine, or is it the machine that is controlling the human? Will human free will be eliminated?

Putting aside the philosophical discussion, the security issue of users entering the metaverse through brain-computer interfaces seems more urgent: If the brain-computer device is hacked, how can the user's privacy be protected? Will the user's personality be affected or tampered with? Who is responsible for storing the user's brain information data? Who will supervise the behavior of brain-computer service providers? How to prevent data and market monopoly formed by brain-computer service providers?

The problem of minors' game addiction has attracted much attention from all walks of life recently. If brain-computer interfaces break the boundary between the real and virtual worlds, will they cause players (especially minors) to become overly addicted, leading to more serious social problems?

In the future, not everyone will be able to accept brain-computer interface technology. The rich will complete cognitive upgrades by accessing the virtual world of the metaverse, while the poor will not be able to enjoy the fruits of development. After some people become "superhumans", will it cause a deeper division at all levels of society?

The cognitive revolution and human progress cannot necessarily justify the ethics of science and technology. He Jiankui's horrific story is still fresh in people's minds, enough to alert other scientific researchers.

In 2018, He Jiankui, a former associate professor at the Southern University of Science and Technology, announced a shocking scientific research result - a pair of twin babies who were genetically edited and naturally resistant to AIDS. As soon as the news came out, it caused an uproar in all walks of life. Although supporters said that the experiment was for the progress of mankind, He Jiankui was convicted of illegal medical practice by the Nanshan District People's Court of Shenzhen in 2019 and was held criminally responsible.

In order to ease the pressure of public opinion and supervision, most brain-computer interface research now claims to be aimed at clinical patients with paralysis, autism, etc., providing them with solutions for normal life. Director Guan Xiaoying believes that the application of brain-computer interfaces for the diagnosis and treatment of patients for clinical purposes has a certain legitimacy, because it is the use of technology to restore the functions that patients should have as a person. However, if the brain function is modified for entertainment or the metaverse, whether it is invasive or non-invasive, then the brain-computer interface exceeds the scope of a normal person and is not accepted by medical ethics.

How is ethical crisis supervision carried out?

Faced with ethical disputes and public doubts, brain-computer interface + metaverse continues to use an ethical standard as a boundary. And this boundary is still unclear today.

Guan Xiaoying revealed that when new medical technology is about to be introduced or a major operation (usually a transplant) is about to be performed, the hospital will organize an ethics committee to conduct an ethics review. The ethics committee is composed of senior doctors with high seniority and good conduct in the hospital. They must all go through special training and be directly responsible for the committee's resolutions. Not only must the resolutions stand up to legal scrutiny, but they must also satisfy the public.

"The doctor-patient relationship has been tense in recent years, and the country has become increasingly strict in this regard. The ethics committee will also do their work more and more carefully, and it will not be passed if there is any problem at all," said Director Guan Xiaoying.

Although there are currently no national or industry-wide ethical regulations for brain-computer interfaces, Professor He Huiguang suggests that we pay attention to the "Ethical Model for the New Generation of Artificial Intelligence" released by the National New Generation Artificial Intelligence Governance Professional Committee on September 25 this year.

This document emphasizes the privacy protection, data security and independent decision-making rights of users in the context of artificial intelligence. It also points out that the artificial intelligence industry should embrace supervision and strengthen self-examination to ensure its legitimacy and inclusiveness. This has great reference significance for the development direction of brain-computer interfaces and the metaverse technology.

Before the technology is implemented, ethical issues related to brain-computer interface + metaverse must be answered first. Fortunately, there is still a lot of time for people to think about these issues.

4


Written at the end

In the ultimate future of the metaverse, only brain-computer interface technology will be waiting for the metaverse.

On that day, the boundaries between reality and illusion, offline and online, reality and dreams will no longer be clear. The metaverse will no longer be just a product, a space, or even a lifestyle. On that day, the metaverse will compete with people's physical world, go hand in hand, and even entangle with each other to become a mixed "new reality".

On that day, people can freely enter and exit the metaverse world and build a new civilization in the new world. The new civilization that people build will be more magnificent and more prosperous than the current one.

Finally, it is in line with the four verses in the Diamond Sutra:

"All conditioned phenomena are like dreams, illusions, bubbles, and shadows; like dew and lightning, they should be viewed in this way."

This article is originally written by Leifeng.com, and the author is Dong Zibo. Please reply "reprint" to apply for authorization. Reprinting without authorization is prohibited.



END

Recommended Reading






Latest articles about

Xiaomi air conditioners are selling like hot cakes. Lu Weibing: A competitor's product that costs 3,000 yuan is sold for 20,000 yuan. Dong Mingzhu is caught in the crossfire. Royole Technology declares bankruptcy. Employees' claims may not be repaid. Zhong Shanshan says he looks down on entrepreneurs who sell goods through live streaming. 
Baidu: Making big model applications more practical 
Dahua Technology joins hands with Hongmeng, is it the direction of the tide or the collision of wisdom? 
Leading the westward expansion of e-commerce, the 150 billionth package will be delivered on Pinduoduo in 2024 
Exclusive: Vipshop Senior Operations Director Fan Li resigns 
Performance exploded! Xiaomi Motors' quarterly revenue sprinted to 10 billion yuan, Lu Weibing said there is no upper limit on the investment in intelligent driving; the widow of the founder of Shanshan Holdings took over from her eldest son as chairman; Zeekr executives called for vigilance against pig-killing scams 
Alibaba Cloud returns to growth track 
Scolding employees and being criticized for being overbearing, Dong Mingzhu: You are so funny, I am the boss; Hycan Auto was exposed to have defaulted on compensation for laid-off employees; Chairman of a state-owned enterprise responded to the high school education of the operations director丨Leifeng Morning News 
1688 is an OEM brand, not following the old path of strict selection 
The Double 11 changes in online retail: Who is driving the direction of the tide? 

 
EEWorld WeChat Subscription

 
EEWorld WeChat Service Number

 
AutoDevelopers

About Us Customer Service Contact Information Datasheet Sitemap LatestNews

Room 1530, Zhongguancun MOOC Times Building,Block B, 18 Zhongguancun Street, Haidian District,Beijing, China Tel:(010)82350740 Postcode:100190

Copyright © 2005-2024 EEWORLD.com.cn, Inc. All rights reserved 京ICP证060456号 京ICP备10001474号-1 电信业务审批[2006]字第258号函 京公网安备 11010802033920号