The Development of Electronic Systems - The Integration of Humans and Embedded Systems

Publisher:平和的心情Latest update time:2014-12-05 Source: dzsc Reading articles on mobile phones Scan QR code
Read articles on your mobile phone anytime, anywhere

  The development of wearable electronic systems, whether biometrics, communications or virtual reality, extends the concept of embedded systems to new and unknown areas. Putting sensors and output devices on the operator has created a new term - cyborg: the combination of human and embedded system.

  Wearable systems open up new vistas for practical applications and require a new vision of embedded architecture. Sensor clusters used in sticky or ingestible forms are completely isolated from traditional power, ground, and I/O connections. To achieve a tiny size and near-zero power consumption, small sensor clusters must support traditional local signal processing, storage, and wireless connectivity, but are not limited to this. This is a dilemma that designers must solve.

  Separate the system

  One way to solve the wearable system challenge is to look at traditional embedded system design, which includes sensors, actuators, and displays connected to the user's body. Driven by the needs of mobility, comfort, and concealment, the system needs to be separated. When sensors, output devices, and computing resources are physically separated from each other, see how the system architecture changes.

  As an example, consider the design of smart glasses. To avoid a cliché, we will not discuss familiar consumer products, but rather look at glasses designed by industrial equipment supplier XOEye. These glasses are used for activities such as component viewing, inventory handling, and field maintenance. The system features a stereo-mounted 720-line video camera, voice input, and LED and voice output, and is designed to interactively help people complete certain predefined tasks.

  XOEye CTO Jon Sharp explained that the glasses capture and analyze the stereoscopic images the user sees, enhancing the ability to distinguish components, measure size and shape without physical contact or measuring tools, and interact with technicians during repairs - "adjust the screw on the left first," or warn of potential safety hazards through a flashing red LED. "Don't go there!"

  The traditional approach to this type of design would use a camera and microphone mounted on glasses, then perform video processing, object recognition, and establish a wireless communication link through a backpack and battery worn on the back. The traditional user response to this design is to look at the backpack and then bend down carefully to use the system.

  Let's get into the concept of wearable technology. XOEye's approach is to achieve fully autonomous glasses. This goal obviously has space and power constraints. We can't do magic, and these constraints force some computation to be done remotely, usually in the cloud. But partitioning the computational load also brings new design challenges.

  Build Links

  Moving a lot of computing tasks to the cloud is not a new concept in the Internet of Things (IoT). Chakra Parvathaneni, senior director of business development at Innovent, noted that the split varies by application. “A home thermostat has a lot of local processing, but Apple’s Siri is almost all in the cloud,” he noted.

  In the case of XOEye, moving the task to the cloud means either having enough bandwidth to send both video streams in raw format, or doing the video compression in real time in the glasses. The latter is possible with existing media processing chips, but requires a battery of suitable size. However, there is another problem.

  “You have to maintain the human interface and certain functions even when there’s no connection,” Sharp cautions. “For example, when you lose WiFi connectivity, you have to be able to identify security issues in real time.” Some functions will require a level of continuous real-time response — something that cloud computing over the far end of the Internet can’t guarantee.

  These issues require local processing, which conflicts with the size, weight, and power constraints of glasses. XOEye originally wanted to solve this problem using the OMAP architecture that combines an MCU with an accelerator. The OMAP SoC can handle traditional media processing tasks, but, Sharp lamented, "it is impossible to achieve real-time stereo ranging." Therefore, XOEye turned to the CPU plus FPGA approach, and they were able to build energy-efficient local accelerators no matter what tasks the application required.

  Smart Hub

  Even if operating conditions allow for local connectivity to a wireless hub, the round-trip link from the hub through the Internet to the cloud still introduces unacceptable uncertainty. This is one of the architectural challenges facing the IoT. Given these circumstances, if some computing tasks are to be done outside of the wearable device, then they can be placed on the local wireless hub rather than in the cloud (Figure 1). Of course, this cannot be done using only commercial WiFi hubs.

  Figure 1. Wearable embedded system becomes smart hub wireless network

  Integrating compute nodes in WiFi hubs greatly increases system design flexibility. Hubs are generally not constrained in space and power consumption, so you can put some compute and storage resources there. Short-range WiFi links can provide reliable broadband, predictable latency connections, allowing hubs to participate in critical control or human-machine interface loops where unexpected latency can be problematic. In addition, hubs have multitasking CPUs and corresponding accelerators to complete many of the processing tasks of remote wearable devices.

  Smart RF

  What happens when wearables are much smaller than glasses, like wristbands, devices inserted into shoes, and larger pills? There is not enough room for a large battery, so the power cannot be supported, and the WiFi cannot be always on. Wireless methods move to Bluetooth or very low-power short-range links. The hub is now a wearable device itself, worn on a belt or in a pocket, within a meter of the sensor - or even closer if it only supports near-field wireless links. And the task division problem changes in interesting ways.

  Several challenges have led to the development of wearable devices. At a minimum, there have to be sensors, controllers to query those sensors, and wireless interfaces. With careful tuning of the duty cycle, tiny batteries can support these loads—low-power devices. But where do you put the computation now?

  The bandwidth between the sensor and the first level of sensor processing becomes a big question. Can the wireless link carry the raw data stream from the sensor in real time? If not, can some of the energy be spent on increasing the link bandwidth, or local processing at the sensor? If the system user model changes, will the answer be different?

  One way to address this problem is to rethink the RF. Designers tend to place the baseband processor in the radio interface as an interference-resistant black box. However, Parvathaneni of Genesis Technologies recommends looking inside. For example, Genesis Technologies has a line of radio processing unit (RPU) baseband processor subsystems that give system designers more freedom in two ways.

  Internally, the Ensigma RPU (Figure 2) includes a general-purpose MIPS CPU core, supported by a set of specialized accelerators, Parvathaneni said. Therefore, functionality is software-defined, and users can change the RF air interface by modifying the code. This is one aspect of freedom, as you can adjust the power consumed by the baseband to match the bandwidth and distance requirements of a specific wireless link. Parvathaneni also explained, "In many cases, the air interface leaves room for the MIPS core for the main task." Therefore, system designers can choose the air interface standard and then load a set of processing tasks into the RPU, without changing the hardware design, ready to respond to changes in the operating mode of the wearable system. In some cases, this flexibility avoids the need for an MCU or compression engine where the sensor is located.

  Figure 2. Ensigma Whisper from Envision Technologies implements a programmable baseband processor with a MIPS core and a set of low-power accelerators.

  Wireless beyond RF

  Wearable sensors are getting smaller, lighter, and almost disposable, so issues such as hardware supporting the air interface and energy consumption are becoming increasingly important. In response, IP startup Epic Semiconductor has made an interesting suggestion that the wireless link does not use radio frequency, but something else. According to Epic's CTO, Wolf Richter, the solution is to use electric fields.

  Epic has developed a technology that uses external electrodes—small sheets or foils of conductors—for three different purposes. First, the circuit harvests energy from the surrounding electric field. In three to five seconds, the device can harvest enough energy to power a 5 mW load for a period of time. Richter gives examples that include performing tasks on an ARM Cortex-M0 running at 3MHz, rewriting a 15V e-ink nonvolatile display, or briefly activating a 30V printed circuit.

  Second, the Epic intellectual property (IP) is able to detect any physical phenomenon that modulates an electric field, just like a typical capacitive sensor. For example, Richter said, the device could detect the presence of a person about three meters away. Other more mundane applications include measuring the dielectric constant of nearby surfaces—from which the system can infer the body's temperature, pulse, and muscle activity. Or, in a completely different context, the sensor could infer the degree of spoilage on the surface of packaged meat from changes in the dielectric constant.

  Finally, this technology can use the same electrodes for bidirectional signals, enabling RF-free near-field communication by monitoring and modulating the electric field on the electrodes. In this way, a 0.25 mm2 silicon chip can provide power, sensing and connectivity functions for smart surfaces, patches or some similar medium.

Reference address:The Development of Electronic Systems - The Integration of Humans and Embedded Systems

Previous article:Pointer out of bounds causes global variable change error
Next article:Detailed explanation of the world's seven major MCUs

Latest Microcontroller Articles
  • Download from the Internet--ARM Getting Started Notes
    A brief introduction: From today on, the ARM notebook of the rookie is open, and it can be regarded as a place to store these notes. Why publish it? Maybe you are interested in it. In fact, the reason for these notes is ...
  • Learn ARM development(22)
    Turning off and on interrupts Interrupts are an efficient dialogue mechanism, but sometimes you don't want to interrupt the program while it is running. For example, when you are printing something, the program suddenly interrupts and another ...
  • Learn ARM development(21)
    First, declare the task pointer, because it will be used later. Task pointer volatile TASK_TCB* volatile g_pCurrentTask = NULL;volatile TASK_TCB* vol ...
  • Learn ARM development(20)
    With the previous Tick interrupt, the basic task switching conditions are ready. However, this "easterly" is also difficult to understand. Only through continuous practice can we understand it. ...
  • Learn ARM development(19)
    After many days of hard work, I finally got the interrupt working. But in order to allow RTOS to use timer interrupts, what kind of interrupts can be implemented in S3C44B0? There are two methods in S3C44B0. ...
  • Learn ARM development(14)
  • Learn ARM development(15)
  • Learn ARM development(16)
  • Learn ARM development(17)
Change More Related Popular Components

EEWorld
subscription
account

EEWorld
service
account

Automotive
development
circle

About Us Customer Service Contact Information Datasheet Sitemap LatestNews


Room 1530, 15th Floor, Building B, No.18 Zhongguancun Street, Haidian District, Beijing, Postal Code: 100190 China Telephone: 008610 8235 0740

Copyright © 2005-2024 EEWORLD.com.cn, Inc. All rights reserved 京ICP证060456号 京ICP备10001474号-1 电信业务审批[2006]字第258号函 京公网安备 11010802033920号