1864 views|0 replies

3836

Posts

19

Resources
The OP
 

How TI dual-core processor ARM+DSP achieves collaborative work [Copy link]

         In view of the complexity of current applications, SOC chips can better meet the needs of applications and media, integrate many interfaces, use ARM as an application processor for diversified application development and user interface and interface, use DSP for algorithm acceleration, especially media encoding and decoding algorithm acceleration, which can not only maintain the flexibility of the algorithm but also provide powerful processing capabilities. After the first series of Davinci chips DM644x, Texas Instruments (TI) has successively launched a series of ARM+DSP or ARM+video coprocessor multimedia processor platforms such as DM643x, DM35x/36x, DM6467, OMAP35x, OMAPLx, etc. Many engineers with strong DSP development experience and application processing development experience have turned to using Da Vinci or OMAP platforms to develop products such as video surveillance, video conferencing and portable multimedia terminals. Based on the chip architecture of ARM+DSP, how to develop and implement the desired embedded applications? Traditional chips are basically a processor core, or a general-purpose processor such as ARM, or DSP. For control and user interfaces, they are generally implemented with general-purpose processors, while algorithm processing or media processing depends on DSP or hardware chips. Many systems are dual-chip architectures. The development model is also relatively simple. For example, ARM chips have ARM simulation tools, which can be used to develop applications based on OS; DSP has DSP development tools, such as TI's CCS and 510 and 560 simulators, which can be used to transplant, optimize, track, and debug algorithms. At this time, the experience required is also relatively simple. Based on the dual-core architecture of ARM+DSP, many engineers do not know how to start development and raise many questions. For example, for ARM engineers, they are confused about how to use DSP resources? How to interact with data? How to keep the synchronization between the two cores? For DSP engineers, they ask how to debug ARM? How to start DSP? If media acceleration is performed, how to operate peripherals to obtain or send data, etc. Based on different development experience and foundation, ARM engineers and DSP engineers will look at SOC chips from completely different perspectives, so that they don't know how to get started when they get SOC chips. Here I will share my experience with you. First of all, the ARM+DSP chip is a dual-core chip, corresponding to different instruction sets and compilers for ARM and DSP respectively. The SOC chip can be regarded as a synthesis of two single chips, requiring two different sets of development tools, CCS3.3. Chip-level debugging and simulation can be performed, but different platforms need to be selected for ARM and DSP. Generally speaking, ARM runs operating systems, such as Linux, Wince, etc. Development on ARM, except for bootloader, is basically based on OS development, such as driver, kernel reduction, and upper-level applications. The debugging and simulation required mainly rely on log or debugger provided by OS, such as KGDB, Platform Builder, etc. Development based on DSP core is the same as traditional single-core DSP, and CCS+ emulator is needed for development and debugging. Secondly, for the peripheral interface of the chip, both ARM core and DSP core can access it. The typical situation is that ARM controls all peripherals and controls and manages them through the driver on OS. This part is similar to traditional ARM chips; DSP mainly accelerates algorithms and only deals with memory. In order to maintain the consistency of chip resource management, try to avoid DSP accessing peripherals. Of course, according to specific application requirements, DSP can also control the peripheral interface to send and receive data. At this time, it is necessary to do a good job of system management to avoid conflicts in dual-core operations. Regarding the use of memory, non-volatile storage space, such as NAND and NOR Flash, is basically accessed by ARM. The algorithm code of DSP exists as a file in the ARM-side OS file system, and the DSP program is downloaded and the DSP chip is controlled through the application. The external RAM space, that is, the DDR storage area, is shared by ARM and DSP, but when designing the system, the physical addresses of the memory used by ARM and DSP need to be strictly separated, and a part of the memory space needs to be reserved for interaction. Generally, ARM uses low-end addresses, and DSP allocates high-end addresses through CMD files, and some space is reserved in the middle for data interaction. For example, in the DVSDK under Linux of OMAP3, the 128MB DDR space is divided into three parts. The 88MB space from the low-end address 0x8000000 to 0x85800000-1 is used by the Linux kernel; the 16MB from 0x85800000 to 0x86800000-1 is used by the CMEM driver for large-block data interaction between ARM and DSP, and the 24MB from 0x86800000 to 0x88000000-1 is the code and data space of DSP. The startup of the chip is also an issue that needs to be considered. Generally, it is ARM startup. Like the traditional single-core ARM, it supports different startup methods, such as NAND, NOR, UART, SPI, USB, PCI and other interface startup. The DSP is in reset state by default. Only after the code is downloaded through the ARM application and the reset is released, can the DSP run. In some application scenarios, the DSP needs to be powered on directly from the outside to start automatically, and some chips also support this mode. Finally, regarding the communication and synchronization of the chip, this is a problem that bothers many engineers. In order to facilitate customer development and use, TI provides the DVSDK development kit of DSPLINK and CODEC ENGINE. Based on DVSDK, ARM+DSP application development can be carried out very conveniently. The following is a brief introduction to the software architecture of DVSDK and the functions of each software module. DVSDK is an integration of multiple software modules, including pure DSP-side software modules, ARM software modules, and dual-core interactive software modules. The software packages of DVSDK are all based on the real-time software module (Real-Time-Software-Component: RTSC). You also need to install the RTSC tool XDC, which is an open source tool from TI. It can support cross-platform development and maximize code reuse. If you need to develop pure ARM, you also need ARM compilation tools and Linux kernel or Wince BSP. If you need to develop DSP algorithms or generate DSP-side executable code, you also need to install DSP compilers cgtools and DSP/BIOS. In order to facilitate the configuration and generation of DSP-side executable code, you can generate Codec's RTSC package and executable code through the wizard, and you can also install ceutils and cg_xml. The core of DVSDK is Codec Engine, and all other software modules are basically centered around Codec Engine. Codec Engine is a bridge connecting ARM and DSP. It is a software module between the application layer (ARM-side application) and the signal processing layer (DSP-side algorithm). When compiling DSP-side executable code and ARM-side application, the support of Codec Engine is required. Codec Engine mainly consists of two parts:  ARM-side application adaptation layer, which provides a streamlined API and corresponding libraries for the application layer to use.  DSP algorithm call layer, which provides interface encapsulation specifications for DSP algorithms, so that all algorithms can be compiled into the executable program of DSP through simple configuration. The final application needs to download DSP code through the API interface of Codec Engine, call the encapsulated algorithm on the DSP side, and communicate between ARM and DSP. For an introduction to Codec Engine, please refer to "Helping You Get Started with Codec Engine Quickly". The communication between the underlying ARM and DSP of Codec Engine is based on DSP/BIOS Link, which is a software module that truly realizes the interaction between ARM and DSP. Since DSP/BIOS Link is cross-platform, it also consists of ARM and DSP parts. On the ARM side, it includes OS-based drivers and library files for application calls. On the DSP side, DSP/BIOS must be used, and the executable code of DSP needs to include the library files of DSP/BIOS Link. DSP/BIOS Link commonly uses the following software modules: PROC related, mainly used to control the DSP chip, such as starting, stopping, downloading DSP executable code, and directly reading and writing DSP memory space, etc.MSGQ is related to ARM. The communication between ARM and DSP is based on MSGQ. MSGQ has a polling waiting mode or an interrupt mode, while MSG is based on a shared memory pool. Codec Engine exchanges some key data through MSGQ, such as control and address pointers of some large blocks of data. A large amount of data interaction needs to be implemented through cmem. On the ARM side, the software modules used with Codec Engine are LinuxUtils or WinceUtils, including cmem, SDMA, etc. cmem is used to allocate continuous physical memory space outside the OS, and perform conversion from physical address to virtual address, and from virtual address to physical address space. In order to avoid multiple data copies, it is necessary to open up a data space shared by ARM and DSP, which can be directly accessed by ARM and DSP. This part of the space needs to be managed through CMEM. For ARM, CMEM is a driver on the OS, and memory allocation or address space conversion needs to be implemented through IOCTL. Since DSP can access any physical address space, the pointer passed to DSP through ARM must be a physical address. In order to adapt to the interfaces of some players, DVSDK also provides DMAI (Digital Media Application Interface). DMAI provides a more streamlined media interface and OS-based audio and video capture, playback and other interfaces. The gstreamer under Linux and the dshow filter under Wince are based on DMAI. And DMAI also provides the most basic test application examples, which can be easily modified and tested. If you just call the ready-made or third-party algorithm library, you can only understand the software module on the ARM side. Codec Engine or DMAI has provided a rich application interface. DSP can be considered as a simple media accelerator, and the ARM+DSP chip can be used as ASIC. If you want to give full play to the performance of DSP, you need to develop DSP. Codec Engine only standardizes the interface of DSP algorithm, so as to generate DSP executable program together with Codec Engine. Engineers who develop DSP algorithm, similar to the traditional single-core DSP development mode, only need to operate DSP core, develop algorithm based on CCS, and finally encapsulate it into xDM interface.

This post is from DSP and ARM Processors
 

Guess Your Favourite
Just looking around
Find a datasheet?

EEWorld Datasheet Technical Support

EEWorld
subscription
account

EEWorld
service
account

Automotive
development
circle

Copyright © 2005-2024 EEWORLD.com.cn, Inc. All rights reserved 京B2-20211791 京ICP备10001474号-1 电信业务审批[2006]字第258号函 京公网安备 11010802033920号
快速回复 返回顶部 Return list