2000 views|0 replies

2015

Posts

0

Resources
The OP
 

ARM and Intel, which one can replace DSP? [Copy link]

Author: Elvis Liu Neng

1. Why did Freescale and ADI "give up" DSP? I think that Freescale and ADI did not develop DSP because they could not compete with TI. As a commercial company, they had to carefully consider whether to develop something with high investment and low return. After all, few people are willing to do a business that loses money.

(II) Don't ignore the soft power of DSP! DSP covers almost all fields such as control, audio and video image processing, and pattern recognition. The hardware advantage is one aspect. Texas Instruments also provides a wealth of high-performance algorithms. You may find it hard to believe how many algorithms there are, especially for video and audio processing. Every year, a large amount of R&D expenses flow to the research of algorithms. Of course, the use of algorithms requires a high fee. This is also due to the fact that DSP supports C and assembly, and also due to its good compiler. In this way, Texas Instruments also put a label on DSP. DSP is widely used in various fields of computing, covering the control field, digital video, digital image, audio, and artificial intelligence processing. DSP is widely used. I believe that this aspect more or less provides a stable guarantee for the status of DSP.

(III) The pain of FPGA software. FPGAs such as Sethling have sufficient performance and parallel processing mechanisms, which seem to threaten the status of DSP. However, they have an inherent drawback - the flexibility of HDL and Ver languages is much worse than that of C. They are "powerful but not willing" to develop high-end complex algorithms. Even if they develop relevant algorithms, the development cost is high, the investment of manpower and material resources is unimaginable, and the portability is very poor. FPGA companies have been coveting the position of DSP for a long time, but what gives them a headache is the incomplete development of algorithms in various fields. Looking at FPGA from the perspective of DSP, they really have a long way to go. How to solve the inflexibility of HDL; how to compile various mathematical algorithms that combine engineering fields; how to optimize; how to enrich the algorithm library; how to solve the porting problem; how to perfectly combine computing and parallel mechanisms; how to surpass DSP in computing efficiency; even if we add an assumption that FPGA has come up with an excellent solution to the flexibility of algorithms, how long will it take to enrich the algorithm library in various fields? After enrichment, how is the efficiency, how is the user experience, is there a large user base, and can the investment be earned back? It is not easy for DSP to develop from its birth to today. It takes time to accumulate and settle. It requires countless people to grasp the market direction, to gamble, and to think. Commonly used algorithms have been practiced by countless people, optimized and fed back, optimized again, and fed back again. Doesn't FPGA need to go through these processes again? Doesn't DSP have to stop developing during this period, sitting there waiting for FPGA to catch up and then develop together? Of course not, DSP is also constantly developing. Therefore, DSP has certain advantages in hardware and soft power.

(IV) DSP auxiliary status Intel's I7 processor has core graphics function, but computers on the market often have independent graphics cards. Intel can completely enhance the core graphics function to replace independent graphics cards, but doing so will not only increase the power consumption of the CPU, but also increase the cost of the CPU. Some people can use it, while some people can't. Those who can use it will undoubtedly waste it, and those who can use it will only use it occasionally, and it is also a waste when it is not used. In addition, people like to deal with important problems professionally, such as playing games and video rendering. Independent graphics cards can show their skills when needed, and these graphics cards are in a dormant state when they are not used. I think Texas Instruments has the same attitude towards DSP. Professional equipment is used to solve critical problems, and we only need super-intensive computing at specific times. Therefore, the positioning of DSP should be similar to that of independent graphics cards, a good auxiliary.

(V) How to position and develop DSP? Since 2008, TI has stopped developing and innovating pure DSP, and DSP often appears as a coprocessor in high-end dedicated processors. The famous Da Vinci platform ARM+DSP architecture and some Sitara ARM of TI are integrated with TI's C6000 series DSP, for example, AM5728, 2 Cortex-A15 cores, 2 C66X DSP, specializing in video and audio processing. Well, you say here that the A15 core is so powerful that it will replace DSP, but the facts tell us that it is not. TI has formed a new thing by "combining the two". This is not a replacement relationship, but a cooperation. I understand that TI has been repositioning DSP. This development trend is not to be independent and powerful. TI has long sensed that a single DSP is not competitive, just like a strong soul needs a strong body. DSP cannot be an independent processor like ARM and FPGA, but it should be said that it is more of a DSP capability and function, an integrated core.

(VI) What are the advantages of FPGA? The prototype of FPGA originated from the foundation of all digital chips - the combination of countless gate circuits. I have consulted with people who make chips. Even the current ARM and DSP INTEL processors, in addition to the integrated analog function, the digital part is to first design a "mother chip" with logic, and then build peripherals, interfaces, etc. layer by layer until we can program, compile and copy instructions. FPGA saves this process, which is equivalent to programming and controlling this "mother chip". The design of the project is completed in the design logic stage, which is a bit like creating chip functions. ARM and DSP have their own development frameworks, but FPGA is not restricted. In other words, an FPGA can break the bottleneck of ARM and DSP on the market. For example, if you want to make a very demanding video decoding, even if the DSP on the market has a speed of 2.0GHz, it cannot meet the real-time requirements. At this time, perhaps a FPGA of several hundred MHz can solve the problem. The parallel processing mechanism of logic can infinitely expand the parallel width. Of course, the resource consumption is quite serious, but then again, no matter what, there is something in this world that can help you solve the problem, isn't it, even if it is very expensive. The latest Intel cloud computing intends to choose FPGA as a computing coprocessor, which is an affirmation of FPGA. Looking at the trend, FPGA has to take on the glorious mission of high-end computing, but this does not mean that DSP is declining. Why should I choose FPGA for those universal algorithms that DSP can handle well and meet the real-time requirements, and the DSP protocol interface is so perfect? In short, each has its own positioning and advantages.

(VII) Personal thoughts As an embedded engineer, we should hope that the tools we use to work and solve problems will become sharper, easier to use, and more efficient, rather than gloating over the demise of an industry. The attitude we should uphold is to "stand by and watch". We cannot control the development direction of an information product. Information products have their own objective laws. Their fate is also in the hands of the giant companies. Maybe a failed decision will lead to the demise of the entire product; or maybe they will grasp the direction and make these things flourish. Nothing is absolute. If FPGA is easy to use, we will use FPGA, and if DSP is easy to use, we will use DSP. We cannot say that if a tool is easy to use, we will not try other tools. We cannot say that if the tools are more advanced, we still stick to the old ways and use our own "old ways". The times are developing and science is progressing. When Cixi saw the train coming for the first time, she was so scared that her face turned pale. This is stupid and funny in modern times, but what about in that era?

This post is from Microcontroller MCU
 

Guess Your Favourite
Just looking around
Find a datasheet?

EEWorld Datasheet Technical Support

EEWorld
subscription
account

EEWorld
service
account

Automotive
development
circle

Copyright © 2005-2024 EEWORLD.com.cn, Inc. All rights reserved 京B2-20211791 京ICP备10001474号-1 电信业务审批[2006]字第258号函 京公网安备 11010802033920号
快速回复 返回顶部 Return list