Why is data collaboration so important for chip manufacturing?
Author | Wu You
Data is critical to understanding the lifespan of semiconductors, and data collection is key to staying competitive beyond Moore’s Law.
Today, the early stages of the chip design cycle increasingly rely on multiple data sources, including some from the design to manufacturing process. While this holistic approach seems logical enough, the semiconductor industry has never been particularly holistic from the beginning. Expertise has developed around specific steps in the process, and practitioners have further developed in a specific step. However, as time to market shortens, chip manufacturing complexity increases, and unique architectures are critical to developing increasingly fragmented market segments, these processes and steps will no longer be suitable for data-driven model design and manufacturing. The biggest challenge now is to make all the data work together. This will be the source of the next order of magnitude of learning in chips.
Currently, the establishment of relevant standards is helping to integrate data. In addition, third-party analysis and monitoring companies are completing the connection of various data sources. These companies help customers view data more easily by extracting and organizing data and storing it in an accessible database format, which also realizes the chip visualization that has been dreamed of by generations of semiconductor customers and chip design, manufacturing and testing engineers in the past.
1
Understand data in the context of analog and digital circuits
One of the challenges facing the design process today is understanding data in the context of analog and digital circuits. Data results vary depending on whether the design of most or all analog circuits is developed at an older node and whether these circuits are digital circuits with analog components. Although most designers recognize that analog quantities drift and are susceptible to noise even at more traditional process nodes, at process nodes of 5nm and below, digital circuits become closer to analog circuits and are subject to more noise interference. In addition, these nodes have less margin to buffer the effects of fine wires and thin dielectrics.
Image from Pxhere
How this data fits with other data in the digital side is still being worked out. However, the amount of in-circuit monitoring is increasing for several reasons.
First, this data may not be accessible through conventional testing. Carl Moore, a yield management expert at yieldHUB, a software company that provides yield management and data analysis for semiconductor companies, said that in PMIC (Prower Management IC), there are many nodes that cannot even be tested. When designing these chips, the first thing to do is to find the testable nodes. There are hundreds of testable nodes in PMIC. If you can find them and test them, you can determine the specifications of these nodes or determine whether something needs to be amplified to a reasonable level.
The second reason has to do with the increasing emphasis on chip reliability in markets such as automotive, where chips are used in safety-critical applications and can have an expected lifespan of up to 20 years.
Carl Moore said that analog circuits and MEMS are more difficult to test because the signals from sensors are weak and dense. For designers developing these chips, the most important thing is what is around the chip and how the chip is used, but over time, more factors need to be taken into account in the design. Therefore, it is not only necessary to ensure that the chip functions properly, but also a format that can access the data, which can generate data associations with other systems or devices.
2
Data Correlation in Equipment Manufacturing
Semiconductor manufacturing equipment has been completely isolated from much of the data used to analyze chip functionality, and work to correlate data across the supply chain is in its early stages. There are a number of reasons for this.
First, much of the data collected is considered proprietary and competitive in nature, so sharing across the supply chain is limited. Second, device vendors primarily interact with their customers, not with each other. Finally, device vendors have been offering sensors as add-on products for some time because no one wants to sell advanced devices without data analytics and machine learning. As a result, not all data formats are consistent, and not all data values are clear.
Although the chip industry tends to improve reliability through on-chip sensors or manufacturing processes, data collection and analysis need to permeate the supply chain. Therefore, foundries, test and packaging houses, and system companies are not only concerned with temperature and voltage testing, but are also increasingly concerned with the various sources of technology, as well as the production time, production scale, and how these technologies are produced.
Dave Huntley, head of business development at PDF Solutions, a US multinational software and engineering services company, said that if you want the equipment to be traceable, you must read the data of the equipment. In addition, the traceability of material manufacturers requires that their positioning is accurate enough and involves all data from all factories.
The data can also be used to screen out counterfeit chips and components, which is crucial for security.
3
Evolution of data formats in testing
The need for connected test data is due to the increasing complexity within chips, and the need for sufficient coverage to ensure that the chip performs as expected over its entire lifecycle.
Not only is there a density issue during chip testing, there is also a lot of current flowing in and out of local areas of the chip, and the tester must do this at different voltages and different frequencies. In some cases, these chips have 12 different power domains, all of which are populated in different areas of the chip. Therefore, the test process generates a lot of data and needs to be supplemented and correlated with other large amounts of data generated by various processes in the factory. A lot of work is required to extract good data and turn it into usable data.
Test equipment from different manufacturers used to use different operating systems and display data in different or incompatible formats, but with the advent of electronic data interchange (EDI) and IEEE's Standard Test Interface Language (STIL), testing needs to be more compatible.
To solve this problem, Teradyne, a manufacturer of automatic test machines, began to work with other suppliers to develop the standard test data format STDF. STDF is a binary format that is independent of database architecture and operating system. It was developed for automatic test equipment (ATE), so it can connect ATE from competing manufacturers and make it easier to port and store data.
The format includes standard record types and global metadata, as well as parameterization (pass, fail and multi-pin) and functional tests. The binary format will be converted to a more human-friendly ASCII format (ATDF - the ASCII version of STDF) and used in a database or Excel spreadsheet. However, during the conversion project, batch IDs or headers may be lost, and there are many opportunities for erroneous data.
Although STDF is the most commonly used format in testing, it also has limitations. For example, it cannot directly support new models in today's test environment, and the format itself is not accurate enough. After STDF, the Rich Interactive Test Database (RITdb) standard being studied by the SEMI Semiconductor Test Collaborative Alliance (CAST) Special Interest Group may be adopted.
4
summary
Generating better data, collecting data from more sources, and being able to correlate all of that data opens up huge possibilities for the entire chip industry, including everything from improving reliability to predictive analytics, where equipment can be repaired or replaced as needed, rather than having no signal when it fails.
The chip industry is in the early stages of this shift, but over the next decade it is likely to define every step from design to manufacturing, improving everything from yield to reliability and safety.
From August 7 to August 9, 2020, the 2020 Global Conference on Artificial Intelligence and Robotics (CCF-GAIR 2020) will be held in Shenzhen. This year's theme is New Opportunities for AI New Infrastructure Industry. There are three special sessions for new infrastructure academics, namely AI Frontier Special Session, Robot Frontier Special Session, and Federated Learning Special Session, and 11 special sessions for new infrastructure industry, namely AI Chip Special Session, Intelligent Driving Special Session, and AIoT Special Session. Academic experts and important enterprise technology VPs of the AI Chip Special Session will jointly discuss how to seize the excellent opportunities brought by new infrastructure to AI chips with innovative instruction sets, architectures, and business models.
Add Leige’s WeChat ID (leiphonesz2018) to get limited free tickets to the CCF-GAIR conference.
This article is translated from https://semiengineering.com/data-becomes-key-for-next-gen-chips/
Previous recommendations
Featured Posts