Big Data Analytics for Smart Manufacturing

Publisher:chinapxfLatest update time:2019-12-17 Source: eefocus Reading articles on mobile phones Scan QR code
Read articles on your mobile phone anytime, anywhere

Over the past few years, Applied Materials has been at the forefront of exploring big data analytics for semiconductor manufacturing. In addition to sponsoring major technical conferences such as the Advanced Process Control (APC) conferences in the United States, Asia, and Europe[1], Applied Materials’ work in this area has been reflected in papers published in peer-reviewed journals such as the IEEE Transactions in Semiconductor Manufacturing[2]. One such article, published in the Swiss MDPI open access journal Processes, won the Best Paper Award in 2016 and 2017[3]. The article explores the trends and opportunities for big data analytics in semiconductor manufacturing and provides a roadmap for how analytics can be used to support a range of applications from defect detection to preventive maintenance. This article summarizes the highlights of that paper[4].

 

Shaping the Landscape of Smart Manufacturing Analytics

The term smart manufacturing (SM) is often used to describe the development direction of the manufacturing industry: integrating upstream and downstream supply chains, integrating physical functions with online functions, and using advanced information to improve flexibility and adaptability. Smart manufacturing makes full use of the huge advantages of data in terms of quantity, velocity, diversity, and authenticity (i.e., data quality analysis technology), that is, using the so-called "big data" technology, and improving existing analysis functions and providing new functions such as predictive analysis through big data analysis.

 

These improvements and new features, summarized in Figure 1, are part of the expanded Advanced Process Control (APC) technology.

 

 

Figure 1. Definition of APC and APC expansion capabilities.

 

The emergence and development of equipment and process analysis technology in semiconductor manufacturing is, to some extent, the result of three major challenges in the industry. These challenges have existed for decades and are not specific to the era of smart manufacturing or the big data revolution, but can be said to be unique to the semiconductor manufacturing industry. The three major challenges faced by the semiconductor manufacturing industry are: (1) the complexity of equipment and processes, (2) the dynamic and context-rich nature of processes, and (3) poor data quality in terms of accuracy and availability.

 

These challenges have led to the realization that analytical solutions for the semiconductor industry cannot be purely data-driven. Tool, process, and analytical domain expertise or subject matter expertise (SME) is also a key component of most fab analytical solutions. Therefore, this must always be kept in mind when designing and applying process analytical techniques for semiconductor manufacturing. In practice, the mechanisms for applying SME are often formally defined in terms of data collection, data processing, parameter selection, model building, model and threshold optimization, and solution deployment and maintenance.

 

Understanding the Composition of Analytical Technologies in Semiconductor Manufacturing
Over the past decade, there has been an explosion of analytical methods, and many analytical methods that leverage big data have emerged. These analytical methods need to be identified and categorized, and one way to do this is to define the dimensions of analytical technology capabilities and then detail or map the analytical capabilities associated with these dimensions. Figure 2 breaks down the dimensions associated with analytical technologies in semiconductor manufacturing.

 

Figure 2. Dimensions of analytical capabilities, with APC solutions commonly used in semiconductor manufacturing mapped to these dimensions. (Phenomenon models are physical models that embody process knowledge; they are adjusted or modified using statistical data)

 

With these dimensions, an analytical application or analytical technique can be defined based on the value of its capabilities in each dimension. For example, principal component analysis (PCA), which is often used in multivariate analysis (MVA), fault detection (FD), and equipment health monitoring (EHM), is an unsupervised, responsive analysis. Multivariate analysis is usually static, stateless, and does not formally incorporate SME. In terms of analytical applications, fault detection in today's fabs is largely unsupervised, responsive, univariate, stateless, and statistically based, and SME is incorporated during the development phase of the fault detection model. Using these and other dimensions to define analytical techniques and analytical applications provides a framework to identify capability gaps, opportunities for progress, and a long-term improvement roadmap.

 

Recent developments in APC applications in semiconductor manufacturing reflect a shift from reactive to predictive and even proactive factory control[5]. This is largely due to the explosion of big data, which provides support for larger and longer-term data archiving, enabling predictive solutions to decipher the complexity of multivariate interactions of parameters, characterize system dynamics, suppress interference, and filter out data quality issues.

 

In many cases, the algorithms in these solutions must be rewritten to take advantage of the parallel computing capabilities enabled by big data solutions to process data in a timely manner. In addition, new algorithms can be developed that are more adaptable to big data. For example, early predictive solutions relied on single-core CPUs and serial processing, but with the advent of the big data era, algorithms such as partial least squares (PLS) and support vector machines (SVM) can be used for parallel computing on server farms. Similarly, unsupervised data exploration techniques such as self-organizing maps (SOM) and generative topology maps (GTM) have also been rewritten to handle large amounts of data, allowing users to quickly obtain useful analytical results. Similarly, time-consuming statistical techniques such as hidden Markov models (HMMs) and particle swarm optimization can be rewritten to greatly improve computational efficiency [6].

 

However, having many technologies and a lot of data does not necessarily lead to more useful analysis results and stronger prediction capabilities. I believe that no one method or combination of methods is suitable for all situations. The specific methods used need to be customized for specific applications based on the data at hand. Regardless, we believe that SMEs will continue to play a leading role in the development and maintenance of solutions.

 

The rise of artificial intelligence and new big data affinity analytics techniques

The term artificial intelligence (AI) can be used to describe any device or analytical technique that can perceive its environment and take appropriate actions to achieve a goal. Today, the term usually refers to the concept of devices or analytical techniques that mimic the functions of the human brain, such as those used in self-driving car applications [7]. Artificial neural networks (ANNs) are an example of such an analytical technique that have been around for decades and are now experiencing a resurgence in the evolution of big data. For example, deep learning is a technique very similar to structured ANNs that uses a layered abstraction approach to improve the quality and speed of large-scale data analysis.

 

Deep learning can be used to solve some high-dimensional problems in big data analysis, including extracting complex patterns from two-dimensional images (e.g., wafer maps). Deep learning techniques benefit from increasing amounts of data and use data-driven supervised learning techniques to discover relationships in the data. The main drawback of this technique is that it is relatively difficult to incorporate SMEs during the development and maintenance of models[8]. Existing models are often not directly usable and therefore difficult to evaluate, while the richness and dynamism of the context involved in semiconductor manufacturing analysis prevents deep learning techniques from taking advantage of large amounts of consistent data. Recent research efforts have focused on combining SMEs with AI techniques, an approach that has the potential to be applied to the production floor in the future[9].

 

Another big data analytics capability that is gaining attention is to perform background analysis using solutions commonly referred to as “crawler”[10]. These “crawler” applications mine the data in the background, looking for relevant patterns or insights, such as components approaching a failure state. They then asynchronously notify applications such as factory control systems so that appropriate actions can be taken. This approach can also improve diagnostic and predictive reconfiguration capabilities.

 

Looking to the future: Analytical technology roadmap

As we move toward smart manufacturing, analytics will clearly continue to play a larger role in maximizing throughput and reducing costs while achieving high yields. Advances in big data will drive these analytics technologies forward, and I believe that the progress made so far has led to some important findings and will help maximize the role of these analytics technologies.

 

The first key finding is that many of the analytical solutions the industry is looking to develop or enhance can leverage the same model development (“static data”) and model execution/maintenance (“dynamic data”) structures. For example, the six-step model development process for PdM (summarized in Figures 3a and 3b) can be used for virtual metrology and even yield prediction. Leveraging a common approach not only saves time and effort in advancing these techniques, but also allows manufacturers to cross-leverage ongoing advances in analytical methods.

 

The second key finding is that smart manufacturing will expand the application of these analytical techniques. For example, extending the use of diagnostics, control, and prediction from within the fab to the supply chain will help better understand customer needs and enhance the ability to solve problems such as field yield.

[1] [2]
Reference address:Big Data Analytics for Smart Manufacturing

Previous article:Enables robots to evolve to new levels of autonomy
Next article:China's Beidou 24 main satellites are in place! High-density launch sets a world record

Latest Embedded Articles
Change More Related Popular Components

EEWorld
subscription
account

EEWorld
service
account

Automotive
development
circle

About Us Customer Service Contact Information Datasheet Sitemap LatestNews


Room 1530, 15th Floor, Building B, No.18 Zhongguancun Street, Haidian District, Beijing, Postal Code: 100190 China Telephone: 008610 8235 0740

Copyright © 2005-2024 EEWORLD.com.cn, Inc. All rights reserved 京ICP证060456号 京ICP备10001474号-1 电信业务审批[2006]字第258号函 京公网安备 11010802033920号