Proceedings Volume 6155

Data Analysis and Modeling for Process Control III

cover
Proceedings Volume 6155

Data Analysis and Modeling for Process Control III

View the digital version of this volume at SPIE Digital Libarary.

Volume Details

Date Published: 10 March 2006
Contents: 5 Sessions, 21 Papers, 0 Presentations
Conference: SPIE 31st International Symposium on Advanced Lithography 2006
Volume Number: 6155

Table of Contents

icon_mobile_dropdown

Table of Contents

All links to SPIE Proceedings will open in the SPIE Digital Library. external link icon
View Session icon_mobile_dropdown
  • Modeling
  • APC
  • CD Control
  • CD and Overlay Control
  • Poster Session
Modeling
icon_mobile_dropdown
First look at across-chip performance and process noise using non-contact performance-based metrology
Majid Babazadeh, Bertrand Borot, Wim Doedel, et al.
We report on the first non-contact, non-destructive performance measurements of embedded Ring Oscillators. Measurements are made on inside the die active area as early as Metal 1. A 90nm logic CMOS technology was used for this work. We have measured residual across-field performance process noise, and variation separate from and of opposite sense to wafer uniformity. This effect cannot be extrapolated from scribe measurements.
Process influence study on optical model generation during model-based OPC development
ChinTeong Lim, Vlad Temchenko, Woong-Jae Chung, et al.
Optical & process model are used in conjunction with Mentors Calibre OPC tool to predict the behavior of a lithography process. Resist models rely exclusively on empirical measurement data, while optical models are calibrated based on the users knowledge of tool settings, but also fitting unknown parameters to empirical measurements. The final OPC model is a combination of optical & process behaviors prediction which includes resist & other process influence to meet the ever increasing demand of advanced lithography technology nodes like 90nm & below on model accuracy. Reliance of optical model creation on empirical measurement data is undoubtedly raising suspicion of how well the derived diffraction model is able to provide an accurate description of how light energy is distributed inside the resist. Various work & effort had been conducted in the past to cover defocus phenomenal on final model outcome & methodology introduced on better prediction from defocus to achieve better simulation quality, investigation has been carried out to study in further detail of existing strategy of resist & optical decoupling methodology in this work.
Dry-etch proximity function for model-based OPC beyond 65-nm node
Dry-etch two-dimensional (2D) model functions have been investigated via 2D SEM image analyses. To evaluate dry-etch bias with respect to its 2D geometry, critical 2D pattern shapes of pre- and post-dry-etch process were compared. From the geometrical evaluation results we have confirmed that dry-etch biases can be expressed by a linear function of 2D pattern/space densities, for which integration should be taken only inside of nearest-neighbor pattern edges. It is guessed that those specific densities are required for estimating the thickness of passivation polymer films upon etching trench sidewall, which is assumed to be a critical factor for etch bias variations. We have obtained good correlations between etch bias and inside-edge pattern/space density; correlation coefficients of 0.95 for SiO2 trench etching process and 0.94 for Si trench etching process have been obtained, respectively. Optimum kernel radii of these processes were about 600 nm - 800 nm in our experiment. These distances would indicate the scope of micro-loading effect. If device pattern complexities come to these sizes, 2D pattern correction by 2D model function should be required for dry-etch biases instead of current rule-based correction.
Hyper-NA model validation for the 45-nm node
Over the years process development engineers found creative ways to extent the capabilities of existing imaging techniques to enable production of the next technology node. For the 45nm node the immersion technology is being prepared for production, along with other resolutions enhancement techniques such as illuminator polarization. In parallel with the development of these tools, modeling techniques are being developed, which are needed in order to establish the design flows and to set up the Optical Proximity Correction (OPC) and mask data preparation. There is a clear need to validate these models and verify them in an early stage. With the equipment not being available yet, other methods like Maxwell simulators and special test equipment are used for such validations. In this paper initial model verification and validation work is presented of a hyper NA models developed for the 45nm technology node. Models with different illuminator settings are used and compared with Maxwell simulators and experimental measurements obtained with an Exitech MS-193i immersion micro-exposure tool.
Multivariate visualization techniques in statistical process monitoring and their applications to semiconductor manufacturing
In today's semiconductor industry, massive amount of data are easily made available in sensor equipped and computer controlled processes, but at the same time, the visualization of high dimensional data has been difficult. This work gives an overview of the most commonly used multivariate visualization techniques for the visualization of high dimensional data. The visualization of process dynamics in the original variable space is discussed. This work also presents visualization of data clusters in the transformed space using projection methods, such as principal component analysis (PCA), class preserving projection (CPP) and methods proposed in this work based on Fisher discriminant analysis (FDA), and support vector machines (SVM), as well as the discussion of the visualization of the process dynamics in the transformed low dimensional space. A rapid thermal annealing process data set is used through the paper as the example data set for comparison of various visualization techniques.
APC
icon_mobile_dropdown
Design and use of multivariate approach error analysis APC system
Jean de Caunes, Joost van Herk, Scott Warrick, et al.
On-going complex integration schemes and developments in processes present significant challenges to lithography in manufacturing advanced semiconductor integrated circuits. Although APC solutions are in place to assist in achieving robust CD control and overlay, there is a great need to increase the 'knowledge' of the system with respect to other contributors impacting the process. The problem becomes more complex in case of an ASIC Prototyping Fab where there is no big runner concept. This leads to the need of a product effect management requirement (Product layout and reticles impact). For this reason, we developed the multivariate R2R controller. This paper discusses the multi-variant methodology and results of a new R2R regulation algorithm in a 65nm node process. Specifically, parameters such as linear combinations of terms, alignment variation for overlay modeled parameters (inter-field / intra-field), CD impacts (reticles, process, tool, STI stack etc) are studied. New solutions for future technology nodes are presented in this paper. It includes for each contributor a multivariate method to assess vector responses and noise contribution. This is being applied on CD and Overlay measurement feedback. For each source of variation (or "Contributor"), the multivariate controller provides the estimated level of compensation requested to meet the target and the level of noise induced on lot processing. At the moment the multivariate R2R controller runs in production. A real evaluation of the existing sources of variations and noise is possible and demonstrated. The result is a significant regulation performance improvement.
Application of integrated scatterometry measurements for a wafer-level litho feedback loop in a high-volume 300 mm DRAM production environment
With critical dimensions in microelectronics devices shrinking to 70nm and below, CD metrology is becoming more and more critical, and additional measurement information will be needed, especially for sidewall profiles and profile height. Integrated scatterometry is, on the one hand, giving the needed measurement precision, and on the other hand, it enables more measurements than stand-alone metrology. Both high precision and large sampling are needed for future technology nodes. This paper shows results from several full volume DRAM applications of state-of-the-art technology nodes on 300 mm wafers. These applications include critical line/space (L/S) layers as 2D applications and contact-hole (CH) layers consisting of elliptical CH-like structures as critical 3D applications. The selected applications are significantly more challenging with respect to scatterometry model generation than the applications presented in previous papers [1, 2]. Simultaneously, they belong to the most critical lithography steps in DRAM manufacturing. In the experiments, the influences of both pre-processes and the litho cluster on Critical Dimension Uniformity (CDU) have been investigated. Possible impacts on Run-to-Run systems like Feed-back and Feed-forward loops will also be discussed. We show that using integrated scatterometry can significantly increase the productivity of lithography clusters.
Real-time spatial control of steady-state wafer temperature during thermal processing in microlithography
Arthur Tay, Weng-Khuen Ho, Ni Hu, et al.
An in-situ method to control the steady-state wafer temperature uniformity during thermal processing in microlithography is presented. Based on first principle thermal modeling of the thermal system, the temperature of the wafer can be estimated and controlled in real-time by monitoring the bake-plate temperature profile. This is useful as production wafers usually do not have temperature sensors embedded on it, these bake-plates are usually calibrated based on test wafers with embedded sensors. However as processes are subjected to process drifts, disturbances and wafer warpages, real-time correction of the bake-plate temperatures to achieve uniform wafer temperature at steady-state is not possible in current baking systems. Any correction is done based on run-to-run control techniques which depends on the sampling frequency of the wafers. Our approach is real-time and can correct for any variations in the desired steady-state wafer temperature. Experimental results demonstrate the feasibility of the approach.
Advanced process control of poly-silicon gate critical dimensions
The ability to control critical dimensions of structures on semiconductor devices is paramount to improving die yield and device performance. Historical methods of controlling critical dimensions include; tool matching, limiting processes to a limited tool set, and extensive monitoring. These methods have proven reasonably effective in controlling critical dimensions, but they are labor and resource intensive. The next level of factory performance is to automate corrections and drive critical dimensions to target. Use of this type of control, commonly referred to as Advanced Process Control (APC) has been a trend that is increasingly becoming the norm in our industry. This paper outlines the implementation of a controller that effectively targets Final Inspection Critical Dimensions (FICD's). This is accomplished by feeding Develop Inspection Critical Dimensions (DICD), FICD, chip code and chamber information in to a model. The controller makes use of a model to make recommendations for recipe parameters. These recipe parameters are transferred to the tool using XML protocol in an automated fashion. Offsets and disturbances are effectively adjusted for. This controller has been implemented in a production facility and has resulted in a 70 % improvement in Cpk performance.
Spatial modeling of micron-scale gate length variation
Systematic gate length variations caused by microlithographic processing [1] have a significant impact on the variability of circuit performance. In this work, rigorous simulation shows the importance of proper spatial modeling. More specifically, the Monte Carlo framework used in [2] has been recast into an analytical macromodel-based Matlab framework. With accuracy in the 2% range, this analytical model allows for much faster critical path variability analysis. The analytical framework has been used to evaluate the assumptions made about the structure of spatial variation in [2]. It has been shown that a more rigorously-defined nested variance model of spatial variation yields substantially different circuit performance variability results. To further establish the nature of spatial variation in the sub-mm regime, a new set of electrical linewidth metrology (ELM) test structures is proposed. These ELM structures enable the measurement of critical dimensions of neighboring polysilicon lines packed at maximum density. Dummy lines may also be inserted between the measurable polysilicon lines, allowing for measurement of near-neighbor lines and thereby increasing the total measurable range. With the fine granularity and wide range of these test structures, spatial variation and correlation in the separation range of 0.2μm to 1.0mm can be measured.
CD Control
icon_mobile_dropdown
Full-field exposure control implications of the mask error function
This report considers a detailed method of rapid and accurate experimental calculation of the Mask Error Enhancement Function (MEF or MEEF) using localized CD variation across the exposure field. MEEF is defined as the non-constant bias of wafer-image replication to small changes in the reticle image. The extraction method of the MEEF response of a reticle to it's process environment is shown to contain a method of measuring the robustness of the OPC design structures on the reticle and their ability to compensate wafer-image replication across the scope of production-process perturbations. This study demonstrates that a MEEF response can be characterized by a regressive comparison of reticle and wafer image sizes for any reticle OPC structure. Expanding the analysis to a focus-dose matrix that approximates normal production variations allows the MEEF response sensitivities to be deconvolved into their component contributions to critical feature variation across the wafer. IntraField dependencies such as sensitivity to the direction of the scan, and thus reticle-stage drive loading are investigated and their contributions are presented at the end of the report. Process induced perturbations such as focus and dose can also change the MEEF and their response is characterized and shown to be a significant contributing factor. An algorithm is then used to extract the full-wafer systematic sensitivity of MEEF to slowly changing perturbations such as film thickness changes in the Anti-Reflective Coating (ARC) and Photoresist thickness. Correlation of the MEEF response to film thickness is discussed and shown to be significant for some films. A budget summary of the systematic perturbation inherent in these MEEF factors is compared against the needs of sub-90 nm nodes with considerations toward the necessity of process-specific OPC design for critical layers.
CD and Overlay Control
icon_mobile_dropdown
Layout optimization for multilayer overlay targets
L. A. Binns, N. P. Smith, C. P. Ausschnitt, et al.
A novel overlay target developed by IBM and Accent Optical Technologies, Blossom, allows simultaneous overlay measurements of multiple layers (currently, up to 28) with a single target. This is achieved by a rotationally symmetric arrangement of small (4 micron) targets in a 50 micron square area, described more fully in a separate paper. In this paper, we examine the lessons learned in developing and testing the Blossom design. We start by examining proximity effects; the spacing of adjacent targets means that both the precision-like Total Measurement Uncertainty (TMU) and accuracy of a measurement can be affected by proximity of features. We use a mixture of real and modelled data to illustrate this problem, and find that the layout of Blossom reduces the proximity-induced bias. However, we do find that in certain cases proximity effects can increase the TMU of a particular measurement. The solution is to ensure that parts of the target that interact detrimentally are maximally separated. We present a solution to this, viewing the problem as a constrained Travelling Salesman Problem. We have imposed some global constraints, for example printing front-end and back-end layers on separate targets, and consistency with the overlay measurement strategy. Initially, we assume that pairwise measurements are either critical or non-critical, and optimize the layout so that the critical layers are both not placed adjacent to any prior or intermediate-layer features. We then build upon this structure, to consider the effect of low-energy implants (that cannot be seen once processed) and site re-use possibilities. Beyond this, we also investigate the impact of more strategic optimizations, for example, tuning the number of features on each layer. In each case, we present on-product performance data achieved, and modelled data on some additional target variants / extreme cases.
Alignment performance monitoring for ASML systems
Woong-Jae Chung, Vlad Temchenko, Tarja Hauck, et al.
In today's semiconductor industry downscaling of the IC design puts a stringent requirement on pattern overlay control. Tighter overlay requirements lead to exceedingly higher rework rates, meaning additional costs to manufacturing. Better alignment control became a target of engineering efforts to decrease rework rate for high-end technologies. Overlay performance is influenced by known parameters such as "Shift, Scaling, Rotation, etc", and unknown parameters defined as "Process Induced Variation", which are difficult to control by means of a process automation system. In reality, this process-induced variation leads to a strong wafer to wafer, or lot to lot variation, which are not easy to detect in the mass-production environment which uses sampling overlay measurements for only several wafers in a lot. An engineering task of finding and correcting a root cause for Process Induced Variations of overlay performance will be greatly simplified if the unknown parameters could be tracked for each wafer. This paper introduces an alignment performance monitoring method based on analysis of automatically generated "AWE" files for ASML scanner systems. Because "AWE" files include alignment results for each aligned wafer, it is possible to use them for monitoring, controlling and correcting the causes of "process induced" overlay performance without requiring extra measurement time. Since "AWE" files include alignment information for different alignment marks, it is also possible to select and optimize the best alignment recipe for each alignment strategy. Several case studies provided in our paper will demonstrate how AWE file analysis can be used to assist engineer in interpreting pattern alignment data. Since implementing our alignment data monitoring method, we were able to achieve significant improvement of alignment and overlay performance without additional overlay measurement time. We also noticed that the rework rate coming from alignment went down and stabilized at quite satisfactory level.
Poster Session
icon_mobile_dropdown
Advanced exposure and focus control by proximity profile signature matching
Wenzhan Zhou, Alex See, Jin Yu
As the advanced IC device process shrinks to below sub-micron dimensions (90 nm, 65 nm and beyond), the process window (Depth of Focus, DOF) decreases to as low as < 0.2 μm. The impact of defocus on final CD results cannot be ignored any more or be compensated by adjusting exposure energy without side effect because the optical behavior of focus is totally different from that of exposure energy. We have developed an advanced control system to detect and correct for variations in both focus and exposure energy. A library of focus exposure matrix (FEM) models is first set up with pattern profiles of different pitches. The inline photoresist profiles of features of various pitches are then fitted to the database in the FEM models. The exposure and focus with which those features have been processed can be estimated. This approach utilizes information of both resist profile and proximity through focus behaviors and therefore gives more accurate extrapolation of defocus than using profile or proximity alone. The approach can also be used to distinguish process drifts caused by exposure and focus from those caused by other process parameters such as PEB temperature, developing parameters and illumination.
Matching poly-layer ADI and AEI process windows by using ADI index
Wenzhan Zhou, Zheng Zou, Alex See
As the critical-dimension of IC devices shrinks to below 90nm, it becomes very important to find an approach to control AEI CD more efficiently. This is especially so for poly gate patterning where not only photoresist dimension but also the photoresist profile plays an important role in defining final CD due to the impact of the photoresist profile on the etch process. In this paper, we will introduce an ADI index, which includes both CD and profile information collected by scatterometry system (KLA-Tencor SCD). With this ADI index, we can match the ADI process window with the AEI process window (exposure and focus window), and final AEI CD can be accurately predicted. This approach can also be effectively used in feed-forward Advanced Process Control (APC) poly patterning.
Statistical shape analysis applied to microlithography
Alessandra Micheletti, Ermes Severgnini, Filippo Terragni, et al.
In this paper we introduce recent mathematical tools for shape description called size functions. Some features of these descriptors such as robustness with respect to noise are pointed out. A first attempt to join the theory of size functions with randomness and to develop the related statistical analysis is then presented. The resulting procedure is applied to some specific problems which arise in microlithography of electronic devices.
Evaluation of an advanced process control solution to detect wafer positioning issues within the hot and cold plate modules of a lithography track
To run the various steps of the process, multiple robot arm transfers within the Hot and Cold Plate modules which directly influence the critical dimension of the production wafers were performed on the lithography track. Wafer positioning inside these modules was found to be one of the key parameters to obtain the best critical dimensional uniformity across the wafer. With the currently realized track monitoring and conventional Statistical Process Control (SPC), potential process drifts or errors within these modules can only be detected from wafers measured during the post process control of product parameters. To catch all potential non-conformal production wafers directly at the tool, minimize equipment downtime and identify the root cause of maintenance issues, the real-time control of tool and process parameters is required. This paper presents the results of the evaluation of an Advanced Process Control (APC) solution used to detect in real-time mode any wafer positioning issues within the Hot and Cold Plate modules of a lithography track based on the monitoring of the plate temperature profile during wafer processing. After an explanation of the methodology used to collect the data from the tool, an initial phase of analysis of the temperature profile of the different Hot Plate modules was carried out. The monitoring of the temperature range was identified as the key parameter for the detection of wafer positioning issues where the temperature profile depends on the number of resistive heating elements, temperature settings and process conditions of the Hot Plate. The wafer tilt was simulated to compare the temperature profile to standard process conditions and in turn determine the detection capability. For the Cold Plate module, it was necessary to know the time between the end of the hot step and the start of the following cold step in order to detect a real tilt issue.
Optical anisotropy approach in spectroscopic ellipsometry to determine the CD of contact hole patterns
Jaisun Kyoung, Hyuknyeong Cheon, Sangbin Noh, et al.
We applied spectroscopic ellipsometry for possible determination of the size of contact holes. We fabricated contact hole patterns in two dimensional array to increase sensitivity and the size of contact hole varied from 80 to 150 nm. Variation of hole-diameter in few nm was distinguishable by comparing the features in ellipsometry parameter Δ, Ψ or the degree of polarization spectra. These features could be used to estimate the size of contact holes once they were calibrated by other techniques. When the photoresist film with dense contact hole pattern was regarded as a uniaxial material, the ellipsometry spectra could be analyzed with anisotropic optical model. In the process, the average size of contact hole could be estimated from the anisotropy of the film.
Improvement of OPC accuracy for 65nm node contact using KIF
Te Hung Wu, C. L. Lin, Ming Jui Chen, et al.
Decreasing k1 factors require improved empirical models for the most critical challenge at 65nm node, contact holes especially. These requirements are reflected in the need for increasingly accurate lithography contour simulations. One of the major contributors to final OPC accuracy is the quality of the optical model. In this study, a new approach to the calibration of an optical model by using KIF will be proposed based upon the real through scanners and steppers of illumination distribution and implement to the OPC kernel.
Predictive modeling for the management of consumable optics in a lithographic system
High Volume Manufacturing (HVM) environments require that equipment have high availability, low cost of ownership, and minimizes events that can cause product defects. With a scheduled proactive maintenance program for consumables, the impact to manufacturing schedules is minimized while maintaining the ability of the tool to create "good" product, and down time due to troubleshooting is reduced. In photolithography, consumable optics provide a large opportunity for instituting this type of program, since optical degradation is very well understood having known causes such as chemical contamination, manufacturing defects and physical effects. These effects can either be reversed or compensated for via optics cleaning, parts replacement or counter-measures such as optical adjustments, optical filters, or tool modifications. Traditionally the lifetime of optics in described in terms of the number of pulses through the optic. This pulse limit is based on modeling and reflects the usage of the machine. There are several disadvantages to using this method, including unnecessarily changing parts prior to failure, or alternately, premature parts failure. Performance-based modeling can solve most issues associated with usage-based models; however, there are also challenges related to performance-based modeling that must be overcome to make the model useful. These include forecasting usage rates, degradation mode signature, and determining thresholds to allow for efficient parts ordering. Nikon has developed a preliminary model to automate performance-based forecasting that takes all of these challenges into account, and with further optimization this model can be implemented successfully providing a reduction in consumables related Cost of Ownership.
Combined use of x-ray reflectometry and spectroscopic ellipsometry for characterization of thin film optical properties
Jason P. Cain, Stephen Robie, Qiaolin Zhang, et al.
Accurate characterization of the optical properties of thin film materials used in semiconductor manufacturing is essential for many metrology applications in the fab, including film thickness measurement and scatterometry. The most common method for measuring these optical properties is spectroscopic ellipsometry. In this work X-ray reflectometry is used as a means of independently determining the thickness of a film to be characterized. This information is then used in the conventional analysis of spectroscopic ellipsometry data to extract the optical properties. In addition, the use of a Cauchy dispersion model fitted to the transparent region of the spectrum (if it exists) was used to determine the film thickness. Once the thickness was determined, a point-by-point regression was performed on the ellipsometry data to extract the optical properties. The results from these techniques were compared with each other and with conventional analysis of the ellipsometry data using common dispersion models.