In this paper, we show, for the first time, GA application to process model calibration. We propose a distributed GA based calibration technique combined with the traditional local optimization algorithm, which reduces time for calibration considerably. Experimental results show calibration of 144 parameters can be completed with a few minutes, whereas it typically takes a human expert a few days. Our algorithm can be easily implemented on a coarse-grain parallel computer such as a PC cluster system or a multi processor workstation. GA, thus, can be a practical and robust tool for process/device calibration.
Journal: TechConnect Briefs
Volume: 2, Technical Proceedings of the 2003 Nanotechnology Conference and Trade Show, Volume 2
Published: February 23, 2003
Pages: 60 - 63
Industry sector: Sensors, MEMS, Electronics