Wafer fabs lose milllions from gas calibration errors

August 24, 2018 // By Nick Flaherty
Credit: C. Suplee/NIST
A study from the US National Institute of Standards and Technology (NIST) has uncovered a source of error in an industry-standard calibration method that could lead chip makers to lose a million dollars or more in a single wafer run.

The error occurs when measuring very small flows of exotic gas mixtures. These small gas flows occur during chemical vapour deposition (CVD) that is used to build complex 3D structures by depositing successive layers of atoms or molecules as well as during plasma etching to produce tiny features on the surface of semiconducting materials by removing small amounts of silicon.

The exact amount of gas injected into the chamber is critically important to these processes and is regulated by a mass flow controller (MFC). 

"Flow inaccuracies cause nonuniformities in critical features in wafers, directly causing yield reduction," said Mohamed Saleem, Chief Technology Officer at Brooks Instrument, a US MFC maker. "Factoring in the cost of running cleanrooms, the loss on a batch of wafers scrapped due to flow irregularities can run around $500,000 to $1,000,000. Add to that cost the process tool downtime required for troubleshooting, and it becomes prohibitively expensive."

Modern fabs rely on accurate gas flows controlled by MFCs which as typically calibrated using the "rate of rise" (RoR) method, which makes a series of pressure and temperature measurements over time as gas fills a collection tank through the MFC.

"Concerns about the accuracy of that technique came to our attention recently when a major manufacturer of chip-fabrication equipment found that they were getting inconsistent results for flow rate from their instruments when they were calibrated on different RoR systems," said John Wright of NIST's Fluid Metrology Group which conducted the error analysis.

Wright was particularly interested because for many years he had seen that RoR readings didn't agree with results obtained with NIST's "gold standard" pressure/volume/temperature/time system. He and colleagues developed a mathematical model of the RoR process and conducted detailed experiments. The study found that conventional RoR flow measurements can have significant errors because of erroneous temperature values. "The gas is heated by flow work as it is compressed in the collection tank, but that is not easily accounted for:


s

Vous êtes certain ?

Si vous désactivez les cookies, vous ne pouvez plus naviguer sur le site.

Vous allez être rediriger vers Google.