I've reposted this, as I still haven't figured this one out......... can anybody out there shed some light on this cockroach?
I have the following problem to contend with and would appreciate any insight on the matter.
We have numerous thin film processes in which we employ mass flow controllers (MFC's) to regulate the flow of process gases into the vacuum chambers. We use only Argon and Oxygen. We are using conventional thermal mass sensor type MFC's (MKS Brand). These meters display flow in SCCM. The instruments have correction for thermal conductivity of measured gas vs the calibration gas (usually nitrogen). I have a Hastings brand bubble flow meter, which I'm attempting to use to verify/calibrate these flow meters. If you are not familiar with these instruments, its simple a graduated cylinder with a means to introduce gas and provide a film of soap which is forced up the cylinder by gas flow. Timing the meniscus rise past graduation lines provides volume vs time.
My problem is that every MFC is measuring high, meaning that it is flowing more gas than indicated on display/set point. The range of deviation is between 6 and 17% high.
If the graduated cylinder was not precise that would induce a consistent off set, but I'm using two different graduated cylinders, depending on flow range tested. I checked them anyhow with class 'A' pipets. They are both good. Believe it or not, I even checked my stop watch............
What am I missing here? Is there something fundamentally wrong with this approach, perhaps something to do with viscus flow versus turbulent flow or something?