One thing I wonder about, especially seeing the numbers during take-off: how well can an Aranet 4 sensor deal with changes in pressure and temperature?
As I understand it, these sensors require (sometimes lengthy) calibration to remain accurate.
For automatic calibration, the device needs to be in fresh air for 30 minutes, no closer than 1 meter from the nearest person. There is also a manual calibration method. I don't know how often calibration is necessary, but I also think this use case may not be what the sensor was designed to do.
Many cheaper CO2 sensors require frequent calibration. Often they just assume the weekly min reading is fresh air (420), or something like that, which is extremely imprecise.
I bought the Aranet4 because it claims not to need recalibration except in exceptional circumstances. They suggest keeping the factory calibration unless you can guarantee a proper controlled environment. I don't have the equipment to test their claims, but the readings have stayed plausible for months with no calibration.
Edit: Note that if you're using your CO2 sensor as a proxy for the rate at which the air in an occupied room is replaced with fresh air the calibration imprecision doesn't matter. You're probably eyeballing the second derivative over the course of minutes-to-hours.
I use my CO2 sensor for that, but I also care about the absolute. There's some evidence that normal rates of CO2 in modern buildings make your brain foggy. I'm trying to figure out if a program better when I use various interventions to increase fresh air.
Easy way to test pressure changes: have a small room with a well-fitting door. Ventilate it well, then put the sensor in there. Open and close the door fast (mainly to/from the almost-fully-closed position) to get pressure changes. The pressure will only change briefly, but if it didn't keep up with the airplane, it definitely won't keep up with this. Depending on how frequent the readings are, at some point while opening/closing you'll get readings during a change and an outlier if it's sensitive to this. (You can also fashion something with e.g. cling wrap and stuff, but this seems simpler and you don't use throwaway plastic or anything.)
I've used a barometer on a plane before. When they turn on the pressurization system, it's very clearly noticeable, but not a huge change. (A train in a tunnel is worse.) I'd expect this would be a similar effect.
Pressurization has gotten better over the years. Modern planes generally pressurize (or depressurize, more accurately. Sooner and slower. Like, imagine starting to transistion to the target pressure - usually equivalent to 6-8000ft, over time, starting as soon as the door is closed for pushback, instead of only adjusting during the climb.
Yes the ABC is often an issue and can lead to an underestimation of the CO2 values.
This often happens in non ventilated and relatively air tight rooms.
We collect air quality data from many schools and see this happening in some classrooms that even over the unoccupied time during the night the CO2 will not reach ambient levels the next morning.
What do you think about Air Quality Egg product [1]? How does it compare to Aranet4? How do these two compare to AirGradient ONE product? Which chemicals are most important to monitor in a typical 40yo house in Florida: CO2, CO, NO2, SO2, O3?
I do not own an Air Quality Egg or an Aranet so I cannot really comment on both.
In general most of the low cost monitors use similar sensor modules, e.g. Plantower for PM, Sensirion for TVOCs etc. What is important to ensure that the CO2 sensor is an NDIR or photo accoustic and not using estimated CO2 via a TVOC sensor that is very inaccurate.
I would start measuring PM, CO2 and TVOCs in the house to first understand more about how well the HVAC system works. Then optimize it for low CO2 and PM. This should then flush out most of the internal pollution sources as well.
I think you might have it the wrong way round, as the product information mentions fresh air as a requirement for manual calibration, not automatic.
"If a drift of the CO2 measurements occurs, calibration feature of the device should be used. Auto calibration mode is utilizing ABC algorithm whereas
Manual calibration mode demands sensor to be exposed to fresh air."
"CO2 sensor of the device is calibrated at standard atmospheric pressure. CO2 readings are pressure compensated and comply with the
specifications down to 750 hPa. If the device has to be used at high altitude for a prolonged period of time, manual calibration of the unit should be
performed for optimal performance. It is not intended to use the device higher than 4000 m (13 000 ft) above the sea level."
From our experience normal temperature fluctuations do not impact the accuracy. However many NDIR sensors do not like vibrations and get inaccurate after being moved around. Then you need to wait for the automatic baseline calibration to kick in or do a manual calibration.
The Aranet 4 uses the Senseair Sunrise that is more resistant to vibrations.
However I could imagine that sudden air pressure changes could have an impact on the measurement accuracy.
yea I recently started monitoring my homes CO2 levels with some DIY sensors. You need to enter your elevation above sea level or have a pressure sensor hooked up.
So if the op set a static pressure I assume that would affect the readings a lot.
Calibration is also just the sensor looking for the co2 floor (ie the lowest level possible). You need to do this weekly or monthly maybe..?
Auto calibration just sets the lowest level it has recently seen as the floor.
What was your experience with the DIY monitor that you build?
I am asking because we maintain an open source / open hardware indoor air quality project [1] that we continuously update and I am interested to learn about missing features we might have.
Besides just the ideal gas law in a nonideal world.
Let's assume a perfect instrument.
With a sensor for a target gas for instance that puts out 0 to 10 volts over a range of 0 to 10000 reference ppm, if the response over that range were perfectly linear, and all replacement sensors perfectly identical, then no mathematical device correction would need to be made. This is the most fundamental analog heart of the device.
So what is calibration? It compensates for imperfection.
Based on a detector (sensor) which is sensitive to the number of molecules it is exposed to at the time, this number of molecules will be mathematically converted to a concentration level in PPM, in the case of gas phase that is PPM by volume. IOW for every million molecules you breath in, how many are CO2?
Counting molecules are us.
If you used an analog dial meter to show you the real-time votage output from the sensor, all you had to do was remember that 1 volt equals 1000 PPM, and you know when the needle points to 0.400 volts that means it's 400 PPM. At this point you still have a completely analog instrument. The analog circuitry is doing the math for you, and the math is very simple because the device is so perfect.
Even the dial-indicating voltmeter is perfect, otherwise there would be a need for it's own mechanical correction like turning the little screw(s) to adjust the needle.
When you replace the analog voltmeter with a digital voltmeter instead, it's still the same analog instrument at the heart, so no difference yet except you can probably read 0.400 volts more repeatably on the numerical display rather than the analog dial. Unless you had a very large diameter dial. And you would still need the voltmeter itself calibrated to make sure the numbers were true voltage otherwise the PPM shown would not be right. This can be a little more complex, and more subject to error, than turning a little screw on an analog meter but it accomplishes the same thing. Digital voltmeters usually had more than just one screw to adjust. And many more (cheap) electronic components inside subject to drift compared to the few mechanical changes capable of occurring over time in the analog meter.
So digital voltmeters themselves can be very flexible in their features, often analog meters would simply have a custom-printed scale to readout directly in desired units.
Either way with a perfect voltmeter the perfect sensor still isn't good enough simply because of the task at hand.
When the air is thinner for any reason whether due to temperature or pressure (altitude), there will naturally be fewer molecules of CO2 detected by the sensor even though it's the exact same reference air sample with the exact same PPM. But the meter would then show an unrealistically low reading.
So in order for it to be "reference PPM", you would calibrate under "standard conditions" of temperature & pressure.
Then as conditions change across the working ranges of temperature & pressure, compensations need to be made so the recorded value remains realistic in PPM.
The basic equation is PV = nRT where the pressure times the volume equals the number of molecules times a constant times the temperature. The pressure is relative to absolute vacuum and the temperature relative to absolute zero.
With our perfect voltage output we have turned the voltage into a unitless number which we can directly interpret in PPM. And for gases PPM is n/V, or molecules per (million molecule) volume.
Rearranging to P/RT = n/V, we see that for n/V to remain constant, P/RT must remain unity as P & T change, since R is a constant.
We already knew intuitively that pressure & temperature are inversely related, so the simply perfect analog instrument operator could simply make a few calculations from their voltmeter reading, depending on the temperature in their lab and the barometric reading on their barometer. Then they could always record the the real CO2 concentration even though the meter reading was only perfect under standard conditions.
A more advanced analog instrument would instead self-correct its voltmeter readings so when each part of the instrument is fully calibrated, it always showed true PPM regardless of temperature or pressure changes. By adding a temperature and pressure sensor to the circuit the operator would no longer need to take their own T & P readings in the lab. The voltages (according to their slopes) from these additional sensors would then affect the main gas sensor voltage to result in the corrected reading. In a less-than-perfect world each of these sensors also need their own calibration to known reference values under ideal conditions, and to compensate for the variability of electrical component performance of not just the sensors themselves. So there would be a number of variable resistor calibration adjustments on the circuit board, some being knobs on a control panel, others miniature setscrews inside for factory adjustment not routinely needed.
Alternatively without a controlled environment for calibration, at least use well documented ambient temperature and pressure. Calculated compensations then are not based degrees F or C, nor PSIG, instead you use absolute temperature in degrees K, and pressure in PSIA. These are the mathematical variables that can be completely eliminated from the equation if the actual "standard" controlled environment is available.
And that's for the analog instrument to do all the math itself.
These same 3 analog sensors (and voltmeter) with all their component variability and interactive imperfections can have all this support circuitry to allow full (sometimes tedious) analog calibration like this, or forget about having any analog circuitry doing any math and just convert each sensor's raw analog output to digital and have software do the math (and calibration) and display from there.
This moves the tediousness to the data system electronics instead of the instrument electronics.
Or anything in between these two extremes.
Regardless, you only get results as good as your dedication to the electronic tediousness, you just have to put it in the right place either way.
As I understand it, these sensors require (sometimes lengthy) calibration to remain accurate.
For automatic calibration, the device needs to be in fresh air for 30 minutes, no closer than 1 meter from the nearest person. There is also a manual calibration method. I don't know how often calibration is necessary, but I also think this use case may not be what the sensor was designed to do.