Field Report 2015-12-10: Sensor Precision vs Accuracy vs Drift

Testing multiple flow meter deployment configurations.

We compared different fin position configurations & three build generations on this deployment.

With drip sensor service under way, we started our usual round of dives to exchange the flow sensors. We often use our favorite coastal outflow as an equipment shakedown dive because it’s fairly shallow and we know the system so well at this point that it feels more like the project workshop than an actual cave. We monitor tide levels here with a pressure sensor, and we also had the 25cm DS18b20 temperature string in the system this time round (but I will report on those results later)

So close to the ocean, this system delivers tidal signals like ‘old faithful’, and we have another gorgeous data set (in triplicate!) to keep my favorite karst hydrologist happy:

Uncorrected Raw Tilt angle as a proxy for water flow velocity in one of our coastal outflow monitoring sites.

Raw Tilt angle in degrees (as a proxy for water flow velocity)

Those high flows line up nicely with the large rain events recorded at Rio Secreto, and with peak displacements above eighty degrees, I suspect that the drag fins are bumping the ceiling of the cave; clipping some of our high end. But that’s still a beautiful time series, and it reminds me that now I really have to get my hands on a logging ADCP so I can calibrate the 2″ housings to point velocity. Unfortunately past experience has already shown us that acoustics often don’t like being in caves with low ceilings, so I would also have to do that testing in some other system or get my hands on a concentrating beam ADV logger to avoid any reflection issues.

The other gorilla in the room is that age old question of accuracy vs precision. This shows up most clearly in the two temperature records from our pressure sensor, which sports a 12-bit Adafruit MCP9808, and a 24-bit MS5803-05BA (which also records temperature for it’s internal corrections) located right beside each other:

MCP9808 vs ms5803 temperature data...?

Temperature (°C): MCP9808 (Top)  vs MS5803 (Bottom)

That pesky pressure sensor....

That pesky pressure logger….

The bit depth limitation of the 9808 shows up pretty clearly against the 5803’s beefy ADC, but my dilemma is that that the factory calibration on the Adafruit sensor is ±0.25°C, while all the lovely data from the M.S. sensor comes with a quid pro quo of ±2.5°C.  I’d be happy to cherry pick diurnals out of the high rez record, but even without a trends over top it’s obvious that the two sensors have diverging behavior. (though both sensors claim great drift stability?). This is the kind of thing that drives a builder like me nuts because it hints that we might have another creeping problem like the TMP102 pressure sensitivity that nearly took out a whole generation of loggers. I think I will have to start recording the RTC temp register (which is protected inside the housing) so I have another data set to compare to these two surface mounted sensors.

What I would really like to know is if there was some way I could squeeze higher precision out of the 9808.  I keep finding off-hand references in the forums suggesting that if you average ~16 readings, you get another decimal place of resolution out of your sensor. But with temperature sensing IC’s doing massive amounts this when they convert 12-bit readings, they might already have reached the point of diminishing returns. This question is also relevant to the ubiquitous DS18b20’s we’re using, because they are so stable that 16 readings in a row usually just gives me the same number 16 times. Does this mean that averaging has already taken any bit-depth enhancing noise right out of the signal?

<— Click here to continue reading the story—>