summaryrefslogtreecommitdiffstats
path: root/libdivecomputer.c
diff options
context:
space:
mode:
authorGravatar Linus Torvalds <torvalds@linux-foundation.org>2013-01-14 08:26:40 -0800
committerGravatar Dirk Hohndel <dirk@hohndel.org>2013-01-14 11:13:33 -0800
commit2e53a415257677fe6e5da7fcce373722269a9082 (patch)
tree3229556ab80d73c8329e21c7b3612d6e16af9032 /libdivecomputer.c
parente0b53a5d319bfdd0411348c8a6907a881df2fc85 (diff)
downloadsubsurface-2e53a415257677fe6e5da7fcce373722269a9082.tar.gz
Fix odd calculated deco "ripples"
Previously we calculate the ceiling at every single second, using the interpolated depth but then only *save* the ceiling at the points where we have a profile event (the whole deco_allowed_depth() function doesn't change any state, so we can just drop it entirely at points that we aren't going to save) Why is it incorrect? I'll try to walk through my understanding of it, by switching things around a bit. - the whole "minimum tissue tolerance" thing could equally well be rewritten to be about "maximum ceiling". And that's easier to think about (since it's what we actually show), so let's do that. - so turning "min_pressure" into "max_ceiling", doing the whole comparison inside the loop means is that we are calculating the maximum ceiling value for the duration of the last sample. And then instead of visualizing the ceiling AT THE TIME OF MAXIMUM CEILING, we visualize that maximal ceiling value AT THE TIME OF THE SAMPLE. End result: we visualize the ceiling at the wrong time. We visualize what was *a* ceiling somewhere in between that sample and the previous one, but we then assign that value to the time of the sample itself. So it ends up having random odd effects. And that also explains why you only see the effect during the ascent. During the descent, the max ceiling will be at the end of our linearization of the sampling, which is - surprise surprise - the position of the sample itself. So we end up seeing the right ceiling at the right time while descending. So the visualization matches the math. But during desaturation, the maximum ceiling is not at the end of the sample period, it's at the beginning. So the whole "max ceiling" thing has basically turned what should be a smooth graph into something that approaches being a step-wise graph at each sample. Ergo: a ripple. And doing the "max_ceiling during the sample interval" thing may sound like the safe thing to do, but the thing is, that really *is* a false sense of safety. The ceiling value is *not* what we compute. The ceiling value is just a visualization of what we computed. Playing games with it can only make the visualization of the real data worse, not better. Signed-off-by: Linus Torvalds <torvalds@linux-foundation.org> Signed-off-by: Dirk Hohndel <dirk@hohndel.org>
Diffstat (limited to 'libdivecomputer.c')
0 files changed, 0 insertions, 0 deletions