summaryrefslogtreecommitdiffstats
path: root/profile.c
diff options
context:
space:
mode:
authorGravatar Linus Torvalds <torvalds@linux-foundation.org>2013-01-14 08:26:40 -0800
committerGravatar Dirk Hohndel <dirk@hohndel.org>2013-01-14 11:13:33 -0800
commit2e53a415257677fe6e5da7fcce373722269a9082 (patch)
tree3229556ab80d73c8329e21c7b3612d6e16af9032 /profile.c
parente0b53a5d319bfdd0411348c8a6907a881df2fc85 (diff)
downloadsubsurface-2e53a415257677fe6e5da7fcce373722269a9082.tar.gz
Fix odd calculated deco "ripples"
Previously we calculate the ceiling at every single second, using the interpolated depth but then only *save* the ceiling at the points where we have a profile event (the whole deco_allowed_depth() function doesn't change any state, so we can just drop it entirely at points that we aren't going to save) Why is it incorrect? I'll try to walk through my understanding of it, by switching things around a bit. - the whole "minimum tissue tolerance" thing could equally well be rewritten to be about "maximum ceiling". And that's easier to think about (since it's what we actually show), so let's do that. - so turning "min_pressure" into "max_ceiling", doing the whole comparison inside the loop means is that we are calculating the maximum ceiling value for the duration of the last sample. And then instead of visualizing the ceiling AT THE TIME OF MAXIMUM CEILING, we visualize that maximal ceiling value AT THE TIME OF THE SAMPLE. End result: we visualize the ceiling at the wrong time. We visualize what was *a* ceiling somewhere in between that sample and the previous one, but we then assign that value to the time of the sample itself. So it ends up having random odd effects. And that also explains why you only see the effect during the ascent. During the descent, the max ceiling will be at the end of our linearization of the sampling, which is - surprise surprise - the position of the sample itself. So we end up seeing the right ceiling at the right time while descending. So the visualization matches the math. But during desaturation, the maximum ceiling is not at the end of the sample period, it's at the beginning. So the whole "max ceiling" thing has basically turned what should be a smooth graph into something that approaches being a step-wise graph at each sample. Ergo: a ripple. And doing the "max_ceiling during the sample interval" thing may sound like the safe thing to do, but the thing is, that really *is* a false sense of safety. The ceiling value is *not* what we compute. The ceiling value is just a visualization of what we computed. Playing games with it can only make the visualization of the real data worse, not better. Signed-off-by: Linus Torvalds <torvalds@linux-foundation.org> Signed-off-by: Dirk Hohndel <dirk@hohndel.org>
Diffstat (limited to 'profile.c')
-rw-r--r--profile.c5
1 files changed, 2 insertions, 3 deletions
diff --git a/profile.c b/profile.c
index 7839f84ff..8fc99f2e4 100644
--- a/profile.c
+++ b/profile.c
@@ -1873,12 +1873,11 @@ static void calculate_deco_information(struct dive *dive, struct divecomputer *d
t0 = (entry - 1)->sec;
t1 = entry->sec;
tissue_tolerance = 0;
- for (j = t0; j < t1; j++) {
+ for (j = t0+1; j <= t1; j++) {
int depth = interpolate(entry[-1].depth, entry[0].depth, j - t0, t1 - t0);
double min_pressure = add_segment(depth_to_mbar(depth, dive) / 1000.0,
&dive->cylinder[cylinderindex].gasmix, 1, entry->po2, dive);
- if (min_pressure > tissue_tolerance)
- tissue_tolerance = min_pressure;
+ tissue_tolerance = min_pressure;
}
if (t0 == t1)
entry->ceiling = (entry - 1)->ceiling;