diff options
author | Linus Torvalds <torvalds@linux-foundation.org> | 2013-01-11 12:29:42 -0800 |
---|---|---|
committer | Dirk Hohndel <dirk@hohndel.org> | 2013-01-11 12:39:54 -0800 |
commit | 954290c70ba525cab4fefe83a3e82384ef01ba8c (patch) | |
tree | eba4c224cefdcf04fb566656ce3ce14cc9035d02 /linux.c | |
parent | c44755878c9dea2349d9a88939652047be8f539a (diff) | |
download | subsurface-954290c70ba525cab4fefe83a3e82384ef01ba8c.tar.gz |
Fix default gradient factor setting
Testing the new "don't even bother saving default values" showed that the
default values for the deco gradient factors were undefined.
Or rather, they were over-defined.
We had defaults for the UI (30 and 75 for GFlow/GFhigh respectively - the
config ones are in percent), *and* we had defaults in deco.c for the deco
code itself (0.35 and 0.75 respectively - in deco.c they are represented
as fractions, not percent).
And if the config entries had never been written, and were assumed to be
the defaults, the UI code thought the defaults were 30/75, but they had
never been *set* to those defaults, so actual default calculations
silently used the 35/75 in deco.c, which is very confusing (you could go
to the preferences page, see the 30/75 there, and it would not actually
match th evalues used for computation).
Of course, with an old config file that saves even default entries, you'd
never see that if you ever changed anything in the preferences, because
you'd always have explicit gflow/high values. But now it's much easier to
see the conflicting default values.
Fix it by just always using the UI defaults (or set values) to set the
actual deco values.
Signed-off-by: Linus Torvalds <torvalds@linux-foundation.org>
Signed-off-by: Dirk Hohndel <dirk@hohndel.org>
Diffstat (limited to 'linux.c')
0 files changed, 0 insertions, 0 deletions