molgrips – Member
I reckon it’d be harder to implement logic like that than just stick a normal meter in there.
Worked on various instrument clusters a few years ago, if they have a “gauge” now, it will be driven by a micro-controller, not a analog gauge. As such this needs SW and a transfer function to define where the pointer should be in relation to the temperature – which will most likely be received from the engine control unit via network comms on the car.
As others have mentioned the transfer functions are generally non-linear – as that gives an impression of more stable coolant temperature. What actually is implemented is a ramp up from cold followed by a large dead band around “normal” – typically from about 75degC to 110degC[1], with a further ramp when temps start to increase beyond that.
A tri-colour “lamp” could achieve the same effect – blue-warming up, green-just fine, red-getting a bit warm down the noisy end chaps. Though to be honest, I don’t see why most people even need anything other than the over temp warning.
One consequence of this is that if you do have a “gauge”, modern cars appear to warm up faster than older ones with a more traditional analog gauge.
[1] – Engine coolant happily runs at well over 100degC – one because it’s pressurised, and two, the coolant additive increases the boiling point (as well as other benefits). Back in early/mid nineties – systems I worked on were designed to “boil” at 131degC at sea level with correct coolant mix, which equated to about 127degC at high european mountain pass altitudes.