I've googled, but no clearer.
I have a Stovax temp gauge on the vitreous pipe. It gives an optimum temperature range, including a "too hot", which is 480+.
How does that work then? In my head, I could have a teeny stove with a 4KWH max output, or some massive bastard that kicks out 15KWH without breaking a sweat, so is this judgement of "too hot" just taken as the average stove?
Or is there no relationship between the two measurements?
Mine has a max output of 7KW, and I when hitting the DANGER 480+ mark, I'm pretty sure it [i]feels[/i] like I could load it up more
I'm pretty sure the temperature is relatively constant no matter the stove. Its the amount of fuel it can take that is the main factor. More fuel means more heat. But spread over a bigger surface area of stove thus same temp?
I'm only guessing here really. We have a little 5kw and a big 8kw. We run both at about 250 to 300 Celsius. The big one heats the room better.
Edit: I think the danger part of the gauge is more about physically damage of the stove.
think the danger part of the gauge is more about physically damage of the stove.
That and a nice chimney fire.
think the danger part of the gauge is more about physically damage of the stove.
Putting aside any chimney fire hysteria, that's what I haven't understood yet.
If I've got a 20ft wide burner, I'd guess it can handle a touch more on the temperature front than a small stove designed to fit in a 2ft opening.
...or do they send the same temps up the flue?
Temps are equal firebox size and external surface area increase with increased heat output.
It's pretty simple!
2-3 pieces of dry as a bone logs no bigger than 100mm diameter are all thats needed on this stove, any more is just wasting fuel.
This still doesn't add up in my head, this 'generic' flue temperature range business.
Anyhow, turns out it's bobbins anyhow - just needs to stay alight?
http://woodheat.org/thermometers.html
