Viewing 40 posts - 41 through 80 (of 80 total)
  • Turning off your PC overnight
  • porterclough
    Free Member

    The radiator in our office is turned off all year, the computers keep it warm enough in winter.

    In summer it can be anything up to thirty degrees when you first walk in in the morning before you open the windows.

    And checking in code every night is a sign you need better version control software…

    GrahamS
    Full Member

    Too_Punk_Too_Funk: sorry I just noticed I missed your reply. Yes I have a fairly good idea what is running on my Vista box. But some of it is obviously magic.

    Feel free to explain why Gentoo is so much better and how you know exactly what everything is doing because you personally read and approved every single line of source code before compiling it and you disassemble every update for every app before allowing it on your system.

    I’ll tell you in the morning how little I care. 😉

    coffeeking
    Free Member

    I think you need to sit down and think about it for a moment. If 100% of electrical energy eventually becomes heat in every electrical device then why would you ever need something specifically designed as a heater? You could just stick the telly on to get warm [:roll:]

    What exactly do you think happens to it? Assuming we’re not talking about high power lasers and super-mental sound systems who’s energy output can leave the room and dissipate elsewhere (and even then that energy will be a small amount in comparison with the heat generated at the source), all electrical energy spent in the appliance becomes heat within that room – its the laws of physics.

    I personally CAN stick my telly on and get warm in my small bedroom, but it takes a LONG time due to the very low power output (in comparison with the heat required to warm a room by much).

    GrahamS
    Full Member

    Personally I can hear my telly from other rooms in the house. And I can even see its flickering light from the street outside. Is my house outside the laws of physics then?

    coffeeking
    Free Member

    Personally I can hear my telly from other rooms in the house. And I can even see its flickering light from the street outside. Is my house outside the laws of physics then?

    No, but the quantity of power required to project your TVs light and sound across the road is TINY in comparison with the heat it is producing to create that light/sound. Think about it, you can easily see the light a 100mW LED from 1/4 a mile away, they’re in the region of ~30% thermally efficient so you’re seeing about 30mW of “light energy” from far away, easily – the energy you “see” from the street will be fairly low. Your TV uses ~150ishW in total, the light and sound output will be in single figure %ages of its power output, the rest is heat. An even some of the light/sound that escapes heats the air/walls.

    In much the same way as ~90% of the energy from a normal light bulb is heat rather than light. The majority of this light is trapped in the room and eventually becomes heat (if it didnt you’d only need to turn the light on for a fraction of a second and then the light would bounce around indefinitely!).

    aracer
    Free Member

    Personally I can hear my telly from other rooms in the house. And I can even see its flickering light from the street outside. Is my house outside the laws of physics then?

    No, but we were originally talking about PCs – I very much doubt you can see or hear your PC from outside the room it’s in.

    Erm…. that’s how they move?? Electrical -> Magnetic -> Kinetic
    And that kinetic only generates a tiny amount of heat (decent PC fans use fluid bearings so friction is minimal).

    In which case they also only use a tiny amount of energy. If you had a totally frictionless mechanical part it wouldn’t need any energy input at all. If that energy isn’t being dissipated in heat, where is it going given conservation of energy? Of course a fan also moves air around, but that air also dissipates the energy it is given as heat…

    If 100% of electrical energy eventually becomes heat in every electrical device then why would you ever need something specifically designed as a heater? You could just stick the telly on to get warm

    Because you’d need an awful lot of tellys to heat a house with, and they’d be rather more expensive than the heaters you can use otherwise. Not only that, but they’re not designed to distribute the heat round a room. They are very, very good though at turning the electricity they do use into heat though.

    GrahamS
    Full Member

    That light and sound that “escapes” though is an example of energy that isn’t heat escaping from the system. No matter how small. So you can’t argue that all the energy becomes heat within the room as that is demonstrably untrue.

    That’s not to say that a large proportion doesn’t become heat. I’ve worked in enough hot offices to know that. But given a cold room and the option of a 1000 watt heater or a 1000 watt PC to warm me up then I’d choose the heater!

    (or I’d go for the PC and simply knock one off to get warm)

    GrahamS
    Full Member

    aracer: yep I can hear my PC from the next room. The noise of the fans and hard drives is audible (even though I have low noise fans) and that very feint hum that lets you know a transformer is working somewhere nearby.

    It’s also visible from outside (even ignoring the monitor) as it has four bright blue LEDs on the case.

    aracer
    Free Member

    Strangely my work PC doesn’t have bright blue LEDs on the case. Meanwhile we’re talking here about rooms with sealed shut doors, closed windows here – can you really hear your PC through those?

    Though to come back to your original question “Guess which one is considerably more efficient at heating a room?”, even if you allow for some sound escaping from the room, the difference in efficiency between the two is miniscule. The difference in energy used between heating a room with computers or more conventional means is far less than the difference opening a window, or having poor insulation makes.

    coffeeking
    Free Member

    So you can’t argue that all the energy becomes heat within the room as that is demonstrably untrue.

    To an extent that you can could determine by thermometer in the room, it does. As I said from the start, a small minority may *escape* the room as sound or light but it is a TINY fraction of the total. The majority of the sound and light emitted by the object will turn into heat and join the heat created as a waste byproduct. To all intents and purposes a 100w lightbulb put as much heat into the room as a 100w heater, only the heater will have a lower surface temp but a higher area, this’ll create more dispersed convection and MAY feel warmer in the near vicinity, however at a point miles from the source with no direct airflow you wont notice any difference at all.

    The reason we have heaters and PCs instead of just PCs is because PCs arent a nice large surface area, or thermostatically controlled.

    To relate this back to the argument, yes you could use the heat to reduce the heating bill if the heating is well controlled (demonstrated by the room my server is kept in (this draws about 150W continuously and the thermostatically controlled radiator rarely comes on whereas it used to be on with the rest of the rooms before the server was there) but often it isnt so people just open windows and waste that heat.

    aracer
    Free Member

    To relate this back to the argument

    Spoilsport!

    konabunny
    Free Member

    You lot obviously work at places which don’t care that much about security.

    I used to work at a place where half the people would log out and then leave the log in screen displaying for fifteen hours before they came back. Facepalm!

    coffeeking
    Free Member

    Shame I can’t demonstrate good spelling or punctuation at this time of the night! Look at the that last post of mine, what a mess!

    Signing out, 4:15 is a stupid time to be posting here!

    GrahamS
    Full Member

    Right… my understanding of the “Conservation of Energy” is that it only applies within a closed and isolated system, of the kind that physics is fond of, but that don’t actually exist in everyday life.

    I’ve already pointed out that noise and light energy can leave the room. There are some others of course: radio (both wi-fi signal and RFI), data (electrical signals being sent back out the room on the network), static, vibration.

    Any of these leaving the room should be sufficient to disprove aracer’s pedant argument that heat is the only possible output from a PC and no other forms of energy can leave the room.

    Getting back to Epicyclo’s original (less pedantic, but equally silly) argument that turning off PCs in heated offices is a “big con”:

    Even if PCs were perfect little heaters, you still shouldn’t leave them running to keep an empty office warm at night as it will use less power for a thermostatically-controlled heating system to kick in on a timer to warm the place up if required, before everyone arrives in the morning.

    kiwijohn
    Full Member

    Erm…? My lap top has been on for 50days, 15hours & 10minutes. Does count?

    matt_outandabout
    Full Member

    Most offices spend more energy and cash on cooling than on heating. Internal gains from computers is big, even compared to people/lights/most other things.
    Switching off a PC will save huge amount of energy, compared to most other steps you can take.
    As ever, faffing and huffing about and splitting hairs over this distracts from the basic fact that we use and waste far too much energy, and that efficiency is king.
    If I was the office manager, I would close down all the PC’s myself daily, and hand anyone who didn’t the bill for the wasted power for the 12-14 hours you are not using the PC.

    AndyP
    Free Member

    ****…PCs sound utter gash. Glad I have a Mac.

    GrahamS
    Full Member

    ****…PCs sound utter gash. Glad I have a Mac.

    Yep, fortunately Macs don’t use any electrical power at all as they run solely on the owner’s misplaced smugness.

    coffeeking
    Free Member

    Right… my understanding of the “Conservation of Energy” is that it only applies within a closed and isolated system, of the kind that physics is fond of, but that don’t actually exist in everyday life.

    Well no, actually its pretty universal, you just have to understand what you’re considering a system etc, however as I’ve said a few times now – it depends what you’re negating. The pedants argument that everything turns into heat in the room is technically incorrect due to the various wireless/light etc items that escape but is practically fairly accurate(and you like practical things as they exist in real life) as the percentage lost through those other forms is minimal in quantity, meaning it is generally safe to claim all power goes to heat air in the room and its internal surfaces.

    Even if PCs were perfect little heaters, you still shouldn’t leave them running to keep an empty office warm at night as it will use less power for a thermostatically-controlled heating system to kick in on a timer to warm the place up if required, before everyone arrives in the morning.

    Agreed – you’ll use less power overall if you allow the whole room to cool overnight and then warm it back up again than if you maintain its temp.

    matt_outandabout
    Full Member

    you’ll use less power overall if you allow the whole room to cool overnight and then warm it back up again than if you maintain its temp.

    And therein lies another issue – the buildings these things are housed in are so damn inefficient 90% of the time…

    coffeeking
    Free Member

    Suppose it helps reduce the need for cooling/AC with all the PCs creating heat, so thats a plus.

    AndyP
    Free Member

    Yep, fortunately Macs don’t use any electrical power at all as they run solely on the owner’s misplaced smugness.

    bizarre. Mine certainly needs electricity.

    coffeeking
    Free Member

    Indeed a problem with the PC market is the scramble of all manufacturers to match products and specs means the wide variety of features and states of operation is a problem, where mac have so few products they get time to nail certain bits well, which is nice, got to hand it to them on that note.

    GrahamS
    Full Member

    The pedants argument that everything turns into heat in the room is technically incorrect…

    As a pedant arguing against a fellow pedant (aracer) I take “technically incorrect” as a pedantic victory 🙂

    ..but is practically fairly accurate(and you like practical things as they exist in real life) as the percentage lost through those other forms is minimal in quantity,

    Show me some figures. I demand figures. And possibly a chart. 😀

    ooOOoo
    Free Member

    Like most electrical items, PCs have been designed with the assumption that it’s ok to waste energy.
    Does my head in – 2.5GHz processors…4Gb of RAM… cleverer than a very clever thing….and yet you can’t even switch the thing off & on reliably. Massively gay.

    GrahamS
    Full Member

    Actually PCs put more effort than most electrical items into using less energy. Hence why we have ACPI, sleep states, SpeedStep, CoolNQuiet, etc

    And no, most people don’t struggle to turn them off and on reliably.

    ooOOoo
    Free Member

    They’re all great but they don’t work reliably, hence the many complaints on this thread. Macs on the other hand….

    If they’re so great how on earth have we got to 1000W power supplies? I can check my emails on my phone, that’s around 2W.

    Fridges, cookers, microwaves….they are nowhere near as inefficent as a typical PC.

    coffeeking
    Free Member

    Graham – go do the calculations yourself, I can’t be bothered 😆 Start with the fact that my server runs at about 90w actual power use, the two processors are rated at ~20W thermal output each and the two drives both sit at around 10W each, plus the graphics card IIRC rated around 2W also thats 62W in specified thermal output alone. The PSU works at about 75% efficiency so even if none of the other cards in the computer (graphics etc) produced heat you’d be up at 82W leaving about 8W in “other” output. Seeing as this is a server it has no speakers or wireless but some minute power will go down the ethernet to the router, the rest is lost as sound and friction in the fans and motherboard components. Since it’s trapped in a cupboard not much of either leaves the room…

    Admittedly the power factor of the PSU may put the meter reading out so all those figures could be only mildly indicative 😆

    coffeeking
    Free Member

    ooOOoo – Not sure you can judge the efficiency of a cooker, it’s designed to produce heat so its “waste” is not really waste – only needs insulating to help it keep its heat in. Microwave – not sure how you’d measure the efficiency of that without seriously sophisticated kit and fridges – dont do hundreds of millions of instructions per second and display them to a screen, its a whole other level of complexity and design. Its like asking why cars are not as efficient at bikes.

    In all honesty, most PC hardware will sleep / hibernate happily until you load software on that hasnt been tested with each possible OS/hardware combo.

    aracer
    Free Member

    As a pedant arguing against a fellow pedant (aracer) I take “technically incorrect” as a pedantic victory

    No – I’m an engineer (am I allowed to say that in this context without getting beaten to death?), so far more interested in “practically fairly accurate”.

    There are some others of course: radio (both wi-fi signal and RFI)

    You’re doing better – my gut feeling is that RF leakage from the room is the biggest source of non heat energy coming from a PC.

    Glad somebody else is doing why it’s a rubbish argument to suggest it’s fine to heat your room with a PC, thus leaving me to the pedantics.

    GrahamS
    Full Member

    coffeeking: I looked at your figures, but all I could think was “He keeps his server in an unventilated closet – man it must be cooking in there.”

    If they’re so great how on earth have we got to 1000W power supplies?

    That is the Maximum Load of the power supply, not what it is actually doing all the time it is running.

    Somewhere around 200w is more typical:
    http://www.tomshardware.com/reviews/truth-pc-power-consumption,1707.html

    ooOOoo
    Free Member

    It’s simple for a microwave:
    Input power – 1230W
    Output power – 800W
    Therefore efficiency = 65%

    All of that will go into your food.

    If you have your PC at 200W….it may be working flat out compressing a file…..or it may be sat idling for 30 mins while you go for lunch. If it’s the 2nd then it’s not doing anything of any use, so the efficiency is 0%.

    GrahamS
    Full Member

    No – I’m an engineer

    That’s just a pedant with a degree 🙂

    Too_Punk_To_Funk
    Free Member

    Feel free to explain why Gentoo is so much better and how you know exactly what everything is doing because you personally read and approved every single line of source code before compiling it and you disassemble every update for every app before allowing it on your system.

    I’ll tell you in the morning how little I care. [;)]

    Lol.

    Sadly some of us are Devs that build this stuff for you all to consume for free if you like, which probably means that we have little life 🙂 (Even less now I teach too..)

    Just amuses me to see people sing the praises of something like Vista, yet put up with so much crap. People have such low expectations from computers these days 😯

    Bloody jinxed myself as I came home to find the power off, guess it had been windy again… Damn fickle powerlines.

    GrahamS
    Full Member

    Dev here too, so I’m well aware that even the*nix devs don’t know everything that is going on in their PC.

    coffeeking
    Free Member

    “He keeps his server in an unventilated closet – man it must be cooking in there.”

    Its got a bit of ventilation at the front and rear in a thru-flow situation, but to all intents and purposes its cupboarded. It does obviously get warmer than it otherwise would but its my home server and I get to monitor temps and make sure none get excessive, so silence and invisibility are more important 🙂

    ooOOoo – not sure about the design of microwave heating systems but going from the basic figures on the label yes that would be some basic assumptions that make sense. I am just unaware of whether all energy is absorbed by any contents or if any is dissipated internally. Obviously non is lost to the environment 🙂 But again, how do you measure the efficiency of a PC – what % energy use is “useful” ? Surely they’re 100% useless when considered thermodynamically as at the end of the day they’re designed for manipulating 1s and 0s on magnetic media which has to take very little power at all in itself. What amount of energy is “useful” when moving things on the screen? If my monitor works at 50W, and we know it loses 20W as immediate heat generation, how much of the 30W remaining is unwasted? Not sure I can answer that having not eaten since lunch.

    ooOOoo
    Free Member

    Well if you ain’t looking at it, that 30W is completely wasted, surely?! (hope you had a good lunch BTW) 🙂 If my PC is so clever then I want it use just enough energy to do the job…then use no energy when I don’t want it to do anything. Come on geeks, surely it’s simples?

    GrahamS
    Full Member

    What do you suggest?

    I guess the PC could use the webcam to turn off the monitor when it reckons there is no one sat there looking at it – but what if you’re just a bit further away or looking from the side?

    Plus how much power does the webcam need?

    Likewise the PC itself can’t “use no energy” but still instantly respond when start typing. It has to tick over like an idling car.

    But modern PCs already effectively “idle” and draw less power by slowing down (“throttling”) the PC while it isn’t doing any work.

    ooOOoo
    Free Member

    Nothing that complex, perhaps just a switch like many other devices have, you could label it….I dunno….”on/off” 🙂

    Yes they throttle, using RightMark mine is at 800MHz 90% of the time. But it’s still using 80 Watts….what is it doing?!

    I want it to tick over like an electric car, not a petrol car 8)

    Surfr
    Free Member

    Right. well I too am a developer and understand the concept of context which the other devs mention. I typically have 5 – 10 shells on various machines tailing logs, editing files (Vim), running commands, commiting and checking out code etc…, Browser with 10+ tabs, mail, calendar, yammer, twhirl, adium, Word, Excel and then maybe some personal stuff like Spotify.

    Turning off would mean probably 20 minutes lost per day but I have not been trying to hibernate so from tonight I’ve set hibernate to 1 hour of idle and will see how my shells react in the morning. Just needed a kick in the right direction so thanks to this thread 🙂

Viewing 40 posts - 41 through 80 (of 80 total)

The topic ‘Turning off your PC overnight’ is closed to new replies.