Leaving a computer on 24/7

Gayble.com

New Member
Alot of people have told me that if you leave your computer on 24/7 it saves more electricity than turning it off and on each day, and then some of my other friends say the opposite, they say that leaving your computer on 24/7 uses more electricity.

So, which is true? and how?

They say that when you turn a computer on/off usually (3 times a day or so) it uses more electricity than leaving it off.. well, what I want to do is leave my computer on 24/7 but I'm just worried that my mum will have a heart attack from electricity bills so I need to make sure.. can anyone bring some light on this issue? :)

Here are my computer specs:

AMD Sempron 2600+ Skt 754
512MB DDR400 RAM
80GB 7200RPM Hard Drive
Geforce FX5200 128MB AGP
Pioneer 16x (DL) DVD Writer
400W Deluxe ATX Case
ViewSonic 17" Flat LCD Monitor


I've heard that the cooling system you use (fan, etc.) can also help with decreasing the electricity the computer uses..

any computer pro's, experts and the like, please help! :D
 
The more fans you use the more electricity you will use, the less heat you will have though.

I dont think it would matter if you left your pc on all the time if you had it in "sleep mode" when you werent using it. but as far as electricity goes i believe that it is determined by what you are doing, if you are playing a demanding game then it will probably take more electricity sitting idle or in sleep mode should take minimal but im not sure, about 90%

These people that are telling you it takes more electricity to start it up and shut it down must be relating it to a car's engine, much more wear and gas used on start up, otherwise people would probably shut the cars off at red lights eh?
 
No, I don't play games, I only have Mozilla FireFox, Outlook Express and MSN Messenger open most of the time. I'll take your advice and try going into stand by when I'm not using it.
 
Well, your monitor and hard drive(s) typically use the most electricity... that is why Windows offers energy saving options... you can set a time for monitor to automatically turn off and your hard disks to stop spinning.... also there is standby mode and hibernation (mostly for laptops) mode...


I read (might not currently be true, but in the past) the most "wear and tear" on machines occur during the boot and shutdown of the computer. That is why I leave my machines on 24/7.
 
yea, for solid state devices, all wear is due to thermal stress. Shutting it down lets the parts cool and contract. Turning it on makes them heat up and expand. The more cycles you do, the more wear. It's better to keep the components at a constant temp by keeping the computer on.

For hard drives, it's best to let the OS turn them off since keeping them spinning will wear out the platter motors eventually.

Computers really don't use more energy during startup and shutdown then while running unless you're using speedstep and/or hardware controlled fan speed.
 
Ok, I've set my computer to turn the monitor off after 2 minutes, hard disks off after 10 mins and go into stand by after 30 mins. I'll try leaving the computer on 24/7 for one month as you guys have advised ;)

Well, I'll post this on as much forums as I can (tech, computer related, etc.) I have to make sure that what I do will not cause a big effect on the electricity bill, my mums life is on the line :D and I don't want her to faint or have a heart attack next to the mail box :(
 
The only harm in keeping your computer on all the time is using alot of electricity, and wearing out the components. For example, a power supply may only be good for 100,000 hours, but of course you will probably get a new one within 11 years, lol.

So its pretty much just the electricty aspect.
 
my computer has been on since August 1st when I moved here. Before that it had been running for about 3 years straight, except for periodic power failures and shutting it down to move it or change hardware. It's running like a champ.
 
Around here a kilowatt costs around 6 cents. Think of a kilowatt as using a 100watt lightbulb for 10 hours. Lets say an average computer uses around 50 watts an hour, so every day it uses 1.44 kilowatts.. so that is 8.64 cents a day. so that will be around 259.2 cents every 30 days. So, every month keeping your computer on 24/7 will cost you 2 dollars and 59 cents (USD).

Thats a lot of money!


As for burning down a house.... I would be more concerned about leaving an overhead fan on....

geoff5093, as I said earlier, the most wear and tear happens when turning on and shutting off your computer. and... after 11 years, if my PSU was the only thing malfunctioning in any of my computers, I would be very happy!
 
SFR said:
Around here a kilowatt costs around 6 cents. Think of a kilowatt as using a 100watt lightbulb for 10 hours. Lets say an average computer uses around 50 watts an hour, so every day it uses 1.44 kilowatts.. so that is 8.64 cents a day. so that will be around 259.2 cents every 30 days. So, every month keeping your computer on 24/7 will cost you 2 dollars and 59 cents (USD).

Thats a lot of money!

$2.59 USD is alot of money? :rolleyes: That's extremely cheap IMO..
 
I've heard that the cooling system you use (fan, etc.) can also help with decreasing the electricity the computer uses..
Probably not. One, your using more electricity to power the fan, and two, I doubt the PSU efficiency will increase much, if any, due to additional cooling, which would most likely need to be cooling directly on the PSU.

As for using more power on start up, there is a the inrush current, but unless you're turning the computer on and off every minute or so, that shouldn't be an issue.

Around here a kilowatt costs around 6 cents
Wow, that's pretty cheap. Here the average is a little under $0.12/kW-hr (even more for peak hours). And technically electricity usage is measured in kilowatt-hrs (watt = Joule/sec so its already a rate).

Lets say an average computer uses around 50 watts
That's probably a pretty conservative value considering some processors can use in excess of 100 watts at full load. Throw in the fans, hard drive(s), and a power hungry graphics card and you could use over a couple hundred watts. So in conclusion, you could be (but not necessarily) paying around $10 a month (or more), or $120 a year just to keep your computer on.

As for burning down a house.... I would be more concerned about leaving an overhead fan on....
Or forgetting to turn the stove of :o ... well I didn't actually burn the house down, but I have forgotten to turn it off.
 
Yeti said:
And technically electricity usage is measured in kilowatt-hrs (watt = Joule/sec so its already a rate).

That's probably a pretty conservative value considering some processors can use in excess of 100 watts at full load. Throw in the fans, hard drive(s), and a power hungry graphics card and you could use over a couple hundred watts. So in conclusion, you could be (but not necessarily) paying around $10 a month (or more), or $120 a year just to keep your computer on.

If you read my post again you will notice multiple places where I speak of watts and cost based on hours...

What applications/games are you running that utilize 100% of your CPU? ..and if your CPU usage really goes to 100%... for how long?

anyway, the reason the word AVERAGE was in my statement is because any computer's CPU is not going to be processing at 100%, 24 hrs a day....

I would venture a guess that most of the day your CPU usage is near 0%.

So, in conclusion, I stand by my previous post :P
 
If you read my post again you will notice multiple places where I speak of watts and cost based on hours...

What applications/games are you running that utilize 100% of your CPU? ..and if your CPU usage really goes to 100%... for how long?

anyway, the reason the word AVERAGE was in my statement is because any computer's CPU is not going to be processing at 100%, 24 hrs a day....

I would venture a guess that most of the day your CPU usage is near 0%.

So, in conclusion, I stand by my previous post

I wasn’t trying to attack your calculations, just saying that they are probably more of a base value that could be higher.

Also, don’t mean to be a prick but:
Think of a kilowatt as using a 100watt lightbulb for 10 hours
That would be a kilowatt-hour, not a kilowatt. A kilowatt is a unit of power, a kilowatt-hour is a unit of energy (its actually a Joule-hour/second, not quite as bad as the Btu/KW-hr heat rate rating, but still fairly stupid :)).
Lets say an average computer uses around 50 watts an hour
It would be using 50 watts for one hour.
so every day it uses 1.44 kilowatts.
It uses 1.44 kilowatt-hours. Well, actually I think it would be 1.2 kW-hr (24 hr X 50 W X 1kW/1000 W)

As for the power consumption calculations:
1) 0% CPU usage does not mean no (or even very low) power consumption. I’d say 20-30 W would be a conservative no load range.
2) My fans (case and HSF) alone take between 7.3 and 18.4 W depending on what my speed settings are (usually on the higher end).
3) Idle hard drives, I have 4, total a bit over 20 W
4) I use my computer for a lot of encoding (really the only time the computer is on with me away from it is for encoding) which takes almost 100% of the CPU power.

There are other things that would consume additional power (gfx card, PSU, mobo, optical drives, etc). I know that not everyone uses their computers like I do, or has the amount of components that I do, but I was trying to show that your value was on the low end. Not trying to attack you, just trying to clarify :).
 
Turn your computer off or leave it on?

I keep hearing different stories about how you should keep you computer on at all times, but then I read an article that stated that you should turn you computer off when it's not in use. So which is it? Should it be turned off when it's not in use or should it be left on?
 
Personal preference, it doesnt theoreticlly use that much electrivity, i mean a kettle uses more...... the only thing is that it might get "tired", as ni cache and temp files are massive cos they never get clerared out on BOOT. and you might not detect some problems if its never turned off, as they only show themselves on bootup.

Saying that, i leave my system on almost 24/7, gets sghut down and restarted at weekends when i do maintenance. It is a webserver thoguh.
 
They may not waste alot of electricity, but they do waste some, especially if its on 24/7. The big thing is the life of your components, since each component only lasts for a certain number of hours, leaving it on all the time wears them down. I would just put it in standby mode when your not using it, this way all you have to do is click and it comes back on instantly, and uses hardly any power.
 
i have my computer turned on 24/7 since i got my broadband connection 2 months ago, i dont have any problems, i have modified the registry so it clears my page file on every boot, on default it doesnt, it clears my cache as well, my computer is right next my bed, about 2 feet from my head when i sleep and i can bearly hear it, about the component life, i have an old duron 800 morgan core CPU, its turned on around 20 hours a day an it still works, amazing CPU though
 
Ive got my pc just half feet from my bed , so me i gotta turn it off :/ or sleep on the couch cauz im too lazy to work for some noisless fans
 
Back
Top