Probably the most answerers above claimed a computer in use is like having on a 40watt bulb. Well, if your 40watt bulb draws about a hundred and fifty-200watts, i might seriously ask for my money back. I've checked a quantity of the PCs I've owned with a power-consumption meter, and they have a tendency to use between 150-200watts. Power is measured in watt/hours, so a computer like this left on for an hour would use 200watt/hours of electricity. These PCs have had processors within the 1.5Ghz-2GHz variety. If you're making use of a quite quick processor and you do a lot of gaming with a elaborate images card fitted, you could in general stick another one hundred-150watts onto that. A CRT monitor uses a reasonable quantity of power too - my historic 15inch one averages about 80watts. Liquid crystal display monitors tend to be much less energy hungry than CRTs. Leaving PCs on when they are now not in use isn't best a waste of money but surely also an uneccessary contribution to world warming. An additional point to bear in mind is that digital components have confined lifespans, in most cases measured in tens or enormous quantities of thousands of hours. Capacitors (you'll find a lot of those for your energy deliver and to your motherboard) have shorter lifespans than many different components, greatly shorter if they may be run in warm or hot environments.