# How much does a 3,755 watts per hour central AC cost \$\$ wise per hour?

Trying to figure out how much our central AC unit costs to run per hour. Can someone help me with this? :)

Thanks.

Relevance

Your Cost to operate this A/C Unit can range from about \$0.20 per hour to as high as \$0.60 per hour. Calculations and assumtions are below. Your Electric Billing Rate is very critical to the calc's below, as well.

The 3,755 Watts per hour actually translates to 3.75 kiloWatt-hours, or more commonly 3.75 kWh , for every hour of usage. This is about the Power requirement for a 3 ton A/C unit, so your A/C unit is likely about a 3 Ton A/C unit.

So it is about 3.8 kWh of electric energy usage every hour. But the A/C compressor has a "duty cycle". It comes on for 10 or 15 minutes, and then cycles off for 10 minutes, and then back on for 10 or 15 minutes. The actual "duty cycle" depends on how warm it is outside, and how hard the A/C needs to work to keep the house cool. Let's assume that it cycles on for 10 minutes, and then off for 10 minutes. So for 1 hour --- it is on for 30 minutes, and off for 30 minutes.

Your electric company Seattle City Light, Pacific Gas & Electric, Florida Power & Light, etc -- they each charge different rates based on what it costs them to obtain the electric power.

It could be \$0.06 per kWh in Seattle, or \$0.08 in Florida, or as much as \$0.15 to \$0.20 in San Francisco.

COST:

So the cost to run your 3 Ton A/C unit is calculated as follows:

At \$0.10 per kWh and a 50% duty cycle (off 50% of the 1 hour period)

3.8 kWh x \$0.10 per kWh x a duty cycle of 50% = About \$0.19 per hour

If your electric rate (on your electric bill) is much higher --- say \$0.20 per kWh --- or if it is much warmer, like the middle of August, and the A/C unit cycles "on" more often --- maybe 75% or more, then the cost could be as high as:

3.8 kWh x \$0.20 x 0.75 = ~ \$0.60 per hour

So Bottom Line:

On a mild day in June, it may run \$0.20 to \$0.30 per hour to operate the A/C.

On a much warmer day in mid-August, it could run \$0.30 to \$0.60 per hour to operate.

NOTE: In these calculations -- your "electric rate" of \$0.05, or \$0.10, or \$0.20 per kWh is critical in these calculations.

If you are in California -- you have a 5-tiered rate, so you could be paying one of 5 different electric rates, ranging from \$0.15 per kWh to as high as \$0.35 per kWh -- depending on your normal electric usage.

Source(s): Electrical Engineer
• \$.18 for 1 hour for 1800 watts for \$.10 per kwh computer = 200 watts, like .02\$ per hour lighting, add up the bulbs , 3 florescent = 75 watts tv 100 watts to 300 watts.

• watts per hour is pretty meaningless. watts is already a rate, joules per second.

Assuming you mean 3755 watts, period.

you need your cost per hour, which can be anywhere from 4¢ to 20¢ per kW-hour. Assuming 10¢...

3.755 kW x 1 hour = 3.755 kw-hour

3.755 kw-hour x 10¢ per kW-hour = 37¢

But AC's do not run continuously, they turn off and on, controlled by the thermostat. The duty cycle depends on many factors, like insulation, area, windows, etc. You can get an idea by noting the times the compressor is off or on.

.

• Most power companies charge you based on the kW-Hrs used per month. If you take your last bill and look at the total kW-Hrs used and divide this into the total dollar value of your bill you will get the cost you paid per kW-Hr. (This will include the generation cost, transmission cost, distribution cost, taxes, etc.) It will likely be between 13 and 18 cents per kW-Hr. So now lets look at your question.

3,755 watts is the same thing as 3.755 kWatts So if you run the AC for one hour you will have used 3.755 kW-Hrs of energy.

Now take this value and multiply it by the cost per kWatt-Hour to get the cost per hour.

3.755 kW-Hrs X 0.18 \$/kW-Hrs = \$0.68 dollars.

Hope this helps,

Newton1Law