Log in

View Full Version : Determinng the number of lightbulbs that can be used on a circtuit


Thad
Jan 28, 2006, 09:46 AM
Could someone tell me how many light fixtures that would contain 60watt bulbs can safely be placed on one 20amp circuit.
Thank you.

tkrussell
Jan 28, 2006, 10:00 AM
A 20 Amp circuit 120 volt must be derated 80 %, which is 16 amps. Wattage is calculated by multipling voltage times amperage.

Ohms Law P = Power or Wattage
P=EI
P=120 Volts * 16 Amps
P=1920 Watts

1920 Watts / 60 Watts per lamp = 32-60 Watt lamps per 20 Amp circuit.

Any switches or other devices on this circuit must be rated 20 Amps at minimum of 120 Volt.

Thad
Jan 29, 2006, 09:48 PM
Thanks tkrussell, that is easy enough. Thad

patcolamp
Feb 18, 2009, 07:32 PM
Thad

When determining how many watts of electricity that you can use on a circuit, the easiest way to do calculate it without all the electrican formulas is to divide the watts used by the input voltage. Example

60 watts / 120v = .5 or 1/2 amp

For breaker deration load is 80% of your breaker load. On a 20 amp breaker you should max it out at 16 amps.

So for 1/2 amp per 60 watt lightbulb. Using a max of 16 amps, you can safely put 32 lightbulbs.

I hope this helps you out.

stanfortyman
Feb 18, 2009, 07:46 PM
For breaker deration load is 80% of your breaker load. On a 20 amp breaker you should max it out at 16 amps.

Just so you know. The 80% rule does NOT apply across the board. Sure, it is required for certain installations, such as for continuous loads or electric heat, but not for general use circuits.
It DOES apply to individual cord and plug connected loads.

A 20A circuit CAN be loaded to 20 amps.
See NEC table 210.24

Also, this thread is THREE(!) years old, and Thad has not been on the site for a year and a half.