> >
>

#1
12-08-16, 01:31 PM
Member
Join Date: Dec 2016
Location: UK
Posts: 3

Hi
If a strip of LED lights requires a wattage of 10W
does the adapter needed to power them need have an exact output of 10W?

Or can one with a lower or higher wattage output also work?

If for example the power adapter has only a wattage output of 3.6W
will it therefore not be able to power the lights?
or
if it has a higher wattage for example 10.8W
could this potentially overload the lights and even be possibly dangerous?

Thanks

#2
12-08-16, 01:35 PM
Member
Join Date: Dec 2016
Location: UK
Posts: 3
Wattage and VA

Is it possible to work out the wattage output of an power adapter by other information such as it's VA, voltage etc..

Thx

#3
12-08-16, 01:55 PM
Member
Join Date: Dec 2016
Location: UK
Posts: 3
Working out the mA of an Power Adapter

How
Is it possible to calculate the mA of an adapter with the following information:

Input; 220-240V ~ 50-60Hz
Output: 24V (DC) 8.0W

Many Thanks

#4
12-08-16, 02:13 PM
Group Moderator
Join Date: Oct 2004
Location: WI/MN
Posts: 19,614
Output amps, right?

8 watts ÷ 24 volts = 1/3 amps.

Unless being DC somehow affects the equation....

#5
12-08-16, 02:58 PM
Member
Join Date: Mar 2006
Location: USA
Posts: 33,582

#6
12-08-16, 02:58 PM
Member
Join Date: May 2015
Location: USA
Posts: 3,138
For pure resistive loads VA=watts. Some loads are close to pure resistive, like incandescent lamps, electric space heaters, toasters. Things with motors and power supplies, like most electric gadgets, will not be pure resistive. Without getting into power factor which tells you how close a load is to pure resistive, for things that you are likely to power with a power adapter, you can assume watts = VA +/- 20% and not be far wrong.

#7
12-08-16, 02:59 PM
Member
Join Date: May 2015
Location: USA
Posts: 3,138
And 1/3 amp = 333 mA of course....

#8
12-08-16, 03:01 PM
Forum Topic Moderator
Join Date: Feb 2005
Location: Near Lansing, Michigan
Posts: 10,943
You need to know the power factor to accurately convert watts and voltamperes (VA), but in most common devices they are roughly equal within about 15%.

See stickshift's post for calculating the maximum output current.

#9
12-08-16, 03:02 PM
Member
Join Date: Mar 2006
Location: USA
Posts: 33,582
It is better to use one with a slightly higher amperage rating. Example a commonly used 60 watt incandescent bulb draws less than one amp but is commonly used on a 15 amp circuit.

#10
12-08-16, 03:03 PM
Forum Topic Moderator
Join Date: Feb 2005
Location: Near Lansing, Michigan
Posts: 10,943
Your power supply needs to supply at least what the lights require, plus some buffer to prevent overheating. Usually 20% is a fair figure, so in this case you would want a power supply rated for at least 12W at the appropriate voltage.

#11
12-08-16, 03:09 PM
Member
Join Date: Mar 2006
Location: USA
Posts: 33,582

#12
12-08-16, 03:23 PM
Member
Join Date: Mar 2010
Location: USA
Posts: 4,296
"Unless being DC somehow affects the equation ... "

The equation (volts times amperes equals watts) is correct for DC.

The variations apply to AC when a load (such as a motor) that is not purely resistive is being powered. That is where the power factor comes into play. To be exact, watts equals volts times amperes times power factor.

Powering a load with a power factor less than 1 (not purely resistive), an AC output power adapter will still deliver only up to the number of amps it was rated for. The result is a smaller maximum number of watts you will get.

The power factor for a given AC motor remains unknown until someone tells you what it is or you figure it out using test equipment. So you will have to do your calculations only in a rough manner as suggested previously.

Last edited by AllanJ; 12-08-16 at 03:39 PM.