how do I reduce DC output from 400Ma to 50Ma

Reply

  #1  
Old 11-02-05, 06:42 AM
pappy77
Visiting Guest
Posts: n/a
how do I reduce DC output from 400Ma to 50Ma

I have rigged a system where I can tell which way my antenna is pointed for my HDTV using a old transformer from a phonemate answering machine and some indicator lights I bought from radio shack. Problem is, the lights are 60Milliamp and the transformer puts out 400Milliamp. The lights therefore are WAY to bright. I tried to put a resistor inline but get nothing. I tried a 1Kohm carbon resistor. I don't know anything about reducing amperage and hope someone here does.
Thanks
 
Sponsored Links
  #2  
Old 11-02-05, 08:19 AM
Member
Join Date: Nov 2004
Location: CA
Posts: 2,041
Received 0 Votes on 0 Posts
You are mixing apples and oranges. When you quote a current rating for a bulb it is AT a certain voltage supply. Double the volts and you will double the amps. The light will be bright and burn out very soon.

A power supply is rated to put out a certain voltage, at a MAXIMUM of in your case 400 milliamps.


It sounds like you have a 6 volt bulb on a 12 volt power supply, or something like that.

In my exapmple, a 6 volt bulb which draws 60 ma is consuming about .36 watts, and the bulb has a resistance of about 100 ohms. You would use these figures, in conjunction with the voltage output of the power supply, to determine what resistance to put in series. Keep in mind that you must pay attention to the wattage requirement of the resistor. When you use resistors as current limiters, you could easily end up needing a 1 watt or 2 watt resistor.

Remember Ohms law. You must use all three figures ...ohms/amps/volts....to determine what is happening.
 
  #3  
Old 11-02-05, 08:26 AM
pappy77
Visiting Guest
Posts: n/a
The lamp is 12volt. What kind of device could I use like a dimmer switch on a light? A potentiometer or reostat?
 
  #4  
Old 11-02-05, 08:34 AM
Member
Join Date: Jul 2005
Location: Central Indiana
Posts: 192
Received 0 Votes on 0 Posts
If the voltage rating of the light is 12 Volts and the output voltage of your transformer is also 12 Volts, and you want to limit the current, you would need to put a resistor in parallel with the light to divide the current. To get 60 mA (Ma is mega-amps) out of the 400 mA total incoming, you would need a resistance roughly 5.67 times that of your light to consume the remaining 340 mA. So I think you need to measure the resistance across the light when lit.

Man, this is bringing back memories of breadboards.

Dave
 
  #5  
Old 11-02-05, 08:42 AM
Member
Join Date: Sep 2000
Location: United States
Posts: 18,497
Received 0 Votes on 0 Posts
If you have a 12-volt bulb, you really need to find a 12-volt transformer. Alternatively, you could, for example, use a 24-volt transformer connected to two 12-volt bulbs in series.

As 594tough said, the transformer does not put out 400mA any more than a car with a spedometer that goes up to 120 MPH is always traveling at 120 MPH. The car's speed is a function of how far you push down the accelerator. The transformers amperage is a function of what load you connect to it. The 400mA is just a maximum.

Putting a resistor in parallel would do nothing except waste energy.

Forget amps. Think volts.
 
  #6  
Old 11-02-05, 10:08 AM
pappy77
Visiting Guest
Posts: n/a
I do have a 12 volt transformer/power supply. Output is 12v DC 500ma.

I tried a wire wound resistor and saw no difference in the brightness of the light. I tried a potentiometer and got nothing. I guess next I'll try a 120V AC dimmer switch...Then I'll try a strip of dark tinted plastic over the lights
 
  #7  
Old 11-02-05, 12:40 PM
Member
Join Date: Sep 2000
Location: United States
Posts: 18,497
Received 0 Votes on 0 Posts
I'd just buy lower wattage 12-volt bulbs. What watt bulbs do you have now?
 
  #8  
Old 11-02-05, 01:27 PM
Member
Join Date: Oct 2004
Location: USA
Posts: 719
Received 0 Votes on 0 Posts
First if you had a 12 @ 500ma power supply you would not have a problem with a 12 volt @ 0.06a bulb.
more then likely your power pack is putting out closer to 18 volts.
-----------
Most but not all power packs are rated at the voltage with the required load.

I have checked some power packs labeled 12 volts DC at 500 ma. without a load the voltage was about 18 volts.
Has to do with AC peek voltage converted to DC.
That 500 ma rating was the required load to bring down the voltage to 12 volts.

People replacing power packs may burn up items by getting a higher current rating power pack.
Its best to get the exact rating when replacing a power pack. and that can be iffy.
--------------------------------------------------------

Assuming 18 volts and a 12 volt @ 0.06a bulb.
you need to create a voltage drop of 6 volts. 12 volts - 18 volts = 6 volts.

The voltage across the resistor will measure 6 volts,
and the voltage across the bulb will measure 12 volts.
Totaling 18 volts.

6v / .06a = 100 ohms

6v x .06a = 0.36 watts

you need to get a 100 ohm resistor rated to handle over 0.36 watts.

try a 100 ohm 1 watt resistor.
 
  #9  
Old 11-03-05, 09:48 AM
pappy77
Visiting Guest
Posts: n/a
GWIZ yer smart! I put a 10ohm 2watt(it's all I had on hand) in PARALLEL with the light bulb and it knocked it down to a glimmer! Perfect! I was trying to run the resistor in series before and now I understand why that won't work. The resistor is stealing current from the bulb via a short cut back to home base. I thought it acted like a speed bump slowing down the juice. That's obviously not correct theory in how a resistor works. Thanks for the lesson! I really appreciate it. One more question, am I going to burn down my house being this thing is in the attic and that damn resistor left a mark on my finger it got so damn hot!!
 
  #10  
Old 11-03-05, 11:43 AM
Member
Join Date: Dec 2000
Posts: 510
Received 0 Votes on 0 Posts
You are likely going to burn out either the resistor or the transformer or both. If we make the same assumptions GWIZ did (18V open-circuit voltage, 500 mA rated current) the series resistance of the transformer is 12 ohms. With your 10 ohm resistor the transformer is drawing 18V/(12 ohms + 10 ohms) = about 800 mA. 800 mA though a 10 ohm resistor gives a dissipation of 6.4W in the 10 ohm resistor, well over the rated dissipation of 2W. Thats why it burned your finger...

If this 10 ohm resistor is the same one you used in series with the light, then it far to small to reduce the current much (in series mode). Your bulb has a resistance of about 200 ohms (12V/60mA). You want at least a 200 ohm resistor in series to reduce the current to about the same level that you got with the 10 ohm parallel resistor. You could probably get by with a 1/2W resistor but a 1W resistor would be safer. Make the resistor larger if you want the light even dimmer, smaller to make it brighter.
 
  #11  
Old 11-03-05, 04:01 PM
Member
Join Date: Oct 2004
Location: USA
Posts: 719
Received 0 Votes on 0 Posts
The figures should be close.

By placing your 10 ohm resistor ACROSS the transformer, its like adding about 20 more light bulbs.
If you multiply the current of one bulb 0.06 amps by 20, your 10 ohm resistor is pulling 1.2 amps.
20 bulbs will get hot.

One light bulb 12 V @ 0.06a has an resistance of 200 ohms.(as stated by mike post # 10).
12v / 0.06a = 200 ohms .

The value of 10 ohms is going to over load the transformer.
The transformer has a capacity of 500 ma.
The approximant 10 ohm load with bulb on a stable 12v supply would be.
12 volt / 10 ohms = 1.2 amps about.
The load wants 1.2 amps, your xformer can only supply 500 ma, so the output voltage drops lower then 12v do to your over loading the xformer.

As stated in post #10 you will burn something up.
-------------------------------------------------
In your first post you stated "inline" (series) with your light. Thats why I did not hi light "series".
You were on the right track, you just did not get the right value resistor.

Get a 100 ohm resistor or so, 1 watt and put that "inline" (series) with the bulb.
Or 200 ohm as stated in post # 10 read his statement.

-----------------------------------------------
Everything has some-some resistance. A light switch, extension cord TV.....

I think I'm wording this right.
In a series circuit. a resistor limits current, which in turn creates a voltage drop across each resistor in proportions to the value of the resistance..

Resistance in series, with a stable 18 volt supply
The voltage across the resistor (100 ohms) will measure 6 volts,
and the voltage across the bulb (200 ohms) will measure 12 volts.
Totaling 18 volts.
 
Reply

Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is On
HTML code is Off
Trackbacks are Off
Pingbacks are Off
Refbacks are Off


Thread Tools
Search this Thread
Display Modes
 
Ask a Question
Question Title:
Description:
Your question will be posted in: