DIY 50w LED Lighting


  #1  
Old 03-07-16, 07:20 AM
L
Member
Thread Starter
Join Date: Mar 2016
Posts: 4
Upvotes: 0
Received 0 Upvotes on 0 Posts
DIY 50w LED Lighting

hello, i'm not sure where's the right section for this particular question of mine, since i'm new here so i would say "excuse me, n00b here".
Also, i'm not a native english speaker, so thanks for understanding my broken english.


Anyway, straight to the problem.


So i decided to make a DIY LED Lighting system for my filming project. I've seen some DIY builts out there but i'm not sure about my own particular build, since i don't want to burn anything out.

Basically what i have now is a 50w LED. I also have a DC-DC voltage booster (and regulator). The problem is, i'm planning to use a lead-acid battery (from my motorbike). It generates about 12v which can be easily boosted to 35volt, but it could also give like 100-150Amps without current limiting resistor, that's if i stupidly just connect the battery to the LED, then BOOM! Bye bye 10 bucks.

I guess this is where i would need a current limiting resistor. The one i have are some 1/4 watt resistors. Then i remembered about the 50w LED. The hell, does that mean my resistor would've just exploded if i use only 1 resistor for the whole system? Being afraid of that, i checked 50w resistor on the web and they cost a fortune. I really want to avoid spending more money than i could. The question is (Tl;dr), can i use parallel-ed 1/4w resistors for a 50w LED build?

Also, i want the LED to have different light levels, like i may want to use the 1/10 of the power, 2/10, 5/10, and so on. I'm thinking about using multiple switches to control each parallel-ed resistor (ohm's law basics~), so there'll be 10 switches. Will it work? I don't want to control the brightness from the volt regulator, it's just not efficient, and may cause unnecessary heat.

btw, i know the LED will get hot, i'm planning on using a CPU heatsink for the LED. So, no worries.

Thank you in advance!
 
  #2  
Old 03-07-16, 09:19 AM
P
Group Moderator
Join Date: Mar 2003
Location: NC, USA
Posts: 28,376
Received 2,324 Upvotes on 2,067 Posts
The LED module will only draw the amperage/wattage that it needs. As long as you have the voltage correct you do not have to worry about too big a battery or having too much supply amperage.

I would use a LED dimmer module to control the brightness. Most use pulse width modulation to provide dimming control.
 
  #3  
Old 03-07-16, 05:44 PM
L
Member
Thread Starter
Join Date: Mar 2016
Posts: 4
Upvotes: 0
Received 0 Upvotes on 0 Posts
Thanks for answering!

So eventhough i don't give any current limiting resistor, it won't burn off the LED?
Because yesterday i accidentaly put about 105watt (35v*3A) into the LED, and the LED was making some flickering noise, not to mention the sudden light intensity burnt my eyes for a while. I was surprised so i took it off just after like 1/10 second, but i was afraid if i tried any longer like 1 second or 10 seconds, it would damage the LED.

(tl;dr, i'm still not sure about what you said....).

Then PWM for the LED dimmer, yes i've thought of that. But i think i can't use that for this particular project because of the flickering effect it may cause when i do my filming project (like if i wanted to shot a slow-mo capture). So i need a steady continous light.

This is my LED btw.
Name:  myLED.jpg
Views: 237
Size:  23.4 KB
 
  #4  
Old 03-08-16, 05:34 AM
P
Group Moderator
Join Date: Mar 2003
Location: NC, USA
Posts: 28,376
Received 2,324 Upvotes on 2,067 Posts
Talk about re-inventing the wheel. Variable intensity video lights are commonly available and cheap.

When you plug a lamp into the wall socket what limits the current going to the light bulb??? It will work with a 40 watt and 100 watt bulb. The device will only draw/consume the wattage/amperage that it needs to operate.

You can hook up a 12v LED light to a stack of 8 AA batteries producing 12 volts or a huge 12 volt truck battery. The LED won't know how big the battery bank is. All it needs is sufficient watts and voltage. The LED will run longer on the larger battery but that's the only difference.

You can use a resistor to drop the voltage which will reduce the brightness of the LED. It's not the best way and it will certainly be expensive for the resistors and waste a lot of your battery power as heat.
 
  #5  
Old 03-08-16, 06:36 AM
L
Member
Thread Starter
Join Date: Mar 2016
Posts: 4
Upvotes: 0
Received 0 Upvotes on 0 Posts
Well, re-inventing the wheel, i know but here in my country it costs about $30++ (it's pricey for a student like me), also the power is not sufficient for my needs. It's just i have plenty of time to do some DIYs so i thought to myself, why not? Let's do it! I love electrical things and that's why i want to learn a lot from this (supposedly) simple DIY.

Alright, back to the problem. Ok then, now i understand that what my LED needs is sufficient watts and voltage, got that. But i guess i just came up with a new problem...



i think my voltage auto buck boost just got busted.... It always short-circuited eventhough the + and - is literally separated. The battery connector just sparks when i tried to connect it to the input of the voltage regulator.

Actually it worked for a while (using my 12v accu, boosted it to 35v, then the LED). The regulator went very HOT after 20secs and made a buzzing noise. But the second time i used it, it gave me a tiny fireworks show.

This is the regulator which i think got busted.
Name:  281039_34a05ddd-fb60-4867-b2ab-fc2f68523f6b.jpg
Views: 253
Size:  12.3 KB
It tells me that it can handle 3A, but well, 35v*3a=105 watt and my LED is only using max 50, so why it looks like the regulator can't handle half the power it should??!

Is there anyway to boost my 12accu to 35v for the LED? I tried to directly connect 12v to the LED but it won't light up despite the free amperage it could suck.

Btw, thank you for keeping up with me!
 
  #6  
Old 03-08-16, 04:28 PM
P
Group Moderator
Join Date: Mar 2003
Location: NC, USA
Posts: 28,376
Received 2,324 Upvotes on 2,067 Posts
Are you sure it's rated for 3a at 35 volts. It could only be rated for 3a at 12 volts. And, some imported and inexpensive electronics have wildly over rated specs.
 
  #7  
Old 03-08-16, 05:16 PM
L
Member
Thread Starter
Join Date: Mar 2016
Posts: 4
Upvotes: 0
Received 0 Upvotes on 0 Posts
too bad there's no technical specs about that. It only tells me that the chip has 3A (or even 4A but not recommended) max amperage without telling what voltage is in it. But my quick guess is, the chip was sucking 50w with 12v initial voltage, it means it drew 4.2A with 12v for the input. So yeah, the chip would've eventually burnt out.

So i think i'm going to buy more advanced (more expensive) voltage booster that has integrated heatsink on it. I have a CPU heatsink on my LED and it didn't get any warmer even without turning on the fan (lol). So when i have the new voltage booster, i'll stick the transistor to the heatsink also.
 
 

Thread Tools
Search this Thread
 
Ask a Question
Question Title:
Description:
Your question will be posted in: