> >
>

# Calculating necessary resistance

## Calculating necessary resistance

#1
01-31-07, 06:06 PM
Member
Join Date: Jan 2007
Posts: 4
Calculating necessary resistance

In my car I have an alternator with a voltage regulator set to 14.6V. Unfortunately, when it gets very cold after sitting out all night it outputs 14.8V. That's unfortunate because I have an 800 watt inverter inside my car wired directly to the battery and it shuts off at above 14.7V. I want to avoid waiting until my car warms up to turn on the inverter (cuz I want to put a small hair dryer on it to defrost my windshield lol, among other devices of course) And before you say that's impossible, it can output 7 amps and has sucessfully powered a space heater. Anyway, I want to put a high wattage resistor on the line coming from the battery to drop the voltage about 0.3V. The problem is, I don't know what the current is on the DC line coming in from the battery is and I don't want to measure it cuz it might fry my ampmeter. So far one theory is since the inverter outputs 7 amps at 115V then it must draw 56 amps from the battery/alternator at 14.6V. Another theory is that it draws about 8 amps cuz dissipating like 40 amps would light the thing on fire so it must not do that. What we have determined though is that inverters run on magic and don't make any sense. I obviously need to know the current in order to calculate how many ohms the resistor should be to drop the voltage by 0.3V so what do you guys think about how much current the inverter is drawing?

#2
01-31-07, 06:27 PM
Member
Join Date: Nov 2004
Location: CA
Posts: 1,913
This is not the right way to do it , because the current drawn by the inverter varies with the load put on it. But to put some numbers on paper:

Ignoring losses, watts out = watts in, so 120 volts X 7 amps out = 840 watts.

840 watts into the inverter, 14.8 volt source , = 56 amps.

Resistor to drop 0.3 volts = .3 divided by 56, = .005 ohms. Resistor would develop 18 watts, so would have to be very large.

Series resistance is not a good way to establish a voltage level in a circuit where current would vary widely from no load to full load.

#3
01-31-07, 06:39 PM
Member
Join Date: Jan 2007
Posts: 4
I also have multiple subwoofers on the inverter (DJ GEAR AND LOGITECH RAWKS!) plus of course some computer power supplies with neons and LEDs and EL wire. As for why the hell I did that....PC stuff is cheaper than car stuff lol. Plus, my speakers can output louder, clearer, better sound than most door/rear speaker combinations 3x the price. Anyway, subs tend to draw more when they're playing than not of course so it's extra bad of a problem with dynamic drawing. So if there is no set resistance that would gaurantee the same drop every time, my only choice is one of those way too expensive voltage regulators and that would be the same as getting the one in my alternator replaced with a 14.4V based one. It was reccomended by the people who would install it to not get it because it keeps the voltage at that level by opening and closing the circuit rapidly and to keep it that low, it may do it slow enough that I see a flicker in my lights. ANd I don't want to put a capacitor on the alternator's output line to fix that problem so I don't really want to risk it considering the cost. Are there any other solutions?

#4
02-01-07, 05:25 AM
Member
Join Date: Jan 2007
Location: Piedmont
Posts: 91
It seems to me that your inverter is either maxed out or overloaded (i.e., 840 watts on a 800 watt inverter). This may not trip the internal breaker (mine are 20 amps), but it very well may burn up you inverter, battery, alternator, or all 3. Prolonged use of the inverter could cost you \$\$\$ in this setup. Adding more resistance will only produce more heat build-up.

#5
02-01-07, 08:43 AM
Member
Join Date: Jan 2007
Posts: 4
naw, it's only at about 300-400 watts total with the normal stuff on it which is like 3-4 amps. And trust me, with how cold it is right now, it won't burn up. The high this weekend is 3 :P

#6
02-01-07, 09:05 AM
Member
Join Date: Jan 2007
Location: Piedmont
Posts: 91
With the "normal" setup <car audio and such>, I'm sure you're right. It's the space heater or hair dryer that could burn it up. I've had several space heaters, and all of them were dual-setting 750 or 1500 watts. If you run ONLY the space heater @ say, 750 watts and nothing else, you might be okay. Turn on the radio w/subs and speakers, and you're over the limit. That's all I'm suggesting. Yes, cold temps. may delay the inverter burning up, but if you overload it, the internals will burn up eventually, or damage the rest of your charging system, or both.

#7
02-01-07, 09:21 AM
Member
Join Date: Jan 2007
Posts: 4
I was just running them on it to see if they would :-P it's not like I keep them in my car at all times. I was going to try and find a 500-600 watt hair dryer that wouldn't even come close to maxing out the inverter just to defrost the windshield but haven't found one yet. The space heater on low mode would make the low voltage or high amperage tone sound on the inverter every time I stopped at a stop sign lol. Now THAT would burn up the inverter real fast lol. So yeah, I don't plan on putting that in there permanently.

#8
02-01-07, 04:22 PM
Member
Join Date: Jan 2007
Location: Piedmont
Posts: 91
lol....Okay, just don't want to see you make an expensive decision......I know sometimes you have to do what you have to do. Wish I could help you out w/ the voltage regulation part, but somebody in the automotive boards could probably help you there.....good luck!