Ohm's Law

Reply

  #1  
Old 01-11-05, 09:01 AM
akant
Visiting Guest
Posts: n/a
Ohm's Law

Hello everyone,

I am a very inexperienced when it comes to electricity. However I am facinated by the technology and have a great respect for those who are able to work with it. I have replaced fuses, assisted my friend with adding breakers and have done some small wiring projects around my home.

I would like to understand Ohm's law better. Could some of you provide some explanations. I have a good idea what AMPS are however I get last with voltage. The only think I think that voltage is like water pressure in a pipe ?? Is this close? How do WATTS work? Isnt the formula something like AMPS x VOLTS = WATTS ?


Comments? Suggestions? Thanks!
 
Sponsored Links
  #2  
Old 01-11-05, 09:39 AM
Member
Join Date: Sep 2000
Location: Fayetteville, NY, USA
Posts: 1,052
Good topic. You are correct, voltage is pressure. That is how people in the business talk about voltage. The more pressure you put on a wire, the easier the electrons transfer down it. Increasing the voltage decreases the resistance. E = I/R, where E is voltage, I is amps, and R is resistance. Wire of a certain gauge has a certain resistance at a certain voltage. The longer the wire, the more resistance. You need more pressure. Utilities use high voltage for long distance transmission because they can use smaller gauge wire for the same number of amps than they could at lower voltages.

Watts is energy used. P = I * E, where P is power, or in this case watts. (It is also measured in volt-amps, or VA.

Using these two equations you can find out a lot of stuff.

If you use a 100 amp light bulb for one hour, you've used 1,000 watts over that time. That is one kilowatt of power. That is how the utility determines how to charge you. You pay by the kilowatt-hour.

If you wanted to know how many amps that 100 watt light bulb takes you would use algebraic manipulation to get I = P/E, solving for I, because P and E are known. 100 watts / 120 volts = 0.83 amps.

If you wanted to install a brand new lighting circuit and you wanted to know how many watts of lighting you can hang on that circuit you would use P = I * E, solving for P. If you are going to install a 15 amp circuit the NEC says you size it based on 80% of the load, maximum. 15 amps * 0.8 = 12.0 amps. 12 amps * 120 volts = 1440 watts.

Gee, that was fun. Any questions?

Juice
 
  #3  
Old 01-11-05, 11:29 AM
Member
Join Date: Sep 2000
Location: United States
Posts: 18,497
You titled your post "Ohm's law", but your actual question went in a different direction. Nevertheless, I thought that since the title is "Ohm's law" that Ohm's law itself should actually make it into this thread at some point. Ohm's law is:

E = I * R, or
Voltage = current times resistance.
 
  #4  
Old 01-11-05, 02:10 PM
Member
Join Date: Oct 2004
Location: USA
Posts: 719
Originally Posted by JuiceHead
Increasing the voltage decreases the resistance.

Juice
The temperature of the wire will change the resistance a small amount.
Inductance is another subject. So just the basics.

The resistance in wire stays the same with the exception above, it does not change with voltage in any way.

Resistance opposes Current (amperage)
Or you can say resistance limits amperage.

I will try to add more later.
 
  #5  
Old 01-11-05, 11:14 PM
Member
Join Date: Apr 2003
Posts: 32
If you use a 100 amp light bulb for one hour, you've used 1,000 watts over that time. That is one kilowatt of power



Im confused ......if you use a 100 amp light bulb for an hour how do you come up with 1,000 doesn't seem to make sense
 
  #6  
Old 01-12-05, 04:34 AM
Member
Join Date: Sep 2003
Location: Central New York State
Posts: 13,973
100 amps at 120 volts for one hour = 12000 volt-amps per hour or 12000 watts per hour or 12 kilowatts.

That's a very large light bulb.
 
  #7  
Old 01-12-05, 11:49 AM
Member
Join Date: Sep 2000
Location: Fayetteville, NY, USA
Posts: 1,052
OK, everybody, I stuck my foot in it. I'm off to go have a big crow dinner...

Ten 100 watt light bulbs are of course 1,000 watts. Run all ten for an hour and you've used one kilowatt-hour.

John Nelson busted me. I mistakenly wrote E = I/R. It is, of course, E = I * R as he said.

I also goofed on resistance. Instead of resistance of wire decreasing with respect to increasing voltage, I was more thinking about voltage drop. Over long runs, the resistance of a given diameter wire causes this. The voltage drop decreases as you (a) increase the wire diameter, or (b), increase the voltage.

Sorry for any confusion.

Juice
 
  #8  
Old 01-12-05, 12:12 PM
Member
Join Date: Sep 2000
Location: United States
Posts: 18,497
Everybody be careful here. If we're not careful, we'll spread disinformation.

100 amps at 120 volts for one hour = 12000 volt-amps per hour or 12000 watts per hour or 12 kilowatts
The term "watts per hour" is meaningless. Watts is already a rate. You cannot meaningfully add "per hour" to the end of it.

If you use a 100 amp light bulb for one hour, you've used 1,000 watts over that time. That is one kilowatt of power.
This is also meaningless. 100 watts used for an hour consumes one tenth of a kilowatt-hour of energy. Used over two hours consumes two tenths of a kilowatt-hour, but is still only 100 watts. Don't confuse power with energy. You pay for energy (measured in kilowatt-hours), not power (measured in kilowatts).

Watts is energy used.
No, watts is not energy. As stated above, energy is measured in kilowatt-hour, not kilowatts or watts. Watts is a measurement of power, not energy.

If you wanted to install a brand new lighting circuit ... the NEC says you size it based on 80% of the load, maximum
The NEC says no such thing.

The voltage drop decreases as you (a) increase the wire diameter, or (b), increase the voltage.
Yes, voltage drop through the wire does indeed decrease as you increase the wire diameter, but no, voltage drop does not decrease as you increase the voltage. It increases.
 
  #9  
Old 01-12-05, 12:47 PM
Member
Join Date: Sep 2000
Location: Fayetteville, NY, USA
Posts: 1,052
OK, watts is power, I must agree. 1,000 watts is one kilowatt. Kilowatt-hours is energy. A unit the utility uses to quantify the amount of "product" you have used so they can charge you. If you use 1,000 watts for one hour, they will charge you for one kilowatt-hour. I think John Nelson clarified this succinctly.

When discussing the 80% requirement of the NEC to size a circuit, I may have misspoke. Working in an industrial engineering environment, we apply a demand factor of 1.25 to all continuous loads. These are loads that will run for a minimum of three hours at a time. That keeps the circuit at 80% of it's rating. I did not mean to say that you size a circuit based on 80% of it's load, I meant 80% of the circuit's rating. I am less familiar with residential lighting loads specifically.

Now to voltage drop. If you shove 50 amps down a #1 AWG wire over 500 feet at 120v/single phase, the drop is 7.7 amps, or 6.4%. That percent is unacceptable. If you shove 50 amps down a #1 AWG wire over 500 feet at 240 volts/single phase, the voltage drop is 7.7 volts, or 3.2%. Better, but in my business we like to keep it below 3%. If you shove 50 amps down a #1 AWG wire over 500 feet at 480 volts/single phase, the voltage drop is 7.7 volts, or 1.6%. That's more like it. I contend that the voltage drop does not increase with an increase in voltage, as John Nelson said. The point is that as voltage increases, the %vdrop decreases.

I sure hope I haven't stuck my foot in it again. I truly appreciate John's concern for not spreading disinformation. He's exactly right. That's important. Some folks who read our posts might think that since we're "the experts", everything we say must always be correct. We do our best, but none of us knows everything and everybody, even electrical geeks, makes mistakes. That's why this forum is so cool. No flaming egos, and every now and then a friendly correction is offered and no offense is intended nor taken.

Juice
 
  #10  
Old 01-12-05, 02:57 PM
Member
Join Date: Sep 2000
Location: United States
Posts: 18,497
Juice, you and I are both right because we made different assumptions. The fact that neither of us stated our assumptions leads to the confusion.

When we said "increase the voltage", we are ambiguous unless we provide more information.
  • I based my comments on increasing the voltage without changing the load. This in turn will increase the current which will increase the absolute voltage drop (but leave the percentage voltage drop the same).
  • You on the other hand talked about increasing the voltage, but you were assuming a constant current. This is only possible of course if you also change the load. You were also referring to percentage voltage drop where I was talking about absolute voltage drop.
These discussions always go around in circles if the conditions are not clearly spelled out.
 
  #11  
Old 01-12-05, 08:24 PM
Member
Join Date: Nov 2004
Location: CA
Posts: 2,041
Getting back to ohm's law, it is basic algebra. Voltage (E) is an electromotive force which will cause current (amps) to flow if a complete path (circuit exists). The amount of current which flows is proportional to the formula I = E/R ( You can state this in any algebraically equivalent form: R = E/I E = I x R ) Knowing any 2 of these values you can solve for the 3rd.

Power ( watts) P = I x E , P = I[squared]x R , P = E[squared]/R

This is an instantaneous measurement, so when we are talking about the electric bill, we talk about kilowatt hours. 100 volts across a 10 ohm resistance will produce a current of 10 amps. Using any of the power formulas you get 1000 watts ( 1 kilowatt). If this device ran for an hour, you will have consumed 1 KwH.



note: I included the "squared" notation in my original post but for some reason some kind of auto-editor on the board deleted it. I had used these symbols instead of the brackets " < > " and that caused the word squared to disappear. I was an electronics technician for 25 years...Navy and civilian..... and feel bad that I seemed to be so unknowing! Sorry!
 

Last edited by 594tough; 01-15-05 at 10:15 PM. Reason: fix the formulas
  #12  
Old 01-12-05, 09:09 PM
Member
Join Date: Sep 2000
Location: United States
Posts: 18,497
Power ( watts) P = I x E , P = I x R , P = E/R
Well, P=IxE is right. But the other two forumlas aren't quite right. They should be:

P = I x I x R
P = E x E / R

A common erroneous conclusion when seeing P = I x E is to conclude that as E goes up, I goes down. This is faulty reasoning because it makes the implicit assumption that P is a constant (but of course it is not). Given constant resistance (approximately true most of the time), as E goes up, I goes up proportionally and P goes up proportional to the square of the increase.

I encourage people interested in all this stuff to take an electrical course at their local community college.
 
  #13  
Old 01-13-05, 04:41 AM
HandyRon's Avatar
Member
Join Date: Dec 2001
Location: New York
Posts: 1,365
John,
I'm not sure I'm following...
"Given constant resistance (approximately true most of the time), as E goes up, I goes up proportionally and P goes up proportional to the square of the increase."
A 120V 1800W heater draws 15A. A 240V 1800W heater draws 7.5A. So for a resistive load, as voltage goes up, current goes down.
This theory does not carry over to motors/inductive loads/
 
  #14  
Old 01-13-05, 07:15 AM
Member
Join Date: Dec 2000
Posts: 510
Originally Posted by HandyRon
John,
I'm not sure I'm following...
"Given constant resistance (approximately true most of the time), as E goes up, I goes up proportionally and P goes up proportional to the square of the increase."
A 120V 1800W heater draws 15A. A 240V 1800W heater draws 7.5A. So for a resistive load, as voltage goes up, current goes down.
This theory does not carry over to motors/inductive loads/
Your 120V 1800W heaster has a lower resistance (8 ohms) than your 240V 1800W heater (32 ohms), so resistance is not constant and John's statement does not apply. You are looking at two different heaters designed to put out the same power at different voltages.

Ohm's law does indeed still apply to inductive and capacitive loads, but the values become complex numbers (real/imaginary or magnitude/phase) - the resistance (R) is more correctly called impedance (Z). The simple answer is that the phases of the voltage and current are no longer identical. A bit too complicated to discuss in this forum (I'd put everyone to sleep pretty quick )...
 
  #15  
Old 01-13-05, 08:04 AM
Member
Join Date: Dec 2004
Location: Canada
Posts: 160
  #16  
Old 01-13-05, 12:38 PM
Member
Join Date: Sep 2000
Location: United States
Posts: 18,497
Again, it's unstated assumptions. My fault.

When I said "as E goes up", I was referring to a circuit in which you did nothing but increase E. Of course if, as you increase the voltage, you swap out one heater and install a different one, anything might happen to the current.
 
  #17  
Old 01-13-05, 11:29 PM
frhrwa's Avatar
Banned. Rule And/Or Policy Violation
Join Date: Jan 2005
Location: Deer Park WA
Posts: 521
holy bats crap man, you guys are confusing me... just when I thought I had it all fingered out...
 
  #18  
Old 01-14-05, 04:09 AM
Richard D
Visiting Guest
Posts: n/a
Great discussion. What is most interesting is that ohm's law is the most basic of electrical equations and even with that causes so much confusion. I guess that's why electrical engineering is so tough since it's so abstract. Thanks for everyone who participated in the discussions. Sometimes through mistakes or false assumptions, you can learn a great deal and clear up any erroneous conclusions we have had.

Major props to John and mikewu. I know I learned a bunch.
 
  #19  
Old 01-14-05, 04:11 AM
Richard D
Visiting Guest
Posts: n/a
thanks for the link buzz.
 
  #20  
Old 01-14-05, 07:23 AM
NAVIGATOR
Visiting Guest
Posts: n/a
Handyron is correct with his 1800watt 120volt heater example.
The resistance does not change if you go from 120volts to 240volts.

An electric heater is just a wire which has constant resistance (except for changes due to temp) If it is an 1800watt heater it will use 1800watts at 120volts or 240volts.

The applied voltage does not change. It is either 120volts or 240volts in a residential application.

What will change is the current (amps). If you double the voltage you divide the current by 2. the answer is still 1800watts.

120volts X 15amps = 1800watts
240volts X 7.5amps = 1800watts

The wire used in a circuit is sized for the current. That is why we use 240volts.
The appliance will draw half the current for the same power. therefore you can use a smaller wire. For example -

2400watts / 120volts = 20amps. You need 12ga wire and 20amp breaker.
2400watts / 240volts = 10amps. You can use 14ga wire and a 15amp breaker.

Just FYI, The wire is insulated for the voltage. Most all wire you will find is insulated for 600 volts, so it is good for 120v or 240v.
 
  #21  
Old 01-14-05, 07:44 AM
Member
Join Date: Dec 2000
Posts: 510
Originally Posted by NAVIGATOR
An electric heater is just a wire which has constant resistance (except for changes due to temp) If it is an 1800watt heater it will use 1800watts at 120volts or 240volts.
Absolutely and positively incorrect. If I have a constant resistance and double the voltage the current will double and the power will increase by a factor of 4.
 
  #22  
Old 01-14-05, 08:21 AM
Member
Join Date: Sep 2000
Location: United States
Posts: 18,497
current will double and the power will increase by a factor of 4
Quite true. At least for a little while until it catches fire.

Textbooks are carefully crafted over a long period of time and thoroughly reviewed. They walk the reader through the concepts one step at a time. In contrast, all of us here are just shooting from the hip with but a moment's thought. It's no wonder that we make so many errors and cause so much confusion.

I think what we've demonstrated quite convincingly here is that a forum such as this is not a great vehicle to provide tutorials in electrical theory.
 
  #23  
Old 01-14-05, 09:27 AM
NAVIGATOR
Visiting Guest
Posts: n/a
OOPS, sorry about that.
I see I am wrong about the constant resistance, sticking to the same heater example the 240v heater would have half the resistance (half the coil length)of a 120v to have the same output of 1800w. I didn't quite think that one all the way through.
You are also correct about the shooting from the hip comment.
 
  #24  
Old 01-14-05, 10:07 AM
Member
Join Date: Sep 2000
Location: United States
Posts: 18,497
sticking to the same heater example the 240v heater would have half the resistance (half the coil length)of a 120v to have the same output of 1800w
Proves my point again. This is not even close. A 240v heater would have not half, but four times as much resistance as a 120v heater with the same wattage output.
 
Reply

Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is On
HTML code is Off
Trackbacks are Off
Pingbacks are Off
Refbacks are Off


Thread Tools
Search this Thread
Display Modes
'