how do I read amps on this multimeter?
#1
how do I read amps on this multimeter?
Hi,
I took a photo of the multimeter, hoping one of you can interpret the multimeter switch positions and the readout for me.
http://www.phoons.com/john/download/multimeter.jpg
I'm expecting my 12V DC circuit to have 0 current flow (as I investigate a battery-powered circuit).
My first simple test was to put a a little 14V lamp in series. It gives a faint glow that disappears within half a second (so I'm guessing there's a little bit of capacitance in the system--do I have that right?). But since the bulb stops shining, I don't know if that means 0 flow or perhaps not enough flow to light the bulb.
So, I concluded that the flow was low enough for me to put my multimeter in series in place of the little lamp. Now I see a constant "0.3" or "0.4" in the multimeter window...but I don't know if it means 0.3 A or 0.3 times some multiplier...and thus this post to ask you folks.
In your answer, could you explain the "200m" part? I don't know what to make of it except to conclude it means 200 milliamps (.2 amps), but I would want to interpret that as "this switch setting is good for up to .2 amps." That doesn't make sense to me since (1) I see 0.3 in the readout and (2) the 10A hole I plugged into is somehow supposed to cover a range from 200 mA to 10A.
So, how many amps is my multimeter claiming are flowing through the circuit?
[[Okay, and bonus points if you have a solid explanation for this odd one: the readout increases when I move the device closer to a 60 watt bulb. I'll take a guess: the LCD is a circuit, and heating it results in current and thus a higher value (I saw as high as 8.0 in the readout). What's up with that?]]
Thanks!
John
I took a photo of the multimeter, hoping one of you can interpret the multimeter switch positions and the readout for me.
http://www.phoons.com/john/download/multimeter.jpg
I'm expecting my 12V DC circuit to have 0 current flow (as I investigate a battery-powered circuit).
My first simple test was to put a a little 14V lamp in series. It gives a faint glow that disappears within half a second (so I'm guessing there's a little bit of capacitance in the system--do I have that right?). But since the bulb stops shining, I don't know if that means 0 flow or perhaps not enough flow to light the bulb.
So, I concluded that the flow was low enough for me to put my multimeter in series in place of the little lamp. Now I see a constant "0.3" or "0.4" in the multimeter window...but I don't know if it means 0.3 A or 0.3 times some multiplier...and thus this post to ask you folks.
In your answer, could you explain the "200m" part? I don't know what to make of it except to conclude it means 200 milliamps (.2 amps), but I would want to interpret that as "this switch setting is good for up to .2 amps." That doesn't make sense to me since (1) I see 0.3 in the readout and (2) the 10A hole I plugged into is somehow supposed to cover a range from 200 mA to 10A.
So, how many amps is my multimeter claiming are flowing through the circuit?
[[Okay, and bonus points if you have a solid explanation for this odd one: the readout increases when I move the device closer to a 60 watt bulb. I'll take a guess: the LCD is a circuit, and heating it results in current and thus a higher value (I saw as high as 8.0 in the readout). What's up with that?]]
Thanks!
John
#2
Alright, I'll take a shot.
The dial labels indicate the maximum amount of current that is allowable for that setting. In the 200mA DC setting, when your probes are connected correctly, you can measure up to 200milliamps. In this setting, the probes get connected to Com and mA, not 10ADC where you have them. In addition, be sure you are placing the meter in series with the circuit when measuring DC amperage.
Since the probes are not connected correctly, the 60W bulb problem could be just about anything.
The 10A probe input, is only to be used if you are planning to measure in excess of 200mA, and have the red probe connected to the 10A position.
The dial labels indicate the maximum amount of current that is allowable for that setting. In the 200mA DC setting, when your probes are connected correctly, you can measure up to 200milliamps. In this setting, the probes get connected to Com and mA, not 10ADC where you have them. In addition, be sure you are placing the meter in series with the circuit when measuring DC amperage.
Since the probes are not connected correctly, the 60W bulb problem could be just about anything.
The 10A probe input, is only to be used if you are planning to measure in excess of 200mA, and have the red probe connected to the 10A position.
#5
When I use the 10A switch setting and the 10A red cord connection, I see 0.03, which I am guessing means 0.03 amps...and therefore, from your last post, I'd be okay (and get a more accurate reading) to use the 200mA switch setting and 200mA cord connection, right?
What did the 0.3 mean when my switch was on 200mA? (Or was it garbage data because I had the wrong combo of 10A cord connection and 200mA switch setting?)
Thanks,
John
What did the 0.3 mean when my switch was on 200mA? (Or was it garbage data because I had the wrong combo of 10A cord connection and 200mA switch setting?)
Thanks,
John
#6
Your measurement appears to be approximately 30mA, so using the lower setting of 200mA (and probe connection) will give a more accurate measurement.
The 0.3 measurement when the wire connection was mismatched with the dial selection was probably garbage.
The 0.3 measurement when the wire connection was mismatched with the dial selection was probably garbage.
#8
Member
Join Date: Nov 2004
Location: CA
Posts: 1,913
Upvotes: 0
Received 0 Upvotes
on
0 Posts
Be sure you understand what it means to connect a meter into a circuit in series. If you connect a meter set for current measurement in parallel with a voltage source, sparks will fly!