An ammeter measures the electric current in a circuit. This may be described more formally as the rate at which electrical charge passes through a given point in the circuit. The standard unit of electrical current is the ampere, although the milliamp is used more often for home experiments. A basic ammeter uses the electromagnetic field generated by an electrical circuit to move a needle in proportion to the electrical current. Modern ammeters measure current with a digital display.
Examine the structure of a simple circuit. The simplest possible circuit may be shown with a battery and light bulb. The negative terminal of the battery is connected to the negative terminal of the light bulb with a lead. Similarly, the positive terminal of the battery is connected to the positive terminal of the light bulb with the other lead.
Observe the inputs for an ammeter. A very basic ammeter might have one input and one output. However, a commercial multimeter should have a specific input for measuring current (typically marked “A” for amperage. The output is commonly marked “COM” for common ground.
Turn the ammeter on and set the selector to detect direct current (DC) amperage. A simple ammeter may only be able to detect amperage but a multimeter can detect various electrical quantities and will need to be “told” which quantity to measure. If the ammeter has a selector for the range of current to display, select the highest available setting.
Disconnect the positive lead from the light bulb and touch the probe from the ammeter’s input (A) to the positive lead from the battery. Touch the probe from the ammeter’s negative terminal (COM) to the positive terminal of the light bulb.
Select progressively lower current ranges until you get a measurable result. If your ammeter has this option, you'll want to “scale down” rather than “scale up”. This will avoid damaging the ammeter by subjecting it to a level of current that it’s not prepared to measure.