I am new to using multimeters. How do I read and understand the accuracy ratings for digital multimeters? For example, AC Current +/-5.5%, +/-3?
I bought a Radio Shack model #22-174B and the way the manual advises you to read accuracies is beyond my comprehension: "Accuracy is shown as % of full scale, plus the number of digits between 5% of range and full scale" Please let me know if anyone can make sense of it.
Are accuracies on all multimeters read the same way?
I am aware that multimeters with True RMS are supposed to be more accurate. The Radio Shack Multimeter has True RMS, but is much less accurate than most of the other meters I've looked at (based on the percent numbers). Why?
The accuracy of a few ranges on the Radio Shack meter:
DC Volts (under 400v) +/-1%, +/-2
AC Volts +/-3%, +/-3
DC Current (at 4 amps) +/-5.5%, +/-5
AC Current (at 40mA) +/-6.5%, +/-3
AC Current (at 10 amps)+/-5.5%, +/-3
Capacitance (at 400uF) +/-10%, +/-3