What is the difference between '% reading' and '% full scale reading'? (FAQ - Force)
The measurement uncertainties associated with specifications for force-measuring devices are often expressed as a percentage of full-scale reading. This is not always the case, however, and sometimes percentage of reading is used instead and the differences can be very significant, particularly when measuring forces that are quite small for a particular instrument.
For instance if an instrument is specified to have an uncertainty of ±0.3 % of full-scale output, and its maximum force capacity is 5 kN, then the user can reasonably hope that if the instrument is used correctly, the value of force it indicates will be correct, give or take 15 N. But if the force being measured is, say, 500 N the uncertainty, expressed as a percentage of the force, will be ten times higher, that is ±3 %. Similarly at 50 N the force-proportional uncertainty would be one hundred times greater, or ±30 %.
The table below shows the uncertainties in the measurement of force, first given as 1 % of reading and second expressed as 1 % of full-scale reading to illustrate the difference; in the region marked with arrows, the device performing to 1 % of full-scale reading is unlikely to make a meaningful measurement.
Different meanings of '1 % uncertainty'
|Percent of reading||Percent of full-scale reading|
Example comparison of two common methods of expressing uncertainty showing different meanings of '1% uncertainty'