The answer to this question deserves some deeper explanation, as many in the industry tend to use a "less than completely honest" means of specifying accuracy.
Accuracy is often specified as +/- a given percentage, however, many times this measurement is made against the truck CAPACITY vs. the LOAD and therefore can be very misleading.
Any non-sensor scale is going to have a varying degree of accuracy, and the LT-100 has an average variance of +/- 35 lbs.
If you are consistently weighing loads of 90,000 - 100,000 lbs, your accuracy will be within 0.3%.
However, if your average loads are 800 lbs, your accuracy is going to be +/- 5%.
We've also found accuracy to be greatly affected by two factors:
- Consistency of the operator.
- Consistency of the load.
If you are weighing very similar weight loads and calibrate your scale using that average load, your accuracy will be much improved.
Similarly, consistency in operators and/or operator processes will improve accuracy, as variances in the weighing process will affect the load accuracy.