• This Forum is for adults 18 years of age or over. By continuing to use this Forum you are confirming that you are 18 or older. No content shall be viewed by any person under 18 in California.

Reducing digital scale drift

Rocketvapor

Gold $$ Contributor
I would like to present a method to greatly reduce drift errors when using digital scales for reloading.
Specifically load cell (strain gage) scales that always seem to drift requiring recalibration and frequent zero setting.

Drift drives the reloader to purchase scales with better accuracy and resolution better than 0.1 grain. Beam scale users can be happy with long term 0.1 grain accuracy but digital scale users want at least 0.02 grain accuracy and seek out scales with an extra digit of resolution in the hopes of getting that accuracy. The performance metric often quoted is a single kernal of Varget.
Past experience with cheap scales and poor charge consistancy must mean that the best scales are needed to produce downrange bugholes. Is $2000 enough to spend on a scale? $200? What does the extra money get you over a cheaper model?
PIC of load cell, 300g X0.001g scale purchased in 2019. Chinese origin of course.
pic-1.jpg
pic-3.jpg

Let's start with ROLLOVER error.
The strain gage scales used by many start off with +/- 0.1 grain resolution. Any reading, even with a perfect digital scale can be between one half count below the reading to one half count above the reading. Half your desired +/-0.1 grain requirement is gone just because it's digital. If you consider the possibility that a zero can be off 1/2 count AND the charge can be off 1/2 count some charges can be off by as much as 0.1 grain before scale error come into play. A scale with better resolution, say 0.02 grain, still has this rollover issue, but it is smaller.

Calibration error. Note the shift to grams from grains. Your +/- 0.1 grain requirement is equal to about 0.00648 grams. That's + and - 6.48 milligrams.
Calibration, often performed at a value close to FULL SCALE, 50 grams on a 50 gram scale, has two primary sources of error.
The calibration weight itself and the stored digital calibration value. Full scale errors are typically percentage based and often insignificant at minor loads on the scale. A relatively large full scale error +/- 0.01 grams @ 50 grams is +/- 0.02%. That percentage of F.S. error is proportional throughout the range. 0.02% @ 5 grams is +/- 0.001 grams (+/- 0.015 grains) and won't be seen on most cheaper scales. The full scale value can drift, is sensitive to level and temperature, but isn't the most significant source of error with a digital scale. Linearity error, related to F.S. error with some scales having a programmed 3 point calibration to help reduce linearity error. Helps reduce the span between zero and the calibration points.

Now the big one, ZERO ERROR.
Just because you have a ZERO indication doesn't always mean the scale is truly at zero. Most analog to digital converters (the electronics that produce displayed values have an 'Auto Zero' that will capture a few counts and display ZERO. This can change each time the scale indicates a value close to zero. The intent is to auto correct small errors in repeatability and drift. At large loads 20 to 40 grams on a 50 gram scale, this creates a negligible percent of reading error. At small loads 2 to 5 grams it is a much larger percentage. Auto Zero might be applied over and over causing several counts of error. Still looks like a zero, but it ain't.

Still reading?

My 300 gram, 0.001 gram scale uncorrected error makes it marginally useful for weighing charges. If zero drift could be monitored and corrected when greater than 1 or two counts (tared) then charge weights would be more accurate. How can you tell if the zero is wrong?

Start by letting the scale warm up BEFORE calibrating. Let the scale sit for a couple minutes after calibration to let the load cell relax after being loaded to large calibration loads, and Tare/Zero. Some battery powered scales will shut off after 2 or 3 minutes making it difficult to properly warm up and stabilize. The scale shown also has a rechargeable 6v, 2.5AH battery.
Now put a good 10 gram weight on the scale, outside your powder pan. A 10.000 gram indication should not interfere with reading charge weights. This is your new FAKE zero. Just watch it each time the empty pan is place on the scale.

You should check this with a couple of known good check weights. 2 grams will look like 12.000, 5 grams will look like 15.000. This prevents the scale from getting close to Zero and the Auto Zero from trying to alter the reading.
Referring to the picture below, Zero drift, Tare out drift, 10 gram fake zero, 2g added to FAKE zero, then 5g, and
5 grams plus 5mg.
Pic-5.jpg

If the 10.000 Fake zero changes by more than a couple counts, remove the 10 gram weight and tare the pan again.
Watching a 0.000 reading tells you nothing. Watching the 10.000 reading gives you a known starting point you can watch.
You might be surprised how stable your scale is if you don't let it zero between powder charges.
Also works with 0.01 gram resolution scales.
 
Last edited:
That strain gauge looks like the one in a Charge Master. What we don't know is the excitation voltage, which leads to the output voltage of the strain gauge. Or the resolution of the A/D converter which defines the number of digits of resolution for the software used to compare the weighed result to the target for control of the dispensing. If the software uses an average of several readings, the net effect is the numerical resolution is finer than the raw resolution associated with a single reading. A lot of unknowns!
 
Most good strain gage systems use a common precision reference for both load cell excitation and as the A/D reference. This sets up a ratio of output from the bridge to A/D full scale.
The scale pictured isn't associated with a dispenser.
Can a dispenser cause scale instability while running?
Digital display is 300,000 counts. 3.3ppm resolution for $60.
+/- 6.5 digits (in gram mode) equal to +/- 0.1 grain
It's not that good over the entire range but standardized near the target weight and using a Fake zero, +/- 2 counts is easy peasy. :)
I don't know what the A/D reading rate is (maybe 10 to 15 per second) or if software averaging is used.

Really good digital voltmeters 6 1/2, 7 1/2 digits, have rollover calibrated to .45 to .55 counts.
The problem I've seen with digital scales is not errors at large loads, but drift at the zero end.

Load cell specs here
I have the 3kg model but haven't open it yet. Just opened it,
3000g X 0.01g, same 300,000 counts.3kg-load-cell.jpg3KG-calibration.jpg
and YES, my cal weights are that good :)
 
Last edited:
Ran a drift test on the 300g/1mg scale most of the day.
Made a crappy little video, time lapse over several minutes with 5 grams, 5g + 5mg, then 111 grams.
Seems to be a problem with playback now.
 
Last edited:
By weighing a heavy load, 100g you can check full scale factor drift.
Should be a very small percentage.
By weighting 1,2,3,4,5 g without allowing the scale to auto zero you eliminate MOST zero drift.
If scale holds weight over several minutes, with a real weight displayed, adding charge weight will be more believable. For most powder charges a 10 gram fake zero (see first post) makes reading easier.
The 5mg sensitivity weight (less than 0.1 grain) shows +/- a count or two.
Should have used Varget :)
 
Last edited:
Some are direct from Rice Lake and Troemner ($$$)

5.005g reading has now drifted to 5.004 g (about 4 hours)
Look close for the 5mg weight on the pan.
HVAC now running at overnight temperature (68F).
Will let the scale run overnight and check in the morning
5004.jpg
 
Last edited:
Just a thought....

Ever think about getting a Kestrel Drop?

It logs the environment and you would have a record of the temps.

Depending on the model, you can also get temp, humidity, and pressure. All useful to reloading and external ballistics.
 
I monitor temperature, with a stable temperature being the most useful for me.
I just do a heavy load test to check scale factor, recal if needed.
My Fake Zero takes care of the low end.

Pressure and humidity I can't control well.
If I needed really high precision I could correct charge weights for buoyancy.
Humidity will change moisture content of powder if exposed for long periods but anything between 30% and 70% is probably OK.
 
Last edited:
Forgive me for not reading every post in the thread, but if your worried about drift at the 3rd power.....

Throw the darn thing out the window at 75 mph.

Seriously its just a bunch of noise.
How many kernels to a grain??
You ain't gonna shoot the difference.
I could get banned for this, so be it.

No I haven't been drinkin....
Well maybe a lil bit.
 
Good Morning :)
@ 6AM this morning, temperature in the kitchen was 67F.
Scale had "Drifted" to 5.003. Added 100 grams and got 104.992 grams (with 105.005grams loaded on the scale).
Zero seems to have drifted down another count and Full Scale Calibration down about 13 counts or about 0.01% .
More than I expected. Heater is on, will check again once house warms up.
EDIT: 10AM
House now at a toasty 72F, 71F under scale base.
5.007 (+2 counts) / 105.001 (-4 counts, -6 counts including zero shift)
1PM
Full scale hasn't come back. Still 6 counts low with 105.005 grams on the pan, 5.005 grams 2 counts high at 5.007. Did a three point calibration @ 100, 200, 300. Starting off with just the 100 gram weight. Getting 99.999. Added 1 gram, 100.999. Another cold (cold for us) night. Conditions not right for reloading, might as well let the scale sit.
 
Last edited:
Did another temperature drift run today.
Cold weather keeps me inside, was in the low 20s.
Cold for south Louisiana.
So, here's what I did. Left scale on over night.
Measured temperature of steel base plate, bottom of scale. Overnight temperature in Kitchen=65F
Zeroed scale, add 1 gram, then add 10 gram, then 100 gram for total of 111.000.
Repeated this as kitchen warmed up during the morning. Then turned on oven and warmed up scale to a little over 80F.
Turned off oven let cool down.
Calibration factor @111g vs temperature recorded. Counts are 1 milligram each.
Zeroed for each reading.

Temp counts % error
65 -18 -0.016%
66 -14 -0.0126%
67 -10 -0.009%
68 -8 -0.0072%
69 -6 -0.0054%
70 -5 -0.0045%
71 -4 -0.0036%
72 -2 -0.0018%
73 0 0
Oven on
74 +2 +0.0018%
75 +4 +0.0036%
76 +6 +0.0054%
77 +9 +0.0081%
78 +14 +0.0126%
80 +20 +0.018%
oven off
82 +25 +0.0225%
85 +30 +0.027%
82 +22 +0.0189%
80 +17 +0.0153%
Still waiting for scale to cool off.
78 +14 +0.0126%
76 +7 +0.0063%
75 +5 +0.0045%
Taking forever to drop now.
75.4F
75F-111g.jpg
73.8F Getting closer :)
74-111g.jpg
OK, all done :) 73F
73-Degrees.jpg

Didn't feel like making a 12 hour video :)
 
Last edited:
One mistake some scale users make is not fully understanding how well or poor their scale is performing.
Half a kernal of Varget is only a sensitivity test, NOT an accuracy test.
Even in a standards laboratory environment, performance at the target load must be verified before performing measurements. Using a product to evaluate scale performance adds the stability of the product.
Variations in water content of a hygroscopic material can introduce day to day variations in measured quantities. The stability of stainless steel calibration weights, properly cared for is many many times better than required.
Take this line from a 10 gram ASTM Class 1 certificate.
10-gram-cert.jpg
Using this to calibrate/standardize a measurement would have to include Mass Value correction, Uncertainty of the 10 Gram standard AND the uncertainty of the measurement process.
This is where KNOWING the performance of your scale (not advertised specifications) is important.


Calibration at a large value, typical of cheaper scales, sets the output of the weighing sensor to the digital value displayed for "Full Scale". Tare/Zero sets the low end of the range and is usually only valid for a short weighing session.
Linearity is typically a function of scale design, stability of calibration end points, and the distance between calibration points. Scales that use 3 point scale factor calibration help with display linearity.
Especially if one of the cal points is near your target value. Think about it. A 50 gram scale being used to weight 2-3 grams calibrated at only 50 grams, or a 3 point cal @ 2, 10 and 50 grams? An actual performance verification, like the 1 gram, 11 gram and 111 gram test I did on my cheapo scale.

I could post a linearity test of my cheapo milligram scale to find the error between cal points, but I challenge scale users to do their own test to find fixed and variable errors at their most often used loads.
Oh, and BTW, don't believe a zero reading,
because it ain't :)
 
Last edited:
There is a lot of misunderstanding floating around concerning precision/analytical scales and their use.

The facts are that all scales drift due to temperature. All scales also have a temperature effect on calibration. It is a function of the physics. Zero drift is the most significant source of error in an electronic/digital and also the easiest to control. Zero drift should never be a source of error. The primary source of drift error is due to the differential temperature response of the measuring element and the electronics. Years ago in lab work, users were thought to zero the scale prior to each measurement. Today, with the processing capability high end scales have auto zero/tare capability. Every scale should be zeroed prior to every measurement.

Calibration drift is typically less of an issue because the non linearities of the weighting element do not change significantly. For proper calibration, all scales should be calibrated prior to each weighing session.

Repeatability of the scale is typically influenced by mechanical issues such as friction and hysteresis of the measuring element. It is poor practice to leave a weight on the scale for longer than necessary as this can contribute to calibration drift and hysteresis.

Proper handling of calibration weights is also required. Nitrile gloves should be worn at all times when handling calibration weights. This serves two purposes. Primarily it prevents the collection of oils and dirt from the user’s hands which affects the weight and also prevents long term corrosion of the weight.

Gloves should also be worn when weighing to eliminate the possibility of oils and dirt from adding to the weight of the powder pan.

Practice is also to remove the sample then reweigh the sample after dispensing the material.
 

Upgrades & Donations

This Forum's expenses are primarily paid by member contributions. You can upgrade your Forum membership in seconds. Gold and Silver members get unlimited FREE classifieds for one year. Gold members can upload custom avatars.


Click Upgrade Membership Button ABOVE to get Gold or Silver Status.

You can also donate any amount, large or small, with the button below. Include your Forum Name in the PayPal Notes field.


To DONATE by CHECK, or make a recurring donation, CLICK HERE to learn how.

Forum statistics

Threads
165,190
Messages
2,191,117
Members
78,728
Latest member
Zackeryrifleman
Back
Top