• This Forum is for adults 18 years of age or over. By continuing to use this Forum you are confirming that you are 18 or older. No content shall be viewed by any person under 18 in California.

Powder scale accuracy range

Could the accuracy of a scale be measured in/on a standard deviation bell curve? Example, a scale can measure up to 1000 grains. Is there a sweet spot? Say half way 500 gr. Also same scale, would it measure 1gr at zero to 1 accurately or 500 to 501 more accurately? Yeh I know, beam, strain, magnetic. Just trying to understand operation in general. I've done some reading on this but it's way not layman enough for my puny brain. Mike
 
If you take the beam scale, the mass on the pivot, say at zero, is the couterpoise weights, the beam, pan, etc.
Friction in the pivot increases as the pan is filled and the couterpoise is moved out to balance.
Friction cuts into sensitivity. Even if small. Larger two pan balances go to extremes to reduce friction.
Weighing also relies on equalization of the pendulum by measuring 3 consecutive turning points, calculating the center rather than magnetically dampening the response early. The magnet/vane is only to save time.
The position of the notches in the beam determine incremental accuracy. Think they are positioned to the microinch?
The beam on both sides of the pivot have to be at thermal equilibrium.

The force balance scale uses precise internal electronic references to generate a counter force to the weight on the pan.
Some use a frequency modulated or pulse width modulated current source to reduce hysteresis when reaching the final weight.
When calibrated at full scale with a known mass linearity can be as good as the electronics inside.
Cost usually determines how good the electronics are. Cal labs use force balance scale to compare reference standards.
Most two pan balances have been replaced with force balance. I liked the old stuff but it took forever to get an accurate result :(

Load cell scales also have a range of performance. The load cell itself can be compensated for temperature and zero drift by using static strain gages on the beam.
These are included in the circuit to try and cancel error in the active cells.
Stain gages exhibit a permanent drift in zero and scale value the more they are flexed. That's why frequent calibration is needed.
They also exhibit hysteresis producing a different final position when the load is approach from below or above the target weight.
By using a load cell of appropriate size the mechanical drift of a small load cell and the hysteresis of too large a load cell can be optimized.
The electronics used in most load cell applications use the excitation voltage, 5 volts or 10 volts, as the reference voltage for the measuring electronics.
That means a 5.1 volts excitation used to excite the strain gages is also the reference fo the measuring electronics canceling out the voltage error.
They work by measuring the ratio of the excitation to the strain gage bridge output.

Test your scale, whichever type, with a known value check weight near your target.
Some digital scale use two calibration points. Full scale to set full scale and a midscale point to fix the linearity to an S curve.
Proper calibration would be at more points to create a software table in the scale electronics. One of them being around the 10% point.
Pay attention to gravity and your local vertical.

If using an electronic scale with "Auto Zero" keep the scale away from zero.
It will suck up a count without you knowing it.

Try the Nickel sorting experiment I mentioned elsewhere in the forum.
You can learn just how well your scale performs for just 55 cents :)
 
Last edited:
Well the beam scale seems fairly easy to understand. More weight on the pivot more friction, more friction less sensitive/ accurate. Yeh?
 
Here's an experiment most of you poor folks can perform with your scale.
You could buy a set a very precise mass standards, but creating a 'working set' will protect
your investment. The less you actually use the standards the less they will change.

So, gab a roll of dimes. Sort them out with your scale, selecting a bunch that indicate about the same weight, maybe 35 grains.
What you want is most of them as close to the same as you can get and one maybe 0.1 or 0.2 grains light, and one about 0.1 or 0.2 grains heavy.
But you want most of them really close together.
(a little work to the back with some 600 grit might help :) )
Pick one from the middle and mark it with a black marker, the light one with a red marker, the heavy one with a green marker.
Double check the light, center, and heavy dimes.
ScaleGame_Step_1.jpg

Now, weigh each of them to find those that are just a tiny bit heavier or lower than your temporary standard (Black Dot).
There's really no way they all weight the same as the standard. Keep going through the process until you just plain give up :)
When you are confident you have found ALL the heavies and ALL the lighties, go ahead and mark them Red, or Green.
Those that are the same as the Black one, DON'T MARK.
My little scale counts by 0.005 carats, which is equal to 0.001 grams, which is equal to 0.0154 grains.

ScaleGame_Step_2.jpg

I ended up with my Temporary Standard showing 0.005 carats drift. Dang, a whole milligram.
Maybe one more pass though the group just to be sure.
When done, you still won't know the true weight of the dimes, just which one are a little heavy or a little light.

I'll come back later and assign a best guess weight to the Black Dot dime.
Ran through them again and have 3 light and 5 heavy.
Couple more times and I'll be sure :)
 
Last edited:
Could the accuracy of a scale be measured in/on a standard deviation bell curve? Example, a scale can measure up to 1000 grains. Is there a sweet spot? Say half way 500 gr. Also same scale, would it measure 1gr at zero to 1 accurately or 500 to 501 more accurately? Yeh I know, beam, strain, magnetic. Just trying to understand operation in general. I've done some reading on this but it's way not layman enough for my puny brain. Mike

Q1 - Absolutely. Certified laboratory scales are calibrated and accuracy defined by an expression of measurement uncertainty, which is a deviation from a known mass, plus or minus two or three standard deviations. The uncertainty of the calibration standard is included among all the other contributors to the final expression.

Q2 - Yes, there can be a range within the entire range at which point the measurement is most accurate. It isn't necessarily in the middle.

Q3 - It is not unusual for the degree of accuracy to be different at different points along the entire range of the scale.

Among the inputs to a measurement uncertainty budget are; calibration standard (weights), temperature, atmospheric pressure, repeatability (inherent variation of the scale), reproduceability (operator/method induced variation), readability (how many digits is your output if digital).

Way too deep to get into? I agree, but I got this dumped in my lap on the very first day on a new position within a company I had worked at for years. Apparently my predecessor had completely screwed the pooch on this (among several other things). It took the better part of a year to get my head around this.

For your purposes - assuming your chosen hobby is reloading for accuracy and not statistics and metrology - as long as your scale weighs your check weights consistently, just load and fire. Don't dwell to deep on it.
 
The scales and balances used to produce the standards used for calibrating scales, at primary standards labs are used to COMPARE the reference standard to the working standards. Those working standards are then used to calibrate test equipment (scales).

Check NIST mass comparitor techniques, tranposition weighing, or single pan substitution. These methods eliminate most scale errors, directly comparing the reference to the test article. If the scale reads 10 counts high for the standard, you would desire the scale to read 10 counts high for the test weight. Often the process is repeated. You would then proceed to test sensitivity at that load. I spent many a night swinging weights, recording turning points and calculating measurement uncertainty for Working Standards and Test Weights sent out to calibrate scales.

https://www.nist.gov/pml/weights-and-measures/laboratory-metrology/standard-operating-procedures


Now, for the purpose of checking various error components of a scale used for weighing charges, bullets, cases, A full scale calibration weight at the value programmed into the hardware of a digital balance, and a check weight near your desired target is the minimum needed.
The nickle test (or the dime test :) ) can produce 10 nearly equal steps to measure linearity and repeatability of the scale. The weights are not precisely known. Just that they are approach the ability of the scale to discern the difference in indicated weight
(find light and heavy) throughout it's range. It also gives you practice using the scale.

With most digital scales rolling over to the next count somewhere in the middle of a count you will get an occasional (if the scale is stable and repeatable) one count high, or one count low. Whenever you have the spare time you can repeat the weighing series and statistically improve your guesses for light and heavy.
Even working them (remember the 600 grit) to a narrower spread in weights, but leaving at least one slightly heavier and one slightly lighter. I'm gonna check the piggy bank for a few more dimes as the light ones I picked are several counts low.
I want one or two counts low and one or two counts high for at least one each.

Sensitivity at low and high loads still needs to be checked.

Of course,
You would be better served by BUYING a 1,2,2,5,10 - - set of appropriate accuracy, and take very good care of them. They will not be cheap, and they don't come from China. If stored properly, handled gently, and used infrequently they will last the careful user for years.
Google weight tolerance charts.
 
Last edited:
@Howland,
I don't want to sound like I am contradicting your scale calibration statements.
Traceable calibrations are NEEDED for some applications.
The primary concern for reloaders is SAMENESS.
The load you wrote down last year will perform the same this year.
A CHECK weight does that for you. Not a full scale calibration.
Even with certified mass standards, you rely on sameness, from the time it was measured to the time you use it. Mass values, at specified environmental conditions, and relative to the density of the standard yield an Apparent Mass value.
Were you in the lab when NIST changed the reference from App Mass Vs Brass to App Mass Vs Stainless?

The typical reloader trusts a full scale calibration way to much. In fact, my cheap digital in the pics, 50 gram full scale is not really good at weighing powder charges when calibrated with the manufacture's method. I have to FUDGE it to get it to read accurately at minor loads like 30 grains.
Instead of calibrating Zero and 50 grams I calibrate at about 0.1 grams and 50 grams. A 49.9 gram full scale range.
That increases the 2 to 5 gram readings a little closer to the true value but makes the higher readings, above about 30 grams start to show errors.
So what :) I'm never going to throw a charge that high.

The dimes (cost less than a weight set :) ) can check repeatability, drift, temperature sensitivity, linearity.
They also give some practical experience weighing objects and detecting process errors.
But, they won't determine the true weight of a reading without a known standard.

Tomorrow I'll see what I can show about zero drift and how to defeat it. The cheap scales will throw you a curve with the auto zero capture.

Sensitivity, you know, that last little kernel of powder, and how people talk about how their scale can see one kernel of powder is exactly how you determine if your scale is sensitive enough. Only takes a few kernels to test it.
(or some primers :) )
 
Last edited:
@Howland,
I don't want to sound like I am contradicting your scale calibration statements.
Traceable calibrations are NEEDED for some applications.
The primary concern for reloaders is SAMENESS.
The load you wrote down last year will perform the same this year.
A CHECK weight does that for you. Not a full scale calibration.
Even with certified mass standards, you rely on sameness, from the time it was measured to the time you use it. Mass values, at specified environmental conditions, and relative to the density of the standard yield an Apparent Mass value.
Were you in the lab when NIST changed the reference from App Mass Vs Brass to App Mass Vs Stainless?

The typical reloader trusts a full scale calibration way to much. In fact, my cheap digital in the pics, 50 gram full scale is not really good at weighing powder charges when calibrated with the manufacture's method. I have to FUDGE it to get it to read accurately at minor loads like 30 grains.
Instead of calibrating Zero and 50 grams I calibrate at about 0.1 grams and 50 grams. A 49.9 gram full scale range.
That increases the 2 to 5 gram readings a little closer to the true value but makes the higher readings, above about 30 grams start to show errors.
So what :) I'm never going to throw a charge that high.

The dimes (cost less than a weight set :) ) can check repeatability, drift, temperature sensitivity, linearity.
They also give some practical experience weighing objects and detecting process errors.
But, they won't determine the true weight of a reading without a known standard.

Tomorrow I'll see what I can show about zero drift and how to defeat it. The cheap scales will throw you a curve with the auto zero capture.

Sensitivity, you know, that last little kernel of powder, and how people talk about how their scale can see one kernel of powder is exactly how you determine if your scale is sensitive enough. Only takes a few kernels to test it.
(or some primers :) )

Which is mostly why I wrote that. If OP or anyone else reads the first five paragraphs as yada, yada, yada, but pays attention to paragraph six, he's all set to go.

I got lucky in that a friend who drove a garbage truck found a check weight set of 0.01 to 100 grams and gave them to me while I worked in a laboratory doing calibration. If I dug them up with the calibration certificate I wrote up and compared them to the grade tolerances, I think they are Grade 4. The 100 gr. weight is 1 mg. over 100 grams with an uncertainty of about 1 mg at k=3. I used an analytical scale with a set of calibration weights certified to a degree of accuracy way finer than even our laboratory really needed. Nice find, but they stay on the back of the shelf. I rarely ever pull them out.

I got unlucky in that I'm pretty high on the OCD scale, but then again if I weren't, I would probably just buy premium ammo instead of reloading. I think it's actually a curse.
 
More-Dimes.jpg OCD? Never heard of it :)
I also have a set I checked myself :)
Might ??? have compared them to E1 (Class S at the time) standards several years ago :)

Raided the piggy bank and found some more dimes. Got rid of the light ones that were 4 or 5 counts light.
can't do anything but a rough sort this morning.
It's cool outside and the HVAC comes on, ramps the temp up, they it drifts down.
Later on today maybe. This is costing more than I planned. Up to $1.70 already.
 
Last edited:
Here's something to try with your cheapo digital scale.
Sensitivity at Zero Load (pan only).
Zero the scale, add 1, 2 or 3 kernels of Varget.
Everyone has Varget :)
See if your scale indicates something reasonable.
Try it a few times. Sometimes mine does, sometimes not.
Empty pan and zero. Now weigh something.
A dime, a bullet, a couple primers.
Add 1,2,3 kernels of Varget.
Should show the extra weight.
Try removing the dime, leaving the kernels, lift the pan and set it back down.
My guess is the scale will Auto-Zero with the kernels.

Now check exactly how sensitive is the scale at different % of load?
1 Kernel?, Good enough.
Play with the hidden Auto-Zero function until you understand it.

Zero-Capture.jpg

How can you defeat the Auto-Zero? Don't let the scale go to zero.
 
Everyone has figured out how to fool the scale's auto-zero function, Right?
Here's one way.
Zero scale with your pan. Mine weighs close to 60 grains.
Gently remove the pan and hide a 10 grain weight under the pan.
Now the scale will indicate 10 grains for zero and never go all the way to zero.
You can watch for drift each time you toss a charge. The 10.00 grain will be displayed each time you start a charge.
Or, you could hide a dime under the pan :)
The issue with this might be forgetting your target weight (powder charge) will indicate HIGH.
If you have been loading 24.9 grains of Varget for years, it may seem odd to see 34.9 grains on the scale.

PlayHideTheDime.jpg
Think this one out.

Ok is bout nap time :)
 
Why dont you just zero your scale with the pan off.
If pan is of known weight, mine is 174gr.
So scale reads pan at all times and never goes to zero, add charge weight to pan total.
For instance 45gr giving total of 219gr. Re zero scale with full pan.
Now when you dump charge and replace pan, pan still registers never letting scale zero.
It's working in reverse ,but have found it helps to keep drift down.
@BoydAllen taught me this trick, and would be able to explain it better.
All that said, I only use my digital to adjust my powder thrower, I weigh every charge on a beam scale.
Got sick n tired of drift.
C O N S I S T A N C Y is key
 
Up from my nap but the wife has occupied the KLAS, so no scale work for a while with the stove heating the place up.
Maybe another small nap?
 
Still have some more to do with the dimes. Maybe tomorrow.
But for slightly heavier jobs I found a 300,000 count 300 gram, and a 3kilogram digital for cheap a year or so ago.
This is the rechargeable model and the plastic shield they supplied was a Shocker no matter what I tried.
Unless you are loading 50 cal or 20mm, not for reloading :)
I use them to adjust test weights and ratio up to heavier weights.
You can pick the calibration point.
Hear is a quick test of 100 gram (my good weight) and a pair of 50 gram weights.
I get a count of noise but a milligram or two at 100 grams is pretty good for $46 each/shipped.
300000milligramScale.jpg



and just for size comparison,
checking two scales at once :)
Two-at-once.jpg
 
Last edited:
I'll post this just for reference.
Weight class tolerances.
MetricWeightTolerance-Image.jpeg
 
Doesn't seem to be a lot of interest in the cheap scales so I'll just play around a little.
Here is that 300 gram scale sitting with 10 grams for a while to see if it drifts a lot.
Left it on with the charger hooked up.
House temperature was pretty stable, heater did not come on all day.
Stability-Test_1.jpg
Kept the 10 gram on the scale and weighed those 2 50 gram weights from yesterday.

Stability-Test_1b.jpg

Never allowed scale to cross zero after the first Tare in the morning.
 
Last edited:
If you have tested your scale for drift, off zero, with a stable minor load, check for tilt error.
Any movement of the scale or bench can invalidate your calibration.
Best to check accuracy with a Check Weight each time you turn it on,
Recalibrate if you move it.
A good solid platform like an 18"X30"X6" granite slab, on a vibration isolation table, out of the way of drafts.
Yup, I got this :)
The picture below shows what can happen with a very little tilt.
Like a playing card thickness on one of those smaller scales.
See if you can detect that with the cute little level on your scale.
The card in the pic is 0.030" thick

Scale-position.jpg

Got up to go pee around 4AM, the scale was indicating 10.001 and while drinking a little water, clicked up to 10.002. Ran to get the camera but it had already gone back to 10.001, then settled back to 10.000.
SHhhh.jpg
Maybe I created a draft walking past.
Darn, missed it. You'll just have to take my word for it.
It's 10.001 again. Camera in hand, SHhhh, I'm huntin for wabbits :)
 
Last edited:
I said earlier that this 300 gram scale wouldn't be good for normal powder charges.
Great for comparing weights but a very slow trickle doesn't have the sensitivity of a scale with a smaller full scale like 20g, 30g or 50g.
Most the time a single kernel of Varget shows up, kicking up the display 1 or sometimes 2 counts.
Sometimes NOT.
Let it sit and it usually will settle to a good value. But that is SLOW.

The four pics below show 2 counts for 1 kernel, 3 counts for three kernels, and then what sometimes happens when the scale gets close to zero.
The auto zero sucks up the 3 counts, gone, and the next weighing will show a new value. Which may or may not be a good thing.

AutoZeroSucks.jpg
 
Last edited:

Upgrades & Donations

This Forum's expenses are primarily paid by member contributions. You can upgrade your Forum membership in seconds. Gold and Silver members get unlimited FREE classifieds for one year. Gold members can upload custom avatars.


Click Upgrade Membership Button ABOVE to get Gold or Silver Status.

You can also donate any amount, large or small, with the button below. Include your Forum Name in the PayPal Notes field.


To DONATE by CHECK, or make a recurring donation, CLICK HERE to learn how.

Forum statistics

Threads
165,018
Messages
2,188,261
Members
78,646
Latest member
Kenney Elliott
Back
Top