Is there a point of diminishing returns when it comes to powder weighing ?
The smallest increment is the weight of one granule. The weight of granules is not identical - by definition powder charges are not identical even if the scale says so.
To what degree is it necessary to control powder weight for a particular caliber in a precision discipline like 1000 yard benchrest ?
I still don't have a clear understanding of how the variation in the weight of primer compound - not the weight of the primer - is able to skew results even when powder is dispensed to 1/100 of a grain, but I do believe it contributes to variance even if powder charges are identical.
When do enviromentals have a bigger effect than powder charge variation ?
I would consider cutting kernels past the point of diminishing returns, LOL. In reality, if one is careful and observant while weighing out the first 5 or 10 charge weights, it is not too difficult to determine where each powder throw "wants" to be with respect to the desired/calculated numerical charge weight. Typically, the difference between where most of the charges "want" to end up and the calculated value will be equal to or less than the average weight of a single kernel, assuming the weighing apparatus is capable of measuring to that level. Because we're often throwing a couple thousand or more kernels per charge, we can therefore let the statistics work in our favor in the following way.
To do this, I weigh out charges for my sighters/foulers first, using a Mettler Toledo analytical balance set to .0001 g readability, paying close attention to the range (hi/lo) and approximate median charge weight. I do this to begin making a decision about how tight I can hold the charge weight tolerances. The balance I use can actually be set to .00001 g readability, but trying to weigh powder down to that fine a precision would be ridiculous, maddening, and would probably end up causing me to take a hammer to my wonderful balance. That would be less than 1/100th of a kernel readability. I really like my balance, so I don't want to do that. Anyhow, after watching charge weights carefully while preparing sighters/foulers, it is really not too difficult to keep the overall range to as low as +/- half a kernel or so, certainly within +/- one kernel once I start weighing powder for my test rounds, or rounds to be used for record shots in a match. I've worked in a laboratory all of my adult life, weighing milligram quantities typically many times per day, so weighing powder this way is really second nature for me at this point.
The most important question is whether this is actually necessary for what I do. I would argue that it is not. We're talking about a charge weight variance that calculates out to a theoretical velocity variance of well under 1 fps, so no, I can't shoot that, not even close. Nonetheless, weighing powder to that level of precision isn't that difficult or time consuming with the right setup. So it's not like doing it that way makes the process far more painful for me than it would be if I settled for a much larger precision increment. To each their own. What I know is that I never, ever, ever have to worry about charge weight variance in my loaded rounds at a match, and when I see unacceptable velocity variance during a load workup, I know it isn't due to excessive charge weight variance. In other words, charge weight effectively ceases to be a "variable". Those things are worth a lot to me, so I choose to do it that way. Only the individual can decide for themselves whether doing any step in the reloading process to some particular level is worth the amount of time and effort it requires. But to make an informed decision regarding whether some particular method is worth the time/cost usually requires some kind of estimate to be made, such as correlating a certain amount of charge weight variance to a certain amount of velocity variance. These kind of estimates are probably routine to anyone that works in a research laboratory, but they may not be so obvious to others. So I try to provide these detailed [and probably
excruciating to some, LOL] explanations of how I do things, to possibly help others decide for themselves what route they may wish to take.
In general, I would think that many reloaders would be perfectly fine weighing powder to about +/- 0.1 gr. Depending on the cartridge/powder, that would likely keep the predicted velocity variance to somewhere around +/- 10 fps, or slightly less. Further, it's neither difficult nor expensive to do. That kind of precision increment could easily be achieved with a number of different weighing options. Most importantly, weighing options that don't cost $600-$700 or more.
If one wanted to go to the next level of precision in powder weighing, something like let's say +/- .02 gr charge weight range, there may be a few less expensive weighing options that simply can't reliably hold such tolerances. So it may cost a bit more, but you might also be getting better reliability, features, and/or ease of use, as well the tighter tolerance.
If one really wants to take it to the limit, such as weighing powder to +/- half a kernel or better, we're really talking about a limited number of balances that can
reliably achieve that, and they're not going to be cheap. This is where you're really going to want a MFR analytical laboratory balance that will cost maybe $600-$700, to well over $1000 depending on the brand and features selected. The increase in cost to go to this level means one better think pretty carefully about what they're going to achieve by doing it, and whether it is worth it to them. Finally, it is not difficult to find good information online that explains what many of the various features and tolerances associated with precision weighing equipment really mean in practical terms; for example, readability, accuracy, linearity, etc. If someone is going to spend their hard-earned money on weighing equipment, it only makes sense that they take a little bit of time to learn something about that equipment.
As to where any of these tolerances fit into the grand scheme of things, I'm going to let people figure that out for themselves. It's simply not that difficult. For example, moderate to medium wind conditions in a typical F-Class match at 600 yd might be worth as much 2 to 3 MOA deflection, or perhaps even a bit more. Wind is usually by far the largest source of error. So one needs to figure out how wind deflection might compare to something like the difference between a load that reliably shoots 0.25-0.3 MOA at 100 yd, and one that reliably shoots 0.15-0.2 MOA. Over the long strings of fire we shoot in F-Class, the fact is that you're probably not going to notice any difference in those two loads in terms of score when the wind comes up. In other words, the wind is likely to be a far greater source of error than piddling little differences in group size during load development at short distance. Along the same line, one needs to have some estimate of the level to which they can
reliably make wind calls. All these little estimates and fair/reasonable assessments of our skill level and shooting equipment are essential to determine with some degree of certainty, or at the very least to make an educated guess, as to what our limiting sources of error really are. In order to minimize a limiting source of error, one must identify it first. So having a good toolkit for estimating and comparing potential sources of error is something every shooter should be thinking about.