A PHD materials engineer I knew once worked on a problem like that. He wound up with an 11X11 matrix for his designed experiment. He could not solve the partial differential equation that resulted. No one in the company had that background so they hired a PHD mathematician who took about a month to solve it. Once solved, the top 4 variables in the system were adjusted to permit sintering a cobalt chrome alloy to a titanium alloy. Then consensus of industry was the process was not possible. The result was if you spent enough time, money and brain power you could make it work. The process was so valuable that it was never patented. It remained a secret proprietary process known only to about 4 people.
Yep, at the heart of the issue is the problem of stacked variability. If you have a variable (case volume) that in isolation could cause a deviation of 3-4 fps, and you add that variability to a broader data set that already has 8-9 fps of deviation from all other factors (primer, powder, bullet, neck tension, etc), you don't get a resulting SD of 11-13 fps. You only see an increase in SD of maybe 0.5 fps. That's just the fundamental math behind stacked variables.
You'd need a very large data set and very methodical procedures to demonstrate that 0.5fps SD improvement, and in the real world that just isn't readily observable or measurable.