I was the one asked what is the difference between the two hell I don't know that is why I am asking.
Standard deviation is a statistic that measures one important characteristic of a normal distribution, which is also known as a “bell-shaped curve.” The assumption that we're dealing with a normal distribution is important for calculation of sd, and velocity data do seem to follow a normal distribution. I've tested a number of sets of data from chronographs which did in fact approximate a normal curve (i.e., were not significantly different from normal), so I’m sure that the algorithm for calculation of sd that’s programmed into the chronograph makes that assumption. (Lest someone jump in to point it out, I’m aware that velocity is a vector quantity, so what we’re really talking about is speed, which is a scalar – but velocity seems to be the term in common use, so that’s what I’m using).
So, the chronograph uses the individual velocity readings to calculate the mean (average) velocity and then looks at the deviation from that mean of each of the individual data points to calculate the standard deviation using a formula that is not greatly complicated but which we don’t need to get into. A small standard deviation indicates that the data points are clustered more closely around the mean, while a larger sd indicates more spread, i.e., more variability. Because consistency in velocity for the individual shots of a particular load recipe is [presumably] conducive to greater accuracy (precision, really), a smaller sd is always (I think) better, but it’s also important to consider the size of the mean velocity in question – for example, an sd of 20 for a set of low-power pistol loads that have a mean velocity of 700 fps is actually indicative of greater relative variability than an sd of 30 for high-speed rifle loads that average 3500 fps. The statistic that takes that relationship into consideration is known as the “coefficient of variation” (abbreviated CV, and also known as Relative Standard Deviation, RSD), which is simply the standard deviation divided by the mean. In the first (pistol) case the CV is 20/700 = 0.029 (and usually given as a percentage, so 2.9%), while in the second (rifle) example, it’s 30/3500 = 0.009 (0.9%), so the rifle data are actually more consistent, relative to the larger size of the mean.
the sd can be used to determine the percentage of your shots that you can expect to fall within a certain velocity range. For any normal distribution, the mean +/- one standard deviation will include about two-thirds of the data points (68%, in fact), +/- two sd will include about 95%, and +/- 3 sd over 99%. Using the rifle example, with a mean of 3500 fps and an sd of 30, 68% of all shots taken with that load (all other factors also being equal) would be expected to range from 3470 to 3530 fps, 95% should fall between 3440 and 3560, and over 99% would be between 3410 and 3590. Another way to look at the same numbers would be (for example) to conclude that only 5% of the shots would either be slower than 3440 or faster than 3560 fps.
The question of how many shots in a data set are necessary for calculation of a “meaningful” sd can be approached by considering the Student’s t-distribution in comparison to the normal distribution. The t-distribution is, in effect, a normal distribution that is mathematically adjusted to account for a small sample size. The amount of the “adjustment” for a particular small sample size is a measure of the uncertainty relative to having lots of data. In a normal distribution, based on an infinite number of data, +/- 1.96 sd includes exactly 95% of the data points. If, for example, you had only 20 data points (e.g., a string of 20 shots) the t-distribution says you would have to increase the 1.96 to +/- 2.093 sd to be confident that 95% of the data are included – the range is greater because of the uncertainty arising from the small sample size. That amounts to a difference of (2.093 – 1.96)/1.96 = about 6.8%, which is probably close enough for the average person evaluating loads but probably not, as steveno points out, good enough for a manufacturer. By the time you get down to a 10-shot string, the difference grows to about 15%, and for the typical 5-shot string, the difference is a good bit larger – nearly 42%, which I think makes a pretty good argument that sd based on a 5-shot string is a suspect value that should be used only with caution, or not at all.