Websurfer
Silver $$ Contributor
Hello all,
Can someone help interpret what is happening with my cartridge velocities?
Some background: I'm a short-range benchrest competitor. Just got a Garmin table-top chronograph, set it up at a match, and was surprised at some of the data from it.
My average velocities increased significantly (well, it seems significant) from the start of the match. Here are the data:
1st target: 3043 FPS average (mean)
2nd target: 3053 FPS
3rd target: 3079 FPS
4th target: 3083 FPS
5th target: 3080 FPS
The lowest velocity recorded was my very first shot, a fouler, at 3013 FPS. The highest velocity was during the 4th target, at 3090 FPS.
Temperatures ranged from a low of 74 degrees to a high of 79 degrees. We shoot from covered benches, so there was full-time shade.
All rounds were loaded at the same time, LT30 powder, thrown from a Harrells measure, then weighed on an electric jewelers scale, and powder trickled in, as needed. Visual review before seating indicated all rounds had equal amounts of powder. Primers are Fed Match, same lot, same box.
Thoughts?
Many years ago, Jackie Schmidt posted a thread in which he expressed surprise at the velocity differences in his rifle (I believe a 6PPC) which shot more or less in the .1's-.2's. He was wondering if the velocity difference made any actual difference on target, which if you're getting .2's or .1's, maybe it doesn't.
Just because there is a measured difference along one line of data, doesn't mean that the difference is relevant.
Anyways, any thoughts on these seemingly large velocity ranges which my little Garmin recorded? Any ideas as to why there was such a difference? Why the velocities increased so much over time?
THANKS!
Can someone help interpret what is happening with my cartridge velocities?
Some background: I'm a short-range benchrest competitor. Just got a Garmin table-top chronograph, set it up at a match, and was surprised at some of the data from it.
My average velocities increased significantly (well, it seems significant) from the start of the match. Here are the data:
1st target: 3043 FPS average (mean)
2nd target: 3053 FPS
3rd target: 3079 FPS
4th target: 3083 FPS
5th target: 3080 FPS
The lowest velocity recorded was my very first shot, a fouler, at 3013 FPS. The highest velocity was during the 4th target, at 3090 FPS.
Temperatures ranged from a low of 74 degrees to a high of 79 degrees. We shoot from covered benches, so there was full-time shade.
All rounds were loaded at the same time, LT30 powder, thrown from a Harrells measure, then weighed on an electric jewelers scale, and powder trickled in, as needed. Visual review before seating indicated all rounds had equal amounts of powder. Primers are Fed Match, same lot, same box.
Thoughts?
Many years ago, Jackie Schmidt posted a thread in which he expressed surprise at the velocity differences in his rifle (I believe a 6PPC) which shot more or less in the .1's-.2's. He was wondering if the velocity difference made any actual difference on target, which if you're getting .2's or .1's, maybe it doesn't.
Just because there is a measured difference along one line of data, doesn't mean that the difference is relevant.
Anyways, any thoughts on these seemingly large velocity ranges which my little Garmin recorded? Any ideas as to why there was such a difference? Why the velocities increased so much over time?
THANKS!