Please clarify what is meant by the statement “The issue associated with trimming bullets all to the same length before pointing is that they will not have the same ogive radius”
If the bullets are formed in the same die, how could they have a different ogive radius (beyond normal manufacturing variation)? The formed bullets OAL can vary based on the length of the incoming jackets.
In my hands with the extreme longest and shortest bullets within a given Lot#, the primary length difference occurs within the nose region. That means the ogive radii will likely not all be the same, because if they were, the meplats of the longest bullets would have to be noticeably smaller in diameter than the shortest bullets. In other words, if we're only talking about a difference in arc length rather than ogive radius, longer bullets would be expected to have a longer and "pointier" nose if the arc length was simply extended as compared to the shorter bullets. Except that they aren't. This concept is easiest to visualize when comparing the very longest and shortest bullets from a given Lot#. That suggests to me that the ogive radii of longer bullets in a batch isn't necessarily exactly the same as for the shorter bullets.
This notion is also supported by the observation that when I seat bullets from the shortest versus longest length groups within a single Lot# of bullets, I have to change the seating die micrometer to achieve the same CBTO measurement, sometimes by as much as a few thousandths. Further, it is not too difficult to visualize that the caliper insert tool doesn't appear to seat in
exactly the same spot relative to the top of the bearing surface with bullets from the longest and shortest length groups. This suggest variance in the diameter of the bullet nose at different points along the ogive for bullets of different nose length. One could argue that the primary difference is nose length variance between the critical contact points (i.e. where the seating die stem and caliper insert each contact the bullet ogive) rather than ogive radius/diameter at a given point on the ogive, but I believe both change as bullet nose length changes. I thought this notion worth mention because the topic of this thread is about seating depth and OAL.
As far as I can tell, these kinds of bullet length variance within a single Lot of bullets
are due to "normal" manufacturing variance and tolerances. If the manufacturers' machines and dies operated perfectly, there would be little to no bullet dimensional variance at all. Yet we know this is not the case. In any event, these kinds of length variance can affect seating depth as measured using CBTO, and therefore will need to be dealt with if one's reloading OCD happens to extend that far (mine does), which brings me back to the original topic of seating depth and OAL. When attempting to maintain the the most consistent seating depth possible, even little things that affect seating depth by a thousandth or two can become additive. Thus, trimming a really long bullet to the same length as a really short bullet before pointing them both is not the best approach, IMO. I don't think the points end up as uniform, and as I've already explained, I believe doing so unnecessarily introduces nose length and/or ogive radius variance. Both these issues can readily be "tamed" by simply sorting bullets into length groups prior to trimming/pointing.
Now if you want to debate whether these differences are large enough that anyone can easily shoot the difference, that's another discussion entirely. As reloaders, many of us do things routinely where it is questionable whether we can reliably shoot the difference, or at least reliably
quantify the difference. Nonetheless, we do them anyhow, because we
believe they help, and they give us more confidence behind the rifle on the firing line, which is never a bad thing. I am also of the mind that the simplest approach is usually the best. At least, it typically requires only a minimal effort to determine whether it is effective or not. For anyone struggling to generate uniform seating depth in their loaded rounds, sorting bullets by OAL
prior to any other modification is a very simple approach that can be tested to determine whether it has a beneficial effect on maintaining uniform seating depth. If it doesn't work in a particular case, one can move to other possible solutions with a minimum of expended time and effort.
Edited to add: for the purpose of this disccussion, I just tried a simple experiment. I lined up two bullets from the shortest and longest length groups from a particular Lot#, with the shortest of the two in front so I could visually compare the ogives. The ogive radii were clearly not the same, and it wasn't difficult to see the difference. To my eye, the noticeable difference started about halfway between the top of the bearing surface and the meplat, which would put it in a region that could affect seating depth via the seating die stem contact point. In fairness, a sample size of one isn't very meaningful, but I don't have the motivation right now to compare several bullets from each group. But what I observed from the pair of bullets fits with what I have observed many times in the past regarding having to change the seating die micrometer in order to achieve the same seating depth with bullets from different length groups. YMMV.