Let's do an easy high school physics problem: How long does it take for a bullet to drop 1.5 inches?
From a stationary start, d = 1/2 at^2, where
d = distance traveled,
a (in this case) is the acceleration due to gravity on Earth (
g), and
t is time in seconds.
a = g = 32 ft/s^2 = 384 in/sec^2
Doing a little algebra to solve for t:
t = sqrt (2d/a) = sqrt (3in/384in/sec^2) = 0.088sec
It takes 88msec for a bullet (or anything else on Earth) to drop 1.5" from a static start (neglecting friction with the air, a safe assumption given that the vertical component of the bullet's velocity is less than 2 ft/sec).
So, how fast does a bullet have to go to cover the 200 yards (= 600 feet = 7200 inches) between the 100-yard zero and the 300-yard target? It must have an
average speed of 7200 inches/0.088 sec = 81,818 in/sec = 6118 ft/sec. Uh oh.
In real life, the bullet already has some downward velocity when it passes through the 100-yard zero range, so it takes even less than 0.088 sec to fall another 1.5 inches, but you get the idea.
Of course, if the acceleration due to gravity and the atmospheric density are different on your home planet, then maybe the bullet really does drop just 1.5" between 100 and 300 yards.