I found this comparison interesting....
On a straight in shot the allowable margin for error when aiming at the exact ob contact point for a center pocket shot is relevant to the ob's margin of error going into the pocket. For example, an ob on the foot spot of a 7ft diamond barbox has a 4.5° window going cleanly into the pocket, a +/- 2.25° margin of error from center pocket.
When measured across the surface of the ob (where the contact point is located) this 4.5° window provides a 2.2mm arc. If the cb strikes the ob anywhere within this arc the ob will go into the pocket. The same 2.2mm arc applies to cut shots also, but as the cut gets progressively thinner this 2.2mm becomes skewed. The change in perspective, as viewed from the cb, makes the 2.2mm arc look smaller. Straight in it looks like 2.2mm, but from a 1/2 ball hit perspective the 2.2mm looks like 1.9mm, and from a thinner 1/4 ball hit perspective it's only 1.5mm. Eventually, as the cut angle approaches 90°, the 2.2mm arc will not even be visible from the cb's perspective.
So, when trying to reference the contact point, the margin for error shrinks as the cut angle increases.
Compare this to fractional aiming, where the width of the ob (2.25") is used to partition the cb into quarter, eighth, or sixteenth aiming point references. This 2.25" never changes, regardless of cut angle/shot perspective. It's a constant. By ignoring the physical surface of the ball (sphere) you can simply focus on the diameter of the ball as a plain circle. Doing this doubles your margin of error when it comes to aiming references.
Fractional aim points have 2 times the allowable margin for error when compared to contact points. A contact point arc of 2mm gives you room for error to be off by no more than 1mm left or right of perfect. The same shot using a fractional aim point allows for 4mm, meaning you can be off by as much 2mm left or right of the fractional aim point needed and still pocket the ball.
Has anyone seen any information like this in any book or online resource? Just curious.
On a straight in shot the allowable margin for error when aiming at the exact ob contact point for a center pocket shot is relevant to the ob's margin of error going into the pocket. For example, an ob on the foot spot of a 7ft diamond barbox has a 4.5° window going cleanly into the pocket, a +/- 2.25° margin of error from center pocket.
When measured across the surface of the ob (where the contact point is located) this 4.5° window provides a 2.2mm arc. If the cb strikes the ob anywhere within this arc the ob will go into the pocket. The same 2.2mm arc applies to cut shots also, but as the cut gets progressively thinner this 2.2mm becomes skewed. The change in perspective, as viewed from the cb, makes the 2.2mm arc look smaller. Straight in it looks like 2.2mm, but from a 1/2 ball hit perspective the 2.2mm looks like 1.9mm, and from a thinner 1/4 ball hit perspective it's only 1.5mm. Eventually, as the cut angle approaches 90°, the 2.2mm arc will not even be visible from the cb's perspective.
So, when trying to reference the contact point, the margin for error shrinks as the cut angle increases.
Compare this to fractional aiming, where the width of the ob (2.25") is used to partition the cb into quarter, eighth, or sixteenth aiming point references. This 2.25" never changes, regardless of cut angle/shot perspective. It's a constant. By ignoring the physical surface of the ball (sphere) you can simply focus on the diameter of the ball as a plain circle. Doing this doubles your margin of error when it comes to aiming references.
Fractional aim points have 2 times the allowable margin for error when compared to contact points. A contact point arc of 2mm gives you room for error to be off by no more than 1mm left or right of perfect. The same shot using a fractional aim point allows for 4mm, meaning you can be off by as much 2mm left or right of the fractional aim point needed and still pocket the ball.
Has anyone seen any information like this in any book or online resource? Just curious.
Last edited: