On a fairly regular basis, I am sent riflescopes for testing and evaluation. Testing objective factors such as the accuracy of adjustments, durability and eye relief is a relatively straightforward process. That said, testing optical quality can be really difficult without sophisticated testing equipment (which I don't have).
How do you objectively compare and evaluate the subjective qualities of a riflescope? Until recently, the only way for me to evaluate factors such as optical clarity and light transmission was by mounting the scope to a rifle and taking it (and a few other rifle/scope combos) outside in various lighting conditions. At dusk, for example, I could take turns looking through scopes at a 1951 Air Force Resolution Target and see which scope gave me the best performance in the diminished light. The problem was that by the time you switched rifles and scopes, your brain "forgot" exactly what the previous image looked like. Not a perfect system to say the least.
A few months ago, a box showed up at my door—sent by a well-known gunsmith friend. It was a steel bar, fitted with four different scope rings set side-by-side. The bar allowed for up to four scopes to be mounted and adjusted so that they all shared the same point of aim. The entire package can be mounted on a camera tripod so that the evaluator (me) can sit comfortably for long periods of time behind the glass. By moving my head an inch or so in each direction, I can compare scopes on the same plane, focused on the same target, in the same conditions. As dusk falls, I can quickly move back and forth between images, comparing apples to apples. Simple, but very effective.
Nothing groundbreaking here, but a little window into what methods we use to evaluate products.