top of page

About my Optic Reviews (Read this First)

Updated: Mar 28



Just about all of the reviews I have read on optics, which is a lot, tend to include pretty subjective assessments of optical quality. This applies to reviews on riflescopes, binoculars, spotting scopes, or thermals. Our eyes are still the most sophisticated lenses on the planet, and probably will be for years to come. The dynamic range of the human eye is about 20 stops (or about 100 decibels), which means the darkest tones we can perceive at any one time are about one million times darker than the brightest tones in the same scene. Compare this to the most modern and sophisticated camera sensors, like the Nikon Z8, and our eyes still have 64 times the dynamic range.


Our eyes are just as variable as we humans are as individuals. In my opinion, you will find the best optic reviews in birding forums. These guys know their stuff. A common theme in these forums when newcomers ask what optic they should buy is the birding folks recommend "go look through it yourself to see if it is a good fit." The variability of our eyes is exactly why they make this recommendation. As an engineer and data analyst, I cringe when I see optics reviews where the tester or reviewer is using a bunch of people to assess optical quality. Your eyes are different than theirs and you will see scenes differently when looking through the same optic. Sure, the average person will generally see better through higher quality optics, but you are also assuming the people used to conduct that review represent the average eyesight capability. To the credit of hunting reviews, though, objectively assessing optical quality isn't super straightforward and requires some specialized equipment to conduct properly. Professional equipment and quality assessment software can span into the tens of thousands of dollars, so I totally get why they conduct these reviews the way they do.


But I do want to provide objective optic reviews because this allows an apples-to-apples comparison and gives you the data to make informed decisions about your gear. I want to provide you the data to make decisions on optical quality factors you weight more heavily than others. For instance, I hate distortion and chromatic aberration in my optics. So, I might buy an optic that has better performance in those areas but a tighter eye box sensitivity. There are a few major optical and physical assessment criteria that cover just about all of the quality metrics we value in our hunting optics:


Optical Quality Assessment Criteria (Detailed descriptions can be found here):

  1. Light Transmission and Phase Shifting

  2. Field Curvature

  3. Distortion

  4. Sharpness

  5. Tonal Response and Color Accuracy

  6. Chromatic Aberration

  7. Coma

  8. Eye Box Position Sensitivity


Physical Quality Assessment Criteria:

  1. Turret Tracking and the Ability to Hold Zero (riflescopes)

  2. Diopter Adjustment

  3. Weather Sealing

  4. Field of View as a function of magnification


The physical quality assessment criteria are actually pretty easy to measure objectively. It is the optical quality of our hunting gear that is more difficult to assess. Actual measurements of the criteria above will be included in my reviews. Through some ingenuity, I was able to develop a test setup which wasn't super expensive and where I could leverage some equipment I already had on hand. Let me tell you a little bit about my process so you understand all the data you will see.


Optical Quality Testing


I needed a few pieces of specialized equipment and software in order to get this done:

  1. Universal Digiscoping Adapter (Baader Microstage II)

  2. Camera (Sony RX100 VII) (This camera is one of the best point-and-shoots on the market. I actually bought it for its high-frame-rate mode, shooting up to 1000 fps)

  3. Off-Axis Parabolic Mirror (OAP Aluminum 90 Degree)

  4. LED Source (Custom Made by Me, I design printed circuit boards)

  5. Laser Cut Apertures (Custom Made by Me)

  6. Imatest Spatial Frequency Response Chart (ISO 12233 eSFR)

  7. Image Assessment Software (MATLAB)


Luckily, I already had the software and camera covered because I needed those for work anyway. I have a full license of the Image Processing Toolbox published by Mathworks (MATLAB). This is a very robust software package which allows you to perform a wide variety of mathematical operations and transforms on images you import into the workspace.


Remember, digital images are just an electronic approximation of light we see in a scene. Camera sensors are typically stacked red, green, and blue pixel arrays which are excited by the light photons that hit them. Those excitations (called photoelectric response) are recorded and post-processed to produce data which can construct an image using the level of excitation (intensity/brightness) of each of the red, green, and blue channels (RGB channels). Optics can distort the incoming light to those arrays, or our eyes, so what we see through an optic doesn't exactly match what we might see without one. These distortions usually manifest in one or more of the Optical Quality Assessment Criteria I mentioned above. The goal is to understand how the optic distorts the incoming light and by how much that light is distorted. "How much?" is really the focus of my reviews because it allows you to quantitatively compare different optics you consider purchasing as part of your gear system, while weighting the optical performance against cost. I said this before in my CPW license allocation article, but Good Data Doesn't Lie. Other reviews you'll surely find on the internet might say something like, "I see a bit of blur and chromatic aberration around the edges of the lens." Sounds pretty qualitative to me. I would probably get a blank stare if I asked, "how much do you see?"


There are two basic test setups to achieve my goal (NOTE: I am leaving out a lot of mundane detail that you probably won't care about, like alignment tolerances, reflectance curves, source emissions, camera settings, and the specific toolbox functions I will use):

  1. The off-axis parabolic (OAP) mirror creates a collimated beam of light (like a flashlight, but better) directed toward the optic and camera. I put the laser cut apertures near the OAP after light is reflected from the LED source. I take one image with the camera without any test optic and use this as a reference. I install the test optic, align it with the camera using the digiscope adapter, and take a second photograph with the same field of view as the reference image. This allows me to isolate the test optic as the only system variable by comparing aspects of the two images in software. This test setup allows me to assess light transmission and any wavelength shifts created by the glass or coatings.

  2. The second test setup and procedure are pretty much the same as 1. above, but I replace the OAP and light source with the Imatest eSFR chart. Matlab has a nice set of functions which analyze regions of interest within the eSFR chart. I align the images in software, perform what's called image subtraction, and then use the MATLAB toolbox functions to generate quality metrics for optical criteria 2-7. By the way, this eSFR chart is an ISO standard, so this is the same type of test performed by professional optical laboratories.


Here are a few technical bits about the test setup:

  • The OAP mirror I am using is "Enhanced Aluminum" and has a surface roughness of less than 50 Angstroms (.000000197 inches = pretty damn smooth). More importantly, Edmund Optics was nice enough to characterize its reflectance curve as a function of light wavelength from the source. This OAP mirror does pretty well in the visible spectrum and extends into UV.


  • The LED light source I am using is a "cool" white 5700k temperature rating. The manufacturer was also nice enough to provide the emission spectra so I can characterize the RGB values of the digital images to actual wavelengths of emitted light.

  • The Imatest edge Spatial Frequency Response chart is the current ISO standard. An image of it is provided below with the areas of interest highlighted within the MATLAB software. This test chart is super high resolution, and I will illuminate it uniformly using some high-speed lighting kits I procured for work (I do a lot of high-speed video work). I'll spare you all the details on why the chart looks the way it does, but it is engineered to give very specific and meaningful results. Most importantly, these results are quantifiable and not subjective.

Results You Can Expect (and Depend On)


I can do all kinds of cool, useful things with these images after I upload them into my MATLAB workspace. I can calculate things like mean-squared error (MSE), which measures the average squared difference between pixels in my test image and pixels in the reference image. Signal to noise ratio variations can indicate an intensity loss, meaning the glass in the test optic is reflecting light away from the camera sensor resulting in an image that is less bright.


A really useful metric is what's called the Structural Similarity Index (SSIM). Our eyes are really good at perceiving structure in what we see. This tool combines local image structure, luminance, and contrast into a single local quality score. The structures are patterns of pixel intensities, especially among neighboring pixels, after normalizing for luminance and contrast. I will be able to tell if the test optic alters the resulting image locally (at a specific place within the image) and provide a global (entire image) score for how well the test optic allowed light to reach the camera sensor undistorted. You may be familiar with this phenomenon where objects around the periphery of your riflescope glass are a little blurrier than in the center. This tool will allow me to characterize how much blurrier it is. For instance, Look at the two images below:


The image on the right is a blurred version of the image on the left. They differ most along sharp, high-contrast regions such as the edges of the background trellis. Feeding this through the SSIM tool creates the following image, where dark regions indicate a noticeable difference between the blurred image and the reference image. Notice most of the dark regions are local to object outlines within the image, which we expect because the test image was blurred.

This type of analysis is very useful for aberrations like distortion, field curvature, sharpness, and coma. Though there are other analyses I will do to provide greater understanding of optical performance. MATLAB allows me to calculate sharpness, chromatic aberration, color accuracy, illumination, and noise directly from the eSFR chart using dedicated functions for each. For instance, I can measure chromatic aberration, which I personally hate seeing in my lenses, for each of the red, green, and blue channels within the image and generate a plot like the following:



Notice the blue channel has a higher intensity than the red and green channels between pixels 100 to 150, and a lower intensity immediately after the roll off (after pixel 200). This intensity difference contributes to measured chromatic aberration, though the image used to generate this plot did not have a noticeable amount of color tint at object edges indicative of chromatic aberration. Yet, the level of aberration is present in the data even if not readily discernable to our eye. I can break out the RGB channels in the other optical criteria as well, so we can predict why certain optics make green trees more vibrant, for example.


Concluding Thoughts


I don't want you to worry if all of that sounded a bit too technical. My job is to summarize everything I report into an easily readable format and what it means to you based on the optical performance I measure. In other words, I will include all the data for those of you who are interested in it, but also what that data means at a high level so you can make informed decisions without having to learn about all the technical jargon.


Upcoming Optical Tests


I own a few optics I consider to be pretty good. I am excited to test them using the setups described above to see exactly how well they stack up against other optics on the market. Upcoming reviews include:

  1. Nightforce NX8 1-8 x 24 F1

  2. Kowa TSN-88A Prominar Spotting Scope

  3. Swarovski EL 10x42 HD Binoculars

  4. Meopta MeoPro Optika 10x42 LR Binoculars

  5. Savage Model 0433B 4X Riflescope (I included this one because it is old (1967) and will probably show significant performance limitations compared to the rest)


I have my work cut out for me, but I hope you find the upcoming reviews of these optics useful when selecting your gear. I also have a few friends willing to loan me their gear to test so we can see how they all stack up. Who knows, maybe I can even get a manufacturer or two to send me some units to review, so long as they understand I will conform to the "good data doesn't lie" principle.


Much more to follow. Many blessings.


Craig.

15 views0 comments

Comments


bottom of page