Digital Night Vision Has the Same Problem As AI

Every year, we get a little bit closer to the 'analog night vision' apocalypse - the mythical tipping point where digital night vision technologies surpass the traditional analog night vision technology we've been using for the past 30+ years. But is this tipping point a myth? Or is it nearly on our doorstep?

First and foremost, if you're reading this and not aware of the distinction between "digital" night vision and "analog" night vision, here's a quick review. 

Analog night vision refers to those systems which use image intensifiers, which amplify light based on a chemical reaction and physical multiplication of electrons by using micro channel plates and photocathodes. Analog night vision is the current 'gold standard' for military and professional uses and has been for the history of NVGs. 

Digital night vision refers to any technology that uses microchips and sensor technology found not unlike what is found in today's digital cameras, which are tuned to be sensitive to a wider spectrum of light and digitally produces an image onto a separate display. Historically, these devices have much less sensitivity and ability to produce high quality images in the dark, and rely much more heavily on the user having supplemental infrared light sources around them to enhance the capabilities. 

Recently, the crew over at T.Rex Arms created a Youtube video comparing some of the latest civilian available 'high end' digital night vision devices to current Gen 3 analog night vision across a variety of factors. 

In the video, we can see a couple of examples where digital night vision looks quite comparable to analog in terms of image quality, and even has some preferrable beenfits to analog such as the ability to render images in color instead of a monochromatic color palette. 

At one point in the video, the video host Isaac Botkin brings up the power consumptionn of digital night vision. 

This point reminded us of an interesting parallel from the digital-analog night vision universe to one from the world of artificial intelligence. One of the biggest barriers to much more serious use of artificial intelligence is supplying it with enough energy to be able to carry out major tasks. A perfect comparison is that while the human brain can carry out complex cognitive tasks using only about 20 watts of power, it is said that AI systems could need billions of watts of power to do comparable analytical tasks. So in the battle of artificial intelligence versus humans, the current champion (the human brain) is in large part the champion because of its insane energy efficiency. 

The same dynamic is true with digital and analog night vision - both approaches could, theoretically, provide excellent night vision capabilities (though digital sensors still aren't quite on the same level for sensitivity, but continue to develop). However, the power consumption requirements for digital to get to the level of what analog can already do with a single common AA battery are staggering. 

Although there are many comparisons and valid pros and cons to the discussion of digital night vision versus analog, energy efficiency is one of the most significant barriers to digital dominance. The incumbent solution of analog image intensifiers require so little energy to provide what is currently a superior solution, this makes the adoption challenge for digital especially great. 

What does the future for night vision hold? 

 


Please note, comments must be approved before they are published

This site is protected by hCaptcha and the hCaptcha Privacy Policy and Terms of Service apply.