An infrared camera, also known as a thermal imaging camera, detects infrared radiation (heat) emitted by objects and converts it into an electronic signal to produce a thermal image. Unlike visible light cameras, infrared cameras can capture images in complete darkness and through smoke, fog, or other obscurants.
Infrared cameras operate based on the principle that all objects emit infrared radiation as a function of their temperature. The camera's sensor, typically made of materials like indium antimonide (InSb) or vanadium oxide (VOx), detects this radiation. The sensor is sensitive to wavelengths in the infrared spectrum, usually between 3 to 14 micrometers.
The camera's lens focuses the infrared radiation onto the sensor, which then converts the radiation into an electrical signal. This signal is processed to create a visual representation, known as a thermogram, where different temperatures are represented by varying colors or shades of gray. Warmer areas appear in colors like red, orange, or yellow, while cooler areas appear in blue or purple.
Infrared cameras are used in various applications, including building inspections, electrical maintenance, medical diagnostics, and military operations. They help identify heat leaks, electrical faults, inflammation in the human body, and hidden objects or people in low-visibility conditions.
Some advanced infrared cameras also feature image fusion, combining thermal images with visible light images for enhanced detail and context. They may include features like adjustable emissivity settings, temperature measurement tools, and connectivity options for data transfer and analysis.