A new app from Fraunhofer development engineers looks directly inside objects and displays specific constituents. It has numerous uses: For instance, apples can be scanned for pesticide residues. Applications will be added successively following the Wikipedia principle.
Such scans usually require a special hyperspectral camera: It adjusts to different colored light each time and ascertains how much of a color’s light is reflected by an object, thus generating a complete spectral fingerprint of the object. The development engineers use a mathematical model to extract just about any information on an object, e.g. its constituents, from its spectral fingerprint. “Since hyperspectral cameras aren’t integrated in smartphones, we simply reversed this principle,” explains Seiffert. “The camera gives us a broadband three-channel sensor, that is, one that scans every wavelength and illuminates an object with different colored light.” This means that, instead of the camera measuring luminous intensity in different colors, the display successively illuminates the object with a series of different colors for fractions of a second. Thus, if the display casts only red light on the object, the object can only reflect red light – and the camera can only measure red light. Intelligent analysis algorithms enable the app to compensate a smartphone’s limited computing performance as well as the limited performance of the camera and display.