Engineers from University Alliance Ruhr have developed novel signal processing methods for imaging and material characterization with the help of radars. They plan to use this technique in combination with radar-based localization of objects with the vision of a flying platform capable of generating a three-dimensional representation of its surroundings. The technology has a number of use cases, for example, it can be used for finding out what fire fighters might encounter behind clouds of smoke in a burning building.
Researchers from Ruhr-Universität Bochum (RUB) and the University of Duisburg-Essen (UDE) have joined forces with other institutions to work on developing this technology.
In principal, the same measurement technique should be suitable for material characterization as well as localization. However, it is not yet being used simultaneously for both. According to the measurement principle, a radar emits electromagnetic waves that are reflected by objects. Broadly speaking, it is possible to calculate how far away an object is located based on the delay between the transmitted and returning signals.
Additionally, the returning waves provide even more information. The strength of the reflected signal is determined by the size of the object, by its shape, and by its material properties. A material parameter, the so-called relative permittivity, describes a material's response to an electromagnetic field. Calculating the relative permittivity, researchers can thus figure out what kind of material the object is made of.
Converting a radar signal into an informative image requires huge computational cost. The recorded data can be compared to those of a camera that lacks a lens for focusing. Focusing is subsequently carried out at the computer. Dr. Jan Barowski has developed algorithms for this process during his PhD research at the Institute of Microwave Systems in Bochum, headed by Prof. Dr. Ilona Rolfes. Not only do Barowski's algorithms take care of focusing, but they also eliminate systemic measurement errors from the data. Under controlled lab conditions, the current system is able to determine the position of an object and that it is made of a different material than, for example, the surface on which it lies.
In the next step, the engineers intend to enable the system to recognize what the object actually is. They already have the capability of specifying the relative permittivity of synthetic materials, even though they cannot yet distinguish between different synthetics. The project partners intend to gradually optimize the system for application under realistic conditions.