A fundamental problem in the analysis of data gathered in multiple scientific and engineering applications is to separate or distinguish between "true" signals, that contain important information, and random noise. Related tasks include estimating how many different signals or sources does the observed data indeed contain, and estimation of their properties.
As collected data become more and more complicated, these tasks become increasingly challenging, rendering previous solutions to these problems highly sub-optimal or even inapplicable.
In 2008, Weizmann scientists developed a new approach to study these problems, by carefully analyzing the effect of a signal strength on the statistical properties of the observed data.
This approach, in turn, led both to a new understanding of which signals can be distinguished from noise, and novel state of the art algorithms to do so. These have found use in various image analysis, communication systems and signal processing applications.