top of page
  • Writer's pictureRana Basheer

Extreme Events and Signal Detection

An idea has been simmering in my mind for sometime regarding the statistical nature of signal detection in a wireless receiver. Essentially, I was wondering if the noise in signal strength values measured by a wireless receiver could be modeled as an extreme value distribution?

The genesis for this idea in my brain could be traced to the moment when I was reading the famous book The Black Swan by Nassim Nicholas Taleb. In this book, Taleb mentioned option pricing using extreme value distribution under market disturbance conditions. Before I explain the details about extreme value distribution, I will give you a glimpse into a frustrating period in my student life. From the start of my PhD research, one of my tasks as a graduate research assistant was to run short localization demos for visitors to our lab. If these visitors are super impressed, they would open their wallets for more research grants or even a plush job for me after graduation. During these demos I am either running a localization algorithm that my fellow students developed or my “next greatest method that is bound to revolutionize” .. not enough hyperbole? After working nights and days tuning the system to perfection while dreaming of ticker tape parades and accolades, I am bought back to earth rudely and violently in an Icarus sort of tragedy, when my delicate tapestry fails spectacularly in front of these visitors. All of a sudden, I am scrambling to explain the reason why my tracked target that is overlayed on an aerial map of our lab is now moving through walls as if it is the reincarnation of Agent Smith who is out to nab Neo in the cult movie Matrix.

I figured that the problem was with the way the signal strength variations was modeled. Essentially, a localization algorithm is attempting to classify signal strength variation as either due to a legitimate movement of the transmitter or due to a wireless propagation issue called signal fading. The best way to do this would be to use the Probability Distribution Function (PDF) of the signal variation under signal fading. Knowing the PDF of a noise source such as fading helps you generate probabilities with each signal strength fluctuation. So if you observe a sudden dip in signal strength and your signal fading PDF tells you that the chance for such a variation is high due to fading then you are not going to update your transmitter’s position on the map.

Though, this sounds very simple and straight forward, the reality is quite complicated. If you skim through any introductory statistical signal analysis textbook, it is replete with discussions about Rayleigh, Rician, Nakagami and a host of other hard to pronounce names that are used as PDF for wireless signal strength. On reading the fine sub text you will realize that each of this PDF has a specific environment that they will work. For Rayleigh distribution, the wireless signal has to be completely diffused i.e. there should not be any clear line of sight communication between receiver and transmitter,  for Rician distribution you need to have a well defined line of sight component whereas Nakagami requires the Signal to Noise Ratio (SNR) of the wireless signal to be above some threshold and so on. Unless you have done an extensive Ray Tracing of your target localization environment, you have no prior idea about the fading environment that your wireless receiver and transmitter is going to experience. Additionally, if people are moving around then all bets are off about the fading environment that you painstakingly figured out using ray tracing. In short you are at the mercy of the environmental requirements set by the PDF that you decided to use for your localization algorithm. This is when every localization algorithm takes out their proverbial “Procrustean Bed” or the infamous heuristic fudge factors that attempts to massage values or throw away samples that doesn’t satisfy some arbitrary standards set by your pet PDF.

Procrustean bed of wireless algorithms

A typical modern day digital spread spectrum radio receiver, such as those in GPS, WiFi, ZigBee radios etc., have a down converter that translates the high frequency carrier to a low frequency base band signal which are then subsequently converted to digital samples using high speed Analog to Digital Converter. To detect a signal of interest in this digital samples, these devices use a process called convolution whereby the signal pattern of interest is time shifted and multiplied with the incoming digital samples. When an exact match occurs the result of the convolution process will be a value much higher in comparison to non-matched samples (this gain depends on the cross-correlation properties of the signal pattern of interest). So to isolate noise from a detection event, these receivers employ a threshold whereby any convolution result above this threshold are treated as detection event or else it is just noise. My reasoning for treating wireless signal strength as an extreme value distribution stems from this use of thresholds for detection. Now let me give a brief introduction to extreme value distribution.

Extreme value distribution is based on Fisher-Tippet-Gnedenko theorem which states that if we take independent samples from whatever underlying distribution and then form sub-samples consisting only of samples that are above or below a threshold then the distribution of the resulting sub-samples can be classified into one of the following three distributions: Type I Gumbel Distribution, Type II Frechet Distribution or Type III Weibull distribution or they can all be grouped under a single family called the Generalized Extreme Value Distribution (GEV). GEV has three parameters

location parameter,

scale parameter and

the shape parameter. The first two parameters (

) are akin to the mean and variance of signal strength at a location in the lab whereas the third parameter (

) is related to the

) that I developed to indicate the line of sight condition through the kurtosis as:

where

are the 2D-Cartesian coordinates of the transmitter,

,

and

is the

Gamma Function. The details about the moments of GEV can be found here

Stedinger, J. R., R. M. Vogel, and E. Foufoula-Georgiou, Frequency analysis of extreme events, in Handbook of Applied Hydrology, edited by D. A. Maidment, chap. 18, pp. 18-1–18-66, McGraw-Hill, New York, 1993.

The advantage of treating signal strength as an extreme value distribution is that GEV is independent of whatever underlying distribution forms the basis for noise in signal strength values. Additionally, by capturing just three parameters over my lab (

) during radio profiling, the mean, variance and the line of sight condition of this area is captured. Now the localization algorithms don’t have to worry about the best distribution to account for the line of sight condition but just concentrate on the best way to estimate these parameters for GEV in real time. Hope I can put this method to use on my next paper on localization.

18 views

Recent Posts

See All
bottom of page