Jump to content

Talk:Additive white Gaussian noise

Page contents not supported in other languages.
From Wikipedia, the free encyclopedia

Untitled

[edit]

just a quick note: I think it would be better to use another letter than n for the noise power, especially since n is used for the number of bits further down in the text - maybe N or σ^2 would be more suited .. --82.130.81.116 (talk) —Preceding undated comment was added at 13:21, 24 July 2008 (UTC)[reply]

So, what is i.i.d.? And why is there no introduction? The explanation looks like it could be rewritten to be the introduction. Zylorian (talk) —Preceding undated comment added 21:36, 3 November 2009 (UTC).[reply]

It seems like this entire page has been lifted almost word for word from Elements of Information Theory by Cover & Thomas, Chapter 9. Can you add a citation to an entire article? Jeejaw (talk) 06:02, 8 December 2010 (UTC)[reply]

I think there should be a description in words of what means before the term is used in the article. 89.242.200.18 (talk) 01:44, 17 December 2015 (UTC)[reply]

Relation to Gaussian noise

[edit]

Question: Is this completely different form Gaussian noise or just a specific example of the latter? I am no expert and tried to learn about Gaussian noise and was confused when I realized that there are these two articles with no mention of each other... --Jordan1976 (talk) 17:04, 28 February 2017 (UTC)[reply]

It's decidedly different. For example, most kinds of non-white/colored noise, especially when derived (from pretty much any kind of) white noise by linear, time-invariant filtering (LTI; the working horse of digital signal processing), tends towards Gaussian statistics because of the central limit theorem. Gaussian statistics are the norm, whereas whiteness and even additiveness are really a high strung special case. As such, the AWGN assumption is decidedly stronger than just the distribution of a signal being Gaussian; as is explained in the article you linked.
Furthermore, there is a little bit of an unstated assumption in AWGN besides that, which is rarely fully appreciated: AWGN not only states the noise is spectrally uniform, but that it is also point-to-point fully *independent*. That's a *much* stronger condition in principle than would at first seem; for instance, pretty much every symmetric cryptographical primitive can somehow be made to produce signals which are white in the spectral sense, but at the same time, most if not all of them are in the strong, statistical sense mostly if not fully dependent from time to time; they have no indeterminism at all, just likable descriptive statistics. If you could crack the primitive, whatever pseudorandom sequence you derived from it would be *fully* sequentially dependent, instead of seeming white, as in lower order, less developed analysis. Decoy (talk) 22:16, 10 March 2020 (UTC)[reply]

A couple of additions in need of references

[edit]

I'd very much like to add a couple of notable viewpoints into the article, but I can't really do so because I don't have any references to give for them. Maybe ya'll could help?

First, the reason for additivity in AWGN really is that you can use linear algebra to analyze the problem. Also that it's a sensible assumption in a wide variety of real world problems, such as in telegraph lines, space communication and such, which obey linear wave equations. (OTOH in high power optical and ionospheric propagation applications, it does not; so it's not just a tautology either.)

Second, the central limit theorem and a couple of other connected ones, such as the idea of Gaussian functions being the eigenfunctions of the Fourier transform operator, make Gaussians special. They not only analyze well using the easiest, square, Euclidean norm, but also extremalize a number of fundamental proofs, such as Shannon's noisy channel coding theorem. On one end, Gaussian noise is easy to analyze, and then on the other you can most easily show that's it's the hardest kind of noise to combat against, decibel per decibel.

And third, that whiteness thingy. It really does contain the idea of full independence. That should be pointed out, because the abbreviation sure does not. If I'm not wrong, this has also lead to a number of nasty homework problems in information theory courses. Because it's not too difficult to derive white-yet-serially-dependent processes, which fuck you up if you don't pay attention to the formal mechanics. In fact those are the easiest to construct when using Gaussian distributions: just use a 45 degree slanted two-dimensional Gaussian and project to the axes from there. Suddenly you have two correlated variables which individually are Gaussian and white. Repeat those two in a row, and you have non-stationary white process -- and one which models, say, any differential coding rather well to a first approximation. So that it's not just theory even here, but ubiquitous practice.

And so on. The problem is, *everybody* understands this in the coding and information theory field, from one angle or another. But then I and many others can't just find the proper reference to substantiate the claim. It's all just so self-evident; if not about folk theorems, then just "the general gist of it all". Maybe some of you could dig a bit deeper and incorporate those points, then? Decoy (talk) 22:40, 10 March 2020 (UTC)[reply]

Just replying to say I completely agree. Because it relies on a single reference, it's also skewed towards radio and telecommunication applications and makes terrestrial use case seem undesired when it's actually a useful model for NAND flash technology. Metzkorn (talk) 21:25, 9 March 2022 (UTC)[reply]

"Uniform Emission" in White's description

[edit]

Maybe this deserves a jargon tag? I cannot find anything on "uniform emissions." I think the name "white noise" is describing the fact the noise affects all frequency ranges much like "white" is radiation summed over all visible light frequencies, but "uniform emissions" just sounds like technobabble.

Metzkorn (talk) 21:13, 9 March 2022 (UTC)[reply]