Bandwidth and Uncertainty

Back to Nick Herbert’s “Quantum Reality” he touches on bandwidth, measuring attributes of a quantum particle, and the uncertainty principle. This seems a little different to me than my experience with Fourier series/transforms. He wasn’t talking so much about a time series as an entire quantum attribute expressed as a wave function. This wave function can then be expressed as a combination of waves from some waveform family (for example sines or cosines in the Fourier series expansion). I suppose mathematically it works out to be the same as a time series, albeit in more than one dimension most of the time. Anyway, take the wave function, find the expression for some attribute, and then you can express it in terms of a combination waves from some wave form family. When you do this there will be some number of specific wave forms needed to express the function you’re transforming, this is what Herbert calls the “Bandwidth”. Each waveform family, in abstract wave form family space, as another wave form family that is as different from it as possible. If you take some function or attribute of interest and express it in one family of waves and the conjugate family of waves, the product of the bandwidths must be greater than or equal to 1. This is implied to be a loose explanation of the Heisenberg uncertainty principle. (Actually I’m sure I’m missing something in Herbert’s reasoning but I think a review of quantum mechanics would be necessary in order to spot it). Anyway, Heisenberg uncertainty principle. When it comes to say, position or momentum, these are conjugate attributes so one can see that treated as wave form families their multiple would be subject to a lower bound with magnitude 1 in its own space. Objection: these are the conjugate attributes being measured, not the conjugate wave form families that one may attempt to express them in. Hence I don’t really see the connection yet, but again, a review of QM would probably help.

This also raised a question about Langan’s views on under-determinism: In interviews Langan has used the uncertainty principle as a way of justifying the existence of secondary telors. This combined with his apparent explanation of the arrow of time using conspansion, light cones, and constructive-filtrative duality, raise a question for me. Sure, there’s some element of under-determinism but that element seems to a limitation on the present condition. You could say that the question “where was particle x at time t?” is answered not just by the measurement at time t, but at measurements at time t+1, t+2, …. and so on. With the exception of entropy, the fundamental laws of physics so far are all symmetric, so what is it that thwarts the picture that the present is limited by Heisenberg as the most well defined, well determined state, while the past and future are blurred more and more the further one looks in either direction of time. I’m sure there’s some answer in the CTMU, I’ve had enough times where I thought I proved it wrong only to realize he was lightyears ahead of me and I just had to catch up. We’ll see, more learning is required.

An odd observation I had while going through linear algebra the other day: The row picture and the column picture are complementary and appear to be very like medium and content. Whichever perspective you choose, one is the medium and one is the content, while the whole picture is a sort of meta-object, which should be intriguing to anyone who’s read the CTMU.

Previous
Previous

Update

Next
Next

Learning About Learning