To say that something is random is to say that it conveys no information. That is the true nothing.
We know from physics that vacuums, as the complete absence of any matter or energy, don't exist, because the uncertainty principle allows particle-antiparticle pairs to appear spontaneously provided they vanish again quickly enough. This is what creates the 'vacuum energy' and is measurable.
So you might argue that there's no such thing as nothing, but the presence or absence of matter or energy is of no significance. What is important is information. In the data-information model, information cannot be extracted from completely random data. Or, to put it another way, completely random data is no data at all (is nothing).
However, randomness is relative. If I listen to someone speaking a Chinese language the words are random to me because I cannot extract meaning from them, but of course they are not random to a Chinese speaker. So, nothingness is relative. Unless, that is, there exists randomness that is absolutely random. Is this the randomness of quantum mechanics? Is this 'God playing dice' (in the famous phrase of Alfred Einstein)?
Sunday, 16 July 2017
Subscribe to:
Post Comments (Atom)
2 comments:
As a psychologist, I like the perspective of "what is important is information" because I am interested in people's actions and that interest is based on a belief that they are too.
I was also surprised when you encouraged an intuition that randomness does not carry information.
One of the milestones in grasping the professional knowledge of psychology is understanding that something that is modelled as random has a mean value tending to zero and is unrelated to some other variable. That does mean it is unrelated to everything. (And there certainly is data otherwise we could't check and we wouldn't have cared in the first place!)
So I was glad you ended where you did.
Having said all of this my physics is fairly rubbish and I am not sure how well I grasped the physics' bits.
Thanks for the comment - it raises an interesting point.
The fact that something is random can carry information, provided it can be compared, set alongside, something that is not random. The information is in the difference between being random and not being random: the 'difference that makes the difference'. That difference can carry information, provided it is not itself random. So, differences are needed to carry information, but differences will not carry information if they are random.
So, let's try to (semi-) formalise it. There is a variable, n, and System A (person, whatever) which repeatedly measures it. The value of n is not related to anything System A knows about: it is completely random with respect to anything in System A's narratives. System A therefore learns nothing from each measurement: measurements of n carry no information.
I think, Jo, what you are saying is that the fact that n is random tells you something: carries the information that n is random, if you like? That is true, but only if there exists (or there is the possibility that it could exist) some other variable, m, which is not random. The information is in the difference between variables n and m. That is not a random difference: it comes about because n is unrelated to anything System A knows about but m is correlated with something System A knows about.
Hope that makes some sort of sense!
Post a Comment