top of page

Conifold Theory Basics, Part 2: Defining Information

  • Writer: conifoldtheory
    conifoldtheory
  • Dec 13, 2022
  • 3 min read

Updated: Jul 6

The colloquial definition of information is 'a meaningful statement about reality'. By contrast, the mathematical definition of information is disorder and randomness. Computation involves parsing information to extract a signal from the noise - to extract meaningful patterns from a messy dataset.

ree

In our everyday lives, ‘information’ is a meaningful statement about reality. But ‘information’ has a formal mathematical definition too. Oddly, this mathematical definition is the very opposite of the colloquial definition – it’s the amount of disorder or randomness in a dataset. So we can think about parsing information as a process of extracting meaningful patterns from a large, disordered dataset.


In this view, our ability to identify meaningful patterns in a messy dataset is our ability to do computational work. This computational work is a process of extracting a signal from the noise.


Our ability to identify meaningful patterns in a messy dataset is our ability to do computational work. This computational work is a process of extracting a signal from the noise.

But before we talk about this process – and the relationship between neural coding, meaning, and qualitative perceptual content – it’s worth digging into the definition of information a bit further.


A formal description of 'information'


The mathematical definition of information was discovered by Claude Shannon, who developed integrated circuits and computational theory at the dawn of the digital age, in the 1940s. He discovered that bandwidth was limited by the compressibility of a dataset – the more patterns in the dataset, the easier it is to store.


In classical computing architecture, the user interface data is actually encoded as a series of 1s and 0s. Each transistor in the computer chip can be in a ‘0’ or ‘1’ state. Any patterns – such as a long series of 0s or 1s – can be compressed. So we can think about each transistor having a ‘0’ or ‘1’ microstate. The sum of all these microstates gives us the macrostate of the whole computer chip. As it happens, this is the formal mathematical definition for the amount of ‘information’ held by a system – the sum of all component microstates, adjusted for their probability of occurrence.


In modern quantum computing architecture, each computational unit (or ‘qubit’) has a spin state with some probability of being either ‘0’ or ‘1’. The macrostate of the system is again the sum of all these component microstates, adjusted for their probability of occurrence. And again, the distribution of all possible microstates is the amount of ‘information’ held by the system. But with a lot more uncertainty for each microstate, a lot more information is encoded by the system – and that’s why quantum computers are more powerful than classical computers.


Computation as a physical process


Interestingly, there is a parallel in physics to the mathematical concept of information. ‘Entropy’ is the sum of all possible microstates in a thermodynamic system of particles, adjusted for their probability of occurrence. Entropy is also the amount of energy that is unavailable to do work in that physical system. So 'entropy' has the exact same formal definition as 'information' – but it is also a thermodynamic quantity.


This is a critical concept – energy must be expended to encode information. In classical computing architectures, a lot of this energy is wasted as heat output. In modern quantum computing architectures, heat must be kept out, because it destroys the computation. It’s worth noting the human brain is different from both systems – our bodies actively trap heat energy to drive computational work. This computational work involves extracting meaning from a large messy dataset.


This is a fundamental but underappreciated concept – information is a thermodynamic quantity as well as a computational quantity.

This is a fundamental but underappreciated concept – information is a thermodynamic quantity as well as a computational quantity. And this concept turns out to be critical for understanding how the brain processes information.

 
 
 

Comments


© 2025 by Conifold Counseling Services LLC

Conifold Counseling Services is a fully-owned subsidiary

of the 501(c)3-registered non-profit research organization, the Western Institute for Advanced Study.

bottom of page