Wednesday, February 21, 2024

Energy and Information

For a while now I have struggled with how and why energy is increased when relationships are add.  Binding energy increases the energy in the system.  Yet, ask a physicist and they will say that the number of potential states is reduced with dependencies of binding energy.  So, now I think I have it:

Given a system with particles - graph with nodes

I can measure the Shannon information, potential states:

At high Entropy - there are very few potential states, everything is bound to each other, no differential, low Information.  No work can be done.

But if I ask, **outside** the system I get a different answer. If the first system wants to communicate with a second system.  So it transmits a bit.  The more binding energy in the first system, the less bits are needed to communicate a long message.  The system has fewer messages it can produce but they are compressed, so each message carries a lot of Information.

System A

(a) (b) (c) (d) (o) (g) (t)

I can communicate many different messages - cat, dog but also, cod, dab.  Each token transmitted does not resolve the entropy

System A`

(c)->(a)->(t) (d)->(o)->(g) (b)

Limited number of messages, less potential work, but, a single token communicates a full word, transmit (c) and I receive 'cat'

So a phase transition occurs when we add the binding energy, the first system sees a drop in Energy but the meaning that can be derived from by the second system increases, hence the Information transmitted is increased.