Entropy - Seeking a Simple Explanation

Hi everyone, can anyone explain entropy to me please! In simple terms that are readily understood. Kindly note that it is not always helpful to explain a basic concept in terms of further concepts that are equally obscure. The following passage is taken from an article posted on Nassim Haramein’s facebook page tonight:

“The physics that describes the micro-scale ordering of a system is known as entropy, so obviously this is where we must turn. When we talk about entropy, we have to remember that the idea of entropy was derived assuming an isolated or closed system where an increase of entropy is essentially an increase in the amount of “lost” energy. Thermodynamically speaking, the system will be tending towards a state of thermal equilibrium – a state where no more work can be done on the system, unless from external sources – this is the state of maximal entropy.”

I have also recently read a small book by Carlo Rovelli “The Order of Time” where again, the fundamental principles of the book are all based around “Entropy”.

Of course I may be lacking in some basic knowledge of physics but still it does seem to me that a lot of new ideas are arising from the term Entropy - and yet I have to confess I am still not at all sure what exactly Entropy is. Over to you, and thank you.


Hi John,

This certainly is no scientific explanation, but the simple way I think of entropy is the following:
It describes and measures the energetic changes that happen within a system or an object when it adjusts itself to its surroundings. I.e. in terms of heat, the hotter object cools down to the temperature of its environment and heats up its environment in the process up to the point until they both stop changing and rest at that uniform found like temperature. There also is the notion of what happens at the molecular or atomic level in this process… when heating up molecules/atoms move faster or when cooling down they move slower. Often these movements are considered irregular or chaotic (i.e. not fully understood) or stated differently, they move in disharmony… A lot of what we learn in these courses is about the importance of the underlying laws of nature that are in fact quite harmonic and orderly (so the opposite of this rudimentary understanding of entropy and all objects striving toward this uniform state of sameness via chaotic state changes.)
So you could say that the opposite of entropy (i.e. fractals repeating in orderly and harmonious patterns from the infinitely small to the infinitely large - micro to macro describe the patterns and forms of life);
Entropy leads to disorderly loss of energy and ultimately death whereas negentropy (i.e. the opposite) is about focusing on the purity of harmonious frequencies describing living patterns that exist/survive to infinity.
I hope this helps and I would certainly welcome others to add and improve on this “lay(wo)man’s” non-scientific description.




Nassim channels Bucky Fuller. Entropy is one of the laws of thermodynamics that states energy flows from more to less. Nassim objects it is only true in a closed system, which doesn’t truly exist, but it gave us steam engines, the industrial age and catastrophic climatic change.


Here is an interesting article about energy creation using graphene, a single atom thick layer of graphite, https://phys.org/news/2020-10-physicists-circuit-limitless-power-graphene.html
I would love to better understand it in terms of how to view this from the unified physics perspective.
It does seem to focus on the more regular movements of electrons in a massless matter (graphene vs graphite that would have heavy mass) - suggesting an energy that harnesses vacuum??? (I’m really reaching here and may well have to be corrected!) - I just like the reference in this prior 2014 article here of “Self-Organized Platinum Nanoparticles on Freestanding Graphene.” https://phys.org/news/2014-02-platinum-nanoparticles-specific-patterns-bonded.html

Thoughts with respect to entropy and/or Unified Physics in this context?



1 Like

This is 5 years more current:

1 Like

Very cool, thanks Darryl, this article contains better explanations;
I wonder whether they used this in the circuit they created and claim it could be incorporated into chips which they announced yesterday (on the first link in my post)


I’m unsure. I deflected into reading science news after drowning in COV19 political news. My understanding is they’ve used it in several circuits already, because it comes at no cost to students and can be tested in an hour as opposed to $10,000 and a half year+ of effort to do the same with cuprates. They did discuss using it for room temperature quantum computing, but these scientists are more into “pure” research rather than applied research. My take, for what it’s worth.

1 Like

Dear John One of our mistake in physics is, to think an atom is a mechanics phenomenon, and an atom is following through entropy. this perception is totally wrong, therefore I am rejecting this ideology. I.e. an atom is using thermodynamic into motion (electron) and function without friction. or each atom has its own characteristic behavior, or each atom can/cannot interact with each other based on nature. But Entropy manifesto stating this action must have mechanics remark it is wrong.
Atoms are the smallest intelligent complete unit of universe. this is unprecedented statement, but that is fact. Read My paper or books on this regards. best wishes.

1 Like

Thanks for your contribution Javadf. However, I’d still like somebody to “define” the term Entropy for me, especially as its meaning seems to depend on the point of view of whoever is using it.

The ancient texts describe entropy as chaos. It is disorder. It is the opposition to order and light and categorization. It is space-time. In the way that I understand the romantic version of entropy, it is the space created by choice. Choice extends space-time via entropy. When we make a choice, we create space and opportunity for the effects of that choice to play out. The choice is the action, the space-time is the environment. Almost all ancient cultures recorded that the universe was born from chaos. That light emerged from darkness. That is a lovely way to explain entropy in my opinion.



I thing at the most fundamental level, entropy is information.

entropy is a mesure of the complexity of a system, it is a number without (physical) dimension
like a probability.

For example, the more of free (degree of freedom) particles is in the system, the more of his entropy is increasing.

If you can also have in a system a great number of particles, that are linked together, like in crystal where the entropy is low because there is few information to describe the crystal is needed : the description of the primitive cell and the microscopical form and dimension of the crystal (assuming it is a perfect one : no dislocation, no hole, …)

But if you destroy the crystal by increasing is temperature smelting it and vaporizing it, so each atom become free. We have lost the order, increasing the entropy because one must know the position and speed of all atoms to know exactly the state of the system.

When a system is closed - no exchange of energy or matter with its environment, the order of the system cannot increased. It can just stay as it his, this is constant entropy or decay is order that is increasing entropy. This is the second principle of thermodynamics.

If you reduce the temperature of the system, the link between the parts (the particles) of the system (our crystal) can occurs again and the entropy decrease. At the absolute zero temperature (0 degree kelvin)
there is no more movements of the particles, so there is no information as speed to furnish to describe the state of the system, it is simpler. This is the third principle of thermodynamics :
The entropy decrease when the temperature decrease.

These results come from the thermodynamics and thermal physics ie statistical physics.

Other form of entropy have been discover :

  • Claude Shannon : entropy in communication
  • Jacob Bekenstein and Stephen Hawkings : entropy of black hole.

This last form of entropy lead Gerard t’Hooft to formulate the holographic principle that Nassim use in is model but with the quantum structure of space he discover.


Hi, thanks for your input on this discussion. It’s a difficult subject I think, but I am starting to get an understanding. Thank you.


I wrote two articles on entropy in my website but I should translate it in english:



I like the negentropy bit (purity of harmonious frequencies describing living patterns that exist/survive to infinity.) Is it attainable?

1 Like

the answer to your question is yes. But the real question is, is it recognizable to our sensory perception?

1 Like

translate.google translates the French text of fxp’s website post



Entropy characterizes the increase in the complexity of a system, ie the amount of information that is necessary to describe (solve) it.

At the outset, we consider a closed system, that is to say a system which exchanges neither energy (mechanical, thermal, radiation) nor matter with its environment.

Entropy is linked to the notion of disorder: the more information there is necessary to describe a system, the more it can appear to us to be disordered.

In fact, entropy is a concept of physics linked to the precise measurement of notions of order or disorder. Entropy also measures the capacity to evolve for a closed system having a certain usable energy (because having a low entropy).

If you have a team in which the horses are going in all directions you cannot move forward because there is no order.

If you discipline your horses then their ordered energy becomes usable to move the team forward.

This example shows that energy (the force that horses can mobilize to do a certain job) is not sufficient and that there is also a notion of order which is very important. It is the concept of entropy that is linked to this order or this disorder.

Energy is what allows evolution, change. If a car’s gas tank is empty, it cannot move forward unless it uses the potential gravitational energy it has or the kinetic energy it has accumulated on a slope, which will allow you if you are lucky, to get to the next gas pump.

If the entropy is low (ordered system) the system can evolve.

As it evolves, entropy increases, ie energy degrades (decreases its order) and the system is less able to evolve.

For example, horses will consume the chemical energy contained in the food they have eaten by using oxygen from the air through respiration to produce ATP molecules Adenosine_triphosphate_ which is the fuel of physiology. By pulling the hitch they will consume this fuel to operate the muscles (the engine).
When the energy of all the food has been consumed with possibly the reserves stored in their muscles, their liver and their fat then they will be tired, will have to rest and if they do not eat again (closed system) they will not be able to well. continue to pull the team for a long time if not until exhaustion (energy reserves).

Consuming energy is in fact increasing entropy because we never consume energy because of the law of conservation of energy which is the first principle of thermodynamics: in a closed system, energy is constant .

When energy is dissipated or degraded, its entropy therefore increases its disorder (this is the second principle of thermodynamics).

In the example of horses, the horse dung resulting from digestion is less orderly than the herbs from which it is derived.
It is the low entropic energy of the sun that photosynthesizes the grasses to regrow using organic matter in the horse manure.

Entropy is a notion initially introduced in macroscopic thermodynamics by Clausius.
Rudolf_Clausius_ and whose deep meaning in terms of information was clarified much later in statistical mechanics by Boltzmann_.

The second principle of thermodynamics says that in a closed system, entropy can only increase or at the limit, remain constant.
Order and disorder are of fundamental importance in physics which deals with the operating laws of physical systems composed of a very large number of entities (a gas formed by all of its molecules for example) . This physics is called Thermodynamics_.

Large numbers reveal new properties, new concepts, new realities and experiences.

Entropy was then redefined by Shannon_ as part of information theory where entropy is identified with the amount of information.

(Information theory is the basis of computer science, so entropy must play an important role in this area of ​​entropy and computing.)

Entropy and information

Entropy and information are strongly related concepts and can be considered the same in statistical mechanics.

In fact, the more complex a system, the greater its entropy and the more information is needed to describe it.

For example, the same quantity of matter in the form of gas or in the form of a crystal is not described with the same quantity of information. If the crystal is perfect (without gap, or dislocation, etc.) then it suffices to specify the position of an atom of the crystal and the structure of the crystal lattice to know where all the atoms of the crystal are located. So very little information is needed to describe the system.


1 Like

Thank you so much for such an elaborate definition. What do you think causes the most entropy? I know it is caused by energy, but that would make it an extension or expression of energy, yes? So what causes the most energy? My opinion is emotion. On a Macro scale, energy is action and entropy is information, so entropy would be the data through which the energy is both expressed and manifested. In the Emerald Tablets, Thoth is told to “quell all the chaos of emotion” because only then can he find balance and have the power of “the word” which is vibration and magnetism. This was recalled into mind while reading your reply. You said, “If the entropy is low (ordered system) the system can evolve.” And that immediately reminded me of the Emerald Tablets describing how to achieve the balanced state of consciousness capable of ascension. It is amazing to me that we are just now understanding the concepts of Old Kingdom Egypt. But there is nothing new under the sun, right? (open system) .