Entropy

Entropy is usually equated with disorder, but this is a tenuous definition, because it depends on someone’s definition of what disorder actually is.

Consider a deck of cards. We would think of entropy as the deck of cards scattered everywhere, which is why we consider at disordered. But in reality, order here just means what we define order is in relation to a deck of cards.

Consider that when you were holding a deck of cards cleanly in your hand, there Are much fewer ways to hold the deck of cards than there are ways for the cards to be strewn about all over the floor

Entropy therefore is not about disorder, but it’s about how many ways something can happen in

  • The more ways that something can happen in, the more entropy there is

Whenever we increase the volume of something, There are more states that the particles within that volume can be in, which is increasing its entropy

  • ex. Imagine we have two containers of different sizes, but with the same amount of water vapour inside. The bigger container (or the more volumetric one) Will allow the water particles inside to have more states that they can possibly exist in. This means it has higher entropy.

Because the universe is always expanding, our entropy in the universe is always increasing (this is the second law of thermodynamics)

Entropy predicts that certain processes are irreversible or impossible, aside from the requirement of not violating the conservation of energy

Entropy is central to the second law of thermodynamics, which states that the entropy of isolated systems left to spontaneous evolution cannot decrease with time, as they always arrive at a state of thermodynamic equilibrium, where the entropy is highest.

The entropy of the universe must always increase

  • This is the second law of thermodynamics

Entropy is the internal property that changes as heat moves around in a system

Entropy is a measure of how evenly the systems energy is spread out. The less concentrated the energy, the less useful it is.

Entropy is a direct measure of each Energy configuration’s probability Energy that is spread Throughout a whole system has higher entropy Low entropy means the energy is concentrated. High entropy means it is spread out


Backlinks