The Second Law of Thermodynamics
A Quick Look
by Albert Hines
A good understanding of this topic requires a thorough knowledge of
statistical analysis, quantum theory, and of thermodynamics. However, we
may look at a few simplified concepts which will help to illustrate the
property called "entropy" with which the second law of thermodynamics deals.
The second law of thermodynamics in essence states this: "There is property
called entropy which, in a closed environment can only increase.
Equivalently, in an open system, this property, entropy, can decrease only
as little or less than the increase of entropy of the surrroundings of this
system."
So what is this stuff called entropy? Entropy was developed from statisical
analysis of molecules in order to put a value upon the probability of a
substance being in a particular state. This is the crucial point of entropy.
It measures the likelihood that a group of molecules, each of which may have
a different speed, rotation, energy level, etc., will have their overall
average character (temperature, pressure, enthalpy, specific volume,
viscosity, conductivity, etc.) at a certain value. The second law of
thermodynamics dictates that the average properties (pressure, temperature,
etc.) of a "lump" of matter will be at the most probable state. Time for an
example:
Example 1:
A casino operator has one hundred slot machines. The probability of a payoff
for each slot machine is about one to one thousand. The machine costs a
quarter to play and pays off $100. The machine may pay off at random times
and each machine may have a slightly different payoff, but the average
property, the profit of the operator, is in the long run sure. The operator
may lose money one day, but in the long run, his profit is sure.
The concept of entropy is very similar. Every molecule is flying, spinning,
vibrating differently, but the overall properties of the billions of molecules
appear, as far as we can measure, constant. It is statistically possible that
all the molecules of bowl of soup bump into one of your chunks of potato and
that it flies out into space! But the probability is so remote, that it has
never been observed experimentally. Entropy is a game of chance. But with
matter consisting of so vastly many molecules, the odds win out almost every
time. When the odds don't win out, we call it a "miracle," - something which
is not supposed to happen scientifically. But in the course of natural
science, the odds being immeasurable, that which happens is the most likely
statistically.
"Yes, but what is entropy? What does it feel like? Explain it to me in
everyday terminology, give me fifty cents worth of entropy!" says the reader.
Well, unfortunately no one can answer that question exactly. But if there is
any consolation, no one can tell you what "energy" is either. Since we use
the term so frequently to describe our personal understanding of energy, we
are familiar with it. We see a fast car or a hot stove and say "that sure has
energy!" But what is energy? We see some of its results, but cannot describe
the stuff itself. The same is true of entropy. If we were to begin to use
the word "entropy" frequently (every time a firecracker explodes or we turn on
an electical heater, we learned to say "Well, that surely created a bucket load
of entropy!") then we would be familiar also with entropy. But the accurate
way this author knows to describe it is that it is a measure of the chance
(probability) of given molecules to exist in a certain state.
While from a thermodynamic point of view, we can study entropy without
considering probability or statistics at all, much insight is gained by
looking into the true nature of the beast. When we do consider the second
law without the background behind it, we tend to always use one of the basic
results that the entropy must increase, therefore, such and such will happen.
And this is not totally invalid, but can be used to generate false deductions.
Many great men have helped to build upon the knowledge of the concept of
entropy - Einstein, Boltzmann, Planck, Bose, Clausius, Kelvin, Carnot,
Maxwell, de Broglie, Tribus, Shannon, literally too many contributors to name.
But in the first third of the twentieth century, the probability of molecular
states was firmly turned into a mathematical function describing this stuff
called entropy. When this new property - entropy - was analyzed, the
following were among the results (some were theorized earlier, but put on a
solid theoretical basis after a second law analysis)
-> The property called 'entropy' in any system can, under ideal conditions,
remain constant. In practice, any change made in the system results in an
increase in the entropy. Alternately, it can decrease by an amount smaller
than that of something in contact with the system (i.e. entropy is 'squirted'
into the surroundings). Once entropy is generated, no known process can cause
it to be destroyed. This is known as the concept of 'irreversibility', since
the overall 'entropy level' can never be restored to its original condition
(this has been described colloquially, saying "you can't unscramble an egg.").
-> Without work input, thermal energy transfers (heat goes) in only from a
'hot' region to a 'cold' region, never from cold to hot.
-> Any cyclic device (one which returns to the same state after performing a
task) which either produces or consumes work must operate from a temperature
difference.
Example:
an internal combustion engine must have a combustion
temperature above that of the immediate surroundings in order to produce work.
A steam generator must have a heater and/or cooler to produce work.
-> Due to the huge number of molecules in a macroscopic world, whenever the
opportunity arises, matter will always go from a state of less probability to
a state of greater probability (not 'order', but 'probability'). Any exception
would be considered a "miracle". It is important to stress the difference
between probability and order, because we normally hear entropy referred to
as that "curse" that makes things go from a state of order to a state of dis-
order. This is dead wrong, because 'order' cannot be measured and hence, can
be used to "prove" falsehood based upon a flawed definition of order.
-> There is an upper limit to the efficiency of any device. This limit is
equal to the efficiency which results from a device that accomplishes the same
task without increasing entropy. Any actual device will be less efficient
(require more energy) in accomplishing the task. This excess energy expended
will always eventually be converted into a rise in temperature of something
(the surroundings).
-> Work energy is more 'precious' than heat energy. Work is harder to obtain,
easier to loose, and more productive than heat. Work may easily be converted
into heat, but heat is difficult to convert into work.
-> There is a limit to the amount of work obtainable from any given situation
(less than the total energy e=mc2). This limit is equal to the amount of work
done in a manner which does not generate (create) entropy. Once some of this
'available work' is lost or destroyed, it is gone forever. This is called
'lost work' or 'irreversibility.'
-> Some of the factors which contribute to lost work are: friction,
unrestrained expansions (such as releasing a high pressure gas into the air),
mixing of unequal substances, hysteresis (losses due to materials not
recovering perfectly), electrical resistance heating, contact of two materials
at different temperatures, combustion, shock waves, non-equilibrium, and
permanent bending or deformation. Any time one or more of these effects are
involved in a process, entropy is produced, work is lost, and efficiency drops.
-> The amount of lost work caused by any process is directly proportional to
the rate of entropy creation. The more entropy is produced, the more work is
lost. While the terms cannot be used interchangeably, they are inseparably
coupled.
Example:
We may create some entropy by mixing water into a bowl of syrup. But when we
do, we forever lose the opportunity to gain some work by allowing a semi-
permeable membrane to raise the level of the syrup by osmotic diffusion of the
water into the sugar solution. We can only regain some of the work by input
of more work into the system (centrifugal separation for example). The amount
of work input required is always at least that of the amount gained for such
an attempt.
-> There is a temperature below which it is impossible to achieve. This
temperature is referred to as 'absolute zero.' This absolute zero has some
very interesting and strange characteristics that I won't go into. If this
temperature could ever be obtained, it could not be sustained.
-> The potential for extracting work from a given situation (let us call it
a given 'system') can either decrease or remain constant, but never increase
without energy input. From observations, it is found that the more the system
is disturbed (changed in any way from its original state) the more likely it
is to lose its work potential (and thus generate entropy). There are a few
processes which do not lose measurable work potential, however, these are the
exception to the norm. (The term 'work potential' should not be confused with
the term 'potential energy' which is a gravitational work storage and has
little to do with the present discussion)
Example:
If we consider a pile of rocks, we could lower every rock carefully to the
ground using a lever and thereby raising a weight (thus storing energy). But
if we kick over the rocks, they will fall without utilizing this work, and
this lost work can never be restored (i.e. we will have to spend at least as
much work to restore it to its original condition).
Example:
It is easy to arrange rocks so that they appear in a very orderly geometric
pattern on the floor. However, a jumbled, apparently disorderly pile of
rocks would have less entropy, because work could be extracted by lowering
them to the ground. Thus, orderliness is a subjective and fallible measure
of entropy.
* Light My Path Publications *
Back