Principle of Entropy Increase Encyclopedia Article

Principle of Entropy Increase

The following sections of this BookRags Literature Study Guide is offprint from Gale's For Students Series: Presenting Analysis, Context, and Criticism on Commonly Studied Works: Introduction, Author Biography, Plot Summary, Characters, Themes, Style, Historical Context, Critical Overview, Criticism and Critical Essays, Media Adaptations, Topics for Further Study, Compare & Contrast, What Do I Read Next?, For Further Study, and Sources.

(c)1998-2002; (c)2002 by Gale. Gale is an imprint of The Gale Group, Inc., a division of Thomson Learning, Inc. Gale and Design and Thomson Learning are trademarks used herein under license.

The following sections, if they exist, are offprint from Beacham's Encyclopedia of Popular Fiction: "Social Concerns", "Thematic Overview", "Techniques", "Literary Precedents", "Key Questions", "Related Titles", "Adaptations", "Related Web Sites". (c)1994-2005, by Walton Beacham.

The following sections, if they exist, are offprint from Beacham's Guide to Literature for Young Adults: "About the Author", "Overview", "Setting", "Literary Qualities", "Social Sensitivity", "Topics for Discussion", "Ideas for Reports and Papers". (c)1994-2005, by Walton Beacham.

All other sections in this Literature Study Guide are owned and copyrighted by BookRags, Inc.

Principle of Entropy Increase

Entropy is a physical quantity that can be interpreted as a measure of the thermodynamic disorder of a physical system. Entropy has the unique property that its global value must always increase or stay the same; this property is reflected in the second law of thermodynamics. The fact that entropy must always increase in natural processes introduces the concept of irreversibility, and defines a unique direction for the flow of time.

On a fundamental level, entropy is related to the number of possible physical states of a system, S = k log (Gamma), where S represents the entropy, k is Boltzmann's constant, and (Gamma) is the number of states of the system. A useful example to illustrate the principle of entropy increase is a closed box containing an ideal gas with a fixed number of molecules. If the enrgy of the box is increased, the number of states increases because there are many ways that the gas molecules reflect the increased energy state. One molecule could respresnt the entire increase or two or more molecules could represent the increase.

Although the entropy of a system can be reduced by a reduction in energy (accomplished by doing work on the surroundings) there is an increase in the entropy of the system's surroundings. Any process that includes heat transfer from one system to another, therefore, increases the total entropy. When two systems are in thermal equilibrium, however, the energy is divided equally between them, no heat transfer takes place, and the entropy does not change. The entropy of a system, therefore, is greatest when it is in thermal equilibrium with its surroundings.

A process in which the net entropy change is positive is called irreversible, because a process with a negative net entropy change cannot be performed to counteract it. A process which has a zero net entropy change, however, is reversible, because the change can be counteracted by another process with a zero net entropy change. Entropy, therefore, increases in all real processes.

One interpretation of the principle of entropy increase is that it defines a unique direction for the flow of time. If all processes were reversible, then movement forward or backward in time would be impossible to tell; broken glass might spontaneously reassemble itself, for example. Increasing entropy sets the direction of the arrow of time.