ENTROPY

ENTROPY
   In its strict meaning, "entropy" is a thermodynamics term, first used by the German physicist Rudolf Clausius (1822-1888) in 1850 to describe the amount of heat that must be put into a closed system to bring it to a given state. The Second Law of Thermodynamics - often stated in terms of work as "it is impossible to produce work by transferring heat from a cold body to a hot body in any self-sustaining process" - can alternatively be rendered: "Entropy always increases in any closed system not in equilibrium, and remains constant for a system that is in equilibrium."To put it less technically: whenever there is a flow of energy some is always lost as low-level heat. For example, in a steam engine, the friction of the piston is manifested in non-useful heat, and hence some of the energy put into it is not turned into work. There is no such thing as a friction-free system, and for that reason no such thing as a perfect machine. Entropy is a measure of this loss. In a broader sense we can refer to entropy as a measure of the order of a system: the higher the entropy, the lower the order. There is more energy, for example, tied up in complex molecules than in simple ones (they are more "ordered"); the Second Law can therefore be loosely rephrased as "systems tend to become less complex". Heat flows, so ultimately everything will tend to stabilize at the same temperature. When this happens to literally everything - in what is often called the heat-death of the Universe - entropy will have reached its maximum, with no order left, total randomness, no life, the end. (There is, however, an argument about whether the concept of entropy can properly be related to the Universe as a whole.) Of course, the amount of usable energy in the Universe, primarily supplied by the stars, is unimaginably huge, and the heat-death of the Universe is billions of years away. Isaac ASIMOV's amusing "The Last Question" (1956) has a supercomputer, which for aeons has been worrying about the heat-death, reversing entropy at the last possible moment. The scientist Freeman DYSON, in "Time Without End: Physics and Biology in an Open Universe" (Review of Modern Physics July 1979), confronts the same question with a similar optimism and, one must assume, rather better mathematics. Local images of entropy, like the huge red Sun at the end of H.G. WELLS's THE TIME MACHINE (1895), long antedate the general use of the word; indeed, dying-Earth stories generally (END OF THE WORLD) can be seen as entropy stories, both literally and metaphorically.Although "entropy" has been a technical term for a long time, it is only since the early 1960s that it has, in its extended meaning, become a fashionable concept (although the word sometimes popped up in sf earlier, as in House of Entropy 1953 by H.J. CAMPBELL as Roy SHELDON). Since the 1960s, to the annoyance of some scientifically minded people, the extended concept of increasing entropy includes holes wearing in socks, refrigerators breaking down, coalminers going on strike, and death. These are indeed all examples of increasing disorder in a technical though not necessarily a moral sense. Life itself is a highly ordered state, and in its very existence is an example of negative entropy (negentropy). It is as if, though the Universe is running down, there are whirlpools of local activity where things are winding up. All forms of information, whether in the form of the DNA code or the contents of this encyclopedia, can be seen as examples of negentropy. It is natural, then, that a popular variant on the entropy story is the DEVOLUTION story.Entropy has become a potent metaphor. It is uncertain who first introduced the term into sf, but it is likely that Philip K. DICK, who makes much of the concept in nearly all his work, was the first to popularize it. He spells it out in DO ANDROIDS DREAM OF ELECTRIC SHEEP? (1968), where entropy, or increasing disorder, is imaged as "kipple": "Kipple is useless objects, like junk mail or match folders after you use the last match or gum wrappers or yesterday's homeopape. When nobody's around, kipple reproduces itself . . . the entire universe is moving towards a final state of total, absolute kippleization."It was, however, in NEW-WAVE writing, especially that associated with the magazine NEW WORLDS, that the concept of entropy made its greatest inroads into sf. J.G. BALLARD has used it a great deal, and did so as early as "The Voices of Time" (1960), in which a count-down to the end of the Universe is accompanied by more localized entropic happenings, including the increasing sleepiness of the protagonist. Pamela ZOLINE's "The Heat Death of the Universe" (1967), about the life of a housewife, is often quoted as an example of the metaphoric use of entropy. Another example is "Running Down" (1975) by M. John HARRISON, whose protagonist, a shabby man who perishes in earthquake and storm, "carried his own entropy around with him". The concept appears in the work of Thomas M. DISCH, Barry N. MALZBERG, Robert SILVERBERG, Norman SPINRAD and James TIPTREE Jr as a leitmotiv, and also in nearly all the work of Brian W. ALDISS, which typically displays a tension between entropy and negentropy, between fecundity and life on the one hand, stasis, decay and death on the other. Outside GENRE SF, Thomas PYNCHON has used images of entropy many times, especially in GRAVITY'S RAINBOW (1973). George Alec EFFINGER's What Entropy Means to Me (1972) is not in fact a hardcore entropy story at all (apart from a tendency for things to go wrong), but Robert Silverberg's "In Entropy's Jaws" (1971) is a real entropy story and a fine one, exploring the metaphysics of the subject with care. Although it was in the 1960s and 1970s that the entropy-story peaked, the image is still used, as in Dan SIMMONS's Entropy's Bed at Midnight (1990 chap).Colin GREENLAND once wrote a critical book called The Entropy Exhibition: Michael Moorcock and the UK "New Wave" (1983), and it is indeed Moorcock who has perhaps made more complex use of entropy and negentropy than any other sf writer, and not just in The Entropy Tango (fixup 1981); the two concepts run right through his Dancers at the End of Time and Jerry Cornelius sequences. Jerry Cornelius seems for a long time proof against entropy, and keeps slipping into alternate realities as if in hope of finding one whose vitality outlives its decay, but like a Typhoid Mary he carries the plague of entropy with him, and ultimately, especially after the death of his formidably vital and vulgar mother, succumbs to it himself, becoming touchingly more human, though diminished.In all of these works, entropy is a symbol or metaphor through which the fate of the macrocosm, the Universe, can be linked to the fate of societies and of the individual - a very proper subject for sf. Negentropy versus entropy is usually seen as an unequal battle, David against Goliath, but sickness, sorrow, rusting, cooling and death contrive to be held at bay, locally and occasionally, by passion and movement and love. Looked at from this perspective, entropy is one of the oldest themes in literature, the central concern, for example, of Shakespeare, Donne, Milton and - especially - Charles DICKENS.
   PN

Science Fiction and Fantasy Encyclopedia. . 2011.

Игры ⚽ Поможем решить контрольную работу

Look at other dictionaries:

  • Entropy — Saltar a navegación, búsqueda Entropy es una red Peer to peer descentralizada similar a la Freenet o a la GNUnet y que busca el anonimato de sus usuarios. El programa (que también se llama igual) que la mantiene está escrito en lenguaje C y no en …   Wikipedia Español

  • Entropy — En tro*py, n. [Gr. ? a turning in; ? in + ? a turn, fr. ? to turn.] (Thermodynamics) A certain property of a body, expressed as a measurable quantity, such that when there is no communication of heat the quantity remains constant, but when heat… …   The Collaborative International Dictionary of English

  • Entropy ry — is a non profit student organization for the advancement of electronic music culture at the Helsinki University of Technology. Founded in 1993 it is the oldest active organization of its kind in the Helsinki metropolitan area and one of the… …   Wikipedia

  • entropy — [en′trə pē] n. [Ger entropie, arbitrary use (by R. J. E. Clausius, 1822 88, Ger physicist) of Gr entropē, a turning toward, as if < Ger en(ergie), ENERGY + Gr tropē, a turning: see TROPE] 1. a thermodynamic measure of the amount of energy… …   English World dictionary

  • Entropy — és una red P2P(Peer to peer) descentralizada similar a la Freenet o a la GNUnet y que busca el anonimato de sus usuarios. El programa(que también se llama igual) que la mantiene está escrito en lenguaje C y no en Java como su antecesor Freenet.… …   Enciclopedia Universal

  • entropy — 1868, from Ger. Entropie measure of the disorder of a system, coined 1865 (on analogy of Energie) by German physicist Rudolph Clausius (1822 1888) from Gk. entropia a turning toward, from en in (see EN (Cf. en ) (2)) + trope a turning (see TROPE… …   Etymology dictionary

  • entropy — [n] deterioration breakup, collapse, decay, decline, degeneration, destruction, falling apart, worsening; concepts 230,698 …   New thesaurus

  • entropy — ► NOUN Physics ▪ a thermodynamic quantity expressing the unavailability of a system s thermal energy for conversion into mechanical work, often interpreted as the degree of disorder or randomness in the system. DERIVATIVES entropic adjective.… …   English terms dictionary

  • Entropy — This article is about entropy in thermodynamics. For entropy in information theory, see Entropy (information theory). For a comparison of entropy in information theory with entropy in thermodynamics, see Entropy in thermodynamics and information… …   Wikipedia

  • entropy — entropic /en troh pik, trop ik/, adj. entropically, adv. /en treuh pee/, n. 1. Thermodynam. a. (on a macroscopic scale) a function of thermodynamic variables, as temperature, pressure, or composition, that is a measure of the energy that is not… …   Universalium

  • Entropy — A mathematical measurement of the degree of uncertainty of a random variable. Entropy in this sense is essentially a measure of randomness. It is typically used by financial analysts and market technicians to determine the chances of a specific… …   Investment dictionary

Share the article and excerpts

Direct link
Do a right-click on the link above
and select “Copy Link”