I always find a good starting place Wikipedia, that famous repository of seemingly all knowledge of learned minds, yet notorious at the same time because seemingly anyone can change its contents (I've never even tried). I do know that when it comes to subjects I know something about, I've always found it both agreeable and further educational. So I looked up the Second Law on it, and found this: http://en.wikipedia.org/wiki/Second_law_of_thermodynamics
________________________________________
Second law of thermodynamics
From Wikipedia, the free encyclopedia
The second law of thermodynamics states that the entropy of an isolated system never decreases, because isolated systems always evolve toward thermodynamic equilibrium, a state with maximum entropy.
The second law is an empirically validated postulate of thermodynamics. In classical thermodynamics, the second law is a basic postulate defining the concept of thermodynamic entropy, applicable to any system involving measurable heat transfer. In statistical thermodynamics, the second law is a consequence of unitarity in quantum mechanics. In statistical mechanics information entropy is defined from information theory, known as the Shannon entropy. In the language of statistical mechanics, entropy is a measure of the number of alternative microscopic configurations corresponding to a single macroscopic state.
The second law refers to increases in entropy that can be analyzed into two varieties, due to dissipation of energy and due to dispersion of matter. One may consider a compound thermodynamic system that initially has interior walls that restrict transfers within it. The second law refers to events over time after a thermodynamic operation on the system, that allows internal heat transfers, removes or weakens the constraints imposed by its interior walls, and isolates it from the surroundings. As for dissipation of energy, the temperature becomes spatially homogeneous, regardless of the presence or absence of an externally imposed unchanging external force field. As for dispersion of matter, in the absence of an externally imposed force field, the chemical concentrations also become as spatially homogeneous as is allowed by the permeabilities of the interior walls. Such homogeneity is one of the characteristics of the state of internal thermodynamic equilibrium of a thermodynamic system.
________________________________________
There is more, much more, and please read it all, for it is good. To begin, it immediately takes us to the concept of entropy, which a measure of disorder in an (isolated, closed) system.
Yet, this has always struck me as bizarre and counter-intuitive. Why don't we speak of the order in a system, in positive terms? In science, as in everyday life, we are accustomed to measuring how much of something a thing has, not how much non-something it possesses. So why isn't entropy the same, a measurement of what's there, not what's lacking? It's as if we defined matter in terms of the space surrounding it.
The entropy of the cosmos is always increasing, we are also told, as another invocation of the Second Law. Information content is always decreasing. Efficiencies are always less than 100%. We're always losing. Growing older, and dying. Death and decay -- what could be a better metaphor for a process that also describes a Carnot Engine? How do all these ends tie together anyway?
________________________________________
Yet they do, in a very mathematical, and, yes, intuitive, way. The mathematics I speak of here is that branch called Probability and Statistics.
Stop. Don't run and hide for cover. I'm not a mathematician, or even a physicist. I'm just a plain old chemist, without even his PhD. I'm not even going to look anything up, or present any strange looking equations or even charts to use. I'm going to try to talk about it the same down to earth language that I used in convincing myself of the validity of the Second Law years ago.
Think of a deck of cards. Better yet, if you have one handy, go grab it. Riffle through it, in your mind or in your hands (or in your mental hands, if you've got nimble ones). What do you notice? First, that all the cards are different -- ah, if this isn't the case, you aren't holding a proper deck. If it is, do like me and count them. Fifty-two of them, all spread before you, name and face cards, black and red, tops and bottoms.
Now shuffle them as randomly as you can (if you find this difficult, let your dog or a small child do it for you). Drop them on the floor, kick them around for a while, then walk about, picking them here and there, at whim, until they're all in your hands again. The only thing I ask you to do while doing this is to keep the faces (all different) down and the tops (all the same, I think) up. Pick them all up. Nudge every corner, every side, every edge, into place, so that the deck is neatly piled.
Now guess the first card. A protest? "I've only a one in fifty-two chance of being right," you exclaim in dismay. If you did, that's good, for we're already making progress. You have some sense of what a probability means. One in fifty-two is not a very good chance. You certainly wouldn't bet any money on it (unless you're a compulsive gambler, in which case Chance help you).
Another way of stating your predicament is that you haven't sufficient information to make a good guess. If you could only know whether it was a red card or a black card, a face card or a number card, or something, anything like this, you could start thinking about your pocketbook before making you guess. But you know nothing, nothing! Well, except that it has to be one of the fifty-two cards comprising the deck.
Hold on, because I'm going to make things worse. What if I asked you to guess, not just the first card, but to guess every card in the deck? If punching me in the mouth isn't your answer, you might just hunker down and wonder how to determine what your chances were of accomplishing such an amazing feat. How could you do this?
This is where a little mathematics comes in. Create a mental deck of cards in your head. Choose the first card at random -- say, seven of spades. That could have been in any one of fifty-two cards, but you placed it first. Then the second card -- what is it? How many remaining cards could you have chosen? Why, fifty-two minus one, equaling fifty-one. Now the third card. Fifty-one minus one, equaling fifty. And so on, and on, and on, etc., until we come to the last card.
So in the placement of the cards you have 53 X 51 X 50 X ... all the way down to the last card, or X 1. Mathematicians have a nice way of expressing a product like this: it's called a factorial, and its represented the "!" symbol. In this case, it would be fifty-two factorial, or 52!.
It's one thing to state it. Actually carrying out the calculation, even with a calculator or on your computer, isn't very easy. Fortunately, all those years ago I already did it, so I will present you with the approximate answer I recall (approximate because my calculator couldn't handle that many digits). That answer is "8 X 10 E67". This is another mathematical shorthand, meaning in this case "Eight times ten, where ten is multiplied by itself 67 times over first". Or, if your prefer, because this is a very, very large number, actually eight followed by sixty-seven zeroes, we can take its based-10 logarithm, around 67.9. Well, round that off to 68, and you're there as good as not.
A number like that is so large (though it's only a trillionth of the number of atoms in the universe), that you wouldn't bet the tiniest tip of one hair leg of the louse living off the tip of one of your hair legs on it. It might as well be infinite, as far as you're concerned. But it's not infinite, not even the even tiniest bit close, all of which brings me back to the subject of thermodynamics.
________________________________________
The second law is an empirically validated postulate of thermodynamics. In classical thermodynamics, the second law is a basic postulate defining the concept of thermodynamic entropy, applicable to any system involving measurable heat transfer. In statistical thermodynamics, the second law is a consequence of unitarity in quantum mechanics. In statistical mechanics information entropy is defined from information theory, known as the Shannon entropy. In the language of statistical mechanics, entropy is a measure of the number of alternative microscopic configurations corresponding to a single macroscopic state.
The second law refers to increases in entropy that can be analyzed into two varieties, due to dissipation of energy and due to dispersion of matter. One may consider a compound thermodynamic system that initially has interior walls that restrict transfers within it. The second law refers to events over time after a thermodynamic operation on the system, that allows internal heat transfers, removes or weakens the constraints imposed by its interior walls, and isolates it from the surroundings. As for dissipation of energy, the temperature becomes spatially homogeneous, regardless of the presence or absence of an externally imposed unchanging external force field. As for dispersion of matter, in the absence of an externally imposed force field, the chemical concentrations also become as spatially homogeneous as is allowed by the permeabilities of the interior walls. Such homogeneity is one of the characteristics of the state of internal thermodynamic equilibrium of a thermodynamic system.
________________________________________
There is more, much more, and please read it all, for it is good. To begin, it immediately takes us to the concept of entropy, which a measure of disorder in an (isolated, closed) system.
Yet, this has always struck me as bizarre and counter-intuitive. Why don't we speak of the order in a system, in positive terms? In science, as in everyday life, we are accustomed to measuring how much of something a thing has, not how much non-something it possesses. So why isn't entropy the same, a measurement of what's there, not what's lacking? It's as if we defined matter in terms of the space surrounding it.
The entropy of the cosmos is always increasing, we are also told, as another invocation of the Second Law. Information content is always decreasing. Efficiencies are always less than 100%. We're always losing. Growing older, and dying. Death and decay -- what could be a better metaphor for a process that also describes a Carnot Engine? How do all these ends tie together anyway?
________________________________________
Yet they do, in a very mathematical, and, yes, intuitive, way. The mathematics I speak of here is that branch called Probability and Statistics.
Stop. Don't run and hide for cover. I'm not a mathematician, or even a physicist. I'm just a plain old chemist, without even his PhD. I'm not even going to look anything up, or present any strange looking equations or even charts to use. I'm going to try to talk about it the same down to earth language that I used in convincing myself of the validity of the Second Law years ago.
Think of a deck of cards. Better yet, if you have one handy, go grab it. Riffle through it, in your mind or in your hands (or in your mental hands, if you've got nimble ones). What do you notice? First, that all the cards are different -- ah, if this isn't the case, you aren't holding a proper deck. If it is, do like me and count them. Fifty-two of them, all spread before you, name and face cards, black and red, tops and bottoms.
Now shuffle them as randomly as you can (if you find this difficult, let your dog or a small child do it for you). Drop them on the floor, kick them around for a while, then walk about, picking them here and there, at whim, until they're all in your hands again. The only thing I ask you to do while doing this is to keep the faces (all different) down and the tops (all the same, I think) up. Pick them all up. Nudge every corner, every side, every edge, into place, so that the deck is neatly piled.
Now guess the first card. A protest? "I've only a one in fifty-two chance of being right," you exclaim in dismay. If you did, that's good, for we're already making progress. You have some sense of what a probability means. One in fifty-two is not a very good chance. You certainly wouldn't bet any money on it (unless you're a compulsive gambler, in which case Chance help you).
Another way of stating your predicament is that you haven't sufficient information to make a good guess. If you could only know whether it was a red card or a black card, a face card or a number card, or something, anything like this, you could start thinking about your pocketbook before making you guess. But you know nothing, nothing! Well, except that it has to be one of the fifty-two cards comprising the deck.
Hold on, because I'm going to make things worse. What if I asked you to guess, not just the first card, but to guess every card in the deck? If punching me in the mouth isn't your answer, you might just hunker down and wonder how to determine what your chances were of accomplishing such an amazing feat. How could you do this?
This is where a little mathematics comes in. Create a mental deck of cards in your head. Choose the first card at random -- say, seven of spades. That could have been in any one of fifty-two cards, but you placed it first. Then the second card -- what is it? How many remaining cards could you have chosen? Why, fifty-two minus one, equaling fifty-one. Now the third card. Fifty-one minus one, equaling fifty. And so on, and on, and on, etc., until we come to the last card.
So in the placement of the cards you have 53 X 51 X 50 X ... all the way down to the last card, or X 1. Mathematicians have a nice way of expressing a product like this: it's called a factorial, and its represented the "!" symbol. In this case, it would be fifty-two factorial, or 52!.
It's one thing to state it. Actually carrying out the calculation, even with a calculator or on your computer, isn't very easy. Fortunately, all those years ago I already did it, so I will present you with the approximate answer I recall (approximate because my calculator couldn't handle that many digits). That answer is "8 X 10 E67". This is another mathematical shorthand, meaning in this case "Eight times ten, where ten is multiplied by itself 67 times over first". Or, if your prefer, because this is a very, very large number, actually eight followed by sixty-seven zeroes, we can take its based-10 logarithm, around 67.9. Well, round that off to 68, and you're there as good as not.
A number like that is so large (though it's only a trillionth of the number of atoms in the universe), that you wouldn't bet the tiniest tip of one hair leg of the louse living off the tip of one of your hair legs on it. It might as well be infinite, as far as you're concerned. But it's not infinite, not even the even tiniest bit close, all of which brings me back to the subject of thermodynamics.
________________________________________
I want to make this crystal clear. I am not relating the cards to the individuals atoms, but to the quantity of kinetic energy each particular atom has. We can even quantize the energy in integer units, from one unit all the way up to -- well, as high as you wish. If there are a trillion atoms of gas in this globe, then let's say that there are anywhere from one to a trillion energy levels available to each atom. The precise number doesn't really matter -- I am using trillions here but any number large enough to be unimaginable will do; and there isn't actually any relationship between the number of atoms and number of energy levels. All of this is just simplification for the purpose of explanation.
Very well. Consider this glass globe full of gas atoms. There were two ways we can go about measuring properties of it. The easiest way is to measure its macroscopic properties. These are properties such as volume, pressure, temperature, the number of atoms (in conveniently large units like a mole, or almost a trillion times a trillion), the mass, and so on. They're convenient because we have devices like thermometers, scales, barometers, etc., that we can use to do this.
But there is another way to measure the properties of the gas: the microscopic way. In this, we take into account each atom and its quantity of kinetic energy, or some measure of its motion, one by one, and sum the whole thing up. I'm sure you'll agree that this would be a very tedious, and in practice, absurd and impossible way to make the measurements -- for one thing, even if we could do it at all (a very dubious if, to say the least) it would take nearly forever to get anywhere with any measurement at all. Fortunately, however, there is a correspondence between these macroscopic and microscopic properties, or states as I shall now call them. That correspondence is via entropy, or the heart of the Second Law.
Recall the statement from Wiki: "Entropy is a measure of the number of alternative microscopic configurations corresponding to a single macroscopic state." That card deck of energy units assigned to each gas atom, like the cards in an actual deck, can be arranged in many, many ways: about a number whose base ten logarithm is 67.9, recall. Excuse me, that's for the 52 cards in a deck; for the trillions of possible energy states in a trillion X trillion atoms, that number would be astronomically large; so large that even its logarithm could not be expressed in a format like this, possibly not in any format available in the universe (if anyone can calculate that). From a microscopic view, the probability of that particular state might as well be zero, for our ability to calculate it. Like a well-shuffled, deck of cards, there's just no useful information in it. Another way of saying this, returning to our randomly moving globe of gas atoms, is that there is no way of doing any useful work with it.
That's for a well-shuffled deck of cards / highly randomized energy distribution among atoms. What about a highly ordered deck or distribution? First, we have to specify what we mean by "ordered." For a deck, this might mean ordered by suit (spades, heart, diamond, and club, say), and by value (ace, king, queen, jack, ten ... two), while for our gas it could mean the one atom owns all the units of energies, and all the others none. I hope you can see that, defined this way, there is only one particular distribution; and once we, either by shuffling the deck or allowing the gas atoms to bump into each other, thereby releasing or gaining units of energy, the distributions become progressively less and less ordered, eventually (though this may take an enormous amount of time) become highly randomized. The overall macroscopic properties, such as temperature or pressure, don't change, but the ways those properties can be achieved, increase dramatically. This is why we talk about the " number of alternative microscopic configurations corresponding to a single macroscopic state", or entropy, and why we say that, in the cosmos as a whole, entropy is always increasing. It is why the ordered state contains a great deal of information and can do a great deal of work, while increasing disorder, or entropy, means less of both.
Now if you've ever worked with decks of cards, you've noticed something quite obvious in retrospect: you rarely go from perfect order to complete disorder in one shuffle. There are many, many (also almost innumerable) in-between states of the deck that still have some order in some places with disorder in others. In fact, even a completely randomized deck will have, by pure chance, some small pockets of order , which can still be exploited as information or for work). The same is true in nature of course, which is why the Second Law is really a statement of probabilities, not absolutes. Or, to quote the late Jacob Bronowski in his famous book Ascent of Man: "It is not true that orderly states constantly run down to disorder. It is a statistical law, which says that order will tend to vanish. But statistics do not say 'always'. Statistics allow order to be built up in some islands of the universe (here on earth, in you, in me, in the stars, in all kinds of places) while disorder takes over in others."
Information, order, the ability for work: these are always things that a universe has to some degree, however incompletely. Indeed, by our current understanding of cosmic evolution, our universe started off in a very high state of order, a perhaps highly improbable state of affairs, but quite permissible by the deeply understood laws of thermodynamics. This initial high degree of order has allowed all the galaxies and stars, and atoms, and of course you an me and all other living things in this universe, to come into existence; and in the same way, will see all these things we regard as precious to us now pass out of existence. But do not despair. Order can never run down to absolutely zero; or, from the opposite perspective, disorder, or entropy, can never increase to an infinite however great it becomes; because of this simultaneously subtle but obvious observation, life in some quantity -- organized consciousness in some form is maybe the better word -- doesn't have to completely vanish. If fact, if the reality really is infinite, as I suspect it is -- consisting of an infinite number of universes in an infinite space-time, all subtly different but all obeying the same fundamental laws of logic -- then we never have to worry about the light of mind being utterly snuffed out everywhere, for all times and places, at any point in any future. That light has certainly shown long before life on Earth began to organize, and will continue to, somehow, long after our solar system and even entire universe has long burnt out into a heatless cinder.
________________________________________
What I've put forth here is a necessarily very limited explanation of the Second Law of Thermodynamics, of order, disorder, entropy, information, and work. There are many more explanations, and whatever you have gained from mine -- I presume more questions than answers -- you should seek deeper comprehension in others' explanations, and in your own mental work on the subject. There are many topics I've overlooked, or hit upon only in sketchy form, concepts you may need to fully explore and gain clarity in first before grasping the Great Second Law. If it is any consolation -- assuming you need any -- I am no doubt in much the same situation, perhaps even more so. If so, I wish you prosperity in your quest for full comprehension, as I have had in my own. Thank you for attending to these words.