Pages

Wednesday, August 15, 2012

Fire on high!

 
The deeper I go into Thermodynamics, the more it begins to look like a wonderland.
In the past, I've mechanics explain to me the principles of physics using engines as an example. They could do this without using math or any advanced physics speak, but they still explained it perfectly. We can learn physics without math, especially if we spend 10,000 hours working with the laws in the physical world. The math is doorway to a different world, however; a Narnia or Wonderland where we can learn how to control what those laws can do. Math is the language we use to share the workings of the world, and we can use it create and discover new things.
Math can be a terrifying sight to the newcomer though. It's full of things like imaginary numbers, logarithms, and differential equations. Statistical Mechanics is no different really. I'm focusing today on Boltzmann's entropy equation, which is the product of a constant and the natural logarithm of probability. Feel free to scratch your head.
Ludwig Boltzmann was one the first people to give real weight to the idea of atoms. In works like Statistical Mechanics, he discusses how large complex mechanisms are made up of smaller individual processes. This lead to the idea that something like entropy was the result of the random movements of sub-atomic particles. He derived his equation entropy is equal to k(lnW) to explain the motion of this particles. K is what is now called Boltzmann's constant, W is the potential movement of the particles, and ln is a natural logarithm. Now everybody in the room say it with me: WTF? It's a logarithm of probability? Seriously? Shit.
This is that scary thing I talked about earlier. After staring at this for a couple of weeks now, I can say that logarithms are not hard if you understand exponents. Log 10 is equal to 1. Another way to say this is 10 to the first power is equal to 10. ln is a natural log, so instead of base 10, it's base of e.
Now that you have a very basic rundown on logarithms, it's kind of easy to see why we need it for entropy. If entropy increases due to energy, and some of the increase in energy is due to entropy, then the growth of entropy is exponential. This exponent is dependent on the potential movement of molecules. It is potential because we have no way of predicting their exact movements as the energy increases. It's that drunken walk in probability. What I'm trying to get at is this: the growth is exponential due to the crazy movements of some sub-atomic particles. We find this exponent by taking the log of their potential energy.
OK, deep breath. The total entropy comes when you multiply this exponent with Boltzmann's constant. The constant is the relationship of of absolute temperature and kinetic energy in a molecule of a perfect gas. But really, it's (1.3807 times 10 to the negitive 23 power) joules per kelvin. Isn't that simple? If you just combine all these factors, you can chart the growth of entropy of an isolated system.
Back up a moment, isolated system? Yeah that's right, isolated system. The reason why you should be cursing Boltzmann and physics right now is because a truly isolated system is theortical. Boltzmann even says that in his paper. There is always another system that heat transfer is going on with, usually it's air. And that, ladies and gentlemen, is why physics frustrates me sometimes. The equations that meant to be used with real things are based on theoretical things. I'm gonna stop and lie down before my brain explodes.
I can't stay mad though. This hobby helps my understanding of the world. I wonder if this writings will help anyone else.

1 comment: