FacebookTwitterYoutubeInstagramGoogle Plus

Your Daily Equation
Your Daily Equation #32: Entropy and the Arrow of Time

Join us for #YourDailyEquation with Brian Greene. Every Mon - Fri at 3pm EDT, Brian Greene will offer brief and breezy discussions of pivotal equations. Even if your math is a bit rusty, tune in for accessible and exciting stories of nature and numbers that will allow you to see the universe in a new way. Episode 32 #YourDailyEquation: Einstein referred to entropy and the second law of thermodynamics as the only insights into the workings of the world that would never be overthrown. Join Brian Greene as he explores how these concepts illuminate the difference between past and future--why glasses break but don't unbreak, why candles burn but don't unburn, why we age but don't un-age. Why, that is, there is an arrow to time. Surprisingly, the discussion brings in the Big Bang itself.Learn More

Speaker 1:

Hey, everyone, and welcome to this next episode of Your Daily Equation. Today I am going to focus upon a deep issue, one that we could spend many episodes talking about, many hours talking about, but I’m just going to really try to scratch the surface here on the deep issue of the arrow of time and its relationship to entropy and the second law of thermodynamics. So, let me just jump right in. What’s the puzzle? The puzzle is this. There are a gazillion processes in the world that we only ever witness taking place in one temporal order. They unfold in one temporal direction, and we never see the reverse of those processes.

I mean, we’re all familiar with the wind blows, and the petals of a flower can be blown away, but if I showed you a film in which the flower reassembles itself, you’d know that I’m showing you a reverse-run film. You never actually see that kind of reconstruction taking place in the real world around you. We’re all familiar. For another example, someone can jump off the side of a pool and do a whatever, summersault, and then land into a pool. But if I showed you a film in which someone jumps out of the water, the water all coalesced to a nice, flat surface, and the person lands on the pool’s surface, you know that I’m showing you a reverse-run film.

You’ve never seen that actual process take place in the world around you, and the perhaps third and canonical example, we’re all familiar. It’s happened to me. You’re holding a nice glass of wine. Maybe it’s a Riedel glass, a real nice one. It slips out of your hand, and it smashes on the floor. Awful mess. But you and I have never seen the shards of glass on the floor all jump off the floor, come back together in just the right way to reassemble a pristine glass filled with wine. We never see such processes. The puzzle, or the issue, frame it that way, the issue is to explain this asymmetry. Why do we see events unfold in one temporal order, but we never see those events in reverse?

Now, one answer, the quickest answer would be, well, maybe the laws of physics allow glasses to smash, allow people to jump into pools where the water becomes all agitated, they allow for when to blow petals of a flower, or whatever they’re called, parts of a flower to blow off in the wind, but they simply don’t allow the reverse process to take place. That would be a great answer. We only see the things allowed by the laws of physics. The laws of physics don’t allow those reverse processes to take place. You can film them in the right order and play them in reverse, but you can’t actually see them take place in the real world in the reverse order, period, end of story. That would be great.

Problem is that explanation fails. As I’ll show you in a moment, any motion allowed by the laws of physics, the reverse motion is also allowed by the laws of physics. The reverse process is allowed by the laws of physics. So, we’re back to square one in trying to find an explanation, and the actual explanation that we’ll actually finally be led to is not to try to say that the reverse processes can’t happen, but rather, to say that they can happen, it’s just that they are extraordinarily unlikely, extraordinarily unlikely, and they’re so unlikely that they effectively never happen in the real world. That is the basic idea.

Trying to make that a little bit more precise, we’ll bring in the concept of entropy, second law of thermodynamics. At the end, though, we’ll find a little bit of a twist in that to really finish this top-level argument, we’re not going to dig in to the nitty-gritty details, which are fascinating and rich, but they will take us down a rabbit hole. The twist, however, that we will encounter is that we will need to bring in properties of the universe near the Big Bang to actually make these ideas complete, or at least this approach to be complete. Not everyone agrees that this approach is the correct approach.

Okay. So, that’s the basic issue at hand, and let’s just quickly jump in. So, the subject that we’re talking about is entropy and the arrow of time, the fact that there seems to be a built-in orientation to the order in which events unfold, and that’s what we mean by the arrow of time, that time has an orientation associated with it, which space does not seem to have. You can do anything in space, but time seems to have this asymmetric quality. Where does it come from? So, I really just want to quickly spell out two things.

Number one, I just want to convince you quickly that the laws of physics really do allow reverse-run processes to take place, so reverse processes can happen, and then the second thing that we will take a look at once we’re convinced that there’s an issue… If the reverse processes can happen, there’s no issue. Once there is an issue, we will talk about entropy, order, disorder, and the second law of thermodynamics, which we will see talks about the overwhelming tendency of order to degrade into disorder, from an orderly glass to degrade into a shattered, disorderly glass. That’s where we’re going.

All right. So, let’s being with point number one, and I’ll do this in a specific example, but easily generalizable. I just want you to get a feel for how we argue that the laws of physics, once they allow one trajectory, one kind of motion, they necessarily allow the reverse trajectory. How do we do this? I’m going to do a specific example. Let’s imagine that we have a baseball. You know I like baseball. I’ll admit it. I like the Yankees. Don’t turn off the video, okay? But anyway, there isn’t any baseball these days, in any event.

So, imagine you have a baseball, and it’s hit from home plate. It soars into, say, the bleachers in the outfield, and let’s say the trajectory is called X of T, and let’s say we set it up so that T equal to zero is when the ball’s at home plate, T equals one when it lands in the bleachers. Clearly, if that was in units of one second, that would be a monster shot. I don’t think anyone has ever hit a ball in one second from home plate to the bleachers, but just let one be whatever unit it needs to be so that I can just keep the math looking simple, from T equals zero to T equal to one.

Now, the question is, what about the reverse trajectory? That would look like, say, starting in the bleachers and heading back toward home plate, and that trajectory, let’s call it X tilde of T. In terms of its functional form, that could be written as X of one minus T. As you see, X tilde at times zero would be X of one, which is this location. Here is X of one, and X tilde of one would be X of zero over here, as that is X of zero. So, that is the reverse-run trajectory, and what we want to make clear is that if X of T satisfies the equations of motion, so does X tilde of T. I’m doing this purely classically. I’ll mention the generalization in a moment.

Now, what are the equations of motion? Well, it’s just the force of gravity, which is equal to the mass of the ball times the acceleration due to gravity, is M times D two X, DT squared. That is the equation satisfied by X of T, and now we want to see if that equation is satisfied by X tilde of T, and that’s not hard to work out, because let’s consider… I’m going to keep the colors semi-consistent. So, let’s consider DX tilde, DT, so that is the same as D of X of one minus T, DT, and that, of course, can be written in D of X of one minus T with a spec to one minus T, which is now a dummy variable that I can replace with anything, as I will in a moment, but let’s write that D of one minus T, DT, using the chain rule.

Now, this fellow over here, derivative of one minus T with a spec of T, the derivative of the one part just gives you zero, so you get derivative of negative T with a spec to T, so it’s just minus one. That’s all that that is. So, we have a factor of minus one coming in, and then this term over here, as I said, one minus T is now a dummy variable, which hopefully not confusing. I’ll now call T, it’s just a derivative of a function of a particular argument, and it’s called one minus T in my equation. I’m going to now call it T just for simplicity. Woops. That’s unfortunate, the phone ringing. Where is that phone? Oh. Would you please excuse me for half a second, guys? Hopefully, this is not a call I need to take. Oh, someone picked it up in the main house. I should’ve gotten rid of this phone before I started this, but in any event, sorry. But where were we?

So, here we have this expression, so we have minus DXDT. That makes good sense because the velocity of the ball in the purple starting in the bleachers, it’s heading out in this direction, whereas the red one, as it was reaching the bleachers, was heading in this directly. Clearly, this purple is the opposite of this red, so that makes perfect sense, but now let’s take the second derivative in order to analyze the question of whether X tilde of T satisfies Newton’s second law. I don’t have to do anything at all, because look. When I took the first derivative, all… Oh, that is so irritating. All right. I’m going to have to get rid of this phone. I’m going to break it. It’s going to drive me nuts. Okay. Normally, I put the phone out the door. I forgot to do it this time, but in any event, okay.

So, what do we have here? So, we have this minus sign that comes from the first derivative. If I take a second derivative, it’ll just bring in another minus sign. Minus sign times minus sign is plus sign, and therefore… Let me just switch back over to here. So, if I have D2X tilde DT squared, that will just be D2XDT. The minus signs will cancel each other, and you’ll just have to choose the argument correctly, because I did replace that one minus T by T, just to keep the functional form simple, but the point is once you get here, you’re done because that’s the only thing that comes into Newton’s second law. So, bottom line, if this trajectory satisfies the equations of motion, then so does the reverse one.

Now, look. This is for a single ball, which I’m viewing as a single particle traveling under just the force of gravity, and we’ve shown that any motion, the reverse motion satisfies the equations of motion. You can completely generalize this to any number of particles acted on any collection of forces. Even quantum mechanically, this is true. It gets a little bit more technically involved. So, in Schrödinger’s equation, not Newton’s second law, if you want the wave function to evolve in the reverse temporal order, you need to take a complex conjugation into Schrödinger’s equations. You got the I in there. It goes to minus I, square to minus one, and so you have to take the complex conjugate of the wave function to make it all work out.

Bottom line, though, you get exactly the same answer. Any evolution of the wave function forward in time, the reverse-run film, if you will, of the way function going the reverse temporal order will also satisfy the equation’s motion. So, I’ve done the simple case, but it totally generalizes. So, that is point one. There really is an issue, as I said at the outset. Anything that happens in one order, it can happen in reverse. Unfortunately, we cannot simply argue that the laws of physics prevent those kinds of phenomena from happening, and that’s why we don’t see them. That is not how our universe works.

Okay. So, we want to go on then to the potential answer, where we don’t try to rule out these reverse-run processes, but rather, we want to argue that they’re incredibly unlikely. Now, intuitively, it’s not hard to get to that conclusion. In fact, let me show you the smashing wine glass. How would you get it to reassemble? In our show some years ago, Fabric of the Cosmos, we did a sort of playful example that I’ll show you here. So, here’s that wine glass again, and it’s in my hand, and I drop it, smashes on the table. You get all the shards.

Now, if I want this to reassemble, what would I need to do? Let’s hold it, hold it still. I would need to run around, changing the velocity of each and every particle, reversing it. Just like the baseball going to bleachers, in reverse, it’s coming out of the bleachers. I need to reverse the velocity of every single particle making up the glass, the air in the room, the wine, whatever. Everything involved, I need to reverse its velocity, and then if I allow it to evolve forward in time with those new reversed velocities, it all comes back together into the pristine glass.

So, it can happen, and that’s how you do it, but look how incredibly difficult it is to do it. You need to run around and change all of those motions in a completely precise and exact way in order for it to all stitch back together. So, there you get a sense of how incredibly difficult it would be for that physical process to be set up to unfold in the usual orientation of time going forward, shards of glass going forward in time, reassembling the glass. But now let’s see how we find the mathematical version that describes how unlikely this is, and that is what brings in this idea of entropy.

Entropy is a word that I think many people are familiar with in everyday discourse. You can think of it as a measure that’s not perfect by any means, and many people balk at this description, but it’s really not bad, especially on a first-pass approach, as we’re doing in this episode. Entropy is a measure of disorder, and roughly, we want to quantify the idea that the pristine glass is ordered compared to the completely disordered state of the shattered wine glass. How will we get a measure of that? The answer to that really comes from this guy over here, Ludwig Boltzmann, and you see on his tombstone there’s an equation, S equals K log W, and that formula embodies Boltzmann’s definition of entropy, and the way in which it can be used to quantify disorder.

What is the basic idea? I’ll say it in words first, and then I’ll write down the equation. The idea is this. If a system is very disordered, then there are many rearrangements of its ingredients that leave it looking very disordered. The canonical example that people use, just as an analogy, if your desk is completely disordered, you got the paperclips all over, papers in random arrangements, coffee cups, whatever, total mess, and if I then come in, you’re not even looking, and I rearrange that disordered mess, you walk back in the room, you don’t even notice that I rearranged it. It was a disordered mess when you left the room. It’s a disordered mess when you came back in the room.

So, there are many, many rearrangements of a disordered system that go completely unnoticed, that leave the system looking pretty much the same. If you have an ordered desk, where the paperclips are all in their appropriate spot, the pages are all in a nice, neat stack, the books are all alphabetically ordered on the back of your desk, whatever, almost any rearrangement you will notice because the paperclips won’t be where they’re supposed to be. The books won’t be in the precise alphabetical order, or the pages won’t be in that nice, neat stack. So, there are very few rearrangements of the constituents of an ordered desk that leave it looking pretty much the same, and there are a huge number of rearrangements of the ingredients making up a disordered desk that leave it looking the same.

So, the way to quantify order versus disorder is to count the number of rearrangements of the ingredients that leave a system looking pretty much the same. That is, in essence, what Boltzmann said, and that’s really what his formula is. So, in some sense then, S is a measure of the number of rearrangements that leave the overall properties of a system unchanged, and when Boltzmann writes down this formula, he writes in terms of the logarithm. That’s what that log means on his tombstone. He uses logarithm. It’s a very important mathematical detail. I don’t want to get bogged down in the mathematical details here, but basically, the W on his tombstone is counting the number of rearrangements, obviously not for a disordered desk, but for a system made up of particles.

So, as an example, if I consider the air in this room, there are many rearrangements of the particles of air in this room that are unnoticed. I’m rearranging them right now. Okay, I’m doing a lot of rearrangement. Feels the same. Temperature’s pretty much the same. I’m breathing the same air. The macroscopic properties of the air in this room do not change under an enormous number of rearrangements of the air molecules in this room. On the other hand, if I had a different circumstance here, what if the air was all clustered in a tiny region over here? I might be gasping for breath.

Put that to the side, but if all the air was right here, then I’m severely limited in the number of rearrangements that make that configuration look the same, because if I move those particles outside that little cluster, then I can notice it. It’s only if I keep them tightly clustered, and therefore, a limited number of rearrangements will keep that configuration of air molecules unchanged. So, if the air is in a nice, tiny, orderly package, very low entropy. If it’s widely dispersed and moving this way and that in my room, in this office here, then it has higher entropy. It is more disordered.

The basic idea of the second law of thermodynamics is that there is a natural tendency for systems to evolve from order toward disorder, or in terms of entropy now, from low entropy to high or higher entropy. The reason for that is, again, quite straightforward. For low entropy, there are very few configurations available. For high entropy, by definition, there are many more configurations of the constituents available, and if the constituents are randomly moving about, thermally jiggling this way and that, then just by the law of numbers, it’s much more likely that they’re going to find themselves in a higher-entropy configuration since there are so many configurations that fit that bill, and quite unlikely that they’ll find themselves in a low-entropy configuration, because there are very few of those.

So, if I had the gas clustered in a small, little region here, as the gas randomly jiggles about, it’s going to ultimately fill the room. It will go from the ordered low entropy to the high entropy, and that is the natural course of events, simply by the logic of numbers and the logic of probabilities. I’d like to give you a little more of a feel for that using another analogy, a concrete example that I find particularly useful, which is, imagine you have 100 pennies, and imagine those hundred pennies are on my desk here, and they’re all heads-up. Now, that is a very orderly configuration. If you think about the degree of freedom to simply change a head to a tail or a tail or a head, there’s only one configuration that has all heads. You can’t change the disposition of any coin and keep it all heads.

If I then were to have the pennies subject to thermal jostling, let’s say I start to kick the table, making the pennies bounce around, some of them will then flip over from heads to tails, and if I keep on going, some of the tails will go back to heads, but many more heads will turn into tails. So, over time, the jiggling pennies will go from the ordered configuration of all heads to a far more disordered configuration, which has more of a mixture of heads and tails, simply because there are many more such configurations, and I want to just make that quantitative for you in half a second and do a fun little simulation on that.

So, take those hundred pennies as our system, and if I ask myself, if I’m looking at a configuration that has all heads, how many rearrangements of that configuration maintain all heads, by rearranging them, I’m just talking about changing heads to tails and tails to heads, not rearranging them in terms of their locations, just whether it’s heads or tails, and there’s only one arrangement or rearrangement, every single penny has to have heads, period, end of story. There are no other possibilities that meet the stipulation that you have all heads, a very unlikely configuration, say, if you drop the pennies, for it to land in then because of that.

But what if I had not all heads, but 99 heads and one tail? How many configurations have one tail? Well, now there are a bunch of rearrangements. Let’s say the first coin is tails and the other 99 are heads. You can rearrange that, make it the second coin, tails, and the first is back to heads. That still has 99 heads, or the fifth coin is tails, or the lone tail is the 17th coin, or the lone tail is the 99th coin. You see, there are 100 possibilities. There are 100 states, if you will, that meet the stipulation of having 99 heads, and so if you’re randomly throwing coins on the table, it’s 100 times more likely that you’ll get 99 heads than you will get 100 heads, and therefore, much more likely that you’ll have at least one tail, but you can keep on going.

What if you have 98 heads? Well, now think about it. The two tails could be coins one and two, or coins one and three, or coins two and three, or four and five, or six and 77. There are, in fact, 100 choose two, if you know a little combinatorics, for the number of configurations to have two tails, in 100 choose two, that’s 100 times 99, divided by two, so it’s 50 times 99. I think that’s 4,950. So, it’s almost 5,000 times more likely that you’ll have two tails than no tails. Keep on going. What if you have 97 heads? If you work that one out, again, three tails, it could be coins one, two, and three, or coins one, two, and four, or coins one, two, and five, or coins two, five, and seven. It just keeps on going.

How many are there? I believe there are 161,700 possibilities in that particular case, which is an interesting large factor by which you’ll be more likely to have that number of heads compared to the number that I started out with, which was no heads. What about 96 heads? I don’t really know that one off of the top of my head, so I’m going to just estimate it. I believe it’s about four million. Sorry I can’t give you the exact number there. If you’re interested, it’s easy to work out, just 100 choose four, and this keeps on going.

The point is I don’t care about these exact numbers that we have here. I just care about the trend that is being illustrated. The trend is that if you have evermore tails, it’s evermore likely that if you randomly drop the coins, you’ll have that number of tails. In fact, this keeps on going until you get to 50 heads and 50 tails, minus an equal split, and that one is what I was trying to bring up. I do want to show you that number, if I can find it. I guess I have it here, so I’m going to bring it up on the screen. There it is. I don’t know how to pronounce that number, so I won’t try, but it’s a big number. It’s a big number.

So, if you think about these numbers as the entropy of the state, which would be the log of it, you see that the number is one for all heads, and it’s this huge number for 50 heads and 50 tails, which means if I had these coins on my table, and I banged them and jostled them around, kicked the table, as I described, you would expect over time you would approach 50 heads or 50 tails, or something very close to that, because there are so many ways to realize that state, and so few ways to realize the low-entropy state of all heads. I want to show you this drive from order toward disorder from, say, the configuration with all heads to one that has a mixture of 50/50.

I have a little simulation that a guy at the World Science Fest, a really smart guy, Danny Swift, made for me. I asked him, “Hey, I want a little computer simulation to show the folks on Your Daily Equation how the coins, the 100 pennies evolve over time,” and he quickly came up with this, which is really cool. So, let me see if I know how to make it work. There it is. All right. So, it asks me, “How many coins are there?” Let me actually do 1,000, make it even more dramatic, not 100, just because the numbers are even more extreme. Then it says, “How many heads are up?” I’m going to start in a completely ordered state of, say… I’m going to do all tails. What difference does it make? Let me start with all tails, so no heads up. Then it says, “How many would you like to flip?”

So, this is saying, I’m kicking the table, “How hard are you kicking the table? On average, how many coins will flip over?” I’m going to say, roughly speaking, 25 coins on each kick are going to change their disposition, either from tail to head or head to tail, and they’re going to be randomly chosen by the simulation, and, “How many times do you want to kick the table?” Well, it’s a computer after all, so I don’t have to kick literally, so let me just say, I don’t know, 2,000 times that happens, and then, “How many tosses per frame?” That’s when I’m plotting it out, how quickly the plot will go. I don’t know. Let me go four per frame. I don’t even know if that’s a good number or not. I know that the graph is going to come out here, so I’m going to quickly try to bring it over once that starts to go.

There it is. There’s my graph. Notice that I started in the ordered state down here. Over time, we’re going closer and closer to the 50/50 split, which here it would be 500 heads and 500 tails. We’ve now gone from order all the way up, more entropy, more disorder, more disorder, and once we hit the 50/50 split, then we pretty much stay in that range. There’ll be some fluctuations when I kick the table, and I get a little more heads than tails, or I don’t know what… Yeah, a little more heads than tails, a little more tails than heads over here, but for the most part, once we reach that maximal entropy state, we’ve pretty much just meandered there.

It’s not that we can’t go back to an ordered state, like all heads or all tails, it’s just so fantastically unlikely. How unlikely? There’s only one state that has all heads, whereas we have this huge number that I showed you before, that 100… I don’t even know what it is. What is it, 100 billion, billion, or whatever it is? I don’t know. I’d have to count, or maybe it’s 100 billion, billion, billion. I’d have to count the number of digits. But there are so many states that have this roughly 50/50 split, and that’s why we’re meandering around them. Some heads go to tails, some tails go to heads, but on average, we’re pretty much staying in the 50/50 split, in this case, 500 heads and 500 tails. For us to go down to here would be an incredibly unlikely move. We’d have to have the coins all just flipping the right way, that singular way, to yield all heads or all tails, which is highly unlikely to happen.

This is a nice example to illustrate the move from low entropy to higher entropy, from order to disorder, and how unlikely the reverse process is. So, if you think about, for instance, this ordered state as our wine glass, very few rearrangements of the molecules of the wine glass will leave it intact, compared to the number of rearrangements of the shards that leave the shards in a disordered mess. You start moving the molecules of the wine glass around, it breaks, it deforms, it warps, whatever, it doesn’t look the same any longer. But you start moving around the molecules of the shards of glass and the splattered wine, and it’s like the messy desk. Pretty much looks like a disordered mess before, pretty much looks like a disordered mess after.

So, there are so many ways for the molecules of that glass to be disordered, that high-entropy state, so few ways for the molecules to be ordered in that beautiful Riedel glass, that once you go the progression from order to disorder, it is very unlikely for the reverse process to happen, and we saw how difficult the reverse process is. You have to change the velocities of all the shards in just the right way for them to come back together. For that to randomly happen in the real world is incredibly unlikely. In the real world, there aren’t people running around, changing the velocities of molecules and atoms. In the real world, it’s just thermal motion banging things around, and for the random thermal motion to happen to be just right to make all the molecules in all the shards of glass to do what I showed you in that film, extraordinarily unlikely.

So, there’s our arrow, if you will, of time, its natural progression from order toward disorder, from low entropy to high entropy. Let me just make, well, three statements. Number one, this the second law of thermodynamics, the natural tendency to go from order to disorder, and you see that it doesn’t really require… I mean, if you take this in a statistical mechanics course, this will be laid out in more rigor, and more formalism will be developed, but in the end of the day, it’s nothing but logical reasoning with numbers, and there are very few ways to be ordered, and a huge number of ways to be disordered, and things are randomly sampling the possibilities, and it’s more likely that they will find themselves in disordered states compared to orderly ones. Nothing to it, in some sense.

That’s why Einstein, I believe that’s why Einstein describes these kinds of ideas as the only ones that he was confident would never be overthrown. He knew that his general theory of relativity and special relativity, they had to be just approximate descriptions of the world. He was realistic about that, and he imagined that one day they might be and would be superseded. But when it came to these kinds of ideas, he didn’t think that they’d ever be superseded because they don’t rely upon anything but kind of logic and numbers. Okay? That’s sort of point number one.

Point number two, the second law of thermodynamics is not a law in the conventional sense. It’s only, as we see, a statistical likelihood. In fact, if you don’t mind, I’m going to do one other example, if I can bring this up on the screen, just to show you what I mean. It’s not that entropy can’t go down. It’s just that it’s unlikely to go down. In fact, in a simple system, let me not do so many coins. Let me do 10 coins, and let me imagine that… I don’t know. Let’s start with five up. So, we’re going to start completely disordered, five heads, five tails, and let’s say we flip, I don’t know, three on each time we’re jiggling around, or let’s not do three. Let’s do, I don’t know, three. Maybe that’s okay. Whatever. I’m not sure.

How may times are you going to do it? Let’s do it many times. Let’s do it, I don’t know, 10,000 times, and how many tosses? I better do a lot, or we’re going to be here forever. The computer’s going to take forever. So, let me do sort of, I don’t know, 100 tosses as we plot this. Let me bring it over here so I can bring that over quickly, that graph. Okay, there it is. Oh, and look at that. You see, we started right here in the middle, at the 50/50, but notice over time, we are going, we’re fluctuating to highly-ordered states, where we have all heads and all tails. The reason is because we’ve only got 10 coins, but my point is here’s an example that accentuates the likelihood of that rare fluctuation from disorder to order, the reverse of what we are used to.

That’s kind of beautiful because we see that the second law is just a statistical tendency. It is not a law. I mean, Newton’s second law is a law. It is not a tendency. Schrödinger’s equation is meant to be a law, not a likelihood. The second law of thermodynamics is, however, a tendency, an overwhelming tendency, but a tendency nevertheless. Entropy can go down. It is just unlikely for that to happen.

Okay. What is my third point? My third point is this. Have we answered the question of the arrow of time? You might think we have because now we understand why glasses shatter and we don’t ever see them un-shatter. We now understand why whatever, a candle burns, but we never see it un-burn. We never see all the fumes or the aroma come back together in order to recreate the candle, and it reforms. We never see that because those would be entropically-decreasing processes, which can happen, as I just showed you. They’re just unlikely, and they become evermore unlikely when the number of particles involved is ever larger. That’s why I only used 10 pennies in the example where I wanted you to see fluctuations to lower entropy.

But have we fully answered the question? Not really, because you still want to ask yourself, if the high-entropy states are the more likely ones, why do we ever have any order at all? Why do we have a pristine wine glass? Where did its order come from? If it’s unlikely to have ordered states and more likely to have disordered states, why aren’t we just always disordered? Where did the order come from? To try to answer that question, it’s natural to think temporally. You can say, “Okay. Today, the universe has a certain amount of entropy.” If the second law holds, you would think that yesterday you’d have less entropy, and the day before, less entropy, and if you follow this all the way back, then you’re led to the Big Bang, and you’re led to imagine that the Big Bang was a highly-ordered, low-entropy state, the lowest entropy that the universe has ever had.

We don’t have an argument that establishes that the Big Bang was highly-ordered, that it had low entropy. We posited. It’s a hypothesis. It’s usually called the past hypothesis. I believe David Albert gave this hypothesis that name. It’s not that everybody believes that this is the right way to go, but I am describing one chain of reasoning, which at least holds together, albeit under the assumptions that it makes, and the assumption is that for whatever reasons, the early universe had extraordinarily low entropy, extraordinarily high order, which means, and I can just go back over here for a second to give a better picture so I can have a nice graph to show this with, so if we do our 1,000, zero, 25, say, let’s do it 1,000 times, and let’s make this go really fast, because I don’t have time to wait, and the graph is happening here.

Whoa, you can sort of see it pretty fast. But the point is the beginning of the universe, highly-ordered. Over time, the entropy has been increasing, and we’re sort of here. We’re in a part of the unfolding, and an interesting question is, does the universe have a maximal entropy? I don’t need to go all the way up here, but we’re over here, say, where the universe still has some residual order from the Big Bang. So, the idea is the reason why there can be ordered structures, like wine glasses and candles and flowers and planets and people and stars, the reason why there can be ordered structures in the universe is because the Big Bang was so fantastically ordered that en route to ever-greater disorder, en route on that journey, there is still residual order from the Big Bang along the way.

So, I like to say, when you drop a wine glass and it smashes, or an egg splatters on the floor, you are actually witnessing something that’s deeply connected to the Big Bang itself, because the very existence of the wine glass, the very existence of the egg relies upon the orderly Big Bang for there to be any order today which can be embodied in, say, a wine glass or an egg or a planet or a person, or any of the structures that are ordered in the world around us. So, this is key to the arrow of time. It’s not just that entropy increases over time. It’s not just that order degrades into disorder. It’s also that there’s an anchor.

You have to explain why there’s any order at all, or else there wouldn’t be any opportunity for order to degrade into disorder, and to explain why there’s any order at all, we are led right back to the Big Bang, and the assumption that the Big Bang was a highly-ordered, low-entropy state, and with that assumption, the past hypothesis that the beginning was highly-ordered, and with the second law of thermodynamics and the overwhelming tendency of entropy to increase over time, we get a natural orientation of time, a natural notion of what it means to head toward the future.

So, with the laws of physics agnostic about past and future, as we started out, any trajectory that can unfold toward the future, the reverse trajectory also solves the equations of motion. So, the laws of physics, agnostic about what we call past and future, but with the past hypothesis, low-entropy beginning, and the second law of thermodynamics, we have an orientation to time that then emerges. Is that the end of the story? No. I mean, we want to understand. Can we give an explanation for why the Big Bang was highly-ordered? Can we give some deeper principle that explains how that order came to be, or do we need to simply accept that we have, say, one universe, and that’s how it began, period, end of story, that’s it, or, as some have suggested, maybe there’s a pre. Maybe there’s another side to time. Maybe time doesn’t begin at the Big Bang, and maybe, as some have suggested, if you go through the Big Bang, entropy increases in a symmetric way.

So, maybe you start with a very high-entropy infinite past that comes down to our Big Bang, and then from there it heads again back toward very high entropy. That would be completely symmetric. That’s a possibility too that people talk about. But in any event, the one that, at least at the moment, that I find most convincing, past hypothesis, second law of thermodynamics, entropy tending to increase from the highly-ordered beginning, and that is where the asymmetry in our experience comes from. That’s why we never see those things that make us laugh in reverse-run films, those things that look absurd. They’re not absurd based upon the laws of physics. They’re absurd based upon our assumption about the ordered state of the Big Bang and our understanding from entropy and the second law of thermodynamics of this overwhelming tendency to head from order toward disorder.

Okay. That’s all I wanted to say today, and maybe a natural next step at some point soon will be to apply these ideas of entropy, relate them to information. Maybe something on the information theory with Shannon would be good, but also to relate them to the physics of black holes, where entropy and these ideas really flower in unexpected and deep… Well, anyway, that’s for the future, but as for today, arrow of time, entropy, Big Bang, past hypothesis, that’s all I wanted to say for today. Until next time, this has been Your Daily Equation. Take care.

Your Daily Equation
Your Daily Equation #32: Entropy and the Arrow of Time

Join us for #YourDailyEquation with Brian Greene. Every Mon - Fri at 3pm EDT, Brian Greene will offer brief and breezy discussions of pivotal equations. Even if your math is a bit rusty, tune in for accessible and exciting stories of nature and numbers that will allow you to see the universe in a new way. Episode 32 #YourDailyEquation: Einstein referred to entropy and the second law of thermodynamics as the only insights into the workings of the world that would never be overthrown. Join Brian Greene as he explores how these concepts illuminate the difference between past and future--why glasses break but don't unbreak, why candles burn but don't unburn, why we age but don't un-age. Why, that is, there is an arrow to time. Surprisingly, the discussion brings in the Big Bang itself.Learn More

Up Next

114,987 views | 00:25:16
68,283 views | 00:15:39
60,823 views | 00:18:34
00:11:08
53,353 views | 00:16:36
00:13:56
61,820 views | 00:11:29
00:21:28
00:19:02
59,557 views | 00:26:19
105,201 views | 00:29:54
60,785 views | 00:24:31
61,602 views | 00:27:17
00:18:43
00:31:37
71,079 views | 00:36:32
63,584 views | 00:36:05
00:07:39
176,805 views | 00:50:33
00:12:23
00:29:22
56,484 views | 00:38:08
145,793 views | 00:34:40
00:29:53
50,703 views | 00:48:25
00:26:59
00:36:59
66,207 views | 00:41:35
127,072 views | 00:44:27
01:01:59
00:41:37
1,118,615 views | 03:02:40
406,454 views | 04:44:34
01:09:27
01:03:04
01:02:31
181,589 views | 02:28:22
667,563 views | 02:58:23
223,984 views | 02:38:50

Playlists