"What happens if I'm traveling at the speed of light, and I try to look at myself in a mirror?"
All high school experiences have one thing in common: there are always a handful of students-the cool kids-who feel the insatiable need to mock everything and everyone around them. This is why we like to think of ourselves as the cool kids of physics, if such a thing could be said to exist. We'll give you an example. We spent part of the introduction making fun of textbook authors who need to use examples involving cataclysmic natural events, sports, or monster trucks to "make physics come alive." We aren't backpedaling, but some of those goofy examples have a tiny bit of merit.
That, and we know in our heart of hearts that we'll never get this physics party started unless we set off some fireworks. If you've ever been to the local Chamber of Commerce Independence Day celebration and decided to get a little physics in, you'll have noted that there's a time delay between the rockets' red glare and the sounds of bombs bursting in air. You see the explosion several seconds before you hear the sound. You've probably experienced the same thing if you've ever had back-of-the-theater tickets at a concert: the music and the musicians suffer a delay. Sound moves fast, but light moves faster.
In 1638, Galileo of Pisa (one of the original cool kids of physics) devised a scheme to figure out the speed of light. The experiment went like this: Galileo parked himself on a hill with a lantern, while his assistant, armed with his own lantern, walked far away to a different, distant hill. The two signaled each other. Each time Galileo saw his assistant's lantern open or close, he would toggle his own, and vice versa. By performing the experiment on more and more distant hills, Galileo hoped to measure the speed of light. The precision wasn't really there, but no one can blame him for taking a crack at it, and he did come to a pretty interesting conclusion.
If it isn't infinite, the speed of light is pretty darn fast.
Over the next few centuries, physicists made ever more precise measurements, but we won't bother you with the design specs for the intricate instrumentation. Suffice it to say that as time went on, scientists grew more and more determined to shed light on light.
The modern value of the speed of light is 299,792,458 meters per second. Rather than rattle off all of the digits, we'll simply call it c for the Latin celeritas, meaning "swift." This measurement is not the kind of number you get with a ruler and an egg timer. To measure c this precisely, you have to use an atomic clock powered by cesium-133 atoms. The scientific community defines the second as exactly 9,192,631,770 times the frequency of light emitted by the "hyperfine transition" of cesium-133. This may sound like it's unnecessarily confusing, but it actually simplifies things a great deal. The second, like your hat size, becomes something that we define in terms of something real; a bunch of physicists could build cesium clocks, and since all cesium acts the same, everyone tells the same time.
We've come up with a creative way of defining the second, but how does that help us measure the speed of light? Speeds are ratios of distance over time, such as miles per hour, and defining the second gives us some leverage. The only thing left to do is determine the length of a meter. This may seem pretty obvious since a meter is exactly one meter long. Just get out a meter stick and you're all set. But how long is that?
From 1889 until 1983, if you wanted to know how tall you were, you'd have to go to the International Bureau of Weights and Measures in Svres, France, go into their vault, and take out their platinum meter stick to measure yourself. Not only was this cumbersome (and illegal, if you didn't ask nicely to use it first), it tends to be pretty inaccurate. Most materials, including platinum, expand when heated. Under the old system, a meter was slightly longer on hot days than cool ones.
So instead of using an actual meter stick, we have a clock capable of measuring a second, and we define a meter as 1/299,792,458 the distance that light travels in 1 second. To make this blindingly obvious, what we've done is say, "We know the speed of light exactly. But meters, on the other hand, have a tiny uncertainty." All this hard work means that we can normalize the second and the meter, and everyone uses the same measurement system.
Keep in mind, though, that the crux of it all is that light doesn't move infinitely fast. Not impressed? Brace yourself for a philosophical bombshell: because light moves at a finite speed, we are forever gazing into the past. As you're reading this book, a foot in front of you, you're seeing it as it was about a billionth of a second earlier. The light from the Sun takes about eight minutes to reach Earth, so our star could well have burned out five minutes ago and we'd have no way of knowing it. When we look at stars in our Galaxy, the light takes hundreds, or even thousands of years to reach us, and so it is a very real possibility that some of the stars we see in the sky are no longer around.
Why can't you tell how fast a ship is moving through fog?
No experiment has ever produced a particle traveling faster than the speed of light. The speed limit of the universe seems to be something we can't brush off even if we wanted to, and the constant speed of light is just the first of two ingredients in what will turn out to be one of the finest physics dishes ever cooked. For the second, we need to think about what it even means to be moving at all.
Allow us to introduce you to Rusty, a physicist-hobo riding the rails, ostracized by society for the unique standards of hygiene common to his lot. Rusty has managed to "borrow" the platinum meter stick from the International Bureau of Standards (which, while not perfect, is still pretty good by hobo standards), and he has a bunch of cesium atoms to build an atomic clock.
He passes his day by throwing his bindle across the train. Each time he throws it, he measures the distance it travels, and the time it takes to cover that distance. Since speed is the ratio of distance traveled compared to the time it takes to cover that distance (miles per hour), Rusty is able to calculate the speed of his bindle with high accuracy.
After a tiring day of bindle-tossing, Rusty nods off to sleep, and he awakes in his own private freight car. Since freight cars don't have any windows, and the train is moving on smooth track, he finds himself somewhat disoriented when he slides open the door and finds that he is moving. You may have noticed that even in cars, you sometimes can't tell that you're moving without looking out the window.
You also may not have noticed that if you're standing on the equator, you're moving at more than 1,000 mph around the center of Earth. Faster still, Earth is moving at about 68,000 mph around the Sun. And the Sun is moving at close to 500,000 mph around the center of our Milky Way Galaxy, which, in turn, is traveling through space at well over 1 million mph.
The point is that you (or Rusty) don't notice the train (or Earth, or the Sun, or the Galaxy) moving, regardless of how fast it's moving, as long as it does so smoothly and in a straight line.
Galileo used this argument in favor of Earth going around the Sun. Most people at the time assumed that you'd be able to somehow feel Earth's motion as it flies around the Sun, so therefore we must be standing still.
"Nonsense!" said Galileo. Not having a ready supply of either hobos or trains, he compared the motion of Earth to a ship moving on a calm sea. It's impossible for a sailor to tell under those circumstances whether he's moving or standing still. This principle has come to be known as "Galilean relativity" (not to be confused with Albert Einstein's special relativity, which we will encounter shortly).
According to Galileo (and Isaac Newton, and ultimately Einstein) there is quite literally no experiment you can do on a smoothly moving train that will give a different result than if you were sitting still. Think back to trips with your family in which you threw mustard packets at your little brother until your parents threatened to "turn this car around this minute, young man!" Even though the car was moving at 60 mph or more, you threw the packets exactly as you would have if the car were sitting still. Like it or not, all of that tormenting was nothing more than a simple physics experiment. On the other hand, this is only true if the speed and direction of the car/train/planet/galaxy are exactly (or really, really close to) constant. You definitely felt it if your parents actually made good on their threat and slammed on the brakes.
So when he awakes from his blissful hobo slumber to return to his bindle-tossing experiments, Rusty might be quite unaware that the train has started steadily moving at about 15 mph. After arranging himself at one end of the train car, he tosses his bindle and measures the speed at, say, 5 mph. Patches, a fellow hobo-physicist, stands outside the moving train but also decides to participate. Using special hobo X-ray goggles to see through the train's walls, he also measures the speed of the bindle as Rusty throws it. Patches, from his vantage point outside the train, finds the bindle to move at about 20 mph (the 15 mph that Rusty's train is moving plus the 5 mph of the bindle).
So who's right? Is the bindle moving at 5 mph or 20 mph? Well, both are correct. We'd say that it's moving at 20 mph with respect to Patches and 5 mph with respect to Rusty.
Now imagine that our train has a high-tech lab equipped with lasers (which, being made of light, naturally travel at c). At one end of the train sits the laser, manned by Rusty. At the other end of the train sits an open can of baked beans. If Rusty turned on the laser for a short pulse (to heat his baked beans, naturally) and measured the time for the beans to start cooking, he could compute the speed of the laser, and he'd find it to be c.
What about Patches? He will, presumably, measure the same amount of time for the light pulse to reach the detector. However, according to him, the light doesn't have to travel as far to get there, so he should measure the speed of the pulse to be faster than c. In fact, common sense tells us that he should measure the pulse to be moving at c + 15 mph. Earlier we said that Einstein assumed that the speed of light is constant for all observers, but by our reasoning the beam doesn't appear to be constant. Not constant at all! Could the great Einstein be wrong?
Fifteen pages into the book, and we've already broken the laws of physics. We couldn't be any more embarrassed if we showed up to a party wearing the same dress as the hostess. It looks like we just blew it. If only there were some obsessive scientist we could look to, some concrete example to revalidate the concept of c as a constant.
We just so happen to have such a scientist. His name was Albert Michelson, and he loved light in a way that today might be characterized as "driving" or "unhealthy." His scientific career began in 1881, after he left the navy to pursue science. He measured light independently for a while, doing gigs in Berlin, Potsdam, and Canada, until he met Edward Morley. They worked together to produce ever more elaborate devices for measuring the speed of light, eventually reaching number 1 with "Bridge over Troubled Water," which stayed at the top of the charts for six straight weeks.
The devices they constructed worked on the following basic premise: since Earth travels around the Sun once a year, relative to the sun their lab should travel at different speeds and in different directions at different times of year. Michelson's "interferometer" was designed to measure whether the speed of light was different when moving in different directions. Your basic intuition should tell you that as Earth moves toward or away from the Sun, the measured value of c should change.
Your intuition is wrong. In experiment after experiment, Michelson and Morley showed that no matter what the direction of motion, the speed of light was the same everywhere.
As of 1887, this was a pretty big conundrum, and it defied the senses because this only seems to work for light. If you found yourself on a bike, face-to-face with an angry cow, it would make all the difference in the world whether you rode toward or away from the charging animal. Whether you run toward or away from a light source, on the other hand, c is c.
Putting it even more bluntly (on the off chance that the strangeness of this still isn't clear), if you were to shine a laser pointer at a high-tech measuring device, then you would measure the photons (light particles) coming out of the laser pointer at about 300 million meters per second. If you were in a glass spaceship traveling away from a laser at half the speed of light (150 million meters per second) and someone fired the laser beam through your ship to a detector, you would still measure the beam to be traveling at the speed of light.
How is that even remotely possible?
To explain this, we need to take a closer look at a hero of physics, the "Light"-Weight Champion* of the World: Albert Einstein.
How fast does a light beam go if you're running beside it?
When Einstein first proposed his principle of special relativity in 1905, he made two very simple assumptions:
1. Just like Galileo, he assumed that if you were traveling at constant speed and direction, you could do any experiment you like and the results would be indistinguishable from doing the same experiment in a stationary position. (Well, sort of. Our lawyers advise us to point out that gravity accelerates things, and special relativity relies on there being no accelerations at all. There are corrections that will take gravity into account, but we can safely ignore them in this case. The correction required for the force of gravity on Earth is very, very small compared to the correction near the edge of a black hole.) 2. Unlike Newton, Einstein assumed that all observers measure the same speed of light through empty space, regardless of whether they are moving.
In our hobo example, Rusty threw his bindle and measured the speed by dividing the length of the car by the time the bindle took to hit the side. Patches sat by the side of the tracks and watched the train and bindle speed by, and therefore saw the bindle move farther (across the car and across the ground the car covered) in the same amount of time. He saw the bindle move faster than Rusty did.
But now consider the same case with a laser pointer. If Einstein was right (and Michelson and Morley's experiment demonstrated, almost two decades earlier, that he was) then Rusty should measure the laser moving at c and Patches should measure the same exact speed.
Most physicists believe that c is a constant without batting an eyelash, and use it to their collective advantage. As a form of exploitation, they frequently express distances in terms of the distance light can travel in a particular amount of time. For example, "light-seconds" are approximately 186,000 miles, or about half the distance to the Moon. Naturally, it takes light 1 second to travel 1 light-second. Astronomers more commonly use the unit "light-year," which is about 6 trillion miles-about a quarter the distance to the nearest star outside our solar system.
So let's make our previous example a little weirder and give our hobo-physicist an intergalactic freight car. It's 1 light-second long, and while Rusty has more space than he will ever need to stretch out and nap, he has the perfect amount of space to run his laser experiment again. He fires off the laser from the back of the train and, by his reckoning, the laser takes 1 second to traverse the train. It must, after all, because light travels at the speed of light (duh!).
(Continues...)
Excerpted from A User's Guide to the Universe by Dave Goldberg Jeff Blomquist Copyright © 2010 by John Wiley & Sons, Ltd. Excerpted by permission.
All rights reserved. No part of this excerpt may be reproduced or reprinted without permission in writing from the publisher.
Excerpts are provided by Dial-A-Book Inc. solely for the personal use of visitors to this web site.