edit 4/2/2012: This discovery has now, almost certainly, been ruled out after the discovery of a loose fiber optic connection and rigorous testing to attribute this error to the neutrino timing anomaly. I’ve written more about this story, and the role mistakes play in the scientific process <a href=”http://wp.me/pWGC2-1lR”>here. </a>

On Friday, one of my students walked in and said something along the lines of “Hey Mr. Burk, did you hear about that neutrino discovery that proves all of physics wrong?”

Cue the record scratch. It was time to take a detour to explore what that the awesome neutrino discovery really means, and how it is a remarkable example of the power of telling the full story of a number, and beautiful example of the power of science.

First, I thought it was pretty awesome that thanks to twitter and blogging, I not only knew exactly what the student was talking about when mentioning a discovery that was less than a day old, I also had a chance to read a few blog posts about, and even skim the original paper in arXiv.

But let’s back up. Maybe you’ve been living under a rock and don’t know at all what I’m talking about (or maybe you just don’t hang out with physics geeks like me much). On Thursday, scientists from the Gran Sasso National Laboratory in Italy released findings that they’re measured the speed of neutrinos, created in some experiments at CERN and determined them to be very slightly faster than the speed of light.

Here’s how the NYT reported it:

According to scientists familiar with the paper, the neutrinos raced from a particle accelerator at CERN outside Geneva, where they were created, to a cavern underneath Gran Sasso in Italy, a distance of about 450 miles, about 60 nanoseconds faster than it would take a light beam. That amounts to a speed greater than light by about 0.0025 percent (2.5 parts in a hundred thousand).

And here’s a writeup in CNET with a great graphic and some data we can use:

As part of the OPERA experiment, physicists tracked how long it takes for neutrons generated at CERN to reach a detector 730km away in Italy.
(Credit: National Institute of Nuclear Physics (ITFN) in Italy)
European physicists have measured tiny particles called neutrinos moving just faster than the speed of light–only a smidgen faster, but enough to raise a serious possibility that Einstein’s physics need a major overhaul.
The scientists sent a beam of neutrinos from CERN, on the Swiss-French border near Geneva, to the INFN (Istituto Nazionale di Fisica Nucleare) Gran Sasso Laboratory in central Italy, 730 kilometers (454 miles) away, in a research project called OPERA…But over the last three years, the OPERA experiment has gathered high-precision data on exactly how long it took for the neutrinos to make a journey that should last about 2.4 thousandths of a second.

Ok, here’s where the news media misses a chance to tell the complete story of a number. Both of the measurements reported here are wildly less precise than the measured values actually reported in the paper. Let’s do the speed calculation ad see what we get.

$v_\nu=\frac{\Delta x}{\Delta t}=\frac{7.3\times10^5\;\textrm{m}}{2.4\times\;10^-3\textrm{s}}=3.0\times\;10^8\frac{\textrm{m}}{\textrm{s}}$

But that’s basically the value for the speed of light, so what’s the big deal? The sig fig police would tell you that number is impossible to distinguish from $2.99792458\frac{\textrm{m}}{\textrm{s}}$, the defined value for c.

Good thing the scientists who did this work weren’t nearly so lackdaisical with their measurements as the journalists were in reporting them. The actual distance between the CERN lab where the neutrinos were created and Italy’s National Institute for Nuclear Physics where they were detected, was measured to be $7.312780\times 10^5 \pm 0.2\; \textrm{m}$. That’s right, scientists measured the distance between these two objects in completely different cities to plus or minus 20 centimeters, easily within a standard ruler. This is an amazing measurement, and it is far more meaningful than when my students write down the velocity of a buggy that travels 2 meters in 6 seconds as $0.33333333333333 \frac{\textrm{m}}{\textrm{s}}$. How did scientists do this? It wasn’t by laying a meter stick from end to end. To make this measurement, scientists used very precise differential GPS, and their measurement was so precise, they could track continental drift between the two locations, and a 2009 earthquake that produced a 7cm shift in the earth’s crust shows up as clear as day.

How’s that for the story of a number? Modern science can measure the distance between two things 450 miles apart so precisely that it can actually measure that they are separating at a rate of 1 cm/year.

The next thing I told my students was they needed to understand a bit about neutrinos. Neutrinos have almost no mass (something we only discovered after they were born), no charge and interact only very weakly with matter, making them incredibly difficult to detect, even though 50 billion of them are passing through you every second.

So this is a big problem. If you can’t detect neutrinos, how do you measure the time it takes them to travel from CERN to OPERA? It’s not like we can tag em when the leave CERN and then spot them when they pass OPERA a few thousandths of a second later. In the entire 3 year timespan of the opera experiment, scientists only detected 16,000 neutrinos from CERN (that’s out of $10^20$ total proton collisions with the graphite target that produces the neutrinos). But the protons are directly related to the production of the neutrinos. Protons are also massive and charged; therefore  they are easy to detect, making them a great proxy for the creation of the neutrinos. Since the LHC’s main job is all about colliding beams of protons together, they know a lot about he shape and characteristics of these proton beams.

This gives us a hint on how we can find the time of flight for the neutrinos. If we know the time that the proton hit the target, and we know the time that neutrinos are detected at the OPERA detector, we can find the speed of the neutrinos. Still, there is the problem that we can’t correspond an individual neutrino detection with an individual proton collision. Instead, we need to turn to statistical analysis and look at characteristics of the entire pulse of protons, which has a very well known shape.

If you plot the distribution of neutrino events, you get this graph (neutrinos are the dots in black). The red line represents the original proton distribution.

For a while, I had a very hard time understanding this graph. Why are the neutrinos plotted starting at $t=0$? Thanks to some help from @jeffhellman, and his friends at CERN, I was able to figure out that this graph already has removed the the time it takes for light to travel from the source to the detector using the previously mentioned precisely measured distance between CERN and OPERA. So by this, if the neutrinos were to travel at exactly at the speed of light, they would arrive with exactly the same time distribution as the protons (scaled down by a factor of a $10^{18}$ or so), and we would see the two distributions overlap. Again, this is assuming you’ve got some ridiculously awesome way to synchronize the clocks at CERN and OPERA, which they did, using a ultra-precise atomic clock to get agreement within 2 nanoseconds ($10^{-9}\; \textrm{s}$).

Why don’t these the measurements for the proton beams and neutrino measurements overlap? None of the detectors in this experiment detect particles instantaneously. All of them have some small systematic delay. So OPERA scientists did two things. One, they do a blind statistical analysis to fit the shape of the proton beam signal to the neutrino data by shifting the proton beam pulse along the time axis—doing this will tell what the delay, $\delta t$ is between the two signals. That blind analysis produced these graphs:

The measured delay between the signals is a whopping 1048 nanoseconds. How much time is that? It’s an almost inconceivably short duration to our minds (but not to modern electronics—your computer can likely do 2-3 thousand calculations in that time). But now comes the really hard part. How much of that delay is due to the systematic delays due to the measurement equipment? This is where scientists have to measure all the delays and their uncertainties down to the nanosecond, and then subtract this from the $\delta t = 1048 \;\textrm{ns}$ measurement to see if there is any delay left over that isn’t accounted for in this work, and therefore might be taken as evidence that neutrinos are traveling faster than light. And that’s exactly what the scientists did, in this table:

So the total systematic delay they found was 988 ns. But then you need to also measure the uncertainty in each of those measurements.

Yes, that’s 7.4 ns of uncertainty there. In 7.4 ns, light can travel 2.2m. This is an amazing measurement, and a truly beautiful story of a number.

With all these measurements, you can now calculate how much delay is left over one you subtract out the systematic delay.

$\textrm{TOF}_c-\textrm{TOF}_\nu=1048.5\;\textrm{ns}-987.8\;\textrm{ns}=(60.7\pm 6.9)\;\textrm{ns}$

Note: I had the hardest time understanding the above result in the OPERA paper until Jeff explained to me the numbers 1048 and 987 are not the actual times of flight for light and the neutrinos, respectively. Instead, they are the leftover delays when you subtract off the known time of flight for light.

There you have it—60ns of unaccounted for delay, indicating that the neutrinos are arriving at the detector 60 ns earlier than light would. Just for fun, let me write out what these calculations would look like if we were to carry them out in physics class.

$v_{\nu}=\frac{\Delta x}{\Delta t}=\frac{\Delta x}{{\delta t}_{light}-\left((\delta t)_{measured}-(\delta t)_{error}\right)}$

${\delta t}_{light}$  is just the measured distance divided by c:

${\delta t}_{light}=\frac{730534.61\;\textrm{m}}{2.99792458\times10^8\;\frac{\textrm{m}}{\textrm{s}}}=0.002436801\;\textrm{s}=2.436801\times10^{-3}\;\textrm{s}$

And putting that into the expression for $v_\nu$ gives us
$\begin{array}{rcl} {v_\nu}&=&\frac{\Delta x}{{\delta t}_{light}-\left((\delta t)_{measured}-(\delta t)_{error}\right)}=\frac{730534.61\;\textrm{m}}{{2.436801\times10^{-3}\;\textrm{s}}-\left((1.048\times10^{-6}\;\textrm{s})-(9.878\times10^{-7}\;\textrm{s})\right)}\\&=&2.997998843\times10^8\;\frac{\textrm{m}}{\textrm{s}} \end{array}$

and if we form the ratio $\frac{v_\nu}{c}$, we get
$\frac{v_\nu}{c}=\frac{2.997998843\times10^8\;\frac{\textrm{m}}{\textrm{s}}}{2.99792458\times10^8\;\frac{\textrm{m}}{\textrm{s}}}=1.0000247$.

And that is why I get angry when I see students listing the period of a pendulum that makes 1 swing in 3 seconds as 0.333333333 s, because somewhere in Italy, a scientist spent 3 years of his or her life tracking down an uncertainty of a few nanoseconds just to make a this discovery.

There is so much more to this discovery and the story of science. The best analysis I’ve found can be found at Starts with a Bang, Are we fooling ourselves with faster than light neutrinos?, which gives a wonderfully detailed summary of the work, as well as this great summary of what all this means:

There’s something wrong with the 988 nanosecond prediction, like they mis-measured the distance from the source to the detector, failed to correctly account for one (or more) stage(s) of the delay, etc.
This is a very, very unlikely statistical fluke, and that if they run the experiment for even longer, they’ll find that 988 nanoseconds is a better fit than the heretofore measured 1048.5 nanoseconds.
The neutrinos that you detect are biased in some way; in other words, the neutrinos that you detect aren’t supposed to have the same distribution as the protons you started with.
Or, there’s new physics afoot.

And, if there is new physics afoot, you can be assured that far from it spelling the end for physics or Einstein, it will bring on a flurry of excitement, since whatever explanation we come up with will not only explain all of the many well tested predictions and applications of special relativity, like GPS, but also this new finding as well.

This is why I think perhaps the best thing I’ve read about all of this, from Nanoscale views, Superluminal neutrinos – a case study in how good science is done. Why is this good science? Doug explains:

The collaboration spent three years looking hard at their data, analyzing it many different ways, checking and cross-checking. They are keenly aware that a claim of FTL neutrinos would be the very definition of “extraordinary” in the scientific sense, and would therefore require extraordinary evidence. Unable to find the (highly likely) flaw in their analysis and data, they are showing everything publicly, and asking for more investigation.

Another thing that makes this a hallmark of modern science is its collaborative nature. With 150 scientists participating in this discovery, it is on the small end of global collaborations, but still very impressive. Here’s the list of authors.

And I will point out last reason this is good science—the response the paper has gotten and how quick other scientists are to pitch in their own ideas and explanations. Here’s just one more example.

On the day after the discovery came out, another PhD physicist, John Costella published a quick response and analysis detailing why he thought the OPERA discovery was wrong. Two days later, when a simple analysis by another physicist showed him to be wrong, he published this 5 page explanation of his mistake, which included this quote1:

Today, I received an email from David Palmer that completely dissolved the fog. His analysis is elegant and simple. It pinpoints the erroneous assumption that I made….

The “blunder”, the “embarrassing gaffe”, is mine and mine alone. I am happy to wear that ignominy: the OPERA result—if its estimates for systematic errors withstand scrutiny, and if it is subsequently confirmed in future experiments—would arguably be the most important discovery in physics in almost a century. Looking stupid is a small price to pay if it brings us closer to determining whether such a monumental outcome is real or imaginary.

If only our politicians could be so quick and thorough in admitting their own mistakes.

But the amazing thing about all this is that it really isn’t all that amazing—ultra precise measurements like these are taking place in scientific laboratories all around the world, scientists are collaborating with one another across the globe, sharing data, checking each others results, asking questions pointing out and admitting mistakes. It’s truly a virtuous cycle that enables ever more amazing discoveries and powerful applications of those discoveries.

One other thing came up in our class discussion, as I was describing some of the incredible lengths scientists went to in order to make this measurement two ideas came up. What good is this discovery?, and how much did it cost? I’m unable to find the cost of the OPERA lab. However, CERN, a much bigger research lab (they run the Large Hadron Collider) has a budget of just under \$1 billion dollars (for comparison NASA’s budget is just 1.8 billion dollars). While this number might seem big, you should also know that the cost of one B2 stealth bomber is just over 1 billion dollars. Put in those terms, I think scientific endeavors like NASA, the LHC and OPERA are a true bargin. Still, what are we getting for that bargain? It’s very hard to say what the results of discoveries like this may be, but efforts to improve timekeeping by creating atomic clocks accurate to the nanosecond are directly responsible for GPS that allows you to land an airplane with no visibility or navigate your car through an unfamiliar city. And just over 20 years ago, a computer scientist at CERN wrote a little proposal for a way to link documents called “hypertext” mainly as a way for physicists to share data, and just a few years later this would become the World Wide Web, which you’re using right now to read this post.

And that’s the end of this most beautiful story.

1: I should point out Dr Costella isn’t so quick to apologize for all his scientific blunders. Among his publications is a five part series on the so-called Climategate emails, which have now been thoroughly debunked as a “manufactured controversy.” Climate science represents a collaboration between thousands of scientist making measurements as precisely as the ones I described above, and its findings of anthropogenic climate change represent are incredibly well-evidenced. Dr. Costella seems to have jumped on the bandwagon of using these emails to impugn the reputation of climate scientists and their work. Hopefully this latest mistake will help others to see his tendency to jump to rash conclusions.

1. September 27, 2011 12:39 am

And if there’s new physics afoot, you can be sure the climate deniers will be crowing about how the consensus can be wrong.

September 27, 2011 9:25 am

This is a really nice write up and a great classroom topic! Thanks for sharing.

I do think it’s worth pointing out that the scientists measured and categorized a significant amount of electrical delay and other sorts of systematic error to obtain the actual time of flight for the neutrinos. Compensating for these errors is really difficult- it looks like they did a great job, but it’s going to take the great process of replication and peer review to verify this finding.

Also, if the systematic error they calculated/measured was off by as little as 6%, the neutrinos no longer travel at faster-than-light speeds. 4% drops the significance of the finding, well, significantly.

I can’t take credit for this information- a friend who works at CERN (referenced above) filled me in.

3. September 27, 2011 11:25 am

Although I love the physics and math in your story, to me the main theme is the power of social media to connect so many minds to work on a problem. Which leads me to this question: why did it take BP so long to successfully cap the Deepwater Horizon well? This was an epic, global problem but what I recall is that offers of assistance from other nations were rebuffed. I may have it wrong, but I can’t help but wonder if the the world’s creative problem-solving energy had been harnessed through transparent crowd-sourcing, perhaps it would not have taken 91 days to solve this problem.

• September 27, 2011 8:36 pm

This is a very interesting and thought provoking question. I do think BP’s secrecy and mishandling of the whole thing definitely slowed down the capping process, but my understanding was that Energy Secretary Steven Chu (another physicist) was brought on board along with a host of scientific consultants as soon as things got clearly out of hand. I’d also say you’re looking at the difference between an unexpected emergency blowout that caught most everyone off guard, and a planned, carefully orchestrated experiment.

4. September 27, 2011 5:29 pm

Thanks, that’s the first piece about this research I have read that made me take it seriously!… I didn’t even know their article was such a large collaboration, and that it was available on arXiv.

But I still don’t understand why people fail to remember that the current theory does support tachyons, we just can’t make one by accelerating something, and we also had never seen one before…

• September 27, 2011 6:28 pm

You’re also right that tachyons could exist, and they would be traveling backward in time, which is what Neil Tyson tweeted yesterday:

5. September 27, 2011 6:00 pm

BTW, you said 1000 ns is an “inconceivably short” duration, but modern electronics has lots of microchips that run at 1GHz clocks. So a nanosecond is quite approachable if you work with electronics, not to say a thousand of them. Of course, if you mean to have an “intuitive idea” that’s another problem, but it’s not a “crazy” quantity.

• September 27, 2011 6:26 pm

You’re right. I was meaning inconceivable to our perceptions, but 1000 ns is really an eternity to a modern computer. I made a modification to that sentence.