Particle smasher gets a super-brain
SOMETIME in 2007, physicists are going to come closest to seeing what the universe was like a split-second after the big bang. Inside a 27-kilometre-long circular tunnel that straddles the border of France and Switzerland 100 metres underground, the Large Hadron Collider will push protons to almost the speed of light and smash them head-on at energies never before created on Earth.
But it will be a messy business. The torrent of information gushing forth from the LHC each year will be enough to fill a stack of CDs three times as high as Mount Everest. To make sense of it will require some 100,000 of today's most powerful PCs, so it is little wonder that CERN- the European centre for particle physics near Geneva that is building the collider- is co-opting a worldwide "grid" of computers to help store and analyse the data. Physicists hope that this collective computing power will help them spot exotic new particles, including the elusive Higgs boson, and validate theories that aim to unite three of the four fundamental forces of nature.
"It will be very exciting to have a new facility searching for new particles in a completely unknown regime of energy," says Peter Watkins, head of particle physics at the University of Birmingham in the UK and a member of GridPP, the collaboration organising the UK's contribution to the LHC computing grid. "Many people have been working on the design for a significant part of their lives, and they're very excited about the day the LHC switches on."
In April, computer scientists at CERN got a taste of what they will be up against when the LHC powers up in 2007. Computers in Geneva sustained a continuous data flow of 600 megabytes per second to seven sites in Europe and the US over a period of 10 days. They transmitted a total of 500 terabytes, which would take 250 years to download on a typical domestic broadband link. "People were very happy with that," says Francois Grey, an IT spokesman for CERN. He adds that it was a challenge to get everyone to cooperate. "Computer centres tend to guard their particular way of doing things, and they are uncomfortable for security reasons about opening up to the world," he says. This cooperation will be tested further in September, when CERN hopes to transmit data to many other computing centres and keep the stream going for three months, allowing scientists to test their software for analysing the data. "Then we'll really start to see how robust the system is, to understand its strengths and limitations," says Grey. "Everybody is quite excited, but also I think a little nervous about how that's going to go." The ultimate goal is to sustain triple the data flow achieved in April by the time the LHC gets down to business.
And CERN will definitely need that capacity and more when the LHC goes full throttle. Nearly 5000 superconducting magnets, weighing up to 35 tonnes each and chilled to -271 °C, will accelerate counter-rotating bunches of protons to nearly the speed of light and smash them into each other 40 million times a second. The proton bunches will collide with an energy of 14 teraelectronvolts- about seven times what any previous accelerator has achieved.
At each collision vast amounts of energy will be squeezed into a microscopic volume, reproducing the conditions inside the hot fireball that filled the universe just a million-millionth of a second after the big bang. Hundreds or thousands of particles will spray out from each collision, and a large fraction of these will have to be tracked and identified.
Detecting and sifting through nearly a billion trajectories each second will be the job of four main detectors. The two largest, ATLAS and CMS, will search for the Higgs particle, also known as the "God particle" as it is believed to be what endows other particles with mass.
The detectors will also look for so-called "supersymmetric" particles. Glimpsing these would be a boost for theories of supersymmetry, which go beyond the standard model of particle physics. Although these theories help to unify the strong, weak and electromagnetic forces of nature, to do so they require the existence of a whole host of partners for all the known particles. The LHC will be the first particle accelerator to even come close to the energies thought necessary to create these supersymmetric partners.
The third detector, called the LHC "Beauty" experiment, will look for evidence of asymmetry in the production of particles called B mesons, to explain why today's universe is dominated by matter rather than antimatter. The fourth detector, ALICE, will be looking for an exotic and extremely dense state of matter called the quark-gluon plasma, which is thought to have existed a fraction of a second after the big bang. For ALICE, the LHC will collide heavy ions such as lead, rather than protons.
These four experiments will generate a staggering 15 million gigabytes of data each year. CERN could have built a massive centralised computing centre to analyse this data, but instead it has opted for the distributed grid approach. CERN will store all raw data in huge tape silos in a local facility, but it will also relay the data to a dozen storage sites in Europe, North America and Asia. From there it will filter down to about 100 smaller sites in 40 countries, then on to individual institutes and universities.
A worldwide grid of computers will allow the 6000-plus physicists working on the experiments to log on to local PCs and request a data analysis. To complicate things further, the data covering the 10 to 15 years that the LHC is expected to operate for will have to be accessible to physicists at any time.
The LHC's worldwide grid achieved a significant milestone on its way to this goal on 15 March, when more than 100 computing centres in 31 countries were connected together, making it the largest scientific grid in history. But even this mammoth network is just the beginning, wielding only 5 per cent of the estimated processing power that the LHC will eventually need.
CERN will need to ensure that no one team or institute hogs the grid. Physicists will probably barter computer time for now, but the system could later work on a pay-as-you-go basis. "Industry is very interested to see how we handle this," says Grey. "A large grid could be very exciting for commercial business, but they need to know what the business model would be."
By late next year the LHC grid should be running continuously, allowing experiments to start. Although the accelerator will still be silent, physicists will switch on the detectors to see how they respond to cosmic ray particles from space. And when the LHC fires up in earnest in the summer of 2007, everyone will be on tenterhooks. Watkins says that although there is no guarantee that the LHC will spawn exotic new particles, there is compelling theoretical evidence that it will: "We have no idea what new physics we're going to see. It wouldn't be fun otherwise," he says.
Rick Gaitskell, a physicist at Brown University in Providence, Rhode Island, agrees: "Apart from the people who are perennially misery-guts, I don't know anyone who doesn't think we're going to learn something from the LHC." Many feel the future of particle physics is resting on the LHC- the failure of this multi-billion dollar project to kick-start new physics could be devastating. But physicists are an optimistic bunch, and are already planning another major accelerator, a linear collider that would smash electrons into positrons.
Because these are elementary particles, unlike the LHC's protons, which contain quarks, the collisions will be less messy and easier to analyse. Gaitskell believes a top-notch linear collider could be the perfect complement to the LHC. "Even if, god forbid, the LHC fails to find any new particles, I think people will still put their weight behind a new linear collider."
Source: Eurekalert & othersLast reviewed: By John M. Grohol, Psy.D. on 21 Feb 2009
Published on PsychCentral.com. All rights reserved.