domingo, 24 de marzo de 2013

"Hacking the Cosmos" --Crunching Planck Data of Big-Bang Afterglow Reveals Daunting New Puzzles




Planck-satellite



Written in light shortly after the big bang, the CMB is a faint glow that permeates the cosmos. Studying it can help us understand how our universe was born, its nature, composition and eventual fate. "Encoded in its fluctuations are the parameters of all cosmology, numbers that describe the universe in its entirety," says Julian Borrill, a Planck collaborator and cosmologist in the Computational Research Division at Berkeley Lab that has released preliminary results based on the Planck observatory's first 15 months of data. Using supercomputers at the U.S. Department of Energy's (DOE) National Energy Research Scientific Computing Center (NERSC). Planck scientists have created the most detailed and accurate maps yet of the relic radiation from the big bang, reveaingl that the universe is 100 million years older than we thought with more matter and less dark energy.

 "These maps are proving to be a goldmine containing stunning confirmations and new puzzles," says Martin White, a Planck scientist and physicist with University of California Berkeley and at Lawrence Berkeley National Laboratory (Berkeley Lab). "This data will form the cornerstone of our cosmological model for decades to come and spur new directions in research." 
 
Planck is a European Space Agency mission with collaboration from NASA. It was launched in 2009 to a point almost a million miles from Earth where it can look into deep space and map tiny differences in the cosmic microwave background, the faint glow of radiation left over from just after the big bang. The Planck observatory has produced the most detailed map to date of mass distribution in the universe.

For the first 370,000 years of the universe’s existence, light was trapped inside a hot plasma, unable to travel far without bouncing off electrons. Eventually the plasma cooled enough for light particles (photons) to escape, creating the patterns of the cosmic microwave background. The patterns of light represent the seeds of galaxies and clusters of galaxies we see around us today.

Then these photons traveled through space for billions of years, making their way past stars and galaxies, before falling into Planck’s detectors. The gravitational pull of both galaxies and clumps of dark matter pulls photons onto new courses, an effect called “gravitational lensing.”



           Timeline_portrait



“Our microwave background maps are now sufficiently sensitive that we can use them to infer a map of the dark matter that has gravitationally-lensed the microwave photons,” said Lloyd Knox, a physics professor at UC Davis and the leader of the U.S. team determining these ingredients from the Planck data.. “This is the first all-sky map of the large-scale mass distribution in the Universe.” 

These new data from Planck have allowed scientists to test and improve the accuracy of the standard model of cosmology, which describes the age and contents of our universe. Based on the new map, the Planck team estimates that the expansion rate of the universe, known as Hubble’s constant, is 67.15 plus or minus 1.2 kilometers/second/megaparsec. (A megaparsec is roughly 3 million light-years.) That’s less than prior estimates derived from space telescopes, such as NASA’s Spitzer and Hubble.

The new estimate of dark matter content in the universe is 26.8 percent, up from 24 percent, while dark energy falls to 68.3 percent, down from 71.4 percent. Normal matter now is 4.9 percent, up from 4.6 percent. At the same time, some curious features are observed that don’t quite fit with the current model. For example, the model assumes the sky is the same everywhere, but the light patterns are asymmetrical on two halves of the sky, and there is larger-than-expected cold spot extending over a patch of sky.

“On one hand, we have a simple model that fits our observations extremely well, but on the other hand, we see some strange features which force us to rethink some of our basic assumptions,” said Jan Tauber, the European Space Agency’s Planck project scientist based in the Netherlands.

Scientists can also use the new map to test theories about cosmic inflation, a dramatic expansion of the universe that occurred immediately after its birth. In far less time than it takes to blink an eye, the universe blew up by 100 trillion trillion times in size. The new map, by showing that matter seems to be distributed randomly, suggests that random processes were at play in the very early universe on minute “quantum” scales. This allows scientists to rule out many complex inflation theories in favor of simple ones.

The Planck CMB surveys are complex and subtle undertakings. Even with the most sophisticated detectors, scientists still need supercomputing to sift the CMB's faint signal out of a noisy universe and decode its meaning.* Hundreds of scientists from around the world study the CMB using supercomputers at NERSC, a DOE user facility based at Berkeley Lab. "NERSC supports the entire international Planck effort," says Borrill. A co-founder of the Computational Cosmology Center (C3) at the lab, Borrill has been developing supercomputing tools for CMB experiments for over a decade. 

The Planck observatory, a mission of the European Space Agency with significant participation from NASA, is the most challenging yet. Parked in an artificial orbit about 800,000 miles away from Earth, Planck's 72 detectors complete a full scan of the sky once every six months or so. Observing at nine different frequencies, Planck gathers about 10,000 samples every second, or a trillion samples in total for the 15 months of data included in this first release. In fact, Planck generates so much data that, unlike earlier CMB experiments, it's impossible to analyze exactly, even with NERSC's powerful supercomputers.

Instead, CMB scientists employ clever workarounds. Using approximate methods they are able to handle the Planck data volume, but then they need to understand the uncertainties and biases their approximations have left in the results.

One particularly challenging bias comes from the instrument itself. The position and orientation of the observatory in its orbit, the particular shapes and sizes of detectors (these vary) and even the overlap in Planck's scanning pattern affect the data.

To account for such biases and uncertainties, researchers generate a thousand synthetic (or simulated) copies of the Planck data and apply the same analysis to these. Measuring how the approximations affect this simulated data allows the Planck team to account for their impact on the real data. With each generation of NERSC supercomputers, the Planck team has adapted its software to run on more and more processors, pushing the limits of successive systems while reducing the time it takes to run a greater number of complex calculations.

"By scaling up to tens of thousands of processors, we've reduced the time it takes to run these calculations from an impossible 1,000 years down to a few weeks," says Ted Kisner, a C3 member at Berkeley Lab and Planck scientist. In fact, the team's codes are so demanding that they're often called on to push the limits of new NERSC systems.

Access to the NERSC Global Filesystem and vast online and offline storage has also been key. "CMB data over the last 15 years have grown with Moore's Law, so we expect a two magnitude increase in data in the coming 15 years, too," says Borrill.

In 2007 NASA and DOE negotiated a formal interagency agreement that guaranteed Planck access to NERSC for the duration of its mission. "Without the exemplary interagency cooperation between NASA and DOE, Planck would not be doing the science it's doing today," says Charles Lawrence of NASA's Jet Propulsion Laboratory (JPL). A Planck project scientist, Lawrence leads the U.S. team for NASA.

No hay comentarios:

Publicar un comentario