sábado, 17 de septiembre de 2011

Earth Sciences - Modeling the local impact of global climate change

Modeling the local impact of global climate change
A recent study of the Catalina Eddy performed by Kanamitsu. The figure shows the 3-hourly evolution of the eddy during two days. Kanamitsu discovered that the eddy disappears during 00Z and 03Z, which had never been reported before. This was due to the lack of high time-resolution observations. This kind of analysis is only possible using the dynamically downscaled analysis. Credit: Courtesy of Masao Kanamitsu, Scripps Institution of Oceanography

"You don't need a weatherman to know which way the wind blows," Bob Dylan famously sang. But if you want to know how it will blow tomorrow, odds are you're going to check the forecast.
Atmospheric prediction has improved immeasurably in the 45 years since Dylan sang "Subterranean Homesick Blues." Whether you're interested in tomorrow's high or the global heat index a decade from now, forecasters can now predict the climate with far greater accuracy.
The rise of powerful high-performance computers plays a large part in those improvements. Scientists isolate the factors that influence the weather--heat, radiation, the rotation of the Earth--transform them into mathematical formulas, and use supercomputers to forecast the atmosphere in all its complexity.
And yet, those forecasts are still painted with a fairly large brush. The global climate models--upon which all official predictions are based--have a resolution on the order of 62 miles (100 kilometers) per grid point. At that level of detail, storms appear as undifferentiated blobs, and towns in the mountains and the valley seem to experience identical weather.
"It's difficult to accurately examine how river flows have changed over the last 50 years, because one grid point may contain many rivers," said Masao Kanamitsu, a veteran of the atmospheric modeling world and a leading researcher at Scripps Institution of Oceanography.
Making a weatherman
Kanamitsu knew he wanted to be a computational weather forecaster from the time he was a teenager in Japan in the 1960s. He worked his way through the world's most advanced  centers, first in Japan, then in Europe, and most recently in the United States.
In the early to mid-1990s, Kanamitsu used Cray systems and Japan's  to run global climate models. Today, he uses the Ranger supercomputer at the Texas Advanced Computing Center, the second largest supercomputer in XSEDE. XSEDE is supported by NSF's eXtreme Digital program.


A demonstration of what dynamical downscaling can achieve. The center figure is the coarse resolution analysis used to force the high resolution model. The left figure is the output from Kanamitsu's downscaling which produces an eddy, or current. This eddy is famous in Southern California due to the very cloudy and cold weather during the May-June period. The right figure is the regional scale analysis performed by National Weather Service, which utilized local observations. Credit: Courtesy of Masao Kanamitsu, Scripps Institution of Oceanography

Kanamitsu and his colleagues in the atmospheric community use a method called "downscaling" to improve regional predictions. The technique takes output from a global climate model and adds information--at scales smaller than the grid spacing--to resolve important features like clouds and mountains. 
"You're given large-scale, coarse-resolution data, and you have to find a way to get the small-scale detail," Kanamitsu said.
Modeling California
Recently, Kanamitsu has been focusing on creating improved regional models for California, where small-scale weather patterns play a large role in the state's many microclimates. By integrating detailed information about the topography, vegetation, river flow and other factors into the subgrid of California, Kanamitsu has been able to achieve a resolution of 6 miles (10 kilometer) per grid point--a huge improvement over the normally accepted 62 mile (100 kilometer) per grid point.
Kanamitsu is also tackling the problem of connecting atmospheric conditions with ocean dynamics.
"Along the coast of California, there's a cold ocean that interacts with the atmosphere at very small scales," Kanamitsu said. "We're simulating the ocean currents and temperature in a high-resolution ocean model, coupled with a high-resolution atmospheric model, to find out the impact of these small-scale ocean states."
To combine all of those factors and get an answer in a short period of time requires very powerful and tightly connected supercomputers like Ranger. The results of Kanamitsu's simulations improved upon those currently in use by the National Weather Service.
Other applications
Other researchers in the community have already begun applying the downscaling results to fish population studies, river-flow changes and wind-energy applications.
"Kanamitsu's model simulations have enabled a much better resolved picture of the processes affecting wind flow and precipitation in the contemporary, historical period in California," said Scripps hydrometeorologist Daniel Cayan.
Over the course of his long career, Kanamitsu has clearly seen how improved computer modeling has changed his field--and the world.
"Thirty years ago, I was one of the forecasters," he said. "Every day, we took our computer model results to the meeting, but the forecaster in charge normally didn't look at or believe in our results. Now, forecasters believe in the models so much that some people think they're losing their skill."
As scientists seek to determine the local impact of  change and address those changes, accurate historical records and sophisticated regional forecasts like those facilitated by Kanamitsu's work are becoming increasingly crucial.
Kanamitsu's research on the NSF-supported Ranger supercomputer is funded by NOAA and by the California Energy Commission.
National Science Foundation (news : web)
Provided by PhysOrg.com

No hay comentarios:

Publicar un comentario