Weather

3
12 COMMUNICATIONS OF THE ACM | SEPTEMBER 2014 | VOL. 57 | NO. 9 N news VISUALIZATION BY TRENT SCHINDLER, NASA/GODDARD/UMBC Science | DOI:10.1145/2641225 Samuel Greengard Weathering a New Era of Big Data Increased computing power combined with new and more advanced models are changing weather forecasting. T HROUGHOUT HISTORY, MAN- KIND has attempted to gain a better understanding of weather and forecast it more accurately. From an- cient observations about wind direc- tion, cloud formations, and baromet- ric pressure to more recent attempts to accumulate data from satellites, sensors, and other sources, weather forecasting has both fascinated and infuriated everyone from picnickers to farmers and emergency responders. It is, in a word, unpredictable. Yet over the last two decades, thanks to increasingly powerful computers, big data, and more sophisticated model- ing and simulations, weather forecast- ing has been steadily moving forward. Amid growing concerns about global warming and more volatile weather and climate patterns, researchers are at- tempting to develop better algorithms and systems. Says Cliff Mass, professor of atmospheric science at the Univer- sity of Washington, “Numerical data is the core technology of weather predic- tion. Everything is dependent upon it.” Moreover, the stakes continue to grow. There is mounting concern that not all weather models are created equal. In some cases, European and A visualization of data from the NASA Center for Climate Simulation, a state-of-the-art supercomputing facility in Greenbelt, MD, that runs complex models to help scientists better understand global climate. This visualization depicts atmospheric humidity during the Great Mississippi and Missouri Rivers Flood of 1993.

description

a

Transcript of Weather

  • 12 COMMUNICATIONS OF THE ACM | SEPTEMBER 2014 | VOL. 57 | NO. 9

    Nnews

    VI

    SU

    AL

    IZ

    AT

    IO

    N B

    Y T

    RE

    NT

    SC

    HI

    ND

    LE

    R,

    NA

    SA

    /GO

    DD

    AR

    D/U

    MB

    C

    Science | DOI:10.1145/2641225 Samuel Greengard

    Weathering a New Era of Big Data Increased computing power combined with new and more advanced models are changing weather forecasting.

    TH R O U G H O U T H I S T O R Y, M A N -

    K I N D has attempted to gain a better understanding of weather and forecast it more accurately. From an-

    cient observations about wind direc-tion, cloud formations, and baromet-ric pressure to more recent attempts to accumulate data from satellites, sensors, and other sources, weather forecasting has both fascinated and infuriated everyone from picnickers to farmers and emergency responders. It is, in a word, unpredictable.

    Yet over the last two decades, thanks to increasingly powerful computers, big data, and more sophisticated model-ing and simulations, weather forecast-ing has been steadily moving forward. Amid growing concerns about global warming and more volatile weather and climate patterns, researchers are at-tempting to develop better algorithms and systems. Says Cliff Mass, professor of atmospheric science at the Univer-sity of Washington, Numerical data is the core technology of weather predic-tion. Everything is dependent upon it.

    Moreover, the stakes continue to grow. There is mounting concern that not all weather models are created equal. In some cases, European and

    A visualization of data from the NASA Center for Climate Simulation, a state-of-the-art supercomputing facility in Greenbelt, MD, that runs complex models to help scientists better understand global climate. This visualization depicts atmospheric humidity during the Great Mississippi and Missouri Rivers Flood of 1993.

  • SEPTEMBER 2014 | VOL. 57 | NO. 9 | COMMUNICATIONS OF THE ACM 13

    news

    NAdministration (NOAA). Scientists in-crease the grid resolution to take advan-tage of the available processing power. Higher resolutions mean that scientists can develop more accurate forecasts that extend further out in time.

    Today, the most powerful weather computers rely on hundreds of thou-sands of processors and, in many cas-es, millions of data points. The Nation-al Weather Service (NWS) in the U.S. currently relies on a pair of supercom-puters with over 200 teraflops (a tera-flop equals one trillion floating-point operations per second) of capacity. By way of comparison, Chinas Tianhe-2 (Milky Way 2) supercomputer, which topped the June 2014 Top500 super-computer rankings, delivers perfor-mance of up to 33.86 petaflops per sec-ond (since a petaflop is equal to 1,000 teraflops, Tianhe-2 provides nearly 170 times more raw processing power than the NWS has available to it).

    Meanwhile, the Korean Meteoro-logical Administration in South Korea is expanding its computing storage capacity to 9.3 petabytes in order to better predict weather events, includ-ing typhoons. The European Centre for Medium-Range Weather Forecasts (ECMWF) processes 300 million ob-servations on a daily basis, producing about 65 terabytes of forecasts every day, with peaks of 100 terabytes. EC-MWFs archive holds 65 petabytes of data, and it is growing at rate of ap-proximately 50% annually, says soft-ware strategist Baudouin Raoult.

    Interestingly, weather forecasting organizations worldwide rely on much

    of the same data derived from many of the same sources, Mass points out. Differences in forecasts typically re-volve around the ways in which math-ematicians and researchers approach statistical processing and how they average and round off numbers. In ad-dition, Web (forecasting services) ob-tain weather data from various sourc-es and apply it in different ways, he explains. The result is different fore-casts but the underlying modeling is much less variable.

    Kyger says current NWS models are more than 60% accurate beyond five daysgenerally considered a skill-ful forecast benchmark. Yet, because the physics and dynamics of the atmo-sphere are not directly proportional to expanding a grid resolution, it is not possible to rely on a static model or linear equation to extrapolate data. In fact, with every hardware upgrade, it can take up to a year to fine-tune a new model to the point where it outper-forms an existing model. At one point, a skillful forecast was only a day or two. The improvements are very gradual year over year, but they add up to the point where significant improvements take place, he explains.

    Peter Bauer, head of ECMWFs Mod-el Division for the European Centre, says predictive skills have improved to the point where researchers are wit-nessing about a one-day-per-decade improvement rate in forecasting. This means that todays six-day forecasts are about on par with the accuracy of five-day forecasts a decade ago. In addition to extending the range and accuracy of large-scale forecasts, the techniques for predicting regional and local weather parameters such as precipitation, sur-face temperature, and wind have dra-matically improved, he points out.

    The Skys the LimitIn practical terms, even a small im-provement in forecasting quality can produce enormous benefits for indi-viduals, businesses, and society, from providing warnings for short-term events such as tornados and floods to long-term issues such as how to construct buildings and design infra-structure. For instance, before Hur-ricane Sandy slammed the northeast-ern U.S. in October 2012, the ECMWF had successfully predicted the storm

    American forecasting methods lead to significantly different predictionsand noticeably different results. This includes predicting the impact of snowstorms in the northeast U.S. in the winter of 20132014 and the ef-fects of Hurricane Sandy on that re-gion in 2012.

    Meanwhile, private companies such as IBM are entering the picture and introducing entirely different tools and methods for forecasting weather-related events.

    Says Lloyd Treinish, an IBM Distin-guished Engineer at the IBM Thomas J. Watson Research Center, The history of weather forecasting and the history of computing have been very tightly coupled. Since the 1940s, revolutions in computing have been very closely tied to weather forecasting and build-ing better equations and models. Over the last couple of decades, we have seen steady improvements in comput-ing, sensor technology, an understand-ing of the atmosphere, and the overall science. Over time, we are learning how to put all the pieces to work.

    A Clearer ViewThe basis for understanding weather in a more systematic way dates back more than a century, to a time when scientists were beginning to exam-ine the physics of the atmosphere and were first applying numerical methods to understand extraordi-narily complex physical processes. By the 1950s, forecasters had begun using mainframe computers to build weather models, moving beyond field observations and telegraph re-ports from the field. By the 1960s, satellites and sensors from automat-ic stations, aircraft, ships, weather balloons, and drifting ocean buoys entered the picture; they created en-tirely new and powerful ways to col-lect data, so it is possible to better understand the mechanisms associ-ated with weather and climate.

    Along the way, advances in technol-ogy and modeling have led to remark-able improvementsalthough, as Mass notes, We are constantly limited by computer power. It is a concern echoed by Ben Kyger, director of central operations for the National Centers for Environmental Prediction at the U.S. National Oceanic and Atmospheric

    Advances in technology and modeling have led to remarkable improvements, although we are constantly limited by computer power.

  • 14 COMMUNICATIONS OF THE ACM | SEPTEMBER 2014 | VOL. 57 | NO. 9

    news

    to gain a better understanding of key factors impacting weather, including greenhouse gasses, ocean tempera-tures, and sea ice. The better the ob-servations and the more critical data points we have, the better the math-ematical methods, he explains. More important in the future will be the prediction of extremes, which places a greater emphasis on predicting the right probabilities of events and doing so in a longer time range.

    IBMs Deep Thunder initiative is further redefining the space. It has pre-dicted snowfall accumulations in New York City and rainfall levels in Rio de Janeiro with upward of 90% accuracy by taking a somewhat unconventional approach. We are not looking to use the traditional method of covering a large area with as high a resolution as possible using uniform information, Treinish says. We are putting bounds on the computing problem by creating targeted forecasts for particular areas. As part of the initiative, IBM plugs in additional types of data sourcesin-cluding agricultural measurements and wind farm dataand manipulates existing sources in different ways.

    In fact, as the Internet of Things (IoT) takes hold, new types of sensors and crowdsourcing techniques will ap-pear, and will further redefine weather forecasting. Kyger says the NWS has already started to experiment with crowdsourcing and other social media input, including data from hyperlocal Twitter accounts. Treinish believes smartphones and other devices could provide insights into everything from

    temperature and wind conditions to barometric pressure and humidity on a block-by-block level. The challenge, he says, is that the massive amount of data can be really noisy and not of real high quality.

    Adding to the challenge, the IoT will collect far more data, but at the same time will further tax existing and already constrained supercomputers.

    In the end, the quest for more ac-curate forecasts leads back to the need for more computing power and the de-velopment of better algorithms; that, in turn, drives the need for even more powerful computers. There is an ongo-ing need for adequate funding and ad-ditional IT resources; it also is critical to continually upgrade models using an assortment of statistical and regres-sion analysis techniques, combined with human analysis and judgement.

    The goal, Kyger says, is to con-tinually look for things that we can do better. Its a closed loop cycle that never ends.

    Further Reading

    Voyant, C., Notton, G., Paoli, C., Nivet, M.L., Muselli, M., Dahmani, K. Numerical weather prediction or stochastic modeling: an objective criterion of choice for the global radiation forecasting, International Journal of Energy Technology and Policy (2014), http://arxiv.org/abs/1401.6002.

    Krishnappa1, D.K., Irwin, D., Lyons, E., Zink, M. CloudCast: Cloud Computing for Short-Term Weather Forecasts, Comput. Sci. Eng. 15, 30 (2013); http://dx.doi.org/10.1109/MCSE.2013.43

    Sawale, G.J., Gupta, S.R. Use of Artificial Neural Network in Data Mining For Weather Forecasting, International Journal Of Computer Science And Applications, Vol. 6, No.2, Apr 2013, http://www.researchpublications.org/IJCSA/NCAICN-13/244.pdf

    Bainbridge, L. Ironies of automation. New Technology and Human Error, J. Rasmussen, K. Duncan, J. Leplat (Eds.). Wiley, Chichester, U.K., 1987, 271283.

    Roulstone, I., Norbury, J. Computing Superstorm Sandy, Scientific American 309, 22 (2013), http://www.nature.com/scientificamerican/journal/v309/n2/full/scientificamerican0813-22.html

    Samuel Greengard is an author and journalist based in West Linn, OR.

    2014 ACM 0001-0782/14/09 $15.00

    track and intensity of the event five days out, while NWS models lagged by about a day. The deviation in model-ing focused attention on the perceived deficiencies of the NWS.

    Kyger acknowledges the episode was a disappointment for the NWS. This led, in May 2013, to the U.S. Con-gress approving $23.7 million in sup-plemental funding to upgrade NWS systems from 90 teraflops to upward of 200 teraflops, as well as address-ing other issues. However, U.S. fore-casting technology continues to gen-erate concerns. There have been a number of important forecasts where U.S. prediction systems performed in an inferior way, Mass says. A recent blog posted by Mass stated that the U.S. had slipped into fourth place in global weather prediction, behind facilities in continental Europe, the U.K., and Canada.

    A major reason why the U.S. is fall-ing behind is that the other centers are using far more advanced data as-similation or higher resolution, both of which require very substantial com-puter power, which the U.S. National Weather Service has been lacking, Mass explains. Over the last decade, Congress has not provided adequate funding to keep up with the fast-mov-ing computing and data environment; for a very modest cost, the United States could radically improve weath-er prediction, he says.

    The upgrades to the NOAA super-computers completed in August 2013 were part of the first phase of a two-step plan to increase its available pro-cessing power. Early results show that, in some cases, a 15% forecasting im-provement has resulted. The comput-ing power will increase to 1,950 tera-flops in 2015, if current funding stays in place. NOAA operates the systems as a private cloud that is scalable. It uses these resources across agencies and tasks, and utilizes capacity in the 90%-plus range. Kyger says a cluster or grid approach that extends beyond NOAA is not feasible, for financial and practical reasons.

    Meanwhile, the ECMWF is continu-ing to refine and improve its forecast-ing model. Moving forward, Bauer says, the Centre is attempting to focus on the environmental system in a far more comprehensive way, in order

    In the end, the quest for more accurate weather forecasts leads back to the need for more computing power and the development of better algorithms.