India Needs Improved Supercomputing For Better Climate And Weather Modelling
Indian Institute of Science in Bangalore prides itself with the fastest supercomputer (a Cray XC 40 system) in the country with a peak computing power of 1.25 petaflops but is ranked only 96th in the top 500 list
Supercomputing for weather and climate research had an inspiring start in 2016 with two major capacity augmentation announcements globally in January.
National Centre for Atmospheric Research (NCAR) announced another supercomputer at its Cheyenne facility in Wyoming, USA. The new 5.34 petaflops system to be built by Silicon Graphics International Corp (SGI) will be named Cheyenne for the city in which the supercomputer will be located. It will replace Yellowstone, the existing supercomputer which was an IBM iDataPlex system and rated for 1.6 petaflops.
Petaflops, what? Well, for the uninitiated, petaflop is a measure of computing speed of a system. Flop is an acronym for floating point operations per second while Peta means a quadrillion. That is 15 zeroes after one! Thus a system rated one petaflops can complete 1 quadrillion floating point operations per second.
The second announcement came from European Centre for Medium Range Weather Forecasts (ECMWF) that said it will upgrade its existing Cray's XC30 supercomputers to the latest XC40 systems. The $36 million deal also includes enhanced Cray Sonexion storage to stockpile more weather data.
Cray is a Seattle headquartered and NASDAQ listed supercomputer manufacturer that apparently sells to more than half of the weather centres of world.
Interestingly, the company is also building UK's brand new supercomputing facility to be completed by 2017 and rated at 16 petaflops (that is 16 quadrillion floating point operations per second).
The first computing device in India was installed at Indian Statistical Institute, Kolkata in 1955, however India's growth in this domain had been trammelled. While IBM left the country in 1978 when asked to dilute and manufacture in India, a technological embargo imposed on India in 1974 (in response to Smiling Buddha, India's first nuclear test) held it back from importing Cray computers. Notably, CDC 160A-3600, was the first Cray device to be installed in India at the Tata Institute of Fundamental Research (TIFR) in 1964 (History of Computing in India by V Rajaraman).
Seymour Cray, known as the father of supercomputing had been the protagonist in CDC's (Control Data Corporation) innovation until he quit the Bloomington, Minnesota based company to launch Cray Research in 1972.
Bangalore based Indian Institute of Science prides itself with the fastest supercomputer (a Cray XC 40 system) in the country with a peak computing power of 1.25 petaflops but is ranked only 96th in the top 500 list.
India's détente with the United States is at its peak now but its share in supercomputing power remains a dismal 2% while China is positioned second with more than 12% of the computing strength. The United States of course takes the cake with more than 50% of installed supercomputing.
There is some headway though. In June 2015, union minister Harsha Vardhan unveiled supercomputer 'Bhaskara'. The newest addition to India's supercomputing (for weather and climate research) capacity supports the Earth System Science Organisation-National Centre for Medium Range Weather Forecasting (ESSO-NCMRWF) to provide very high resolution 10 days' deterministic weather forecasts and probabilistic forecasts.
Bhaskara runs a multi-model (44) ensemble prediction system for this purpose. Bhaskara was commissioned with an IBM iDataPlex 0.35 petaflop system, taking ESSO High Performance Computing (HPC) facility to a peak computing power of 1.14 petaflops.
This achievement is however trivial compared to the dramatis personae in technological advancement. For a country that boasts of a Mars Mission, a combined supercomputing capability of 5.25 petaflops is blasé (the new NCAR system alone will be more powerful).
The nation needs to rapidly ramp up its weather and climate computing power. In the backdrop of climate change, it becomes imperative than ever before to simulate its repercussions on weather, climate and more importantly glacial and coastal inundation. This would need countless quadrillion floating point operations per second. Countless.