Why do we need the exascale ?
It’s time to shift to a new computing scale
High Performance Computing, or HPC, has gradually become a part of our daily lives, even if we are not always aware of it. It is in our medicines, our investments, our cellphones, in the films we go to see at the cinema and the equipment of our favorite athletes, the cars we drive and the petrol that they run on. It has a direct impact on our quality of life, making our world a safer place, with ever more accurate and precise weather, climate and seismic forecasts, and, thanks to researchers, a world we can more easily understand.
A never-ending need for more compute capacity
All sectors, in industry and in the academic and scientific community, demand ever more powerful computing systems, involving ever growing volumes of data. Generating finer-grained weather forecasts, designing cleaner aircraft engines, leveraging genomics to implement personalized medicine… all of these innovations require more computing power than currently available, and will advance considerably thanks to exascale systems.
Imagining smart cities for high-quality urban services
“Without a serious shift in urban thinking, the consequences of escalating climate change, pollution and resource depletion pose an ever more serious threat to the resilience of cities right around the world.” Jon Lovell, Deloitte
Cities are faced with the following challenges:
- Enhance quality and performance of urban services
- Reduce costs and resource consumption
Smart city projects can address issues such as:
- data-driven real-estate valuation,
- parking analysis for urban planning (capacity planning, variable pricing…),
- monitoring of the motion of citizens around city,
- monitoring of disease spread,
- city-level traffic management.
Smart city projects leverage emerging developments in Internet of Things and Big Data. They gather large-scale data and transform this date into knowledge that helps address real-world challenges in an urban context.
Relying on fine-grain weather forecasts to anticipate severe phenomena
Without supercomputers, weather forecasting as we know it today would not be possible. And as the computing power available to meteorological agencies increases, weather forecasts improve in many ways. Between 1992, when Météo-France invested in their first supercomputer, and today, the compute capacity increased by a factor of 500.000 – and Météo-France expects to keep the same trend in the future. Weather forecasting agencies worldwide need to:
- issue forecasts every hour;
- use a finer mesh size for finer and more reliable predictions;
- enable the prediction, exact location and time of severe weather phenomena.
These objectives require increased model resolution and the incorporation of a greater quantity of data and observations in the forecasting process. This means more computing resources and the capacity to handle massive data efficiently.
Designing more environment-friendly engines
The aeronautical industry is relying on simulation to reduce the quantity of pollutants emitted by aircraft engines, the noise they make, and the quantity of fuel they consume. Aircraft designers must tackle the following challenges:
- Reduce Co2 emission by 2 in 2020, and by 4 in 2050
- Reduce fuel consumption by 15%
- Reduce noise
To achieve this ambitious goal, finer and more complex models are required, in particular to simulate combustion chamber performance with more precision.
Aircraft engineers reckon that they will need their high performance computing resources increased by 70% to 100% each year to reach their objective.
Leveraging genomics for better diagnosis and treatment
Genome sequencing and analysis are complex tasks that demand powerful analytics platforms. The compute time needed for sequencing has been reduced considerably in recent years, making it possible to drastically increase the amount of genomic data collected on large study populations. This opens the way to a new genomic-based healthcare service, leveraging in-depth and comprehensive genomic analyses for a predictive and personalized medicine. The challenge is to achieve:
- Better and predictive diagnosis
- More efficient treatments
- Customized dosing
To implement such a promising project, sequence analysis must be available on an industrial scale, and complex analytics must be supported. This requires computational power on an unprecedented scale.
Re-inventing agriculture to meet 21st century demand
With global populations rising rapidly, worldwide agriculture faces the challenge of producing enough food to meet increasing demand in conditions of changing climate and scarce natural resources. A new agricultural revolution based on a strong scientific foundation is needed to tackle the challenge of increasing production while also meeting environmental, economic, and social goals:
- Feed a world population that will reach 8.5 billion by 2030 and 9 billion by 2050
- Reduce of the use of pesticide by 30%
- Take into account climate change, ground quality alteration and plant behavior
The solution: the development of precision agriculture; leveraging the large variety of data obtained by sensors to build models with fine ground mesh, and leading to better use of fertilizers and agricultural inputs.
This requires massive computing resources and Big Data resources.