Meeting Future Software Challenges in High-Energy Physics

22.05.2019 11:15 – 12:15

The computing hardware of today is vastly different to what was available when the LHC experiments began writing their software, almost 20 years ago. Single CPU cores with ever faster clocks have given way to multi-core chips with wide vector registers for parallel processing. At the same time, Graphics Processing Units (GPUs) have become ever more powerful and ever more popular, offering thousands of cores and raw floating point performance much greater than CPUs can manage. This presents two huge challenges to HEP software. The first is concurrency, the adaption to performing many tasks in parallel during data processing. The second is heterogeneity, where instead of a mono-culture of x86_64 processors, HEP needs to adapt to a mixture of different CPU architectures, GPUs and even more exotic processors, such as FPGAs. These challenges from modern hardware come just at the time when planning for the High-Luminosity LHC foresees a x10 increase in rate for ATLAS and CMS and a tremendous jump in event complexity, with pile-up perhaps reaching 200.

To face these challenges computing and software experts from multiple experiments and institutions came together in 2015 to begin a bottom-up organisation to tackle the challenges of the coming years and to avoid each experiment having to work alone to solve these problems. The HEP Software Foundation (HSF) undertook to organise the community to write a Community White Paper Roadmap, where more than 300 physicists and computing experts came together to map out a plan for progress from event generation through to final analysis. Solution areas, like machine learning, were put forward as major pieces of that strategy for the future. Since the CWP publication many HSF working groups have begun tacking the pressing issues we face, with an emphasis always on cross-experiment work and common solutions. In this seminar I will describe the genesis of the HSF and the process that led to the Community White Paper. I will review key areas where the HSF is helping to find effective solutions to problems in HEP computing. Finally, I will end on the question of how we argue for more resources to fund software in the future and best work together as a community with other data intensive sciences and industry.

Lieu

Bâtiment: Ecole de Physique

Quai Ernest-Ansermet 24
1211 Genève 4
Grand Auditoire A

Organisé par

Faculté des sciences
Section de physique
Département de physique nucléaire et corpusculaire

Intervenant-e-s

Graeme STEWART, Dr, CERN

entrée libre

Classement

Catégorie: Séminaire

Mots clés: Software, High-Energy Physics

Plus d'infos

Contact: missing email