•  

Understanding the computing for the Large Hadron Collider at CERN

A session at Crunch Conference 2016

The physics community at CERN analyses since many decades large volumes of physics data.
More recently statistical methods and machine learning are also applied to computing infrastructure metrics to better understand and optimise the complex and distributed computing systems used for the Large Hadron Collider.
This presentation will give an overview of established and new techniques and tools for supporting these analysis activities.

About the speaker

This person is speaking at this event.
Dirk Duellmann

irk Duellmann leads the Analytics and Development section of CERN's Storage group.

Dirk Duellmann leads the Analytics and Development section of CERN's Storage group. He is responsible for the design and evolution of CERN's high performance disk pools for physics data analysis and he chairs the working group for Infrastructure Analytics of CERN's IT department. Previously he lead the Worldwide LHC Computing Grid (WLCG) projects for persistency framework development and for distributed database deployment.

Dirk joined CERN in 1995 after receiving a PhD in High Energy Physics from the University of Hamburg. Before he worked in several software companies on the development of database management systems and applications.

Sign in to add slides, notes or videos to this session

Tell your friends!

Short URL

lanyrd.com/sfdtff

Official event site

crunchconf.com

View the schedule

Share

See something wrong?

Report an issue with this session