Article 6EXTH CERN Swaps Databases To Tackle Its Petabyte-A-Day Habit

CERN Swaps Databases To Tackle Its Petabyte-A-Day Habit

by
janrinok
from SoylentNews on (#6EXTH)

Arthur T Knackerbracket has processed the following story:

Europe's particle accelerator at CERN spews out around a petabyte of data daily, which means monitoring the computing infrastructure that processes the data is crucial.

CERN's main activities are based on the Large Hadron Collider (LHC), which propels sub-atomic particles around a 27km circuit, 100 meters underground, then smashes them into each other under the guise of eight distinct experiments. Among them is the CMS experiment, which aims to spot the particles responsible for dark matter, among other things.

Like other experiments, CMS shut down for a period of upgrades from 2018 to 2022, and restarted in July last year for the three-year Run 3 period in which scientists will increase the beam energy and sample physics data at a higher rate.

In preparation, four big LHC experiments performed major upgrades to their data readout and selection systems, with new detector systems and computing infrastructure. The changes will allow them to collect significantly larger data samples of higher quality than previous runs.

But Brij Kishor Jashal, a scientist in the CMS collaboration, told The Register that his team were currently aggregating 30 terabytes over a 30-day period to monitor their computing infrastructure performance.

"Entering the new era for our Run 3 operation, we will see more and more scaling of the storage as well as the data. One of our main jobs is to ensure that we are able to meet all this demand and cater to the requirements of users and manage the storage," he said.

Read more of this story at SoylentNews.

External Content
Source RSS or Atom Feed
Feed Location https://soylentnews.org/index.rss
Feed Title SoylentNews
Feed Link https://soylentnews.org/
Feed Copyright Copyright 2014, SoylentNews
Reply 0 comments