CERN: EU Data Protection Laws Hindering Cloud Adoption
The cloud could solve CERN's big data problem, but legislation is delaying progress
CERN's computing capacity needs to keep up with the data coming from the Large Hadron Collider and we see Helix Nebula - the Science Cloud as a great way of working with industry to meet this challengeFrdric Hemmerhead of CERN's IT department
Researchers at the European Organisation for Nuclear Research (CERN) in Geneva are being held back from adopting cloud computing on any significant scale due to the delay in establishing a European regulatory framework for data protection.
Speaking at the Cloud Computing World Forum in London this week, Bob Jones, head of CERN openlab, said that the European Commission's failure to push through clear guidelines for data protection in the cloud was hindering uptake within the scientific community
The benefits of cloud computing are not lost on CERN. The organisation's existing European data centres currently manage up to 15 petabytes of data a year over 100,000 CPUs, but that only represents 20% of the total data generated by its Large Hadron Collider (LHC) accelerator.
In reality, the LHC's four major experiments - Atlas, LHCb, ALICE (A Large Ion Collider Experiment) and Compact Muon Solenoid - generate around a petabyte of raw data per second but only about one percent of that is stored, said Bob Jones, head of CERN openlab, speaking at the Cloud Computing World Forum in London.
The organisation's existing European data centres currently manage up to 15 petabytes of data a year over 100,000 CPUs, but that only represents 20% of the total data generated by its Large Hadron Collider (LHC) accelerator
CERN is keen to explore how the cloud can help it deal with its big data problem, hence its involvement in "Helix Nebula - the Science Cloud," launched earlier this year.
Helix Nebula will give CERN access to more computing power to process data from its Atlas experiment, which is designed to observe phenomena that involve massive particles that might shed light on new theories of particle physics.
"CERN's computing capacity needs to keep up with the data coming from the Large Hadron Collider and we see Helix Nebula - the Science Cloud as a great way of working with industry to meet this challenge," said Frdric Hemmer, head of CERN's IT department, back in March.
Jones said that CERN now plans to embark on a two-year pilot phase, which will involve moving data between commercial cloud data centres and its own publicly funded data centres. He emphasised that CERN has no plans to give up its existing systems, but wants to create a hybrid cloud ecosystem marrying the two together.
During the pilot phase, flagship projects will be deployed in order to analyse functionality and performance. Jones said that, with CERN currently consuming 150,000 CPUs continuously and simultaneously, it was not clear whether the cloud would be able to scale to its needs.
Also speaking at the conference, Megan Richards, Deputy Director General of Information Society and Media for the European Commission, said that new data protection legislation is currently passing through the European Parliament, and the proposals will be finalised within the next year come into effect within the next two and a half years.
After a profitable fourth quarter in fiscal year 2014 following a corporate restructure, Canadian smartphone maker BlackBerry is reportedly targeted for acquisition by Microsoft, Xiaomi, Lenovo and Huawei.
BlackBerry plans to lay off an unspecified number of staff in its devices unit, as it attempts to make that business profitable, while expanding in other areas.
Fluke Networks has unveiled a comprehensive strategy to align its product portfolio around the needs of the "borderless enterprise".
Captive centers -- in-house IT and business process delivery arms -- accounted for one quarter of the $150 billion global services market last year, according outsourcing consultancy and research firm Everest Group.