From the Grid to the Cloud: Computing for Big (and Small) Science

Saturday, 14 February 2015: 8:00 AM-9:30 AM
Room LL20C (San Jose Convention Center)
Research facilities across the physical and life sciences are generating ever-increasing quantities of data that need to be accessed and analyzed by larger, more international scientific collaborations. When faced with a collaboration of 8,000 physicists on five continents needing to access and analyze 15 petabytes of data every year from the then under-construction Large Hadron Collider, particle physicists took the first steps in the 1990s to tackle this problem. The Worldwide LHC Computing Grid was the result: a shared distributed computing infrastructure now used by scientific disciplines from biology to civil engineering. The life sciences community is today tackling its own unique cross-border data challenge, with successful solutions promising to enable breakthroughs in agriculture, food, energy, and health care. The model of shared distributed computing has also leapt from the research sector to the private sector and back again through the development of cloud “computing on demand.” But with the next generation of facilities promising even further massive increases in data, can we keep up? And what technological, financial, and legal issues must the research community consider in doing so? This session will examine the basic concepts of distributed computing, their application in both physical and life sciences, and the challenges facing the next generation of distributed computing.
Vincenzo Napolano, National Institute for Nuclear Physics
Terry O'Connor, Science and Technology Facilities Council
Matt T. Goode, Biotechnology and Biological Sciences Research Council
Frank Wuerthwein, University of California, San Diego
Shared Computing Driving Discovery: From the Large Hadron Collider to Virus Hunting
Davide Salomoni, National Institute for Nuclear Physics
The Future of Distributed Computing
See more of: Information and Data Technology
See more of: Symposia