Friday, February 19, 2010: 3:30 PM-5:00 PM
Room 10 (San Diego Convention Center)The amount of data generated each year is likely to double every 2 years for the next decade as the cost of computing and networking plunges and the number of people and data-generating instruments connected to the Internet soars. George Gilder referred to this phenomenon as the "exaflood." Some research projects routinely generate terabytes and even petabytes of data. Many others result in smaller, heterogeneous collections with valuable attributes. To realize the full benefit and value of these diverse and voluminous data requires effective data management techniques, institutional arrangements, and policies. How can research organizations ensure that data are properly archived and made available to all the researchers who might find them useful? How can the origin and accuracy of data be ensured and properly documented? What new approaches are needed to ensure that the preservation and access to data of great scientific and social value is a priority? Many organizations and advisory groups are now confronting these challenges. The symposium will bring together leading research and policy experts who will discuss the value proposition of the exaflood, provide compelling examples of applications that expand the boundaries of what is possible, and discuss some of the policy and management issues that must be resolved.
Bonnie C. Carroll, Information International Associates Inc.
Paul F. Uhlir, National Research Council
Michael R. Nelson, Georgetown University
Christopher L. Greer, Networking and Information Technology Research and Development