BLOOMINGTON, Ind. – The world’s leading high-performance computing researchers gathered on the Indiana University Bloomington campus for the second Big Data and Extreme-scale Computing workshop, or BDEC2, Nov. 28 to 30 in the Cyberinfrastructure Building.
Professors Geoffrey Fox and Judy Qiu, both in the Department of Intelligent Systems Engineering in the IU School of Informatics, Computing and Engineering, collaborated to bring BDEC2 to IU. The workshop was funded by the U.S. National Science Foundation and Intel Corp.
“BDEC2 brings together the brightest minds in computer science to design the future of the HPC platform for scientific research,” Fox said. “Because Indiana University has a long and distinguished history of supporting the development of HPC tools and research – thanks in great part to IU President Michael McRobbie’s research interests in HPC – I thought it was appropriate for IU to host the kickoff meeting for the second BDEC series.”
Attendees from around the world – including national and university labs and supercomputing centers in Japan, China, Saudi Arabia, Spain and the United States – brainstormed strategies to converge Big Data and the high-performance computing tools necessary to make sense of massive data sets.
Just how big is Big Data? In 2020, when the volume of digital data is expected to reach 40 zettabytes, the number of network-connected devices – sensors, actuators, instruments, computers and data stores – is projected to reach 50 billion, roughly five times more than the expected population of the planet for that year.
“Machine learning and Big Data, combined with edge computing, have been a disruptive force in HPC,” said Pete Beckman, co-organizer of BDEC2 and director of the Northwestern-Argonne Institute for Science and Engineering at Argonne National Laboratory. “Now people want to hook up millions of smart devices – aka the Internet of Things – and fuse multiple data sets from satellites, medical databases and more and have them run continuously while also using supercomputers. This creates a challenge.”
With science becoming more and more international, Beckman said that whatever solution is developed must be global. “For example, climate modeling is not done in just one country; it takes data from around the world and uses HPC tools in many different countries to make sense of the data.”
Satoshi Matsuoka echoes Beckman’s sentiments. He is director of the Riken Center for Computational Science at Riken, the largest supercomputing center in Japan. There he oversees the K supercomputer and its upcoming exascale successor, the Post-K supercomputer.
“Finding a solution to the problem of converging Big Data and high-performance computing will not be achieved right away, but we will solve it together with collaboration on an international scale,” he said. “It is very important that we create infrastructure platforms and software pieces to bring this all together.”
Matsuoka said that in Japan, a new concept is emerging: Society 5.0. In this fifth stage of human evolution, knowledge will be gleaned from data via artificial intelligence to solve humanity’s greatest problems, like hunger, climate change and disease.
Jack Dongarra, of the University of Tennessee and Oak Ridge National Laboratory, is a BDEC2 co-organizer. He said the workshop is the ideal way to solve a complex problem and advance scientific discovery.
“By bringing together a number of communities, the workshop allows us to focus on ideas that will move Big Data and extreme-scale computing forward, in a way that will not duplicate things,” he said. “BDEC2 exposes ideas to other groups of people, enabling collaboration across universities, countries and institutions in a way that benefits science overall.”