NSF grant to fund advanced deep learning and visualization computing platform
The University of Illinois at Chicago has received a three-year, $1 million grant from the National Science Foundation to build a state-of-the-art computing platform that will incorporate multiple graphics processing units, as well as enable faculty and students to execute deep learning and visualization codes faster, apply more sophisticated models to large-scale problems, gain greater insights, accelerate discovery and open new avenues of research.
“The new system will allow researchers to create and utilize an in-demand computing platform that can rapidly learn to identify anomalies in large data sets and produce visualizations or extract features of interest from images, which will help them hone in on answers to research questions, and even tailor the questions themselves,” said Maxine Brown, director of the Electronic Visualization Laboratory at UIC and principal investigator on the grant.
The grant will support the development of a next-generation composable infrastructure computing system, called COMPaaS DLV for Composable Platform as a Service Instrument for Deep Learning & Visualization. It will be developed and maintained by the Electronic Visualization Laboratory and will be initially made available to UIC College of Engineering faculty, many of whom provided use cases for the grant proposal, explaining how they would use this system for research and research education.
“COMPaas DLV will enable faculty to perform complex deep learning and visualization computations on large-scale big data that cannot be efficiently executed on existing computing systems in a timely manner,” said Andrew Johnson, Electronic Visualization Laboratory director of research, associate professor of computer science, and co-principal investigator of the grant. “The system is flexible and scalable, meaning it can be configured to quickly move, process and store data with few to no bottlenecks.”
COMPaaS DLV will be connected to UIC’s existing computing and network resources, managed by the UIC Academic Computing and Communications Center, as well as to regional and national clusters, storage, supercomputing and cloud facilities.
Currently, UIC deep learning researchers use small-scale clusters or clouds with graphics processing units. These platforms, while sufficient for moderately sized training jobs, limit the size of data sets and the complexity of neural network codes used as training, and require days and weeks to train sufficiently interesting models.
The composable computing platform will enable researchers to “compose” their own computers, so to speak, creating a temporary, on-demand computer system out of independent components – computing processors, graphics processing units, storage and networking – customized to optimally execute their application codes. Traditional computers allocate a “slice” of their system to each application, whether the codes need more or less of each type of component to run efficiently.
“Given it’s a more flexible system, researchers can more quickly analyze big data problems and possibly identify specific subsets of the data they want to focus on or abandon. It’s kind of like how when you get a box lunch, you are stuck with what’s inside that box,” said Johnson. “A composable computer is more like a buffet, where you take as much of what you like and leave what you don’t want behind.”
“With composable computer architecture, the computer’s components form a fluid pool of resources so that different applications with different workflows can run simultaneously, with each configuring the resources it requires almost instantaneously at any time, assuming they are available,” Brown said. “Given the composable infrastructure’s inherent scalability and agility, it is more beneficial than traditional clouds and clusters that are rigid, over-provisioned and expensive.”
One of the applications that will benefit from COMPaaS DLV is SENSEI Panama, a UIC collaborative project with evolutionary anthropologist Meg Crofoot at the University of California, Davis, to enable anthropologists to use virtual reality to “walk inside their data” and study animal behavior. Crofoot and UIC computer scientists Tanya Berger-Wolf, along with Johnson and Brown, are applying deep learning and visualization to process data from aerial photography, terrain maps, GPS collars, and accelerometers, in order to study how coatis, kinkajous, spider monkeys, and capuchin monkeys on Barro Colorado Island in Panama, forage for food.
“Composable architecture is new and not yet readily available from multiple vendors,” Johnson said. “But, in the field of computer science, these systems are believed to be the future of computing, analyzing and visualizing Big Data sets.”
Part of the grant will go toward the development of a software system to help users interface with the composable platform. Robert Kenyon and G. Elisabeta Marai at UIC are co-principal investigators on the grant.