UMD Researchers Awarded $3.4M DARPA Grant to Develop Automated Software-Analysis Tools

Wed May 13, 2015

A multi-institutional team led by the University of Maryland was awarded a $3.4 million grant from the Defense Advanced Research Projects Agency (DARPA) to develop automated software-analysis tools able to detect unusual anomalies in critical operating systems.

When completed, the technology will help analysts quickly identify software vulnerabilities in large-scale systems that are used by the U.S. government or military, or that are directly tied to critical infrastructure.

Michael Hicks, a professor of computer science with appointments in the University of Maryland Institute for Advanced Computer Science (UMIACS) and the Maryland Cybersecurity Center (MC2), is principal investigator of the grant.

Hicks is joined on the project by assistant professors David Van Horn and Elaine Shi—also in computer science, UMIACS, and MC2—as well as researchers from Yale University and the University of California, Berkeley.

The research team is tasked with examining two unique types of software vulnerabilities, and developing tools that can identify such vulnerabilities quickly and accurately.

The first vulnerability involves “side channels” in software, which potentially allow hackers to learn secret information by measuring the behavior of running software, e.g., how long the program runs or how much memory it uses.

The second problem involves attacks that can occur when cybercriminals send a maliciously crafted message to software, causing it to perform poorly.

“It turns out these two problems are related, because both depend on how inputs to the program can influence the program’s subsequent execution behavior,” says Hicks. “We want to develop analysis tools that when handed a piece of software, can look for whether that particular software might be vulnerable to either of these problems.”

Hicks says that the UMD-led team plans to create an infrastructure that combines different styles of automated analysis, while including input from human analysts as part of the process.

“Right now, developers test their code, they look at their code, they explain it to a colleague—so all of that would continue, but we would develop a way in which knowledge gained from that normal process would be treated as input into our tool suite,” he says.

All the research will be conducted under DARPA’s Space/Time Analysis for Cybersecurity (STAC) program, which has provided funding to almost a dozen other universities and organizations in addition to the UMD-led team.

DARPA is the federal agency charged with developing emerging technology for the U.S. military. It is perhaps best known for developing technology that later evolved into the commercial Internet.

A kick-off event for the STAC program was recently held in Dulles, Virginia.

“DARPA has this set up almost as a competition,” Hicks says, with some of the participating teams tasked with creating challenge problems—software that exhibits the problems the teams are trying to find—while the rest are working on developing the analysis tools to find them.

“The overall goal is that during the course of the grant period, these engagements will push each team’s technology to steadily improve, hopefully greatly outdistancing the state of the art,” he says. “Indirectly, the teams compete against each other, but there are no losers. The program will be viewed as successful if new techniques are developed that work—for reasons we understand—regardless of which team’s approach ends up ‘winning’ a particular engagement.”