By
Stopping cyber attacks and terrorists has long been a focus – and frustration – of law enforcement professionals, who use surveillance, informants and phone taps, among other means, to thwart unlawful activities. Now data analysis may be a valuable new tool in this fight, according to Associate Professor of Information Systems Sam Ransbotham.
“If some vulnerability in software is found, it’s not magically and suddenly adopted by every bad guy across the planet at the same time,” says Ransbotham. “They go through processes as well: They pick one vulnerability versus another, and they figure out how to use some faster than others.
“The solution is to manipulate information and incentives so we make our countermeasures as fast as possible, while making their exploitation as slow as possible.”
Ransbotham’s work in this area has earned him the prestigious National Science Foundation CAREER Award, which recognizes a junior faculty member “who exemplifies the role of teacher-scholars through outstanding research, excellent education and the integration of education and research within the context of the mission of their organizations.” The five-year $402,000 award will fund his project “Using Analytics on Security Data to Understand Negative Innovations.”
“I’m pretty excited about it,” says Ransbotham, who joined the Carroll School of Management faculty in 2008. “There’s a signal from this award that this research is the kind of investigation we need to support.”
Ransbotham says one of his projects will involve using detailed logs from intrusion detection systems like computer firewalls and virus protection.
“What I’m doing in these projects is taking that data and trying to understand how the bad guys are thinking and working,” says Ransbotham. “Think about it: How do we learn about what the good guys do? Well, we ask them questions, we interview them, we have them take a survey. We just can’t do that sort of thing with bad guys. We have to use the data that they’re leaving to try and better understand their behavior.”
Recent years have seen a proliferation of detailed data available, he notes, what with the popularity of mobile devices, the growing number of appliances with computers in them, and the increase in users.
“We are putting computers into everything: cars, refrigerators, toaster ovens, LEGO toys – I mean, everything has a computer in it,” says Ransbotham. “We have an increased need to secure those as best we can. This becomes particularly important when you consider the number of devices that need to be corrected when there is a problem – and problems are inevitable. How do we respond?
“There are wonderful programs that try to get computing into more and more hands by offering low-cost devices and computers. But seldom do I see people think about what we’re going to do if we have all these people coding and using computers – rarely do people think about the associated security and privacy implications.
“No one ever adopts a product because they think, ‘Man, it has awesome security’; they think, ‘Man, it has awesome features.’ That’s perfectly rational, but this attitude often leads to security as an afterthought in the market.”
With security concerns taking a back seat, he says, the vast treasure trove of data serves as both friend and foe.
“How do we increase the ‘friend-ness’ and decrease the ‘foe-ness’ of it? That is what’s really important for society right now,” says Ransbotham. “When we talk about driverless cars or drones delivering packages, these things are feature-driven rather than security-driven. We know there’s going to be a day that someone does something bad with the code that operates a driverless car. It’s inevitable. The question is, how do we get as much of the good from an innovation like that while minimizing the bad?”
As a possible answer, Ransbotham points to the diffusion of innovation – understanding how knowledge spreads.
“How can we disclose vulnerabilities and weaknesses in such a way that accelerates the diffusion of countermeasures,” asks Ransbotham, “but restricts the ability of the bad guys to take advantage of it? Should we tell when we know something bad? Should scientists who develop a new way of creating a new strain of bird flu or a new virus publish those results?
“The tension is, if you tell people, then bad guys learn. If you don’t tell people, then the good guys can’t protect themselves.”