MSUToday’s “Ask the expert” articles provide information and insights from MSU scientists, researchers and scholars about national and global issues, complex research and general-interest subjects based on their areas of academic expertise and study. They may feature historical information, background, research findings or offer tips.
With so much of our lives being stored online and in digital databases, it’s reassuring to know there are researchers out there like Borzoo Bonakdarpour of Michigan State University.
Bonakdarpour, an associate professor in the Department of Computer Science and Engineering, works to prevent information that people want to keep private from leaking into the public.
Talk of data breaches probably calls to mind one (or several) of the well-publicized examples of people being hacked through phishing scams or lax security practices. But Bonakdarpour, a recipient of a 2023 Withrow Teaching Award, focuses on a more subtle aspect of data privacy that carries the same high stakes.
“We sort of trust that computer programmers don’t make mistakes,” Bonakdarpour said. “But they’re still human, right? They make mistakes all the time. This can introduce bugs that are accidental, but they can still result in massive security breaches.”
Bonakdarpour and his team recently won a grant from the National Science Foundation to develop what he calls “enforcers,” programs that can automatically spot and remedy those digital gaffes before they do harm.
MSUToday sat down with Bonakdarpour to chat about cybersecurity and learn more about his work.
Where does your work fit into the big picture of cybersecurity?
The project we’re talking about is just one aspect of cybersecurity. Outside of that, there’s database security, network security — there are a ton of different types of security.
My focus is on what’s called information-flow security and on developing algorithms that can verify the correctness of computer programs with respect to information flow.
What is information-flow security?
Information-flow security is about how secrets can leak into observable public channels. Let me give you one small example.
When I first started studying this, I was writing papers for a conference with my students, and I could log into the conference management portal that showed the status of all our submissions in table form.
The status was color coded: there was one “accepted” submission shown in green, one “rejected” submission shown in orange, and there were two pending submissions shown in yellow.
Each entry also included a “Session” column. For the green accepted paper, that column said “not yet assigned.”
Looking at that column for the yellow entries, one was blank, but the other said “not yet assigned.” From that, I could guess that this paper was probably internally marked as accepted.
But this information was supposed to be confidential. I should not have been able to guess anything about what was happening internally while it was pending. We took a screenshot of that table and put it in the introduction of one of our papers.
So while this example isn’t overly sensitive, it demonstrates how information can easily leak from a private channel to a public channel.