Applying analytics to personal data can produce immense value for society—but how that data is used is key to preventing the privacy and cybersecurity violations and even discrimination that can unintentionally result.
Dennis Hirsch, JD, was recruited to Ohio State in January by TDA and the Moritz College of Law to spearhead the university’s new Program on Data and Governance, which studies ways to govern big data analytics and other uses of personal data in order to enhance their contributions to society and prevent harmful impact. “Society can govern through law and policy,” says Hirsch, “but it can also govern through ethics, business management, technology and social norms. Our program looks at all of these forms of governance as applied to data analytics.”
In addition to hosting conferences and speaker events, the program conducts non-partisan, interdisciplinary research. The program encourages faculty from other colleges and departments to collaborate and so build the interdisciplinary lens required to think most effectively about how to govern big data use. Drawing on the insights from its research and engagement, the program also advises policymakers, researchers and businesses on governing strategies, and analyzes the increasing role that data analytics plays in national security, immigration, and other areas of American government and democracy.
“Data analytics produces tremendous social and economic benefits, and as the practice has grown so too has an awareness that data analytics can also produce problematic impacts,” Hirsch says. “The idea behind the program is that is important to protect against unfairness and discrimination, and it’s important for data scientists to address privacy and other implications so that data analytics can be fully supported and achieve all the great things it’s capable of achieving.”
A data collection project by a company called inBloom, funded with $100 million from the Gates Foundation, is considered the ultimate data governance cautionary tale. The nonprofit planned to gather hundreds of data points on millions of school children that data scientists could then use to develop individualized instruction plans for students.
“It was a noble and good idea, but parents got extremely nervous,” says Hirsch. “Some of the data fields had to do with history of violence in the family, divorces, student underachievement, and other personal information that students and their families might prefer to keep private. Parents were concerned about who would see the information, whether a security breach might inadvertently release it. They worried that this prejudicial data might stay with child for the rest of their lives. It was a perfect storm with concerns about privacy, security and fairness and students being characterized in ways that may not reflect them.”
The result: The nine states that partnered with inBloom stopped sharing their information, and several began passing laws regulating the sharing of student data.
“Data analytics can do wonderful things like personalize education and make it more effective, but practitioners need to proactively and intentionally see and address the privacy, cybersecurity, and fairness issues,” says Hirsch. “We see it as really important to the future growth of data analytics that researchers face those more problematic impacts, not sweep them under the rug but really look at the issues thoughtfully and figure out how to mitigate the risks.”
It boils down to thinking through and solving for possible problems before they arise.
“Our program is working on ways to identify issues an analytics project might raise before it is rolled out, and so avoid the kind of reaction InBloom saw,” says Hirsch. “They may be legal concerns, they may be ethical. We are studying these issues and how to address them on the front end.”
Big data use and its impact, Hirsch has found, has much in common with environmental law, another area he has specialized in and written about extensively.
“About dozen years ago, I was struck by fact that just as the smokestack industry creates unwanted byproducts of its beneficial activities, data activity does too. Privacy injuries are analogous to environmental injuries: They’re pervasive, they affect a lot of people, they are externalities—costs that the business doesn’t have to bear and can pass on to others. Then I thought, if that’s true, could we use the 50 years of thinking about environmental governance and apply it to privacy?”
Hirsch was drawn to Ohio State by the enthusiasm within Moritz for exploring governance issues, as well as the breadth of related expertise throughout the university and initiatives such as TDA that are designed to drive collaboration.
“Ohio State is making a huge investment in data analytics, and there are many colleagues working in this field, not just in law, who are knowledgeable,” he says. “The opportunity to collaborate and think hard about the best ways to address this set of issues—that’s exactly the kind of work I want to be doing.”