When it comes to decision-making, big data can be a big help. Or, a Pandora ’s Box of potential bias. If a public safety organization could fully tap into the strengths of big data, paramedics could reach the injured more quickly, and police officers would be better positioned to respond to crime. But how can organizations be sure the data they rely on doesn’t have inherent bias, or that it isn’t being used to unfairly profile people?

The recognition of both the potential that exists with big data, as well as the struggles to understand and apply it, led to a multidisciplinary Boise State University research team landing the university’s first grant at the intersection of “big data” science and public policy last spring.

Basically, big data refers to huge amounts of data that are collected and interpreted by machine algorithms – which were written by data scientists – in an attempt to find local and national patterns in specific fields. The team is using the $97,000 exploratory grant to investigate how big data is perceived, and most effectively can be used, by criminal justice agencies in the western United States.

“What we’re doing has been recognized by the National Science Foundation (NSF) as unique because we are talking to the end users, not just Microsoft and those who are pushing that data out. We are starting at the other end, learning what their challenges are, what they know about big data and what problems we could potentially help solve.” – Dr. Eric Lindquist

The NSF has charged the group with building relationships over the life of the one-year planning grant. Lindquist’s team has collaborated with data experts in computer science at Boise State, and met with a number of public safety agencies. Meetings in Boise included policing agencies, emergency response teams and the ACLU of Idaho.

As part of the West Big Data Hub, the team also participated in the first conference of the American Society for Evidence Based Policing in Phoenix, Arizona, where they spoke with agencies tasked with implementing evidence-based policing, as well as in Washington, D.C., Austin, Texas, and Boulder, Colorado. Throughout the region, some police departments are using predictive analytics and data to determine where crime is mostly likely to happen, then implementing resources to head it off. But community groups worry that the big data may not be objective.

“We are finding across the board that departments lack funding, as well as skilled personnel to work with big data,” said Kimberly Gardner, a doctoral candidate in the School of Public Service’s Public Policy and Administration program and a member of the research team.

“There is often a lack of critical thinking around the broader context of how big data is utilized. Parts of society are ignored. For example, if you implement data using Twitter, a huge part of the population is ignored. We are tasked with figuring out how you make it representative.”

The goal of all of the conferences is to connect the NSF big data structure across the West. Solving the challenges will require thinking from a variety of perspectives. The interdisciplinary nature of the issue is reflected on the research team, which includes Frances Lippitt, a master’s of public administration student, and Gabe Turner, an undergraduate student who is double majoring in philosophy and engineering.

“Gabe is the ideal undergraduate for us because he has an engineering focus but he’s also really aware of the philosophical underpinnings of ethics. Those two don’t always coincide, so it’s been great to have him to discuss that disconnect,” Lippitt said.

From left: Kimberly Gardner, Frances Lippitt and Eric Lindquist pore over big data.

Lindquist will involve more students in the spring with a new graduate course built around big data and public decision-making. He will also create a multidisciplinary, long-term student research team project, known as a Vertically Integrated Project, in the College of Innovation and Design.

“We’re building this team to mirror how interdisciplinary this technology really is,” Gardner said.

The NSF grant is part of $11 million in grants dedicated to the Big Data Hubs and Spokes Projects and associated planning activities this year. The foundation also plans to invest more than $110 million in Big Data research in Fiscal Year 2017, and the Boise State research team has submitted additional proposals around the questions they’ve identified.

“Our team is focused on the overarching question of the use of science in decision-making,” Lindquist said. “Big data is supposed to help us make good decisions, but if you don’t have good trust in the data, you don’t use it.”