Tackling the ethical challenges of big data

December 15, 2016

An authority on social data, Susan Etlinger argues we need to apply critical thinking and exercise caution as we enter the age of "data ubiquity."

The coming tech disruptions and revolutions have at times been predicted to fix all manner of societal and environmental ills, so at first glance, Susan Etlinger's warning to exercise caution and restraint can seem odd. But while data may be propelling advances in many fields, caution and critical engagement with the information we collect is essential if we are to avoid error and protect our privacy, Etlinger warns.

"At this point in our history, as we've heard many times over, we can process exabytes of data at lightning speed, and we have the potential to make bad decisions far more quickly, efficiently, and with far greater impact than we did in the past," she said in a 2014 Ted talk.

Applying these ideas to the world of business, Etlinger advises companies on using and is member of the Big Boulder Initiative, an industry organisation which promotes the successful and ethical use of social data.

We asked her about how cities and companies can safely and effectively handle information as we enter the age of data ubiquity.

Why is critical thinking so important to handling big data?

Here we are, with more data than we know what to do with. And human beings have this tendency to give a lot of respect to technology. It's funny, if you look at charts and graphs, if you look at studies that come out, people tend to trust charts and graphs quite a bit. What's interesting is underneath that chart or that graph might be terrible data, and it actually might be showing something that's untrue. Or it might not account for something important.

In the age of nearing data ubiquity, are we keeping pace with our critical thinking and processing of this information?

It depends on the kind of data. So if you think about, for example, something like weather prediction, which has got so good. The set of data you need and the possible outcomes is more or less constrained.

Then when you get to things we call human data—human expression, text, speech, audio, any of that—interpreting meaning, and then even translating, and then interpreting meaning again, you get into some real challenges in terms of understanding what people actually mean.

For example, on Twitter, you could see something like, "Oh great, I dropped my phone and broke it." And most technologies will classify that as a positive statement. So things like sarcasm, things like where certain groups might use veiled language because they might be politically active under administrations that frown upon that. Even simple things like the language teenagers use, which changes all the time, can be missed.

Could you summarise some of the applications and ethical concerns regarding image recognition (the ability of computers to read and interpret images accurately) and emotional detection technology (which aims to analyse and interpret a person's mood either through speech patterns, facial movement or other cues)?

Some of the potential uses are really interesting. So you could imagine that if you had a photographic record of a city over time you could understand a little bit about population patterns, you could understand where people live during different times, where people work, what commute patterns look like. You could understand sentiment and whether people seem happier or sadder or more worried than they were before. You could look at things like interests, sports or purchasing patterns, what people eat, anything. The question is, given that you can do that, should you do that?

I think there are some ways in which this technology can help us understand our history better. I think there are ways in which it can help us understand others better.

[Regarding emotion ] there are tremendous applications for social good (...) but at the same time, those same technologies can also be used for mass surveillance, for other political purposes. They can be used for scary reasons, too.

It's a lot of power we potentially have at our fingertips, now. So I'm arguing that we need to take a breath. And not stop it, because innovation will happen no matter what we do. But to really think about the ways we incorporate it into our businesses and into our society.

You've discussed many cases of well-intentioned big data initiatives leading to unforeseen breaches of privacy. For example, the case of the charity Samaritan's' controversial social media monitoring "suicide watch" app. How can we guard against these events?

Well, this is difficult. When you think about artificial intelligence, one of the hallmarks of AI is that it's really difficult to understand how an algorithm works. An algorithm is sort of like a recipe in the sense that it tells you what ingredients and in what proportion to get your outcome and the outcome could be a fantastic cake or something completely inedible depending on what you do. And people don't want to share [their algorithms] because of competitive advantage.

One thing I really like about computer vision, about in particular, is that you see the images come back. The beautiful thing about data science is you can then go back and work on your algorithm, on your data model, to better reflect the world we want to live in rather than the world we actually live in.

I like looking at images because they show you right away. They show you right away when you google the phrase "three black teenagers" versus the phrase "three white teenagers." And you see images of three white teenagers having fun at picnics and images of three black teenagers being booked into police departments.

The data that we have encodes the biases we have. And I really think for one think that as painful as it is, that gives us an opportunity to stop and be better.

Explore further: Computer scientist looks for deeper meaning in webcam videos

Related Stories

A picture is worth 1000 words, but how many emotions?

February 6, 2015

Log on to Twitter, Facebook or other social media and you will find that much of the content shared with you comes in the form of images, not just words. Those images can convey a lot more than a sentence might, and will ...

Putting the data into dating

April 21, 2016

Heart rate, step counts and sleeping patterns may not be the most romantic of topics—but what would happen if you started talking about them on a date?

Recommended for you

Elon Musk hints at new brain-computer project

March 28, 2017

Tech entrepreneur Elon Musk hinted Tuesday that he is working on a new startup focusing on brain-computer interface, part of his vision to help humans keep up with machines.

Self-driving car crash comes amid debate about regulations

March 28, 2017

A crash that caused an Uber self-driving SUV to flip onto its side in a Phoenix suburb serves as a stark reminder of the challenges surrounding autonomous vehicles in Arizona, a state that has gone all-in to entice the company ...

Renewable energy has robust future in much of Africa: study

March 27, 2017

As Africa gears up for a tripling of electricity demand by 2030, a new Berkeley study maps out a viable strategy for developing wind and solar power while simultaneously reducing the continent's reliance on fossil fuels and ...

0 comments

Please sign in to add a comment. Registration is free, and takes less than a minute. Read more

Click here to reset your password.
Sign in to get notified via email when new comments are made.