Prejudiced AI Is Changing American Lives. What Can We Do About it?

Ailsa Johnson /
Ailsa Johnson / | © Culture Trip

Tech & Entrepreneurship Editor

Imagine a world where artificially intelligent algorithms make decisions that affect your everyday life. Now, imagine they’re prejudiced.

This is the world we’re already living in, says data scientist, Harvard PhD and author Cathy O’Neil. (Read part one of our discussion with Dr O’Neil here). We sat down with the National Book Award nominee to find out what we can do about prejudice in the era of big data.

CT: Is AI prejudiced?

CO: Every algorithm that hasn’t been explicitly made fair should be assumed to be prejudiced. Because as people, we are prejudiced. If we acknowledge that, and we are creating these algorithms with our values and our data, then we shouldn’t assume anything has magically happened to make things fair. There’s no magic there.

CT: Where do algorithms get their data?

CO: It depends on the algorithm. Sometimes social media, for things like political market targeting or advertising or for-profit colleges and predatory lending – but a lot of the data isn’t being collected on social media, or even online.

Data collection is increasingly tied into real-life, like getting a job, working at your job, going to college or going to prison. Those things aren’t things we can circumvent with privacy laws. They’re issues of power, where the people who are targeted by the algorithms have no power, and the people who are collecting the information and building and deploying the algorithms have all the power. You don’t have any privacy rights if you’re a criminal defendant, you don’t have any privacy rights at your job, and you don’t have much in the way of privacy rights if you’re applying for a job because if you don’t answer the questions that your future employer has asked you, then you likely won’t get the job.

We should think less about privacy and more about power when it comes to algorithms and the harm [they can cause].

CT: What can we do to make it better?

CO: We can acknowledge that these algorithms are not inherently perfect, and test them for their flaws. We should have ongoing audits and monitors – especially for important decisions like hiring, criminal sentencing or assessing people at their jobs – to make sure that the algorithms are acting they way that we want them to, not in some sort of discriminatory or unfair way.

Ailsa Johnson /

CT: What are the best and worst case scenarios for the data-driven future?

CO: The worst case scenario is what we have now – that we all blindly expect algorithms to be perfect, even though we should know better by now. And we propagate past injustices and unfairnesses. And we continue ignoring the flaws of these algorithms.

The best case scenario is we acknowledge these algorithms aren’t inherently better than humans. We decide what we want as humans, what we’re striving for. What we want society to look like, and we teach those values. If we do that successfully, these algorithms could be better than humans.

CT: What role can everyday people play?

CO: The most important role that an individual can play is to not implicitly trust any algorithm. To have an enormous amount of scepticism. If you’re being evaluated on an algorithm ask ‘How do I know it’s fair, how do I know it’s helpful, how do I know it’s accurate? What’s the error rate? For whom does this algorithm fail? Does it fail women or minorities?’ Ask that kind of question.

The second thing, beyond skepticism, is that if you think an algorithm is being unfair to you or other people is to organise with those other people. A recent example is teachers. The statistical models about value-added teachers are terrible, almost random number generators. But they were being used to decide what teachers should get tenure and what teachers should get fired, all over the US.

My suggestion is for them to get their union to push back. And this did happen in some places. But it’s surprising how little resistance there was because of the mathematical nature of the scoring system.

CT: How did you get into ‘big data’?
CO: I worked on Wall Street and witnessed the financial crisis from inside. I was disgusted by the way mathematics was used to either take advantage of people or to fool people. I saw the kind of damage that could come from mathematical lies, what I call ‘the weaponization of mathematics’.
I decided to get away from it, so I joined Occupy Wall Street and started to work as a data scientist. I slowly realised that we were seeing flawed and misleading hype around misleading data algorithms happening outside of Wall Street as well, and that that was going to lead to a lot of damage. The difference was that while people all over the world noticed the financial crisis, I didn’t think people would notice the failures of these big data algorithms, because they usually happen on the individual level.

Read part one of our discussion with Dr O’Neil here. Dr Cathy O’Neil’s book, The Weapons of Math Destruction: How Big Data Increases Inequality and Threatens Democracy, is available now.

Culture Trips launched in 2011 with a simple yet passionate mission: to inspire people to go beyond their boundaries and experience what makes a place, its people and its culture special and meaningful. We are proud that, for more than a decade, millions like you have trusted our award-winning recommendations by people who deeply understand what makes places and communities so special.

Our immersive trips, led by Local Insiders, are once-in-a-lifetime experiences and an invitation to travel the world with like-minded explorers. Our Travel Experts are on hand to help you make perfect memories. All our Trips are suitable for both solo travelers, couples and friends who want to explore the world together.?>

All our travel guides are curated by the Culture Trip team working in tandem with local experts. From unique experiences to essential tips on how to make the most of your future travels, we’ve got you covered.

Culture Trip Spring Sale

Save up to $1,100 on our unique small-group trips! Limited spots.

X
close-ad
Edit article