Big Tech's Racial Injustice Action Misses How It's Fed Into The Issue

SMS
Big Tech's Racial Injustice Action Misses How It's Fed Into The Issue
Critics say that while donations from big tech can help, they don't address how these companies' products may feed into racial injustice.
SHOW TRANSCRIPT

Even in today's world, the old phrase "actions speak louder than words" seems to ring true. While tech and social media companies are responding to the recent protests by issuing messages and donations in solidarity of those fighting for racial justice, critics say more needs to be done.

“I think it's great they're making these types of commitments now, but I think it's going to take more than just money,” says Ashley Nelson, a social media expert at Tulane University.

Twitter used its diversity account to call attention to "injustices faced by Black and Brown people on a daily basis," while Amazon issued a similar message calling for the end of the inequitable and brutal treatment of Black people. Facebook CEO Mark Zuckerberg pledged $10 million to groups working on racial justice, and YouTube made a similar $1 million commitment.

But critics say this doesn't sufficiently address racial justice issues — or how their products feed into these problems.

"Companies are going to have to sit back and say, 'OK, we're in solidarity, we support it, what does that mean?'" Nelson said. "Because right now, that's words on a post. It doesn't have any meaning. It doesn't have any value."

Zuckerberg did note that "$10 million can't fix this," and said, "It's clear Facebook also has more work to do to keep people safe and ensure our systems don't amplify bias." Notably, Facebook has recently been accused of putting protesters in danger by allowing posts from President Donald Trump that said, "When the looting starts, the shooting starts."

Amazon's tweet drew a response from the ACLU, which called on the company to "stop selling face recognition surveillance technology that supercharges police abuse." That tech has also been shown to be racially biased: An ACLU test found Amazon's system incorrectly matched 28 members of Congress with publicly available mugshots of people who had been arrested, and a disproportionate number of false matches were people of color.

"So what you're going to get is, for example, a system that's trained to do facial recognition is likely to have more of a false positive on minority groups, which means more people are going to be caught up in dragnets for no reason because the system flags them as a false positive," said Suresh Venkatasubramanian, a computer science professor at the University of Utah.

Nelson told Newsy that in order to actually make a change, these platforms need to prove that they're investing in minority communities through employment and training opportunities.

"I think it's going to be like, 'Pony up, show us, what can you do to actually help?'" Nelson said. "What percentage of the people that they have working at these companies are minorities? It's extremely low. Are there training opportunities that you can come into poor neighborhoods and do that?"