MAHK ZUCKENBERGER!!! You got some ‘splainin’ to do, my guy.
The New York Times reports that Facebook has apologized after users who watched a video posted on the site by The Daily Mail in 2020 saw automated prompts asking if they wanted to “keep seeing videos about Primates.”
There was nary a primate, Most Valuable or otherwise, in the video. Instead, it was a clip of a Black man interacting with officers after a white man called the police on him at a marina. According to the Times, Facebook says this was an error on behalf of an artificial intelligence feature on the site.
From the Times:
Darci Groves, a former content design manager at Facebook, said a friend had recently sent her a screenshot of the prompt. She then posted it to a product feedback forum for current and former Facebook employees. In response, a product manager for Facebook Watch, the company’s video service, called it “unacceptable” and said the company was “looking into the root cause.”
Ms. Groves said the prompt was “horrifying and egregious.”
Dani Lever, a Facebook spokeswoman, said in a statement: “As we have said, while we have made improvements to our A.I., we know it’s not perfect, and we have more progress to make. We apologize to anyone who may have seen these offensive recommendations.”
Here we have another example of how artificial intelligence programs routinely demonstrate biases against people of color. We’ve seen stories on how this Minority Report-ass facial recognition software has led to innocent Black people being arrested or discriminated against due to computer errors.
Then there’s the racial bias found in voice recognition programs and Twitter cropping Black faces out of photos at a higher rate than it does white faces.
G/O Media may get a commission
So yeah, it’s safe to say this problem ain’t nothing new. The question is, what are Big Tech companies like Facebook and the others going to do about it?
Based on what Groves told the Times, so far the answer is “not enough.”
Ms. Groves, who left Facebook over the summer after four years, said in an interview that a series of missteps at the company suggested that dealing with racial problems wasn’t a priority for its leaders.
“Facebook can’t keep making these mistakes and then saying, ‘I’m sorry,’” she said.
Source link