The short video shows more than a dozen California Highway Patrol officers surrounding a single protester, who is already on the ground. They are shouting instructions. While it’s hard to make out what they are saying, the person on the ground doesn’t appear to resisting. Then, amidst the throng of cops, at least one officer appears to drop his knees on to the protester’s body.
On June 7, Khanstoshea Zingapan posted the video, sent to her by someone who witnessed the confrontation, on the Facebook page of her documentary video charity, Black Zebra Productions. “This is Sacramento, CA. This is a CHP officer with a knee on a neck! Why is this still happening?” she wrote. Zingapan released the video at the height of protests over the death of George Floyd after a Minneapolis police officer knelt on his neck for nearly nine minutes. Over the course of next day, Zingapan says the video continued to garner views and comments. But when she woke up on June 9, Black Zebra Productions’ Facebook page been taken down. An email from the company indicated it had been removed for “posting fraudulent or misleading content.”
Four years ago, Zingapan began releasing documentary footage of her community in Sacramento on Facebook and other social media channels under her artist name, Black Zebra, “showing my world through the eyes of a black woman in America.” In one series last year she documented local police’s evacuation of a long-standing homeless encampment. When Floyd’s killing sparked protests, Black Zebra Productions became a resource for the Sacramento activists and residents, with Facebook serving as the platform and warehouse for its livestreams. Zingapan worked out a freelance arrangement with the Sacramento Bee that she hoped would prevent police harassment as she and a crew of volunteers live-streamed protests, were hit by rubber bullets, and choked on tear gas. Then, in a flash, all their work disappeared.
Despite Facebook’s recent proclamations and donations designed to indicate backing for America’s swelling anti-racist movement, the company and its CEO Mark Zuckerberg have been targeted by such activists largely because of the platform’s appeasement of President Donald Trump, even as he posts misinformation about voting and exhortations to violence. Meanwhile Facebook is also home to a chorus of Black people who use the site to fight racism but whose own posts and pages are often penalized for calling out bigotry, even as vitriol against them remains on the platform.
What happened to Zingapan, where a Black person’s Facebook page is taken down for mysterious reasons, is actually common. In Zingapan’s case, her followers had sprung into action within moments of the page disappearing, reaching out to local politicians, the Northern California chapter of the ACLU, and Facebook itself. That afternoon, the company restored the page and all its footage. Zingapan later met with the heads of multiple Facebook departments, accompanied by a lawyer from the ACLU. “I did feel as though they did give me the time to hear my story,” she said last week. “But at the same time, I don’t know where that goes. I don’t have the reassurance that this won’t happen again. I don’t even know what happened.”
While Zingapan’s own interactions with the company left her in the dark for weeks about what triggered the takedown, Facebook told Mother Jones this week that the page had been incorrectly flagged for intellectual property violations. “We mistakenly unpublished the Black Zebra page for violating IP policies,” a Facebook spokesperson said. “Once we realized there was no violation, the page was immediately restored. We take these mistakes seriously and work with our teams to minimize the chance of it happening again.” While Facebook uses artificial intelligence to detect intellectual property issues like a video including copyrighted music, users can also report intellectual property violations to Facebook. To prevent Black Zebra’s page from any future unjustified sanctions, Facebook said some of its content will receive more review it if is flagged for IP violations in the future.
This explanation is unsatisfactory to Zingapan’s attorney, Abre’ Conner of the ACLU of Northern California. Copyright violations are an unlikely problem for an outlet like Black Zebra, which live-streams newsworthy events and produces its own documentary footage. Facebook’s own processes suggest that only content presenting potential intellectual property violations should be taken down, warning that broader removals of entire pages should be reserved for repeat offenders.
“Although Facebook has recently given us that [intellectual property] response, it’s unclear whether we have all of the information,” Conner said. “We asked basic questions to get a better sense of what that meant. We are continuing to have conversations, but for now, we are left with more questions than answers.” After learning of Facebook’s explanation, Zingapan stressed that she still is not sure the company offers a “safe place and platform” for Black users like herself. “We’ve outlined the problem. And now it’s time for Facebook to actually state some solutions.”
Zingapan isn’t the first activist or Facebook user granted a meeting with the company after feeling abused and underserved by the platform. Nearly a decade ago, representatives of the Jewish community and women’s groups complained to Facebook about harassment on the platform and the spread of hate, making the first of many journeys to Menlo Park to warn Facebook about the growing problem of hate speech on its platform. And in February 2016, company representatives met with five activists representing the Black Lives Matter movement and racial justice organizations. While Facebook had become a valuable organizing and messaging tool for the movement, the organizers complained about constant hate on the platform—violent threats and images and racial epithets. For years, activists have warned that hate speech leads to doxxing, stalking, and other real life harms. Making matters worse, Facebook sometimes blocked BLM activists’ and groups’ pages for no apparent reason or for minor infractions of the company’s then-opaque content policies. (Facebook did not release its community standards enforcement policies until 2018.)
“It felt like an obligatory meeting where Facebook told us, in very nice words, that they really couldn’t do anything to help us,” Shanelle Matthews, who attended toe 2016 meeting as the then-communications director for the Black Lives Matter Global Network, told Mother Jones last year.
Despite the intervening years, what Zingapan shared with Facebook last month sounds familiar. “You’re blocking us but on our posts,” she recalled explaining, “whites supremacists…are coming after us, like physically coming after us, trying to find out where our locations are.” Zingapan and supporters who follow her page constantly flag racist comments on Black Zebra’s videos and livestreams. “I’m having to reimagine my work and how I do things because the platform that I’m on is not providing me the safety that I need,” she says.
Racial justice groups have spent years and countless meetings with Facebook trying to get it to take Black Lives Matter and the rights of minority groups seriously. They pushed for a civil rights audit that has led to some new policies, including on voter suppression, but that have so far proven insufficient in curbing disinformation and hate. The final audit report, released Wednesday, made clear that while Facebook has made policy improvements around hate, its enforcement is uneven and over-reliant on predictive technology—even as the company has struggled to fully understand the biases in its own algorithms.
After 2017’s deadly white supremacist rally in Charlottesville, Virginia, Facebook removed many hate groups and racist accounts, though others remain. The white nationalist “boogaloo” movement that found a home on Facebook is only now, after two members allegedly murdered at least one law enforcement officer in an attempt to start a race war, being scrutinized by the platform. Historically, the best way to get Facebook to take action against discrimination has been to take the company to court. Now, as the Black Lives Matter movement gains momentum, activists want more from Facebook than donations to racial justice groups and words of support. While Facebook says its myriad meetings on hate speech have persuaded the company to do more on the issue, including piloting programs to improve hate speech moderation, the initiatives have come too slow for activists who, given the amount of hate on the platform, want to see faster and more dramatic action.
In June, several civil rights groups formed the Stop Hate for Profit Coalition, urging advertisers to cease spending money on Facebook for the month of July to pressure the platform to take further steps to address hate and misinformation. More than 1,000 brands have signed on, including Verizon, Coca-Cola, Hershey’s, and Unilever. While Zuckerberg announced modest changes to Facebook’s content policies on Friday shortly after the boycott began, the advocates were not placated. “We won’t settle for less than real steps to address white supremacy on their platform,” tweeted Rashad Robinson, the president of Color of Change, a racial justice group involved in the effort, after Pepsi signed on. Among the coalition’s 10 demands for Facebook: “Enable individuals facing severe hate and harassment to connect with a live Facebook employee. In no other sector does a company not have a way for victims of their product to seek help.”
Despite this sustained pressure from racial justice groups, activists on the ground still experience the platform as a battlefield. As many Black people on Facebook are all too aware, the company’s algorithms are very good at taking down content that may discriminate against white people. In one example, the sentence “White men are so fragile” was immediately taken down as hate speech, according to one woman who told her story to USA Today in 2019. In fact, saying “white” exposes posts or whole pages to the threat of a takedown, which has led to people misspelling the word to avoid detection. The situation means that people of color struggle to discuss racism on a platform whose owner touts it as a haven for free speech and a marketplace of ideas.
Instead, the ideals of free speech mostly serve to protect hatred against people of color. Tanya Faison, the founder of Black Lives Matter Sacramento, attended the 2016 meeting with Facebook. “It’s been years now and nothing has happened,” she told Mother Jones in 2019. “It seems that white folks and men are protected more than black people and women…A lot of racist posts say whatever they want to us.”
“Their algorithms are off,” says Conner of the ACLU. “People are risking their lives in order to ensure that there is equality for Black people. And Facebook can really be a part of ensuring that people see that. But with the algorithms, the way that it is now with proactive tech…it has not been in favor of Black activists and organizers.”
“People get blocked all the time for just saying the word ‘white,'” Faison recently reiterated. “It’s all still the same.” Meanwhile, Facebook has allowed threats against Faison and other BLM activists to remain up. In early June, one man whose page appears dedicated to harassing BLM activists posted a video of Faison with an all-caps message: “I WOULD LOVE TO WAKE UP TO THE NEWS THAT SOMEONE OFFED THIS EVIL BITCH.” As one commenter on the BLM Sacramento Facebook page wrote, “Facebook is so ridiculous. I reported his page and profile and the crazy posts hes made and everything came back as meeting community standards. HOW? What he said about Tanya is basically a death threat.” It’s not clear if Facebook ultimately took any action, but at some point the harasser changed “offed” to “offended.” The post remains up, and in context, the threat remains clear. Another post last month by the same harasser read that “I’d like to give a shout out to all the good people that have been making looters disappear and never be seen again.” The message, which was followed by an emoji of two beers clinking together, was set against a skull-theme backdrop. It is still up.
Such threats, coupled with activists’ own pages and posts coming down or being deleted, “hinders the work” of racial justice organizers, Faison says. “If they’re not able to go on social media, they can’t go live when something’s happening. It impacts what we’re doing. It impacts it in a huge way because we use social media platforms to inform people.” The number of videos of police brutality that went viral during the first weeks of the protests make it clear that access to social media is, indeed, crucial to such activists.
“As black folks are continuing to demand justice, and transparency and accountability, Facebook also has a responsibility to not perpetuate a system that has already been really harmful towards black activists, organizers, journalists and media,” says Conner. She is helping to lead “conversations with Facebook,” she says, “trying to get to more to the bottom of this and hopefully there will be some positive changes.”
But given the company’s track record it’s no mystery why some thought a boycott would be a good idea. After the action’s leaders met with Zuckerberg on Tuesday, they came away sounding much like Zingapan. “#StopHateForProfit didn’t hear anything today to convince us that Zuckerberg and his colleagues are taking action,” attendee Jessica J. González, co-CEO of the advocacy group Free Press, said in a statement. “Instead of committing to a timeline to root out hate and disinformation on Facebook, the company’s leaders delivered the same old talking points to try to placate us without meeting our demands.” The boycott continues.
“What would this moment look like right now for Black communities in which they’re not having to deal with white supremacists” on social media, asks Steven Renderos, executive director at the group MediaJustice, which has spent years engaging with Facebook over protecting Black users. Instead, as Renderos spoke, the country was dealing with the aftermath of Zuckerberg’s decision to leave up a post from Trump glorifying violence against protesters. “We would be in a different place in terms of being able to protect and value Black lives in this moment if Facebook took those concerns seriously. But I think we’re at this moment now where we’re yet again realizing that they don’t care.”