Facebook will not remove deepfakes of Mark Zuckerberg, Kim Kardashian and others from Instagram
The work, featured in a site-specific installation in the UK as well as circulating in video online, was the first test of Facebook’s content review policies since the company’s decision not to remove a manipulated video of House Speaker Nancy Pelosi received withering criticism from Democratic political leadership.
“We have said all along, poor Facebook, they were unwittingly exploited by the Russians,” Pelosi said in an interview with radio station KQED, quoted by The New York Times. “I think they have proven — by not taking down something they know is false — that they were willing enablers of the Russian interference in our election.”
After the late May incident Facebook’s Neil Potts testified before a smorgasbord of international regulators in Ottawa about deep fakes, saying the company would not remove a video of Mark Zuckerberg . This appears to be the first instance testing the company’s resolve.
“We will treat this content the same way we treat all misinformation on Instagram . If third-party fact-checkers mark it as false, we will filter it from Instagram’s recommendation surfaces like Explore and hashtag pages,” said an Instagram spokesperson in an email to TechCrunch.
The videos appear not to violate any Facebook policies, which means that they will be subject to the treatment any video containing misinformation gets on any of Facebook’s platforms. So the videos will be blocked from appearing in the Explore feature and hashtags won’t work with the offending material.
Facebook already uses image detection technology to find content that has been debunked by its third-party fact checking program on Instagram. When misinformation is only present on Instagram the company is testing the ability to promote links into the fact-checking product on Facebook.
“Spectre interrogates and reveals many of the common tactics and methods that are used by corporate or political actors to influence people’s behaviours and decision making,” said Posters in an artist’s statement about the project. “In response to the recent global scandals concerning data, democracy, privacy and digital surveillance, we wanted to tear open the ‘black box’ of the digital influence industry and reveal to others what it is really like.”
Facebook’s consistent decisions not to remove offending content stands in contrast with YouTube which has taken the opposite approach in dealing with manipulated videos and other material that violate its policies.
YouTube removed the Pelosi video and recently took steps to demonetize and remove videos from the platform that violated its policies of hate speech — including a wholesale purge of content about Nazism.
These issues take on greater significance as the U.S. heads into the next Presidential election in 2020.
“In 2016 and 2017, the UK, US and Europe witnessed massive political shocks as new forms of computational propaganda employed by social media platforms, the ad industry, and political consultancies like Cambridge Analytica [that] were exposed by journalists and digital rights advocates,” said Howe, in a statement about his Spectre project. “We wanted to provide a personalized experience that allows users to feel what is at stake when the data taken from us in countless everyday actions is used in unexpected and potentially dangerous ways.”
Perhaps, the incident will be a lesson to Facebook in what’s potentially at stake as well.