The platform’s former head of global public policy claimed that Facebook encountered election interference content as far back as 2006, about a decade before Mark Zuckerberg first acknowledged the problem.
Speech at Sky News’ Big Ideas Live eventexperts and industry leaders discuss the biggest scientific and technological issue of our time, with Paul Kelly saying staff must deal with it “all the time”.
“Back in 2006 and 2008, we saw the first stages of a misinformation campaign around elections,” Mr. Kelly revealed on a panel about the future of big tech companies.
Missed a great creative livestream?Follow it if it happens
“We were actually doing a lot of projects at the time trying to increase civic engagement on the platform. We certainly saw people trying to use misinformation early on in that phase to influence elections.”
Facebook founder Zuckerberg admitted in 2017 that he should have raised concerns about fake news ahead of the 2016 presidential election when donald trump More seriously won the race for the White House.
He dismissed the idea as “crazy,” but then wrote in a public post in September 2017: “Calling it crazy is dismissive and I regret it.
“This issue is too important to dismiss.”
Mr. Kelly was responding to a question from an audience member about the connection between social media and growing divisions in American politics and elsewhere.
Sky News tech reporter Rowland Manthorpe questioned the gap between Facebook’s handling of misinformation and Zuckerberg’s acknowledgment of the problem, with Mr Kelly saying “the scale has changed”.
“By then I was gone,” he stresses.
“But we’ve definitely seen some attempts to spread election misinformation early in the race.”
A spokesperson for Meta, Facebook’s parent company, said it had “developed a comprehensive approach to how elections are played on our platform” — “reflecting years of work” and “billions of dollars in investment.”
They added that they have “dedicated teams working on elections,” including this month’s U.S. midterm elections.
“Meta has hundreds of people working across more than 40 teams to combat election and voter interference, combat misinformation, and discover and remove offending content and accounts,” they said.
“We’ve also put in place stronger policies to stop allegations of decriminalization or fraud on our services.”