Facebook dilemma: balancing campaign vs disinformation and securing users’ privacy

Scholars and researchers are accusing Facebook of not staying true to its promise that they will publicly release data pertaining to the disinformation campaigns that are happening within its social media platform for the academe to study and analyze ahead of the upcoming 2020 elections. Facebook’s excuse: they are having a hard time balancing the fight against misinformation and safeguarding their users’ privacy. 

In April 2018, Facebook promised that they would launch a program where they would share huge amounts of posts, links, and other user data with researchers around the world. The goal of the program is for the academe and scholars to study and flag disinformation campaigns in the site. 

“Our goal is to focus on both providing ideas for preventing interference in 2018 and beyond, and also for holding us accountable,” Facebook founder, Mark Zuckerberg, said last year, in testimony during the highly publicized legislative inquiry on the interference of Russian hackers in the 2016 elections. He added that the program will hopefully see the “first results” by the end of 2018. 

However, a huge chunk of the data promised by Zuckerberg to the academe is still unavailable to the researchers almost two years after he made the promise. Facebook argues that it is hard for them to share the information needed to study the disinformation campaigns in its social networking platform without compromising their users’ privacy and online security. 

Furthermore, the data that was already released by Facebook was ultimately less comprehensive in studying disinformation than the ambitious promises made by Zuckerberg in his April 2018 statements. 

Seven nonprofit groups that have helped finance the research efforts, including the Knight Foundation and the Charles Koch Foundation, have already threatened the company that they will abandon their support to the program as a result of the lagging fulfillment of Facebook’s promises. 

Because of the unavailability of the majority of the data regarding the rampant disinformation campaigns in the social media network, the biggest in the world, scholars worry that they will only have a little more information as the 2020 election approaches than what they have prior to the 2016 election. 

Nonetheless, Facebook maintains that they have done more in fighting misinformation than any other tech startups. Elliot Schrage, Facebook’s vice president of special projects, who oversees the initiative, defended the company’s efforts saying that they are doing their best. 

“The whole reason Mark announced this program in the first place is he believes that the most productive and instructive debates are driven by data and independent analysis,” Schrage said in an interview. “I know of no private company that has invested more to build tools and technologies to make private data publicly available for public research.”

On the other hand, scholars and academics argue that it the moral obligation of Facebook to lead the way in fighting fake news and misinformation on its platform. 

“Silicon Valley has a moral obligation to do all it can to protect the American political process,” said Dipayan Ghosh, a fellow at the Shorenstein Center at Harvard and former privacy and public policy adviser at Facebook. “We need researchers to have access to study what went wrong.”

While Mark Zuckerberg is eager to fight misinformation on Facebook, as implied in his testimonies regarding the matter, Facebook is left in a very difficult position as the company has been suffering from its bad reputation on securing their users’ privacy. 

Recently, Facebook has been penalized by the Federal Trade Commission record defying fines for its involvement in the Cambridge Analytica scandal, where the British PR firm was able to scrape millions of user data in order to manipulate public opinion and influence the results of 2016 presidential election that elected Trump into office. 

“At one level, it’s difficult as there’s a large amount of data and Facebook has concerns around privacy,” said Tom Glaisyer, chairman of the group of seven nonprofits supporting the research efforts. “But frankly, our digital public square doesn’t appear to be serving our democracy.”

Be the first to comment on "Facebook dilemma: balancing campaign vs disinformation and securing users’ privacy"

Leave a comment

Your email address will not be published.


*