In 2011 as Facebook inched ever closer to 1 billion active monthly users, it faced a vexing crisis: uproar about a facial recognition algorithm that tagged people in photos without their consent.
Six years later, and as Facebook nears the 2-billion-user milestone, that complaint almost seems quaint.
Consider the problems facing the world’s biggest social network today. The Menlo Park, Calif., company is taking fire for spreading propaganda and misinformation, potentially influencing the outcome of elections in the United States and abroad. It’s being criticized for allowing hate and terror groups to foment on its platform. And it’s scrambling to stamp out horrific videos of suicides and murders streamed live.
That these setbacks come at a time when Facebook now reaches a quarter of the globe’s population only underscores how much the stakes have grown for the company. It connects people in ways never experienced before, providing the only outlet for free speech in some countries. Facebook is one of the few companies in Silicon Valley that can proclaim without a hint of irony that it has changed the world.
Armed with enormous power, the nation’s fifth-largest company by market capitalization has to take into account both the needs of shareholders and the communities it serves. How Facebook navigates its journey toward its next billion users could portend not just the financial health of the company but also the health of the societies increasingly influenced by its products.
“The more mature their tools become, the more profound the challenges they have,” said Mike Hoefflinger, a former Facebook marketing employee who wrote “Becoming Facebook,” a book about the company’s evolution into a behemoth valued at $430 billion.
“My sense is that they’re absolutely maturing into their role in the world,” Hoefflinger continued. “But the history of Silicon Valley has taught us that no company, no matter how great, can dominate forever.”
That vulnerability — and a willingness to adapt — has rarely been more evident than since the 2016 presidential election. Against the backdrop of a bitterly divided country, Facebook, which did not respond to a request for comment, has provided valuable clues as to how it will behave in the new political era.
Initially, Facebook founder and Chief Executive Mark Zuckerberg dismissed the notion that his platform helped spread propaganda and partisan clickbait that some say helped Donald Trump win the election. Zuckerberg even went so far as to call the idea “crazy.” But as criticism mounted, the 32-year-old executive began addressing the issue more forcefully and accepting his platform’s role.
Facebook partnered with independent fact checkers to help vet content, it restricted ads from fake news sites and it tried to educate users about how to spot hoaxes in several countries undergoing elections, including in Britain, Germany and France. To combat criticism that Facebook only fortifies the echo chamber effect online, the company is testing a “Related Articles” feature that will give users more perspectives on the news.
Zuckerberg was also compelled to take action over the rise of live-streamed violence on Facebook’s popular video broadcasting platform.
Last month, Zuckerberg had to kick off the company’s annual developers conference by offering condolences for the murder of 74-year-old Robert Godwin Sr. in Cleveland. Video of the crime was uploaded onto Facebook by the gunman, 37-year-old Steve Stephens, who later bragged about the killing on Facebook Live.
Last week, Zuckerberg announced plans to hire 3,000 more moderators to screen content for disturbing material.
The efforts to crack down on fake news and objectionable video also comes months after Facebook, Microsoft, Twitter and YouTube agreed to share information to weed out terrorists and their content. That’s being done even as Facebook and others are deflecting allegations that technology companies are complicit in terrorist attacks because the perpetrators use their platforms.
Zuckerberg’s attempts to quell its problem with fake news and the like are all steps in the right direction for mitigating Facebook’s shortcomings, experts say. But the question invariably remains: How can Facebook ever have enough moderators, fact-checkers or even sophisticated enough artificial intelligence to back up its lofty intentions?
“Two billion users is a challenge no company has ever had,” said David Kirkpatrick, chief executive of media company Techonomy and author of “The Facebook Effect.”