Facebook Is Us

Image credit: iStock

Image credit: iStock

Facebook has had a tough month. Really, they’ve had a tough last five years, which have been a nonstop parade of bad news—everything from spreading misinformation in the 2016 election, to compromising user data in the Cambridge Analytica scandal, to helping instigate the Capitol Riot. But since early September, the sixth largest company in the world by market cap has seen one of its steepest stock price declines ever—plunging by about 12%, around $120 billion in value.

The latest wave of setbacks started with the release of a trove of internal documents, dubbed the Facebook Files and largely broken over a series of articles in the Wall Street Journal, that show Facebook’s executives knew about and largely chose to ignore many ill-effects of their platform, from polarizing our politics to triggering teen suicide. That cheery revelation was followed by the whistleblower herself, Frances Haugen, going public in everything from a 60 Minutes interview to Congressional testimony, about what she saw inside the company. Then, of course, there was the massive service outage (not only of Facebook but Instagram and WhatsApp as well)—rare and suspicious enough in its timing as to cause speculation it was a deliberate distraction tactic, or even a chance for Facebook engineers to expunge compromising code.

Most mere mortal businesses wouldn’t have survived one of these scandals, let alone a seemingly endless series of them. The mainstream media has ruthlessly portrayed Facebook as an evil empire, a monopolist manipulating our society and exploiting our children. The company is one of the few topics that liberals and conservatives agree on—both sides hate them and relish eviscerating their executives every time they’re dragged to Capitol Hill. It doesn’t help that Mark Zuckerberg looks like an unhinged sociopath who hasn’t seen sunlight in years.

As cathartic as it is to blame a nefarious villain for our societal problems, my perspective is that Facebook is less a deliberate bad actor and more an example of extreme group think. As Haugen herself said, “No one at Facebook is malevolent.” Arguably, they’ve done exactly what companies are supposed to do: maximize growth, revenue, and profits for their shareholders. What makes the business so unusual though is that their “product” is us. We go there to share and consume information about ourselves and our friends. And Facebook is extremely adept at giving us exactly what we want—whatever dopamine hit will hook us so they can serve us up to advertisers. The real moral question of Facebook and its ancillary brands is whether a company should be able to exploit human behaviors for profit.

To really understand Facebook’s current state and the case for regulating it, it’s informative to look back at the origin of social networking. I happen to have been running marketing for one of the world’s largest social networks, hi5, when social networking businesses were first erupting. Though the company is largely forgotten now, along with contemporaries like Orkut, Six Degrees, Bebo, and dozens more, when I was there from 2008-2010, hi5 was a top-15 website in the world. It was not obvious at the time that Facebook would become the behemoth it is today. I recall telling a colleague who worked there that they were foolish to pass on Yahoo’s billion-dollar offer to buy the company. My bad. While modern social networking was invented by Friendster and scaled by MySpace, it was perfected by Facebook. Their reward was to become a trillion-dollar company.

But back in those Darwinian days, Facebook was in a fight for its life. With the benefit of hindsight, the company did several things right that were critical to its ultimate success. Likely due to their humble origins on college campuses, Facebook instilled a behavioral norm that users provided their real names and pictures on their profiles. Most other social networking sites had random anonymous profiles on them, making it much harder to find your real friends. Facebook also invented the News Feed. Most other social networking sites got boring after a while, but Facebook fed you an endless diet of the next thing to look at, click on, or react to. And Facebook launched Facebook Platform which enlisted thousands of software developers building addictive widgets, from hotness meters to Farmville, that gave Facebook’s users even more to do—more clicks, more ads, more revenue.

What all these innovations shared in common is that they increased user engagement. Turns out connecting with real friends is much more engaging than random strangers. Turns out if you stick an endless stream of information in the middle of the screen, users will scroll and click on it until they’re left drooling with their eyes crossed. Turns out if you release millions of rabid developers loose on your platform, some of them will invent apps that compel users to constantly check on the status of pretend crops. Who woulda thought. Really, what made Facebook a trillion-dollar company was one central success factor that they were better at than anyone else: discovering and amplifying what we human beings find engaging.

Imagine how deeply ingrained that gets into the DNA of an organization. What a foundational reward system it becomes. An entire generation of Facebook employees—from executives to engineers, designers to data scientists—became really, really, really good at giving us exactly what we want. On a daily basis for the last 17 years, they have continuously tuned their UI, their messaging, their interactions, their algorithms to keep us engaged, entertained, addicted. We are the product. And every day we blithely hand over, via the tell-all truth-sayer of our click stream, exactly what activates us.

What Facebook has harnessed is two million years of human evolution. Giving us what we are biologically hardwired to want. As any social scientist can tell you, human beings seek mates. Click. Human beings like being part of a group, forming tribes for our self-defense. Click. Human beings like to share information for the benefit of the tribe. Click. Human beings care what other human beings think about them. Click. Human beings like the validation of other human beings agreeing with us. Click. Human beings like to be entertained. Click. Whatever it is that gets us to click, tap, swipe, view, like, comment, share or otherwise engage is dopamine for us and ad dollars for Facebook. Symbiosis.

Every company in the history of companies has tried to perfectly serve the needs of its customers. “Give the customer what they want.” “The customer is always right.” When that company-to-customer connection is mild, it’s called “Brand Loyalty.” When it’s strong, it’s called addiction. Which brings us back to the fundamental difference between Facebook and other companies that created addictive products, whether it was tobacco, oil, or OxyContin. Facebook’s product is us.

The credit that Facebook rarely gets is that they almost certainly do much more to tamp down our basest instincts rather than exploit them. I saw these behaviors first hand at hi5. Left unmoderated, social networks spawn hate groups, child pornography, and prostitution rings as efficiently as they do baby pictures, high school reunions, or the Arab Spring. Again, that is the inherent nature of the platform: it gives us what we want. For some of us, those wants are really bad things. And because those wants are public, Facebook creates its own moral obligation to regulate them. Of course, they can't permit illegal activity, or unethical, or exploitative, or . . . Merely unpleasant? Mildly objectionable? Possibly offensive? Where do we draw the line? What comments, actions, or behaviors do we tolerate? For the first time, humanity’s proclivities, from our darkest obsessions to our mildest prejudices, are up for discussion in a public forum.

And herein lies the undeniable imperative that Facebook needs to be regulated. We cannot permit a handful of unelected private sector executives, personally enriched by their company’s stock price, deciding what is acceptable and what is not, while hiding behind Section 230 of the Communications Decency Act. As decision-makers they are fundamentally conflicted. Though they haven’t cynically exploited the most pernicious behaviors that would have transpired on their platform if left completely unmoderated, their ultimate loyalty is to their shareholders, not the greater societal good. We should not expect, nor want, them to be those arbiters.

That is why, as Haugen testified, every dilemma between Facebook’s interests and the well-being of its users is decided in favor of Facebook. Just as many Facebook users are addicted to the service, Facebook itself is addicted to growth, revenue, and profit. It cannot deny its sole purpose as a corporate entity, its fiduciary obligation to itself. Even bridled with the best intentions of well-meaning employees, it can’t self-regulate. Because we are the product, not even Zuckerberg himself fully controls it—there are just too many of us, two billion every day, generating content. Even if Facebook’s executives were saints, they can’t stop us from subjecting ourselves to our own worst instincts. The company, like many of its users, is like a drug-addled pusher, rationalizing its behavior, in denial, hooked on a bender, just one more bump. We need an intervention.

Michael Trigg2 Comments