Michael Baugh CDBC CPDT-KSA
“The problem with Facebook is Facebook”
– Siva Vaidhyanathan – Antisocial Media: How Facebook Disconnects us and Undermines Democracy.
I’ve known about the inherent problems with Facebook for quite a while. I’m a student of behavior, human behavior especially. Social media, primarily Facebook and Twitter, have been implicated in ginning up conflict in a number of very significant incidents ranging from clashes in Sri Lanka to genocide in Myanmar to our own elections. I was never active on Twitter. My decision to leave Facebook as been more than a year in the making.
I already know that some of you (maybe a lot) will get all riled up that I’m posting this. Stay in your lane. Right? We don’t want anything that even hints at politics on a dog trainer’s blog. I get it. Stay with me, though. There’s a training connection here.
Social media platforms use artificial intelligence (AI) to build, maintain, and update their algorithms. Algorithms are the complicated programs that control what we see and in what order on sites like Facebook, Twitter, and YouTube. They are designed to keep us engaged, to keep us on the site longer. Facebook and YouTube have the most robust algorithms and the most advanced AI. Keeping us humans engaged in social media is interestingly similar to how we keep our dogs engaged in learning. Reinforcement. We all know dogs who will bring us the ball endlessly so long as we keep throwing it (some of us have those dogs). Many are so hooked they will retrieve until they fall over from exhaustion. Ball throwing reinforces the persistent behavior even to the dogs potential detriment.
What reinforces our behavior on social media? Many of us check our phones as frequently as a half dozen times an hour. And we spend hours “doom scrolling.” Hours. What reinforces that persistent (and potentially detrimental) behavior? It would be hard for us humans to figure out on our own. But artificial intelligence can do it very quickly. With billions of data points available, AI can adjust social media algorithms on-the-fly to feed us exactly the content that will keep us engaged, keep us watching, keep us posting, (and sometimes viewing their ads).
Folks tell me all the time that their dogs like this treat or that one. My reply is almost always the same. I don’t want to know what your dog likes. I want to know what he will work for. I like dog training videos. I like inspirational memes (really, I’m that guy). Heck, I like some of your vacation photos. But, what will we humans work for? What keeps us commenting and replying to comments over and over? What engages social media users and keeps them coming back for more? The research points to one thing conclusively: anger. “Studies of Twitter and Facebook have repeatedly found the same,” writes Max Fisher in The Chaos Machine, “though researchers have narrowed the effects from anger in general to moral outrage specifically.”
Humans evolved to operate in groups of usually 150 or less. Moral outrage, the indignation over a group member’s perceived wrongdoing, kept us moving forward. We would rally as a community fueled by moral outrage and, presumably, the errant group member would fall back into line. It was a cohesive adaptation. The problem is that moral outrage becomes fracturing, even violent, when it’s cranked up in thousands or even millions of people at once (well beyond the 150 we are build for). Moral outrage is what feeds social media and then social media feeds it back to us. It keeps us turned on. It keeps them in business.
Did social media executives figure out this is how we tick? Maybe. But, it’s more likely they just told the AI to get them more users and more engagement. The algorithm did the rest. Our behavior trained the AI and then it trained us. It’s like when we ask “Am I training my dog or is he training me?” The answer is yes. That’s how learning works. It’s cooked into the system, which is why Siva Vaidhyanathan wrote “The problem with Facebook is Facebook.” This isn’t something social media does. This is what social media is.
Moral outrage is not new for us dog trainers. It’s an old joke. The one thing two dog trainers can agree on is that the third one is wrong. It’s worse than that, though. Those of us who use positive reinforcement have vilified those who use punishment, and vice versa. It’s played out on social media in various forms over the years. Admittedly I’ve participated, posting dog behavior blogs and pithy memes of my own. We take a jab at them. They take a jab at us. And, the algorithms notice. Over time the posts we see most frequently on social media are about how terrible they are. Researchers have found that posts with keywords associated with moral outrage perform the best. So moral outrage is what we see and moral outrage is what we feel.
Fisher: “This creates powerful incentives for what philosopher Justin Tosi and Brandon Warmke have termed moral grandstanding – showing off that you are more outraged, and therefore more moral, than everyone else.” One well known trainer went so far as to proclaim there is a crisis in dog training. Others rallied around him in a vibrant display of virtue posturing. “The effect scales,” Fisher writes, “people express more outrage, and demonstrate more willingness to punish the underserving, when they think their audience is bigger. And there is no bigger audience on earth than Twitter and Facebook.”
Another well known trainer, in perhaps an unrelated move, decided to reach out to a renown colleague who uses punishment. The idea seemed reasonable enough. Have a conversation, record it, post it on social media. But, the problem with social media is social media. Though a majority of people (trainers are people) will say they support open dialogue and cooperation, social media algorithms don’t amplify their voices. They amplify moral outrage. When this trainer crossed the line to publicly engage a colleague on the other side, the outrage was swift and vociferous. At best, from a distance, it was annoying. Up close, for those involved, it was ugly. Livelihoods were threatened. Careers were tarnished. This hit close to home. I know these people. I like them. Social media companies banked a dollar or two over it (maybe), a drop in the bucket. No one in Silicon Valley noticed, of course. It all played out on server farms.
We trainers talk a fair amount about how dogs are now in environments that don’t match their evolution. We can make a good case for that. Dogs bred to sprint and scent and scavenge are cooped up in suburban homes. Trainers and behavior consultants can help with that. But what about us? Aren’t we also now in an environment beyond our evolutionary boundaries? Big social media is less than 20 years old. We don’t adapt that quickly. What if we are hard wired to have 150 connections? I have more than 5,000 on facebook, most of whom I will never meet. Moral outrage has its function. But what do we do when it’s manipulated in the pursuit of money, when we are manipulated, when we become the product to boost the corporate bottom line – our suffering be damned.
People in Myanmar were given free access to the internet under one condition. They had to access it through Facebook. Facebook picked up the tab, just a cost of doing business in a new market. After the genocide Facebook pointed to their trope about the greater good, connecting humanity. We have to wonder, though; connecting whom, and how, and to what end?
A few of my friends and colleagues have said they’ll miss me on Facebook. They emailed. Some texted. So far they number fewer than 150. I assured them we are still connected. No one else commented. There was no cause for outrage. It was nice. It was just us humans connecting. Humanity at last.
Michael teaches aggressive dog training in Houston, TX and is a constant student of behavior. He’s reachable by email: email@example.com