I have a rather eclectic group of friends on Facebook. On any given day, I will see posts for Donald Trump, posts for Hilary Clinton, posts for Bernie Sanders, posts by Christians, posts by atheists, posts by conservatives, posts by liberals, even posts by a few bomb-throwing types (figuratively speaking).
I may be in the minority, and not for any reason of my own.
The algorithm made me do it.
When Gizmodo published a story claiming Facebook was biased against conservatives in how it managed Trending Topics, I thought, “And this is news?” As I read the story, though, it was more serious than I believed. Facebook could have almost singlehandedly turned Black Lives Matter into a major story when it wasn’t one to begin with.
The major newspapers and television networks know this. Facebook is the largest driver of online traffic to their news sites. And in the kind of crazy election year we’ve been experiencing, we really don’t know how much of what is happening is affected by algorithms used by Facebook and other social channels like Twitter, and how much is real.
At first glance, using an algorithm to determine what you do and don’t see in social media sounds technologically unbiased. Except for the fact that algorithms read what you read and like, and they serve you more of it, because that’s what keeps you clicking, that’s what keeps you on the site, and that’s what the social channels sell to advertisers.
So we all like cute puppy and kitten videos, and almost 140 million of us liked Candace Payne putting on a Chewbacca mask. Those are funny and fun, and we’ll keep seeing more of these show up because of that trusty algorithm.
But there’s a darker side. Do you know what kind of news and posts those algorithms really like?
Conspiracy theories abound, and thrive, on social channels. They virtually impossible to kill because they keep showing up, and they keep recycling, and if you keep seeing them (with links!) they must be true.
Vani Hari, aka the Food Babe, discovered this all on her own. She was trying to drawn attention to what she saw as the negative health effects of butter and margarine, and was getting nowhere.
Until she called it Monsanto butter. That tapped into a baker’s dozen of conspiracy theories on social media, and she went from largely ignored to food celebrity almost overnight. It doesn’t matter that nutritionists and scientists see her as misleading at best and downright dangerous at worst; she now has an audience and they will follow her over the cliff, disregarding anything contrary to their (and her) beliefs.
The problem is that we all do this. Given a choice to click on a link that supports what we already believe, and one that contradicts or disagrees with what we believe, guess which one we are more likely to choose?
And the algorithms are watching, and will serve up more of it.
Have you ever been surprised, after looking at a book on Amazon, to discover that same book showing up as a promoted post on your Facebook news feed?
You shouldn’t be. Yes, it’s that algorithm again, supplemented by a little friendly cooperation with advertisers.
It’s one thing if it’s a book. But it’s quite another when it’s an issue, a controversy, or a political belief system.
None of us are abandoning Facebook anytime soon. But we can be aware of what’s happening, seek out alternative viewpoints, and promise ourselves not to accept something as truth, even when “all of our Facebook friends” are saying so.