Facebook has been much in the news recently. You may have followed the story: the tech site Gizmodo quoted unnamed former Facebook contractors who said they routinely suppressed conservative viewpoints in the “trending stories” news feed. This caused many (mainly conservatives, you might expect) to raise the hue and cry about the alleged liberal bias. Facebook has officially denied any such hidden agenda.
Now
I could use this blog to question why anyone gets their news from Facebook in
the first place, but that would be futile. The fact is they do. According to a 2015 Pew research study, 63% of Facebook users use it to get the news. And 40% of users
agree that it is “an important way to get the news.”
Or
I could make comparison of liberal and conservative thinking when it comes to
suspected biases in the media and society at large. (“Why is it usually the
conservatives who see these conspiracies at work?” I might ask – but I won’t!)
I
might also reiterate some of what the Wall Street Journal concluded, saying “…using
human editors to curate trending topics inevitably introduces biases, both
conscious and unconscious.” The Journal (no hidden liberal bias here!) said
that in this regard Facebook operated just like any other news room.
But
in a larger sense, in my opinion, we’re chasing the wrong bogeyman. Facebook is
not to blame; we are. The fact is that we reveal our own prejudices and
preferences with every click we make, and the internet is designed to reflect
that back to us. As Frank Bruni wrote in the New York Times, in a column called
“How Facebook Warps Our Worlds”, the internet is not rigged to give us a
conservative or liberal bias, until we rig it that way ourselves. It is, he
said “designed to give us more of the same.”
Every
time we click like, follow a link, join a page or a group, we are telling the
internet what we want, and the internet will give us more just like that. Just
look at all the ads you see now, all chosen just for you. Every click you make
determines what you’ll see next.
Google
does this every time you search. It records the link you clicked on – all the
links you’ve ever clicked on – and uses that knowledge to serve up search
choices for you next time that more closely match your profile. It gives you
what it has calculated you want. That’s part of Google’s secret sauce.
Eli
Pariser wrote an eye-opening book in 2011 called the “Filter Bubble”, which described
all the ways that the web’s hidden gatekeepers now build a bubble around us,
all in the name of customization. We get the web experience just the way we
want it, without even asking for it. More and more we’re living in a house of mirrors,
in which everything we are is reflected back on us.
(By
the way, I highly recommend Pariser’s TED talk about the Filter Bubble. It’s
been viewed more than three and half million times and it’s worth nine minutes
of your time too.)
So
don’t each of us win when our likes and dislikes rule the web experience? I don’t
think so. The danger is real that our minds will become increasingly narrowed
by reinforcement of our opinions. The web will just continue to prove us right,
in whatever we believe, from rigid political dogma, nutty conspiracy theories, prejudices
of all kinds – you name it. It can’t be good to be shown only content that
agrees with you.
The
fact is that we need to have our beliefs challenged by differing viewpoints.
And we need to be available when some serendipitous idea or story comes our
way. We have to be there to see it.
I
used to buy the New York Times every morning and “read” it from first page to
last. I didn’t read everything, but at least I looked at it, bumping by chance
into all kinds of stories I never would have sought out: cooking, travel, the
chess column, whatever. Now, my news consumption is quite different. I
subscribe to the Times web site and I only click on what I want to see. No
serendipity here folks.
So
don’t blame Facebook or some other online source for serving up a biased
agenda. Look in the mirror.