Monday, June 13, 2016

Who's really to blame for the web's hidden bias?


Facebook has been much in the news recently. You may have followed the story: the tech site Gizmodo quoted unnamed former Facebook contractors who said they routinely suppressed conservative viewpoints in the “trending stories” news feed. This caused many (mainly conservatives, you might expect) to raise the hue and cry about the alleged liberal bias. Facebook has officially denied any such hidden agenda.

Now I could use this blog to question why anyone gets their news from Facebook in the first place, but that would be futile. The fact is they do. According to a 2015 Pew research study, 63% of Facebook users use it to get the news. And 40% of users agree that it is “an important way to get the news.”

Or I could make comparison of liberal and conservative thinking when it comes to suspected biases in the media and society at large. (“Why is it usually the conservatives who see these conspiracies at work?” I might ask – but I won’t!)

I might also reiterate some of what the Wall Street Journal concluded, saying “…using human editors to curate trending topics inevitably introduces biases, both conscious and unconscious.” The Journal (no hidden liberal bias here!) said that in this regard Facebook operated just like any other news room.

But in a larger sense, in my opinion, we’re chasing the wrong bogeyman. Facebook is not to blame; we are. The fact is that we reveal our own prejudices and preferences with every click we make, and the internet is designed to reflect that back to us. As Frank Bruni wrote in the New York Times, in a column called “How Facebook Warps Our Worlds”, the internet is not rigged to give us a conservative or liberal bias, until we rig it that way ourselves. It is, he said “designed to give us more of the same.”

Every time we click like, follow a link, join a page or a group, we are telling the internet what we want, and the internet will give us more just like that. Just look at all the ads you see now, all chosen just for you. Every click you make determines what you’ll see next.

Google does this every time you search. It records the link you clicked on – all the links you’ve ever clicked on – and uses that knowledge to serve up search choices for you next time that more closely match your profile. It gives you what it has calculated you want. That’s part of Google’s secret sauce.

Eli Pariser wrote an eye-opening book in 2011 called the “Filter Bubble”, which described all the ways that the web’s hidden gatekeepers now build a bubble around us, all in the name of customization. We get the web experience just the way we want it, without even asking for it. More and more we’re living in a house of mirrors, in which everything we are is reflected back on us.

(By the way, I highly recommend Pariser’s TED talk about the Filter Bubble. It’s been viewed more than three and half million times and it’s worth nine minutes of your time too.)

So don’t each of us win when our likes and dislikes rule the web experience? I don’t think so. The danger is real that our minds will become increasingly narrowed by reinforcement of our opinions. The web will just continue to prove us right, in whatever we believe, from rigid political dogma, nutty conspiracy theories, prejudices of all kinds – you name it. It can’t be good to be shown only content that agrees with you.

The fact is that we need to have our beliefs challenged by differing viewpoints. And we need to be available when some serendipitous idea or story comes our way. We have to be there to see it.

I used to buy the New York Times every morning and “read” it from first page to last. I didn’t read everything, but at least I looked at it, bumping by chance into all kinds of stories I never would have sought out: cooking, travel, the chess column, whatever. Now, my news consumption is quite different. I subscribe to the Times web site and I only click on what I want to see. No serendipity here folks.

So don’t blame Facebook or some other online source for serving up a biased agenda. Look in the mirror.

3 comments:

Anonymous said...

Interesting perspective, Howard. Let's face it -- we're all 'biased' in that we tend to click on and read stories that confirm what we already believe (and ignore disconfirming reports and narratives). This is known as the Confirmation Bias. That, combined with the tendency of the web to serve us more and more of the kinds of things we already click on, reinforces the kind of tunnel vision that you describe.

A somewhat radical approach might be to occasionally wade deliberately into opposing-viewpoint territory. For NY Times readers, that could for instance entail glancing at the Wall Street Journal editorial page. I try that occasionally, but it usually just reinforces that my viewpoints are more 'correct' than theirs. A second less radical way is to occasionally read stories that tend to not support what we believe and then ask ourselves, "What 10% of that story can I agree with?"

Anonymous said...

Excellent insights Howard! I'm old enough to remember when what most people knew about the world was what they got from Walter Cronkite between 6:30-6:45 PM on weeknights. And we believed him. But today there is constant reporting from biased and even inaccurate sources all feeling a commercial motive. It supports human nature to filter in what we want to hear and to filter out what we don't want to hear. You are right- it's about us. The only solution is to realize it through articles such as yours, and then we all need to add diversity of thinking and culture into our lives... Maybe turn off the computer and TV, pick up a camera and take a look at the world as it really is!.... Thanks Howard, Ted!

Scott Bond said...

Well done, Howard. Beginning from the time we're toddlers, we are exposed to "bias" from sources including parents and family, and their social circles, teachers, friends, and available media. And, we all know that things were simpler, back then. Heck, here in the Philadelphia area, we had Channels 3, 6, 10, 12, and if you had the right antenna, you could pick UHF channels 17, 29, and 48. Wow!

These days, choice is almost mind boggling, and you can choose the options that stroke you, or provoke you. So, what is it tonight, CNN, MSNBC or FOX - or a little BBC for international flavor?

Keep the pieces coming, Howard.