In my search to understand how we decide and choose what to believe, I ran across a phenomenon called "confirmation bias."

This is a behavior we all exhibit when we are presented with some new data about a subject. We subconsciously seek out those aspects of the data that confirm our prior beliefs about the topic. Those that fit, we embrace, eagerly point to and, if so inclined, retweet -- those that do not; we reject, ignore or denigrate.

So, if for example, we believe that red wine is good for us (and who does not, sheesh), then when we see the latest news story about it, we lock on to the points that confirm our belief (the winemakers always look so healthy and cheerful) and reject or ignore those that do not (pesky inconclusive scientific data).

Intrigued by this phenomenon, which was the subject of a TED video "Filter Bubbles" (http://tinyurl.com/4xkj8hm) I decided to conduct an experiment of my own to confirm my bias, err, deepen my understanding about confirmation bias.

In a nutshell, the TED guest, Eli Pariser was saying that sites like Facebook and search engines like Google know what we like and are interested in based on our prior activity and tracked by their supersecret formulas. Therefore, when we do a search or hit a link, the results they give us, using these formulas or filters, are the ones we are more likely to want to see. So far so good.


Advertisement

The problem arises if, over time, certain results are not presented to us, or are lower down in the listing so that we only see things we already agree with, the "filter bubble" concept would therefore seem to feed into our confirmation biases. We see things that conform to our beliefs, and are not exposed to those that do not. Sounds dicey.

I wanted to see if this was really true, so I asked 10 old friends of mine to join me in an experiment. We would all search three terms in Google at the same time and see if the results differ. I was pumped! I had a glass of red wine to celebrate my cool idea.

My buddies and I share many characteristics -- age, athletic ability and modest good looks (sic). Growing up together in Oakland, we went to the same schools and are generally similar by most measures. Today we differ primarily in location and, this is the important one, political outlook.

We cover the spectrum from progressive left to conservative right and all points in between. My hypothesis, based on the TED talk, was that these differences would result in differing search results. Our search topics were "global warming," "Oakland" and "mountain biking."

We agreed on a date and time, did our searches and everyone sent me their links. I was eagerly anticipating confirmation that my conservative friends were getting only links to climate denial sites and my liberal friends were being sent to the Sierra Club. That one group would get links to breathless reports on crime in Oakland, while another, the latest Uptown restaurant reviews. Not so.

The search results were generally similar, with only modest differences in order of info and type of paid advertising. In fact, "mountain biking" seemed to generate the most different results. My confirmation bias filter bubble had been burst. I actually felt quite disappointed, which I think is the real lesson of the experiment. The results did not confirm my bias.

I continue to believe that the notion of filter bubbles and their impacts on confirmation biases is a valid one. We select whom to follow on Twitter, subscribe to online blogs that we agree with and watch TV channels that align to our beliefs. However, my experiment showed me that, even though I was conscious of the risks of confirmation bias, I am by no means immune to them. So, what do you believe?

Alexander Zwissler is the Executive Director and CEO of Chabot Space & Science Center in Oakland. Visit the center at 10000 Skyline Blvd. Go to www.chabotspace.org. Follow him at Twitter.com/alexzwissler. Contact him at azwissler@chabotspace.org or 510-336-7383.