How Google’s Algorithm Silences Minority Opinions
by Carson Ward | May 6, 2014 | Technology | 0
Google’s influence on society is both unparalleled and poorly understood. At last count
in September of 2013, the company served over 3 billion searches per day. Most of us use Google, and we’re far more likely to click the top few results. If Google is determining what we’re more likely to read, and if reading can change belief, then Google’s algorithm can change millions of beliefs daily.
I myself have changed beliefs – and in one case my entire worldview – based on what I found through Google. I’m happier for it, but I can’t help thinking a few different top results may have changed the course of my life forever.
Most people either don’t know or don’t care how Google chooses results, as long as they find what they’re looking for. That might be fine when you’re looking for an address or a cake recipe, but are we comfortable with an unknown algorithm subtly influencing which political and philosophical opinions are most relevant and silencing opposing voices?
Google and the Spiral of Silence
A “spiral of silence
” occurs when a system pushes one opinion into obscurity.
The popular social sharing site reddit
is an example of a spiral of silence. When users post a new link or comment, other users vote on whether they like the content or not. Reddit sorts content by popularity, voting a comment up results in more visibility and thus a higher chance of being voted up further. If users vote a comment down quickly most people will never see it. Despite “reddiquette
” encouraging users not to downvote for disagreement, users’ inherent confirmation bias
means the most visible content appeals to reddit’s overall demographic and the demographics in its sub-sections. Downvotes push unpopular arguments so far down the page that casual readers rarely ever see opposing viewpoints.
Google’s algorithm works exactly the same way, albeit more subtly. Links are like votes. Because unpopular opinions have fewer sites and fewer proponents, they get fewer votes in Google’s eyes, while popular opinions end up with more votes to share and thus more exposure. Sites with unpopular opinions already have less visibility because they’re being shared less, but the search engine actually compounds
the lack of visibility by ranking these sites lower.
The rich get richer, the popular get more popular
Google has always ranked sites by their popularity. One of Google’s primary advantages over early spam-infested rivals was its use of PageRank – an algorithm developed by Larry Page – to determine which sites and pages were being linked to most often. Sites with more links pointing to them tend to rank higher, and their outgoing links count for more than less-popular sites. The logic for using popularity metrics to rank results is simple: if other legitimate sites like a page enough to link to it, it’s probably not spam. If hundreds of sites are linking to a page, it’s more likely to appeal to you, too.
Google has become more and more sophisticated in the way it ranks sites, but they have never stopped relying on popularity metrics. A recent Moz study
found that popularity metrics – links, shares, etc. – remain the highest correlated factor to earning a top spot in a Google result. Google representatives have said
they tried excluding links as a factor, but doing so made search results far worse.
Google’s algorithm remains fundamentally biased towards the majority view: the less popular your viewpoint, the less likely it is to show up. The less visibility you receive, the less likely you are to get links or shares, and the spiral continues downward.
Consider the search on Google.com for, “does god exist?” – 4 of the top 5 results argue or imply god does exist, and the other is a neutral Wikipedia article. 70% of page 1 argues for the existence of god, 20% against, and 10% neutral. Google’s output isn’t far from the American view on religion according to Pew
Click for full results
Not every search will reflect majority views, of course. The words you use can easily skew a search towards what you’re looking for. “Is abortion wrong,” for example, yields a lot more pro-life results, and “is abortion right” yields more pro-choice results. Google will also personalize and localize results it thinks you’ll want: for example, the search above may appear different to you based on your location, search history, and friends. Eli Parser calls the later the “filter bubble
” – a related and equally-disturbing trend. The filter bubble effect actually helps reinforce the spiral of silence, making differing opinions even less likely to show.
Google justifies showing you popular and personal results because the results are more relevant: these are things you are more likely to be interested in. Google says, essentially, “We’re just giving the people what they want.” I understand the business need to increase personalization, but I also worry that feeding the majority their own view on important social issues could lead to a culture with increasingly stagnant opinions.
Google certainly isn’t the only site silencing minority views with an algorithm side-effect. Nearly all search engines prioritize results on popularity. Facebook and Twitter are more likely to show you well-liked posts. Among sites intended for information discovery, all of the most popular sites use popularity in some form or another to rank what you see. This realization should be a little frightening, and search engines so far have escaped the scrutiny of social observers.
The trend today is toward more popularity data and integration resulting in more clicks. Marketers realized long ago that personalizing what you see to match your current interests and beliefs means you’ll be more likely to click, read, and return. This is why Google gathers data about you
and Facebook personalizes your news feed
. If showing you information you already agree with is as profitable as it seems, we’re headed for a world where our most-used online services are afraid to offend us by disagreeing.
Potential technical solutions
Search engines like Google don’t want to stop using popularity metrics because they would be less able to filter out low-quality pages, but existing and developing technology may offer a way to show diverse opinions without cutting links out of the equation.
Google owns a number of patents designed to understand how we feel about a subject. One machine learning patent
even handles understanding different words in different contexts: “For example the word ‘small’ usually indicates positive sentiment when describing a portable electronic device, but can indicate negative sentiment when used to describe the size of a portion served by a restaurant.” Google also offers tools
to create a sentiment analysis model.
Google could combine sentiment analysis with an existing feature (“query deserves diversity” or QDD) that shows topically diverse results when the algorithm detects ambiguity. Google may eventually be able to detect controversy and show results that deserve diverse opinions.
Responsible web search
Limiting our exposure to conflicting views is dangerous. When journalism was booming in the 1920s, many realized that what newspapers wrote and published had a great influence over society and culture. It was easy to write what their readers wanted to read – playing to their existing biases and beliefs. Many journalists worried that papers were neglecting less-common beliefs and avoiding difficult criticism.
Movements like civic journalism grew out of the idea that journalists and editors had a responsibility to do more than report the facts. Socially responsible journalists actually tried to improve public discussion, and in many cases they succeeded. Their coverage of the civil rights movement was highly influential, and led many to reconsider long-held biases.
The responsibility is ultimately ours to honestly and openly challenge our own beliefs, but maybe our search engines have a social responsibility, too.
Illustration by Kurt Michelson
Find Carson on Google+