The Blurred Lines and Closed Loops of Google Search

Seemingly small design tweaks to the search results interface may change how and where people find information online.
An ad wears a disguise.
Illustration: WIRED Staff; Getty Images

January 13 was a fairly eventful day, at least for pre-pandemic times. Cory Booker dropped out of the presidential race. LSU trounced Clemson in the college football national championship game. Attorney general William Barr asked Apple to unlock an iPhone. And Google pushed out a seemingly tiny tweak to how it displays search ads for desktop computers.

Previously, the search engine had marked paid results with the word “Ad” in a green box, tucked beneath the headline next to a matching green display URL. Now, all of a sudden, the “Ad” and the URL shifted above the headline, and both were rendered in discreet black; the box disappeared. The organic search results underwent a similar makeover, only with a new favicon next to the URL instead of the word “Ad.” The result was a general smoothing: Ads looked like not-ads. Not-ads looked like ads.

This was not Google's first time fiddling with the search results interface. In fact, it had done so quite regularly over the past 13 years, as handily laid out in a timeline from the news site Search Engine Land. Each iteration whittled away the distinction between paid and unpaid content that much more. Most changes went relatively unnoticed, internet residents accepting the creep like the apocryphal frog in a slowly boiling pot.

But in January, amid rising antitrust drumbeats and general exhaustion with Big Tech, people noticed. Interface designers, marketers, and Google users alike decried the change, saying it made paid results practically indistinguishable from those that Google’s search algorithm served up organically. The phrase that came up most often: “dark pattern,” a blanket term coined by UX specialist Harry Brignull to describe manipulative design elements that benefit companies over their users.

“We conduct hundreds of thousands of quality tests and experiments each year to ensure that every product change makes Search more helpful and improves the user experience," a Google spokesperson said in a statement to WIRED. "Google is an industry leader when it comes to providing unambiguous ad labeling, guided by extensive research that shows that these labels help people clearly distinguish between paid and organic content."

That a small design tweak could inspire so much backlash speaks to the profound influence Google and other ubiquitous platforms have—and the responsibility that status confers to them. “Google and Facebook shape realities,” says Kat Zhou, a product designer who has created a framework and tool kit to help promote ethical design. “Students and professors turn to Google for their research. Folks turn to Facebook for political news. Communities turn to Google for Covid-19 updates. In some sense, Google and Facebook have become arbiters of the truth. That’s particularly scary when you factor in their business models, which often incentivize blurring the line between news and advertisements.”

Google’s not the only search engine to blur this line. If anything, Bing is even more opaque, sneaking the “Ad” disclosure under the header, with only a faint outline to draw attention. Here’s what a Bing search for DoorDash gets you:

Screenshot: Bing

Tricksy! You'll notice the knowledge box on the righthand side, too. But Google has around 92 percent of global search market share. It effectively is online search.

Dark patterns are all too common online in general, and January wasn’t the first time people accused Google of deploying them. In June 2018, a blistering report from the Norwegian Consumer Council found that Google and Facebook both used specific interface choices to strip away user privacy at almost every turn. The study details how both platforms implemented the least privacy-friendly options by default, consistently “nudged” users toward giving away more of their data, and more. It paints a portrait of a system designed to befuddle users into complacency.

That confusion reached its apex a few months later, when an Associated Press investigation found that disabling Location History on your smartphone did not, in fact, stop Google from collecting your location in all instances. Shutting off that data spigot altogether required digging through the settings on an Android smartphone. It took eight taps to reach, assuming you knew exactly where to go—and Google didn’t exactly provide road signs. In May of this year, Arizona attorney general Mark Brnovich sued Google under the state’s Consumer Fraud Act, alleging "widespread and systemic use of deceptive and unfair business practices to obtain information about the location of its users.” Even a privacy-focused Google software engineer didn’t understand how location controls worked, according to recently unsealed court documents from the case first reported by the Arizona Mirror. “Speaking as a user, WTF?” reads the chat log.

"The attorney general filing this lawsuit appears to have mischaracterized our services," another Google spokesperson, Jose Castaneda, said. "We have always built privacy features into our products and provided robust controls for location data. We look forward to setting the record straight." Castaneda also called the employee communications surfaced in the court documents "cherry-picked published extracts," which "state clearly that the team's goal was to 'Reduce confusion around Location History Settings.'"

Google has taken steps in recent years to give users more control over how long it keeps the data that it collects. A feature added in 2019 let you set your “Web & App Activity” to delete automatically after three or 18 months, and this summer Google implemented auto-deletion of data for even more categories by default for new accounts. It has also made it easier to adjust your privacy settings directly from within search, meaning you have to dig less to find them, and introduced Incognito Mode to YouTube and Google Maps.

"We are unequivocally committed to providing prominent, transparent and clear privacy controls, and we continue to raise the bar, with improvements like making auto-delete the default for our core activity settings," Google said in its statement.

Critics say that the company has not gone far enough. “We are aware that Google has made a number of minor improvements,” says Gro Mette Moen, acting digital policy director of the Norwegian Consumer Council. “However, as far as we have seen, none of these changes address the main issue: Consumers are still led to accept a large amount of tracking.”

They’re also led to accept a large amount of, well, Google. A detailed investigation by the Markup last month found that in 15,000 queries examined, nearly half of the first page of mobile search results were designed to keep the user on Google, rather than directing them to another website. Those responses consisted of both Google’s own properties and the “direct answers,” the snippets Google pulls from outside sites to display right in the results. Google has called the Markup's methodology "flawed and misleading," arguing that it pertains to a "non-representative" set of samples. "Providing feedback links, helping people reformulate queries or explore topics, and presenting quick facts is not designed to preference Google," the company said in its statement. "These features are fundamentally in the interest of users, which we validate through a rigorous testing process."

It's true that not having to click saves you time, and Google says it approaches queries like weather or sports scores differently from those with answers that are better served by going to a website. It notes that it drives "billions of visits to sites across the web every day." But critics of the company claim that expediency is a self-serving rationale that ignores wider harms to the internet as a whole. Not only does the practice stifle the growth of the non-Google sites that it pulls from, they say, it further cements Google’s position as the predominant end point of knowledge rather than a conduit.

“I’m not convinced that Google’s dedicating a huge portion of the search results to Google is ‘convenient’ for users at all, seeing as it often obscures relevant information and definitely doesn’t contribute to the overall health of the web,” Zhou says. “Google’s choice to surface its own content first has serious implications on whether users are exposed to the most valid and relevant results.”

The term “dark pattern” is inherently squishy. A confusing menu could be the product of malign intent or just a function of a labyrinthine legacy operating system. Turning your internet springboard into a one-stop shop might save users time, or it might limit their worldview to the confines of your algorithms. Or both. That ambiguity also means that dark patterns, calculated or not, are everywhere.

“Although companies have the responsibility to not manipulate or deceive consumers, there is no doubt that every internet user will encounter dark patterns online on a daily basis,” Moen says. “The best way to avoid being tricked by dark patterns is by being aware that you are seeing a dark pattern.”

There are signs of that awareness flickering more broadly. Go back to January 13, and that seemingly small change. The pushback was loud and sustained enough that Google partially rolled it back; the favicons on organic search headers were gone, making it slightly easier to spot the ads for what they were. Here's what that same search for DoorDash looks like on Google today:

Screenshot: Google 

“The adverse reaction to Google’s revamped search results could very well be the result of rising digital literacy in some segments of society,” says Zhou. Emphasis on “some.” “Digital literacy is a byproduct of privilege—the privilege to be exposed to such curricula at all, the privilege to have consistent access to a computer or smartphone, the privilege to have reliable internet.”

Without that privilege—and, too often, even with it—manipulative design elements can stack the deck in favor of the platforms powering so much of the internet today, be it social media or retail or banking or search. A healthier internet will require staying alert and informed—and helping others do so as well.


More Great WIRED Stories