6-8 September 2017, McGrath Centre, St. Catharine's College

Designing for truth: search in the age of fake news

A 45 minute Case Study by:

Dustin Coates


About this Case Study

As an engineer on the customer solutions team on Algolia, I wrestle all the time with questions of how to build the best search. Sometimes it's not just a question of commerce, but one of ethics as well.

This case study will take knowledge gained from working with thousands of search engine implementations, plus examples from industry leaders, to examine the following questions:

- When do we prioritise finding versus discovery? - Are algorithms the answer to biased content? - What are the ethical considerations each of us need to consider when building search?

Often we think of search engines as being tools to help us find what we're looking for. And while that's certainly part of it, we can think of search as balancing 2 competing interests: finding and discovery.

With finding, we get exactly what we're looking for. Think "I'm feeling lucky" or structured results in Google and Bing.

With discovery, we get things that we never knew we wanted. Think recommendation systems or similar results.

In the past, this has been a fairly academic discussion. However, with Brexit and the 2016 presidential election it's come much more to the forefront.

The motifs of these outcomes were, "How were the predictions so wrong?" and "These 2 sides don't speak to one another." How much did prioritising finding over browsing impact this? Personalisation and hyper-targeting can lead to an echo chamber where people only see their circles. Do we have an ethical imperative to introduce people to differing viewpoints or give them just what they want?

On the flip side, not all viewpoints are equal. When someone searches for medical advice, should search engines consider all viewpoints equal, irrespective of the authority? Is fake news just as valid if it gets shared more often than verifiable facts?

In the wake of these problems, many people have suggested a turn to algorithms because they can be neutral when people can't. But, indeed, algorithms will never be neutral as they are built by people, who introduce their own biases. Do we favour newer content or content that has been shared more? Is some content considered more authoratative? How?

None of these questions have easy answers, but we can have guiding principles:

  • First, do no harm.
  • Be upfront with users about why they're seeing something.
  • Be fair, not neutral.

These questions, and this discussion, are tailored to anyone who builds search or content discovery. The takeaways are applicable in all verticals, from eCommerce to online news.

About the Speaker

Dustin Coates is an engineer on the Solutions Team at Algolia - which means that he is working with the customers and the product team to ensure that customers are getting the best search experience through hands-on help, improved products and more. Previously, he was a software engineer at General Assembly and a web development instructor.


Tickets are available now


See the full programme