Skip to main content

Lund University Publications

LUND UNIVERSITY LIBRARIES

In Search of the Climate Change Filter Bubble : A Content-based Method for Studying Ideological Segregation in Google

Genot, Emmanuel LU ; Jiborn, Magnus LU ; Hahn, Ulrike ; Volzhanin, Igor ; Olsson, Erik J LU and von Gerber, Ylva LU (2020)
Abstract
Abstract: A popular belief is that the process whereby search engines tailor their search results to individual users, so-called personalization, leads to filter bubbles in the sense of ideologically segregated search results that would tend to reinforce the user’s prior view (filter bubble hypothesis). Since filter bubbles are thought to be detrimental to society, there have been calls for further legal regulation of search engines beyond the so-called Right to be Forgotten Act (EU, C-131/12, 2014). However, the scientific evidence for the filter bubble hypothesis is surprisingly limited. Previous studies of personalization have focused on the extent to which different users get different results lists without taking the content on the... (More)
Abstract: A popular belief is that the process whereby search engines tailor their search results to individual users, so-called personalization, leads to filter bubbles in the sense of ideologically segregated search results that would tend to reinforce the user’s prior view (filter bubble hypothesis). Since filter bubbles are thought to be detrimental to society, there have been calls for further legal regulation of search engines beyond the so-called Right to be Forgotten Act (EU, C-131/12, 2014). However, the scientific evidence for the filter bubble hypothesis is surprisingly limited. Previous studies of personalization have focused on the extent to which different users get different results lists without taking the content on the webpages into account. Such methods are unsuitable for detecting filter bubbles as such. In this paper, we propose a methodology that takes content differences between webpages into account. In particular, the method involves studying the extent to which users with strong opposing views on an issue receive search results that are correlated content-wise with their personal view. Will users of with strong prior opinion that X is true on average have a larger share of (top) search results that are in favor of X than users with a strong prior opinion that X is false? We illustrate our methodology at work, but also the non-trivial challenges it faces, by a small-scale study of the extent to which Google Search leads to ideological segregation on the issue of man-made climate change. (Less)
Please use this url to cite or link to this publication:
author
; ; ; ; and
organization
publishing date
type
Other contribution
publication status
submitted
subject
project
Filterbubblor och ideologisk segregering online: behövs reglering av sökmaskiner?
language
English
LU publication?
yes
id
e8b3f22b-c94c-47b9-9d9e-f593daeb1bc9
date added to LUP
2020-08-19 08:36:11
date last changed
2023-05-24 11:00:24
@misc{e8b3f22b-c94c-47b9-9d9e-f593daeb1bc9,
  abstract     = {{Abstract: A popular belief is that the process whereby search engines tailor their search results to individual users, so-called personalization, leads to filter bubbles in the sense of ideologically segregated search results that would tend to reinforce the user’s prior view (filter bubble hypothesis). Since filter bubbles are thought to be detrimental to society, there have been calls for further legal regulation of search engines beyond the so-called Right to be Forgotten Act (EU, C-131/12, 2014). However, the scientific evidence for the filter bubble hypothesis is surprisingly limited. Previous studies of personalization have focused on the extent to which different users get different results lists without taking the content on the webpages into account. Such methods are unsuitable for detecting filter bubbles as such. In this paper, we propose a methodology that takes content differences between webpages into account. In particular, the method involves studying the extent to which users with strong opposing views on an issue receive search results that are correlated content-wise with their personal view. Will users of with strong prior opinion that X is true on average have a larger share of (top) search results that are in favor of X than users with a strong prior opinion that X is false? We illustrate our methodology at work, but also the non-trivial challenges it faces, by a small-scale study of the extent to which Google Search leads to ideological segregation on the issue of man-made climate change.}},
  author       = {{Genot, Emmanuel and Jiborn, Magnus and Hahn, Ulrike and Volzhanin, Igor and Olsson, Erik J and von Gerber, Ylva}},
  language     = {{eng}},
  title        = {{In Search of the Climate Change Filter Bubble : A Content-based Method for Studying Ideological Segregation in Google}},
  url          = {{https://lup.lub.lu.se/search/files/82883912/In_Search_of_the_Climate_Change_Filter_Bubble.pdf}},
  year         = {{2020}},
}