Skip to main content

Lund University Publications

LUND UNIVERSITY LIBRARIES

Self-imposed Filter Bubbles : Selective Attention and Exposure in Online Search

Ekström, Axel ; Niehorster, Diederick C LU orcid and Olsson, Erik J LU (2022) In Computers in Human Behavior Reports 7.
Abstract
It is commonly assumed that algorithmic curation of search results creates filter bubbles, where users’ beliefs are continually reinforced and opposing views are suppressed. However, empirical evidence has failed to support this hypothesis. Instead, it has been suggested that filter bubbles may result from individuals engaging selectively with information in search engine results pages. However, this “self-imposed filter bubble hypothesis” has remained empirically untested. In this study, we find support for the hypothesis using eye-tracking technology and link selection data. We presented partisan participants (n = 48) with sets of simulated Google Search results, controlling for the ideological leaning of each link. Participants spent... (More)
It is commonly assumed that algorithmic curation of search results creates filter bubbles, where users’ beliefs are continually reinforced and opposing views are suppressed. However, empirical evidence has failed to support this hypothesis. Instead, it has been suggested that filter bubbles may result from individuals engaging selectively with information in search engine results pages. However, this “self-imposed filter bubble hypothesis” has remained empirically untested. In this study, we find support for the hypothesis using eye-tracking technology and link selection data. We presented partisan participants (n = 48) with sets of simulated Google Search results, controlling for the ideological leaning of each link. Participants spent more time viewing own-side links than other links (p = .037). In our sample, participants who identified as right-wing exhibited a greater such bias than those that identified as left wing (p < .001). In addition, we found that both liberals and conservatives tended to select own-side links (p < .001). Finally, there was a significant effect of trust, such that links associated with less trusted sources were attended less and selected less often by liberals and conservatives alike (p < .001). Our study challenges the efficacy of policies that aim at combatting filter bubbles by presenting users with an ideologically diverse set of search results. (Less)
Please use this url to cite or link to this publication:
author
; and
organization
publishing date
type
Contribution to journal
publication status
published
subject
keywords
Filter bubble, Online search, Selective exposure, Ingroup bias, Eye tracking, Trust
in
Computers in Human Behavior Reports
volume
7
article number
100226
pages
10 pages
publisher
Elsevier
external identifiers
  • scopus:85135800361
ISSN
2451-9588
DOI
10.1016/j.chbr.2022.100226
project
Filterbubblor och ideologisk segregering online: behövs reglering av sökmaskiner?
language
English
LU publication?
yes
id
d0221edb-d148-4a00-988d-38aaa6fe1f56
date added to LUP
2022-07-31 11:33:25
date last changed
2023-05-22 14:02:53
@article{d0221edb-d148-4a00-988d-38aaa6fe1f56,
  abstract     = {{It is commonly assumed that algorithmic curation of search results creates filter bubbles, where users’ beliefs are continually reinforced and opposing views are suppressed. However, empirical evidence has failed to support this hypothesis. Instead, it has been suggested that filter bubbles may result from individuals engaging selectively with information in search engine results pages. However, this “self-imposed filter bubble hypothesis” has remained empirically untested. In this study, we find support for the hypothesis using eye-tracking technology and link selection data. We presented partisan participants (n = 48) with sets of simulated Google Search results, controlling for the ideological leaning of each link. Participants spent more time viewing own-side links than other links (p = .037). In our sample, participants who identified as right-wing exhibited a greater such bias than those that identified as left wing (p &lt; .001). In addition, we found that both liberals and conservatives tended to select own-side links (p &lt; .001). Finally, there was a significant effect of trust, such that links associated with less trusted sources were attended less and selected less often by liberals and conservatives alike (p &lt; .001). Our study challenges the efficacy of policies that aim at combatting filter bubbles by presenting users with an ideologically diverse set of search results.}},
  author       = {{Ekström, Axel and Niehorster, Diederick C and Olsson, Erik J}},
  issn         = {{2451-9588}},
  keywords     = {{Filter bubble; Online search; Selective exposure; Ingroup bias; Eye tracking; Trust}},
  language     = {{eng}},
  publisher    = {{Elsevier}},
  series       = {{Computers in Human Behavior Reports}},
  title        = {{Self-imposed Filter Bubbles : Selective Attention and Exposure in Online Search}},
  url          = {{http://dx.doi.org/10.1016/j.chbr.2022.100226}},
  doi          = {{10.1016/j.chbr.2022.100226}},
  volume       = {{7}},
  year         = {{2022}},
}