The self-imposed filter bubble hypothesis
(2021) KOGM20 20211Cognitive Science
- Abstract (Swedish)
- It is commonly assumed that algorithmic curation of search results creates filter bubbles, where users’ beliefs are continually reinforced and opposing views are suppressed. However, empirical evidence has failed to support this hypothesis. Instead, we suggest that filter bubbles may result from individuals acting selectively on information made available by search engines. When presented with search engine results pages, links and sources that validate users’ beliefs should be attended more than other links. This prediction is testable using eye-tracking technology. Here, we presented biased participants (n = 48) with sets of simulated Google Search results, controlling for the ideological leaning of each link. Results indicate that, on... (More)
- It is commonly assumed that algorithmic curation of search results creates filter bubbles, where users’ beliefs are continually reinforced and opposing views are suppressed. However, empirical evidence has failed to support this hypothesis. Instead, we suggest that filter bubbles may result from individuals acting selectively on information made available by search engines. When presented with search engine results pages, links and sources that validate users’ beliefs should be attended more than other links. This prediction is testable using eye-tracking technology. Here, we presented biased participants (n = 48) with sets of simulated Google Search results, controlling for the ideological leaning of each link. Results indicate that, on average, politically Liberal participants spend more time viewing own-side links than other links, while political Conservatives do not. However, both Liberals and Conservatives tend to select same-side links. Further, there is a significant effect of trust, such that links associated with less trusted sources are attended less and selected less often. Implications, study limitations, and directions for further study are also discussed. (Less)
Please use this url to cite or link to this publication:
http://lup.lub.lu.se/student-papers/record/9055864
- author
- Ekström, Axel LU
- supervisor
- organization
- course
- KOGM20 20211
- year
- 2021
- type
- H2 - Master's Degree (Two Years)
- subject
- keywords
- Filter bubble, Search engine, Ideology, Attention, Eye tracking, Trust
- language
- English
- id
- 9055864
- date added to LUP
- 2021-07-06 10:26:15
- date last changed
- 2021-07-06 10:26:15
@misc{9055864, abstract = {{It is commonly assumed that algorithmic curation of search results creates filter bubbles, where users’ beliefs are continually reinforced and opposing views are suppressed. However, empirical evidence has failed to support this hypothesis. Instead, we suggest that filter bubbles may result from individuals acting selectively on information made available by search engines. When presented with search engine results pages, links and sources that validate users’ beliefs should be attended more than other links. This prediction is testable using eye-tracking technology. Here, we presented biased participants (n = 48) with sets of simulated Google Search results, controlling for the ideological leaning of each link. Results indicate that, on average, politically Liberal participants spend more time viewing own-side links than other links, while political Conservatives do not. However, both Liberals and Conservatives tend to select same-side links. Further, there is a significant effect of trust, such that links associated with less trusted sources are attended less and selected less often. Implications, study limitations, and directions for further study are also discussed.}}, author = {{Ekström, Axel}}, language = {{eng}}, note = {{Student Paper}}, title = {{The self-imposed filter bubble hypothesis}}, year = {{2021}}, }