On whom would I want to depend : humans or computers?
(2019) In Journal of Economic Psychology 72. p.219-228- Abstract
We study in a laboratory experiment whether humans prefer to depend on decisions of others (Human-Driven Uncertainty) or states generated by a computer (Computerized Uncertainty). The experimental design introduced in this paper is unique in that it introduces Human-Driven Uncertainty such that it does not derive from a strategic context. In our experiment, Human-Driven Uncertainty derives from decisions, which were taken in a morally neutral context and in ignorance of externalities that the decisions may have on others. Our results indicate that even without strategic interaction and moral elements humans prefer Computerized to Human-Driven Uncertainty. This holds even when the distribution of outcomes under both types of uncertainty... (More)
We study in a laboratory experiment whether humans prefer to depend on decisions of others (Human-Driven Uncertainty) or states generated by a computer (Computerized Uncertainty). The experimental design introduced in this paper is unique in that it introduces Human-Driven Uncertainty such that it does not derive from a strategic context. In our experiment, Human-Driven Uncertainty derives from decisions, which were taken in a morally neutral context and in ignorance of externalities that the decisions may have on others. Our results indicate that even without strategic interaction and moral elements humans prefer Computerized to Human-Driven Uncertainty. This holds even when the distribution of outcomes under both types of uncertainty is identical. From a methodological point of view, the findings shed a critical light on behavioral research in which it is common practice to control for strategic uncertainty by comparing interaction with an artificial agent with a known strategy to interaction with humans. Outside the laboratory, our results suggest that whenever dependence on humans is changed to dependence on computers and other kinds of “artificial” decision makers, preferences with regard to these dependencies may change too.
(Less)
- author
- Farjam, Mike LU
- publishing date
- 2019
- type
- Contribution to journal
- publication status
- published
- subject
- keywords
- Ambiguity aversion, Experiment, Human uncertainty, Risk
- in
- Journal of Economic Psychology
- volume
- 72
- pages
- 10 pages
- publisher
- Elsevier
- external identifiers
-
- scopus:85064590214
- ISSN
- 1872-7719
- DOI
- 10.1016/j.joep.2019.04.002
- language
- English
- LU publication?
- no
- additional info
- Funding Information: We thank the Max Planck Society for financial support through the International Max Planck Research School on Adapting Behavior in a Fundamentally Uncertain World. Special thanks also to Alexia Gaudeul, Oliver Kirchkamp, Anna Merkel and the participants of the 2015 IMPRS Uncertainty Summer School in Jena for their feedback and ideas on designing this experiment. Publisher Copyright: © 2019 Elsevier B.V. Copyright: Copyright 2019 Elsevier B.V., All rights reserved.
- id
- b71caeeb-a980-4d8c-98f3-15f536256b85
- date added to LUP
- 2021-01-20 16:45:42
- date last changed
- 2022-04-26 23:50:48
@article{b71caeeb-a980-4d8c-98f3-15f536256b85, abstract = {{<p>We study in a laboratory experiment whether humans prefer to depend on decisions of others (Human-Driven Uncertainty) or states generated by a computer (Computerized Uncertainty). The experimental design introduced in this paper is unique in that it introduces Human-Driven Uncertainty such that it does not derive from a strategic context. In our experiment, Human-Driven Uncertainty derives from decisions, which were taken in a morally neutral context and in ignorance of externalities that the decisions may have on others. Our results indicate that even without strategic interaction and moral elements humans prefer Computerized to Human-Driven Uncertainty. This holds even when the distribution of outcomes under both types of uncertainty is identical. From a methodological point of view, the findings shed a critical light on behavioral research in which it is common practice to control for strategic uncertainty by comparing interaction with an artificial agent with a known strategy to interaction with humans. Outside the laboratory, our results suggest that whenever dependence on humans is changed to dependence on computers and other kinds of “artificial” decision makers, preferences with regard to these dependencies may change too.</p>}}, author = {{Farjam, Mike}}, issn = {{1872-7719}}, keywords = {{Ambiguity aversion; Experiment; Human uncertainty; Risk}}, language = {{eng}}, pages = {{219--228}}, publisher = {{Elsevier}}, series = {{Journal of Economic Psychology}}, title = {{On whom would I want to depend : humans or computers?}}, url = {{http://dx.doi.org/10.1016/j.joep.2019.04.002}}, doi = {{10.1016/j.joep.2019.04.002}}, volume = {{72}}, year = {{2019}}, }