Linguistic stability and change under small-scale egalitarian language contact : a mixture model approach
(2020) 42nd Annual Meeting of the Cognitive Science Society: Developing a Mind: Learning in Humans, Animals, and Machines, CogSci 2020 p.3109-3115- Abstract
This paper investigates the outcomes of small-scale egalitarian language contact in an attempt to address whether different linguistic domains exhibit different degrees of stability and resistance to convergence among cohabitant speakers of Jahai and Jedek, two closely related Aslian (Austroasiatic) language varieties spoken in northern Peninsular Malaysia. Using nonparametric Bayesian mixture models, we find that basic vocabulary items show a signal that strongly matches the linguistic identity of individuals, while data from other domains do not. This result is in agreement with other findings from the study of language contact: basic vocabulary is said to be a domain where distinctions in linguistic identity are often emphasized and... (More)
This paper investigates the outcomes of small-scale egalitarian language contact in an attempt to address whether different linguistic domains exhibit different degrees of stability and resistance to convergence among cohabitant speakers of Jahai and Jedek, two closely related Aslian (Austroasiatic) language varieties spoken in northern Peninsular Malaysia. Using nonparametric Bayesian mixture models, we find that basic vocabulary items show a signal that strongly matches the linguistic identity of individuals, while data from other domains do not. This result is in agreement with other findings from the study of language contact: basic vocabulary is said to be a domain where distinctions in linguistic identity are often emphasized and maintained, while other parts of the vocabulary may be less salient for the purposes of indexing speaker identity, and are thus more prone to the effects of convergence. We demonstrate that this finding is an artifact of neither data coverage nor model choice; at the same time, we are able to identify variation in basic vocabulary items across linguistic groups which is suppressed by the model we use, and outline alternative methods for analyzing data of this sort.
(Less)
- author
- Cathcart, Chundra Aroor LU and Yager, Joanne LU
- organization
- publishing date
- 2020
- type
- Contribution to conference
- publication status
- published
- subject
- keywords
- Bayesian modeling, Language change, Language contact, Linguistics
- pages
- 7 pages
- conference name
- 42nd Annual Meeting of the Cognitive Science Society: Developing a Mind: Learning in Humans, Animals, and Machines, CogSci 2020
- conference location
- Virtual, Online
- conference dates
- 2020-07-29 - 2020-08-01
- external identifiers
-
- scopus:85139511985
- language
- English
- LU publication?
- yes
- id
- d3ec26c9-2bf0-49ae-9d0a-cc7f9231ff6a
- date added to LUP
- 2023-01-10 10:04:53
- date last changed
- 2023-09-25 08:41:57
@misc{d3ec26c9-2bf0-49ae-9d0a-cc7f9231ff6a, abstract = {{<p>This paper investigates the outcomes of small-scale egalitarian language contact in an attempt to address whether different linguistic domains exhibit different degrees of stability and resistance to convergence among cohabitant speakers of Jahai and Jedek, two closely related Aslian (Austroasiatic) language varieties spoken in northern Peninsular Malaysia. Using nonparametric Bayesian mixture models, we find that basic vocabulary items show a signal that strongly matches the linguistic identity of individuals, while data from other domains do not. This result is in agreement with other findings from the study of language contact: basic vocabulary is said to be a domain where distinctions in linguistic identity are often emphasized and maintained, while other parts of the vocabulary may be less salient for the purposes of indexing speaker identity, and are thus more prone to the effects of convergence. We demonstrate that this finding is an artifact of neither data coverage nor model choice; at the same time, we are able to identify variation in basic vocabulary items across linguistic groups which is suppressed by the model we use, and outline alternative methods for analyzing data of this sort.</p>}}, author = {{Cathcart, Chundra Aroor and Yager, Joanne}}, keywords = {{Bayesian modeling; Language change; Language contact; Linguistics}}, language = {{eng}}, pages = {{3109--3115}}, title = {{Linguistic stability and change under small-scale egalitarian language contact : a mixture model approach}}, year = {{2020}}, }