摘要
语音正成为主流人机交互方式之一,但语音助手并非一定是性别中立的,而是可能存在性别刻板印象,这会对社会造成潜移默化的影响。本文通过联合国发布的《如果我能,我会脸红》(I'd Blush If I Could)这一报告,探究中国市场上的主流语音助手呈现的性别刻板印象类型,发现语音助手被构建为女性形象且呈现出四种性别偏见类型。语音助手性别建构是性别偏见传统下的又一次女性形象的矮化。而科技拟人化的这一特性,也带来了新的道德困境。
关键词: 语音助手;性别刻板印象;性别偏见
Abstract
Voice is becoming one of the mainstream human-computer interaction methods, but voice assistants are not necessarily gender neutral. They may reflect gender stereotypes, which will have a subtle impact on the society. Through the report "I'd Blush If I Could" released by the United Nations, this paper explores the gender stereotype of mainstream voice assistants in the Chinese market, and finds that voice assistants are constructed as female images and show four types of gender bias. The gender construction of voice assistant is another dwarfing of female image under the tradition of gender prejudice. The personification of science and technology has also brought new moral dilemmas.
Key words: Voice assistant; Gender Stereotype; Gender bias
参考文献 References
[1] 马锦华.性别刻板印象与性别教育[J].教育评论,2000(06):35-37.
[2] 伏静怡.从Alexa到Siri,建构语音助手女性化形象的生成逻辑[J].新闻研究导刊,2020,11(12):65-66.
[3] Zdenek, S. (2007). “Just roll your mouse over me”: Designing virtual women for customer service on the web. Technical Communication Quarterly, 16(4), 397-430.
[4] Assink, L. M. (2021). Making the Invisible Visible: Exploring Gender Bias in AI Voice Assistants (Master's thesis, University of Twente).
[5] Callaway, C., & Sima'an, K. (2006). Wired for speech: how voice activates and advances the human-computer relationship. Computational Linguistics, 32(3), 451-452.
[6] Hwang, G., Lee, J., Oh, C. Y., & Lee, J. (2019, May). It sounds like a woman: Exploring gender stereotypes in South Korean voice assistants. In Extended Abstracts of the 2019 CHI Conference on Human Factors in Computing Systems (pp. 1-6).