posted on 2022-02-08, 16:57authored byValentina Pitardi, Hannah Marriott
With the development of deep connections between humans and Artificial Intelligence voice‐based assistants (VAs), human and machine relationships have transformed. For relationships to work it is essential for trust to be established. Although the capabilities of VAs offer retailers and consumers enhanced opportunities, building trust with machines is inherently challenging. In this paper, we propose integrating Human–Computer Interaction Theories and Para‐Social Relationship Theory to develop insight into how trust and attitudes toward VAs are established. By adopting a mixed‐method approach, first, we quantitatively examine the proposed model using Covariance‐Based Structural Equation Modeling on 466 respondents; based on the findings of this study, a second qualitative study is employed to reveal four main themes. Findings show that while functional elements drive users' attitude toward using VAs, the social attributes, being social presence and social cognition, are the unique antecedents for developing trust. Additionally, the research illustrates a peculiar dynamic between privacy and trust and it shows how users distinguish two different sources of trustworthiness in their interactions with VAs, identifying the brand producers as the data collector. Taken together, these results reinforce the idea that individuals interact with VAs treating them as social entities and employing human social rules, thus supporting the adoption of a para‐social perspective.
Pitardi, V. and Marriott, H.R. (2020) 'Alexa, she's not human but… Unveiling the drivers of consumers' trust in voice‐based artificial intelligence', Psychology & Marketing. https://doi.org/10.1002/mar.21457