Advertisement
NO ACCESS
Research Article
HUMAN-ROBOT INTERACTION

Human-like behavioral variability blurs the distinction between a human and a machine in a nonverbal Turing test

Science Robotics
27 Jul 2022
Vol 7, Issue 68

Abstract

Variability is a property of biological systems, and in animals (including humans), behavioral variability is characterized by certain features, such as the range of variability and the shape of its distribution. Nevertheless, only a few studies have investigated whether and how variability features contribute to the ascription of humanness to robots in a human-robot interaction setting. Here, we tested whether two aspects of behavioral variability, namely, the standard deviation and the shape of distribution of reaction times, affect the ascription of humanness to robots during a joint action scenario. We designed an interactive task in which pairs of participants performed a joint Simon task with an iCub robot placed by their side. Either iCub could perform the task in a preprogrammed manner, or its button presses could be teleoperated by the other member of the pair, seated in the other room. Under the preprogrammed condition, the iCub pressed buttons with reaction times falling within the range of human variability. However, the distribution of the reaction times did not resemble a human-like shape. Participants were sensitive to humanness, because they correctly detected the human agent above chance level. When the iCub was controlled by the computer program, it passed our variation of a nonverbal Turing test. Together, our results suggest that hints of humanness, such as the range of behavioral variability, might be used by observers to ascribe humanness to a humanoid robot.

Get full access to this article

View all available purchase options and get full access to this article.

Already a subscriber or AAAS Member? Log In

Supplementary Materials

This PDF file includes:

Supplementary Methods
Table S1
Fig. S1
Reference (59)

Other Supplementary Material for this manuscript includes the following:

MDAR Reproducibility Checklist

REFERENCES AND NOTES

1
A. M. Turing, I.—Computing machinery and intelligence. Mind LIX, 433–460 (1950).
2
J. H. Moor, The status and future of the Turing test. Minds Mach. 11, 77–93 (2001).
3
U. J. Pfeiffer, B. Timmermans, G. Bente, K. Vogeley, L. Schilbach, A non-verbal turing test: Differentiating mind from machine in gaze-based social interaction. PLOS ONE 6, e27591 (2011).
4
C. Willemse, S. Marchesi, A. Wykowska, Robot faces that follow gaze facilitate attentional engagement and increase their likeability. Front. Psychol. 9, 70 (2018).
5
M. Rebol, C. Güti, K. Pietroszek, Passing a non-verbal Turing test: Evaluating gesture animations generated from speech, in 2021 IEEE Virtual Reality and 3D User Interfaces (VR) (IEEE, 2021), pp. 573–581.
6
J. Ventrella, M. Seif El-Nasr, B Aghabeigi, R. Overington, Gestural turing test: A motion-capture experiment for exploring believability in artificial nonverbal communication, in AAMAS 2010 International Workshop on Interacting with ECAs as Virtual Characters (2010).
7
M. Polceanu. Mirrorbot: Using human-inspired mirroring behavior to pass a turing test, in 2013 IEEE Conference on Computational Inteligence in Games (CIG) (IEEE, 2013) pp. 1–8.
8
T. Gurion, P. G. Healey, J. Hough, Real-time testing of non-verbal interaction: An experimental method and platform, in Proceedings of the 22nd Workshop on the Semantics and Pragmatics of Dialogue-Poster Abstracts, SEMDIAL (2018) pp. 1–4.
9
C. Becchio, L. Sartori, M. Bulgheroni, U. Castiello, Both your intention and mine are reflected in the kinematics of my reach-to-grasp movement. Cognition 106, 894–912 (2008).
10
F. Ciardo, I. Campanini, A. Merlo, S. Rubichi, C. Iani, The role of perspective in discriminating between social and non-social intentions from reach-to-grasp kinematics. Psych. Res. 82, 915–928 (2018).
11
I. M. Thornton, Q. C. Vuong, Incidental processing of biological motion. Curr. Biol. 14, 1084–1089 (2004).
12
E. D. Grossman, R. Blake, Brain areas active during visual perception of biological motion. Neuron 35, 1167–1175 (2002).
13
A. P. Atkinson, W. H. Dittrich, A. J. Gemmell, A. W. V. Young, Emotion perception from dynamic and static body expressions in point-light and full-light displays. Perception 33, 717–746 (2004).
14
T. J. Clarke, M. F. Bradshaw, D. T. Field, S. E. Hampson, D. Rose, The perception of emotion from body movement in point-light displays of interpersonal dialogue. Perception 34, 1171–1180 (2005).
15
N. Stergiou, L. M. Decker, Human movement variability, nonlinear dynamics, and pathology: Is there a connection? Hum. Mov. Sci. 30, 869–888 (2011).
16
A. Wykowska, J. Kajopoulos, M. Obando-Leitón, S. S. Chauhan, J. J. Cabibihan, G. Cheng, Humans are well tuned to detecting agents among non-agents: Examining the sensitivity of human perception to behavioral characteristics of intentional systems. Int. J. Soc. Robot. 7, 767–781 (2015).
17
A. Wykowska, J. Kajopoulos, K. Ramirez-Amaro, G. Cheng, Autistic traits and sensitivity to human-like features of robot behavior. Interact. Stud. 16, 219–248 (2015).
18
R. H. Baayen, P. Milin, Analyzing reaction times. Int. J. Psych. Res. 3, 12–28 (2010).
19
R. M. Yerkes, Variability of reaction-time. Psychol. Bull. 1, 137–146 (1904).
20
D. G. Tervo, M. Proskurin, M. Manakov, M. Kabra, A. Vollmer, K. Branson, A. Y. Karpova, Behavioural variability through stochastic choice and its gating by anterior cingulate cortex. Cell 159, 21–32 (2014).
21
M. D. Fox, A. Z. Snyder, J. L. Vincent, M. E. Raichle, Intrinsic fluctuations within cortical systems account for intertrial variability in human behavior. Neuron 56, 171–184 (2007).
22
C. Vesper, R. P. Van Der Wel, G. Knoblich, N. Sebanz, Making oneself predictable: Reduced temporal variability facilitates joint action coordination. Exp. Brain Res. 211, 517–530 (2011).
23
L. Schilbach, B. Timmermans, V. Reddy, A. Costall, G. Bente, T. Schlicht, K. Vogeley, Toward a second-person neuroscience. Behav. Brain Sci. 36, 393–414 (2013).
24
N. Sebanz, H. Bekkering, G. Knoblich, Joint action: Bodies and minds moving together. Trends Cogn. Sci. 10, 70–76 (2006).
25
S. Atmaca, N. Sebanz, W. Prinz, G. Knoblich, Action co-representation: The joint SNARC effect. Soc. Neurosci. 3, 410–420 (2008).
26
N. Milanese, C. Iani, S. Rubichi, Shared learning shapes human performance: Transfer effects in task sharing. Cognition 116, 15–22 (2010).
27
N. Sebanz, G. Knoblich, W. Prinz, Representing others’ actions: Just like one’s own? Cognition 88, B11–B21 (2003).
28
N. Sebanz, G. Knoblich, W. Prinz, How two share a task: Corepresenting stimulus-response mappings. J. Exp. Psychol. Hum. Percept. Perform. 31, 1234–1246 (2005).
29
J. R. Simon, A. P. Rudell, Auditory S-R compatibility: The effect of an irrelevant cue on information processing. J. Appl. Psychol. 51, 300–304 (1967).
30
R. W. Proctor, K. P. L. Vu, Stimulus-Response Compatibility Principles: Data, Theory, and Application (CRC Press, ed. 1, 2006).
31
M. Tagliabue, M. Zorzi, C. Umiltà, F. Bassignani, The role of long-term-memory and short-term-memory links in the Simon effect. J. Exp. Psychol. Hum. Percept. Perform. 26, 648–670 (2000).
32
T. Dolk, B. Hommel, L. S. Colzato, S. Schütz-Bosbach, W. Prinz, R. Liepelt, How “social” is the social Simon effect? Front. Psychol. 2, 84 (2011).
33
T. Dolk, B. Hommel, W. Prinz, R. Liepelt, The (not so) social Simon effect: A referential coding account. J. Exp. Psychol. Hum. Percept. Perform. 39, 1248–1260 (2013).
34
M. Yamaguchi, H. J. Wall, B. Hommel, Sharing tasks or sharing actions? Evidence from the joint Simon task. Psychol. Res. 82, 385–394 (2018).
35
K. Dittrich, A. Rothe, K. C. Klauer, Increased spatial salience in the social Simon task: A response coding account of spatial compatibility effects. Atten. Percept. Psychophys. 74, 911–929 (2012).
36
C. Iani, F. Ciardo, S. Panajoli, L. Lugli, S. Rubichi, The role of the co-actor’s response reachability in the joint Simon effect: Remapping of working space by tool use. Psychol. Res. 85, 521–532 (2021).
37
F. Ciardo, L. Lugli, R. Nicoletti, S. Rubichi, C. Iani, Action-space coding in social contexts. Sci. Rep. 6, 22673 (2016).
38
A. Sahaï, A. Desantis, O. Grynszpan, E. Pacherie, B. Berberian, Action co-representation and the sense of agency during a joint Simon task: Comparing human and machine co-agents. Conscious. Cogn. 67, 44–55 (2019).
39
T. Wen, S. Hsieh, Neuroimaging of the joint Simon effect with believed biological and non-biological co-actors. Front. Hum. Neurosci. 9, 483 (2015).
40
C. C. Tsai, C. W. J. Kuo, D. L. Hung, O. J. Tzeng, Action co-representation is tuned to other humans. J. Cogn. Neurosci. 20, 2015–2024 (2008).
41
A. Stenzel, E. Chinellato, M. A. T. Bou, A. P. Del Pobil, M. Lappe, R. Liepelt, When humanoid robots become human-like interaction partners: Corepresentation of robotic actions. J. Exp. Psychol. Hum. Percept. Perform. 38, 1073–1077 (2012).
42
M. Strait, F. Lier, J. Bernotat, S. Wachsmuth, F. Eyssel, R. Goldstone, S. Šabanović, A three-site reproduction of the joint Simon effect with the NAO robot, in Proceedings of the 2020 ACM/IEEE International Conference on Human-Robot Interaction (IEEE, 2020) pp. 103–111.
43
A. Sahaï, “Joint agency in human-machine interactions: How to design more cooperative agents?,” thesis, PSL Research University (2019).
44
K. L. Marsh, M. J. Richardson, R. C. Schmidt, Social connection through joint action and interpersonal coordination. Top. Cogn. Sci. 1, 320–339 (2009).
45
P. E. Keller, G. Novembre, M. J. Hove, Rhythm in joint action: Psychological and neurophysiological mechanisms for real-time interpersonal coordination. Philos. Trans. R. Soc. Lond. B Biol. Sci. 369, 20130394 (2014).
46
M. Malone, R. D. Castillo, H. Kloos, J. G. Holden, M. J. Richardson, Dynamic structure of joint-action stimulus-response activity. PLOS ONE 9, e89032 (2014).
47
G. Metta, L. Natale, F. Nori, G. Sandini, D. Vernon, L. Fadiga, L. Montesano, The iCub humanoid robot: An open-systems platform for research in cognitive development. Neural Netw. 23, 1125–1134 (2010).
48
F. Ciardo, A. Wykowska, Response coordination emerges in cooperative but not competitive joint task. Front. Psychol. 9, 1919 (2018).
49
A.V. Barbosa, H. C. Yehia, E. Vatikiotis-Bateson, Algorithm for computing spatiotemporal coordination, in Auditory and Visual Speech Processing, S. Luccy, Ed. (Moreton Island: Casual Productions, 2008), pp. 131–136.
50
P. Stone, R. Brooks, E. Brynjolfsson, R. Calo, O. Etzioni, G. Hager, J. Hirschberg, S. Kalyanakrishnan, E. Kamar, S. Kraus, K. Leyton-Brown, D. Parkes, W. Press, A. L. Saxenian, J. Shah, M. Tambe, A. Teller, “Artificial intelligence and life in 2030: One hundred year study on artificial intelligence” (Report of the 2015–2016 study panel, Stanford University, 2016).
51
S. Harnad, Other bodies, other minds: A machine incarnation of an old philosophical problem. Minds Machines 1, 43–54 (1991).
52
S. Marchesi, D. Ghiglino, F. Ciardo, J. Perez-Osorio, E. Baykara, A. Wykowska, Do we adopt the intentional stance toward humanoid robots? Front. Psychol. 10, 450 (2019).
53
F. Ciardo, D. De Tommaso, A. Wykowska, Effects of erring behavior in a human-robot joint musical task on adopting Intentional Stance toward the iCub robot, in 30th IEEE International Conference on Robot & Human Interactive Communication (RO-MAN) (IEEE, 2021), pp. 698–703.
54
A. Aron, E. N. Aron, D. Smollan, Inclusion of other in the self scale and the structure of interpersonal closeness. J. Pers. Soc. Psychol. 63, 596–612 (1992).
55
A. Waytz, K. Gray, N. Epley, D. M. Wegner, Causes and consequences of mind perception. Trends Cogn. Sci. 14, 383–388 (2010).
56
D. Bates, R. Kliegl, S. Vasishth, H. Baayen, Parsimonious mixed models. arXiv:1506.04967 [stat.ME] (16 June 2015).
57
A. Kuznetsova, P. B. Brockhoff, R. H. B. Christensen, Package ‘lmertest’. R package version 2, 734 (2015).
58
B. Efron, R. J. Tibshirani, An Introduction to the Bootstrap (CRC Press, 1994).
59
N. Spatola, A. Wykowska, The personality of anthropomorphism: How the need for cognition and the need for closure define attitudes and anthropomorphic attributions toward robots. Comput. Hum. Behav. 122, 106841 (2021).

Information & Authors

Information

Published In

View large Science Robotics cover image
Science Robotics
Volume 7 | Issue 68
July 2022

Submission history

Received: 18 January 2022
Accepted: 29 June 2022

Permissions

Request permissions for this article.

Acknowledgments

This work has received support from the European Research Council under the European Union’s Horizon 2020 research and innovation program, ERC starting grant, G.A. number ERC-2016-StG-715058, awarded to A.W. F.C. was partly supported by H2020 Marie Skłodowska-Curie grant agreement no. 893960. The content of this paper is the sole responsibility of the authors. The European Commission or its services cannot be held responsible for any use that may be made of the information it contains.
Author contributions: F.C. designed and performed all experiments, analyzed the data, and wrote the manuscript. D.D.T. programmed the robot and revised the manuscript. A.W. designed the experiments and wrote the manuscript.
Competing interests: The authors declare that they have no competing interests.
Data and materials availability: All data, code, and materials used in the analysis is available at https://osf.io/vyj73/.

Authors

Affiliations

Social Cognition in Human-Robot Interaction, Italian Institute of Technology, Genoa, Italy.
Roles: Conceptualization, Data curation, Formal analysis, Funding acquisition, Investigation, Methodology, Project administration, Resources, Validation, Visualization, Writing - original draft, and Writing - review & editing.
Social Cognition in Human-Robot Interaction, Italian Institute of Technology, Genoa, Italy.
Roles: Conceptualization, Data curation, Methodology, and Software.
Social Cognition in Human-Robot Interaction, Italian Institute of Technology, Genoa, Italy.
Roles: Conceptualization, Funding acquisition, Methodology, Project administration, Resources, Supervision, Validation, Writing - original draft, and Writing - review & editing.

Funding Information

Notes

*
Corresponding author. Email: [email protected]

Metrics & Citations

Metrics

Article Usage
Altmetrics

Citations

Export citation

Select the format you want to export the citation of this publication.

Cited by

  1. Do we really want AI to be human-like?, Science Robotics, 7, 68, (2022)./doi/10.1126/scirobotics.add0641
    Abstract
Loading...

View Options

Check Access

Log in to view the full text

AAAS ID LOGIN

AAAS login provides access to Science for AAAS Members, and access to other journals in the Science family to users who have purchased individual subscriptions.

Log in via OpenAthens.
Log in via Shibboleth.

More options

Purchase access to this article

Download and print this article within 24 hours for your personal scholarly, research, and educational use.

View options

PDF format

Download this article as a PDF file

Download PDF

Media

Figures

Multimedia

Tables

Share

Share

Share article link

Share on social media

(0)eLetters

eLetters is an online forum for ongoing peer review. Submission of eLetters are open to all. eLetters are not edited, proofread, or indexed. Please read our Terms of Service before submitting your own eLetter.

Log In to Submit a Response

No eLetters have been published for this article yet.