Abstract
The use of robots in therapy for children with autism spectrum disorder (ASD) raises issues concerning the ethical and social acceptability of this technology and, more generally, about human–robot interaction. However, usually philosophical papers on the ethics of human–robot-interaction do not take into account stakeholders’ views; yet it is important to involve stakeholders in order to render the research responsive to concerns within the autism and autism therapy community. To support responsible research and innovation in this field, this paper identifies a range of ethical, social and therapeutic concerns, and presents and discusses the results of an exploratory survey that investigated these issues and explored stakeholders’ expectations about this kind of therapy. We conclude that although in general stakeholders approve of using robots in therapy for children with ASD, it is wise to avoid replacing therapists by robots and to develop and use robots that have what we call supervised autonomy. This is likely to create more trust among stakeholders and improve the quality of the therapy. Moreover, our research suggests that issues concerning the appearance of the robot need to be adequately dealt with by the researchers and therapists. For instance, our survey suggests that zoomorphic robots may be less problematic than robots that look too much like humans.





Similar content being viewed by others
Notes
Note that there can be at least two kinds of arguments for taking into account people’s perceptions. One is a pragmatic one and is mainly the point of view of the robotics researcher: in order to ensure social acceptance of technology, one needs to take into account people’s perceptions, even if “we”, researchers, know what is objective and real. There is little point in developing robots for therapy if the end user (patient) has no subjective feeling of comfort when interacting with the robot. Another argument requests attention to perceptions based on the philosophical position that questions the very distinction between “objective” versus “subjective” knowledge (even scientific knowledge is a kind of perception, a way of seeing, a perspective) and that sees the exclusion of the view of lay people on the grounds that it could be based on “ignorance” or “error” as problematic. The authors of this paper sympathize with the latter argument and position, that is, assume that scientific views should not necessarily have epistemic, moral, and political priority but should be part of a broader discussion in which other voices should also be heard and valued.
Note also that this discussion raises the philosophical question what sociality is, and invites critical reflection on how we (therapists, society) define and deal with autism spectrum disorders, also in different therapeutical situations and at different times and places.
Note that it may have been better to ask about “children with ASD” and “children suspected to have ASD”, rather than “children with autism”.
In this paper we assume that all stakeholders should have a say. This may mean various things and invites larger questions and discussions about technology and democracy, but in this article we limit ourselves to using the very concrete tool of the survey to give stakeholders a chance to give their opinion on the use and development of the new technology under consideration.
More information is available at http://dream2020.eu/.
References
American Psychiatric Association. (2013). Diagnostic and statistical manual of mental disorders (DSM-5) (5th ed.). Arlington, VA: APA.
Bartneck, C., & Forlizzi, J. (2004). A design-centred framework for social human–robot interaction. In 13th IEEE international workshop on robot and human interactive communication, 2004. ROMAN 2004 (pp. 591–594).
Beauchamp, T. L., & Childress, J. F. (2001). Principles of biomedical ethics. Oxford, NY: Oxford University Press.
Borenstein, J., & Pearson, Y. (2013). Compansion robots and the emotional development of children. Law, Innovation and Technology, 5(2), 172–189.
Broadbent, E., Kuo, I. H., Lee, Y. I., Rabindran, J., Kerse, N., Stafford, R., & MacDonald, B. A. (2010). Attitudes and reactions to a healthcare robot. Telemedicine and e-Health, 16(5), 608–613.
Calo, R. (2010). Robots and privacy. In P. Lin, G. Bekey, & K. Abney (Eds.), Robot ethics: The ethical and social implications of robotics. Cambridge: MIT Press.
Coeckelbergh, M. (2009). Personal robots, appearance, and human good: A methodological reflection on roboethics. International Journal of Social Robotics, 1(3), 217–221.
Coeckelbergh, M. (2010). Moral apperances: Emotions, robots, and human morality. Ethics and Information Technology, 12(3), 235–241.
Coeckelbergh, M. (2011a). Human development or human enhancement? A methodological reflection on capabilities and the evaluation of information technologies. Ethics and Information Technology, 13(2), 81–92.
Coeckelbergh, M. (2011b). You, robot: On the linguistic construction of artificial others. AI & Society, 26(1), 61–69.
Coeckelbergh, M. (2012a). Are emotional robots deceptive? IEEE Transactions on Affective Computing, 3(4), 388–393.
Coeckelbergh, M. (2012b). How I learned to love the robot. In I. Oosterlaken & J. van den Hoven (Eds.), The capability approach, technology and design (2012th ed., pp. 77–86). Berlin: Springer.
Coeckelbergh, M. (2012c). Can we trust robots? Ethics and Information Technology, 14, 53–60.
Coeckelbergh, M. (2013). E-care as craftsmanship: Virtuous work, skilled engagement, and information technology in health care. Medicine, Health Care and Philosophy, 16(4), 807–816.
Coeckelbergh, M. (2015). Good healthcare is in the “how”: The quality of care, the role of machines, and the need for new skills. In S. P. van Rysewyk & M. Pontier (Eds.), Machine medical ethics (pp. 33–48). Berlin: Springer.
Costa, S., Lehmann, H., Dautenhahn, K., Robins, B., & Filomena S. (2015). Using a humanoid robot to elicit body awareness and appropriate physical interaction in children with autism. International Journal of Social Robotics, 7(2), 265–278.
Costescu, C. A., Vanderborght, B., & David, D. O. (2014). The effects of robot-enhanced psychotherapy: A meta-analysis. Review of General Psychology, 18(2), 127–136.
European Commission. (2012). Special eurobarometer 382: Public attitudes towards robots. (Report) Retrieved November 14, 2012 from http://ec.europa.eu/public_opinion/archives/ebs/ebs_382_en.pdf
Feil-Seifer, D. (2011). Socially assistive robotics. IEEE Robotics and Automation Magazine, 18(1), 24–31.
Feil-Seifer, D., & Mataric, M. (2011). Ethical principles for socially assistive robotics. IEEE Robotics and Automation Magazine, 18(1), 24–31. Special issue on Roboethics. G. Veruggio, J. Solis, and M. Van der Loos (Eds.).
Feil-Seifer, D., Skinner, K., & Mataric, M. J. (2007). Benchmarks for evaluating socially assistive robotics. Interaction Studies, 8(3), 423–429.
Fong, T., Nourbakhsh, I., & Dautenhahn, K. (2003). A survey of socially interactive robots. Robotics and Autonomous Systems, 42(3–4), 143–166.
Goodrich, M. A., Colton, M., Brinton, B., Fujiki, M., Atherton, J. A., Robinson, L., Ricks, D., Maxfield, M. H., & Acerson, A. (2012). Incorporating a robot into an autism therapy team. IEEE Life Sciences. Retrieved from http://lifesciences.ieee.org/images/pdf/072012-autism.pdf
Goris, K., Saldien, J., Vanderborght, B., & Lefeber, D. (2011). How to achieve the huggable behavior of the social robot probo? A reflection on the actuators. Mechatronics, 21(3), 490–500.
Kamide, H., Mae, Y., Kawabe, K., Shigemi, S., Hirose, M., & Arai, T. (2012). New measurement of psychological safety for humanoid. In Proceedings of the seventh annual ACM/IEEE international conference on human–robot interaction (pp. 49–56). ACM Digital Library.
Lee, M., Rittenhouse, M., & Adbdullah, H. A. (2005). Design issues for therapeutic robot systems: Results from a survey of psychotherapists. Journal of Intelligent and Robotic Systems, 42(3), 239–252.
Libin, A. V., & Libin, E. V. (2004). Person–robot interactions from the robopsychologists’ point of view: The robotic psychology and robotherapy approach. Proceedings of the IEEE, 92(11), 1789–1803.
Lu, E. C., Wang, R. H., Hebert, D., Boger, J., Galea, M. P., & Mihailidis, A. (2011). The development of an upper limb stroke rehabilitation robot: Identification of clinical practices and design requirements through a survey of therapists. Disability and Rehabilitation, 6(5), 420–431.
Peca, A., Simut, R., Pinea, S., Costescu, C., & Vanderborght, B. (2014). How do typically developing children and children with autism perceive different social robots? Computers in Human Behaviour, 41, 268–277.
Pop, C., Simut, R., Pintea, S., Saldien, J., Rusu, A. S., David, D. D., et al. (2013). Can the social robot probo help children with autism to identify situation-based emotions? A series of single case experiments. International Journal of Humanoid Robotics,. doi:10.1142/S0219843613500254.
Rattray, J., & Jones, M. C. (2005). Essential elements of questionnaire design and development. Journal of Clinical Nursing, 16, 234–243.
Roberts, J. M. (2003). A review of the research to identify the most effective models of best practice in the management of children with autism spectrum disorders. Sydney: Centre for Developmental Disability Studies.
Scassellati, B. (2005a). Using social robots to study abnormal social development. In Proceedings of the fifth international workshop on epigenetic robotics: modeling cognitive development in robotic systems (pp. 11–14). Nara, Japan.
Scassellati, B. (2005b). Quantitative metrics of social response for autism diagnosis. In IEEE international workshop on robot and human interactive communication (pp. 585–590).
Scassellati, B., Admoni, H., & Matarić, M. (2012). Robots for use in autism research. Annual Review of Biomedical Engineering, 14, 275–294.
Scattone, D. (2007). Social skills interventions for children with autism. Psychology in the Schools, 44(7), 717–726.
Sharkey, A., & Sharkey, N. (2010a). Granny and the robots: Ethical issues in robot care for the elderly. Ethics and Information Technology,. doi:10.1007/s10676-010-9234-6.
Sharkey, N., & Sharkey, A. (2010b). The crying shame of robot nannies: An ethical appraisal. Interaction Studies: Social Behaviour and Communication in Biological and Artificial Systems, 11, 161–190.
Sharkey, A., & Sharkey, N. (2011). Children, the elderly, and interactive robots. IEEE Robotics and Automation Magazine, 18(1), 32–38.
Sparrow, R., & Sparrow, L. (2006). In the hands of machines? The future of aged care. Minds and Machines, 16(2), 141–161.
Thill, S., Pop, C., Belpaeme, T., Ziemke, T., & Vanderborght, B. (2012). Robot-assisted therapy for autism spectrum disorders with (partially) autonomous control: Challenges and outlook. Paladyn, 3(4), 209–217.
Vanderborght, B., Simut, R., Saldien, J., Pop, C., Rusu, A., Pintea, D., et al. (2012). Using the social robot probo as social story telling agent for children with ASD. Interaction Studies, 13(3), 346–370.
Veruggio, G. (2005). The birth of roboethics. Paper presented at ICRA 2005, IEEE international conference on robotics and automation, workshop on robo-ethics, Barcelona April 18, 2005.
Wada, K., Shibata, T., Musha, T., & Kimura, S. (2008). Robot therapy for elders affected by dementia. IEEE Engineering in Medicine and Biology Magazine, 27(4), 53–60.
Weiss, M. J., & Harris, S. L. (2001). Teaching social skills to people with autism. Behavior Modification, 25(5), 785–802.
Whitby, B. (2015). Automating medicine the ethical way. In S. P. van Rysewyk & M. Pontier (Eds.), Machine medical ethics (p. 233). Berlin: Springer.
Acknowledgments
The authors thank the participants for making this study possible by completing the survey. We also thank the autism organisations we contacted for their permission and their help in making the questionnaire widely available to their members. We are also grateful for funding received from the European Commission for our FP7 project DREAM (Grant no. 611391).
Author information
Authors and Affiliations
Corresponding author
Rights and permissions
About this article
Cite this article
Coeckelbergh, M., Pop, C., Simut, R. et al. A Survey of Expectations About the Role of Robots in Robot-Assisted Therapy for Children with ASD: Ethical Acceptability, Trust, Sociability, Appearance, and Attachment. Sci Eng Ethics 22, 47–65 (2016). https://doi.org/10.1007/s11948-015-9649-x
Received:
Accepted:
Published:
Issue Date:
DOI: https://doi.org/10.1007/s11948-015-9649-x