Like many other digital devices, SARs may collect data, through the devices they embedded, to enhance their performance. Although national and supranational legislation have adapted to ensure the privacy of their citizen (e.g.: In the European Union, each national legislation was modified to suit the General Data Protection Regulation, passed in 2016 and enforced from 2018), some grey areas remain about the use of personal data. SARs also raise new questions about their responsibilities before the law.
The spread of social robots in societies also sparks debates about the place of robots in human environments, both from a practical angle (for example about the use of robots in work environments) and from a philosophical angle (questionning the evolution of boundaries between human and robots). Although some of these issues were already identified long before the actual creation of robots, SARs’ presence is a fairly new phenomenon that has yet to be addressed. In healthcare settings, the use of SAR represents specific challenges, as their current technical limitations do not allow them to provide spontaneous and useful assistance to patients or staff members, or to engage in any convincing dialogue. Therefore, the opportunity to introduce today’s SARs in healthcare can be discussed not only from a technical point of view, but also from an ethical perspective.
More specifically, in eldercare settings, the use of SARs brings additional isues such as:
- the rate of acceptance of social robots among older adults
- the possibility of deception and manipulation, especially for human-like and animal-like robots ;
- privacy issues; and
- considerations about the future of the relationship between care providers and patients.
Ethical challenges of the use of SARs in geriatric settings can thus be summarised as follows:
- Ensure the privacy of users regarding personal data’s management and boundaries between private and public.
- Manage to mitigate the psychological effect of the robot on vulnerable patients.
- Limit the negative impacts, both organisational and psychological, of the robot’s presence on the care professionals’ daily work.
The main objective of SPRING is to develop a socially assistive robot able to perform in noisy and crowded environments. Eventually, SPRING’s robot will provide daily assistance and entertainment in geriatric settings. This project is on the forefront of healthcare and technological innovation and, although the risk for participants is low from a medical point of view, it raises significant ethical issues regarding fairness, dignity and privacy. AP-HP, as the ethics guarantor for the project, thus set up a comprehensive strategy for the consortium to protect participants rights (such as their right to decline to participate without any consequences), and to mitigate the adverse effects that the experiments may generate (for example, make sure that they do not impact the quality of routine care).
The following general principles will guide the actions of the consortium regarding these critical issues:
- SPRING members will ensure that each experiment is conducted in accordance with the ethical policy defined in a framework document (D10.3), under the supervision of the relevant ethical committees that have been identified.
- Participants will be recruited with respect to their ability to give informed consent. The investigators will assist all the participants, provide all the necessary information, and ensure that their interest always prevails.
- SPRING will comply fully with GDPR requirements for the matter of collection, storage, transfer and processing of personal data. Adequate technical and organizational measures will be implemented to the rights of participants.
- The consortium, as a whole, intends to apply the guidelines of Horizon 2020, specifically on the matter of « Health, demographic change and well-being » and « inclusive, innovative and reflective societies »
All along the project, experiments will be conducted to collect data used to build, enhance and evaluate the capabilities of the robot, or some of its modules. Most importantly, the robot must analyse the behaviour and understand the intentions of humans around it to respond accordingly. This implies that human subjects must be involved in many of these experiments, at different stages of the project. The first experiments will be conducted in each partners’ laboratory and will involve volunteers with various socio-demographic profiles (students, researchers, volunteers) but no patients or vulnerable subjects. They will serve to create and validate the first versions of the different modules of the robot, as well as the first fully integrated prototype before the main experiment begins.
The main experiment will take place in Broca’s DCH, involving actual users of the DCH (patients, caregivers, professionals). Although individuals with severe cognitive impairment will not be included in the experimentation, a significant portion of the participants is expected to suffer from mild and moderate cognitive declines. Hence, the project raises significant concerns about the consent, privacy and dignity of vulnerable participants, and the collection and the management of their personal data.
As such, SPRING Consortium commits to:
- Ensure the right of the participants to be informed and freely consent or decline to participate, and protect their dignity and access to proper care regardless of the attitude towards the experimentation.
- Adapt the study design of the experimentation to ensure maximal privacy of the participants, as well as the full privacy of those who do not wish to participate.
- Ensure to modulate the course of the experiments according to the vulnerability of the participants
- Minimize the amount of data collected and anonymize or pseudonymize this data whenever possible
- Setup secure storage and transfer protocols for the personal data collected
A full description of the measures set forth within the SPRING project to ensure scrutiny of its experimental protocol’s ethical and privacy aspects is made available within the document called D10.3: Privacy and Ethics guidelines.