Voice commands are now widely used. It is now being deployed in other universes: connected speakers, audio headsets, vehicle interiors, and even other everyday objects, which are thus seeing their interaction possibilities renewed.
Julia Velkovska and Moustafa Zouinar, who have been conducting surveys on the actual uses of voice assistants since 2015, observe, however. " a real gap between the promotional speeches that praise the conversational capabilities of assistants and the reality of uses ".
In an article published by the CNRS journal, Justine Cassell and Catherine Pelachaud also express their skepticism: the "disembodied" voices currently found on Home, Alexa or Siri " are far from what these assistants could be in the future: virtual beings with a body and a face to better convey their message, able to decipher our mood and build relationships with us to better meet our needs.
According to Justine Cassell and Catherine Pelachaud, "creating such beings requires a long research work, if we want to avoid the "I'm sorry, I don't understand you" or the "I'm sorry, I don't know how to help you with that", that today's assistants regularly oppose us. Communication is much more than an exchange of information. It is not a simple series of questions and answers.
"In spite of an ever increasing knowledge of the mechanisms underlying human communication, it remains illusory to claim to create today a virtual being that is able to respond to all situations. However, it is becoming possible to design virtual beings that are capable of interacting in specific contexts. We can thus imagine virtual tutors intended for learning (a language, algebra...), or virtual companions capable of helping, for example, an elderly person to adopt the good health reflexes prescribed by his doctors.Ethical issues
The two researchers explain that there are also " ethical questions that researchers, but also society as a whole, will have to answer in the near future. To what extent is it in our interest to create a virtual being capable of establishing relationships and communicating in a totally natural way with human beings? Shouldn't we, on the contrary, keep a part of imperfection in them, to avoid the human being getting too attached to them, or being manipulated by a virtual interlocutor who has become too skilled?Another issue: the question of data storage: "to meet the expectations of its user, a virtual being will first have to finely analyze his emotions, his moods, his needs, and store this information. What will the company marketing the assistant in question do with it? What will happen to this information if it is hacked or falls into the wrong hands? The recent case of Alexa's recordings used without the users' knowledge by Amazon to improve its service, gives food for thought "
Références :