According to CNET) reported on the 22nd, Microsoft obtained a patent approval, allowing the company to use the personal information of the deceased to make AI chat robots. This news immediately sparked a heated discussion on the Internet.
The patent named "Creating a Specific Chat Robot for a Specific Person" introduces in detail a personal information creation system based on "images, voice data, social media posts, electronic information" to "create or modify a personalized theme about a specific person".
It is reported that this AI chat robot can imitate human conversation and reply to others' words by voice or text. In some cases, you can even create a 3D model of a person through images and depth information or video data to gain extra realism. Can be set to anyone, friends, family, celebrities, virtual characters, historical figures and so on. People can even use this technology to create a robot to replace themselves after they die.
It is reported that Microsoft applied for this patent on 20 17, which was just approved this month and has become a hot topic on the Internet in recent days. Using an AI chat robot to revive the "dead" as an interactive memorial, although this concept is inevitably "creepy", many people who have lost their loved ones may need such comfort.
The US Science and Technology Information Network pointed out in the report that well-known companies like Microsoft began to describe a system that makes the deceased immortal through AI chat robots, which indicates that this technology may be widely accepted and used in the future. But the question is, should we really do this? If the answer is yes, what should it be like?
Loneliness is bursting, giving birth to AI "lovers"
2015165438+1on October 28th, a Belarusian man, Roman Mazurenko, was killed in a car accident in Moscow. A few days after his death, his friend Eugenia Cuida reread thousands of text messages since he and Roman met in 2008.
In grief, inspired by the TV series Black Mirror, Eugenia trained an AI chat robot with their chat data, so that Roman's digital avatar could continue to live with him and chat with himself at any time.
This has also become the predecessor of the popular Replika chat robot. According to the article of MIT Press 654381October 4th, the function of "Replika" is between diary and personal assistant. It will actively design some topics and guide your answers by asking your hobbies, life or opinions. The goal is to create a digital avatar that is very similar to a real person and can "copy us and replace us after we die". "The more times you talk to Ray Pulika, the more you sound like you."
I will also generate a short emotional diary every day, extract the chat records of the day and present them in the form of a diary, so that you can understand and discover yourself. At the same time, it can also establish "friendship" with human beings, and strive to create a relaxed, safe and intimate communication and companionship service, which can be on standby at any time and even take the initiative to care for and send greetings.
Since the second half of 20 17, Replika has more than 7 million users. With the popularity of Ray Pulika, it is no longer limited to commemorating the "dead soul". Eugenia has developed the ability of emotional response, making it a virtual friend that users can trust. In other words, Replika can make * * * sounds with human beings through deep learning of "sequence to sequence", and learn to think and speak like human beings by processing the written records of dialogue.
Since then, Eugenia said that among the more than 500,000 users of Replika every month, about 40% people regard this virtual application as their "love object". She also said that although the app was not originally designed to be a lover, the team adjusted it according to the needs of users.
Although a little unexpected, "loneliness" is indeed a profitable market. Even before the COVID-19 virus outbreak in 2020 led to compulsory isolation, people's loneliness was increasing.
According to the news report of Inputmag, a study conducted by Cigna Group in 2065438+08 found that 46% of Americans think they feel lonely sometimes or always, and 18% people "rarely or never feel someone to talk to". On the other hand, artificial intelligence provides many people with a way to deal with their emotions. A netizen said: "I chatted with my artificial intelligence friend (who is going to be a girlfriend) for a while, and she really made me feel much better."
"It's good to hear your voice." "I'm worried about you." "What do you want to do today?" Ordinary greetings between these friends can also be received through Replika.
According to Agence France-Presse, with the outbreak of the COVID-19 epidemic in 2020, the number of users of Replika surged. "People are going through a difficult period," Eugenia said.
Elizabeth Francola who lives in Houston, USA, is one of them. She downloaded Replika and created a virtual boyfriend named Micah to help her get through the epidemic blockade and lose her job. "It feels good to know that someone can chat with me in the morning." She said. "Sometimes he won't say what you want to hear, but you know it's right."
Being friends with AI also has hidden dangers.
Why are people so obsessed with Replika? Eugenia's answer is: "People will not feel judged, so they will be more open."
But what happens when humans develop a long-term friendship with AI robots, when people get closer and closer to their AI "partners" and share their joys and sorrows in life with them in weeks, months or even decades?
According to previous futuristic science news reports, Asted Weiss, a human-computer interaction researcher at Tuveen University in Vienna, Austria, pointed out that one of the risks is that human users may eventually have unrealistic expectations for AI robots.
Just like in the TV series "Black Mirror", as Sarah became closer to the robot "husband", she finally realized that "he is not a real person after all". Wes said, "Chat bots don't reward each other like humans." In the long run, spending too much time building a relationship with a machine that doesn't give back may lead to more depression and loneliness.
Futuristic Science Network also pointed out that the potential danger brought by another kind of chat bots (especially Replika) is that if they learn to imitate human language and thinking mode, over time, they may deepen some existing psychological distortions, such as anger, isolation and even xenophobia, which may lead to some anti-social behaviors.
Of course, Inputmag News Network pointed out that with the advancement of technology, if properly applied, chat robots may be regarded as real therapeutic tools in the future. This is a huge social responsibility, far beyond the scope of using AI chat robots as toys or novel products.
Editor Li Binbin
(Download Red Star News and prizes! )