A decade ago, Joaquin Phoenix fell in love with his AI companion Samantha in the Spike Jonze movie Her.
Cut to the present, chatbots like ChatGPT and Bard, which are increasingly getting better at mimicking human conversation, have opened doors for individuals to design their perfect AI partners.
"Have you ever dreamed about the best girlfriend ever? Almost for sure! Now she can be at your fingertips," reads the slogan for AI girlfriend app Romantic AI.
Meanwhile, a Silicon Valley company, Andreessen Horowitz (a16z), has uploaded a tutorial to GitHub outlining how to create customisable "AI companions" with configurable personalities and backstories.
With several nations like China and European Union still working on devising their own detailed regulations to govern artificial intelligence (AI), some are concerned that the virtual girlfriend apps might create unrealistic expectations for human relationships.
On another AI girlfriend app, Eva AI, a user is given options like “hot, funny, bold”, “shy, modest, considerate” or “smart, strict, rational” to create the “perfect partner”. The app also asks if one wants to opt in to sending explicit messages and photos.
"Creating a perfect partner that you control and meets your every need is really frightening," Tara Hunter, the acting CEO for Full Stop Australia, which supports victims of domestic or family violence, was quoted as saying in a report by The Guardian.
Hunter added that since the drivers of gender-based violence are ingrained in cultural beliefs that men can control women, it is "really problematic".
Meanwhile, Eva AI’s head of brand, Karina Saifulina, told Guardian Australia the company had full-time psychologists to help with the mental health of users. "Together with psychologists, we control the data that is used for dialogue with AI," she said.
Saifulina added that the company conducts surveys of loyal users every two-to-three months to ensure that the application is not harming their mental health. She further said the app has guardrails to avoid discussion about topics like domestic violence or pedophilia.
Dr Belinda Barnet, a senior lecturer in media at Swinburne University, told the publication the apps cater to a need, but, more regulation is needed. "With respect to relationship apps and AI, you can see that it fits a really profound social need [but] I think we need more regulation, particularly around how these systems are trained," Barnet said.
According to an analysis by venture capital firm a16z, quoted by the publication, the next generation of AI relationship apps will be even more realistic.
The a16z analysts said AI relationship apps are "just the beginning of a seismic shift in human-computer interactions that will require us to re-examine what it means to have a relationship with someone".
"We’re entering a new world that will be a lot weirder, wilder, and more wonderful than we can even imagine."