Imagine a machine that can laugh and cry, learn and teach, or have a natural chat with you as though it was a person.
Imagine a computer that recognizes you, reacts to stimuli and can establish a real connection with you. Science fiction? No. It’s reality.
Yesterday, artificial intelligence engineer Mark Sagar gave a presentation here at the SXSW entitled “Giving a face to artificial intelligence,” a novelty that impressed all the people who were packed into the large hall at the Marriot: avatars that are emotionally responsive, just like humans.
These are “living” computational models, with a face and a brain, that combine multidisciplinary technologies”: Bioengineering, Computational and Theoretical Neuroscience, Artificial Intelligence and Interactive Computer Graphics Research.
Instead of a simple screen on a device, we see a person: a baby (also known as BabyX), women and men of diverse ethnicities and age groups, each one more perfect than the last, with faces and brains. These are animated and interactive virtual prototypes that have been created to generate identification with people, and enrich the user experience.
Considering that life’s most significant experiences are those that happen face-to-face, what could be more fitting than to utilize this power in human-computer interaction?
Mark has given a face and a virtual incarnation to artificial intelligence. The cerebral processes that give rise to social learning and behavior were modeled and used to animate realistic models of faces that can truly interact. The computer models come to life through advanced 3D computer graphics, which are able to precisely render each muscle on the face, and display a wide variety of facial expressions.
The system analyzes the video and audio inputs in real time, to produce reactions to the behavior of the person interacting with the machine. If you love Siri, Apple’s voice-activated assistant, imagine how much you’ll love these avatars! I’d say they’re like a Siri with a face, activated by audio and video.
These emotionally intelligent virtual agents, just like humans, learn through interaction. How we experience the world depends on our actions, perceptions, emotions and memories. The avatars have basic neural systems driving interactive behavior and learning. Amazing, isn’t it? I love it! In my opinion, this innovation is going to permanently change the way we relate to technology.
One interesting fact is that Mark has won two Oscars (Scientific and Engineering Award) for the visual effects in Avatar and King Kong, and he also worked on Spider-Man 2. That’s what made me realize why these digital creatures are so convincing. The difference is that, in fiction, the creatures serve to narrate the clash between man and unknown civilizations, with catastrophic results. In real life, they serve to establish a relationship that is as yet unknown between man and machine, with transformative results. Welcome to the next step in human and digital interaction.