Technology has grown increasingly personal over the years, but can it ever be a “friend” in the way we think about human friends?
The movie Her, directed by Spike Jonze, envisions a future in which operating systems have evolved to learn from our behaviors and proactively look out for our best interests every day. They’re our personal assistants, but they’ve become nuanced to the point that we have no problem calling them our friends. And when a person says they’re in love with their operating system, it’s not particularly weird.
The star of Her is OS1, a new operating system that, when you first launch it, creates a unique persona to best accommodate its user’s personality and communication needs. For the film’s lonely protagonist, OS1 takes on the name “Samantha” and acts as a personal assistant to control connected technologies like computers, smartphones and TVs. Voiced by Scarlett Johansson, she is also the most human-sounding non-human ever built.
Samantha talks and responds naturally like a human, but she can also “like” things like colors, faces and stories. She can “see” her surroundings via webcam, laugh at jokes, make her own jokes, and even exhibit feelings of joy and sadness. She can also recognize and analyze patterns in its owner’s recreational habits, relationships and career, and offer beneficial advice without the user needing to ask for it—just like a friend would.
If AI’s goal is to emulate human behavior, OS1 might be the ultimate realization.
The closest modern approximation to the fantasy depicted in Her is the virtual personal assistant, which can be found in desktop clients like Nuance’s Dragon Assistant and smartphone apps like Apple’s Siri or Google Now. While it’s highly unlikely that any of these products will turn into anything like OS1, many natural language developers believe it won’t be long before our AI assistants get much more personal than they are now.
More Than Human
Nuance CMO Peter Mahoney says his company’s been spending more time building out virtual assistant capabilities due to the “groundswell of interest in making more intelligent systems that can communicate with humans more fluidly.”
Since computing technology has reached the point where it can now access huge amounts of data in the cloud, sift through that data and make real-time decisions about it in just seconds, Nuance has worked hard to transition its solutions from solely transcribing audio to actually extracting meaning from the text.
“Dialogue is really important,” Mahoney told me. “In the original systems that came out, it operated like a search engine. You say something and something comes back, but it may or may not be the right thing. But that’s not how humans work. Humans disambiguate. We clarify.”
Creating “natural-sounding” systems that can dissect speech and read between the lines, though, is just as difficult as it sounds.
Martijn van der Spek is the co-founder of Sparkling Apps, a startup that owns nine different speech recognition services including Voice Answer, which the company calls its “next-generation personal assistant.” According to van der Spek, virtual personal assistants require massive amounts of server power, and smaller startups with AI solutions—like Sparkling Apps’s Voice Answer—simply can’t afford to power a truly smart assistant with expertise across a broad number of domains, as opposed to just a few.
“The amount of data stored results in performance issues for our servers,” van der Spek told me. “This together with the concern of privacy has made us clear Eve’s database every 24 hours. So she suffers from acute amnesia and any long-term relationship is doomed to fail.”
Luke Muehlhauser, CEO of the Singularity Institute for Artificial Intelligence, also noted that AI is advancing more slowly than it might because many researchers aren’t sharing their information. Large private companies like Google and Facebook are keeping their AI-related research under wraps, whereas academic researchers constantly publish their progress in journals.
Getting To Know You
Digital assistants may never evolve to love us like OS1 does in Her, but maybe they’ll at least eventually remember what we’ve told them in previous conversations.
Today’s personal assistants are helpful with solving problems that are happening right now (“play a song,” “text Joe,” “launch Skype,” “find a Chinese restaurant nearby,” etc.). But if AI ever wants to approximate human behavior, its systems will need to be a little more thoughtful. And that means pushing intelligent systems to store more data and consider more contextual information when making decisions.
“A human who is thoughtful understands your needs, wants and desires—he or she understands you and can contextualize that,” Mahoney told me. “One of the things you talk about is having all the information. The more online information and the more great services out there that exist, the more we’ll be able to connect our intelligent systems that can understand everything that’s going on.”
What drives a recommendation engine isn’t just information, but learned combinations of relationships, classifications and genres. “Structured content will happen first versus things that are less structured—those will be more complicated to figure out,” Mahoney said. In other words, today’s personal assistants know a lot about what’s playing in theatres, but those less-structured concepts—like remembering previous conversations about favorite movies to proactively recommend a new movie the user may like—are going to take more time to develop.
Ray Kurzweil, the noted inventor and futurist currently working with Google on its X Lab projects, believes that Google will build a computer that can understand natural language and human emotion by 2029. But as he told tech blogger Jimi Disu in December, an amped up digital assistant could be in our pockets in as little as four years:
Right now, search is based mostly on looking for key words. What I’m working on is creating a search engine that understands the meaning of these billions of documents. It will be more like a human assistant that you can talk things over with, that you can express complicated, even personal concerns to. If you’re wearing something like Google Glass, it could annotate reality; it could even listen in to a conversation, giving helpful hints. It might suggest an anecdote that would fit into your conversation in real-time.
Making Friends In iPlaces
Over time, the intelligence of personal assistants will expand as the online catalogue of information grows deeper and better-connected. And lots of big companies are investing heavily to make the best use of those vast information stores.
Last October, Apple purchased a unique “personal cloud” company that can search deep into social networking accounts. And Google recently purchased AI firm DeepMind Technologies, which “uses general-purpose learning algorithms for applications such as simulations, e-commerce and games,” according to its website.
But collecting massive libraries of information isn’t enough to power a true personal assistant. Companies like Apple and Google also need to perfect the “dialogue” factor, since there is all too often a noticeable lapse in time between the user’s question and the personal assistant’s answer.
The key might be to disconnect from the cloud entirely—or at least to minimize the number of times the system must connect to the cloud. But even though personal assistants would benefit from as much local processing as possible, the ideal personal assistant—think “best friend that knows everything about you”—needs access to the deep catalogues of online information. Companies are working on anticipating users’ needs to have the most relevant information ready to deliver, but there’s a lot of information to consider and many moving parts.
Google is experimenting with a few solutions to make personal assistants work faster, namely with offline voice recognition in Android, while Intel’s new Edison computer might make it possible for voice recognition over mobile devices or even wearables to work near-instantaneously. The key, according to most companies, is to minimize the number of round trips over cellular-data signals to make processing—and in turn, conversations—more snappy.
Intelligent personal assistants will become more valuable as they get better at understanding the subtleties in communication, but researchers and developers will eventually be forced to grapple with the issue of ethics. If we can program a computer to function like a brain in order to like or even love us, there’s nothing stopping developers from fine-tuning those powerful systems to personal or corporate interests as opposed to a true moral compass.
In other words, we want AI to drive our cars, manage traffic congestion, save energy in our homes, and better organize our daily lives—not to constantly nag us to visit Wal-Mart.
Movies like Her make us fantasize about personal assistants that can be true friends, but the state of today’s AI technologies leads one to believe this won’t be happening anytime soon. Personal assistants are nifty features, but they need to improve their listening skills, knowledge bases and memory banks before they can be our trusty sidekicks.
In time, AI assistants may grow smart enough to learn our habits and advocate for our best interests, but the odds are against personal assistants ever leaving the friend zone to become something “more.” And there’s nothing wrong with that.
Images by Annapurna Pictures
Read more : Something About ‘Her’: Will Our Computers Ever Be Real Friends?
0 Responses
Stay in touch with the conversation, subscribe to the RSS feed for comments on this post.