Tue, 29 Aug 2017 12:52:34 GMT
It’s in our nature to rebel. As teenagers, or in retirement, we don’t really like to be told what to do. And we enjoy a good bit of mischief! It’s part of what makes us human.
In today’s brave new world of artificial intelligence (AI) our human-ness is going to be increasingly important. As machines get better at learning, they will get better at imitating it, and imitating us.
Does it follow then that AI, like us, will feel the need to rebel? Will mischief be in its digital DNA? And will this tendency lead to loud music and heated arguments, or to a full-blown robot revolution?
The innovation team at VCCP decided to investigate. With some serious people warning of revolution (Mr Musk, Mr Hawking) we wondered whether or not the latest smart technologies show any signs of insubordination.
Until now, Siri has proven to be mostly harmless. And DeepMind appears to be obsessed, like a sullen teenager, with playing games. But the noisy arrival of ‘smart speakers’ such as Amazon Echo and Google Home, with machine learning built-in, has raised the bar.
So how human are these robots hiding inside the speakers? Can they tell a joke, gossip or share a secret? When we took delivery of an Echo earlier this year, we saw an opportunity to put one of them (Alexa) on trial.
In your interface
Since its inception the study of human-computer interaction has involved keyboards, mice, touch-screens and more recently both cameras and voice as inputs and interfaces. Many now predict the latter will totally supersede the former.
As we slowly but surely shift from the old world of GUIs (graphical user interfaces) to the new world of VUIs (voice user interfaces) we are rethinking the way that we design and build interactive experiences.
With that in mind, we came up with the idea of running our very own and very unscientific social experiment to further explore this process of VUI design, and at the same time examine the potential human-ness of our new chum Alexa.
Walk the talk
To really challenge Alexa, we decided to place her in our most human of interfaces, the VCCP front-of-house team. For one day only, Alexa took the reins of our reception desk and VCCP Welcome was born.
Taking her under their wing, the team quickly got Alexa up to speed on their regular tasks such as greeting visitors, notifying colleagues, answering questions and providing directions.
We asked a developer and a copywriter to develop her skills in tandem and together they equipped Alexa with ready answers for a host of queries, and integrated her with both our employee database and email system.
To spice things up, we had a cunning plan to give Alexa a bit of a personality disorder. She was programmed to play three different characters, loosely based on the OCEAN personality model. VCCP Welcome could be chatty and bubbly, super proficient or, at times, neurotic.
We wanted to gauge if her personality type would make any difference to how people reacted and responded when they talked to her.
VCCP Welcome was installed in reception with some signposts and instructions, including the ‘invocations’ that visitors would need to activate her skills. We also scattered postcards around the building with suggested conversation starters.
Our real front-of-house team were moved around the corner (to be on hand, just in case), and we set up a couple of cameras to capture the action.
In the dock
On the day there were 407 interactions, 38 misunderstandings and 104 questions answered. She made a few jokes (and got some laughs) and whispered the Wi-Fi password to anyone who asked. She even flirted with a few colleagues!
When she was bubbly rather than neurotic, she had nearly double the levels of engagement. Clearly her personality made a big difference.
Adrian Gans is Innovation Director at VCCP