Alysha Naples, Life coach and Chief Experience Officer of Tin Drum, has a word of caution for designers dabbling in Virtual Reality and Augmented Reality.
A deep sense of love and belonging is an irreducible need of all women, men, and children. We are biologically, cognitively, physically and spiritually wired to love, to be loved and to belong. – Dr Brene Brown
As newer and innovative forms of technology seep into design, it is easy to get carried away by the magic of what we create and detach ourselves from the human side of it.
Alysha Naples, previously Senior Director of User Interaction and Experience at Magic Leap Inc., spoke to a large gathering of designers at the recent DesignUp conference in Bengaluru on how it’s important to go beyond the screen.
We have to slow down, observe, and think critically. We have to anticipate the unintended consequences.
In the US, designers have to appraise other designers. This helps them in learning the importance of teamwork, and collaboration while understanding the impact of the industry’s work.
Alysha recalled how it took less than 24 hours for Twitter users to corrupt a Microsoft’s AI chatbot Tay in 2016. Tay was created as an experiment in "conversational understanding." The more one chatted with Tay, the smarter it got, learning to engage people through "casual and playful conversation."
Unfortunately, the conversations did not stay playful for long. Within a day of Tay’s launch, it started tweeting using misogynistic, racist, and abusive language and the bot was withdrawn.
Another example is when Facebook quietly fired a large part of its editorial staff in 2016 and replaced them with an algorithm.
The algorithm could trawl through extensive amounts of data, measure engagement and highlight news stories that are highly read but unfortunately, could not gauge the difference between facts and fake news.
So several articles villainising Hillary Clinton got past it and experts have deduced this being one of the reasons leading to Donald Trump’s victory.
Data and algorithm cannot replace facts and ethics. Empathy is a conscious choice.
Design has to be created with consideration for use, and also abuse. Innovation is not about technological breakthroughs, it is also about unintended consequences that impact humans. There’s a captivating emotional factor to the games, than just the engagement factor.
Sharing examples of Wall-E and Hyper reality by Keiichi Matsuda (a short film based in a high-tech dystopian near future), she spoke about how both the movies realistically depict a high-tech future that is not rosy.
Alysha also spoke about the game Quiver. She recalled an article by a woman, who was groped by male co-players when she was playing the virtual reality game. She explained that though she was in the safety of her house, the virtual groping left her feeling violated and extremely upset. The creators of Quiver immediately issued an apology and added a new feature – a force field – that would allow a player to shake off any co-player trying to touch them. Alysha said, “When you make a mistake - Learn It, Fix It, and Share It.”
Ending on a positive note, Alysha spoke of the affirmative experience she gained while playing a game called Journey, built for fostering friendship. The game does not disclose the age, race, sex, and nationality of the players, thereby preventing scope for bias, hatred, or abuse. Her own experience while playing with a nameless co-player ended with the other player drawing a heart on the ground around her avatar to thank her for a great game. This is the kind of unblemished and happy experience that designers should aim for.