Immersive ethics tool helps developers avoid Internet of Things dystopia

Immersive ethics tool helps developers avoid Internet of Things dystopia
Credit: Insight Publishers

Google's infamous "Don't be evil" motto laid out the core values of its founders in the simplest terms (whether or not you think they have stuck to their promise is another conversation). But is being tech-ethical as simple as just stating your values, or does it require further introspection throughout the process of development?

The imminent rollout of the 5G network has been described as the herald of the fourth industrial revolution, bringing unprecedented connectivity between devices which will transform the way we live. But as more devices in our homes become connected and join the growing Internet of Things (IoT), developers need to start asking themselves more questions about the ethical implications of their creations.

It's become a familiar story lately. A popular "smart" device is found to be embarrassingly insecure or easily hackable, leading to a leak of highly sensitive data. IoT developers are learning fast that security cannot be overlooked, and privacy issues regarding the protection of user identity are equally important. When we buy products like Amazon's Alexa which are intimately involved in our —seeing and hearing everything we do—we need to know that the personal data they collect about us is not being misused.

The VIRTeu project, coordinated by Irina Shklovski of the IT University of Copenhagen, is creating tools and activities that help IoT developers bring into their own conversations. Their latest exploit, Bear & Co, is an immersive experience that plunges participants into the world of a fictitious IoT start-up. Inspired by the real-life CloudPets—a "smart" teddy bear company whose product famously recorded and stored millions of easily-hacked conversations between parents and children online—Bear & Co invites participants to become an "employee" of the company and see how seemingly innocuous decisions can lead to ethical difficulties.

Participants are first asked to state their values—what they will bring to the company and care most about. Then, their values are tested through different scenarios and problems. After they finish, their decisions are compared with their initial set of —often showing a misalignment between the two.

"Bear & Co is designed to make people think more deeply about the decisions that have to be made when developing an IoT product," says project coordinator Irina Shklovski. "On the surface, it is easy to think of these decisions as purely technical, but often there are these underlying repercussions which may conflict with the values of the developers.

"We've run events where we've asked the founders of start-ups to think about a world in which their product is in every household. The vast majority of the time, they end up with some wildly dystopian future because they've never considered what would happen at this scale. There are no easy answers to these issues, but it's important for people to think about them properly, even though it can be quite uncomfortable for them."

Bear & Co was created by the VIRTeu partner CIID Research in collaboration with Irina Shklovski from the IT University of Copenhagen and Javier Ruiz from the Open Rights Group. The CIID Research design team is Annelie Berner, Monika Seyfried, Calle Nordenskjöld and Peter Kuhberg (indsigt design). CIID Research is a future-facing research group working at the intersections of interaction design, art, science and technology, from within the Copenhagen Institute of Interaction Design.

Provided by CORDIS

Citation: Immersive ethics tool helps developers avoid Internet of Things dystopia (2019, May 28) retrieved 24 April 2024 from https://phys.org/news/2019-05-immersive-ethics-tool-internet-dystopia.html
This document is subject to copyright. Apart from any fair dealing for the purpose of private study or research, no part may be reproduced without the written permission. The content is provided for information purposes only.

Explore further

Avoidance or responsible moral choices—what is your supervisor like?

18 shares

Feedback to editors