Brands
Discover
Events
Newsletter
More

Follow Us

twitterfacebookinstagramyoutube
Youtstory

Brands

Resources

Stories

General

In-Depth

Announcement

Reports

News

Funding

Startup Sectors

Women in tech

Sportstech

Agritech

E-Commerce

Education

Lifestyle

Entertainment

Art & Culture

Travel & Leisure

Curtain Raiser

Wine and Food

YSTV

ADVERTISEMENT
Advertise with us

Future of Work 2020: Researching for the ‘Next Billion Users’ will require a code of ethics

At the third edition of YourStory's Future of Work conference, Padmini Ray Murray of Design Beku shed light on the need for a code of ethics to help regulate design research and ensure that users are not 'harmed'.

Future of Work 2020: Researching for the ‘Next Billion Users’ will require a code of ethics

Sunday March 08, 2020 , 6 min Read

“The next billion users are under the microscope in a way that they never have been before,” said Padmini Ray Murray, while speaking at the third edition of YourStory’s flagship event Future of Work 2020 in Bengaluru.


Padmini is the co-founder of Design Beku, a collective that co-designs responses to complex social technical challenges as a community. The collective explores how technology and design can be decolonial, local, and ethical.


Padmini Ray Murray fow 2020

Padmini Ray Murray of Design Beku at YourStory's Future of Work 2020.

During her 20-minute talk at the product-tech-design conference, Padmini shed light on the need for a code of ethics within the industry to help regulate design research to ensure that users were not harmed.


“When we talk about design, we are talking about technology. The two are inseparable. And obviously, the ethical conundrums as a consequence are also now inextricably intertwined,” she said. 


The problem, she said, arose at the very beginning because the kind of focus that is given to design research is not accorded to design research ethics



Why design ethics and why now?

Speaking to a room full of product designers, engineers, and people from the industry, she talked about the ethical conundrums that arise from various user research and design research projects. For example, she soke about the ethical dilemma that arises out of the micro-project ‘Faces’ available on journalist P. Sainath’s People’s Archive of Rural India.


The project is a digital archive of people from each district in the country. Unknown to them, these photographs can be used by facial recognition software that are increasingly being used for surveillance around the world. 


These are the kind of consequences and questions that designers need to be aware of when collecting user data. The intersection of technology and design make it inextricably difficult to continue with old ways of conducting research. 


“Designers have to keep in mind how their research and what they put out can harm their users.”


At a time when everybody is working with the idea of the ‘Next Billion Users’ and designing products that can essentially transform lives, it is critical that ethical criteria be in place. 


Padmini believes that one way in which the conversation can be changed and moved towards a better understanding of ethical needs is to understand current best practices in academic research and see how they can be modified to suit commercial user research.



What we need?

She said the need of the hour was a code or framework of ethics that the design community and tech community could be held accountable to. 


“We want to be able to come up with a working pattern or language of ethical codes of researching with, and for the next billion users. When we say with, we mean conversations about what is happening, what is the need, etc. with the end user.”  


She outlined three parameters that would play a key role in designing ethics for the industry

Informed and granular consent

“This is a huge problem not only for the NBU, it’s for anybody who uses technology all over the world,” she said.


With sweeping generalisations of consent being a norm for letting users on to a platform, user data was being mishandled and exploited. Take, for example, the Cambridge Analytica, filter bubbles, she said. The lack of informed and granular consent was the reason for these. 

Reciprocity and transparency 

When researchers are interacting with users and creating products with the users, there has to be constant communication between the designers and the community that they are trying to serve, she said. 


“We are radically different from the community that we are seeing as our future users, and it actually makes business sense to be constantly in touch with them and to understand how the end user is thinking,” she said.


In terms of transparency, Padmini pointed out the lack of such practices in current industry standards. She focused on how none of the user licences and terms and conditions that we come across on a daily basis were written in simple language, were multi-lingual, or illustrated. Such practices could help users understand what they were actually signing up for. 


“I’m not talking about somebody in a village, I am talking about my mother, or somebody who is just not tuned in to tech. Everybody is just signing away everything without actually understanding any of it.”


This is where designers can intervene and provide ways to help make these practices transparent for everyone. 



Privacy and data protection 

People usually think that they don’t need privacy because they are not doing anything wrong. However, Padmini reiterated that privacy was not about doing wrong things, it was about having the freedom to live your life the way you wish to.


There is information to share and information not to share. Companies have to know the difference between collecting high-resolution data and low-resolution data. 


“High resolution means that they’re collecting everything. And low resolution means that data collected is impressionistic. We can still get a very good sense of what we need to know, but we don’t need to know about everything the user is doing,” she said.


With privacy and consent, one big problem that the industry seems to have missed is the mental models of users. Every user has a different mental model, everybody will not understand consent the way the company puts it. So, the need to clearly define consent also becomes important. 


Lastly, she brought attention to the Personal Data Protection Bill 2019, which has been tabled in Parliament and is being reviewed by a Joint Parliamentary Committee. If passed, the bill will have profound impact on not only businesses but also individual users. 


Padmini examined the clause of ‘the consent manager’ that the bill is seeking to introduce. The consent manager will be a portal between businesses and the user where the user can see at each step where consent has been taken. If s/he finds out that consent has been violated, s/he can immediately withdraw it.


The bigger challenges, she said, were consent fatigue and mental models of consent. 


(Edited by Javed Gaihlot)


A big shout out to our Future of Work 2020 Sponsors: Alibaba Cloud, Larksuite, Vodafone Idea Limited, Gojek, Adobe, Udaan, Pocket Aces, Junglee Games, Sharechat, Open, VestaSpace Technology, Maharashtra State Innovation Society, Kristal.AI and GetToWork; and our Knowledge Partner: Ascend Harvard Business Review  

Future of Work banner