Brands
Discover
Events
Newsletter
More

Follow Us

twitterfacebookinstagramyoutube
ADVERTISEMENT
Advertise with us

How this Class 11 student aims to tackle gender bias in textbooks using AI and ML

Ananya Gupta, a self-taught AI and ML student has built an online toolkit - Grit Parity that one can use to upload a textbook and find out whether it’s gender-neutral in its representation.

How this Class 11 student aims to tackle gender bias in textbooks using AI and ML

Thursday May 21, 2020 , 7 min Read

As the co-founder of the GirlUP chapter (a global UN initiative to promote gender equality and uplift women) at The International School, Bangalore (TISB) Ananya Gupta has been visiting a number of schools for lower-income groups to promote young girls in STEM.


The Class 11 student is also a machine learning enthusiast, looking for ways by which ML and artificial intelligence (AI) can be used for social good.


In the course of her constant interactions with students from these schools, Ananya discovered the intrinsic gender biases that played on young minds.


“To understand the biases and why they existed I conducted surveys in a number of schools for lower-income groups, and government schools in Bengaluru, to find out what young students feel about them. The survey threw up a lot of interesting results, mainly centred on how children view gender roles - who is a business person, a nurse, an assistant and so on and why are they assigned to a specific gender,” says Ananya.

The results also revealed the inherent discrimination and prejudice that exists in society.


During the course of the survey, she also found that gender biases existed in textbooks, a startling fact, that proved that our education system needed a thorough overhaul.


Ananya Gupta

Ananya Gupta



Questions the books don't answer

Ananya believes young students are at an impressionable age, and when they begin to form biases regarding gender roles very young, it could affect their life choices and what they want to be in the future.


She raises some pertinent questions. "It’s important to ensure boys and girls see themselves in myriad roles - it shouldn’t always be that Ramu is the shopkeeper. Girls should not see themselves in the future confined to cooking or cleaning chores, as is often depicted in textbooks. Why are almost all the stories about a “him”? Why is mum always cooking and dad going to work? This issue works the other way too - boys don’t get to see themselves in different roles either. Why is the nurse always female? Why are no boys shown in the kitchen? Why can’t I be a stay-at-home parent?” 

Discovering biases and finding solutions

This discovery inspired Ananya to build a machine learning-driven product that automatically quantifies gender representation in textbooks. All you have to do is use Grit Parity, the online kit developed by Ananya to upload a textbook from your computer to find out whether it’s gender neutral in its representation. 


“By having statistics outputted from this system that are hard, undeniable facts, we stand in a credible position to say we need change,” she adds. 


Though driven by machine learning, Ananya has also used deep learning and image classification in conjunction with the natural language processing technique of coreference resolution to build the system. 


The process is simple. When you upload a textbook in PDF format the toolkit works in three ways. It compares the number of stories with primarily male and female representation, extracts the careers in which each gender is cast in and compares the number of images of men and women used throughout the book.

 

While the toolkit is entirely her idea, Ananya says, she received a lot of help from different mentors from the industry. 


As it is the first ML toolkit to detect and quantify the representation of men and women in school textbooks, while building the image classification model, Ananya had to build the train dataset (the images on which this model would be trained) from scratch, one by one, picking out each image and labelling it.  


“Initially, the model was performing at an accuracy of only about 75 percent, which was not acceptable if it were to be used in the real world. I was constantly enhancing my dataset and testing different methods to boost its accuracy. This challenge was in fact, a veiled opportunity to learn. I delved into the field of transfer learning, another area in machine learning, that allowed me to increase accuracy dramatically to about 99 percent,” says Ananya. 

Textbooks need to change 

As a student, Ananya understood what separated her from others was not the age gap, but a gap in opportunity and availability of different choices. 


“Many young girls like me are unable to see themselves as scientists, due to the lack of representation and the pervasive nature of discrimination in our society. Their intelligence and potential are no less, and yet they seem to not have the confidence to visualise themselves in different roles.” 


“There’s a fine line between girls receiving education, and quality education that liberalises and educates them to have a choice. They should be able to see themselves as an engineer or even a stay-at-home parent, but that choice and that thought should be theirs - it shouldn't be because of societal pressure. We can change the minds of a whole generation by tweaking the education they receive,” she says.

Looking deeper into the causes of the biases, Ananya found that for children, what they learn at school becomes the primary source of knowledge, inspiration, and role models. During the course of her research, she found that textbooks are used by teachers as a core means of teaching in 70-95 percent of classroom time. 


“Thinking along the lines that it could potentially be the textbooks that are not gender neutral, I began conducting research through a quantitative analysis of the representation of males and females in the elementary Karnataka board textbooks to see if I get concrete evidence for this. I found that on an average, there was about 75 percent representation of males and a mere 25 percent of girls,” says Ananya. 


Based on her results, and after delving into similar studies being conducted in countries like Australia, Singapore, Pakistan, Egypt, and even other regions within India, it became clear that young girls and women are heavily underrepresented in textbooks and curricula globally. 


While there is concrete proof of a need for change, there still isn’t much awareness of this issue, and no way for the general public to discern whether their textbooks were biased. Grit Parity helps bring this issue to the forefront. 



How it helps

The issue of unequal representation affects almost every industry, from the lack of women in STEM or politics to the dearth of males in the nursing industry.  


Ananya points out that there are hardly any stories where the protagonist is female - girls aren’t seeing themselves at the forefront, only in passive or supporting roles. This has a negative psychological effect - discriminatory gender norms and practices conveyed in and through textbooks can lower girls’ engagement in the classroom and limit their expectations in education and in life, as a UNESCO GEM report addresses. 


“Even issues faced by women such as the imposter syndrome could be avoided by ensuring they have a strong sense of belonging and self-confidence, imposed by boosting the representation of females in textbooks,” she says.

She hopes to use the toolkit to spread awareness about this issue, by prompting others to check the bias scores of their textbooks and advocate for change. 


“I plan to work with the Karnataka Government and the Ministry of Education to try and bring reform in the textbooks to reflect an equal representation of males and females. Then, maybe our future community would be able to achieve the level of gender equality we strive for,” Ananya says.