Get all your news in one place.
100’s of premium titles.
One app.
Start reading
ABC News
ABC News
Health

Can a chatbot help women who have experienced racism?

That chatbot provides options for women to choose how they want to handle experiences of racism. (Supplied: Maya Cares)

Ayan* was fresh out of high school in Perth when she went through a racist incident that stuck with her.

She applied for a job at a grocery store, in a well-off area in the West Australian capital.

"I got a call saying: 'Look the manager really likes you, but we were wondering if you could remove your headscarf or headwrap?'"

Ayan told the interviewer her headscarf was a religious obligation. What he said next unnerved her.

"He said that, in this area, there are a lot of white people here who might feel offended by it."

Ayan was then told that there was a "mistake" with her application and that she did not get the job.

"I felt that the reason I didn't get the callback is because I refused to take my headscarf off," she said.

Because it was a phone call, she did not have evidence of the interaction. Even if she did, she wouldn't have known what to do.

It's a situation hundreds of thousands of people constantly find themselves in and Priyanka Ashraf — the director and found of The Creative Cooperative — set out to find a solution.

Her solution is Maya Cares, a chatbot that's described as "your older sister who has the answers".

It's the first chatbot made for Aboriginal and Torres Strait Islander women and other women of colour to help identify whether they have experienced a racist incident.

"It became really clear that there are so many challenges when you experience racism," Ms Ashraf said.

"Do you understand that what just happened is racism and, if you don't put a name to what it is, how can you possibly do something about it?"

Ms Ashraf was inspired to create Maya Cares after she was told to "take her COVID virus back to where she came from" at a supermarket during the height of the pandemic.

She realised she didn't know what her options where if she wanted to act.

"It never even occurred to me that I should report it, and it was only much later when I spoke to a friend who asked me, 'Why didn't you report it?'" she said.

"I used to be a practising lawyer, with access to information and knowledge...

"If I didn't know how to report it, then clearly people who have less access to that information [face] even more barriers?"

Creating a database

Maya Cares is a chatbot and platform that seeks to help women report racism and seek culturally suitable counselling. (Supplied: Maya Cares)

Maya Cares is designed to replicate a casual conversation with a friend or loved one.

It asks questions about the incident, then offers options for how to report, seek counselling, seek resources or to learn more.

According to Wendy Qi Zhang — the service designer who created Maya Cares — another crucial function of the chatbot is collecting data, which can be anonymous.

"There are big missing patches of data of what racism really looks like and that constantly feeds into this narrative that racism doesn't really happen," Ms Zhang said.

"What we're really hoping for is, like, a big surge of undeniable database and [an] evidence base to give to different policy-makers."

As more data was collected about different types of racism, their frequency and even location, Ms Ashraf said, the chatbot could provide more-specific advice.

"If I had experienced racism on a bus, compared to if I'm at a medical practice, the processes that govern how a complaint is made or is addressed are arguably two very different processes," Ms Zhang said.

"As we start getting this information, there is the power to become a lot more personalised and, therefore, active in the responses and interventions."

University of Melbourne senior lecturer in computing and information systems Suelette Dreyfus sees many positive outcomes for this type of technology.

"IT is not good or bad here. It's a powerful tool that humans can use for either," she said.

"AI has a history of embedding bias, so it's important that the way we use the technology is fit for purpose."

Offering culturally appropriate solutions

Racism can cause mental and physical health to deteriorate in people from culturally and linguistically diverse backgrounds.

There's an economic cost as well, with structural racism in Australia costing the country more than 3 per cent of its annual GDP, equating to $37 billion per annum.

Another function of Maya Cares is to offer referrals to culturally appropriate mental health professionals and resources.

"I was trying to educate the professional who is meant to be alleviating or supporting me on my mental health journey," Ms Ashraf said.

"To get someone up to speed on the cultural nuance, it's going to take them three hours, four hours, before we start to really have that common ground and shared understanding."

Dr Dreyfus agrees that it is important the chatbot is programmed to respond appropriately.

"[The] chatbot … obviously needs to include sensitivity to the state of mind of the victim, and a non-judgemental framework about how the event rolled out, as well as its impact when described."

She said it was important that the chatbot was designed by women with the right expertise.

"Incorporating that sort of knowledge in the creation of things like a support chatbot can make all the difference between the technology failing or succeeding in its purpose."

Ayan has been beta-testing Maya Cares in the lead-up to its launch and thinks it will help a lot of young people who would have doubts about identifying racism.

"It's giving the user the option to decide how they want to deal with it," she said.

"Some people might want to see a counsellor or psychologist, other [people] might want to talk to others about it and get some validation … and some want to take more action."

*Ayan is not her real name.

Sign up to read this article
Read news from 100’s of titles, curated specifically for you.
Already a member? Sign in here
Related Stories
Top stories on inkl right now
Our Picks
Fourteen days free
Download the app
One app. One membership.
100+ trusted global sources.