Making trustworthy chatbots to support women plagued by violence and abuse
Gender-based violence (GBV) is a global pandemic that affects one in three women in their lifetime. Kwanele Bringing Women Justice, one of the winning startups at Mozilla's Responsible AI Challenge, wants to help survivors of abuse in South Africa by giving women and children an easy-to-use tool to report and successfully prosecute crimes involving gender-based violence.
Founder Leonora Tima understands the first-hand the impact of gender-based violence and the importance of safe spaces and support. She also understands the risks associated with AI and its greater impact on those using them. Leonora is passionate about the issue and is committed to creating a tool that gives a voice to people in need of help. We spoke to Leonora about her work, how she got started on Kwanele, and her vision for the future of AI.
From social work to AIHaving spent years in the nonprofit sector and quite exposed to the violence in South Africa, Lenora said a career junction" was her gateway to AI. She touched on her family experience with gender-based violence in 2020, when her husband's family lost a young woman to violence. She also reflected on the impact that raising daughters has had on her and her commitment to fighting for women's rights. Tech was never an area she thought she would find herself in, but she said some of the words I use now, I'm like, did that just come out of my mouth!"
Starting KwaneleLenora always enjoyed tech and spent some time in the tech space working for a startup, where she saw the power of technology, especially for communities that are low income, and started researching what was available, how people in South Africa were reporting rapes, why they weren't, issues with reporting violence and the barriers to justice. We did a lot of work in the community... but over and over we found that people were struggling to access information," she said.
Access and ethical AIWhat was most horrifying for me was hearing from high schoolers that Facebook was their main source of information on the legal system, and then we had people that would go on to post to Facebook and had defamation cases against them," Lenora said.
In most communities, violence reporting apps can greatly enhance the support options available to survivors. However, without care, they can reinforce existing problems in how communities respond to violence.Lenora explains how Kwanele initially explored a Zendesk model for AI, where they would populate a lot of the data, but realized over time it was extremely limited.
A lot of the [large language models] are designed for English first language speakers... whereas in South Africa you have 11 main languages and a huge amount of cultural nuances," she said. It was at this point that she started reviewing ethical AI with the support of Mozilla. Mozilla was the catalyst in my AI journey," she added.
Lenora knows it's important to start having important conversations about ethical AI: People aren't even aware of the problem, you can start talking about a solution when you didn't even know people are stealing your data."
Security is extremely important when it comes to gender-based violence. We take security very seriously, we do a lot of due diligence, we encrypt, and check all of the information that comes through," said Lenora.
Chatbots and accessibilityThere are so many people that need access to resources. Lenora recognizes that if done correctly, AI and chatbots could enable people to access information and resources at a reasonably low cost. The nonprofit sector hasn't grasped the enormity of the power of it, especially if leveraged correctly," she said.
Nonprofits can have better access to funding and resources. Kids in communities that have limited access to resources. Racial bias is a huge problem, but with the speed at which AI is moving, hopefully, things will change quickly for specific environments," Lenora said.
The future of KwaneleAI and chatbots have gained a lot of traction and criticism over the past couple of years, but could the right use of this technology lead to change for good? Only time will tell, but for the time being, Lenora is betting on it.
What I'd like is that we'll be able to guide other organizations in the nonprofit sector, on how to use this technology to their advantage and teach them how to leverage it, and I hope we'll be able to leverage the bot to grow access to justice," Lenora said.
The post Making trustworthy chatbots to support women plagued by violence and abuse appeared first on The Mozilla Blog.