To make sure you never miss out on your favourite NEW stories, we're happy to send you some reminders

Click 'OK' then 'Allow' to enable notifications

AI chatbot allows women battling workplace discrimination to determine what behaviour is inappropriate

AI chatbot allows women battling workplace discrimination to determine what behaviour is inappropriate

Two women have developed an AI chatbot that supports and advises women if they feel they might be experiencing workplace discrimination.

In the workplace, things can sometimes get a bit blurry when it comes to knowing what's acceptable and what's not. But where do you go in fear of speaking up directly?

Here's where a brand new AI chatbot makes it easier to ask the not-so-easy questions.

Two women have created 'SaferSpace', an AI chatbot designed to help individuals identify inappropriate behaviours in the workplace.

The chatbot - devised by PR expert Ruth Sparkes and business entrepreneur Sunita Gordon - originated from alarming statistics and personal stories about workplace harassment across the UK.

Ruth stated: 'We wanted to create something that not only gave victims a voice but also helped institutions create safer environments.'

Trained on UK legislation, the best part about the chatbot is that it allows individuals to ask personally curated questions anonymously, making it easier to report issues without worrying about the consequences.

AI chatbot helps women with workplace discrimination / Vertigo3D/Giselleflissak/Getty Images
AI chatbot helps women with workplace discrimination / Vertigo3D/Giselleflissak/Getty Images

It gives people the confidence to speak up, especially when they're not sure if what's happening is wrong or worried about being judged.

Plus, it helps companies understand and tackle these issues better.

Sunita explained: 'It's not about just making a complaint; it's about providing organisations with the data and tools they need to make real change.

'By understanding the frequency and nature of issues, organisations can develop effective interventions. Lots of organisations have robust policies in place, but policies don't equal protection.'

In times where women may feel uncomfortable, maybe their boss keeps inviting them to dinner or things start to feel less than professional, they can turn to the chatbot for advice on their personal situation.

The AI model can then give a factual answers and support that's backed up by official UK legislation, such as the Equality Act 2010.

The chatbot's responses can help assess whether a particular - even unique - scenario qualifies as sexual harassment, and it encourages the individual to consider seeking further help.

Women can ask the chatbot personally curated question where they will get a factual response / Vertigo3D/Giselleflissak/Getty Images
Women can ask the chatbot personally curated question where they will get a factual response / Vertigo3D/Giselleflissak/Getty Images

In any doubt, the chatbot tries to level with the potential victim, offering advice like: 'It depends on the context and your comfort level. If you feel uncomfortable or threatened, it is best to talk to your boss about how you feel and ask them to clarify their intentions.'

On the chatbot's design, Ruth added, 'While developing SaferSpace, Sunita and I often reflected on our own experiences. It's a stark reminder that this isn't just a business endeavour for us; it's deeply personal.'

In a country where 58% of women have experienced sexual harassment at work, and 68% of students on campuses have faced harassment, the need for such a tool is urgent - and the developers recognised this.

The two women think that this tool could be beneficial especially considering the recent instances of sexual harassment and misconduct reported at McDonald’s, the Met police, and the NHS.

Ruth explained: 'We need to encourage wider reporting. Many victims lack information on how to report an incident, they fear that they won't be taken seriously, or find the whole process of reporting too intimidating.

'The most common reason for people not reporting, is not knowing whether what they've experienced actually warrants a complaint. '

Featured Image Credit: Vertigo3D/Giselleflissak/Getty Images