• News
    • Tech News
    • AI
  • Gadgets
    • Apple
    • iPhone
  • Gaming
    • Playstation
    • Xbox
  • Science
    • News
    • Space
  • Streaming
    • Netflix
  • Vehicles
    • Car News
  • Social Media
    • WhatsApp
    • YouTube
  • Advertise
  • Terms
  • Privacy & Cookies
  • LADbible Group
  • LADbible
  • UNILAD
  • SPORTbible
  • GAMINGbible
  • Tyla
  • FOODbible
  • License Our Content
  • About Us & Contact
  • Jobs
  • Latest
  • Topics A-Z
  • Authors
Facebook
Instagram
X
TikTok
Snapchat
WhatsApp
Submit Your Content
AI chatbot allows women battling workplace discrimination to determine what behaviour is inappropriate

Home> News> AI

Published 16:26 16 Jan 2024 GMT

AI chatbot allows women battling workplace discrimination to determine what behaviour is inappropriate

Two women have developed an AI chatbot that supports and advises women if they feel they might be experiencing workplace discrimination.

Rebekah Jordan

Rebekah Jordan

In the workplace, things can sometimes get a bit blurry when it comes to knowing what's acceptable and what's not. But where do you go in fear of speaking up directly?

Here's where a brand new AI chatbot makes it easier to ask the not-so-easy questions.

Two women have created 'SaferSpace', an AI chatbot designed to help individuals identify inappropriate behaviours in the workplace.

The chatbot - devised by PR expert Ruth Sparkes and business entrepreneur Sunita Gordon - originated from alarming statistics and personal stories about workplace harassment across the UK.

Advert

Ruth stated: 'We wanted to create something that not only gave victims a voice but also helped institutions create safer environments.'

Trained on UK legislation, the best part about the chatbot is that it allows individuals to ask personally curated questions anonymously, making it easier to report issues without worrying about the consequences.

AI chatbot helps women with workplace discrimination / Vertigo3D/Giselleflissak/Getty Images
AI chatbot helps women with workplace discrimination / Vertigo3D/Giselleflissak/Getty Images

It gives people the confidence to speak up, especially when they're not sure if what's happening is wrong or worried about being judged.

Advert

Plus, it helps companies understand and tackle these issues better.

Sunita explained: 'It's not about just making a complaint; it's about providing organisations with the data and tools they need to make real change.

'By understanding the frequency and nature of issues, organisations can develop effective interventions. Lots of organisations have robust policies in place, but policies don't equal protection.'

In times where women may feel uncomfortable, maybe their boss keeps inviting them to dinner or things start to feel less than professional, they can turn to the chatbot for advice on their personal situation.

Advert

The AI model can then give a factual answers and support that's backed up by official UK legislation, such as the Equality Act 2010.

The chatbot's responses can help assess whether a particular - even unique - scenario qualifies as sexual harassment, and it encourages the individual to consider seeking further help.

Women can ask the chatbot personally curated question where they will get a factual response / Vertigo3D/Giselleflissak/Getty Images
Women can ask the chatbot personally curated question where they will get a factual response / Vertigo3D/Giselleflissak/Getty Images

In any doubt, the chatbot tries to level with the potential victim, offering advice like: 'It depends on the context and your comfort level. If you feel uncomfortable or threatened, it is best to talk to your boss about how you feel and ask them to clarify their intentions.'

Advert

On the chatbot's design, Ruth added, 'While developing SaferSpace, Sunita and I often reflected on our own experiences. It's a stark reminder that this isn't just a business endeavour for us; it's deeply personal.'

In a country where 58% of women have experienced sexual harassment at work, and 68% of students on campuses have faced harassment, the need for such a tool is urgent - and the developers recognised this.

The two women think that this tool could be beneficial especially considering the recent instances of sexual harassment and misconduct reported at McDonald’s, the Met police, and the NHS.

Ruth explained: 'We need to encourage wider reporting. Many victims lack information on how to report an incident, they fear that they won't be taken seriously, or find the whole process of reporting too intimidating.

Advert

'The most common reason for people not reporting, is not knowing whether what they've experienced actually warrants a complaint. '

Featured Image Credit: Vertigo3D/Giselleflissak/Getty Images
AI

Advert

Advert

Advert

Choose your content:

23 hours ago
a day ago
  • Kevin Dietsch/Getty ImagesKevin Dietsch/Getty Images
    23 hours ago

    Elon Musk issues $37,000,000,000,000 debt warning after Bitcoin shock

    The richest man in the world has a whopping $1.3 billion in Bitcoin

    News
  • Bill Turnbull/NY Daily News Archive via Getty ImagesBill Turnbull/NY Daily News Archive via Getty Images
    a day ago

    Experts reveal heartbreaking reason thousands of victims of 9/11 have still not been identified

    Almost half of the remains are still not identified to this day

    Science
  • Charlie Kirk / YouTubeCharlie Kirk / YouTube
    a day ago

    Exact words written on bullets used to kill Charlie Kirk revealed by FBI

    Inscriptions were found on the bullets

    News
  • The Salt Lake Tribune / Contributor / GettyThe Salt Lake Tribune / Contributor / Getty
    a day ago

    Man who allegedly shot and killed Charlie Kirk officially named

    President Trump issued a major update on the hunt for the Charlie Kirk shooter

    News
  • New Jersey man dies while traveling to meet AI chatbot he fell for
  • AI warns of the 'devastating' futuristic weapons likely to be used in 'World War 3'
  • Expert says AI 'could lead to literal human extinction' in jaw-dropping court testimony
  • AI startup allows people to converse with their deceased loved ones