## AI Tools in Social Work: Gender Bias Concerns A recent study by the London School of Economics (LSE) reveals a concerning trend: AI tools used by over half of England's councils may be i…
## AI Tools in Social Work: Gender Bias Concerns A recent study by the London School of Economics (LSE) reveals a concerning trend: AI tools used by over half of England's councils may be introducing **gender bias** into care decisions. These AI tools, often employed to help social workers manage heavy workloads, are summarizing case notes.
However, the LSE research indicates that these summaries, generated using tools like Google's "Gemma," are **downplaying women's health issues**. ### Key Findings: * The study suggests that the AI tools are more likely to use language that minimizes the severity of women's physical and mental health concerns.
* Phrases like "disabled," "unable," and "complex" were more frequently used, potentially leading to a misrepresentation of women's needs. * This bias could result in unfair or inadequate care decisions. The research highlights the urgent need to address potential biases in AI systems used in social work and ensure that these tools do not inadvertently disadvantage women.