Algorithm Assured Her Safety, But Her Husband Took Her Life

Algorithm Assured Her Safety, But Her Husband Took Her Life


A domestic dispute in a small apartment near Madrid turned violent on January 11, 2022, when⁢ Lobna Hemid’s husband ⁤attacked her with a broken piece of a⁣ wooden shoe rack. The incident, witnessed by ‍their four children, escalated as he verbally abused her with derogatory terms.

Following the altercation, Ms. Hemid reported the abuse to the police, who used an algorithm called VioGén ⁤to assess⁢ the risk of further ‍violence. Despite the software indicating low risk, tragedy struck seven weeks later when her husband ‍fatally stabbed her ​before taking his own ⁤life.

Spain’s reliance on VioGén to address gender violence has had mixed results, with​ cases of miscalculated risk leading to devastating outcomes. While the⁤ system has helped ​protect some women, others have fallen victim to repeated attacks due to inaccurate risk ‍assessments.

The statistics reveal a concerning⁤ trend, with a significant‌ number of‌ women classified as low risk by VioGén ⁣experiencing subsequent harm. ⁣This​ highlights the algorithm’s shortcomings and the need for a more nuanced approach to addressing domestic violence.

Examining Risk Levels and Outcomes

Extreme

High

Medium

Low

Negligible

Source: ⁢Spanish General Council of the Judiciary Note: Data from 2010 to 2022. Data from 2016 ​to 2018 is unavailable. By Alice Fang

Despite its limitations, Spanish law enforcement continues to rely‍ on ⁣VioGén for assessing and⁢ managing cases of⁣ gender violence, highlighting the ‍complex interplay between technology and‍ human decision-making.

2024-07-18 10:18:16
Original ​from www.nytimes.com

Exit mobile version