Family handout Lina tragically lost her life at her home in the picturesque Spanish coastal town of Benalmdena on February 9, 2025. This heartbreaking event has drawn significant attention to the efficacy of domestic violence risk assessment tools, particularly Spain's algorithmic system known as VioGn.

In January, Lina made the courageous decision to approach the police after experiencing threats from her ex-partner. On that fateful day, she reported an incident where he raised his hand as if to strike her. Her cousin, Daniel, recalls that Lina had endured several violent episodes, leaving her deeply frightened for her safety.

Upon visiting the police station, Lina underwent an interview and her case was entered into VioGn, a digital tool designed to evaluate the likelihood of a woman experiencing further violence from her partner. The VioGn system operates by posing 35 questions regarding the nature and severity of the abuse, the aggressor's access to firearms, his mental health, and whether the woman has left or is contemplating leaving the relationship. Based on the responses, the system classifies the threat level as 'negligible,' 'low,' 'medium,' 'high,' or 'extreme.' Unfortunately, Linas situation was assessed as being at 'medium' risk.

Seeking to protect herself and her children, Lina applied for a restraining order at a specialized gender violence court in Malaga to prevent her ex-partner from contacting her or being in her home. Regrettably, her request was denied.

According to her cousin, "Lina wanted to change the locks at her home so that she could live peacefully with her children." Tragically, just three weeks after her visit to the police, Lina was found dead. Her ex-partner allegedly entered her apartment with his key, setting the home ablaze. While her children, mother, and even her ex-partner managed to escape, Lina did not survive. Her 11-year-old son bravely informed the authorities that it was his father who had killed his mother, leading to the arrest of her ex-partner. This harrowing incident has sparked urgent discussions about the effectiveness of VioGn and its capacity to ensure the safety of women in Spain.

The VioGn system's failure to accurately assess the threat against Lina raises critical questions. Although Lina was classified as 'medium' risk, the protocol dictates that a designated police officer should follow up with her within 30 days. Tragically, she was murdered before any follow-up could occur. Had she been categorized as 'high' risk, the police would have been required to check in with her within a week. This begs the question: Could a timely intervention have altered the tragic outcome for Lina?

Risk assessment tools similar to VioGn are utilized across North America and various European nations. In the UK, police forces often apply DARA (Domestic Abuse Risk Assessment) which functions as a checklist, while DASH (Domestic Abuse, Stalking, Harassment, and Honour-based Violence Assessment) may be employed by police or social workers to evaluate the risk of further attacks. However, Spain stands out for its stringent integration of algorithmic assessments into police operations. VioGn was developed collaboratively by Spanish police and academic experts, and its implementation has become widespread, excluding only the Basque Country and Catalonia, which operate their own systems while still cooperating with national protocols.

Ch Insp Isabel Espejo, who leads the National Police's family and women's unit in Malaga, speaks highly of VioGn, calling it a crucial resource in monitoring each victims circumstances. Her unit handles an average of 10 gender violence reports daily, with VioGn frequently identifying approximately nine to ten women each month as being at 'extreme' risk of repeated victimization. Cases deemed 'extreme' necessitate substantial police resources, including round-the-clock protection until the situation improves. In a significant study conducted in 2014, it was revealed that police officers accepted VioGn's risk evaluations 95% of the time, a statistic that underlines the system's perceived reliability.

However, critics argue that reliance on an algorithm for assessing the safety of women may lead to a dangerous abdication of personal responsibility by the police. Ch Insp Espejo acknowledges that while the algorithm is usually reliable, she concedes that Lina's assessment did not reflect the severity of her situation. "I'm not going to say VioGn doesn't fail it does. But this wasn't the trigger that led to this woman's murder. The only guilty party is the person who killed Lina. Total security just doesn't exist," she stated emphatically. Yet, being classified as 'medium' risk meant that Lina did not receive immediate police attention. Furthermore, it raises the question of whether her VioGn assessment influenced the court's denial of her restraining order request.

Judge Maria del Carmen Gutirrez, who presides over gender violence cases in Malaga, spoke to us regarding the criteria for issuing restraining orders. Although we were unable to meet the judge who denied Lina's request specifically, Judge Gutirrez explained that such orders require two key elements: clear evidence of a crime and a serious threat to the victim's safety. "VioGn is one element I use to assess that danger, but it's far from the only one," she clarified. There are instances where judges issue restraining orders even when VioGn indicates a low risk, and conversely, there are situations where judges refrain from doing so in cases labeled as high risk.

Dr. Juan Jose Medina, a criminologist at the University of Seville, highlighted a concerning disparity in the way restraining orders are granted across different regions in Spain, describing a "postcode lottery" for women seeking protection. Despite the widespread use of VioGn, there remains a lack of systematic understanding regarding how this algorithm influences court decisions and police actions, as no comprehensive studies have been conducted to examine this question. "How are police officers and other stakeholders using this tool, and how is it informing their decision-making? We don't have good answers," he noted.

Access to VioGn data has historically been limited by Spain's interior ministry, and no independent audits of the algorithm have been carried out to date. Gemma Galdon, founder of the organization Eticas, which focuses on the social and ethical implications of technology, argues that without proper audits, it is impossible to ascertain whether these systems are effectively protecting the women they are meant to serve. Numerous documented instances of algorithmic bias exist globally, including a notable 2016 analysis in the US which revealed that Black defendants were often misclassified as higher risk for recidivism, while white defendants were more frequently misidentified as low risk. In 2018, the Spanish interior ministry declined a proposal from Eticas to conduct a confidential internal audit, prompting Gemma Galdon and her team to pursue an external audit of VioGn. They gathered data through interviews with survivors of domestic violence and public information, including judicial records of women like Lina who were murdered by their partners or ex-partners. Their findings revealed that between 2003 and 2021, 71 women who had reported domestic abuse to the police were subsequently killed, many of whom were categorized by VioGn as having 'negligible' or 'medium' risk levels. Galdon stated, What wed like to know is, were those error rates that cannot be mitigated in any way? Or could we have done something to improve how these systems assign risk and protect those women better? This tragic case serves as a stark reminder of the complexities and challenges in addressing domestic violence effectively, raising concerns about the balance between algorithmic assessments and human intervention in ensuring the safety of vulnerable individuals.