Tech Abuse & Domestic Violence: Safety & Defense
Tech Companies Address Their Role in Domestic Abuse Prevention
Updated June 19, 2025
As technology advances, so does technology-facilitated abuse (TFA). Recognizing the need for intervention, advocates like southworth, founder of the National Network to End Domestic Violence’s Safety Net Project, have worked since 2000 to equip victims and hold abusers accountable. The project provides resources, including toolkits with guidance on creating strong passwords and security questions. “When you’re in a relationship with someone,” says Audace Garnett, Safety Net’s director, “they may know your mother’s maiden name.”
Southworth later advised tech companies on user protection. Joining Meta (formerly Facebook) in 2020 as head of women’s safety,she focused on intimate image abuse,noting Meta’s early “sextortion” policies from 2012. Her current work involves “reactive hashing,” adding digital fingerprints to nonconsensual images, allowing survivors to report them once for blocking across platforms.
Meta has also addressed “cyberflashing” on Instagram by restricting unsolicited image, video, and voice note sharing between accounts.While Meta removes online threats violating bullying policies and promoting offline violence, policy changes earlier this year have allowed users to refer to women as “household objects” and post previously banned transphobic and homophobic comments, according to CNN.
The dual-use nature of technology presents a challenge. Tracking functions, dangerous in the hands of abusers, can also help victims monitor stalkers. Experts acknowledge the difficulty in mitigating TFA, citing the misuse of parental controls to monitor adults. Garnett suggests designing technology with safety in mind from the outset, but notes this is often to late for established products. Computer scientists point to Apple’s security measures as effective,but acknowledge that no measures are foolproof.
Over the past decade, major U.S. tech companies,including Google,Meta,Airbnb,Apple,and Amazon,have formed safety advisory boards. Uber’s board members provide feedback on potential blind spots, influencing the progress of customizable safety tools, according to Liz Dank, who leads work on women’s and personal safety at the company. This collaboration led to Uber’s PIN verification feature, ensuring riders enter the correct vehicle.
Apple offers a 140-page “Personal Safety User Guide” with detailed guidance, including steps for blocking, evidence collection, and unwanted tracking alerts for those considering leaving unsafe relationships.
Though, abusers circumvent these precautions. Elizabeth, who asked that only her first name be used, discovered an AirTag her ex hid in her car’s wheel well. apple introduced a security measure allowing users alerted to unwanted tracking to locate the device via sound after receiving numerous reports. “That’s why he’d wrapped it in duct tape,” Elizabeth said, “To muffle the sound.”
Laws Play Catch-Up
Lisa Fontes, a psychologist and expert on coercive control, notes inconsistent law enforcement responses.”I’ve seen police say to a victim, ‘You shouldn’t have given him the picture,’” regarding nonconsensual image sharing. She also cites instances were police dismissed hidden “nanny cam” cases due to lack of proof.
What’s next
The ongoing challenge lies in proactively addressing technology-facilitated abuse, balancing innovation with user safety, and ensuring consistent legal responses to protect victims.
