Navigating AI in Victim Services: A Response to NNEDV's Vital Guidance
- Parasol Editorial Team
- Oct 25
- 4 min read
The Parasol Cooperative applauds the National Network to End Domestic Violence (NNEDV) for its recently released guide, Artificial Intelligence and Victim Services: A Comprehensive Guide for Advocates. This vital analysis is a timely contribution to the field, and its focus on safeguarding survivors reflects a shared objective central to our own mission.
NNEDV’s guidance clearly outlines the significant risks of using general-purpose AI in victim services. The risks of inadequate data privacy, inappropriate responses, and the lack of a survivor-centered approach are precisely the challenges we sought to address in designing Ruth, our trauma-informed AI chatbot. Ruth was built on trauma response principles to help individuals identify and navigate crises, and to complement, not replace, human advocate care. Our response outlines how Ruth’s design philosophy and operational protocols directly reflect the priorities NNEDV shared.
NNEDV’s Guidance and Ruth’s Alignment
On Avoiding General-Purpose AI
• NNEDV Guidance: Advises against using general-purpose chatbots for survivor support.
• Ruth Difference: Ruth is not a general-purpose AI but a purpose-built tool developed specifically for crisis support. Its design is informed by the same principles used to train human advocates. Ruth's core function is to help people identify and navigate crises, connect them to local resources, and consistently recommend connection with a human advocate.
On Data Privacy and Security
• NNEDV Guidance: Highlights the critical need to protect sensitive data and prevent third-party access.
• Ruth Difference: Ruth is built on a secure Amazon Web Services (AWS) Bedrock infrastructure engineered to protect user privacy. User conversations are not stored or used to train AI models, unlike platforms like ChatGPT, which retain and learn from user data.
In cases where Ruth is integrated into a victim service organization's system, conversations may be temporarily stored (for up to seven days) for quality assurance before being permanently deleted. This is in line with the protocol for reviewing hotline calls, and the reviews are conducted solely by professionals already bound by strict confidentiality obligations under laws like the Violence Against Women Act (VAWA) and the Family Violence Prevention and Services Act (FVPSA). The data collected for quality assurance aligns with the reporting and confidentiality standards that organizations already maintain for reports to funders like the Office for Victims of Crime (OVC). For full details, please see Ruth’s Privacy Policy and Terms of Use.
On Legal Access and Data Retention
NNEDV Guidance: Warns that AI conversations can be subpoenaed, citing a case where courts ordered an AI company to preserve user data, including chats users believed had been deleted.
Ruth Difference: Ruth’s system is designed so that there is nothing to subpoena. While AI prompts can be subject to legal requests, that applies only when data is stored or linked to identifying information. Ruth prevents this by design. Ruth requires no account login, collects no IP or device data, and runs on Amazon Web Services (AWS) Bedrock in zero-data-retention mode, meaning conversations are deleted immediately after use and are never shared with any other party or used to train the underlying AI model.
When Ruth is integrated into a service provider’s system, conversations may be temporarily stored for up to seven days for quality assurance, and then permanently deleted. These measures ensure that survivor conversations cannot be retrieved or linked back to an individual, even if a subpoena were issued. Because there are no stored messages, user identifiers, or data transfers to external systems, there is nothing for a court to compel or recover.
On Preserving Survivor Agency
• NNEDV Guidance: Cautions against AI making decisions for survivors.
• Ruth Difference: Ruth is designed to equip survivors with information, not to make decisions for them. It does not assess risk, determine eligibility, or direct a course of action. All agency remains with the individual. In this way, Ruth functions like a smart resource directory, always encouraging users to connect with a human advocate.
On Human-Centered, Survivor-Focused Approaches
• NNEDV Guidance: Stresses that technology must be trauma-informed and human-centered.
• Ruth Difference: This principle is the foundation of Ruth's design. Ruth was built by survivors and advocates, is guided by trauma-informed principles, and operates under safety-first protocols that general-purpose AI tools cannot replicate.
On Environmental Impact
NNEDV Guidance: Raises essential questions about the environmental cost of AI.
Ruth Difference: We agree that the environmental impact of technology deserves careful consideration. When evaluating this for essential safety services, however, we believe the context is critical.
Negligible Individual Use: Current AI queries use about 0.24-0.34 watts of energy, equivalent to running a standard microwave for one second. This is far less than the energy consumed by many everyday digital activities.
High-Value Application: Unlike AI used for casual tasks, Ruth provides critical safety information. This focused utility can actually reduce a survivor's overall energy footprint by preventing the need for multiple online searches or unnecessary travel.
Responsible Infrastructure: We are committed to mitigating our impact. Ruth is hosted on AWS, which has committed to powering its operations with 100% renewable energy by 2025 and reaching net-zero carbon emissions by 2040.
Conclusion
NNEDV's guidance provides a crucial framework for all stakeholders: a call to embrace technology's potential while steadfastly avoiding its pitfalls. Their analysis doesn't discourage innovation; it demands responsible innovation.
We are grateful for NNEDV's leadership in defining a clear set of risks and standards. The alignment between their guidance and Ruth’s design demonstrates what's possible when technology is built from the start to meet the exacting safety and ethical needs of survivors.
Working together with advocates and partners, we can ensure that technology truly serves survivors, upholding the highest standards of safety, ethics, and care.
You can read NNEDV's guidance here.
