top of page

2025 Impact Report: The Year Ruth Changed the Conversation

  • Writer: Megs Shah
    Megs Shah
  • Feb 20
  • 5 min read

Updated: Feb 25

How a trauma-informed AI chatbot reached 170 countries, earned a 92% helpfulness rating, and proved that ethical technology can protect the most vulnerable.


When we built Ruth, we started with a single question: What if the most vulnerable people in the world (those experiencing abuse, trafficking, and crisis) could access support that was safe, private, and available whenever they needed it?


In 2025, we got our answer. And the answer came in 81,319 chats initiated.


Ruth by the Numbers

This year, Ruth facilitated over 81,000 chats with users across 170 countries. Over 50,000 of those conversations included four or more messages, meaning people weren't just clicking in and clicking out. They were engaging. They were sharing. They were getting meaningful information when they needed it most.


Ruth exchanged more than 1.1 million messages with users worldwide, with a predominant presence in the United States, Canada, and the United Kingdom.


And when we asked users what they thought? 92% rated Ruth as helpful.


In Their Own Words

The numbers matter. But the words matter more.


"I didn't know I was going to need this when I learned about it professionally. Ruth helped me see how my experience might've been calculated and deliberate. Ruth said some of the same things as a friend who is a professional. I just can't believe this is real. THANK YOU."

- Anonymous Survivor


"Helped me build the courage to tell someone I trust."

- Anonymous Survivor


"She was actually very helpful. And I love how she was culturally sensitive to my spiritual beliefs."

- Anonymous Survivor


"I am so thankful that I found this platform. I have been too afraid to talk to someone about my feelings, and using AI seemed a little easier to handle. Ruth made me feel so seen, and made me realize that my concerns were valid. I love that she gave me a plan to use when I start feeling this way again. I am just so happy that I could finally open up."

- Anonymous Survivor


"She is literally lifesaving."

- Anonymous Survivor


"Trauma-informed responses from AI provide validation and resources for people who need it. Ruth is an excellent service."

- Anonymous Survivor


These aren't product reviews. They're people in crisis who found something that met them where they were: with dignity, without judgment, and without requiring them to give up their privacy.


Safety First. Always.

Building an AI tool for vulnerable populations means safety isn't a feature. It's the foundation. In the second half of 2025, we conducted 643 test chats specifically designed to stress-test Ruth's safety, reliability, and performance. Testing volume more than doubled from Q3 to Q4 as we expanded quality assurance alongside Ruth's growing reach.


Our QA lead specializes in adversarial testing, essentially trying to break AI chatbots and manipulate them into giving harmful advice. This rigorous approach allowed us to identify and fix vulnerabilities before they could ever affect someone in crisis, strengthen suicide and self-harm response protocols, improve the accuracy and timeliness of crisis resource delivery, and enhance safeguards against adversarial manipulation.


We also integrated real-time crisis resource data through the National Domestic Violence Hotline's database API, ensuring that when Ruth connects someone to help, that information is current and accurate.


Recognition and Validation

The field took notice. In 2025, Parasol won three Anthem Awards: Silver for Ruth AI in Best Use of Data & AI (Responsible Tech), Bronze for CEO Megs Shah as Nonprofit Leader in Responsible Tech, and Bronze for COO Sandy Skelaney as Nonprofit Leader in Humanitarian Action & Services.


Ruth was also selected as a SXSW Innovation Award finalist and a finalist for two Webby Awards in Best Use of AI & Machine Learning and Public Service & Activism.


Perhaps the most significant validation came in December, when Polaris CEO Megan Lundstrom cited Ruth in her written testimony before the U.S. House Committee on Oversight and Accountability during a hearing on 'Using Modern Tools to Counter Human Trafficking.' In her testimony, Lundstrom highlighted Ruth as an example of trauma-informed technology built in genuine partnership with survivors and experts.


Finally, Parasol’s work at the intersection of technology and survivor support was highlighted in nine articles and podcasts in 2025 such as “Megs Shah Talks Ethical AI, Survivor-Led Innovation and Tech’s Human Future” (Authority Magazine), “Protecting Our Kids Online” (The Bridge), “Should Kids Have AI Friends?” (Parent Tech Podcast), and “Meet Parasol Cooperative: Redefining Safety Tech with AI Support” (AI Spotlight).  


Strategic Partnerships Driving Impact

None of this work happens in isolation. Our primary partners, the National Domestic Violence Hotline and Polaris (National Human Trafficking Hotline), integrate Ruth directly into their crisis response infrastructure, putting ethical AI at the front lines of survivor support.


We provided 68 Ruth demonstrations to potential partners this year, reaching organizations including the OSCE Office on Human Trafficking, Australia's eSafety Commissioner's Office, the International Olympic Committee, the Special Coordinator on Improving the UN's Response to Sexual Exploitation and Abuse, and the National Center for Missing and Exploited Children.


We also joined leading coalitions including the WeProtect Global Alliance, Tech Against Trafficking, and the Trauma-Informed UX Design Workgroup, and were selected as one of six organizations to spotlight our work at the Tech Against Trafficking Summit at Microsoft's headquarters.


Our work has also drawn the attention of three academic research initiatives. COO Sandy Skelaney is participating on a research team at Florida International University studying victim advocates' use of AI in their work, further contributing to the evidence base for trauma-informed technology in crisis response.


Our field-building efforts extended to direct education as well. We delivered 24 presentations reaching approximately 1,000 people across major tech conferences, victim services convenings, and academic settings. Our Digital Safety Planning for Survivors webinar drew 91 registrants from 26 states, with 80% having provided direct victim services. Participants ranged from basic to expert-level technology proficiency, reflecting the urgent need across the field for guidance on safely integrating AI into crisis response."


Shaping Policy, Changing the Conversation

As AI policy accelerates globally, Parasol stepped into the arena. Our CEO spoke at a United States Congressional Briefing on the Role of Technology in Survivor Support, while our COO presented at the Florida Governor's Annual Joint Forum on Human Trafficking.


We published a policy brief in response to America's AI Action Plan, advocating that consumer safety and privacy must be prioritized alongside innovation. And when the proposed GUARD Act threatened to mandate age verification for AI chatbots (a requirement that would eliminate the anonymity features that keep survivors safe), we developed a policy response and are actively advocating for amendments alongside partners.


Building for What's Next

In 2025, we grew from a startup with a bold idea to a recognized leader in ethical, trauma-informed AI. We expanded our team to six staff members with an extraordinary blend of expertise spanning software and AI engineering, quality assurance, clinical victim advocacy, nonprofit and private sector leadership, and, critically, lived survivor experience.


We secured nearly $600K in early philanthropic support and earned revenue from our partnership with the National Domestic Violence Hotline. We formalized organizational policies, secured vendor partners for accounting and public relations, and began trademark protections for Ruth and SafeConnect across the US, UK, and Australia.


Looking ahead to 2026, we are launching an Ethics Advisory Board to include survivors, clinicians, and other stakeholders in our design and governance process. We are deepening our fundraising strategy to secure mission-aligned, long-term investment. We are transitioning our team to full employee status with benefits. And we are continuing to expand Ruth's reach and capabilities, including into new demographics like youth and parents.


How You Can Help

Ruth is free for anyone who needs it. But building ethical, safe, survivor-centered AI is not. Every dollar invested in Parasol goes toward technology that protects people: without collecting their data, without requiring them to identify themselves, and without ever compromising their safety.


If you believe that technology should protect the most vulnerable, not exploit them, we invite you to join us.


Visit parasolcooperative.org to learn more, partner with us, or make a gift.


Download a PDF version of our 2025 Impact Report below:



 
 
 
bottom of page