Sondermind

What if AI didn’t replace care, but redefined how we deliver it? Explore how SonderMind is engineering technology that scales empathy, accountability, and outcomes in behavioral health.

When people talk about artificial intelligence, they often describe it in terms of what it can replace: the tasks it can automate, the jobs it can perform, the efficiencies it can unlock. But in behavioral and mental health care, the most meaningful work that changes lives can’t be replaced. It’s the human connection between a provider and a patient.

At SonderMind, that truth sits at the center of every line of code we write. Our goal isn’t to create AI that replaces a therapist or psychiatry provider, but to build AI that understands and strengthens the therapeutic relationship—helping people feel better, faster.

This fall, we launched the SonderMind AI Suite, a set of clinically-backed tools designed to empower providers and patients alike, reinforcing and extending care beyond the therapy session into the moments when it’s needed most. These tools represent years of technical development and clinical partnership, grounded in privacy, ethics, and the real-world needs of both therapists and patients.

The Other 167 Hours

A therapy session lasts about an hour. The rest of the week—167 hours—is where the real work of healing happens. It’s in those moments that patients process insights, practice new skills, and face challenges that can either reinforce or derail progress.

Our AI tools are designed to meet patients in those moments. Through secure, always-on features like guided journaling, session recaps, brain training, and goal tracking, patients can stay connected to their therapeutic process even between sessions. The goal isn’t to add more technology into their lives; it’s to make the care they receive more continuous, personalized, and effective. These always-on tools also enable continuous measurement. In mental health, it’s difficult to capture progress through traditional assessments alone, like the PHQ-9 or GAD-7. 

Over time, we’ll be able to infer mental health states from the inputs and outputs of these digital touchpoints—journaling, recaps, notes, and transcripts—creating a richer, more dynamic understanding of each individual’s journey.

Building for Providers, with Providers

As an engineer, I’m always thinking about the system as a whole: not just what we can build, but how it fits into the real lives of the people who use it. For mental health professionals, one of the biggest challenges is administrative burden. Documentation, treatment plans, and progress notes are essential parts of care, but they often eat into the time and energy clinicians would rather spend focused on patients in session or taking on additional patients.

With the SonderMind AI Suite, we’re reducing that burden. Features like AI-generated notes, treatment plans, and patient takeaways help clinicians save as much as 7-8 hours each week and focus on connecting with patients (not taking notes), without compromising quality or compliance. Every AI-driven feature is co-developed with licensed clinicians, tested in real workflows, released only after rigorous clinical and privacy review, and continuously supervised.

We aspire to have AI become a true copilot to the provider, an example is what we have done with AI Assisted Treatment Plans, now more therapists than ever are actually setting treatment plans with clients, and since launch 40% of all treatment plans were AI generated with providers in the loop.

That last point is key: these tools are opt-in and provider-controlled. A therapist decides when and how to use them, and only with patient consent. AI serves the human, not the other way around.

Ethics & Accountability Are Design Decisions

Responsible AI isn’t something you can retrofit after the code is written—it has to be designed in from the start. That’s why every feature in our AI ecosystem is guided by our SonderMind AI Constitution. We aspire for AI to serve as a true copilot to the provider. A clear example is our AI-assisted treatment plans. Since launch, more therapists than ever are creating treatment plans collaboratively with their clients. In fact, 40% of all treatment plans have been AI-generated with providers fully in the loop, ensuring accuracy, personalization, and clinical integrity.

This framework codifies how we design, deploy, and govern AI in our care ecosystem. It sets standards for clinician oversight, data privacy, model and AI agent testing, and accountability. It’s reviewed by our AI Governance Council, which includes clinicians, compliance experts, data scientists, and product leaders. We document the intended use and risks of every feature, conduct annual bias and safety reviews, and maintain full audit logs.

All of this is codified through evaluations, tests, and runtime guardrails developed in close collaboration with our legal, clinical, product, and user experience teams.

In healthcare, there is little room for error: clinicians need reliability, patients need safety, and regulators need transparency. That’s why we’ve built our infrastructure to meet HIPAA and ISO 27001 standards, with dual encryption and feature-specific deletion protocols. Audio data, for instance, is deleted after 30 days, and all patient-facing features require explicit opt-in.

But beyond compliance, we see accountability as part of our engineering culture. Our teams document decisions, test for unintended outcomes, and invite external review. It’s not enough to say that a model performs well. You have to understand why, and for whom, it performs well.

The Next Era of Care Delivery

The healthcare industry has long talked about “value-based care”—a system where outcomes, not volume, drive success. AI can finally make that real in mental health.

By connecting clinical outcomes with engagement data, our tools help providers and health systems measure what matters: whether people are actually getting better. Early data show that providers using our AI-enabled tools are submitting documentation 55 hours faster than industry benchmarks and are able to see up to 1.5 more patients per week. That’s more time for care, less time on paperwork, and measurable progress toward better outcomes.

For payors and health systems, this means accountable, coordinated care that drives measurable results. Patients in AI-supported care pathways experience 67% higher engagement and 3-4x better symptom reduction, improving outcomes and value-based performance. For patients, it means faster progress, more personalized support, and a stronger therapeutic alliance.

Human Connection at Scale

We’re living through a fascinating moment in AI development. Every week brings a new headline about what’s possible—but also new questions about responsible use. In that landscape, our north star remains clear: technology should enhance the humanity of behavioral and mental healthcare, not erode it.

When AI is used responsibly, it can make therapy more accessible. It can help providers focus on what matters most. It can bridge the gap between sessions and turn insights into sustained change.

At SonderMind, we’re not building AI to replace the therapist—we’re building AI to make the therapist’s work more effective, more sustainable, and more human. We’re building AI to make the patient’s time both within and in between sessions more engaging, more continuous, and more effective.

That’s the future we’re engineering toward: one where technology amplifies human-led empathy, and care never stops at the session door.