How Satsang is different from ChatGPT
ChatGPT is a brilliant generalist. Satsang is built for one thing — the moments with your kids and your partner that you would never type into a public chatbot. We are not training AI on your family. We are not selling your data. And there is a real therapist behind the voice you talk with here.
Where the two genuinely differ
Built for one thing: parenting
ChatGPT is a generalist. It will help you draft a memo, plan a trip, or explain quantum physics. Satsang is built for the moments with your kids and your partner that you would never type into a public chatbot — the bedtime that fell apart, the sentence you wish you had not said, the worry that wakes you up at 2am.
Technical detail: Our model is steered by a curriculum and a system prompt designed around family dynamics, child development, and parental regulation. Responses follow a "steady the parent first, then think with them" pattern reviewed by our resident therapist. The assistant has guardrails for crisis cues that route to human resources, rather than improvising clinical advice.
We never sell your family’s story
No ads. No data brokers. No ranked feed of "things parents like you bought." What you tell Satsang about your kids is not a product we package for someone else. It is your story, and it stays yours.
Technical detail: We do not sell, rent, or share personal data with advertisers, analytics resellers, or data brokers. Our subprocessors are bound by Data Processing Agreements that prohibit advertising profiles, resale, and any use of your data outside delivering the contracted service. Product analytics (PostHog) capture pseudonymous events only — page views and feature usage, never conversation content.
Your conversations are not training data
When you type into ChatGPT’s consumer app, your conversation can be used to train future versions of the model unless you opt out. With Satsang, the opposite is the default: nothing you share teaches an AI, ever. We do not train models on your words, and our model provider has agreed in writing not to either.
Technical detail: Our agreements with Anthropic opt out of training on customer data and prohibit retention of inputs or outputs for model improvement. Where supported, provider-side zero-data-retention controls are enabled. We retain only short, redacted summaries you can view and delete from your privacy settings. Full encryption details and the list of subprocessors live on our security page.
A real therapist behind the words
ChatGPT speaks with the average tone of the internet. Satsang speaks with the voice and judgment of Mirra Wicker, a therapist with 15 years of clinical work in families and child development. Her training, language, and frameworks shape what you read here.
Technical detail: Mirra reviews the assistant’s clinical voice, escalation paths, and care patterns. Responses lean on attachment-informed and nervous-system–aware frameworks rather than generic life-coach advice. When a moment is beyond what coaching can hold — trauma, harm risk, acute mental health — the assistant says so and points you to a human professional.
Memory that picks up where you left off
A general chatbot starts fresh every conversation. Satsang remembers the names you have shared, what you tried last week, and what you are working on — only because you told it on purpose, and only because that is what makes the help useful. You can read what we remember and delete any of it, anytime.
Technical detail: Memory is built from short, redacted summaries of past conversations, stored encrypted at rest in our managed Postgres database. You can review, edit, or delete each summary from privacy settings; deletions propagate to backups within 30 days. We never use these summaries to train models; they exist only to give the next conversation a place to begin.
Side by side, at a glance
- What it is built for
- ChatGPT: Everything. Code, recipes, research, marketing copy, school essays. Satsang: The parenting moments you would never type into a public chatbot.
- What happens to your words
- ChatGPT: Consumer accounts may use your conversations to train future models unless you opt out in settings. Satsang: Never used to train any model. Only short summaries you can view and delete are kept.
- Who is behind the voice
- ChatGPT: Tone learned from the open internet. Satsang: Voice and clinical judgment shaped by Mirra Wicker, a therapist with 15 years in families and child development.
- Memory between sessions
- ChatGPT: Optional, general-purpose memory. Designed to be useful across every topic you ask about. Satsang: Memory designed for parenting continuity. Visible to you, editable, and deletable on demand.
- How your data is used commercially
- ChatGPT: Subject to OpenAI’s consumer terms, which include broad operational uses. Satsang: Never sold, rented, or used to build advertising profiles. Subprocessors are bound by DPAs that prohibit it.
When ChatGPT is the better choice
We are not trying to replace a great general-purpose tool. If you need to:
- Draft a work email, a contract, or a school essay.
- Debug code, brainstorm a marketing line, or research a recipe.
- Ask a public question that you would happily say out loud.
ChatGPT will likely serve you well. Satsang exists for the conversations you would not put into a generalist tool — the ones about your child, your partner, your patience at the end of a long day.
The promises behind all of this
- We will never sell your family’s data, full stop.
- We will never train AI on what you share without your explicit consent.
- You can delete any conversation, any stored summary, or your whole account at any time.
- You can email team@satsangcoach.com for a portable export of everything we hold about you.
For the technical detail on how your data is encrypted, who processes it, and what we keep, see our security page. For the legal text covering CCPA, CPRA, GDPR, and COPPA, see the privacy policy. Questions? Write to team@satsangcoach.com.