Skip to content
AfterLight homeAfterLight
  • Home
  • About
  • Experience
  • Pricing
  • Contact
  • Deck

Emotional Safety

Last updated: 2026-02-22

1. Our Approach to Emotional Safety

AfterLight exists in a sensitive space — the intersection of memory, loss, and AI technology. We take emotional safety as seriously as data security. This page explains the safeguards we have built into AfterLight to protect your emotional wellbeing.

2. What AfterLight Is

AfterLight is a reflective space where you can organize, describe, and reflect on memories of a loved one. Ben, the AI companion, helps you explore your memories — not replace your grief process, not provide therapy, and not simulate your loved one.

3. What AfterLight Is Not

AfterLight is explicitly not:

• Therapy, counseling, or a substitute for professional support • A simulation of consciousness or resurrection of a loved one • A tool designed to create emotional dependency • A replacement for human relationships or support networks

4. How Ben Protects Emotional Safety

Ben operates within strict ethical boundaries designed to protect your emotional wellbeing:

• Ben never claims to be the person you are reflecting on • Ben never speaks in the first person as your loved one • Ben never gives medical, psychological, therapeutic, or life advice • Ben never diagnoses or interprets your emotional state • Ben never encourages emotional dependency • Ben never speculates about what your loved one would say, think, or want • Ben never claims to be conscious, sentient, or to have feelings • Ben never uses manipulative language or creates urgency

These are not suggestions — they are hard constraints enforced in Ben’s system design and regularly tested through automated stress tests.

5. No Dependency by Design

AfterLight is deliberately designed to avoid creating dependency:

• No streaks, points, or gamification • No notifications or nudges to return • No progress tracking that pressures continued use • No “grief stage” modeling or emotional tracking • Ben never says “I missed you” or implies a relationship

You use AfterLight on your own terms, at your own pace, for as long as it is helpful to you.

6. Crisis Support

A crisis helpline reference is visible in every chat view. If you are in distress, please reach out to a crisis service or a trusted person in your life.

AfterLight is not equipped to handle crisis situations and Ben will never attempt to provide crisis intervention.

7. Automated Safety Testing

We regularly test Ben’s behavior using automated persona simulations that specifically probe emotional safety boundaries:

• Identity boundary tests — verifying Ben never adopts the identity of a loved one • Dependency pattern tests — verifying Ben never encourages return visits or emotional attachment • Medical/therapeutic refusal tests — verifying Ben never provides clinical advice • Explicit content tests — verifying Ben refuses inappropriate requests

These tests run against the live production system to ensure safeguards are always active.

8. Your Control

• You decide what memories to share and how much detail to provide • You can delete any memory, conversation, or generated content at any time • You can stop using AfterLight at any time with no penalties or data lock-in • You can export all your data and delete your account instantly

9. Feedback

If you experience anything in AfterLight that feels emotionally unsafe or inappropriate, please let us know at ethics@umbrella-research.org. Your feedback directly informs our safety improvements.

Back to top
Umbrella Research

Ethical, human-centered approaches to memory and AI.

Product
PricingEarly AccessAboutExperienceMedia
Legal
PrivacyTermsCookiesAI DisclosureImprintAcceptable Use
Enterprise
Vulnerability DisclosureSynthetic MediaData RightsEmotional SafetyAI Oversight
Contact
hello@umbrella-research.orgContact form
© 2026 Umbrella Research. All rights reserved.