Transparency
All AI-generated content is clearly disclosed. Users always know they are interacting with an AI system, not a person.
A small, research-driven team exploring ethical, human-centered applications of AI in the space of memory, reflection, and personal meaning.
Umbrella Research is the team behind AfterLight. We are not a large corporation. We are a small initiative prioritizing responsible design over rapid commercialization.
Our work sits at the intersection of AI, memory, and ethics. We believe that how technology handles personal memory matters — not just technically, but morally.
We value transparency, user agency, and clear ethical boundaries in everything we build. AfterLight is our first project, and it reflects these values at every layer.
Who we are, what we believe, and why we build the way we do.
All AI-generated content is clearly disclosed. Users always know they are interacting with an AI system, not a person.
Users decide what memories to share and how to interact. They can stop, delete, or modify their experience at any time.
We do not collect more than necessary or share data with third parties. Personal memories are treated with care.
Every design decision is grounded in ethical reasoning, not market pressure. We move deliberately, not fast.
Founder & Lead Researcher
Exploring the ethical boundaries of memory representation through AI.
We build slowly, because what we’re building matters slowly.
Umbrella Research
AfterLight works with personal memory — one of the most sensitive kinds of data a person can share. That demands a level of care that goes beyond standard privacy practices.
We designed AfterLight’s ethical framework before writing a single line of code. The principles below are not aspirational. They are architectural constraints that shape what the system can and cannot do.
Avatars reflect memories shared by the user. They do not claim to be, replace, or simulate a real person.
All AI-generated content is clearly disclosed. Users always know they are interacting with an AI system, not a person.
AfterLight is not therapy, counseling, or a medical tool. It does not diagnose, treat, or advise on mental health.
Users decide what memories to share and how to interact. They can stop, delete, or modify their experience at any time.
The system is designed to support healthy reflection, not to create emotional dependency or manipulate vulnerable users.
Personal data and memories are treated with care. The system does not collect more than necessary or share data with third parties.
The right question is not “what can we build?” but “what should we build — and what should we refuse to?”
Umbrella Research