Skip to main content

Envisioning is an emerging technology research institute and advisory.

LinkedInInstagramGitHub

2011 — 2026

research
  • Reports
  • Newsletter
  • Methodology
  • Origins
  • My Collection
services
  • Research Sessions
  • Signals Workspace
  • Bespoke Projects
  • Use Cases
  • Signal Scanfree
  • Readinessfree
impact
  • ANBIMAFuture of Brazilian Capital Markets
  • IEEECharting the Energy Transition
  • Horizon 2045Future of Human and Planetary Security
  • WKOTechnology Scanning for Austria
audiences
  • Innovation
  • Strategy
  • Consultants
  • Foresight
  • Associations
  • Governments
resources
  • Pricing
  • Partners
  • How We Work
  • Data Visualization
  • Multi-Model Method
  • FAQ
  • Security & Privacy
about
  • Manifesto
  • Community
  • Events
  • Support
  • Contact
  • Login
ResearchServicesPricingPartnersAbout
ResearchServicesPricingPartnersAbout
  1. Home
  2. Research
  3. Eclipse
  4. Avatar Consent Governance

Avatar Consent Governance

Frameworks governing the creation and use of AI replicas of deceased individuals
Back to EclipseView interactive version

The emergence of AI-powered digital avatars capable of replicating deceased individuals' voices, mannerisms, and conversational patterns has created an urgent need for robust governance frameworks. Avatar Consent Governance addresses the ethical and legal vacuum surrounding posthumous digital replicas—often called griefbots or memorial chatbots—by establishing binding control mechanisms that honour the wishes of the deceased while protecting the interests of surviving family members. At its technical core, this governance layer operates through a combination of smart contracts, cryptographic consent tokens, and policy enforcement engines that sit between the AI model and its deployment interfaces. These systems verify that each interaction with a digital avatar complies with pre-established parameters, checking against a consent ledger that may specify permitted contexts (private family use versus public memorial), temporal boundaries (active only during specific anniversaries or grief periods), and content restrictions (avoiding certain topics or relationships). The architecture typically includes multi-signature authorization requirements, ensuring that no single party can unilaterally modify or deploy an avatar without consensus from designated stakeholders.

The absence of such governance creates profound risks for both technology providers and grieving families. Companies developing memorial AI face potential litigation over unauthorized use of personality rights, while families may encounter distressing scenarios where a loved one's digital likeness is deployed in contexts they would have rejected, or where family members disagree about appropriate usage. Avatar Consent Governance solves these challenges by codifying decision-making authority before conflicts arise, establishing clear chains of custody for digital remains, and providing mechanisms for evolving consent as social norms and family circumstances change. This framework enables new business models in the death tech industry, allowing companies to offer memorial AI services with legal clarity and ethical safeguards. It also addresses the temporal dimension of grief, recognizing that what feels comforting immediately after a loss may become unhealthy or unwanted years later, by building in sunset clauses and periodic review requirements.

Early implementations of these governance systems are emerging within digital estate planning platforms and specialized end-of-life technology providers, though standardization remains limited. Some services now offer consent dashboards where individuals can pre-authorize specific uses of their data for posthumous AI creation, designate family members as stewards with varying levels of control, and establish automatic expiration dates for their digital presence. Research in digital ethics and thanatechnology suggests that as AI-generated memorial content becomes more sophisticated and widespread, formal governance structures will transition from optional features to regulatory requirements, similar to how organ donation consent evolved into standardized legal frameworks. The technology intersects with broader trends in digital legacy management, data sovereignty, and the emerging concept of "informational self-determination" that extends beyond biological death, positioning Avatar Consent Governance as a critical infrastructure for navigating the increasingly blurred boundary between remembrance and resurrection in the digital age.

TRL
4/9Formative
Impact
5/5
Investment
3/5
Category
Ethics Security

Related Organizations

HereAfter AI logo
HereAfter AI

United States · Startup

95%

An app that records personal stories and uses AI to let loved ones ask questions about those memories later.

Developer
StoryFile logo
StoryFile

United States · Company

95%

Creates conversational video AI that allows people to record their life stories for future generations to interact with.

Developer
DeepBrain AI logo
DeepBrain AI

South Korea · Company

90%

AI human synthesis company.

Developer
Eternos logo
Eternos

United States · Startup

85%

A company developing AI-driven interactive avatars that allow users to 'train' their digital selves before death.

Developer
Microsoft logo
Microsoft

United States · Company

85%

Through Copilot and the 'Recall' feature in Windows, Microsoft is integrating persistent memory and agentic capabilities directly into the operating system.

Developer
OpenAI logo

OpenAI

United States · Company

85%

Creator of GPT-4o, a natively multimodal model capable of reasoning across audio, vision, and text in real-time.

Developer
Somnium Space logo
Somnium Space

United Kingdom · Startup

80%

An open VR world that natively supports external NFT assets and avatars.

Developer
Replika logo
Replika

United States · Company

75%

An AI companion app that has faced scrutiny regarding the emotional dependence of its users.

Deployer
Institute for the Future of Work logo
Institute for the Future of Work

United Kingdom · Research Lab

70%

Research institute exploring the impacts of AI on work and identity, including rights over digital twins.

Researcher

Supporting Evidence

Evidence data is not available for this technology yet.

Same technology in other hubs

Eros
Eros
Digital Afterlife Governance

Legal and ethical frameworks for managing posthumous data, digital avatars, and AI representations

Connections

Ethics Security
Ethics Security
Posthumous Data Privacy

Legal and ethical frameworks governing digital information after death

TRL
4/9
Impact
5/5
Investment
2/5
Ethics Security
Ethics Security
Posthumous Biometrics Guardrails

Protocols governing the use of voice, face, and DNA data after death

TRL
3/9
Impact
5/5
Investment
3/5
Software
Software
Generative Griefbots

AI chatbots trained on a deceased person's messages, emails, and posts to simulate conversation

TRL
7/9
Impact
5/5
Investment
4/5
Ethics Security
Ethics Security
Digital Executor Authentication

Cryptographic systems enabling verified executors to access deceased users' digital accounts

TRL
6/9
Impact
4/5
Investment
3/5
Applications
Applications
Digital Legacy Creation

Curated digital archives that preserve personal histories through photos, videos, and interactive media

TRL
7/9
Impact
4/5
Investment
4/5
Ethics Security
Ethics Security
Memorial Safety & Moderation Systems

Specialized content moderation protecting grieving users in online memorial platforms

TRL
6/9
Impact
4/5
Investment
3/5

Book a research session

Bring this signal into a focused decision sprint with analyst-led framing and synthesis.
Research Sessions