Digital Doppelgängers: Authenticity, Consent, and the Myth of Immortality
The dream of reviving the dead has long animated technology. From photographs to holograms, each innovation captures a fragment of presence that defies time. Artificial intelligence (AI) pushes this further, transforming imitation into simulation. Digital replicas of historical figures or modern icons now speak, reason, and improvise, offering a semblance of life. Yet, this technical marvel raises profound questions: what does it mean to recreate a mind without consent, and when does imitation blur into a false promise of immortality? In cultural heritage, these replicas promise education but risk ethical pitfalls.
Cultural Replicas and the Performance of Authenticity
AI-driven cultural replicas bring history into the present. In museums, classrooms, and digital archives, these doppelgängers—more than mere chatbots—enable interactive dialogues with historical figures, evoking their era’s insights and perspectives. As Škobo and Šović (2025) note, such systems blend pedagogy with performance, letting users “converse” with the past. These replicas make history accessible, engaging audiences through simulated voices that evoke the wit of satirical writers or the ingenuity of inventors.
But authenticity is elusive. A replica’s responses are not memories but probabilities, stitched from datasets of texts, speeches, or letters. A digital replica might echo a satirical tone but miss the social critique of its historical context, where humor targeted societal absurdities. This underscores the need for human curation to ensure replicas educate rather than mislead, preserving historical integrity over performative allure. In the absence of this dynamic interplay, the cultural legacies that have been established risk becoming static caricatures, thereby losing the cultural depth that defines them.
Transparency and the Boundaries of Simulation
Transparency is critical to maintaining trust. The European Union’s Artificial Intelligence Act (Regulation 2024/1689) mandates clear labeling of AI-generated content, including synthetic voices or likenesses. This prevents digital doppelgängers from being mistaken for historical sources. Without labels, a simulated lecture by a historical figure could blur into pseudo-history, confusing narrative empathy with factual authority.
When AI generates a dialogue for a 17th-century scholar, clear disclosure ensures users see it as a tool for engagement, not a primary source. Transparency protects cultural memory, ensuring digital replicas amplify history without distorting it, and fosters collaboration between historians and technologists to highlight the constructed nature of these simulations.
Consent and the Right to the Self
If authenticity guards truth, consent guards ownership. The Tennessee ELVIS Act (2024) pioneers legal protection for voice and likeness, prohibiting unauthorized AI-generated replicas. This law reframes identity as dignity, not just data. Creating a digital doppelgänger without permission, whether for education or art, treads a fine line between tribute and exploitation. This is critical for historical figures whose estates or cultural communities must be consulted to avoid perpetuating historical power imbalances.
Consent is a practical concern. Recreating a figure requires navigating ethical boundaries, especially when descendants or cultural representatives are involved. Unauthorized simulations of marginalized voices, such as indigenous leaders, could reinforce stereotypes rather than preserve heritage. Community-driven governance ensures replicas respect the dignity of the original while serving educational goals.
The Afterlife of Data and the Illusion of Immortality
The allure of “digital immortality” suggests consciousness can persist as algorithmic patterns—an archive of words and gestures ready to be reanimated. Yet, as Harbinja (2025) argues, post-mortem data is residue, not continuity. A digital replica of a deceased poet may speak convincingly, but it lacks the moral context of their life. The dead become interfaces, editable and ever available, turning mourning into maintenance and grief into a public spectacle.
This redefines death in the digital age. Instead of silence, we get perpetual dialogue, instead of absence, simulation. This raises questions about preserving legacies without freezing them into caricatures. A digital historical figure might engage audiences, but without careful design, it risks reducing a complex life to quotable phrases. Ethical frameworks must prioritize ephemerality, ensuring legacies evolve through living interpretation rather than static simulation.
The Pedagogy of Empathy
When used responsibly, digital doppelgängers can humanize history. In educational settings, replicas bridge past and present, sparking curiosity through interactivity. A student “conversing” with a digital replica gains insight into a historical era’s humor, provided the system is framed as a tool, not a reincarnation. Acknowledging this distinction encourages critical engagement, teaching audiences to question AI’s role in shaping narratives.
AI supports but does not supplant human insight. By highlighting the gap between simulation and intent, educators empower audiences to see technology as a translator of history, not its author, fostering a deeper connection to the past.
Between Preservation and Appropriation
Digital replicas both preserve and transform. By training on traces of the past, AI creates versions of history shaped by present biases. A digital figure’s voice may sound authentic, but its meaning reflects contemporary prompts. The challenge is to ensure replicas bridge to the past, not mirror modern assumptions, through diverse datasets and community feedback.
Consent, authenticity, and truth are intertwined. Responsible design-rooted in curated datasets and transparent labeling-ensures replicas enhance cultural memory. Community oversight avoids flattening cultural nuances, turning replicas into tools for global dialogue (Škobo & Šović, 2025).
Guardrails for Ethical Use
Practical guidelines are emerging:
- Ensure Transparency: Label AI content to distinguish simulation from source.
- Secure Consent: Obtain permission from individuals or estates, per the ELVIS Act.
- Curate Data: Use context-specific datasets to preserve nuance, addressing post-mortem privacy.
- Educate Users: Teach audiences to question AI outputs, emphasizing human oversight.
These ensure digital doppelgängers enrich history without claiming to resurrect it.
Conclusion: Remembrance, Not Resurrection
Digital doppelgängers preserve voices but risk commodification. True preservation lies in honest mediation-using AI to contextualize ideas within living discourse. AI maps history’s contours, but human judgment defines its meaning. Digital immortality is a myth, but digital remembrance, guided by ethics and humility, keeps the past alive for the present, honoring the dead without possessing them.
References (APA)
Harbinja, E. (2025). Digital remains and post-mortem privacy in the UK: what do we protect and why? International Journal of Law and Information Technology.
Regulation (EU) 2024/1689. Artificial Intelligence Act of the European Parliament and of the Council. Official Journal of the European Union, July 2024.
Škobo, M., & Šović, M. (2025). The Digital Doppelgängers of Nikola Tesla and Branislav Nušić: A New Approach to Interactive Learning and Cultural Heritage. In Proceedings of the International Scientific Conference on Information Technology, Computer Science, and Data Science – Sinteza 2025 (pp. 411–417). https://doi.org/10.15308/Sinteza-2025-411-417
Tennessee General Assembly. (2024). Ensuring Likeness, Voice, and Image Security (ELVIS) Act. Nashville: State of Tennessee.








