AI Rebirth: Turning Digital Footprints Into Living Legacies

Summary: What happens when artificial intelligence makes the dead speak again? A comprehensive new study explores the “unsettling frontier” of digital resurrection. By analyzing over 50 real-world cases, researchers reveal how generative AI is turning the voices, faces, and life histories of the deceased into reusable “spectral labor.”

This practice, ranging from hologram concerts to grief-tech chatbots, is redrawing the boundary between life and death—creating a “postmortal society” where the dead are increasingly exploited for profit, politics, and comfort without their consent.

Key Facts

  • Spectral Labor: The study introduces this term to describe how the dead are “compelled to work” by serving the emotional, political, or commercial desires of the living through their data.
  • Three Modes of Resurrection:
    1. Spectacularization: AI-generated performances by icons like Whitney Houston or Freddie Mercury for entertainment.
    2. Sociopoliticization: Reanimating victims of injustice to testify or protest posthumously.
    3. Mundanization: Everyday people using chatbots to maintain “daily interaction” with deceased loved ones.
  • Consent Vacuum: Most AI resurrections occur without clear ownership rules, accountability, or the deceased’s prior consent.
  • Postmortal Society: We are entering a social stage where immortality is sought through algorithms and “digital afterlives” rather than religion.
  • Ideological Weaponization: AI allows political figures or ideologues to continue circulating their message and influence indefinitely after death.

Source: Hebrew University of Jerusalem

A new study shows that generative AI is already being used to “bring back” the dead, as entertainment icons, as political witnesses, and as everyday companions for grieving families.

Tracing cases of AI “resurrections,” the study claims this practice isn’t just emotionally powerful; it’s ethically explosive because it turns a person’s voice, face, and life history into reusable raw material.

This shows a digital "ghost" coming out of a dark path.
AI resurrections turn the digital remains of the deceased into active data, creating a “digital afterlife” where the dead are compelled to serve the needs of the living. Credit: Neuroscience News

AI resurrections are important because they can happen with little or no consent, clear ownership rules, or accountability, creating a new kind of exploitation the authors call “spectral labor,” where the dead become an involuntary source of data and profit, while the living are left to navigate blurred lines between memory and manipulation, comfort and coercion, tribute and abuse.

What does it mean when artificial intelligence makes the dead speak again?

From hologram concerts of long-deceased pop stars to chatbots trained on the texts of lost loved ones, Gen AI is rapidly redrawing the boundary between life and death.

A new study by Tom Divon, an internet and technology researcher from Hebrew University and Prof. Christian Pentzold of Leipzig University Germany offers one of the most comprehensive looks yet at this unsettling frontier and raises urgent questions about consent, exploitation, and power in a world where the dead can be digitally revived.

In their article, Artificially Alive: An Exploration of AI Resurrections and Spectral Labor Modes in a Postmortal Society, the researchers analyze more than 50 real-world cases from the United States, Europe, the Middle East, and East Asia in which AI technologies are used to recreate deceased people’s voices, faces, and personalities.

What sets this study apart is its scope and clarity. Rather than focusing on a single technology or viral example, the researchers examined dozens of cases from across continents to show that AI “resurrections” are already forming a recognizable social pattern.

They identify three distinct ways the dead are being digitally reintroduced into society, from celebrity spectacles to political testimony to intimate conversations with lost loved ones and reveal a shared underlying dynamic: the growing use of the dead as a source of data, voice, and likeness that can be reused and monetized, often without consent.

This broad view shows how quickly experimental uses of AI are becoming normalized and why the ethical stakes are no longer theoretical.

Three ways AI brings back the dead

The study identifies three dominant ways AI is being used to “re-presence” the deceased:

  • Spectacularization – the digital re-staging of famous figures for entertainment. Fans can now watch “new” performances by Whitney Houston or Freddie Mercury, generated by AI and staged as immersive spectacles.
  • Sociopoliticization – the reanimation of victims of violence or injustice for political or commemorative purposes. In some cases, AI-generated personas of the dead are made to testify, protest, or tell their own stories posthumously.
  • Mundanization – the most intimate and fast-growing mode, in which everyday people use chatbots or synthetic media to “talk” with deceased parents, partners, or children, keeping relationships alive through daily digital interaction.

The rise of “spectral labor”

Across all three modes, the dead are not simply remembered they are made to work.

Divon and Pentzold introduce the concept of spectral labor to describe what is happening beneath the surface. AI systems are trained on the digital remains of the dead; photos, videos, voice recordings, social media posts. Without consent, these data are extracted, repackaged, and monetized, with immense potential for weaponization.

What happens when a figure like Charlie Kirk is resurrected to continue circulating his ideology, speaking to new audiences after his death, without accountability, context, or the possibility of refusal? Or when the likeness of a victim is reanimated to repeatedly relive trauma for political, commercial, or instructional ends?

In these cases, AI resurrection becomes a tool for extending power, ideology, and influence beyond the limits of life itself.

“The dead are compelled to haunt the present,” the authors argue, serving the emotional, political, or commercial desires of the living.

This raises difficult questions: Who owns a voice after death? Can a digital likeness be exploited? And who gets to decide how, when, and why the dead are brought back?

Living in a “postmortal society”

The study situates AI resurrections within what sociologists call a postmortal society, one that does not deny death, but increasingly seeks to overcome it technologically. In this world, immortality is no longer promised through religion alone, but through data, algorithms, and platforms offering “digital afterlives.”

Yet the authors are clear: AI does not conquer death. Instead, it keeps people suspended in an uneasy in-between state, neither fully alive nor fully gone.

As generative AI accelerates, Divon and Pentzold warn that society must confront the ethical and legal implications now, before digital resurrection becomes normalized and unregulated.

“Thinking seriously about what AI does to our relationship with the dead,” they write, “is essential to understanding what it is doing to the living.”

Key Questions Answered:

Q: Is it really “resurrection” if it’s just code?

A: Technically, no, but the study argues it creates a state where people are “suspended in-between”—neither fully alive nor fully gone. For those interacting with these AI personas, the emotional impact is very real, effectively “re-presencing” the dead in our daily lives.

Q: Can anyone turn a deceased relative into an AI chatbot?

A: Currently, the technology exists (mundanization), but the study warns that this is happening in a legal and ethical “Wild West.” There are no universal rules about who owns your digital likeness or voice after you pass away.

Q: What is the biggest risk of “Spectral Labor”?

A: Exploitation. The dead can’t refuse to work. Whether it’s a pop star being forced into a new world tour or a political victim being made to relive their trauma for a campaign, the deceased become an involuntary source of data and profit.

Editorial Notes:

  • This article was edited by a Neuroscience News editor.
  • Journal paper reviewed in full.
  • Additional context added by our staff.

About this AI and neuroethics research news

Author: Press Office
Source: Hebrew University of Jerusalem
Contact: Press Office – Hebrew University of Jerusalem
Image: The image is credited to Neuroscience News

Original Research: Open access.
Artificially alive: An exploration of AI resurrections and spectral labor modes in a postmortal society” by Tom Divon and Christian Pentzold. New Media & Society
DOI:10.1177/14614448251397518


Abstract

Artificially alive: An exploration of AI resurrections and spectral labor modes in a postmortal society

Generative Artificial Intelligence (GenAI) is widely regarded as a transformative force, reshaping our understanding of both life and death. One experimental frontier is its ability to recreate deceased human beings.

Our article explores this nascent GenAI application situated at the threshold of existence. We analyze 50 cases from the United States, Europe, the Near East, and East Asia and distill three principal modes of AI resurrections: (1) Spectacularization, the public re-staging of iconic deceased cultural figures via immersive recreations for entertainment spectacle; (2) Sociopoliticization, the re-invoking of victims of violence in political or commemorative contexts, often as posthumous testimonies; and (3) Mundanization, the everyday revival of loved ones, allowing users to interact with the deceased through chatbots or synthetic media.

To engage with the underlying industry that capitalizes on digital remains, we introduce the notion of spectral labor, in which the dead become involuntary sources of data, likeness, and affect that are extracted, circulated, and monetized without their consent.

We argue that the use of GenAI to animate the deceased raises urgent legal and ethical questions around posthumous appropriation, ownership, work, and control.

Join our Newsletter
I agree to have my personal information transferred to AWeber for Neuroscience Newsletter ( more information )
Sign up to receive our recent neuroscience headlines and summaries sent to your email once a day, totally free.
We hate spam and only use your email to contact you about newsletters. You can cancel your subscription any time.