In Russia, a new form of digital mourning is emerging — one that blurs the line between memory and machine. Artificial intelligence is now being used to recreate deceased soldiers killed in the war in Ukraine, allowing grieving widows and families to “see” and “speak” with virtual versions of their loved ones.
These digital resurrections — hyperrealistic avatars trained on old videos, social media posts, and voice recordings — are being offered by private tech companies as a kind of closure. But they raise profound ethical, psychological, and political questions: Is this therapy or exploitation? Consolation or control?

The Technology Behind the “Resurrection”
Several Russian startups, reportedly supported or tolerated by state-linked institutions, are using generative AI to reconstruct digital likenesses of fallen soldiers.
Here’s how it works:
- Data Collection: Families upload photos, videos, voice messages, and social media archives of the deceased.
- Model Training: AI systems analyze this data to synthesize a 3D avatar, capable of mimicking facial movements, gestures, and speech patterns.
- Voice and Interaction Layer: A fine-tuned chatbot or voice synthesis model (similar to ChatGPT with a cloned voice) allows the AI version to hold short conversations in the person’s tone and mannerisms.
- Deployment: The digital version is rendered through video calls, holographic displays, or VR environments.
In promotional videos, widows are shown speaking to their “returned” husbands, who smile, speak their names, and deliver carefully crafted farewell messages — sometimes expressing pride, reassurance, or patriotism.
A New Frontier in Grief — or State Propaganda?
Psychological Comfort and Closure
For some, these AI avatars offer comfort. Psychologists have long studied “continuing bonds” — the idea that maintaining a symbolic connection with the deceased can aid healing. AI resurrection can make those bonds tangible, helping families cope with sudden loss.
In interviews, participants describe feeling as though they received “a final goodbye.” Others say it eased their nightmares or helped children remember their fathers more vividly.
Manipulation and Narrative Control
But the darker side looms large. Many of the AI-generated soldiers deliver patriotic messages — thanking their families for sacrifice, affirming loyalty to Russia, and urging continued faith in the cause.
This suggests a subtle intertwining of grief counseling and propaganda. Critics argue that these digital resurrections are being used to sanitize the human cost of war, transforming individual tragedy into state-sanctioned messaging.
Some analysts compare it to Soviet-era hero memorialization — but powered by algorithms instead of statues.
Consent and Ethics
The dead cannot give consent. While family members may authorize the reconstruction, the moral and legal questions remain unsettled:
- Does a person’s digital likeness belong to them, their family, or the state?
- What if the avatar is made to say things the real person never believed?
- How long before such “digital ghosts” are used for political or commercial purposes?
In Western contexts, AI posthumous reconstructions (like virtual celebrities or hologram concerts) already spark controversy. In Russia, the stakes are higher — entwined with nationalism, grief, and state control.
The Cultural Context: Russian Attitudes Toward Death & Memory
In Russian culture, remembrance of the dead is sacred. Practices like the “Day of Remembrance and Sorrow” or the “Immortal Regiment” parade reflect a national ethos that venerates sacrifice.
AI resurrection taps into that tradition — offering families not just remembrance, but simulated presence. The technology thus rides on deep cultural currents: the desire to preserve memory, sanctify heroism, and endure loss with pride.
But that cultural resonance also makes it a potent instrument for state narratives, reinforcing themes of duty, honor, and eternal legacy — sometimes at the expense of truth and emotional autonomy.
A Global Trend: Digital Afterlife Technologies
Russia’s experiment is part of a growing global trend toward “digital immortality.”
- In China, AI memorial platforms let users converse with virtual reconstructions of deceased loved ones.
- In the U.S., startups like HereAfter AI and StoryFile offer similar experiences — marketed as “legacy preservation tools.”
- In South Korea, a 2020 VR documentary (“Meeting You”) showed a mother reuniting with her deceased daughter through VR, sparking international debate.
The Russian twist is its integration with war, politics, and national identity — transforming a deeply personal technology into a public tool of collective storytelling.
Ethical and Legal Questions Still Unanswered
- Digital Personhood: Should AI versions of people have legal standing, or are they simply data simulations?
- Emotional Manipulation: Is it ethical to use grief as a market or propaganda tool?
- Data Ownership: Who controls the personal data used to resurrect the dead?
- Authenticity: At what point does a simulation become a distortion of memory?
- Therapeutic Oversight: Should psychologists or ethicists be involved before families use such technology?
Frequently Asked Questions (FAQs)
| Question | Answer |
|---|---|
| 1. What technology powers these digital soldiers? | They use generative AI — combining image synthesis, voice cloning, and conversational models similar to ChatGPT or DeepBrain to create interactive avatars. |
| 2. Are these “resurrected” soldiers conscious or self-aware? | No. They simulate responses based on prior data — sophisticated puppets, not sentient beings. |
| 3. How do families react? | Mixed. Some find solace and closure; others feel disturbed or manipulated by what feels like an artificial intrusion into grief. |
| 4. Is this legal? | There are few legal frameworks in Russia or globally that regulate AI reconstructions of the deceased. Consent and data use remain gray areas. |
| 5. Could the state use this for propaganda? | Yes — and evidence suggests that patriotic messaging is already being embedded in some AI-generated soldier interactions. |
| 6. Is this happening outside Russia? | Yes. Similar technologies are emerging globally in memorial and entertainment industries, though usually for civilian purposes. |
| 7. What are the psychological risks? | Prolonged interaction with AI versions of the dead may hinder grief resolution, foster dependency, or distort memory. |
| 8. What might come next? | Deeply personalized “digital ancestors” that persist for generations, raising new questions about identity and legacy. |
Final Thoughts
The AI resurrection of Russian soldiers is a mirror reflecting both human longing and technological hubris. It blurs mourning and manipulation, comfort and control.
For grieving families, these virtual reunions can feel like a miracle — a momentary reprieve from loss. For the state, they are a powerful narrative tool. For the world, they are a warning: as AI crosses into the sacred terrain of death, our definitions of love, loss, and truth are being rewritten.
If technology gives us the power to speak to the dead, we must decide not just if we should — but who gets to write their final words.

Sources The Washington Post


