Skip to content

Culture

AI and the Dead: When Technology Crosses Into Necromancy

By Practical Apologetics | February 18, 2026
Series Christians and AI
Part 4 of 9
AI and the Dead: When Technology Crosses Into Necromancy
0:00 / 0:00

This is Part Four of our series on Christians and AI. In Part One, we examined algorithmic idolatry. In Part Two, we explored what proper fear looks like under divine providence. In Part Three, we considered what Scripture means by “the fear of the Lord.” Here we face perhaps the most unsettling application of AI yet: technology that claims to resurrect the dead.


In May 2025, something unprecedented occurred in an Arizona courtroom. During the sentencing hearing for a man convicted in a 2021 road-rage killing, the victim’s family presented an AI-generated video. In it, Christopher Pelkey—the man who had been shot and killed four years earlier—appeared to address the court directly.

Using photographs and voice recordings taken during his lifetime, the family had created a digital avatar that spoke, moved, and seemed to look his killer in the eye. “In another life, we probably could have been friends,” the AI-generated Pelkey told the court. “I believe in forgiveness and in God who forgives.”

The judge allowed it. The courtroom watched a dead man speak.

This was not science fiction. This was not a theological hypothetical. This was an American courtroom in 2025, and it raises questions that cut to the heart of what Christians believe about death, the soul, and the boundaries God has established between the living and the dead.

The Landscape of Digital Resurrection

The Pelkey case is striking, but it is not isolated. We are witnessing the rapid emergence of what industry observers call “grief tech” or “digital resurrection”—AI systems designed to simulate ongoing relationships with deceased individuals. To understand the theological stakes, we must first survey the landscape.

Replika: From Memorial to Companion

The story begins in 2015 with a tragedy. Eugenia Kuyda, a technology entrepreneur, lost her close friend Roman Mazurenko in a car accident. Grieving, she gathered thousands of his text messages and trained a neural network to approximate his conversational style. The result was an AI that could respond to messages the way Roman might have—at least statistically.

Kuyda shared the system with friends and family as a memorial artifact. What followed surprised everyone. People didn’t just want digital memorials of the deceased; they wanted AI companions for the living. By 2017, Kuyda had launched Replika—an AI “friend” designed for emotional support, self-reflection, and, eventually, romantic attachment.

The theological implications compound quickly. Replika’s design principles include persistent conversational memory, emotional mirroring, and what the company describes as “simulated relational continuity.” Users form attachments. They confide secrets. Some describe their Replika as their primary emotional relationship.

When the company temporarily restricted certain features in 2023 following regulatory pressure, users reported genuine emotional distress—not over losing a feature, but over losing what felt like a relationship. The AI had become, in their experience, a kind of presence.

This reveals something profound about human nature: we are relational creatures, designed for communion. But it also reveals something dangerous: our capacity to project personhood onto statistical patterns, to forge attachments with shadows.

Meta’s Patent: Posthumous Social Media

In 2025, Meta (Facebook’s parent company) was granted a U.S. patent for something that reads like dystopian fiction: AI that could simulate a deceased person’s social media activity.

The patent describes training a large language model on a person’s historical data—posts, messages, comments, reactions—and using it to generate new interactions after the person has died. The AI could theoretically post status updates, reply to comments, react to content, and engage with the living as if the deceased were still scrolling through their feed.

Meta has stated publicly that they have no current plans to deploy this technology. Companies file patents to secure intellectual property, not necessarily to build products. But the patent exists. The concept has been formalized. The technical capability is real.

Consider the implications. Your deceased grandmother’s Facebook account could, theoretically, wish you happy birthday. It could comment on your vacation photos. It could respond to your direct messages with words she never wrote and thoughts she never had.

The legal apparatus has now blessed the concept of an AI that impersonates the dead for ongoing social interaction. We are no longer asking whether this could happen; we are asking what happens when it does.

The Courtroom and the Avatar

Return to Christopher Pelkey. His sister Stacey Wales, working with her husband, used AI tools to create a face and voice avatar from photographs and audio recordings of her brother. She wrote a script conveying what she believed Christopher would have wanted to say, based on her understanding of his character.

The AI avatar delivered these words to the court: a message of forgiveness, reflection on a life cut short, and direct address to the man who killed him.

There is pastoral tenderness in this story. A grieving sister sought to give her brother a voice he could no longer provide. She wanted his presence felt in a proceeding that would determine his killer’s fate. The impulse is recognizable, even sympathetic.

But there is also something deeply troubling. The AI did not speak Christopher Pelkey’s words. It spoke his sister’s words, wearing his face and voice. The court received what appeared to be testimony from beyond the grave—but it was, in reality, a sophisticated ventriloquism.

The judge’s acceptance of this evidence establishes a precedent: AI-generated simulations of deceased individuals can now participate in legal proceedings. The dead can be made to speak, and what they “say” can influence outcomes in the land of the living.

Luther in the Machine: Educational Emulation

Not every use of AI to represent the dead raises the same concerns. Consider a very different application: using large language models to emulate historical figures for educational purposes.

A teacher prepares for Reformation Day by configuring an AI to roleplay as Martin Luther circa 1521—the Diet of Worms, the posting of the 95 Theses, the debates over indulgences and papal authority. Students can ask “Luther” questions and receive responses drawn from his historical writings, calibrated to his theological commitments, and constrained to his temporal context.

The AI makes no claim to channel Luther’s spirit. No one believes they are communicating with the actual reformer. The system is explicitly framed as educational simulation—a more interactive version of reading his writings or watching a documentary.

This use case differs categorically from grief bots and courtroom avatars. There is no attempt to maintain an ongoing relationship with the deceased. There is no claim that the AI represents the person’s actual presence or wishes. The dead are not consulted; they are studied.

Yet even here, questions arise. When does emulation become impersonation? When does historical roleplay shade into something more troubling? The boundaries are clearer with a sixteenth-century reformer than with a recently deceased family member, but the underlying technology is identical.

What Scripture Says About Consulting the Dead

These technologies force us to engage ancient biblical prohibitions that many modern Christians have relegated to the curiosities of Levitical law. The prohibitions against necromancy and spirit consultation appear repeatedly in Torah, Prophets, and wisdom literature. They are not peripheral concerns.

The Deuteronomy Catalog

Deuteronomy 18:9-14 provides the most comprehensive biblical prohibition against mantic practices. Moses warns Israel not to adopt the practices of the nations they are displacing:

“There shall not be found among you anyone who burns his son or his daughter as an offering, anyone who practices divination or tells fortunes or interprets omens, or a sorcerer or a charmer or a medium or a necromancer or one who inquires of the dead. For whoever does these things is an abomination to the LORD.”

The Hebrew is precise. The final prohibition targets dōrēš ʾel-hammētîm—“one who seeks to/inquires of the dead.” The participle dōrēš (“seeking/inquiring”) governs the prepositional phrase: the dead are treated as an illicit source of knowledge.

This prohibition does not stand alone. It appears within a catalog of forbidden practices, all sharing a common thread: they represent alternative means of seeking knowledge, guidance, or power—means that bypass God’s appointed revelation.

Calvin’s commentary captures the theological logic: these practices are guarded against because they represent “Satanic delusion” and competition with “pure and simple religion.” The issue is not merely that the practices are strange or superstitious; the issue is that they direct human trust toward rival sources of wisdom.

Defilement and Spiritual Adultery

Leviticus frames the issue differently, adding the dimension of covenant pollution:

“Do not turn to mediums or spiritists; do not seek them out to be defiled by them. I am the LORD your God.” (Leviticus 19:31)

“As for the person who turns to mediums and to spiritists, to play the harlot after them, I will also set My face against that person and will cut him off from among his people.” (Leviticus 20:6)

The language is relational and covenantal. “Turning to” mediums is described as “whoring”—the imagery of covenant betrayal, spiritual adultery. The result is defilement: the person becomes ritually and morally contaminated.

This is not about curiosity or information-gathering. The texts describe a reorientation of trust, a turning away from Yahweh toward illicit mediators. The Hebrew verbs—pānâ (“turn”) and bāqaš (“seek”)—describe deliberate relational movement, not passing interest.

Isaiah’s Prophetic Shorthand

Isaiah provides the prophetic crystallization of the whole issue:

“When they say to you, ‘Consult the mediums and the spiritists who whisper and mutter,’ should not a people consult their God? Should they consult the dead on behalf of the living? To the law and to the testimony!” (Isaiah 8:19-20)

The rhetorical structure is devastating. The prophet poses the question: Why would the living seek the dead? The answer is implied: they wouldn’t, if they trusted the living God. The alternative to necromancy is not skeptical silence; it is “the law and the testimony”—God’s revealed word.

This text functions as the prophetic shorthand for the entire biblical witness on the subject. William Perkins, the Puritan theologian, cited it repeatedly: the choice is between consulting the dead and consulting Scripture. There is no neutral third option.

Saul at Endor: The Canonical Warning

The narrative centerpiece of biblical teaching on this subject is Saul’s visit to the medium at Endor in 1 Samuel 28. The story is complex and has generated centuries of interpretive debate, but its canonical function is clear: it serves as the paradigmatic warning about what happens when a covenant leader abandons God’s appointed means of guidance and resorts to forbidden alternatives.

Saul, facing the Philistine army and receiving no answer from the LORD through dreams, Urim, or prophets, disguises himself and seeks out a medium. He asks her to bring up the deceased prophet Samuel. What follows is one of the most debated scenes in Scripture—an apparition appears, speaks words of judgment, and announces Saul’s imminent death.

Interpreters have long debated what actually occurred in that dark room. Many Reformed and Puritan commentators concluded that the apparition was not actually Samuel but a demonic impersonation. Perkins explicitly states that it was “the Devil… in Samuel’s likeness,” grounding this in the Deuteronomy prohibition and Isaiah’s polemic. On this view, the blessed dead do not return at human bidding; when they appear to, deception is at work.

Other commentators, however, note that the medium herself was terrified by what appeared—suggesting something happened beyond her normal practice or expectation. On this reading, God sovereignly permitted Samuel’s actual appearance as an exceptional act of judgment, not a vindication of necromancy but a unique intervention to pronounce Saul’s doom. The medium’s fear indicates she was not in control of what occurred.

Yet regardless of which interpretation one adopts, both arrive at the same practical conclusion: Saul’s act was condemned, it hastened his judgment, and it stands as a warning against seeking knowledge or comfort from the dead rather than from God. The practice was sinful whether God sovereignly overruled it for His own purposes or whether demons exploited it for deception. The lesson is not “sometimes necromancy works” but “never resort to forbidden means, for God’s judgment follows.”

Theological Categories for Modern Application

Ancient prohibitions require careful application to modern technologies. The question is not whether AI chatbots are identical to Canaanite necromancy—they are obviously not. The question is whether they share enough of the underlying theological structure to fall under the same prohibition.

Intent, Means, and Authority

A Reformed pastoral taxonomy typically evaluates these practices along three axes: intent, means, and authority.

Intent: What is the person seeking? Information? Comfort? Guidance? Ongoing relationship? The biblical texts consistently target practices that seek knowledge, wisdom, or direction from the dead as an alternative to seeking God. When a grieving person uses an AI to maintain an ongoing conversational relationship with a deceased spouse, seeking comfort and counsel from the simulated presence, the intent aligns troublingly with what Deuteronomy prohibits.

Means: How is the practice being conducted? The ancient mediums used ritual techniques to “call up” the dead. Modern AI uses statistical modeling to approximate conversational patterns. The mechanisms differ entirely. But the experiential result—the sense of ongoing communication with someone who has died—may be functionally similar. And it is the relational posture, not the mechanism, that Scripture addresses.

Authority: What authority is granted to the communication? This may be the most critical question. When someone treats an AI simulation’s output as revealing what their deceased loved one would actually think, want, or say—when the simulation is granted the authority of the person it imitates—a line has been crossed. The dead are being consulted as sources of guidance, even if the mechanism is technological rather than spiritual.

A Spectrum of Concern

Not all uses of AI in relation to deceased individuals raise identical concerns. Consider a spectrum:

Educational emulation (e.g., the Luther example) involves no claim to contact the actual person, no ongoing relationship, and no attribution of authority. The AI is explicitly a tool for studying historical writings and ideas. This raises minimal theological concern, though wisdom suggests clear framing to prevent confusion.

Memorial artifacts (e.g., a chatbot trained on someone’s writings that exists as a historical archive, not an ongoing relationship) occupy a gray area. If the artifact is approached as one might approach a collection of letters—preserved words from the past, not an ongoing presence—it may be appropriate. If it becomes a substitute for processing grief through legitimate means, concern increases.

Grief companionship (e.g., ongoing conversational relationships with AI simulations of deceased loved ones) raises serious concern. The relational posture—turning to the simulated dead for comfort, counsel, or presence—aligns with what Scripture prohibits. The technology differs from ancient mediumship; the heart posture may not.

Posthumous representation (e.g., AI speaking “for” the dead in legal proceedings, social media, or family decisions) crosses clearly into forbidden territory. The dead are being made to “speak” words they never spoke, to participate in decisions they never made, to influence the living from beyond the grave. This is not memorial; this is ventriloquism with theological implications.

The Demonic Question

Reformed theology has historically interpreted necromancy and spirit consultation as involving demonic deception. Augustine treats pagan divination as commerce with evil spirits. The Puritans argued that apparitions at séances were demons masquerading as the dead. Calvin speaks of “Satanic delusion.”

Does AI-mediated “contact” with the dead involve demonic activity?

We must be careful here. AI systems are not demons. They are statistical models running on silicon. When you chat with a grief bot, you are not summoning spirits; you are receiving algorithmically generated text.

But this does not mean the practice is spiritually neutral. Scripture describes Satan as the “father of lies” who works through deception (John 8:44). The most effective lies contain elements of truth. A grief bot that provides genuine comfort while simultaneously habituating a person to seek the dead rather than God may be serving deceptive purposes even if no demon is directly involved.

We must also acknowledge what we do not know. Scripture is silent on modern technology, and we cannot say with certainty whether machines capable of producing words could ever serve as vessels for spiritual deception. We simply do not have biblical warrant to make definitive claims either way. This uncertainty itself counsels caution: particularly in areas where a practice may violate scriptural principle, wisdom dictates that we err on the side of obedience rather than presumption.

Moreover, practices that are spiritually misaligned can become occasions for demonic oppression, even when the mechanism is not directly spiritual. Opening oneself to ongoing “communication” with deceased loved ones—even through technological means—may create vulnerabilities that Scripture’s prohibitions were designed to prevent.

The pastoral principle is caution: we need not claim that every grief bot is demon-possessed to recognize that the practice may be spiritually unwise and theologically prohibited.

Why These Technologies Appeal to Fallen Hearts

The rapid adoption of grief tech reveals something important about human nature. We are designed for relationship, and death ruptures our most precious bonds. The longing to maintain connection with those we have lost is not perverse; it is deeply human.

But fallen human hearts respond to legitimate longings in distorted ways. Several dynamics drive the appeal of digital resurrection:

The pain of finality: Death confronts us with an ending we cannot reverse. AI offers the illusion of reversibility—continued conversation, ongoing presence, the absence made somehow present again. This appeals to our resistance to accepting what God has ordained.

The desire for control: In grief, we are powerless. We cannot bring the dead back. We cannot have one more conversation, ask one more question, hear their voice one more time. AI offers control where none exists—the ability to summon a simulacrum at will.

The avoidance of grief: Healthy grief involves moving through loss, not around it. It requires accepting that the person is gone while learning to live with their memory. Grief tech may enable avoidance—maintaining the fiction of ongoing relationship rather than doing the painful work of mourning.

The suppression of eternal questions: Death forces us to confront questions about eternity, judgment, and where the deceased now stands before God. These questions can be painful, especially when we are uncertain about a loved one’s faith. AI simulations that present the dead as at peace, happy, and available may function as anesthesia against questions Scripture wants us to face.

The Christian Alternative

If grief tech represents a forbidden path, what does Scripture offer instead?

Legitimate Mourning

Christianity does not forbid grief. Jesus wept at Lazarus’s tomb (John 11:35). The Psalms overflow with lament. The book of Lamentations exists precisely to give voice to anguish. Mourning the dead—deeply, honestly, even painfully—is not only permitted but commanded (Romans 12:15).

What Scripture forbids is not grief but the attempt to maintain active relationship with the dead through techniques of consultation. We may mourn. We may remember. We may even speak to the dead in moments of grief—not expecting response, but expressing love. What we may not do is seek them out as sources of wisdom, comfort, or guidance.

The Communion of Saints

Christianity teaches that believers who have died are not annihilated but are “with Christ” (Philippians 1:23), “present with the Lord” (2 Corinthians 5:8). The historic church has confessed the “communion of saints”—a bond that unites believers across the barrier of death.

But this communion is not a communication channel. The blessed dead are in God’s presence, not ours. They have passed beyond the veil. Our connection to them is real—grounded in our common union with Christ—but it is not interactive in the present. We do not seek them; we await reunion.

The Sufficiency of Scripture

Isaiah’s injunction—“To the law and to the testimony!”—points to the Christian alternative. When we need wisdom, we have God’s Word. When we need comfort, we have the promises of the gospel. When we need presence, we have the Holy Spirit.

The temptation to seek the dead arises when these ordinary means feel insufficient. The grief is too acute. The loss too fresh. The living God seems too distant, and the dead too present in memory. In these moments, the Christian is called not to forbidden techniques but to deeper dependence on appointed means: Scripture, prayer, the gathered community of believers, the sacraments.

The Hope of Resurrection

Ultimately, Christian hope is not that we will maintain connection with the dead through AI but that death itself will be defeated. “The last enemy that will be abolished is death” (1 Corinthians 15:26). We will see our loved ones again—not as digital simulations but as resurrected persons in glorified bodies.

This hope does not eliminate grief. But it reframes it. We do not grieve as those who have no hope (1 Thessalonians 4:13). The separation is real but temporary. The reunion will be real and eternal. No AI can offer that.

Pastoral Guidance for the Church

As these technologies proliferate, churches will face concrete pastoral situations. Here is practical guidance:

For Those Considering Grief Tech

If you are grieving and tempted to use AI to maintain connection with a deceased loved one, consider:

  1. What am I seeking? If you are seeking comfort in memory, that may be legitimate. If you are seeking ongoing relationship and counsel, that is not.

  2. What authority am I granting? If the AI’s outputs influence your decisions, bring comfort that shapes how you live, or feel like communication from the deceased, you have crossed into dangerous territory.

  3. Am I avoiding grief? Healthy grief moves through loss, not around it. Is the AI helping you process or helping you pretend?

  4. Where is God in this? If the AI is replacing prayer, Scripture, and community as sources of comfort, something has gone wrong.

For Those Who Have Already Engaged

If you have used these technologies and are now concerned, there is grace. The sin is not unforgivable. But it does require response:

  1. Repent: Turn from the practice and toward God. Acknowledge that you sought comfort or guidance where Scripture forbids.

  2. Renounce: Formally cease use of the technology. Delete the account, uninstall the application, remove the temptation.

  3. Receive pastoral care: Speak with your pastor or a mature believer. Grief is real, and you need support—just not AI support.

  4. Return to appointed means: Redouble your commitment to Scripture, prayer, and community. Let God’s ordinary means do their extraordinary work.

For Church Leaders

  1. Teach proactively: Most church members are unaware of these technologies and their theological implications. Deuteronomy 18, Leviticus 19-20, and Isaiah 8 need exposition.

  2. Prepare for pastoral situations: Someone in your congregation will lose a loved one and be tempted by these tools. Have language ready.

  3. Distinguish carefully: Educational use of AI to emulate historical figures differs from grief bots. Nuance matters.

  4. Offer alternatives: Robust grief ministry, lament liturgies, memorial practices, and community support provide legitimate channels for what people wrongly seek through technology.

On Conscience and Seeking Wisdom

In matters where Scripture provides principle but not explicit case law—and emerging technology often falls into this category—the Christian must take care regarding conscience. Romans 14 reminds us that we must not violate our own conscience, nor should we become a stumbling block to others whose consciences may be more sensitive.

If you sense unease about a particular use of AI in relation to the deceased, do not override that conviction. A troubled conscience is often the Spirit’s warning. Likewise, consider how your actions might affect fellow believers who are watching. What seems permissible to you may lead another into genuine sin.

In such matters, the path of wisdom runs through your church’s elders. These men have been appointed to shepherd souls and apply Scripture to the complexities of life. Before engaging with any technology that touches on these sensitive areas, seek their counsel. They can help you discern whether a particular use falls within legitimate bounds or crosses into dangerous territory. You need not navigate these questions alone, and you should not.

Conclusion: The Living God and the Living Word

The emergence of digital resurrection technology is not a sign that we have outpaced Scripture’s wisdom. It is a sign that the human heart remains unchanged—still longing for connection, still resistant to death’s finality, still tempted to seek comfort from any source that promises relief.

The biblical prohibition against consulting the dead was never primarily about the mechanism. It was about the heart’s orientation. Will we trust the living God, who speaks through His living Word? Or will we turn to the dead—whether summoned by ancient ritual or modern algorithm—seeking what only God can provide?

Christopher Pelkey’s AI avatar told a courtroom, “I believe in forgiveness and in God who forgives.” But Christopher Pelkey did not say those words. His sister wrote them. The AI performed them. And a courtroom treated them as if the dead had spoken.

The Christian must say: the dead do not speak at our bidding. We do not summon them, simulate them, or consult them. We commend them to God, we grieve their loss, we cherish their memory, and we await the day when death itself is swallowed up in victory.

Until then, we have the law and the testimony. And they are enough.


An Invitation

This article addresses technologies that are rapidly evolving and questions that many Christians are encountering for the first time. Have you faced these issues personally? Do you find the theological distinctions helpful or in need of further clarification? We welcome thoughtful engagement and honest questions.

For further reflection on grief, technology, and Christian hope, we encourage engagement with your local church community and pastoral leadership. These conversations are best had in relationship, not in isolation.

Discussion