When someone dies, you don’t expect to hear from them again. But that’s exactly what seemed to happen in the case of Christopher Pelkey. No, it wasn’t dark magic. It was something a bit more uncanny: artificial intelligence. The Arizona man was shot and killed at a red light several years ago. Now that his killer has finally been sentenced, Pelkey’s family used AI to deliver a message in court through his recreated voice and image. They combined old videos, voice recordings, and personal memories to digitally bring him back, if only for a moment.
In the AI-generated video, Pelkey appeared to address his killer directly. He spoke of forgiveness, even saying that in another life, the two might have been friends. One powerful moment featured a digitally aged photo of him, symbolizing the life he never got to live and the years taken away by the shooter’s actions.
The family chose to create the AI video after everyone who knew Christopher Pelkey agreed it was the right thing to do. They wanted to show the world how Pelkey might have responded if he had lived to hear his killer’s sentencing. While some may find the idea unsettling, the Arizona judge was visibly moved by the video and ultimately sentenced the shooter to 10.5 years in prison instead of the expected 9.5.
Though the family created the video as a way to cope with their grief, their decision made waves in the U.S. justice system. Arizona Chief Justice Ann Timmer commented on the use of AI. She said that it is very useful but also dangerous. The chief justice also confirmed that the court has formed an AI committee to study and evaluate how this technology might be used in future legal proceedings.
Many Reddit users who read about the AI-generated video being used in court had mixed reactions. Numerous people expressed discomfort and concern over its use. “Keep AI out of the justice system or we are all screwed,” one Redditor wrote. Others criticized the decision to recreate a deceased person’s face and voice, calling it inappropriate and unsettling. “Very strange. I get it, but I’d absolutely hate it if my family created an AI of me posthumously, saying things I never actually said,” another user commented.
Others even remarked that people might want to start including clauses in their wills to prevent family members from using their face or voice in AI-generated content. Some felt that the use of the video bordered on blatant emotional manipulation, arguing that bringing a dead person “back” to speak in court crosses a moral line, regardless of the intention.