For two years, Stacey Wales kept a running list of everything she would say at the sentencing hearing for the man who killed her brother in a road rage incident in Chandler, Ariz.
But when she finally sat down to write her statement, Wales was stuck. She struggled to find the right words, but one voice was clear: her brother’s.
“I couldn’t help hear his voice in my head of what he would say,” Wales told NPR.
That’s when the idea came to her: to use artificial intelligence to generate a video of how her late brother, Christopher Pelkey, would address the courtroom and specifically the man who fatally shot him at a red light in 2021.
Death must have been a sweet release for this poor dude. His family couldn’t be assed to say a few genuine words over his death and outsourced the job to AI.
There are whole swaths of population that are outsourcing their basic human functions to AI.
That a judge allowed this travesty in an actual courtroom is even more of a travesty.
Derek Chauvin’s lawyers have a chance to do the funniest thing.
LikeLike
NPR is propaganda central, though. Thinking through the implications of NPR pushing AI outsourcing of personal communications generally, and of courtroom statements in particular.
That’s not a good direction.
LikeLike
…and let’s not forget, using AI to resurrect simulacra of dead people. Don’t even want to think about where that’s going, given that I quit NPR over their nonstop, totally relentless pushing to normalize, desensitize the public to, every sexual fetish imaginable.
Next stop: AI necrophilia.
LikeLiked by 1 person
For the noble purpose of ~~checks notes~~ forgiving black murderers of white people.
LikeLiked by 1 person
yeah, that would be so NPR.
LikeLike
“That’s not a good direction”
I can imagine worse….
AI trials: Evidence is fed into AI which determines the verdict.
LikeLiked by 1 person
oof.
I hate how plausible that sounds.
LikeLiked by 1 person
\ AI trials: Evidence is fed into AI which determines the verdict.
How is this worse than “AI determines the disease and the course of treatment”?
Or “AI determines the remaining years of life and the cost of health insurance, if one is permitted to buy it at all”?
LikeLiked by 1 person
“How is this worse”
They’re all terrible and probably all coming because most people are too stupid or lazy to prevent it….
LikeLiked by 2 people
Disease is biological and objective. Justice is human, subjective and very culturally and historically circumscribed. A heart attack or a tumor are always that, at any time in history and in any culture. But what’s considered crime and what mitigates it varies dramatically.
As an example, killing under the influence of alcohol was considered an aggravating factor in the USSR. But it’s considered a mitigating factor in the US. I was really stunned when I found out because that’s so different from what I always knew.
LikeLiked by 1 person
\ Justice is human, subjective and very culturally and historically circumscribed. … But what’s considered crime and what mitigates it varies dramatically.
First of all, the main aggravating and mitigating factors can be easily fed into AI.
Even more importantly, people will argue that AI is the only entity capable of dispensing true justice, impartially enforcing laws free from human emotions and prejudices. For the first time in human history, justice will be blind.
One could hope woke people would help you fight this trend after seeing the disproportionate effect of AI verdicts on people of color, yet I am not optimistic both since AI can be programmed to see a minority status as a mitigating factor and since it’s much cheaper than human judges.
It’s already a reality btw:
Or this article:
https://dl.acm.org/doi/full/10.1145/3696319
LikeLike
AI is hardly free from prejudice. It’s a lefty propaganda machine. Every time I use it I have to battle it from not giving me lists of propaganda points.
LikeLike
There’s another issue, much more important than accuracy. AI in court decisions and all other avenues of life is a technocratic libtard dream. And here’s why:
LikeLike
As for Grok, I’ll put some evidence in the new post. If you trust that piece of crap about anything, I don’t know what to say.
LikeLike
It’s strange and maybe dehumanizing to me to make an AI statement putting words in the victim’s mouth at a victim impact hearing of the guy who murdered him. And then having this AI mouth, “I’d forgive you” even if his family is right that he’d say that. The judge loved it added another year and a half to the murderer’s sentence over what the prosecution asked for.
LikeLike
Very strange. I don’t understand why this was allowed.
People need to start rewriting their wills, prohibiting relatives from making AI versions of them.
LikeLiked by 1 person
There’s a very interesting lawsuit filed against Meta/Facebook which I hope is successful. Using AI to destroy your political enemies is so effective because there’s always plausible deniability. “Sorry, it was the algorithm.”
And that needs to stop.
LikeLiked by 1 person
This is really shocking and vile. I hope he trashes them in court.
LikeLike