probability exploitation
It was the last day before retirement of a co-worker, i've worked with for 19 years. It was a short and painless goodbye, with the idea of meeting again next spring. Something i've heard to often, as to keep my schedule free for the whole of spring.
An acquaintance's husband died last summer, she already has a lot of mental problems—among them dissociative identity disorder—, and through a digital neighbour, got introduced to Chat GPT as her new therapist. Years ago, i witnessed a downfall episode of hers, self destructing behaviour that ended with her replacing her pills with candy, and being institutionalised1 when her Doctor found out. Since then she had her ups and downs of course, even in the early stages of grieving, that started with her starting to take over the palliative care of her Husband. She said the word grieve to me, when the house was emptied of his things, when a few photographs and whiff of his smell where the only things of him, that remained. She talked about, how she felt different, fragmented. Chat GPT tells her, everything is ok, yet, she talks of problems at work, less motivated, exhausted. And while i understand why she doesn't, yet, she needs rest. I worry, and am not sure how to articulate my worries, so they cone through. She always recommends the AI to me, not in a million years will i use it as a therapist. A machine, that's there to keep my attention and therefore is not interested in helping me cope isn't what we need. Of course there are therapists who work the sane way, yet, their advantage is the human experience. At a basic level, most of them are able to emphasise with what they're told by their patients. I'll risk being ignored or misinterpreted over probability exploitation.
This is a tricky topic … it happened to me as well, in my case after a suicide attempt, and i'm against locking up people for being sick, and a threat to themselves, but i understand, that it is uncertain, if self harming behaviour will spiral into hurting others.↩