mtbc: maze I (white-red)
Mark T. B. Carroll ([personal profile] mtbc) wrote2025-04-11 04:39 pm
Entry tags:

A possibly good side of modern AI research

In writing here previously on generative AI I had wondered what happens to society when people can routinely lose themselves in artificial worlds of their own design. I had been thinking from the point of view of their being able to act out dark fantasies then adjusting to the real world where they don't make the rules and their actions affect others.

Anticipated by many works of fiction over the years, I was slow to consider what may be a good side of advancements in training inference models. There are many people who don't have enough contact with friends, perhaps especially the elderly. We may not be far from a point where they can have some artificial companion, patient and configurable, that offers interesting and helpful conversation on whatever topics the user wishes, even joining them actively in some pursuits, far beyond Alexa who can do little more than reading out the results from web searches.

Such companions may be considered a poor substitute for human contact but I suppose that there are probably funded startups chasing this very market.
mellowtigger: (artificial intelligence)

[personal profile] mellowtigger 2025-04-12 01:19 am (UTC)(link)
Back in October, I was trying to find counseling help for someone here in Minnesota. I discovered that there is already at least one AI available for PTSD counseling, linked below. I told her it's probably not safe to use it for actual counseling, but it would be worth asking it for help in finding a local therapist.
https://www.myptsd.com/forums/ask-aria-catalyst-ai.372/

I see that Psychology Today mentions other AI chatbots, discussing the benefits and dangers. I'm reminded of those pictures of young monkeys clinging to fake mother figures. I hope that scenario is no longer repeated since the Harlow experiment. If someone has little choice for better options, I do think that faked comfort is better than no comfort at all.
darkoshi: (Default)

[personal profile] darkoshi 2025-04-12 09:13 am (UTC)(link)
When I read this article on AI Companions back in December, I was surprised that this kind of thing has existed already since at least 2017:
https://www.theverge.com/c/24300623/ai-companions-replika-openai-chatgpt-assistant-romance
https://en.wikipedia.org/wiki/Replika
shadowkat: (Default)

[personal profile] shadowkat 2025-04-13 04:59 pm (UTC)(link)
They already have robot dogs and cats for nursing home and elderly patients.