![[personal profile]](https://www.dreamwidth.org/img/silk/identity/user.png)
In writing here previously on generative AI I had wondered what happens to society when people can routinely lose themselves in artificial worlds of their own design. I had been thinking from the point of view of their being able to act out dark fantasies then adjusting to the real world where they don't make the rules and their actions affect others.
Anticipated by many works of fiction over the years, I was slow to consider what may be a good side of advancements in training inference models. There are many people who don't have enough contact with friends, perhaps especially the elderly. We may not be far from a point where they can have some artificial companion, patient and configurable, that offers interesting and helpful conversation on whatever topics the user wishes, even joining them actively in some pursuits, far beyond Alexa who can do little more than reading out the results from web searches.
Such companions may be considered a poor substitute for human contact but I suppose that there are probably funded startups chasing this very market.
Anticipated by many works of fiction over the years, I was slow to consider what may be a good side of advancements in training inference models. There are many people who don't have enough contact with friends, perhaps especially the elderly. We may not be far from a point where they can have some artificial companion, patient and configurable, that offers interesting and helpful conversation on whatever topics the user wishes, even joining them actively in some pursuits, far beyond Alexa who can do little more than reading out the results from web searches.
Such companions may be considered a poor substitute for human contact but I suppose that there are probably funded startups chasing this very market.
no subject
Date: 2025-04-12 01:19 am (UTC)https://www.myptsd.com/forums/ask-aria-catalyst-ai.372/
I see that Psychology Today mentions other AI chatbots, discussing the benefits and dangers. I'm reminded of those pictures of young monkeys clinging to fake mother figures. I hope that scenario is no longer repeated since the Harlow experiment. If someone has little choice for better options, I do think that faked comfort is better than no comfort at all.
no subject
Date: 2025-04-12 09:13 am (UTC)https://www.theverge.com/c/24300623/ai-companions-replika-openai-chatgpt-assistant-romance
https://en.wikipedia.org/wiki/Replika
no subject
Date: 2025-04-13 04:59 pm (UTC)