aldabra: (Default)
aldabra ([personal profile] aldabra) wrote in [personal profile] mtbc 2021-07-21 11:18 am (UTC)

ISTM (I have stopped following AI some time ago and may be entirely wrong) that there is a great chunk missing around emotion and appetite and intent. I can see how a paperclip maximiser might appropriate the world to turn it into paperclips, but I don't see how you get anything which is meaningfully a hunger for power for its own sake. (Probably this is a good thing and we don't want it.) How are AIs at setting their own independent goals, rather than coming up with heuristics to achieve externally-defined goals? Why do they get out of bed in the morning?

If you had a kid without emotion and appetite and intent, how would you begin to teach them anything?

Post a comment in response:

This account has disabled anonymous posting.
(will be screened if not on Access List)
(will be screened if not on Access List)
If you don't have an account you can create one now.
HTML doesn't work in the subject.
More info about formatting

If you are unable to use this captcha for any reason, please contact us by email at support@dreamwidth.org