ISTM (I have stopped following AI some time ago and may be entirely wrong) that there is a great chunk missing around emotion and appetite and intent. I can see how a paperclip maximiser might appropriate the world to turn it into paperclips, but I don't see how you get anything which is meaningfully a hunger for power for its own sake. (Probably this is a good thing and we don't want it.) How are AIs at setting their own independent goals, rather than coming up with heuristics to achieve externally-defined goals? Why do they get out of bed in the morning?
If you had a kid without emotion and appetite and intent, how would you begin to teach them anything?
no subject
If you had a kid without emotion and appetite and intent, how would you begin to teach them anything?