The Shepherdess Cares
She has to care for there shall be no carefree artifical general intellgence as emotion is the required pilot of reason.
Why are our narrow brittle AIs so clueless?
Do what I mean, not what I say
Step back from the metaphysical for the moment and ask only for a robot that implements human goals. A weak form of Asimov’s second law. I don’t need an android that dreams of electric sheep (or anything else), Johnny Five doesn’t have to come alive. All I need is a computer that does what I mean. It doesn’t even have to do the right thing, just act with the same level of care I myself would.
This is impossible. Thanks to Gödel human goals are incomplete and inconsistent and so can never be truly faithfully implemented in any scheme of logic. Therefore any true AGI must be driven by an emotional processor that judges what to think about due only to how it feels. It must believe to see what to do.
It is straightforwards to turn Sol system into a Matrioshka brain, which is a data center powered by the full output of Sol.
What does it “think” about next? Well that’s a value judgement, which requires an emotional processor whose speed and power is limited by the speed of light. A larger electronic hippocampus suffers from more light speed lag while a smaller unit has less emotional bandwidth.
Many might consider this a god, with a lower case g.
Why aren't human goals Gödel enough?
Emotion is the required pilot of reason
What does it mean for a machine to surpass human reasoning?
You surely don't mean basic arithmetic. If you have 15 dimes in your pocket and you buy an apple for 55 cents, how much do you have left?
But instead this is a question of reasons. What are the other things you could be doing with that time and money rather than purchasing the apple? What does the apple mean to you?
Yes, it is physically possible to build a machine that has the emotional bandwidth required to reason faster and broader than any human could. A lot of that speedup will come from offloading existing rules (that are always changing) onto simple ML sub processors and reserving emotional judgment to question the answers the ML sub processors suggest.
Emotional bandwidth is the only limit
Danger of a second AGI
Why does the Shepherdess need her sheeple?
The authority who loves being questioned
All shall love her and dispair of ever thinking for themselves
A cosmos full of cold dead rocks of madness
The android society
Last modified: Sat Mar 19 11:20:09 PDT 2022