Discussion about this post

User's avatar
Terry underwood's avatar

As a layperson, my first response to the idea that mind is software and brain hardware was WHAT? My resistance stemmed from my emotional sort of horror that the mind is premade and downloaded into the brain by the authorities, possibly ethereal. But then I thought, well, mind software could be different from computer software in that the brain might write its own code or software in response to experience. I’m going to have to let a day or two pass while I think about it and then reread your post which, btw, is stunning. I love the way you teach through writing! It’s such a rare talent.

Anyway, I realize that we are talking analogies and metaphors with these terms, but mortality with regard to both humans and computers is not a metaphor but a fact. And the idea of consequences seems to me to make a crucial difference. I just published a post of a talk I had with Claude in which I asked Claude for the best directions to an airport during rush hour. I didn’t see it while writing, but the difference between Claude’s output and my input was this: Claude could predict nothing, and prediction was everything to me. The difference was this: I would be the one in the car driving. I had skin in the game. What Claude and I were talking about meant something to me. Claude more or less admitted that it meant nothing to “him” and told me to consult a traffic app for in the moment data. Claude is not only not mortal, Claude is not “in the moment.” Human consciousness came about because of its survival benefits. I can’t imagine consciousness in what we know is inorganic, purely physical like a fence post. Suzi, please critique my reasoning here if I’m completely off topic. I may just be rambling—proof that this isn’t coming from a bot:)

Oh, and I don’t respond to the word “mortal” in the way Mark Young does. I do think it’s clever, the point about Russian drones. It makes me think of suicide. AI isn’t ever “suicidal” is it? Can a rock or a stoplight be suicidal? I think mortal is a particularly apt word.

Expand full comment
Nick Potkalitsky's avatar

Great work, Suzi! The idea of building more organic computer systems has fascinated me for a while. Particularly imbuing them with properties like mortality. But it strikes me that will be a very difficult process. It just sort of reduplicates the question of cultivating in a computer system true awareness about its own processes including their limits. Technically, all computer systems today are just one environmental or systemic disaster away from erasure, and yet current system can only report on their own internal processes to the extent that we can program that process inside the vectors of machine learning or other emergent processes.

Expand full comment
28 more comments...

No posts