Warning: Some posts on this platform may contain adult material intended for mature audiences only. Viewer discretion is advised. By clicking ‘Continue’, you confirm that you are 18 years or older and consent to viewing explicit content.
I’m not my body and I’m not my mind. I am the ethical soul, the decision-making process. If the replacement makes all the same decisions I would, it IS me.
The thought process assumes it is a complete and perfect cloning of all aspects we do and don’t understand. The reason the clone is not you is because if I do something to the clone it does not affect you.
Like if you take a water bottle and clone it, drinking one does not cause the other to be empty. Thus they must be two separate things.
What if something like ChatGPT is trained on a dataset of your life and uses that to make the same decisions as you? It doesn’t have a mind, memories, emotions, or even a phenomenal experience of the world. It’s just a large language data set based on your life with algorithms to sort out decisions, it’s not even a person.
I’m having a hard time imagining a decision that can’t be language based.
You come to a fork in the road and choose to go right. Obviously there was no language involved in that decision, but the decision can certainly be expressed with language and so a large language model can make a decision.
It doesn’t matter how it comes to make a decision as long as the outcome is the same.
Sorry, this is beside the point. Forget ChatGPT.
What I meant was a set of algorithms that produce the same outputs as your own choices, even though it doesn’t involve any thoughts or feelings or experiences. Not a true intelligence, just an NPC that acts exactly like you act. Imagine this thing exists. Are you saying that this is indistinguishable from you?
I’m not my body and I’m not my mind. I am the ethical soul, the decision-making process. If the replacement makes all the same decisions I would, it IS me.
The thought process assumes it is a complete and perfect cloning of all aspects we do and don’t understand. The reason the clone is not you is because if I do something to the clone it does not affect you.
Like if you take a water bottle and clone it, drinking one does not cause the other to be empty. Thus they must be two separate things.
What if something like ChatGPT is trained on a dataset of your life and uses that to make the same decisions as you? It doesn’t have a mind, memories, emotions, or even a phenomenal experience of the world. It’s just a large language data set based on your life with algorithms to sort out decisions, it’s not even a person.
Is that you?
No, because not all my decisions are language-based. As gotchas go, this one’s particularly lazy.
I’m having a hard time imagining a decision that can’t be language based.
You come to a fork in the road and choose to go right. Obviously there was no language involved in that decision, but the decision can certainly be expressed with language and so a large language model can make a decision.
But I don’t make all my decisions linguistically. A model that did would never act as I do.
It doesn’t matter how it comes to make a decision as long as the outcome is the same.
Sorry, this is beside the point. Forget ChatGPT.
What I meant was a set of algorithms that produce the same outputs as your own choices, even though it doesn’t involve any thoughts or feelings or experiences. Not a true intelligence, just an NPC that acts exactly like you act. Imagine this thing exists. Are you saying that this is indistinguishable from you?
“Is something that acts exactly like you act indistinguishable from you?”
Well, yes.