Good answer, but what threshold of self-awareness do they have to cross? How self-aware must they be? What test would we use to determine if they've reached it?
And do humans even have consciousness by that definition? We're not completely self-aware all the time. Most of us can't coherently express that we understand our place in the world around us. We have existential crises. We struggle with what it even means to "be our self".
It seems to me that self-awareness is an abstract sliding scale, and if we take your definition then consciousness must be an abstract sliding scale too. It isn't something we either have or don't have, but something we have levels of. Consider a hypothetical being who has a far greater self-awareness than we do, just because it has a more efficient brain able to process vast amounts of information at a rate we could never imagine. A being that would find a question like your one laughably easy. That being would have such great self-awareness that it would know what consciousness means at a level we never could and might find the experience of humans so measly in comparison that it would never judge us as having consciousness, in the same way that we would probably never say a fly or a mosquito has consciousness. Judgements over which beings have consciousness can only be made in relation to what the judges (in this case human beings) have. It's not objective.
They usually measure an animals self awareness by placing a mirror in front of the animal...and see if it can recognize that reflection as themselves and nit another animal.
I suppose you would do the same for the artificial intelligence. .. :P
Ok, that's very interesting.
But just because it may be impossible for us to measure a definite point where a computer has become aware of itself doesn't mean of course that it couldn't become aware of itself; your point seems to be that we would have trouble measuring a point at which that happens as you're saying that self-awareness is on a scale. It reminds me of the sorites (heap) paradox: when does a heap become heap? How many grains of sand do you need to get a heap, and if you have a heap and take a grain away is it no longer a heap? The point is there is a blurry section in the middle of two distinct outer parts. That is interesting in terms of consciousness. And also children are not really self-aware until a certain age, then there's dreams, drugs, and other altered states. Also, some argue animals have a degree of self-awareness especially primates and perhaps dolphins? Not sure about dolphins.
So like many things self-awareness and consciousness is a slippery thing to pinpoint and define. I just read that betrand russell extended the grain of sand paradox to words like these: "tall", "rich", "old", "blue", "bald."
I think theoretically we could judge when a computer or an animal or some other being gains a level of consciousness that is similar to our own. We might not see it creeping up on us (like in the heap paradox) and we will only be able to accept it and judge it as consciousness if/when it manifests in a way we can recognise. But I don't think that those limitations mean we couldn't ever recognise that a computer had gained self-awareness. I think we could recognise it, that is if it could occur in the first place.
Is it possible computers could become conscious one day?
← View full post
I dunno. What is consciousness?
--
Anonymous Post Author
8 years ago
|
pl
Comment Hidden (
show
)
Report
0
0
Self-awareness.
--
dom180
8 years ago
|
pl
Comment Hidden (
show
)
Report
1
1
Good answer, but what threshold of self-awareness do they have to cross? How self-aware must they be? What test would we use to determine if they've reached it?
And do humans even have consciousness by that definition? We're not completely self-aware all the time. Most of us can't coherently express that we understand our place in the world around us. We have existential crises. We struggle with what it even means to "be our self".
It seems to me that self-awareness is an abstract sliding scale, and if we take your definition then consciousness must be an abstract sliding scale too. It isn't something we either have or don't have, but something we have levels of. Consider a hypothetical being who has a far greater self-awareness than we do, just because it has a more efficient brain able to process vast amounts of information at a rate we could never imagine. A being that would find a question like your one laughably easy. That being would have such great self-awareness that it would know what consciousness means at a level we never could and might find the experience of humans so measly in comparison that it would never judge us as having consciousness, in the same way that we would probably never say a fly or a mosquito has consciousness. Judgements over which beings have consciousness can only be made in relation to what the judges (in this case human beings) have. It's not objective.
--
reminiscent
8 years ago
|
pl
Comment Hidden (
show
)
Report
1
1
-
Anonymous Post Author
8 years ago
|
pl
Comment Hidden (
show
)
Report
1
1
-
Anonymous Post Author
8 years ago
|
pl
Comment Hidden (
show
)
Report
0
0
They usually measure an animals self awareness by placing a mirror in front of the animal...and see if it can recognize that reflection as themselves and nit another animal.
I suppose you would do the same for the artificial intelligence. .. :P
Ok, that's very interesting.
But just because it may be impossible for us to measure a definite point where a computer has become aware of itself doesn't mean of course that it couldn't become aware of itself; your point seems to be that we would have trouble measuring a point at which that happens as you're saying that self-awareness is on a scale. It reminds me of the sorites (heap) paradox: when does a heap become heap? How many grains of sand do you need to get a heap, and if you have a heap and take a grain away is it no longer a heap? The point is there is a blurry section in the middle of two distinct outer parts. That is interesting in terms of consciousness. And also children are not really self-aware until a certain age, then there's dreams, drugs, and other altered states. Also, some argue animals have a degree of self-awareness especially primates and perhaps dolphins? Not sure about dolphins.
So like many things self-awareness and consciousness is a slippery thing to pinpoint and define. I just read that betrand russell extended the grain of sand paradox to words like these: "tall", "rich", "old", "blue", "bald."
I think theoretically we could judge when a computer or an animal or some other being gains a level of consciousness that is similar to our own. We might not see it creeping up on us (like in the heap paradox) and we will only be able to accept it and judge it as consciousness if/when it manifests in a way we can recognise. But I don't think that those limitations mean we couldn't ever recognise that a computer had gained self-awareness. I think we could recognise it, that is if it could occur in the first place.