The perception thing is tricky. How do you even know another human perceives things as you do? AI teaches us some truths through its failures and successes.
It's a fallacy to think that we only think with our "mind". We think when we "try something out"; we think with our "gut"; we "take a walk"; we "sleep on it". We toss rocks, fiddle with rosaries, bow to mecca. All of these engage other parts of our body, even parts of our environment. (How do people make friends with a river or the wind? How do they perceive seasons? What is the epistomological basis of "Saturnine"? Why do we name some inanimate objects, in particular ships; and why feminine?) By my lights if you carry your smartphone all day, it's a part of you; I avoid making it a habit for this very reason.
I've had serious astigmatism my whole life. I also managed to function uncorrected into my forties. My "vision" was extremely good, especially in a natural environment. Astigmatism means everything isn't in focus at one place at once, and there are interference patterns; so I'm hard wired to see certain patterns. So imagine that fly's eyes. The individual units don't focus to the best of my understandings. Think about them as individually manufactured, and maybe the quality control isn't what you think it is. Maybe it's expected that some focus near and some focus far; maybe some even suffer from e.g. astigmatic conditions. If AI can classify images from raw JPEG images, it's reasonable to suspect that it can integrate sensor data from multiple heterogenous units incorporating light sensors.
So at some level it builds a model. The question has to be how much introspection the control unit can bring to bear on that model and does that help it interact with its environment in a more efficient or effective way.
It's a fallacy to think that we only think with our "mind". We think when we "try something out"; we think with our "gut"; we "take a walk"; we "sleep on it". We toss rocks, fiddle with rosaries, bow to mecca. All of these engage other parts of our body, even parts of our environment. (How do people make friends with a river or the wind? How do they perceive seasons? What is the epistomological basis of "Saturnine"? Why do we name some inanimate objects, in particular ships; and why feminine?) By my lights if you carry your smartphone all day, it's a part of you; I avoid making it a habit for this very reason.
I've had serious astigmatism my whole life. I also managed to function uncorrected into my forties. My "vision" was extremely good, especially in a natural environment. Astigmatism means everything isn't in focus at one place at once, and there are interference patterns; so I'm hard wired to see certain patterns. So imagine that fly's eyes. The individual units don't focus to the best of my understandings. Think about them as individually manufactured, and maybe the quality control isn't what you think it is. Maybe it's expected that some focus near and some focus far; maybe some even suffer from e.g. astigmatic conditions. If AI can classify images from raw JPEG images, it's reasonable to suspect that it can integrate sensor data from multiple heterogenous units incorporating light sensors.
So at some level it builds a model. The question has to be how much introspection the control unit can bring to bear on that model and does that help it interact with its environment in a more efficient or effective way.