You could define it that way. I think it could be more abstract than that, personally, because
a. Is the nervous system in animals the only neural network in nature? I’ve heard discussion on the whether a some types of fungus are conscious from how they send chemical signals to other parts of the fungus. This is slow but does it count? And then there’s the collective consciousness of ant colonies and beehives. That’s a level above where each bug’s nervous system is itself a node in a larger neural network.
b. I think that consciousness is more than just the nervous system. In another comment under this post I argued that a neural network (in an abstract sense) can only “think” in terms of the sensors it has access too. What does the lab-grown brain think about? It’s never seen things, it’s never heard sounds or words, can it feel touch? (I’m not an anatomy guy). My hunch is it’s just static, essentially an “untrained” neural network". Does that count as conscious?Maybe those senses are considered a part of the nervous system, again I’m not an anatomy guy.
But then how do the “chemical computations” like hormones and gut bacteria come into play? Are they just indirectly sensed by the nervous system?
I’m really not exactly sure what qualifies, but the existence of an emergent system so has to be there. Does fungus communication give rise to a system that can build some kind of memory and refer to it to develop more complex behavior? If not, then it’s lacking the level of complexity to be considered consciousness. (But that’s just where I personally draw the line)
Eusociality has its own context. It’s possible for a hive to show complex organized behavior, but so would an infinite paperclip machine if it was to consist of a swarm of collector drones. A myriad of units with a set of pre-determined instructions can have complex organizations, which still wouldn’t qualify as consciousness.
Now, the brain scenario would definitely count since it consists of the necessary “hardware” to start generating its own abstract contextual model of its experiences.
A myriad of units with a set of pre-determined instructions
Like neurons? My argument was that in abstract sense, a single ant could be considered a neuron. It senses the environment and other ants for inputs, and it interacts with the environment and other ants for output. A network of ants is capable of complex behavior. By this logic of course, just about any entity could be considered a neuron, and any collection of entities a neural network, which I think is what the original article is getting at. Now is the ant colony conscious? I don’t know. Am I conscious? I think so, it seems like it. Are you conscious? You seem a lot like me, and I think I probably am, so I think you probably are too. Basically what I’m saying is I haven’t heard of a definition of consciousness that doesn’t wind up encapsulating everything or nothing, or that isn’t human-centric.
Now, the brain scenario would definitely count since it consists of the necessary “hardware” to start generating its own abstract contextual model of its experiences.
So, you’re saying that you don’t need experience to be conscious, just the the potential to experience? I’m not sure if I agree with that. Yeah there’s diminishing returns, I don’t think that an old person is significantly more self-aware than a kid in the grand scheme of things, but pretty much every thought I’ve ever had, that I realized I had anyway, was in terms of a sense I had, or at least derived from the senses. Even a newborn has been feeling and hearing since embryo. Now there is instinct to consider, that was evolved and while it can influence and direct consciousness, I don’t think acting on instinct is a conscious act itself. I’m saying, can a brain in a jar with no contact with the world, that’s never had contact with the world at any point, be aware of itself? What is self without environment?
You could define it that way. I think it could be more abstract than that, personally, because
a. Is the nervous system in animals the only neural network in nature? I’ve heard discussion on the whether a some types of fungus are conscious from how they send chemical signals to other parts of the fungus. This is slow but does it count? And then there’s the collective consciousness of ant colonies and beehives. That’s a level above where each bug’s nervous system is itself a node in a larger neural network.
b. I think that consciousness is more than just the nervous system. In another comment under this post I argued that a neural network (in an abstract sense) can only “think” in terms of the sensors it has access too. What does the lab-grown brain think about? It’s never seen things, it’s never heard sounds or words, can it feel touch? (I’m not an anatomy guy). My hunch is it’s just static, essentially an “untrained” neural network". Does that count as conscious?Maybe those senses are considered a part of the nervous system, again I’m not an anatomy guy.
But then how do the “chemical computations” like hormones and gut bacteria come into play? Are they just indirectly sensed by the nervous system?
I’m really not exactly sure what qualifies, but the existence of an emergent system so has to be there. Does fungus communication give rise to a system that can build some kind of memory and refer to it to develop more complex behavior? If not, then it’s lacking the level of complexity to be considered consciousness. (But that’s just where I personally draw the line)
Eusociality has its own context. It’s possible for a hive to show complex organized behavior, but so would an infinite paperclip machine if it was to consist of a swarm of collector drones. A myriad of units with a set of pre-determined instructions can have complex organizations, which still wouldn’t qualify as consciousness.
Now, the brain scenario would definitely count since it consists of the necessary “hardware” to start generating its own abstract contextual model of its experiences.
Like neurons? My argument was that in abstract sense, a single ant could be considered a neuron. It senses the environment and other ants for inputs, and it interacts with the environment and other ants for output. A network of ants is capable of complex behavior. By this logic of course, just about any entity could be considered a neuron, and any collection of entities a neural network, which I think is what the original article is getting at. Now is the ant colony conscious? I don’t know. Am I conscious? I think so, it seems like it. Are you conscious? You seem a lot like me, and I think I probably am, so I think you probably are too. Basically what I’m saying is I haven’t heard of a definition of consciousness that doesn’t wind up encapsulating everything or nothing, or that isn’t human-centric.
So, you’re saying that you don’t need experience to be conscious, just the the potential to experience? I’m not sure if I agree with that. Yeah there’s diminishing returns, I don’t think that an old person is significantly more self-aware than a kid in the grand scheme of things, but pretty much every thought I’ve ever had, that I realized I had anyway, was in terms of a sense I had, or at least derived from the senses. Even a newborn has been feeling and hearing since embryo. Now there is instinct to consider, that was evolved and while it can influence and direct consciousness, I don’t think acting on instinct is a conscious act itself. I’m saying, can a brain in a jar with no contact with the world, that’s never had contact with the world at any point, be aware of itself? What is self without environment?