Nature of Consciousness

What is the nature of our concioussness and how do we form our sense of individuality?
It is a philosophical question of course but in Roger Penrose has done a lot of work in showing of how science may be able to explain about it. This question arises from many problems and the answer would also create many problems.

It is a hot age for artificial intelligence, machine learning, data science and stuff how will exactly be a good AI. Well in technical terms there is an ANI and AGI. ANI refers to Artificial Narrow Intelligence and AGI to Artificial General Intelligence. As the name suggest, we already have ANI, if many spheres but to create an AGI is still a milestone. But the question is can we really create such a thing? if yes how can we know if it isn't just an advanced ANI? if we are really successful what does it say about our mind then? or our individuality? What if we make multiple versions of it and they all have individual same identity? Can this happen to us? Does our sense of individuality and consciousness just a illusion? 

The book "Shadows of the Mind" by Roger Penrose discusses about these things in a very comprehensive way. There is no way I can summarize what the book says, but it touches concepts of quantum mechanics, neuroscience and philosophy. But at first he proposes four things that can be said about the mind, only one of them gives the nature of our consciousness, 



The four viewpoints are 
  1. All thinking is computation; in particular, feelings of conscious awareness are evoked merely by the carrying out of appropriate computation
  2. Awareness is a feature of the brain's physical action; and whereas any physical action can be simulated computationally, computational simulations by itself cannot evoke awareness;
  3. Appropriate physical action of the brain evokes awareness, but this physical action cannot even be properly simulated computationally.
  4. Awareness cannot be explained by physical, computational or any other scientific terms.

So if 1 is right, we can make AGI that will by principle be same as our mind.
If 2 is right then we can make some sort of AI that will be indistinguishable from our own mind but by principle won't be.
If 3 or 4 is right we can never make an AGI. 

This book has a lot and lot of discussion including some higher mathematics and Penrose uses Godel's Incompleteness theorem to justify that one of this viewpoint's in completely wrong.
We can only try and perfect the AI even if we get a master piece, we cannot be sure between 1 and 2.


Comments