2017-11-26

The irony of the singularity

I have since the seventies from time to time pondered upon the AI technological singularity. I do beleive we will eventually reach the singularity, but not anytime soon. This blog post sum up my ideas well. In terms of technological skills we are very far from creating something that can be smarter than us. In fact we still do not know if it is possible to be smarter than what we are. When we are discussing AI we have since Turing always talked about the electronic brain or computer singularity. I find it more likely we can enhance our brain by tweaking our own DNA, if that will make us smarter that remains to be seen.  
Anyway I was discussing the computer singularity  with one of my sons, and we both find it likely if the singularity happens that AI will end humanity, all historical evidence points in that direction and that is what we have to relate to. Then my son said:
“That is the end of it all, when humanity is annihilated, the AI will shut itself off. Who says the superior AI has any desire to ‘live’ or ‘exist’? We just cannot know.”


The singularity happens, humankind is annihilated, the AI shut itself off. That is the end, no harmageddon, no terminal nuclear winter. Just a “Thank you for the fish” a flip on the switch and nothing more. That would be the ultimate irony of humankind.

I find my son’s remark the most thought provoking idea about the singularity I have heard in a very long time.

No comments:

Post a Comment