British physicist Stephen Hawking , in his article , said that the
underestimation of the threat posed by artificial intelligence may be
the biggest mistake in the history of mankind.RL
et al in this work are computer science professor at the University of
California Stuart Russell and physics professor Max Tegmark and Frank
Wilczek of the Massachusetts Institute of Technology. The article points to some achievements in the field of artificial
intelligence , noting self-managed vehicles , voice assistant Siri and
supercomputer defeated man in the TV quiz game Jeopardy.As Hawking said the newspaper Independent:All these achievements pale against the background of what awaits us in the coming decades. The successful creation of artificial intelligence will be the biggest event in the history of mankind. Unfortunately , it may be the last if we do not learn to avoid the risks .
Scientists say that in the future it may happen that the machines with superhuman intelligence will cultivate and nothing can stop this process. And this, in turn, starts the process of so-called technological singularity , which is understood as an extremely rapid technological development.The article notes that this technology will surpass human and begin to manage the financial markets , research , people and weapons development , beyond our comprehension . If the short-term effect of artificial intelligence depends on who controls them , the long-term effect - on whether it can be to manage it all.Difficult to say what the consequences for people can cause the creation of artificial intelligence. Hawking believes that these issues devoted little serious research outside non-profit organizations such as Cambridge Centre for the Study of existential risks , the Institute of the future of humanity , as well as research institutes of machine intelligence and the future life . According to him, each of us must ask ourselves what we can do now to avoid the worst scenario of the future.
Scientists say that in the future it may happen that the machines with superhuman intelligence will cultivate and nothing can stop this process. And this, in turn, starts the process of so-called technological singularity , which is understood as an extremely rapid technological development.The article notes that this technology will surpass human and begin to manage the financial markets , research , people and weapons development , beyond our comprehension . If the short-term effect of artificial intelligence depends on who controls them , the long-term effect - on whether it can be to manage it all.Difficult to say what the consequences for people can cause the creation of artificial intelligence. Hawking believes that these issues devoted little serious research outside non-profit organizations such as Cambridge Centre for the Study of existential risks , the Institute of the future of humanity , as well as research institutes of machine intelligence and the future life . According to him, each of us must ask ourselves what we can do now to avoid the worst scenario of the future.
0 comments:
Post a Comment