Artificial Intelligence will indeed be the last invention as it could destruct the human kind through sparking a nuclear war according to a study by RAND Corporation and researchers believe it could happen as early as 2040 with huge advances in technology.
The RAND Corporation, a not-for-profit security thinktank in the United States, believes significant advances in machines could lead to intelligent robots encouraging and brainwashing the world leaders to launch their nuclear arsenal by mistake. All out war was prevented during the Cold War because of “mutually assured destruction” – if one side launched a nuclear bomb, the other side would too, meaning everyone would have lost.
But RAND argues more data and more intelligent AI could mean the robots convince world leaders that the opposition’s nuclear arsenal is not as powerful as their own. This could lead them to take drastic action, according to the report titled Project 2040 which saw the company interview unnamed experts in security, nuclear weapons and artificial intelligence.
Andrew Lohn, an engineer at RAND and co-author of the paper, said: “This isn’t just a movie scenario. “Things that are relatively simple can raise tensions and lead us to some dangerous places if we are not careful. “Some experts fear that an increased reliance on artificial intelligence can lead to new types of catastrophic mistakes. “There may be pressure to use AI before it is technologically mature, or it may be susceptible to adversarial subversion.
“Therefore, maintaining strategic stability in coming decades may prove extremely difficult and all nuclear powers must participate in the cultivation of institutions to help limit nuclear risk.”
Edward Geist, co-author of the paper and associate policy researcher at RAND, said links between war and AI are nothing new. He said: “The connection between nuclear war and artificial intelligence is not new; in fact, the two have an intertwined history. “Much of the early development of AI was done in support of military efforts or with military objectives in mind.”