There was a time when Western Europe had guns and everybody else didn’t, and it was not a good time to be the “everybody else”. The conquests that occurred during this time were both complete and catastrophic, and that era in many ways is responsible for the shape of the world’s power structure today.
I think we’re heading into another time where it’s bad to be “everybody else”. Instead of guns, this time it will be Artificial Intelligence enabled weapons.
Take this article from Popular Science: “A pilot A.I. developed by a doctoral graduate from the University of Cincinnati has shown that it can not only beat other A.I.s, but also a professional fighter pilot with decades of experience. In a series of flight combat simulations, the A.I. successfully evaded retired U.S. Air Force Colonel Gene “Geno” Lee, and shot him down every time. Lee called it “the most aggressive, responsive, dynamic and credible A.I. I’ve seen to date.”
In the research paper the article is based on, the authors also note that “…given an average human visual reaction time of 0.15 to 0.30 seconds, and an even longer time to think of optimal plans and coordinate them with friendly forces, there is a huge window of improvement that an Artificial Intelligence (AI) can capitalize upon.”
Think about that for a second: today, in 2016, a computer artificial intelligence, built by a graduate student can already beat professional fighter pilots, and there is a huge window of improvement.
The difference between countries that have the technical expertise to create these weaponized A.I.s and everybody else will ultimately result in a one-sided battlefield, with everybody else at the total mercy of the AI, militarily. This isn’t “I have tanks and you only have cannons”, where one side has a solid advantage. This is “I have a sword and you have a piece of wet spaghetti”. There is no fair fight against these machines.
The one saving grace here – and it’s a major one – is that the cultural and political norms today point towards nonviolence and anti-imperialism. Obviously, this was not the case when guns first came on the scene. These norms will likely be strong enough to prevent any kind of military doomsday scenario, so I wouldn’t expect to see AI fighter jets decimating the streets of any third world countries anytime soon.
However. Norms can, and do, change, so relying only on norms seems to be a slightly risky strategy.
Instead, we as a global society ought to instate clear rules of warfare for how AI can be used on the battlefield. It will take a near Herculean effort on the part of the countries not named America/Russia/China (since for the major powers AI warfare is clearly beneficial), but the next generations of humanity will thank us for our foresight.