Above Photo: Poster for The Tomorrow War (Amazon Prime Video)
Chris Pratt of Parks and Recreation and Guardians of the Galaxy fame has a new film out. In The Tomorrow War, Pratt uses time travel to save Earth from hordes of ravenous aliens. The film ultimately is an allegory for climate change, so kudos to Pratt and all in Hollywood for a movie demonstrating that climate change will bring surprising and inevitable deadly consequences to all of us.
By no means is The Tomorrow War a masterpiece; I would give it 5 stars out of 10. It is what you would expect from a summer action-adventure blockbuster. However, one thing that stuck with me regarding this film about humans fighting aliens 30 years in the future is that there is little to be seen of drone warfare. In only a couple of scenes do we see drones fighting the aliens. The absence of drones is because Hollywood makes money off of its stars and not robots. The reality, though, is that based upon where we are in the present with robotic killing machines and the predictive course of technological progress and adaption, in 30 years from now, humans will not be present on the battlefront. The likely scenario is that the fictional aliens in The Tomorrow War would not stand a chance against the automatized warfare of the present, let alone the future. What needs to be asked is: what chance do we as non-fictional humans have?
The idea that machines may kill on their own is older than I am. Science fiction writers and futurists crafted laws in their novels and predictions that humans would program robots with constitutional instructions not to harm humans. When I was a boy in the 1980s, Arnold Schwarzenegger shot to stardom as he played the role of the assassin robot in The Terminator. At about the same time, Matthew Broderick starred in Wargames, a movie about the consequences of putting the decision to kill in the hands of computers. Frighteningly, what was once considered gist and speculation for science fiction novels and movies is now existent.
It has been more than a year since the first known autonomous drone conducted a kill mission on its own. In early 2020, a Turkish-built drone carried out a successful autonomous kill mission in Libya. As reported by the United Nations, this drone conducted its entire mission: takeoff, targeting, attack, and return, without the assistance of a human. This machine found and killed who it wanted to without a human hand or mind involved. Yes, humans instructed the machine who to look for, but once that information was provided, the machine could operate and kill independently of any additional input. Go out and look for someone who says certain words on a cellphone, wears a particular style of “uniform,” or meets a broad demographic category is what the drone is programmed to do, and once it has that input, it can kill on its own.
We now readily know machines can learn, and killer drones can incorporate that learning so that their initial target inputs are updated and adapted to allow the drones to search for expanded targets without human assistance. Drones can also be resupplied and refueled by other drones so that a drone that is hunting people needs never stop its hunt until it is successful. This near-dystopian fear of machines operating on their own, finding and killing humans, is a decades-old worry that is now true.
Drones are operating effectively and efficiently throughout war zones. In last year’s quick but bloody war between Azerbaijan and Armenia, Azerbaijan, again with Turkish drones, impressively defeated Armenia. Armenia lost hundreds of tanks, armored vehicles, and artillery pieces. The success of drones in combat has been noticed and is causing changes. After more than 100 years of tanks on the battlefield, the US Marine Corps recognizes their vulnerability, as tanks are easy things for cheap and autonomous drones to find and destroy. The US Marine Corps no longer has tanks. The Marines decided to discard their tanks before Azerbaijan’s use of drones against Armenia, but that decisive victory by Azerbaijan’s drones cleared doubts about keeping tanks that Marine leaders may have held.
One of the things necessary to understand about warfare is it is the most competitive of all human activities. When I was a Marine officer in Iraq, I was responsible for the counter-improvised explosive device (IED) operations for my regiment. My Marines and sailors went on the roads looking for those roadside bombs. After returning home, I worked for the Joint IED Defeat Organization, trying to get technology to US forces in Iraq and Afghanistan to protect them from IEDs. What I learned, and what we experienced, was that within 30 to 60 days of whatever technology or tactic we put into the field to try and protect our soldiers from IEDs, the insurgents had come up with a counter to our countermeasure. We would issue a piece of equipment that would attempt to protect our troops, but the insurgents had found a way to defeat it within a month or two. We would then produce an upgrade, a change in tactics, or another piece of equipment, and the insurgents would then find a way to counter that countermeasure. On and on it went, and, since these wars continue, on it goes. This degree of extreme competition is why the human race has often seen its swiftest and most extraordinary technological progress during warfare.
Militaries recognize the danger of drones and are trying to find ways to protect their troops from them. One such advancement that may reap horrifying effects on civilians, particularly in a country like the US, where mass shootings are daily, is a rifle that automatically tracks and fires on the target at which the shooter aims. Ostensibly created to combat drones, this rifle can be used against anything or anyone. The shooter aims at the person they want to shoot, and the computerized rifle sight follows the person selected. The shooter pulls the trigger; however, the rifle does not fire until the computer tells the rifle to fire to ensure a hit. You can buy one of these rifles for $6,000. How long until one of these is used at a church, a school, a concert…If Chris Pratt and his friends in The Tomorrow War had the weapons available today, let alone 30 years from now, that movie would not have lasted fifteen minutes.
Movies, and all art, speak to and reflect our society’s dreams, fears, obsessions, values, etc. They record our progress and attempt to tell us where we are going. Gerard Butler and Morgan Freeman in 2019’s Angel Has Fallen show quite dramatically how adversaries will use drones to overwhelm defenses and assassinate VIPs. In real life, we have seen Venezuela’s president survive a drone assassination attempt. Houthi insurgents have skillfully utilized drones to punish Saudi Arabia for its war crimes against Yemenis, and militias in Iraq and Syria, both Sunni and Shia, appear to be using more drones to attack occupying US forces. Thus the age-old question: does art imitate life, or does life imitate art?
On my TV, I watched Chris Pratt heroically battle aliens 30 years in the future. However, such a war would be fought almost entirely by robots. The idea of robots fighting aliens is no longer a purely speculative one, as the robots do exist. Autonomous robots that utilize artificial intelligence, machine learning, computerized fire control systems, and amazingly sensitive sensors are machines that do not seem to miss and never hesitate to pull the trigger. It is clear the aliens Chris Pratt fights in the future would not stand a chance against today’s robots. That is Hollywood, though. The question for us, outside of the movie theater and away from our TVs, is what chance we as human beings stand?