We’ve seen the sci fi version of artificial intelligence (A.I) in movies like Terminator where Skynet, an artificial intelligence driven computer system, takes over the world and attempts to wipe out the human race. Then there are those movies featuring a cute robot with human like feelings, like WALL-E. In reality, right now in our homes, we have AI driven personal assistants like Siri or Alexa that are considered to have a low level of competence for complex tasks but that will change — they will improve quickly. Imagine a day when a cute robot, like Wall-E, is not only your personal in-home assistant but is weaponized for security purposes.
(AI) Artificial intelligence is progressing rapidly — from Google’s search algorithms to Xiaoyi, an AI-powered robot in China that recently passed the medical licensing examination making it the first robot to have done so, all the way to autonomous weapons, types of military robots that can independently search and engage targets based on programmed constraints and directions. Data scientists, robotics and IT experts are using artificial intelligence to help intralogistics systems learn and evolve on their own for the optimization of the logistical flow of material goods within fulfillment centers, warehouses or distribution centers. The race for AI dominance is on.
Autonomous weapons programmed to kill
According to Wikipedia lethal autonomous weapons are (LAWs) are a type of autonomous military robot that can independently search and engage targets based on programmed constraints and descriptions. LAW are also called lethal autonomous weapon systems (LAWS), lethal autonomous robots (LAR), robotic weapons, or killer robots. LAWs may operate in the air, on land, on water, under water, or in space. The autonomy of current systems as of 2018 is restricted in the sense that a human gives the final command to attack — though there are exceptions with certain “defensive” systems.
“AI could offer “incalculable benefits and risks” such as “technology outsmarting financial markets, out-inventing human researchers, out-manipulating human leaders, and developing weapons we cannot even understand.”
Would a super intelligent computer system be driven by greed and the desire for power like humans are? Not likely — Artificial intelligence (A.I) doesn’t have to think like a human to decide to cull the human population. AI just needs to have a goal like creating an equilibrium for all species on earth or reducing greenhouse gas emissions and decide that human existence at its current population does not help achieve the objective.
Ever thought the human race would be subject to a come-to-Jesus reckoning of its own creation? Sure, Mother Nature has done her duty keeping us in check through the course of history…the Black Death comes to mind among others….but what about a culling due to our own creation(s)?
Culling is the conscious decision to remove or kill a surplus of animals from a certain animal population. It could be due to the animal in question being an invasive species and it’s threatening a native habitat or it could be due to uncontrolled population growth or even could be caused by the fact that the animal is posing a threat to our way of life. Some argue that culls are sometimes necessary whereas some advocates say they’re not and that humans are probably the biggest threat to the earth’s ecosystem.
As it stands now, though, the world’s population is over 7.3 billion. According to United Nations’ predictions it could reach 9.7 billion people by 2050 and over 11 billion by 2100. There’s only so much space and we seem to be doing a great job bulldozing over it for our 5 bedroom, two car garage, 1 acre property needs and our consumption demands.
There is no precedent in regards to what the earth can sustain in regards to the human population. Can the planet hold 11 billion people? We don’t know. But the evidence seems to indicate that it cannot.
Scenarios abound about what may hinder our domination of the planet and what may cause a population correction— a major war between nations, a massive natural disaster, an asteroid plowing into a populated area, etc but a cute little weaponized WALL-E? It wouldn’t be because a switch changed WALL-E’s good intentioned nature and helpfulness from helping you choose which pizza joint to check out or letting you know to expect rain to all of a sudden becoming a psycho robot with gun ready to eliminate you from the earth. It would be just simply due to an algorithm, a mathematic equation.
It would be nothing personal — it would just be doing the math.
Check out my BlockDelta profile for all of my contact details and details of my other postings.