Articles, Blog

The Threat of AI Weapons


I’ll explain more at the end, but let me set up this clip in five words: robot killers, Stephen Fry, watch Autonomous weapons have been described as the third revolution in warfare after gunpowder and nuclear bombs. They could mount rapid devastating attacks on a huge scale. Thousands of experts including Elon Musk and Stephen Hawking have signed an open letter calling for a ban on the weapons. They say drones that can autonomously find and eliminate people meeting certain criteria are feasible within years, not decades But calls for a ban have been rejected by the world’s 10 largest military powers and Musk has warned that international competition for AI superiority is the most likely cause of World War 3 So, do we have a recipe for disaster? Let’s start with the current situation During Obama’s presidency the number of American troops in war zones dropped by around 90 percent But there were ten times more drone strikes Russia has multiple unmanned drone tanks in development and completely autonomous armed vehicles guarding missile bases America plans to enable its Abrams tanks to control robotic wingman vehicles to attack the enemy while protecting the manned tank It is also considering a remote-controlled version of the high speed ripsaw vehicle Air power is already used with relative impunity And soon the fighters will use AI to control drone wingmen able to carry weapons test air defenses and keep human pilots even further from danger Around the world anti-ballistic missile defenses and drones are diminishing nuclear deterrence This is the US Navy’s new unmanned ship designed to hunt submarines and launch surface weapons It will cost around 20 thousand dollars per day to operate compared to 700 thousand dollars per day for a manned destroyer In a simulation test with the Air Force Research Lab AI drones repeatedly defeated a human pilot Former Air Force battle manager Jean Lee tried and failed to hit the drones and was destroyed by them every time He said they seemed to be aware of his intention and reacted instantly to his movements and attacks Autonomous weapons would be cheap to mass-produce and may become available to terrorists and dictators on the black market A recent report by the US defense Science Board called for immediate action to accelerate military AI development and understand the capabilities of other powers It also noted that commercial drones are rapidly improving and private AI firms are well ahead of the military Some argue AI will reduce accidental casualties by making fewer mistakes If a soldier believes someone is a threat their life depends on making a quick decision A robot on the other hand could react a fraction of a second before it’s fired on or even wait until it’s attacked But the weapons could also be extremely effective at assassinations, subduing populations and selectively killing particular ethnic groups The second revolution in warfare brought the world close to World War 3 with the Cuban Missile Crisis The third revolution may be even more volatile We also need to plan for an even greater threat from AI itself Many experts believe it will surpass human intelligence in the next few decades It may then assume full control of the world’s networked weapons Would it use them for peacekeeping? Or for bringing humanity to a quick end? Whether or not humans retain control and AI arms race may be a race to Armageddon We believe a ban that’s difficult to enforce is better than a world flooded with cheap anonymous autonomous weapons What do you think? So I hope you enjoyed that clip or were frightened by it Let me know what you thought in the comments below That clip was actually made by friends of mine who run a YouTube channel called Pindex They make incredibly high quality educational videos If you want to check them out, I will put a link in the description and one up here Now, you know whenever I see those Boston Dynamics videos come out I’m always simultaneously amazed at the achievements that they’ve been able to produce and simultaneously scared that I’m watching the birth of the ancestors of our future robot overlords So, yeah, I’ve got mixed feelings about this but I think it’s a really important topic and that’s why I wanted to share this clip with you And I also wanted to highlight the fine work of the guys over at Pindex, so yeah, definitely check them out and for those of you who are asking Derrick what about your videos? Why haven’t you been posting very much lately? Well, I have been working on some stuff In fact, I’ve got a full-length 90-minute documentary coming out in the summer of this year It’ll be available globally so watch out for that. I will be giving you some updates closer to the day And I also have a pilot for a brand new series that I worked on with minutephysics and MinuteEarth, I will put a link in the description And if you’re in the U.S., you can go check it out. Other places are geo-blocked You know I can’t say anything for that but it’s our intention to try to get this series out to as many people around the world as we possibly can So if you can check it out, go and do that now and I will be posting some more videos soon I promise and hopefully it will explain why I’ve been absent for some of this time, so yeah. I hope you enjoyed that clip

100 thoughts on “The Threat of AI Weapons

  1. Hopefully EMPs from nukes from a high altitude detonations can take out drones. Assuming we would be able to launch them and they get that high.. So basically your choices are fallout or AI..

  2. Why can't humans just work together for each other as a planetary species instead of constantly trying to kill and destroy ourselves and our planet?! It's so dumb… Most hot and cold wars feel more like a few children in a sandbox being like "But that's my toy! I want to play with it!" *Throws stuff at the other child… *
    We either gotta work together or we will end up getting erased from Earth…

  3. war on drugs was literally to empower private agencies and their power in global affairts,. and the deal is the private agenices aare enabled to do this for the uberbanking oil elite who profits the most.

    the scariest part isnt their profit, its their total controll.
    sidenote: 70 – 90 % of mainstream news are owned by the uber ultra – billion elite… . if control what the people thinks, you control what they will do

  4. There was no education in this video, literally nothing but fear mongering. Which is fine if you follow it up with understanding, but you just dropped all your viewers off a fear cliff.

  5. that means enemies are just data on a screen, with a press of a button you kill them. there's no emotion attached to warfare, and there's no feelings resulted from killing. commanders simply issue a command. The AI wouldn't hesitate.
    The problem isn't the AI, but the fact that humans get too comfortable with killing people. The number of deaths becomes statistics.

  6. i think that this further shows that people when the chance to make money is at hand are idiots and in this case of course the selling of AI is the way to make money

  7. AI won't be able to fix pipes and do indoor plumbing but it will be able to shoot any person who complains to the municipality about the leak

  8. No testing on black people means black people will have the first 'accident'. Today's facial recognition technology has already failed that test.

  9. Rather than seeking to create AI leading to an "us and them" type scenario we should seek to integrate with this technology and become symbiotic. Man and machine. I for one love my AI (Google) but see no point in talking to it, typing it or showing it if I have already thought it. The tank commanders drones need to be an extension of the tank commanders mind. The order is given but it's execution is down to the AI.

  10. Where is the documentary? :O I really want to watch it and watching this roughly a year later. Thank you so much for everything!!

  11. I am skeptical and disturbed by the move to autonomous weapons.
    "A ban that is difficult to enforce", however, is just an impediment to the good guys. It simply gives the bad guys an advantage.
    That's a lesson we should have learned during the cold war.

  12. If the AI remains regulated, it would do exactly what it was designed to do, which is keep humans away from harm by fighting wars for them. If the AI is implemented poorly or not kept check, these systems could indeed have unintended consequences. Perhaps an alternative might be having robots remotely controlled by human operators; however, the prospect of the superhuman capabilities that AI promises would likely be very attractive to governments wishing to gain the upper-hand in warfare, and thus the, "RC soldiers" would only serve as a temporary solution to the issue due to their inferior skill.

    Case and point: we're screwed.

  13. Such weapons should be banned and also taken care that they aren't being sold in black in markets.Cause they are capable enough to wipe out the very existence of a country within a span of few days.

  14. I think the premise of the video is flawed. A surprising amount of the industrial machinery that keeps our world running would be quite efficient at killing humans. Robert Miles has an excellent video on convergent instrumental goals. The short version is that AI will likely conflict with humans because AI will want to use resources that we also want to use. Banning weaponised AI won't stop that from happening. It will keep us from getting ready for it. We should not ban AI weapons, we should learn how to manage them safely.

  15. While it's certainly good to be comprehensive and maybe even apprehensive, it's sad to think that things like this and fear of automation may spread to a point where it stunts our technological growth. And while they may not sound like a bad thing after this, just realize that all that's going to do is give China another edge over us, because it sure as hell isn't going to stop them.

  16. Experts!? One was a famous (egocentric) astro physicist dumb enough to actually believe a time traveler would assist to his secret "party" (and outsmarted by a former plumber) and the other one made an electric car company which doesn't necessarily means he knows about neither electronics or software engineering.

  17. You could make a video about ANYTHING POPULAR and make it look frightening:

    – Video games and school shootings.
    – Collectible Cards and low grades in school.
    – Genetic modification and nazism.
    – Fast food and overweight.
    – Dungeons and Dragons… and "oUr yOuTh iS bEing coRruptEd".

    First biased video from you, first thumb down for you!

  18. if i can upload my mind into a computer and then join the AI then im okay with mass AI in warfare. would be pretty awesome so long as you are on the winning side

  19. Just look at the history. Following through was the best option.
    Should we have stopped developing guns? Then bad guys would have kept building them.
    Should we have not developed nukes? Then Russia would be in control
    Should we not develop AI? China, Russia, Terrorists will have more power

    Just like Nukes, we need to be at the front.
    Believe it or not, but the world has evil people in it, regardless of what technology we have

  20. I'm not scared of AI, they are already used today in the form of Deep Learning and Neural Networks. The smarter ones, such as self-driving cars, use alot of processing power requiring them to use a very high-end CPU/GPU that is alot better than most consumer grade PC's. I don't think its impossible for an AI to decide to take control and not listen to instructions, but there would be a failsafe, and aspects of the AI that arn't directly controllable by the AI, for example the motor control could go through extra programmed code before it reaches the physical motor. AI robots would be expensive, because of their high end components, and also all the hardware required wouldn't also be cheap either, for example for a drone with a gun: the high-end motors to hold the gun, the extra motors to fire the gun, the cameras, all the sensors, the frame, and the processor would be expensive for a terrorist to get their hands on, and while they could afford a few, they couldn't spend thousands of dollars for a swarm of drones. A few drones with guns would be very easy to shoot down and destroy.

  21. Hey I was just wondering do we really need "development"??
    More developed we are more tensed we get. And we are no better than tribals who according to our standards are not developed. So we exploited the planet and billions of it's organisms for poisonous air, more tension and dangerous diseases. If we do the cost benefit analysis of this, we'd realise that we have paid a huge price just to live a little bit longer, nothing else.

  22. Why is the military budget rising instead of being reduced?! Is having a more advanced military than other countries more worth it than not ruining millions of lives, politics suck and clowns like Donald Trump should not have such enormous power over billions of humans! Where the hell is humanity going? Maybe to hell… I’m losing more and more hope on humanity as days pass… What a shameless world!

  23. Soon aircraft will be able to control drones and weapons? The F35 already can. That's it's advantage.

  24. What worries me the most in the current trajectory of AI and robots is the capitalist question?
    Right now really crazy exploitation and rampant corporatism is held in check by countries laws and the elected governments that make these laws.
    Corporations are are already heavily influenciong these elections and though lobbying corrupting that system of representation we have. My biggest worry is once they have the means to overthrow through AI and and robot security forces of their own they will engineer an excuse to enforce martial law and take over control of the government.
    The only thing that provides real power is Military might. The French revolution proved that the people will only be pushed so far down before they rebel and at that time the people had power.
    Once there a robot armies available to the elites it's game over for us plebs.
    All of us could unite and still be beaten down.

    In fact once production is automated we are surplus to requirements.
    Capitalism can die too.
    They don't need our labour anymore.
    They can have anything they want. It's a post scarcity society.
    Money is redundant as a rationing tool.
    We just become a massive drain on their resources.
    Best get rid of us.

    I'm trying to talk myself out of this vision of the future but my memories of the past see this current trajectory coming to pass.
    Talk me out of this.
    Luv and Peace?

  25. Its a complicated situation. I know it's a weak philosophical/ethical argument, but from a practical standpoint it kind of is a race. Yea we can say we ban it all we want. But other countries WILL exploit it, to our disadvantage. We basically are forced to keep up or win this arms race. Unfortunately.

  26. World War 3 is caused by the world powers fighting over time-travel technology, developed by a girl whose consciousness is preserved in an AI named Amadeus

  27. Don't worry, if the AI became super intelligent it would want no wars and eradicate the warmongering human speci-… wait…

  28. Once the AI developing would be done, they would no longer need us. At that point, the hopes of revolting against or changing the world would be innexistent

  29. Sooo, you're saying I should stop playing around with my Arduino controlled drone? And definitely I should've never mounted airsoft cannons to it, or IR head to head FPV? You've got to be kidding me. People actually think there is a way to prevent this from evolving, or a way to stop it, or control it. LOL. I see it like the this. The first people that do this, will have the superior advantage. So I guess everyone better stop the philosophic nonsense, and embrace it. It's kind of like, stop worrying and learn to love the AI.

  30. Scientists are very irresponsible as to what they research and develop. They pull the, "there's nothing I can do about that", card far too often.

  31. remote piloted vehicles, sure. AI? nope. sorry, AI will never happen, despite what the charlatan Musk seems to think…..AI is just a current buzzword that's repackaged from the 50s so computer programmers can try to stroke their own egos and make more money.

    Computers don't "think"…they never have, and never will. they simply execute a program, even extremely complicated ones, that can appear as ordered independent thought, but is still none the less just a program. Computers don't in ANY WAY process information like a living being does, and is completely incapable of cognitive reasoning and self awareness.

    If we ever find a way of making true AI, it will be done by biologists, not computer coders. And even then, why does everyone assume that real AI would be any more intelligent than an actual human brain, or for that matter a faster thinker? Just because a computer processes information fast now, doesn't mean it can do it at the same rate if it had the abilities of real brains.

    AI has always been a joke to me, and i laugh when people make claims that there have been all these "miraculous" advances on the subject…..AI has been touted as being achievable since the invention of the transistor…..and it still hasn't happened yet.

  32. AI is here to stay. The threat of AI is not violence, but lost of jobs for the masses. In the video they had hinted this by stating that the unmanned drone battleship cost 20k per day to operate meanwhile the manned one was somewhere around ~700k a day. We won't need humans to employed for their intelligence if a machine can be the intellectual. Faster, smarter, no breaks, no complaining. I hold a comp sci degree with 20yrs of exp. I picked up AI courses in 2017 so that I can bring oil to the the machines. The rise of the machines 💯✔☕

  33. war never changes, until it gets to the point where war stops all things from changing forever, or is inevitably eliminated.

  34. The F-35 Jet literally just blew my mind just now and all this time I thought it was complete waste of time.

  35. My thoughts?
    1) Face recognition breaking: face fragment clothing or IR emitter hat
    2) "aim for the torso". On-board processor will be large and housed by the largest component. If it has a "head" it's for show.
    3) Radiation. Machines fry in it far faster than organics.
    4) Signal jamming. Complex tasks require connection to remote supercomputer. Cutting that connection decreases efficiency by 95%
    5) Exploit IFF. IFF codes are not, and likely never will be, perfect. By determining shortcuts used by machine for IFF, one can cause machine to shoot at own allies, while ignoring enemies.
    6) Exploit learning algorithm. Attack machine with identical machine. Each time, central computer "learns" that own machine is enemy. Eventually, starts shooting itself.

  36. Sure, AI will be used for military purposes. But the true value in AI (and the real threat, I believe) will be for the civilian day-to-day purposes. You see, the real war isn't what you think ; it's not against terrorism, between countries or religious. It's against capitalism. It's always been about money, and will always be. Intelligent robots will take over our lives, they will wash our clothes, cook our meals, drive us to work, probably even redecorate our houses. AI already knows what you want to buy next, give it a few years and it will predict what you think, what you will become. This is our real war. Will we let ourselves become totally dependent from technology, give up our privacy entirely to corporations and admit defeat? We already know the answer…

    WW III is a silent war.

  37. Autonomous killer robots have been around since the 80s. The Goalkeeper CIWS is capable of automatic detection, tracking and killing threats, all without human intervention. It is just that these machines are programmed to destroy a specific class of threats, being radar detected incoming missiles.

  38. "There are wise and foolish ways of dealing with the threats to our existence," he continued. "Some threats turn out to be figments of cultural and historical pessimism. Others are genuine, but we must treat them not as apocalypses-in-waiting but as problems to be solved."

    https://www.beckershospitalreview.com/artificial-intelligence/harvard-psychologist-ai-fear-mongering-is-the-y2k-of-the-21st-century.html

  39. Autonomous IA is frightening. Even if it never leads to a large scale war, I find it utterly disgusting and insulting that somebody could get killed just because a machine crunched some numbers and the result is to pull the trigger.
    People working on that kind of stuff are useful idiots and moral degenerates.
    People need to defend themselves, and that makes a legitimate argument for weapons. Machines don't need to defend themselves. Machines are THINGS, not men and women.

  40. If they make self-replicating ai that can upgrade itself without human intervention we are totally fucked

  41. I think weaponized AI is inevitable. Instead of human life, we would sacrifice autonomous drones, and casualties in wars would drop massively. However, it can be dangerous if it falls into the wrong hands. Solution? In-built systems that are made to save the victim's life. Companies that produce the automated weapons legally cannot create them without certain requirements.
    Examples for requirements:
    1. Disarming instead of killing if possible
    2. Do not kill a person that is not fighting back or attempting to escape
    3. Preferably aim for the legs
    4. Report locations of injured victims

  42. "Listen, and understand. That terminator is out there. It can't be bargained with. It can't be reasoned with. It doesn't feel pity, or remorse, or fear. And it absolutely will not stop, ever, until you are dead."

    Kyle Reese in The Terminator

  43. This should be supported by everyone! Unless of course, you disagree w/the statement: "the fewer deaths on the battlefield, the better"! Also, if you go up against an autonomous adversary, w/o such tools, perhaps you do deserve a lesson on the battlefield that all the others can learn from, so as to reconsider such a stupid idea. Saving lives is good… you'd think. :/

  44. Simple, create a super powerful emp device. We may go back to the 1700s but hey, it's better than being obliterated.

  45. Stephen Hawking is a physicist. Elon Musk is a business man. Neither are computer scientists, let alone computer scientists that work on AI. Why do their opinions matter in my field?
    I love most of your videos but this is some Elon Musk type BS. Look at SmarterEveryDay's video interview with a 4 star general about the future of war. That is a much better assessment of the future of war than the musings of a man with no experience in AI or the military.

Leave a Reply

Your email address will not be published. Required fields are marked *