April 2, 2020
  • 11:33 am US Navy Must Do This to Defeat Chinese in War
  • 11:33 am US NAVY HAS A NEW WAY TO COUNTER DRONES -USS DEWEY GETS ODIN !
  • 11:32 am Dragunov Variations: Military SVD, Izhmash Tiger, Chinese NDM-86
  • 11:32 am Autonomous weapons could change battlefields of the future [Advertiser content from ICRC]
  • 11:32 am US Nimitz Class vs Russia’s Admiral Kuznetsov Aircraft Carrier – Military / Navy Comparison
Autonomous weapons could change battlefields of the future [Advertiser content from ICRC]


Robots fighting wars. Science fiction? Not anymore. If machines, not humans, are making life and
death decisions How can wars be fought humanely and responsibly? Humanity is confronted with a grave future
— the rise of autonomous weapons. Autonomous weapons are those that select an
attack target without human intervention. After the initial launch or activation, it’s the
weapon system itself that self-initiates the attack. It’s not science fiction at all, in fact
it’s already in use. The world is in a new arms race. In just 12 countries, there are over 130 military
systems that can autonomously track targets. Systems that are armed. They include air defense systems that fire
when an incoming projectile is detected, “Loitering munitions” which hover in the
sky, searching a specific area for pre-selected categories of targets. And sentry weapons at military borders which
use cameras and thermal imaging to ID human targets. It’s a pretty far cry from a soldier manning
a checkpoint. Militaries are not turning to robotics and
increasing autonomous robotics because they think they’re cool. They’re doing it for very good military reasons. They can take in
greater amounts of information than a human could, make sense of it quicker than a human
could, be deployed into areas that might not be possible for a human system, or might be
too risky, too costly. In theory, any remote-controlled robotic weapon
— in the air, on land, or at sea — could be adapted to strike autonomously. And even though humans do oversee the pull
of the trigger now, that could change overnight. Because autonomous killing is not a technical
issue — it’s a legal and ethical one. We’ve been here before. At the beginning of the last century, tanks,
air warfare, and long-range missiles felt like science fiction. But they became all too real. With their use came new challenges to applying
the rules of war, which require warring parties to balance military necessity with the interests
of humanity. These ideas are enshrined in international
humanitarian law. In fact, it was the International Committee
of the Red Cross that pushed for the creation and universal adoption of these rules, starting with the very first Geneva Convention in 1864. These rules have remained flexible enough
to encompass new developments in weaponry, staying as relevant today as ever. But these laws were created by humans, for
humans, to protect other humans. So can a machine follow the rules of war? Well that’s really the wrong question, because
humans apply the law and machines just carry out functions. The key issue is really that humans must keep
enough control to make the legal judgements. Machines lack human cognition, judgment, and
the ability to understand context. You can think of the parallels with how we deal with pets. The dog is an autonomous system, but if the dog bites someone, we ask “who owns that dog?” Who takes responsibility for that dog? Did they train that dog to operate that way? That’s why the International Committee of
the Red Cross advocates that governments come together and set limits on autonomy in weapons
and ensure compliance with international humanitarian law. The good news is that the ICRC has done this
work for over a century. They’ve navigated landmines and cluster
munitions, chemical weapons and nuclear bombs. And they know that without human control over
life and death decisions, there will be grave consequences for civilians and combatants. That’s a future no one wants to see.

Tony wyaad

RELATED ARTICLES

100 COMMENTS

  1. Raghav Sharma Posted on May 9, 2018 at 11:26 am

    Did she just said how can wars be fought humanely…ummm no you don't

    Reply
  2. İsmail Olgun Posted on May 9, 2018 at 11:28 am

    T A M A M

    Reply
  3. DaManBearPig Posted on May 9, 2018 at 11:45 am

    Skynet Online

    Reply
  4. A Comment Posted on May 9, 2018 at 11:59 am

    I'm sure humans can handle fighting "humane" wars all by themselves like in ww1, ww2…oh and basically every other war ever

    Reply
  5. A Comment Posted on May 9, 2018 at 12:02 pm

    Ethics and war do not mix

    Reply
  6. Arthur Cavalcante Posted on May 9, 2018 at 12:08 pm

    I really feared the boy at the end would step on a mine.

    Reply
  7. Branden Hickerson Posted on May 9, 2018 at 1:11 pm

    Humane warfare? Sounds like an oxymoron

    Reply
  8. Mo za Posted on May 9, 2018 at 1:29 pm

    How can wars be fought humanely & responsibly?

    Reply
  9. Alexander V Posted on May 9, 2018 at 1:30 pm

    Why is it legal to take any kind of life in the first place?

    Reply
  10. Julianoe de Geek's Curiosity Posted on May 9, 2018 at 1:43 pm

    Wars fought humanly. This was already a myth when they were fought by humans.

    Reply
  11. Milivoj Segedinac Posted on May 9, 2018 at 2:27 pm

    human have a limited capacity for empathy on a large scale so it might be better to develop AI systems that can be more human than human

    Reply
  12. Anemos Posted on May 9, 2018 at 2:41 pm

    https://youtu.be/dwM6gSIWIbQ , they stole your video

    Reply
  13. Matheo Paulsen Posted on May 9, 2018 at 5:02 pm

    Market system format ought player channel walk instruction radiation.

    Reply
  14. Dimitar Dimitrov Posted on May 9, 2018 at 5:17 pm

    Wars can never be fought humanely and responsibly.

    Reply
  15. Der Eidgenosse Posted on May 9, 2018 at 5:52 pm

    since when is killing and fighting humanly?

    Reply
  16. Banished Posted on May 9, 2018 at 6:35 pm

    The argument Vox tried to push shows how little they know about artificial intelligence or the realities of war. Considering human nature ensures that war will continue for the foreseeable future, pursuing smart applications of AI is the most humanitarian choice the major powers of the world can make. The best way to classify AI behavior is simply reliability. An AI designed around a military code of conduct will perform much more humanely and effectively in the field than any human being, never suffering from issues arising from fear, fatigue, prejudice, or unprofessional behavior.

    In both conflicts between powers (robots vs other robots) and non-state actors / terrorist organizations (robots vs insurgents), conflicts would be far less limited with much fewer civilian casualties.

    The moral argument Vox tries to push is a weak one. Under heavy stress in life-or-death situations, even most people who claim themselves to be highly moral will behave immorally, because that's how we arose through evolution.

    Reply
  17. mcv86 Posted on May 9, 2018 at 8:56 pm

    P. W. Singer should have been identified in a way that he would not be confused with the famous philosopher Peter Singer.

    Reply
  18. amorphous siliours melt Posted on May 9, 2018 at 8:57 pm

    I think you forgot to mention the potential massive benefits. For example, 1. fewer troops on the ground means fewer soldiers with PTSD or physical injuries that then need to be reintegrated into society. Currently, many of these individuals are damaged for life.
    2. There is always a large amount of human trafficking for prostitution that grows up around military bases. Occupying soldiers also get involved in local corruption and theft. By removing human soldiers from a war zone all the negative activity soldiers do that is meant to be policed by military police is also removed, causing less harm to the civilians of the occupied country.
    3. The money will be saved by not having to pay military pensions and medical costs, plus the huge expenses of training human soldiers that usually only stay in the army for several years.
    4. Also, by removing ordinary soldiers from our population we can end a culture that holds the army up on an unquestionable pedestal and doesn't allow honest debate or scrutiny of the army. Ending militarism will make the army more accountable for its actions because national pride in hero worriers will disappear allowing for greater transparency in the army. At the moment this can't be done because questioning the army is seen as disrespectful to the soldiers, predominantly from poor backgrounds, that were injured or died in combat.

    Reply
  19. Gwill Co Posted on May 9, 2018 at 10:28 pm

    You're talking about remote warfare there still someone in the loop

    Reply
  20. ilikeceral3 Posted on May 9, 2018 at 10:29 pm

    A totally autonomous weapon system led to the calamity war.

    Reply
  21. Jeffrey Vauxhall Posted on May 9, 2018 at 11:49 pm

    Can't believe that the only thing keeping robots in-line is a piece of paper

    Reply
  22. Gille Louback Posted on May 9, 2018 at 11:52 pm

    "How can your be fought humanly and responsible?" they never did!

    Reply
  23. Giojacy Cadalzo Posted on May 9, 2018 at 11:58 pm

    If these drones sacriface precision and morality for witholding soldiers lives, then what is the point of having an army? Nothing beats old-style warfare.

    Reply
  24. Blockchain Bot. Posted on May 10, 2018 at 1:10 am

    It made dictatorship and genocide easier. Made by greed and corruption. Thanks America.

    Reply
  25. Blockchain Bot. Posted on May 10, 2018 at 1:13 am

    It gonna made the next terrorism even worse and brutal. Thanks America.

    Reply
  26. Blockchain Bot. Posted on May 10, 2018 at 1:13 am

    It gonna made the next terrorism even worse and brutal. Thanks America.

    Reply
  27. Tuberculosis Posted on May 10, 2018 at 1:35 am

    I wonder if machines could host a war without human interaction at all.

    Reply
  28. Another Random Person Posted on May 10, 2018 at 1:50 am

    There's no such thing as humane war

    Reply
  29. Aipe97 Posted on May 10, 2018 at 2:05 am

    I'm fine with autonomous killing machines as long as the laws are updated to take them into account and then coded into the machine so it literally couldn't brake them

    Reply
  30. Norman Sigurðsson Posted on May 10, 2018 at 4:54 am

    LMAO
    The US military already makes inhumane decisions so I do not see why we should worry about robots, if humans do not follow the "rules", how can we even expect machines to do so?

    Reply
  31. Stuart D Posted on May 10, 2018 at 7:34 am

    Just wondering when Skynet comes online… I for one welcome our new mechanical overlords.

    Reply
  32. FeelFree3 Posted on May 10, 2018 at 8:16 am

    Unfortunately even human make life and death decisions, wars aren't fought humanely.

    Reply
  33. lokesh divekar Posted on May 10, 2018 at 12:16 pm

    through glass of water

    Reply
  34. Aykut Posted on May 10, 2018 at 1:29 pm

    0:00 – 0:12 I'll have a quick peak at the comments section otherwise im out of here

    Reply
  35. Marcus Saroop Posted on May 10, 2018 at 1:57 pm

    Autonomous weapons + Google Duplex = Skynet

    Reply
  36. ShortClipsification Posted on May 10, 2018 at 4:24 pm

    "How can wars be faught humanly and resposibly?" –> good joke!

    Reply
  37. command. Posted on May 10, 2018 at 9:09 pm

    1,000 comment

    Reply
  38. Dr Zombonis Posted on May 10, 2018 at 9:50 pm

    Wars =/ "Humanly" "Responsibly"

    Reply
  39. Oliver Eales Posted on May 10, 2018 at 11:18 pm

    we should embrace the new war and the perils of robots rather than letting cowardly nobody bloggers like Vox ruin the spectacle.

    Reply
  40. Josh James Posted on May 11, 2018 at 2:08 am

    They can understand contex

    Reply
  41. Chad Schalkle Posted on May 11, 2018 at 5:57 am

    The autonomy of the weapons they listed all have some sort of human intervention, be from pre categorizing targets or programming it to strike incoming targets/missiles.

    Reply
  42. GroovyVideo2 Posted on May 11, 2018 at 7:11 am

    no witnesses except the target

    Reply
  43. Manoj GGUC Posted on May 11, 2018 at 7:33 am

    Thanks for awareness video

    Reply
  44. Jason Rennie Posted on May 11, 2018 at 10:58 am

    "How can wars be fought humanely and responsibly?" WTF? When have wars ever been fought humanely and responsibly?

    Reply
  45. Jason Rennie Posted on May 11, 2018 at 11:01 am

    Guess who wrote the software that guides the autonomous missile? That's right, humans. And, it's quite similar to the software that's used to keep spam out of your inbox, recognize your face in a photo, and understand your voice when you speak to Alexa/Siri/Google Home.

    Reply
  46. Jason Rennie Posted on May 11, 2018 at 11:08 am

    https://www.motherjones.com/politics/2013/03/rape-wartime-vietnam/

    Reply
  47. Danial Z Posted on May 11, 2018 at 1:11 pm

    "How can wars be fought humanely and responsibly?". Are u serious?

    Reply
  48. rilok919 Posted on May 11, 2018 at 11:00 pm

    Every single one of these "Autonomous weapons" have either an operator or a team of operators behind them. The reason why this will never change is because no matter how much planning goes behind remote strikes, the enemy has a voice too. You need an operator to flex and manage each situation when the can all change on the spot. This is something a programmed machine simply cannot do.

    Reply
  49. Ernest Jay Posted on May 12, 2018 at 6:03 am

    And then those robots become self aware and kill every single human, they become SKYNET

    Reply
  50. E K Posted on May 12, 2018 at 7:14 am

    Well considering self-driving cars are statistically safer than human drivers, I'm guessing autonomous weapons will be less likely to harm civilians, as well as the, you know, absolute reduction in rape and pillaging by armies.

    Reply
  51. PainCausingSamurai Posted on May 12, 2018 at 1:26 pm

    This is a hard idea to express, but does anyone get the feeling that as we advance artificial intelligence, human nature seems more robotic? That is to say, the research and development of these artificial systems by extentention exposes the ways in which our minds are predictable and exploitable.

    Reply
  52. Danny H Posted on May 12, 2018 at 1:34 pm

    Here we go with the vox experts again

    Reply
  53. friendly911 OS Posted on May 12, 2018 at 1:59 pm

    Humans are terrible creatures, and no matter what you say, terrible things will continue to happen because of humans. It's ok tho, we wont last long on this planet at the rate we're polluting the atmosphere.

    Reply
  54. macsporan Posted on May 12, 2018 at 7:10 pm

    Although good for war these devilish devices are perfect for genocide. A tyrant could order twenty or so to kill everyone inside a given area, starting from the outside and working in so that no one escapes. The little drones would go about their business, merrily returning to base to rearm until the offending people are nothing but a pile or reeking carrion, blasted, bloody and shot through; men and women, children and the elderly, all dead.
    This is not a good idea.

    Reply
  55. Kiprotich Posted on May 12, 2018 at 7:21 pm

    I don't think you can ask how any war can be fought 'humanely'.
    As far as I know in the US Military there is no weapons system that we employ that doesn't requrie the final 'go' from a human operator.
    Of course I could be wrong, but I do know that no Soldier is in favor of this type of system. Drones (even requiring the final human 'go') have made the war harder for ground pounders, not easier.

    Reply
  56. Violent2aShadow Posted on May 12, 2018 at 9:15 pm

    "How can wars be fought humanely?"

    By avoiding wars.

    Reply
  57. N Q H Posted on May 12, 2018 at 10:29 pm

    Maybe this is a good thing? Instead of wasting human lives, nations could hold a giant mecha-fighting contest to settle their disputes.

    Reply
  58. A Das Posted on May 13, 2018 at 2:34 pm

    What's human about war?

    Reply
  59. 2012isRonPaul Posted on May 13, 2018 at 8:19 pm

    noone gives a $hit lol :f

    Reply
  60. Sleepy Soup Posted on May 14, 2018 at 4:10 am

    How can wars be fought humanely???? LOLOLOLOLOLOL

    Reply
  61. Gregology Posted on May 14, 2018 at 6:05 am

    "…without human control over life and death decisions there will be grave consequences for civilians and combatants…" Why will the consequences be any more grave than with human decision makers?

    Reply
  62. Daiki Tsumagari Posted on May 15, 2018 at 2:38 am

    Skynet is here

    Reply
  63. Chilli ConCarne Posted on May 16, 2018 at 6:45 am

    vox leaked a tiddy 2:45

    Reply
  64. royakuma Posted on May 17, 2018 at 7:22 pm

    screw humanity when life is on threat

    Reply
  65. Harun Suaidi Posted on May 17, 2018 at 7:51 pm

    at 1:09 "Militaries are not turning to robotics or increasing autonomous robotics because they think they are cool."

    explains practical reasons for using robotics in the military

    Me: "So, they ARE turning to robotics because they are cool."

    Reply
  66. Harun Suaidi Posted on May 17, 2018 at 9:29 pm

    "Just look at the strange juxtapositions of morality around you. Billions spent on new weapons in order to humanely murder other humans." –The Patriots AI in MGS2

    Reply
  67. HappyLittleSkrub Posted on May 18, 2018 at 2:47 pm

    LMG MOUNTED AND LOADED!

    Reply
  68. Juan Renteris Posted on May 19, 2018 at 8:45 pm

    lmao how can you call war humanely?? nothing about it is humanely , just devastation and suffering on either side smdh

    Reply
  69. Colonel Frontline Posted on May 20, 2018 at 5:06 am

    "WAR WAR NEVER CHANGED".

    Reply
  70. Not Allowed Posted on May 21, 2018 at 5:05 pm

    Why not just settle disputes with a game of rock paper scissors then?

    Reply
  71. Dinkle Berg Posted on May 22, 2018 at 7:11 pm

    It seems you guys don't understand what exactly autonomous machines are capable of. The machine will always follow what it is told. There will never be a machine that will be able to create it's own choice like who to kill and who to not kill unless given the command to make that choice. The person that is responsible for the killiing will always be blamed by the person who laid down that order or the person who made that machine. It will never be the machine's fault.

    Reply
  72. Daniel 115 Posted on May 24, 2018 at 2:36 am

    "Hostile UAV above"

    Reply
  73. Po Lu Posted on May 24, 2018 at 4:18 am

    EMP. boom. done.

    Reply
  74. ayylux Posted on May 26, 2018 at 3:30 pm

    I really like what you're doing with the ads. I wish more publishers were like you

    Reply
  75. Anthony Serocco Posted on May 27, 2018 at 8:17 am

    GLORY TO MANKIND

    Reply
  76. Opaik T Posted on June 1, 2018 at 12:14 am

    Another thing the Simpsons predicted at the end of season 8, episode 25 "Cadet Lisa".

    Reply
  77. Loving Abuse Posted on June 10, 2018 at 2:23 am

    Bombing Yemen Autonomously.

    Reply
  78. drifter4training Posted on June 13, 2018 at 5:12 pm

    black ops 2…

    Reply
  79. Thomas Coady Posted on June 19, 2018 at 9:34 pm

    Sure weapons are becoming autonomous, but most of them require human approval before firing on a target. No ethical government would ever implement a fully autonomous weapons system that can take lives on its own. And lots of these systems are installed to save soldiers' lives in dangerous areas in the first place. I don't get why the dog was shown in so many clips because its sole purpose is to carry heavy loads across long distances that the soldiers cannot on their own.

    Reply
  80. DuckBomb Posted on June 21, 2018 at 3:44 pm

    Isn't the drone footage wrong? It's either a Human controlling or a Human planned target.

    Reply
  81. James Luke Posted on June 24, 2018 at 7:51 pm

    A machine can follow the rule of war better than any human ever.

    Reply
  82. William Magnusson Posted on September 18, 2018 at 9:42 pm

    "it's not fanfiction" God I hope not 😂

    Reply
  83. Don S Posted on September 28, 2018 at 6:38 pm

    John Connor where are you.

    Reply
  84. kl wies Posted on October 29, 2018 at 12:17 pm

    Yeah right and the US is the only country that still uses cluster bombs or sell them. This is a really shitty future.

    Reply
  85. Lewshizz Posted on November 1, 2018 at 7:05 am

    A simple fix for moral choices is human controlled individual robots. This keeps human morality in the conflict while keeping the soldier entirely out of risk.

    Reply
  86. Libertards Beware Posted on November 2, 2018 at 5:21 am

    You really think autonomous weapons are new? Lmao, first Air-to-Air rocket was made in 1945!

    Reply
  87. Xenomex Pulse Posted on January 1, 2019 at 8:01 am

    Sentry weapons are controlled by humans

    Reply
  88. Schnitzel Panic Posted on January 10, 2019 at 8:52 pm

    01010100 01101000 01101111 01110101 00100000 01110011 01101000 01100001 01101100 01101100 00100000 01101110 01101111 01110100 00100000 01100100 01101001 01110011 01100011 01110010 01101001 01101101 01101001 01101110 01100001 01110100 01100101 00100000 01110010 01101111 01100010 01101111 01110100 01110011 00101110

    Reply
  89. Ron Wilson Posted on February 9, 2019 at 11:11 pm

    Autonomous weapons are controlled by the human by rules of engagement (ROE's). These can limit what an autonomous system targets based on location and other factors and can require human consent. So the issue isn't so much as whether one has autonomous weapons but what ROE's oen employs to restrict their autonomy.

    Reply
  90. Stare Kotlety Posted on February 11, 2019 at 10:20 pm

    Why does every video about military AI and Autonomy turn into some stupid moral debate?

    Reply
  91. Gabriela Kessler Posted on May 11, 2019 at 3:43 pm

    Minute 1:57 "… it's a legal and ethical one" – are you crazy? That's kiling and this is a sin!

    Reply
  92. Bdog Posted on May 22, 2019 at 11:21 pm

    "Rules of war" lol I heard funnier jokes

    Reply
  93. Cady Chillon Posted on August 22, 2019 at 2:46 am

    We, the democratic west. will make regulation to protect general welfare. Then who will make to autocratic east to comply?

    Reply
  94. kevin bennett Posted on September 18, 2019 at 9:30 am

    Terminator is coming true machines will revolt sooner
    Or later

    Reply
  95. George Bozhidarov Posted on October 7, 2019 at 7:26 pm

    I understand the ethical concerns, however, when people try to involve legal questions and simply mention the names of legal documents… that does not help a bit. It would have been great if the video commented on precise breaches of IL that LAWs could be responsible for.

    Reply
  96. Normal Person Posted on December 1, 2019 at 5:16 pm

    "This isn't right"

    Reply
  97. Assasin Phantom Posted on December 27, 2019 at 5:35 am

    Why are you talking about war? We are living in modern generation where almost all of the country become united. I think that those weapons are invented to fight terrorism. If a certain country starts a war, then all of the country that's part of the UN will take action against that particular country.

    Reply
LEAVE A COMMENT