In the aftermath of the tragedy of the Dallas shootings this week, one particular detail has caught they eye of many observers. The Dallas Police Department (DPD) killed the shooter with a robot.

“We saw no other option but to use our bomb robot and place a device on its extension for it to detonate where the suspect was,” Chief David Brown said in a press conference Friday morning. “Other options would have exposed our officers to grave danger. The suspect is deceased … He’s been deceased because of a detonation of the bomb.”

That use of a robot raises questions about the way police adopt and use new technologies. While many police forces have adopted robots—or, more accurately, remote-controlled devices—for uses like bomb detonation or delivery of non-lethal force like tear gas, using one to kill a suspect is at least highly unusual and quite possibly unprecedented.

Unprecedented, but also likely irrelevant from a legal use-of-force threshold:

“The circumstances that justify lethal force justify lethal force in essentially every form,” [said Seth Stoughton, an assistant professor of law at the University of South Carolina].1July 18, 2016 Update: This was previously misattributed to Peter W. Singer. “If someone is shooting at the police, the police are, generally speaking, going to be authorized to eliminate that threat by shooting them, or by stabbing them with a knife, or by running them over with a vehicle. Once lethal force is justified and appropriate, the method of delivery—I doubt it’s legally relevant.”

Police have always had the ability to use lethal force when a suspect has committed a felony and presents a present danger to the public.

In a present danger scenario, a police officer can deploy lethal force with their bare hands, a gun, a knife, a baton, or a taser. A robot is merely the newest tool in the toolkit. It’s being operated at a distance, but is it really any different than a sniper shooting a suspect from 300 yards away?

Yet if robots armed with deadly force are just another tool, another variation of “arms,” can the general public own killer robots as well?

What Is A Robot?

Before getting to the legality of the matter, we have to define exactly what makes a robot, a robot. As almost every device has a computer embedded in it today, people refer to almost everything as robots. Often because it’s a cheap marketing hack to get people to pay attention to your product.

Is a drone operated by a human a robot? Does a machine have to be completely autonomous (i.e. – not human controlled) to qualify as a robot?

From a legal and regulatory perspective, it doesn’t appear that there is a set definition of what qualifies a machine as a robot or not. In a law review article entitled, How Should The Law Think About Robots?, Neil M. Richard, Professor of Law, and William D. Smart, Associate Professor of Computer Science Director, at Washington University in St. Louis offer the following definition:

A robot is a constructed system that displays both physical and mental agency, but is not alive in the biological sense.

That is to say, a robot is something manufactured that moves about the world, seems to make rational decisions about what to do, and is a machine.

In Robotics And The Lessons Of Cyberlaw, Professor Ryan Calo, University of Washington School of Law, states:

Robots are best thought of as artificial objects or systems that sense, process, and act upon the world to at least some degree.

What About Drones?

What the public commonly refers to as drones, are referred to as unmanned aircraft systems (UAS) or unmanned aircraft vehicles in regulatory and legal terms. The U.S. Federal Aviation Administration (FAA) defines UAS as:

(9) UNMANNED AIRCRAFT SYSTEM.—The term ‘‘unmanned aircraft system’’ means an unmanned aircraft and associated elements (including communication links and the components that control the unmanned aircraft) that are required for the pilot in command to operate safely and efficiently in the national airspace system.

The legal and regulatory difference between drones/UAS and robots seem to be one of autonomy. UAS may include some type of auto-piloting system, but also rely on human operator control. Instead of being fully independent, autonomous systems, drones/UAS are hybrid – capable of programed, autonomous function or able to be controlled by a human.

So the distinction we are looking at when we examine what distinguishes a robot from a drone, seems to be a matter of autonomy.

What Is A Murderbot?

It depends.

Likely the closest analogy to a private citizen using a robot for lethal force would be a spring-gun. A spring-gun is a firearm set with a spring that is triggered by some other device such as a tripwire or pressure plate. In the past, property owners set up spring-guns to guard distant, remote, or unseemly property.

Essentially, spring-guns are booby traps.

spring gun can you own a murderbot

Spring gun (CC BY-SA 3.0) via Wikipedia

Spring-guns were commonly used in the 18th and 19th century to protect property. Their use fell out of favor due to increased tort liability. Multiple cases in England and the U.S. held that the property owner was liable for the trespasser’s injuries resulting from the discharge of the spring-gun that was tripped. Why?

Because the value of human life and limb is of greater importance than any property. As such a property owner:

cannot gain a privilege to install, for the purpose of protecting his land from intrusions harmless to the lives and limbs of the occupiers or users of it, a mechanical device whose only purpose is to inflict death or serious harm upon such as may intrude, by giving notice of his intention to inflict, by mechanical means and indirectly, harm which he could not, even after request, inflict directly were he present.’ Restatement (First) of Torts § 85 (1934)

The most commonly referenced case in in U.S. law is Katko v. Briney, 183 N.W.2d 657 (Iowa 1971), which held that use of spring-guns are not allowed solely for the protection of property. An exception was made in that the “only time when setting of a spring gun is justified would be if trespasser was committing a felony of violence or a felony punishable by death, or where trespasser was endangering human life by his act.”

In essence, spring-guns fit the definition of a “robot.”

  • Artificial, constructed systems (mechanised system made by man),
  • that sense the world (input from a pressure device), and
  • independently act upon the world to some degree (discharge a round).

Given historical legal context, it seems ownership of a murderbot by a private citizen is well established.

A Matter Of Control

Here is a non-lethal example of a murderbot:

In the above video, a Nerf turret has been combined with a camera system and tracking algorithm that automatically fires at moving targets. But is also has a manual control switch. The robot can flip from autonomous mode to manual switch in an instant.

The above robot is closely analogous to a spring gun. The only real difference is the triggering mechanism is no longer a crude tripwire, but a highly sophisticated visual tracking system.

If the Nerf turret was armed with lethal force, acting independently and autonomously,  it likely falls under similar laws respecting spring-guns. That is, they are not to be used except for when a “trespasser was committing a felony of violence or a felony punishable by death, or where trespasser was endangering human life by his act.”

But there is also the matter of control. The above turret can shift from autonomous to manual control with the flip of switch. Does this transfer of control from automation to a human also shift liability?

The turret is no longer independently acting on the world, it is doing so at the command of a human operator. Is proximity to a device an extenuating circumstance? I.E. – does it matter if your finger is physically on a gun, or that you’re pulling the trigger remotely from a control device 100 yards away?

Defending Your Castle

Private citizens owning murderbots is further complicated by the castle doctrine. Over twenty states have passed some variant of castle doctrine laws.

The “Castle Doctrine” is a long-standing American legal concept arising from English Common Law that provides that one’s abode is a special area in which one enjoys certain protections and immunities, that one is not obligated to retreat before defending oneself against attack, and that one may do so without fear of prosecution.

Given that a citizen in these states have no obligation or duty to retreat from their home when someone trespassers on their property, along with the above analysis regarding robots and autonomy, it would seem a citizen is free to operate a robot utilizing lethal force to protect their property, without worry of the mitigating factors of “a felony of violence or a felony punishable by death, or where trespasser was endangering human life by his act,” as long as the robot remains under human control and is not operating autonomously.

There doesn’t seem to be any distinction in statutes or case law regarding distance, or information transmission, or intervening steps as it relates to use of lethal force by a mechanical device or system.

But that is changing, as some states have introduced bills that would limit the arming of UAS.

Except as otherwise provided by law, no person shall operate or use any computer software or other technology, including, but not limited to, an unmanned aerial vehicle, as defined in subdivision (29) of section 15 of the general statutes, as amended by this act, that allows a person, when not physically present, to release tear gas or any like or similar deleterious agent or to remotely control a deadly weapon….”

Which seems sound on it’s face, but it merely shifts the question to what it means to be “physically present.” In the same room? The same floor? The same house? The same piece of property?

Rise Of The Robots

Robots, particularly lethally armed robots are introducing a whole swath of ethical, legal, and regulatory problems. And most of the problems are only going to be answered in retrospect. Technology is simply moving too fast for the government and lawmakers to keep up.

Private owned, lethally armed robots likely present problems similar to that of 3D printing. It doesn’t matter if legislatures pass laws against printing of firearm components. The cat is out of the bag. You can download schematics off the internet and print them to your heart’s content within the privacy of your home and a 3d printer. The right to print arms is here.

Legislatures may ban the sale of lethally armed robots, but there is nothing stopping people from assembling their own. Lethally armed robots are already here (see below video).

And unfortunately, nothing is likely going to get sorted out until after a privately owned robot kills a trespasser on private property.

 

References[+]

Share This