Strict Liability for Killer Robots?

0
1487

The Almost Human episode “Unbound,” raised an interesting question: what is your liability for building a robot that go on a killing spree?

DRN_XRN_8759As explained in the story, Nigel Vaughn (played by John Larroquette) built an XRN named Danica that went on a three day rampage at her “product demo” for elected officials and other VIPs. The XRN was built for combat and killed at least 26 police officers.

What would be Vaughn’s civil liability for Danica’s extremely high death count?

A plaintiff could bring a products liability claim against Vaughn on the theory that Danica (a product) had a defect in either her design or manufacture. Merrill v. Navegar, Inc., 26 Cal. 4th 465, 479 (Cal. 2001), citing Soule v. General Motors Corp. (1994) 8 Cal. 4th 548, 560.

DRN_Product_8777Case law holds for a strict liability case based on defective design, “a product is defective . . . either (1) if the product has failed to perform as safely as an ordinary consumer would expect when used in an intended or reasonably foreseeable manner, or (2) if . . . the benefits of the challenged design do not outweigh the risk of danger inherent in such design.” Merrill, at *479 citing Barker v. Lull Engineering Co. (1978) 20 Cal. 3d 413, 418.

Focusing on the benefits of the design vs the risk of danger, a jury could consider, “the gravity of the danger posed by the challenged design, the likelihood that such danger would occur, the mechanical feasibility of a safer alternative design, the financial cost of an improved design, and the adverse consequences to the product and to the consumer that would result from an alternative design.” Merrill, at *479 citing Barker at *431.

Danica the XRN killed in excess of 30 people over three days. This was after the DRN program had been cancelled, because of the android police officers going “crazy,” with some killing themselves. Moreover, the XRN had the same type “synthetic soul” as the DRN’s, showing no change in a design that was decommissioned. While none of the DRN’s went on a three-day rampage, Dr. Vaughn was on notice of a possible defect that would impact the behavior of a combat android.

The XRN was designed for combat and not law enforcement. Given the instability with the “synthetic souls” of the DRN’s, Dr. Vaughn should have been concerned with the “gravity of the danger” of a combat android with emotional instability.

A three-day killing rampage at a product demo would be difficult to predict. The DRN’s are helpful to see that it was possible something could go very wrong, but predicting Danica would go on a one-robot-woman Iwo Jima reenactment might not have been foreseeable. However, it was foreseeable that violence could have happened from the failure of the DRN’s, especially given the nature of Danica’s programing.

AH_Police_8642The MX program demonstrates there was a safer alternative design. While utterly lacking imagination or conversation skills, a safer android was ultimately put into police service.

Do the MX’s have adverse consequences to having android police officers? Yes. They simply are not human, lacking the ability to relate to those they are supposed to protect. This means no empathy and compassion. Those upholding the law need those qualities so the populace is not living in fear of armed robots walking the streets.

Danica ironically proved this point when the mother and girl got in Danica’s cab. Instead of killing both, she smiled when the little girl told Danica, “I think you are pretty.”

A kid sweet talked a killer robot. That would not happen with an MX.

Sweet talk aside, not even the great defense attorney Dan Fielding could defend Nigel Vaughn from the strict liability for Danica’s “defective” design that resulted in a three-day rampage.

Leave a Reply