Artist rigs up Google Assistant to (sometimes) fire a gun on command

Isaac Asimov’s first law of robotics:

A robot may not injure a human being or, through inaction, allow a human being to come to harm.

The robotics laws are fictional, notes artist and robot maker Alexander Reben. So, he asked himself, why not make a robot that breaks the first rule? He then created a robot that punctures your finger with a lancet, as you can see in this video:

…after which he continued his inquiry into robot-human interaction, which has included pleasure (robots that massage your scalp); intimacy (cardboard robots as cute as baby seals that get people to open up by asking them intimate questions, which people seem pretty happy to answer); on up to ethics, as in, robots that could be used to kill people.

A recent video from Reben shows the artist deploying artificial intelligence (AI), in the form of Google Assistant, to shoot an air pistol at an apple.

During the TED talk in which he displayed the intimacy/pleasure/stabby robots, Reben noted that the finger-puncturing robot chose to stab somebody in a way that Reben says he couldn’t predict.

Reben, who claims that this is the first robot “to autonomously and intentionally break Asimov’s first law,” says that the robot decides for each person it detects if it should injure them, and that the decision is unpredictable.

In the 28-second video, Reben says “OK Google, activate gun,” to his Google Home smart speaker, though he could have used Amazon Echo. Next to his Home is some sort of air pistol that then fires at an apple, which is on a pedestal. The apple tumbles off as Google Assistant says, “OK, turning on the gun.”

Reben told Engadget that he built the robot using parts lying around his studio: besides the Google Home, he used a laundromat change-dispensing solenoid, a string looped around the gun’s trigger, and a lamp-control relay.

Reben:

The setup was very easy.

The artist told Engadget that it’s not the robot that matters; what really matters is the conversation about AI that’s smart enough to make decisions on its own:

The discourse around such an apparatus is more important than its physical presence.

Just as the AI device could have been any of the assistants that anticipate their owners’ needs – be it Google Home or Alexa or what have you – so too could the triggered device have been anything, Reben said, such as that back massaging chair he previously set up. Or an ice cream maker.

Or any automation system anywhere, for that matter: alarm clocks, switches for turning lights on and off when you’re on vacation, in an attempt to convince burglars you actually aren’t on vacation, etc.

This is certainly not the first incident of technology turned lethal with things kicking around the house: we’ve seen hobby drones turned into remote-control bombs, a remote-controlled quadcopter drone equipped with a home-made flamethrower, and a flying drone that fires a handgun.

Yes, Reben says, there are many automated systems, be they coffee makers or killer drones and sentry guns. But typically, they either involve a human who makes decisions or the system is “a glorified tripwire.”

Start typing and press Enter to search