“Please ask yourself: when was the last time you saw next-generation, world-class technology for education, healthcare, housing, etc. consistently prioritized for underserved communities like this?”
Arguing that money would be better spent on uplifting programs and policies rather than on high-tech policing of underserved communities, Rep. Alexandria Ocasio-Cortez on Thursday blasted the New York Police Department for deploying a robotic police dog in a Bronx apartment building earlier this week.
Responding to a New York Post article about the NYPD’s use of a “DigiDog” robotic K-9 unit in response to a home invasion, Ocasio-Cortez (D-N.Y.) tweeted, “Shout out to everyone who fought against community advocates who demanded these resources go to investments like school counseling instead. Now robotic surveillance ground drones are being deployed for testing on low-income communities of color with under-resourced schools.”
Minutes later, she tweeted, “Please ask yourself: when was the last time you saw next-generation, world class technology for education, healthcare, housing, etc. consistently prioritized for underserved communities like this?”
Boston Dynamics, the manufacturer the “Spot” robot, advertises the 70-pound quadruped—which can run about three-and-a-half miles an hour and climb stairs—for $74,500, retail.
“Spot comes ready to operate, right out of the box,” the company’s website says. “With its flexible API [application programming interface] and payload interfaces, Spot can be customized for a variety of applications.”
The NYPD version of the robot, which is still in its testing phase, is equipped with additional cameras and lights.
“This dog is going to save lives, protect people, and protect officers and that’s our goal,” NYPD Technical Assistance Response Unit Inspector Frank Digiacomo told WABC in December. “This robot is able to use its artificial intelligence to navigate things [in] very complex environments.”
Boston Dynamics says it does not want Spot weaponized. And while the company says its robots aren’t meant to kill people, it’s easy to imagine how they could be customized or otherwise used to do just that. In 2016, Dallas police rigged a police bomb disposal robot with explosives to blow up a sniper who killed five officers.
Boston Dynamics’ robot dog is now armed—in the name of art. A group called MSCHF gave Spot a paintball gun and plan to let others remotely control it inside an art gallery.
But not everyone’s laughing. Here’s how the prank’s sparked a kerfuffle: https://t.co/6YIyjDdeYq 📽️: MSCHF pic.twitter.com/HaqrgAhJ1k
— WIRED (@WIRED) February 23, 2021
Matthew Guariglia, privacy policy analyst at the San Francisco-based digital rights advocacy group Electronic Frontier Foundation, wrote earlier this year that “the arrival of government-operated autonomous police robots does not look like predictions in science fiction movies.”
“An army of robots with gun arms is not kicking down your door to arrest you,” said Guariglia. “Instead, a robot snitch that looks like a rolling trash can is programmed to decide whether a person looks suspicious—and then call the human police on them.” And while “police robots may not be able to hurt people like armed predator drones used in combat—yet—as history shows, calling the police on someone can prove equally deadly.”
This is especially true given the well-documented algorithmic racial bias to which artificial intelligence is prone.
A more insidious threat of police robots, says Guariglia, involves surveillance.
“The next time you’re at a protest and are relieved to see a robot rather than a baton-wielding officer, know that that robot may be using the IP address of your phone to identify your participation,” he wrote. “This makes protesters vulnerable to reprisal from police and thus chills future exercise of constitutional rights.”
Common Dream’s work is licensed under a Creative Commons Attribution-Share Alike 3.0 License. Feel free to republish and share widely.