In a letter to the House of Representatives leaders on Wednesday, over 100 racial and civil liberties groups called on Congress to end federal funding for surveillance technologies that police used to spy on activists and demonstrators. In New York City Council voted for the Law on Public Supervision of Surveillance Technology (POST), a bill requiring the New York City Police Department (NYPD) to disclose the use of surveillance. Detroit residents and activists crowded the city council to reject a contract that would expand the use of facial recognition by local law enforcement agencies.
Prior to such legislation, some providers have voluntarily committed to terminate or interrupt their relationships with police authorities. Amazon and Microsoft said they would stop selling facial recognition solutions to law enforcement officials, at least temporarily, citing lack of regulation. IBM announced that it would end its general facial recognition business, and Foursquare, a location technology platform used by a number of mobile apps and services, has decided not to provide analytics on data from recent protests.
While this appears to be a step in the right direction, a healthy dose of skepticism about the intentions of these companies is warranted. Technology giants quickly attribute their actions (or reactions) to noble concerns, but tax or political motivations often play a role. Deborah Raji, a technology officer at AI Now Institute, found that IBM had silently removed facial recognition features from its APIs last falland the Associated Press reported that the company's decision "is unlikely to affect the bottom line." And Amazon is under increasing pressure from regulators, consumers, shareholders, and employees as its recognition cloud service continues to be sold to the police.
Even companies that appear to have fewer horses in the race may not act out of the goodness of their hearts. Microsoft, whose president Brad Smith recently described the company's position on the use of facial recognition as "in principle," had already tried to sell its facial recognition technology to the U.S. Drug Enforcement Administration in 2017 Emails received from the American Civil Liberties Union. A report from Intercept and the National Institute's nonprofit investigative fund shows that IBM, in collaboration with the New York City Police Department (NYPD), has developed a system that allows officials to search for skin color, hair color, gender, age, and various factors.
Jathan Sadowski of OneZero, a postdoctoral fellow in Smart Cities at the University of Sydney, has a sentence by Chris Gilliard, professor at Macomb Community College, in one opened this week: "Black Power Wash." While moratoriums for the sale of surveillance technologies are more important than, for example, tweets against systemic racism, they are ultimately calculated decisions. If "doing good" is in the best interest of a company, it is essential to take such measures fail to meet ethical standards.
Sadowski advocates holding companies accountable by dismantling police surveillance infrastructure, from Stingray devices for tracking mobile phones and real-time analytics to predictive algorithms and data acquisition systems. He claims that, on the one hand, they are deeply flawed at a technological level. Using facial recognition as an example, a study by the National Institute for Standards and Technology (NIST) last December found that systems misidentified blacks more often than whites groundbreaking work on facial recognition technologies by Raji, Joy Buolamwini, Timnit Gebru and Helen Raynham, who came to the same conclusion.
The POST law – and the 13 similar laws passed by cities across the country – are a positive step. This also applies to regulations on the use of drones by law enforcement agencies in 44 states. the face recognition bans in San Francisco, Oakland, Somerville, Brookline and San Diego; and the reform law proposed by the Democrats of both Congress Houses, which contains restrictions on the use of police cameras. All of these measures prevent companies like Palantir, Ring, Clearview AI – and to a lesser extent Microsoft, Google and Amazon – from entering into contracts that violate the rights of ordinary people.
"The existing order of power is unsustainable," writes Sadowski. “The companies that have benefited from (surveillance) technologies and the governments that have used them against the public have no moral authority to tell us what to keep and what to dispose of. The dismantling of the police machinery is a necessary method to counter the forms of power that are managed through this infrastructure. "
This increasingly seems to be the best option.
Send news tips to Khari Johnson and Kyle Wiggers for AI reporting – and be sure to subscribe to the weekly AI newsletter and bookmark our AI channel.
Thank you for reading,
AI Staff Writer