Brooklyn resident Fabian Rogers knew he had to act in 2018 when his penny-pinching landlord suddenly attempted to install a facial recognition camera in the entrance of a rent-stabilized building he’d called home for years. Under the new security system, all tenants and their loved ones would be forced to submit to a face scan to enter the building. The landlord, like many others, tried to sell the controversial tech as a safety enhancement, but Rogers told Gizmodo he saw it as a sneaky attempt to jack up prices in a gentrifying area and force people like him out.
“They were trying to find ways to expedite ways of flushing people out of the building and then try to market new flipped-over apartments to gentrifiers,” Rogers told Gizmodo.
Advertisement
Rogers says he tried to speak out against what he saw as an invasive new security measure but quickly realized there weren’t any laws on the books preventing his landlord from implementing the technology. Instead, he and his tenant association had to go on a “muckraking tour” attacking the landlord’s reputation with an online shame campaign. Remarkably, it worked. The exhausted landlord backed off. Rogers now advocates against facial recognition on the state and national levels.
Despite his own success, Rogers said he’s seen increasing efforts by landlords in recent years to deploy facial recognition and other biometric identifiers in residential buildings. A first-of-its-kind law discussed during a fiery New York City Council hearing Wednesday, however, seeks to make that practice illegal once and for all. Rogers spoke in support of the proposed legislation, as did multiple city council members.
Advertisement
“We are here to address an invisible but urgent issue that affects all New Yorkers: the use of biometric surveillance technology,” Council member Jennifer Gutiérrez said in a statement. “It is our responsibility as elected officials to thoroughly examine its potential benefits and risks.”
Advertisement
Council members expressed repeated concerns over the ability of private businesses and landlords to abuse biometric identifiers or sell them off to third parties on Wednesday. Council member Carlina Rivera, who is sponsoring a bill restricting facial recognition in residential areas, said she feared aggressive landlords could use the tech to issue petty lease violations against tenants, which could eventually lead to their eviction. If left unchecked, she said, racially biased algorithms driving these systems risked further fueling gentrification, which threatened to, “erode what should be a diverse collective identity in the city.”
Privacy and civil rights advocates supporting the bill—along with a sister bill seeking to ban facial recognition use in sports stadiums and other large venues—could have wide implications beyond the Big Apple and serve as an example for other local legislatures to follow.
Advertisement
“Facial recognition technology poses a significant threat to our civil liberties, our civil rights, and the privacy of our citizens,” National Action Network NYC Field Director Derek Perkinson said during a rally outside City Hall on Wednesday. “It is biased and broken… In the name of Al Sharpton, what’s right is right, what’s wrong is wrong.”
How would the NYC bills impact facial recognition?
The two bills under consideration during the council hearing this week would approach limiting facial recognition from two different angles. On the housing side, a bill introduced last week would make it unlawful for landlords who own multiple buildings to install biometric identification systems to scan tenants. Landlords, under this bill, would be banned from collecting biometric data on anyone unless they have “expressly consented” in writing or through a mobile app.
Advertisement
The other new bill, also introduced last week, would modify administrative laws to prohibit places or providers of public accommodations from using biometrics identifying technology. These public accommodations could include retail stores, movie theaters, sporting stadiums, and hotels, and could directly implicate Madison Square Garden, which gained national notoriety earlier this year for using facial recognition to identify and promptly boot attorneys from its premises. New York already had a law requiring businesses like these to post a sign informing the public it collects biometrics, but lawmakers and advocates say it does little to prevent wide swaths of faces from being sucked up and potentially sold to day brokers.
Advertisement
What happened during the NYC Council hearing on facial recognition?
Wednesday’s hearing, jointly hosted by the New York City Council’s Committees on Technology and Civil Rights, kicked off with lawmakers questioning senior members of the city’s Office of Informaiton Privacy (OIP), which is in charge of advising the mayor and other city agencies about privacy protection and data sharing initiatives. The OIP leaders refused to offer much insight into the ways local agencies like the New York Police Department handle biometric data. Instead, one of the city’s leading data privacy bureaucrats spent the better part of two hours dancing around questions and declining to take any position on the two bills in question.
Advertisement
Privacy advocates testifying at the hearing were upset with the dillydallying of the OIP leaders, with one accusing administration officials of spreading “misinformation” and appearing to withhold available data. “The New York Police Department is systematically breaking transparency and oversight laws,” Surveillance Technology Oversight Project Executive Director Albert Fox Cahn said during the hearing. Fox Cahn said the city’s current data privacy practices amounted to a “free for all.”
Council members warned facial recognition used by private businesses like Madison Square Garden could lead to an “Orwellian” reality where people of color are wrongly identified as shoplifters or some other banned person and unjustly denied entry. Not all the lawmakers were in agreement though. Council Member Robert Holden went to bat for the tech and said he believed laws restricting private firms’ freedom to use the system for security amounted to government overreach.
Advertisement
Biometrics: ‘If it’s compromised, it’s compromised for life.’
Advocates speaking in favor of the bill spent most of their testimony attempting to convince lawmakers of the unique threat the tech posed to residents. Fox Cahn said the “timeframe of harm” associated with biometric identifiers sets it apart from other types of personal data since it sticks with people for the entirety of their lives. “If it’s compromised, it’s compromised for life,” he said.
Advertisement
Others, like Surveillance Resistance Lab Senior Researcher and Organizer Alli Finn, said these surveillance tools, left unchecked, don’t just affect New Yorkers—they amount to a “monumental threat to democracy.” Even improved accuracy levels, Finn said, won’t address the underlying issue. “Increased accuracy rates will never fix the fundamental flaws,” Finn told the lawmakers. “They will always reflect the biases of those who make them.”
Rogers, the advocate who successfully fought back his landlord’s attempt to install facial recognition in his apartment, said he was optimistic these and other bills across the country could gain traction. Still, he acknowledged some inherent difficulties of pushing back against a tool many people simply find convenient.
Advertisement
“Corporate convenience is what leads to techno-solutionism being the quickest go-to option,” Rogers said. “I think as long as advocates are still energized, collaborating, and trying to do the political education that makes it feasible and understandable for a fifth grader, then I think we will get to a point where folks understand regulation and enforcement is essential”.
Services Marketplace – Listings, Bookings & Reviews