It was a somber day, and the aftermath continues. The Federal Bureau of Investigation as well as Washington, DC police and other state authorities are now hunting down the perpetrators. One of the tools employed to identify the individuals caught in photos of the event is facial recognition technology. In fact, Clearview AI, a facial recognition technology company, said it saw a 26 percent jump in use of its product the day after the mob.
While there’s a public will to punish those who put U.S. democracy and human lives at stake, we find ourselves in yet another debate over whether security should always take precedence over privacy in times of national crisis. The use of facial recognition technology is still very controversial globally, and only a few municipalities or cities in the U.S. have regulated or outright banned its use. At the federal level, there aren’t hard and fast rules about how law enforcement can apply it nor standards or best practices over how it does so.
In this week’s edition of the Privacy Insider newsletter, you’ll see evidence of this ongoing debate. The Federal Trade Commission, in fact, made history this week by reaching a settlement with a photo-sharing company who allegedly created facial recognition algorithms using photos users had uploaded. The users didn’t consent to such use, so, while there isn’t a “facial recognition technology” statute the FTC could base a case on, the agency could nail the company for deceiving its users — an authority granted to the agency under Section 5 of The FTC Act.
And in New York, the state is considering a bill on facial recognition technology. It’s one to watch, because it allows for a private right of action for violations of the law. Only one other state in the U.S. has a similar model, the Illinois Biometric Information Privacy Act. Plenty of lawsuits have been filed citing BIPA violations, so it’s an important provision for New York to consider and one that will no doubt incite plenty of debate if the bill pushes forward in the coming months.
Enjoy reading, and we’ll see you next week!
Here are the top stories you might have missed:
2. FTC settles with photo-storage company over its facial-recognition technology
The Federal Trade Commission made history this week when it reached a settlement with a photo storage app the agency claims used customers’ photo to develop facial recognition technology without telling them, The Verge reports. The settlement requires Everalbum Inc to delete photos and videos and any facial recognition technology algorithms built using them. It’s the first time FTC enforced a case “focused primarily on the misuse” of facial recognition technology,” the agency tweeted on Jan. 11.
3. High Court rules against intelligence agencies’ bulk surveillance practices
A U.K. High Court decision against intelligence agencies’ bulk surveillance tactics has privacy advocates claiming victory, Infosecurity Magazine reports. Privacy International fought to end the practice at the Investigatory Powers Tribunal. The Tribunal ruled in favor of intelligence agencies in 2016. But the High Court last week said authorities are wrong to search individuals' property “without lawful authority, even in cases of national security,” the report states.
4. Are Apple’s new Privacy Nutrition Labels legit?
Apple recently revealed its Privacy Nutrition Labels in the App Store, but they’re flawed. Fast Company reports that it’s impossible to verify any app developers’ disclosures within their nutrition labels, “it’s entirely self-reported. … There is currently no way for Apple to know what an app does with user data after the data is sent to the app. But by calling it equivalent to “Privacy Nutrition Labels,” Apple irresponsibly implies that this privacy information is vetted, when that is absolutely false.”
5. New York to consider bipartisan biometric privacy law
New York state lawmakers introduced a bipartisan bill last week that would impose rules around biometric data. If the New York Biometric Privacy Act (AB 27) passes, it would be one of only a few states with biometric laws on the books. The law would allow for a private right of action for violations. It would require written consent from individuals before their biometric data could be collected, and it would restrict businesses from selling or profiting from biometric data, JD Supra reports.
6. New Zealand central backed hacked via third-party vendor
New Zealand’s central bank reported one of its databases had been breached after a hacker gained access to one of its third-party file-sharing vendors, the Associated Press reports. Commercially and personally sensitive information was accessed. “We are working closely with domestic and international cybersecurity experts and other relevant authorities as part of our investigation and response to this malicious attack,” said Governor Adian Orr.
7. A debrief on what you should know about the UK Information Commissioner’s Office
The Information Commissioner’s Office (ICO) might not be a pop-culture term, but it’s certainly well known to anyone following privacy and data protection. That’s because it’s the U.K.’s data protection authority and one of the most active EU data protection authorities. It gained a bit of mainstream fame when its enforcement officers raided the offices of Cambridge Analytica, the infamous data analytics firm behind Facebook’s data breach, in 2014. This Osano blog post outlines some of the ICO’s recent action and what makes it a regulator you should know in the privacy space.
8. Opinion: Canada’s new privacy bill encouraging, needs work
In an opinion piece for CBC News, Vass Bednar and Mark Surman Canada’s recently proposed Consumer Privacy Protection Act, or Bill C-11. The act is an opportunity for Canada to set an example beyond its borders, the authors write. “Just like pollution, abuse of data affects individuals and the collective. When we're on Facebook or YouTube, your data is mixed with my data. In order to get better treatment from online services, we need a way to push for our rights together, not just as individuals. Otherwise, the burden on individuals to manage their digital privacy will remain absurdly high.”