“Garbage in, garbage out” is one of the most important principles in computer science, but many people seem unable to internalize it. Maybe it’s the power of marketing; maybe it’s the decades-old “let’s enhance” trope in popular entertainment, or maybe it’s the old Arthur C. Clarke observation that “any sufficiently advanced technology is indistinguishable from magic”—whatever the reason, it’s not unusual for folks to assume that a fancy computer system will always have valuable outputs, regardless of the quality of the inputs.
When it comes to facial recognition, the failure to grok GIGO can have tragic outcomes. I wrote about this a month ago, when the Federal Trade Commission (FTC) banned Rite Aid from using such systems for five years—largely thanks to low-quality imagery, the drugstore chain often falsely accused people of being shoplifters, with harmful, humiliating consequences.
Now, as the Guardian reports, a Texas man has made some particularly horrific allegations about treatment he alleges was the result of bad facial recognition—again, thanks to low-quality imagery.
Harvey Eugene Murphy Jr is suing Macy’s and Sunglass Hut parent EssilorLuxottica for falsely identifying him as the perpetrator of a Houston store robbery that took place in 2022 when he was living in California. Murphy was arrested last October and released a couple of weeks later after his alibi was confirmed—and, he claims, after he was beaten and gang-raped in jail. “All of this happened to Murphy because the Defendants relied on facial recognition technology that is known to be error prone and faulty,” the suit states.
Using grainy pictures to wreck people’s lives is bad enough, but a startling Wired article yesterday detailed some truly next-level misplaced faith in technology’s ability to solve crimes.
There’s a company called Parabon NanoLabs that uses machine learning to transmogrify DNA samples into a 3D rendering of a person’s face. To Parabon’s credit, the company seems cautious about making overly wild claims about the accuracy of its results—it told Wired that it generates rough predictions rather than anything that can be used for “individual identifications”—but some of its law-enforcement clients seem to think the renderings can then be fed into facial recognition systems to help crack cases. At least one homicide detective made such a request; several officers told the publication they think the method is worth considering.
As Electronic Frontier Foundation general counsel Jennifer Lynch commented in the piece, “There’s no real evidence that Parabon can accurately produce a face in the first place,” so running its renderings through facial-recognition systems “puts people at risk of being a suspect for a crime they didn’t commit.”
So sure, let’s enhance, but let’s also be deeply skeptical of the results. It’s a good thing that U.S. lawmakers are now pushing for more scrutiny of the impact of police facial recognition on people’s civil rights.
More news below, but do check out Term Sheet writer Allie Garfinkle’s interview with Keith Rabois, in which the VC bigs up Miami and slams Silicon Valley—and Stanford grads, of which he is one. Also recommended: Eleanor Pringle’s piece on YouTube sensation MrBeast puncturing the revenue hype around his first X video post.
David Meyer
Want to send thoughts or suggestions to Data Sheet? Drop a line here.
NEWSWORTHY
Chinese gaming rules. Draft guidelines for new Chinese curbs on the gaming industry have mysteriously vanished from the government’s website. As TechCrunch notes, the surprise release of the draft rules last month whacked the market; their disappearance accordingly gave a fillip to Tencent and NetEase shares. A key propaganda official reportedly lost his job over the debacle.
Microsoft’s underdog benefits. Microsoft’s Edge browser and Bing search engine—and its advertising services—are reportedly likely to escape the heaviest obligations imposed by the EU’s new Digital Markets Act antitrust law. Per Bloomberg, the European Commission is “leaning toward the reprieve” because of the services’ non-dominant positions in their respective markets.
Amazon’s French fine. The French data protection regulator CNIL has fined Amazon $35 million for “excessive” surveillance in its logistics facilities, Reuters reports. The watchdog issued the penalty over systems that tracked the idle time of workers’ scanners and measured the speed of scanning. “More generally, the CNIL considered it was excessive to keep all the data collected by the system, as well as the resulting statistical indicators, for all employees and temporary workers, for a period of 31 days,” it added.
ON OUR FEED
“These messages appear to be an unlawful attempt to disrupt the New Hampshire Presidential Primary Election and to suppress New Hampshire voters.”
—The New Hampshire Department of Justice decries robocalls that purported to be a recording of President Biden, urging people not to bother voting in the state’s upcoming presidential primary. The department said the message “appears to be artificially generated.” Get ready for a whole lot of these shenanigans this year.
IN CASE YOU MISSED IT
Fintech startup Brex is laying off 20% of staff, by Kylie Robison
Streaming TV subscribers in the U.S. jumped unexpectedly at the end of 2023—now analysts are rethinking a key ‘narrative’ about the business, by Rachyl Jones
AI far too expensive to replace humans in most jobs, MIT study finds, by Bloomberg
AI chatbot calls itself ‘useless,’ writes elaborate poem about its shortcomings, and says it works for ‘the worst delivery firm in the world’, by Marco Quiroz-Gutierrez
Tech CEO dies after falling to the stage at company event, by Chris Morris
Another AI unicorn? $80 million Series B led by Andreessen Horowitz yields a $1.1 billion valuation, source says, by Paolo Confino
SEC blames ‘SIM swap’ attack for disastrous X hack ahead of Bitcoin ETF approval, by Leo Schwartz
The world needs an International Decade for Data–or risk splintering into AI ‘haves’ and ‘have-nots,’ UN researchers warn, by Tshilidzi Marwala and David Passarelli (Commentary)
BEFORE YOU GO
iOS security update. iPhone and iPad users should hurry to download the latest iOS/iPadOS update, 7.3, on security grounds. For one thing, Apple says it patches a vulnerability in the WebKit browser engine that would allow “maliciously crafted web content [to] lead to arbitrary code execution,” and that this vulnerability reportedly may already have been exploited.
But, perhaps more importantly, the update also introduces a feature called Stolen Device Protection, which demands the use of Face ID or Touch ID to change sensitive settings when the device isn’t at a regular location such as the owner’s home or workplace—and for really sensitive things like signing out of Apple ID or disabling the Find My tool, an hour-long delay will also be introduced if the phone is somewhere unfamiliar. The idea is to thwart thieves who know the owner’s passcode and hope to quickly make the device untraceable.
The feature is turned off by default, so head on over to the Face ID & Passcode section in the settings app to activate it.
This is the web version of Data Sheet, a daily newsletter on the business of tech. Sign up to get it delivered free to your inbox.
Original Article Published at Fortune.com
________________________________________________________________________________________________________________________________