Apple details its CSAM detection system, says it expects to set a match threshold of 30 known CSAM images before an iCloud account is flagged for manual review
Apple today shared a document that provides a more detailed overview of the child safety features that it first announced last week …
MacRumors Joe Rossignol
Related Coverage
- Security Threat Model Review of Apple's Child Safety Features Apple
- After criticism, Apple to only seek abuse images flagged in multiple nations Reuters
- Apple details the protective layers against misuse of CSAM detection and Communication Safety in Messages systems iThinkDifferent · Rida Imran
- Apple says its new child safety feature will look for images flagged in multiple countries Neowin · Fiza Ali
- Apple's controversial new child protection features, explained The Verge · Adi Robertson
- Apple's controversial plan to try to curb child sexual abuse imagery The Verge · Kim Lyons
- View article Wall Street Journal
- Apple races to temper outcry over child-porn tracking system South China Morning Post
- Apple details the ways its CSAM detection system is designed to prevent misuse 9to5Mac · Chance Miller
- Apple Shares How its Photo-Scanning System Is Protected Against Abuse PetaPixel · David Crewe
- Cookie Consent and Choices NPR
- Apple Will Keep Clarifying This CSAM Mess Until Morale Improves Gizmodo · Victoria Song
- Apple Details its CSAM Detection System's Privacy and Security iPhone in Canada Blog · Usman Qureshi
- View article iMore
- Apple warns retail and online employees to be ready for iPhone backdoor questions MacDailyNews
- Apple's New Scanning Tools Raising More Concerns, Even Inside Apple Techdirt · Mike Masnick
- Apple offers another look at the security and privacy baked into CSAM photo scanning iDownloadBlog.com · Evan Selleck
- View article Redmond Pie
- Tim Sweeney: No cloud provider with better privacy policy than Apple iMore · Stephen Warwick
- Green and Stamos: Apple has now sent a clear message that it is safe to build and use systems that directly scan people's personal phones for prohibited content New York Times
- Action on sexual abuse images is overdue, but Apple's proposals bring other dangers The Guardian · Ross Anderson
- Apple, You Broke Your Privacy Promises and our Hearts The Mac Observer · John Kheit
- Apple's New CSAM Detection Feature Could've Been Communicated Better, Craig Federighi Admits In Interview Redmond Pie · Oliver Haslam
- WSJ:Craig Federighi氏、児 .ポルノ検 .ソフト批 判に対して ; コメント MACお宝鑑 … · Danbo
- Apple exec defends the company's much-criticized plan to scan iPhones for child abuse images, saying the feature has been misunderstood Insider · Kevin Shalvey
- CORE PROBLEM Apple changes course on plan to scan users' iPhones for child abuse pics as exec admits company ‘jumbled’ announcement The Sun · Jack Williams
- Apple clarifies its photo scanning policy in response to backlash Tom's Guide · Alan Martin
- Apple vice president supported checking user photos on iPhone Gizchina · Abdullah
- Child safety features built to withstand snooping, says Apple Livemint · Prasid Banerjee
- How much CSAM porn is too much CSAM porn? 30 images, says Apple Philip Elmer‑DeWitt · Philip Elmer-DeWitt
- Why Apple's ‘Shock’ New Update Will Radically Change Your iPhone Forbes · Zak Doffman
- Facebook Adds End-to-End Encryption for Audio and Video Calls in Messenger The Hacker News · Ravie Lakshmanan
- Apple clarifies its sex abuse scans would look for ‘images flagged in multiple countries’ Engadget · Andrew Tarantola
- New CSAM Detection Details Emerge Following Craig Federighi Interview TidBITS · Adam Engst
- Apple applies limits to new system scanning for child sex abuse The Hill · Lexi Lonas
- Apple employees voice concern over iPhone photo scanning Reclaim The Net · Didi Rankovic
- Apple asks to be trusted, turns out many people don't MSPoweruser · Surur
- Apple: Anti-Child Porn System Won't Trigger Until at Least 30 Images Are Detected PCMag · Michael Kan
- Apple says its iCloud scanning will rely on multiple child safety groups to address privacy fears The Verge · Adi Robertson
- Apple releases more information regarding child sexual abuse photo detection MobileSyrup · Brad Bennett
- Craig Federighi Thinks We're All Confused About Apple's New iCloud Photo Scanning FrontPageTech.com · Corina Garcia
- Apple admits ‘jumbled messages’ over controversial photo-scanning tech Trusted Reviews · Chris Smith
- Apple CSAM Detection failsafe system explained SlashGear · Chris Burns
- Memo: Apple warns staff to be ready for questions about CSAM scanning, says it will address privacy concerns by having an independent auditor review the system Bloomberg · Mark Gurman
- What does Jack Dorsey have against Ethereum? Fortune
- Apple regrets confusion over ‘iPhone scanning’ BBC
- Apple Preparing Its Employees to Answer CSAM-Related Questions iPhone Hacks · Sanuj Bhatia
- Apple Could've Done A Better Job Communicating About CSAM: Apple SVP Craig Frederighi fossbytes.com · Mohammed Abubakar
- Apple details user privacy, security features built into its CSAM scanning system AppleInsider · Mike Peterson
- Apple Says Its iCloud Child-Porn Scanning System Won't Trigger Alerts Until It Detects At Least 30 Images Variety · Todd Spangler
- Craig Federighi Tries to Clarify Apple's Upcoming Child Safety Features Pixel Envy · Nick Heer
- Survivors Laud Apple's New Tool To Spot Child Sex Abuse But The Backlash Is Growing NPR · Bobby Allyn
- Apple's Craig Federighi defends the new ‘Expanded Protections for Children’ saying they are not a backdoor iThinkDifferent · Rida Imran
- Apple's Software Chief Explains ‘Misunderstood’ iPhone Child-Protection Features (Exclusive) | WSJ YouTube
Discussion
-
@kurtopsahl
Kurt Opsahl
on x
Apple just released a Threat Model Review of its new scanning program, focusing on the threat of a secret attempt to add new hashes to the DB. The system is designed to be tamper resistant and evident. But what about overt attempts? https://www.apple.com/...
-
@runasand
Runa Sandvik
on x
Apple intends to prevent misuse of the child safety feature — that is only available in the US — by working with organizations operating in different sovereign jurisdictions. https://www.apple.com/... https://twitter.com/...
-
@kenli729
Kenneth Li
on x
Because the first three briefings didn't quite cut it — After criticism, Apple to only seek abuse images flagged in multiple nations https://www.reuters.com/... @josephmenn @StephenNellis
-
@kendraserra
Kendra Albert
on x
Given Apple's heavy reliance on security researchers as an accountability mechanism for the CSAM scanning tool, I imagine that they're also going to commit to not pursuing litigation against security researchers.
-
@timsweeneyepic
Tim Sweeney
on x
The US constitution protects against arbitrary government search of one's home and personal effects. Data one stores privately and doesn't share is a personal effect. Apple backdooring iOS to examine personal iCloud data is a suspicionless search of personal effects.
-
@kendraserra
Kendra Albert
on x
Sorry I had to. https://twitter.com/... https://twitter.com/...
-
@timsweeneyepic
Tim Sweeney
on x
There's not. “If we lose freedom here, there is no place to escape to. This is the last stand on earth.” Sent from my iPhone. https://twitter.com/...
-
@matthew_d_green
Matthew Green
on x
I'm glad that Apple is feeling the heat and changing their policy. But this illustrates something important: in building this system, the *only limiting principle* is how much heat Apple can tolerate before it changes its policies. https://www.reuters.com/...
-
@sarahjamielewis
Sarah Jamie Lewis
on x
Apple have given some interviews today where they explicitly state that the threshold t=30. Which means the false acceptance rate is likely an order of magnitude *more* that I calculated in this article. https://twitter.com/...
-
@sarahjamielewis
Sarah Jamie Lewis
on x
Some quick calculations with the new numbers: 3-4 photos/day: 1 match every 286 days. 50 photos/day: 1 match every 20 days.
-
@riana_crypto
Riana Pfefferkorn
on x
Can everyone please quit dismissing human rights concerns about Apple's plan as “hypothetical” & “speculative,” while claiming that child abuse is a harm that's actually happening now? There are 1 million Uighurs in Chinese concentration camps. Harms ARE happening now, worldwide.
-
@pwnallthethings
@pwnallthethings
on x
How about dynamic analysis? Unless Apple is planning on giving the iCloud app the get-task-allow permission so you can attach a debugger, that would be out of the question on an vanilla iPhone. You'd need to resort to jailbreaks, or, heaven forbid, a Corellium device
-
@paulbernaluk
Prof Paul Bernal
on x
We need to keep the heat on. It can work. https://twitter.com/...
-
@sarahjamielewis
Sarah Jamie Lewis
on x
In 2017 Whatsapp said they were seeing 4.5 billions photos shared per day. You can't extrapolate an acceptance false positive rate from 100 million tests. https://twitter.com/...
-
@runasand
Runa Sandvik
on x
Apple's new document brings up many good questions to unpack and explore. For example, how good is the cross-border collaboration between NCMEC and other child safety organizations? Is this a new version of Apple deciding what is “bad” enough? https://www.apple.com/...
-
@sarahjamielewis
Sarah Jamie Lewis
on x
Anyway keep up the pressure. The fact that Apple felt it necessary to do a PR blitz today along with releasing new slivers of information regarding parametrization is a good sign.
-
@evacide
Eva
on x
I am much less worried about secret attempts to add new hashes to the system than I am about government orders to point the system at a database of its own choosing. https://twitter.com/...
-
@kavyapearlman
Kavya Pearlman
on x
@Apple's recent report misses: 1. An independent technical security review of the CSAM content detection algorithm 2. Accounting for covert attempts frm malicious actors who will using hacking, especially by famous & powerful ppl of interest https://www.apple.com/... https://twit…
-
@sarahjamielewis
Sarah Jamie Lewis
on x
Also the fact that they gave a single number for the threshold indicates that they are planning to use a single, global threshold. Which will result in worse privacy for heavy-use accounts, and will mean the obfuscation can be trivially broken as I explain in the article.
-
@pwnallthethings
@pwnallthethings
on x
Might be a slightly self-indulgent thread, but how exactly does Apple suppose that security researchers will do this without running across anti-research minefields that Apple has intentionally laid down to block exactly this kind of research? https://twitter.com/...
-
@pwnallthethings
@pwnallthethings
on x
And notice the implicit assumption here in the first place. Security researchers *will* do the review, fighting over all of the obstacles intended to make actually doing the review simple. For free. Why? Why is this considered acceptable?
-
@pwnallthethings
@pwnallthethings
on x
Does Apple ask other auditors for free labor after setting them up to fail? “Hi accountants, our calculation on this napkin is correct and the warehouse of receipts is subject to inspection by accountants who wish to verify it'? Of course not. Only this industry gets screwed.
-
@sarahjamielewis
Sarah Jamie Lewis
on x
Some more information about NeuralHash too. They state they did not train it on CSAM images (which makes one wonder what they *did* train it on). This 100 million number needs some inspection given that there are billions of images exchanged everyday. https://twitter.com/...
-
@sarahjamielewis
Sarah Jamie Lewis
on x
Apples new threat model document contains some actual justification for the numbers! (https://www.apple.com/...) They are assuming 1/100000 false acceptance rate for NeuralHash which seems incredible low. And assuming that every photo library is larger than the actual largest one…
-
@howelloneill
Patrick Howell O'Neill
on x
Here is Apple's newly released doc on how they plan to protect their new CSAM tech against abuse: https://www.apple.com/...
-
@runasand
Runa Sandvik
on x
Apple even makes a reference to security researchers in its latest document about the child safety features. As if these researchers only exist to make sure Apple does not mess up. https://www.apple.com/... https://twitter.com/...
-
@normative
Julian Sanchez
on x
Apple has a new threat modeling white paper out regarding the controversial “child safety features” they recently announced. https://www.apple.com/...
-
@inafried
Ina Fried
on x
Apple is announcing a few fresh details on its child sexual abuse imagery detection program that it says are designed to avoid the system being manipulated
-
@inafried
Ina Fried
on x
One step is that it says the system will be able to be audited by third parties. Also, it will only flag images that have been identified by agencies in multiple countries as CSAM, to prevent one government from trying to use the system to scan for non-CSAM material.
-
@benedictevans
Benedict Evans
on x
Part Apple's latest PR push on iPhone CSAM scanning is to make sure people understand they're only scanning iCloud uploads, not iMessage. But why not? What's the moral logic of scanning a shared iCloud album but not an iMessage group chat?
-
@charlesarthur
Charles Arthur
on x
Great to see that Tim Sweeney doesn't understand that CSAM is illegal both to store and transmit, nor that neither Apple or the NCMEC is “the government”. This is a useful guide. https://www.justice.gov/... https://twitter.com/...
-
@timsweeneyepic
Tim Sweeney
on x
No surprise. Apple has long made personal privacy part of its very DNA. Engineers chose to join Apple for less pay and a tougher work environment because they believe in product excellence and chose to serve on the front lines of privacy as a human right. https://twitter.com/...
-
@runasand
Runa Sandvik
on x
Apple's left hand doesn't know what its right hand is doing. First it sued @CorelliumHQ for enabling people to inspect iOS devices. Now it's saying those same people will be able to spot if the photo matching process is misused in some way. https://twitter.com/...
-
@alecmuffett
Alec Muffett
on x
“Apple software head [#CraigFederighi] says plan to scan iPhones for child abuse images is ‘misunderstood’” Craig, I really don't think that *that* is the problem here. The problem is that Apple unwisely & illiberally over-reached into people's privacy. https://www.cnet.com/...
-
@evacide
Eva
on x
I'd like to take this moment to make it clear to poor Craig that no, I don't misunderstand Apple's plans to check photos in iCloud against NCMEC's database of CSAM. It's well-meaning but it's also creating a mechanism that Apple will be forced to use for other things. https://twi…
-
@bobmcmillan
Robert McMillan
on x
Apple's Craig Federighi says the iPhone's CSAM threshold will be about 30 images and that on-phone database will ship internationally, but engages in magical thinking about whether Apple is scanning images on your phone. https://www.wsj.com/...
-
@joannastern
Joanna Stern
on x
I sat down with Apple's Craig Federighi to talk about the company's child protection features. He spoke about the stumbles the company made in the announcement. I pushed him to explain everything in plain english. Here's the exclusive video interview: https://www.wsj.com/...
-
@alexstamos
Alex Stamos
on x
@runasand Yes, and making a full-throated argument for these changes would require Apple to admit that there was a human impact to being 15 years behind the rest of the industry in child safety.
-
@keleftheriou
Kosta Eleftheriou
on x
Wait, so Apple wanted to ensure that the Child Sexual Abuse Material (CSAM) tech is understood to be totally separate from the iMessage photo scanning feature, and yet they're calling it “Communication, Safety, And Messages”? 🥴👏 https://twitter.com/...
-
@lapcatsoftware
Jeff Johnson
on x
The day after settling a copyright lawsuit against security researchers: “Apple has always been at peace with security researchers.” https://twitter.com/...
-
@ihnatko
Andy Ihnatko
on x
@siegel I'm not even kidding here: Apple screwed up the messaging on this so completely that I wonder if a certain key person or two is on an extended vacation or personal leave and wasn't around to oversee this. Apple jumped on a rake with both feet, from the top of a ladder.
-
@lorenzofb
Lorenzo Franceschi-Bicchierai
on x
Apple's rollout of its CSAM scanning tool has been just a huge PR mess. Company failed to explain exactly how this works, which users it affects, what data it touches, etc. And then...Apple has been going around complaining public and press “misunderstood” everything.
-
@runasand
Runa Sandvik
on x
@alexstamos Anyone who's ever tried to announce changes to the office desk layout, the menu in the cafeteria, or the types of snacks by the coffee machine know that humans do not like surprises.
-
@timsweeneyepic
Tim Sweeney
on x
Governments want this search capability but many, including America, are constitutionally prohibited from it. Perhaps Apple thinks that if they give governments this massive surveillance gift at this critical time, regulators will look the other way on their antitrust abuses.
-
@hatr
Hakan
on x
“Because it's on the [phone], security researchers are constantly able to introspect what's happening in Apple's [phone] software,” (11-min interview w/ Apple's software chief Craig Federighi) https://www.wsj.com/...
-
@josephfcox
Joseph Cox
on x
Apple says possible for security researchers to verify how its CSAM system works because it's being done on the device. Apple does not have the best reputation for making research easy, at all. If anything, tries to block it. https://www.wsj.com/... https://twitter.com/...
-
@madbitcoins
Mad Bitcoins
on x
It's strange that 30 or so is the magic number of unacceptable “known bad signature” child porn images. I suppose there must be a threshold somewhere. Again the key is it's matching a known database of bad sigs, not scanning images for new ones. https://www.wsj.com/... https://tw…
-
@benedictevans
Benedict Evans
on x
Seriously, the whole point of the iOS security model is that this is supposed to be impossible for third parties. You can't just snoop around inside the OS watching what's going on - that's how malware works.
-
@timsweeneyepic
Tim Sweeney
on x
My fear is that what Apple is ultimately trying to backdoor here is not our iPhones but democracy and rule of law itself.
-
@timsweeneyepic
Tim Sweeney
on x
Apple already made such a deal in China, selling out the privacy of iCloud users there by putting iCloud servers in a data center operated by a government owned enterprise. Now Americans are told “trust us” because, though we just sold you out now, we won't do it again.
-
@alexstamos
Alex Stamos
on x
@benedictevans Right, until three days ago Apple was arguing that DMCA 1201 should outlaw pretty much any reverse engineering tool that could be used for security research.
-
@benedictevans
Benedict Evans
on x
@alexstamos Their neat little ‘on-device=private, off-device=not private’ theory has led them right into a corner.
-
@kaepora
@kaepora
on x
I'm sorry, but what the hell is Craig Federighi talking about here?! Since when are security researchers “constantly able to introspect what's happening in iOS”?!?! iOS is the blackest of black boxes, that's the point researchers have been making since last week! It's opaque! htt…
-
@mgsiegler
M.G. Siegler
on x
Few days later, it's even more clear that first and foremost this was the mother of all communications fuck-ups from Apple. And they have strong competition from players like the CDC and, as always, Facebook. What on Earth was Apple thinking rolling this out with this strategy?! …
-
@benedictevans
Benedict Evans
on x
Wouldn't Apple sue you if you did that? https://twitter.com/...
-
@alexstamos
Alex Stamos
on x
30 images is a higher bar than I expected, and indicates that Apple's goal is more to prevent mass sharing of known CSAM instead of tracking down the original creators of CSAM found elsewhere. Then why apply to all of the photo roll and not just shared albums? https://twitter.com…
-
@alexstamos
Alex Stamos
on x
Federighi is right about the confusion caused by bundling the iMessage and iPhoto protections into one announcement, but the bigger PR mistep was: 1) Not being clear about the goals 2) Not announcing the parallel roadmap for iCloud encryption
-
@gsterling
Greg Sterling
on x
Apple somewhat naively assumed its privacy reputation would shield it against criticism of CSAM on-device scanning. https://www.wsj.com/...
-
@adrianweckler
Adrian Weckler
on x
More clarity: Apple's child abuse image scanning system will only trigger alert when “around 30” images are detected. Great explainer (and interview) here. https://twitter.com/...
-
@dsilverman
Dwight Silverman
on x
Facing criticism about privacy on the iPhone, Apple's Craig Federighi says new tools aimed at best ensuring privacy while fighting illegal images https://www.wsj.com/... via @WSJ I appreciate that there's a text version of this story as well as video.
-
@lanceulanoff
Lance Ulanoff
on x
Exactly: “In hindsight, introducing these two features at the same time was a recipe for this kind of confusion.” https://www.wsj.com/...
-
@charlesarthur
Charles Arthur
on x
Ah, so “about 30” is the magic number for tripping Apple's CSAM system. Wonder how they arrived at that number. https://www.wsj.com/... https://twitter.com/...
-
@charlesarthur
Charles Arthur
on x
Re Apple's CSAM scanning, @kesenwang points out that Apple updated its privacy policy in late May 2019 to allow scanning for CSAM. Apple's been working on this for years. @josephmenn @benthompson @gruber https://web.archive.org/... v https://web.archive.org/... (tip @techmeme) ht…
-
@timkhiggins
Tim Higgins
on x
Apple's Craig Federighi discusses how new tool to combat child porn is designed with safeguards, such as independent audits, to protect user privacy, w/ @JoannaStern https://www.wsj.com/...
-
@huntressofcrete
Artemis
on x
Apple is flagging ICloud photos that match known pedo databases. It's not one photo, but a threshold of multiple photos uploaded that triggers an alert. Pretty sure Apple has no interest in aggravating a huge customer base through algorithmic mistakes. https://www.wsj.com/...
-
@joannastern
Joanna Stern
on x
@johnmisczak @reckless He went into a bit more on it and we detail that in the news story here and in the column I have coming soon https://www.wsj.com/... https://twitter.com/...
-
@reckless
Nilay Patel
on x
One thing I'd note here is that Federighi says there are multiple points of auditability in the system and... it would be good if those were clearly spelled out and people were able to audit them
-
@reckless
Nilay Patel
on x
Here's @JoannaStern with Apple's Craig Federighi on the child protection system and how it works https://www.wsj.com/...
-
@carlquintanilla
Carl Quintanilla
on x
“It's really clear a lot of messages got jumbled pretty badly in terms of how things were understood,” Mr. Federighi said. “We wish that this would've come out a little more clearly for everyone ..” @WSJ $AAPL https://www.wsj.com/...
-
@reneritchie
Rene Ritchie
on x
Where @JoannaStern does one of her always phenomenal interviews/videos — this time digging into Apple's new Child Safety features with Apple's head of software engineering https://twitter.com/...
-
@kaepora
@kaepora
on x
The sheer volume of the doublespeak Apple has done on this is nuts. Over and over again this past week, Apple issues “clarifications” where they deny something they've said and then *immediately rephrase the same claim in a new way.* Interview is here: https://www.wsj.com/...
-
@kaepora
@kaepora
on x
Craig Federighi today to the @WSJ: “We're not looking for CSAM on iPhones. [...] The sound-bite that got out early was “Apple is scanning my phone for images.” This is not what's happening” Dude, this is just blatant lying. This is directly contradicted by Apple's own statement. …
-
@kenroth
Kenneth Roth
on x
Apple says it will scan your iPhones only for evidence of child sexual abuse, but it is dangerously opening a new path for surveillance that could be used for anything. For example, @Apple has a long history of bowing to China when deemed necessary. https://www.nytimes.com/... ht…
-
@darchmare
Jeff
on x
Reuters: “Apple's child protection features spark concern within its own ranks -sources” Looks like the “screeching voices of the minority” include those within Apple itself. Good. https://www.reuters.com/...
-
@snowden
Edward Snowden
on x
@RichFelker @Apple Apple is not doing this as part of a crusade to protect children, or they would not allow it the scans to be bypassed by disabling iCloud Photos. They are doing it to be seen to be doing something while minimizing their involvement. This hurt users and doesn't …
-
@matthew_d_green
Matthew Green
on x
Here's an op-ed @alexstamos and I co-authored about the risks of Apple's content scanning plan. It's short and easy to read, and I'm hoping it makes the issues digestible to non-technical people. https://www.nytimes.com/...
-
@reuters
@reuters
on x
Exclusive: Some Apple employees are speaking out internally about the company's plan to scan iPhones and computers for child sex abuse images. Apple's move has also provoked protests from tech policy groups https://www.reuters.com/... https://twitter.com/...
-
@kurtopsahl
Kurt Opsahl
on x
“Apple employees have flooded an Apple internal Slack channel with more than 800 messages on the plan ... Many expressed worries that the feature could be exploited by repressive governments looking to find other material for censorship or arrests...” https://www.reuters.com/...
-
@chrismessina
@chrismessina
on x
Can you imagine being an Apple Genius having to explain how CSAM detection works OVER AND OVER AND OVER again? https://www.macrumors.com/... https://twitter.com/...
-
@bzamayo
Benjamin Mayo
on x
*APPLE WARNS STAFF TO BE READY FOR QUESTIONS ON SAFARI TAB DESIGN https://twitter.com/...
-
@markgurman
Mark Gurman
on x
Latest news on Apple child abuse images system: the company warns staff to be prepared for questions from customers, the threshold of elicit images in your library for Apple to be alerted is ~30, and Apple will have an independent auditor for its database https://www.bloomberg.co…
-
@diogomonica
@diogomonica
on x
If people yelling on Twitter can cause policy change at Apple, imagine what will happen when nation states exert pressure to increase the scope of on-device searches. https://www.reuters.com/...
-
@0xabad1dea
Badidea
on x
They have now said they'll only use hashes submitted by multiple countries, which does help mitigate this. I still think it's a bad plan that accomplishes very little against CSAM and sets itself up for abuse. But I won't deny this is an improvement https://www.reuters.com/...
-
@matthew_d_green
Matthew Green
on x
But even this description isn't accurate, because the new threat model Apple outlined this week actually expands the ideal functionality as follows. https://www.apple.com/... https://twitter.com/...
-
@matthew_d_green
Matthew Green
on x
Apple has also started emphasizing that they will include “hash publications” to prevent selective targeting of individuals. This is great! None of these things were properly described in the technical/security reviews they did last week! https://twitter.com/...
-
@geoffdelc
Geoffrey Delcroix
on x
“Scanning hashes is not like scanning content” is kind of the new “but this identifier is anonymous don't worry”, isn't it? https://twitter.com/...
-
@dailypostdan
Dan McGarry
on x
I can't get behind Apple's use of blanket surveillance tools to combat child porn. There is no tool that effectively combats child porn that cannot be turned on innocent people. That's how tech works. The tools you use to end X are equally good on Y. https://www.bloomberg.com/...
-
@byjulialove
Julia Love
on x
Apple employees have flooded Slack with concerns about the company's abuse-scanning feature. It's a level of debate that was almost unheard of when I was on the beat a few years ago. Story with @josephmenn @StephenNellis https://www.reuters.com/...
-
@kateconger
O...K
on x
Apple employees are pushing back on the company's plans to scan for CSAM, via @josephmenn @byjulialove https://www.reuters.com/...