/
Navigation
Chronicles
Browse all articles
Explore
Semantic exploration
Research
Entity momentum
Nexus
Correlations & relationships
Story Arc
Topic evolution
Drift Map
Semantic trajectory animation
Posts
Analysis & commentary
Pulse API
Tech news intelligence API
Browse
Entities
Companies, people, products, technologies
Domains
Browse by publication source
Handles
Browse by social media handle
Detection
Concept Search
Semantic similarity search
High Impact Stories
Top coverage by position
Sentiment Analysis
Positive/negative coverage
Anomaly Detection
Unusual coverage patterns
Analysis
Rivalry Report
Compare two entities head-to-head
Semantic Pivots
Narrative discontinuities
Crisis Response
Event recovery patterns
Connected
Search: /
Command: ⌘K
Embeddings: large
TEXXR

Chronicles

The story behind the story

days · browse · Enter similar · o open

[Thread] WhatsApp says Apple's approach to CSAM is a setback to user privacy, will be more fraught outside the US; WhatsApp flagged 400K+ cases to NCMEC in 2020

even photos you haven't shared with anyone. That's not privacy. Will Cathcart / @wcathcart : We've worked hard to ban and report people who traffic in it based on appropriate measures, like making it easy for people to report when it's shared. We reported more than 400,000 cases to NCMEC last year from @WhatsApp, all without breaking encryption. https://faq.whatsapp.com/... Will Cathcart / @wcathcart : We've had personal computers for decades and there has never been a mandate to scan the private content of all desktops, laptops or phones globally for unlawful content. It's not how technology built in free countries works. Alex Stamos / @alexstamos : @thegrugq @Paxxi @pwnallthethings That's a great question, and Facebook actually did a breakdown this year. I'm hoping to get these folks to re-write this as a peer-reviewed article with more detail. cc @JohnDaveBuckley https://research.fb.com/... Thaddeus E. Grugq / @thegrugq : @alexstamos @Paxxi @pwnallthethings Can you help make sense of the 12m was Facebook messenger, 18 million overall for Facebook... sort of numbers? Also the “N million a month” stuff? This is known CSAM... traded? Harassment? Assholes??? Roi Carthy / @roi : I literally had to put a story on the front page of the FT to get you guys to remove 120,000 people who traffic in CSAM in **public** WhatsApp groups, Will. https://twitter.com/... Rat King / @mikeisaac : (facebook was just waiting for an opportunity like this to hammer apple on privacy. some context from our previous coverage.... https://www.nytimes.com/... Will Cathcart / @wcathcart : This is an Apple built and operated surveillance system that could very easily be used to scan private content for anything they or a government decides it wants to control. Countries where iPhones are sold will have different definitions on what is acceptable. Jeff Johnson / @lapcatsoftware : You know you've done something very wrong when even Facebook is creeped out by it. https://twitter.com/... @avibarzeev : FB has a “concern” with Apple using crypto tech to identify child exploitation photos on local devices, even though it preserves full privacy for anyone who doesn't exploit kids. Meanwhile FB scans your photos in the cloud. “Privacy” to FB means /their/ privacy, not yours. https://twitter.com/... Will Cathcart / @wcathcart : Apple once said “We believe it would be in the best interest of everyone to step back and consider the implications ...” https://www.apple.com/... @sunchartist : Pot calling the kettle black https://twitter.com/... @can : @ranjanxroy can you even use WA without giving FB full access to your address book? @can : @sp990 @thijsniks @ranjanxroy I'm old enough to remember WhatsApp was about never having any ads when founded. If Signal flip flopped constantly on policy, I'd stop using them too. Will Cathcart / @wcathcart : ..."it would be wrong for the government to force us to build a backdoor into our products. And ultimately, we fear that this demand would undermine the very freedoms and liberty our government is meant to protect." Those words were wise then, and worth heeding here now. @pinboard : @pwnallthethings You're a really smart person and I know you understand the significance of doing this on the device vs. server side. The status quo is a tradeoff (because horrifying amounts of CSEM *do* get uploaded to any image sharing site), and Apple's move is a big shift in that tradeoff @pinboard : The threat is not just individual governments' misuse of this architecture, but that the whole thing in the hands of supranational entities with state-like power who believe in design by YOLO. It's a planetary social experiment with no checks or controls, run by high-IQ idiots @can : @sp990 @thijsniks @ranjanxroy And the larger point is that if you have the data and change the policy later, your users are fucked. Signal doesn't have the data to begin with so you can't gotcha people. So not sure if your analogy works here. Aditi Agrawal / @aditi_muses : Not Twilight Zone: Head of Facebook-owned WhatsApp is berating Apple's latest move for violating people's privacy. Pay attention. Apple didn't create a slippery slope; it's now a free fall off a cliff. https://twitter.com/... @can : @thijsniks @ranjanxroy i guess my bar for trusting FB is p low 1) they've used SMS intended for 2FA for advertising 2) terms can change easily 3) if that's the level of comfort we have, apple doesnt have access to your photos either @justinschuh : I once heard a friend describe the broader problem here as “there is nothing more legally and ethically fraught than running an open bit bucket on the Internet.” https://twitter.com/... @pinboard : The fucked up thing here is that design decisions about a worldwide surveillance architecture of social control get made by a small clique of individual companies, with no accountability to the billions of people their decisions affect, and no consequences for getting it wrong. https://twitter.com/... Floor / @floorter : It's a bit awkward to see a Facebook representative dunking on Apple about privacy. But in the end it doesn't really matter if WhatsApp implements this if the underlying OS does. https://twitter.com/... Yan / @bcrypt : there's lots of reasons to object to apple's CSAM proposal but “they shouldn't build software to scan your device” doesn't resonate with me. given a choice between plaintext backups scanned in the cloud and end-to-end-encrypted backups scanned on-device, i'd pick the latter. Yan / @bcrypt : not saying this is the choice we are facing, but i hope this makes it clear that on-device vs in-cloud is not the issue here so much as the scanning itself. Yan / @bcrypt : sorry this should say “because of eventual plans to do e2e encryption for icloud photos, or at least i hope” :) @pwnallthethings : The completely perverse thing about the whole discussion is that *on-device* scanning enables you to do equivalent levels of CSAM protection *and then also encrypt everything in the cloud*. @pwnallthethings : WhatsApp: Apple is bad for scanning for CSAM. This is the wrong approach and we would *never* do this. WhatsApp (but quieter) in 2018: we also do this and have since 2011. https://www.judiciary.senate.gov/ ... https://twitter.com/... Blake E. Reid / @blakereid : Apple's rake-stepping yesterday is now enabling semi-credible privacy dunks from Facebook 🙃 https://twitter.com/... @tihmstar : Technically he is not wrong. Yet hearing Whatsapp/Facebook talking about privacy is somewhat ironic afterall. Is really annoying how jailbreaking is still absolutely neccessary to have usable devices. First it was about usability, now it is about security/privacy... https://twitter.com/... Yan / @bcrypt : given that CSAM scanning is only enabled for users who opt into icloud photo backups, i'm guessing apple would have built it into icloud if they could. they can't because of end-to-end encryption. https://twitter.com/... Will Hamill / @willhamill : The danger with using the “Think of the children!” argument to let Apple away with their invasion of your devices and data is it doesn't take a leap to see how once the precedent is established it can trivially be misused. https://twitter.com/... Saran / @saranbyte : It's not a good look for Apple when the head of WhatsApp even thinks it's a bad move 😬 https://twitter.com/... Susan Li / @susanlitv : Definitely no love loss between #Apple & #Facebook 👇 $aapl $fb https://twitter.com/... David Grunwald / @st_eppel : Hey @Apple - when @Facebook is legitimately able to claim the moral high ground against you, you've probably screwed up bad https://twitter.com/... Will Cathcart / @wcathcart : Can this scanning software running on your phone be error proof? Researchers have not been allowed to find out. Why not? How will we know how often mistakes are violating people's privacy? Will Cathcart / @wcathcart : Apple has long needed to do more to fight CSAM, but the approach they are taking introduces something very concerning into the world. Charles Arthur / @charlesarthur : Interesting WhatsApp declaration. Cathcart links to a blogpost (CEI = child exploitation imagery, same thing as CSAM = child sexual abuse material). Note the number of accounts zapped *per month*. https://twitter.com/... https://twitter.com/... Chris Messina / @chrismessina : Children don't have an absolute right to privacy from their parents. Will children really report CSAM that they receive via iMessage? Regardless, they can only block senders, not report them, presuming they can even figure that out. Curious @wcathcart's take on this. https://twitter.com/... Sarah Jamie Lewis / @sarahjamielewis : Update: These initial expressions of hesitance from Whatsapp are at least a small sign that there is some fight left in corporate e2e providers to reject mandates of on-device mass surveillance. Some reasons for optimism there. https://twitter.com/... Rat King / @mikeisaac : re: some of WhatsApp's statements, apple spent most of yesterday rebutting certain claims that WA is pouncing on mostly pointing out that if users turn off upload to iCloud then the system doesn't work and phones/iCloud won't be scanned I expect more back and forth to come https://twitter.com/... Michael / @omanreagan : Even the head of Facebook's WhatsApp thinks what Apple is proposing is a setback for privacy. Think about that for a minute. https://twitter.com/... @luke_metro : How long has Facebook waited for a chance like this to dunk on Apple for privacy violations https://twitter.com/... Eduardo Arcos / @earcos : 100% agree with Will Cathcart. Wrong approach from Apple to a VERY sensitive problem. https://twitter.com/... Rat King / @mikeisaac : @joelipper they're loving this Tom Warren / @tomwarren : the head of WhatsApp has concerns about Apple's image scanning approach. “Apple has long needed to do more to fight CSAM, but the approach they are taking introduces something very concerning into the world.” https://twitter.com/... Will Cathcart / @wcathcart : What will happen when spyware companies find a way to exploit this software? Recent reporting showed the cost of vulnerabilities in iOS software as is. What happens if someone figures out how to exploit this new system? Will Cathcart / @wcathcart : Will this system be used in China? What content will they consider illegal there and how will we ever know? How will they manage requests from governments all around the world to add other types of content to the list for scanning?

@wcathcart Will Cathcart

Discussion

  • @pwnallthethings @pwnallthethings on x
    OK so a slightly longer thread on how we got to the Apple CSAM thing and why it's not going away
  • @wcathcart Will Cathcart on x
    Instead of focusing on making it easy for people to report content that's shared with them, Apple has built software that can scan all the private photos on your phone — even photos you haven't shared with anyone. That's not privacy.
  • @wcathcart Will Cathcart on x
    We've worked hard to ban and report people who traffic in it based on appropriate measures, like making it easy for people to report when it's shared. We reported more than 400,000 cases to NCMEC last year from @WhatsApp, all without breaking encryption. https://faq.whatsapp.com/…
  • @wcathcart Will Cathcart on x
    We've had personal computers for decades and there has never been a mandate to scan the private content of all desktops, laptops or phones globally for unlawful content. It's not how technology built in free countries works.
  • @roi Roi Carthy on x
    I literally had to put a story on the front page of the FT to get you guys to remove 120,000 people who traffic in CSAM in **public** WhatsApp groups, Will. https://twitter.com/...
  • @mikeisaac Rat King on x
    (facebook was just waiting for an opportunity like this to hammer apple on privacy. some context from our previous coverage.... https://www.nytimes.com/...
  • @wcathcart Will Cathcart on x
    This is an Apple built and operated surveillance system that could very easily be used to scan private content for anything they or a government decides it wants to control. Countries where iPhones are sold will have different definitions on what is acceptable.
  • @wcathcart Will Cathcart on x
    Can this scanning software running on your phone be error proof? Researchers have not been allowed to find out. Why not? How will we know how often mistakes are violating people's privacy?
  • @lapcatsoftware Jeff Johnson on x
    You know you've done something very wrong when even Facebook is creeped out by it. https://twitter.com/...
  • @avibarzeev @avibarzeev on x
    FB has a “concern” with Apple using crypto tech to identify child exploitation photos on local devices, even though it preserves full privacy for anyone who doesn't exploit kids. Meanwhile FB scans your photos in the cloud. “Privacy” to FB means /their/ privacy, not yours. https:…
  • @blakereid Blake E. Reid on x
    Apple's rake-stepping yesterday is now enabling semi-credible privacy dunks from Facebook 🙃 https://twitter.com/...
  • @wcathcart Will Cathcart on x
    Apple once said “We believe it would be in the best interest of everyone to step back and consider the implications ...” https://www.apple.com/...
  • @sunchartist @sunchartist on x
    Pot calling the kettle black https://twitter.com/...
  • @susanlitv Susan Li on x
    Definitely no love loss between #Apple & #Facebook 👇 $aapl $fb https://twitter.com/...
  • @can @can on x
    @ranjanxroy can you even use WA without giving FB full access to your address book?
  • @willhamill Will Hamill on x
    The danger with using the “Think of the children!” argument to let Apple away with their invasion of your devices and data is it doesn't take a leap to see how once the precedent is established it can trivially be misused. https://twitter.com/...
  • @pwnallthethings @pwnallthethings on x
    WhatsApp: Apple is bad for scanning for CSAM. This is the wrong approach and we would *never* do this. WhatsApp (but quieter) in 2018: we also do this and have since 2011. https://www.judiciary.senate.gov/ ... https://twitter.com/...
  • @can @can on x
    @sp990 @thijsniks @ranjanxroy I'm old enough to remember WhatsApp was about never having any ads when founded. If Signal flip flopped constantly on policy, I'd stop using them too.
  • @wcathcart Will Cathcart on x
    ..."it would be wrong for the government to force us to build a backdoor into our products. And ultimately, we fear that this demand would undermine the very freedoms and liberty our government is meant to protect." Those words were wise then, and worth heeding here now.
  • @pinboard @pinboard on x
    @pwnallthethings You're a really smart person and I know you understand the significance of doing this on the device vs. server side. The status quo is a tradeoff (because horrifying amounts of CSEM *do* get uploaded to any image sharing site), and Apple's move is a big shift in …
  • @st_eppel David Grunwald on x
    Hey @Apple - when @Facebook is legitimately able to claim the moral high ground against you, you've probably screwed up bad https://twitter.com/...
  • @pinboard @pinboard on x
    The threat is not just individual governments' misuse of this architecture, but that the whole thing in the hands of supranational entities with state-like power who believe in design by YOLO. It's a planetary social experiment with no checks or controls, run by high-IQ idiots
  • @floorter Floor on x
    It's a bit awkward to see a Facebook representative dunking on Apple about privacy. But in the end it doesn't really matter if WhatsApp implements this if the underlying OS does. https://twitter.com/...
  • @can @can on x
    @sp990 @thijsniks @ranjanxroy And the larger point is that if you have the data and change the policy later, your users are fucked. Signal doesn't have the data to begin with so you can't gotcha people. So not sure if your analogy works here.
  • @bcrypt Yan on x
    there's lots of reasons to object to apple's CSAM proposal but “they shouldn't build software to scan your device” doesn't resonate with me. given a choice between plaintext backups scanned in the cloud and end-to-end-encrypted backups scanned on-device, i'd pick the latter.
  • @saranbyte Saran on x
    It's not a good look for Apple when the head of WhatsApp even thinks it's a bad move 😬 https://twitter.com/...
  • @aditi_muses Aditi Agrawal on x
    Not Twilight Zone: Head of Facebook-owned WhatsApp is berating Apple's latest move for violating people's privacy. Pay attention. Apple didn't create a slippery slope; it's now a free fall off a cliff. https://twitter.com/...
  • @bcrypt Yan on x
    not saying this is the choice we are facing, but i hope this makes it clear that on-device vs in-cloud is not the issue here so much as the scanning itself.
  • @bcrypt Yan on x
    given that CSAM scanning is only enabled for users who opt into icloud photo backups, i'm guessing apple would have built it into icloud if they could. they can't because of end-to-end encryption. https://twitter.com/...
  • @tihmstar @tihmstar on x
    Technically he is not wrong. Yet hearing Whatsapp/Facebook talking about privacy is somewhat ironic afterall. Is really annoying how jailbreaking is still absolutely neccessary to have usable devices. First it was about usability, now it is about security/privacy... https://twitt…
  • @can @can on x
    @thijsniks @ranjanxroy i guess my bar for trusting FB is p low 1) they've used SMS intended for 2FA for advertising 2) terms can change easily 3) if that's the level of comfort we have, apple doesnt have access to your photos either
  • @pwnallthethings @pwnallthethings on x
    The completely perverse thing about the whole discussion is that *on-device* scanning enables you to do equivalent levels of CSAM protection *and then also encrypt everything in the cloud*.
  • @justinschuh @justinschuh on x
    I once heard a friend describe the broader problem here as “there is nothing more legally and ethically fraught than running an open bit bucket on the Internet.” https://twitter.com/...
  • @pinboard @pinboard on x
    The fucked up thing here is that design decisions about a worldwide surveillance architecture of social control get made by a small clique of individual companies, with no accountability to the billions of people their decisions affect, and no consequences for getting it wrong. h…
  • @bcrypt Yan on x
    sorry this should say “because of eventual plans to do e2e encryption for icloud photos, or at least i hope” :)
  • @wcathcart Will Cathcart on x
    Apple has long needed to do more to fight CSAM, but the approach they are taking introduces something very concerning into the world.
  • @charlesarthur Charles Arthur on x
    Interesting WhatsApp declaration. Cathcart links to a blogpost (CEI = child exploitation imagery, same thing as CSAM = child sexual abuse material). Note the number of accounts zapped *per month*. https://twitter.com/... https://twitter.com/...
  • @chrismessina Chris Messina on x
    Children don't have an absolute right to privacy from their parents. Will children really report CSAM that they receive via iMessage? Regardless, they can only block senders, not report them, presuming they can even figure that out. Curious @wcathcart's take on this. https://twit…
  • @sarahjamielewis Sarah Jamie Lewis on x
    Update: These initial expressions of hesitance from Whatsapp are at least a small sign that there is some fight left in corporate e2e providers to reject mandates of on-device mass surveillance. Some reasons for optimism there. https://twitter.com/...
  • @mikeisaac Rat King on x
    re: some of WhatsApp's statements, apple spent most of yesterday rebutting certain claims that WA is pouncing on mostly pointing out that if users turn off upload to iCloud then the system doesn't work and phones/iCloud won't be scanned I expect more back and forth to come https:…
  • @omanreagan Michael on x
    Even the head of Facebook's WhatsApp thinks what Apple is proposing is a setback for privacy. Think about that for a minute. https://twitter.com/...
  • @luke_metro @luke_metro on x
    How long has Facebook waited for a chance like this to dunk on Apple for privacy violations https://twitter.com/...
  • @earcos Eduardo Arcos on x
    100% agree with Will Cathcart. Wrong approach from Apple to a VERY sensitive problem. https://twitter.com/...
  • @mikeisaac Rat King on x
    @joelipper they're loving this
  • @tomwarren Tom Warren on x
    the head of WhatsApp has concerns about Apple's image scanning approach. “Apple has long needed to do more to fight CSAM, but the approach they are taking introduces something very concerning into the world.” https://twitter.com/...
  • @wcathcart Will Cathcart on x
    What will happen when spyware companies find a way to exploit this software? Recent reporting showed the cost of vulnerabilities in iOS software as is. What happens if someone figures out how to exploit this new system?
  • @wcathcart Will Cathcart on x
    Will this system be used in China? What content will they consider illegal there and how will we ever know? How will they manage requests from governments all around the world to add other types of content to the list for scanning?