/
Navigation
C
Chronicles
Browse all articles
C
E
Explore
Semantic exploration
E
R
Research
Entity momentum
R
N
Nexus
Correlations & relationships
N
~
Story Arc
Topic evolution
S
Drift Map
Semantic trajectory animation
D
P
Posts
Analysis & commentary
P
Browse
@
Entities
Companies, people, products, technologies
Domains
Browse by publication source
Handles
Browse by social media handle
Detection
?
Concept Search
Semantic similarity search
!
High Impact Stories
Top coverage by position
+
Sentiment Analysis
Positive/negative coverage
*
Anomaly Detection
Unusual coverage patterns
Analysis
vs
Rivalry Report
Compare two entities head-to-head
/\
Semantic Pivots
Narrative discontinuities
!!
Crisis Response
Event recovery patterns
Connected
Nav: C E R N
Search: /
Command: ⌘K
Embeddings: large
VOICE ARCHIVE

Michael Keller

@mhkeller
10 posts
2023-02-07
New w/ @kateconger: Over 120k views on a video of child sexual assault. A recommendation engine pushing abuse-related content. Unpaid bills for detection software. We looked at Twitter's efforts against child sex abuse since Elon Musk made it “priority #1” https://www.nytimes.com/...
2023-02-07 View on X
New York Times

Tests show CSAM still spreads and gets recommended on Twitter, including known material that's easier to detect; one video had 122K+ views in just over a month

2022-12-13
This week, @MissingKids, the center that receives reports of online child exploitation, held a roundtable with leading tech platforms about combating this illegal imagery. For the first time in the event's decade-long history, Twitter reps didn't attend.
2022-12-13 View on X
Reuters

Elon Musk says Twitter Blue subscribers will see half the number of ads as non-subscribers and that the company plans to offer a higher tier with no ads in 2023

Twitter owner Elon Musk tweeted on Monday that Twitter's Basic blue tick will have half the number of advertisements …

This week, @MissingKids, the center that receives reports of online child exploitation, held a roundtable with leading tech platforms about combating this illegal imagery. For the first time in the event's decade-long history, Twitter reps didn't attend.
2022-12-13 View on X
Wall Street Journal

Similarweb: Twitter ad manager visits fell nearly 74% YoY in October and 85% YoY in November, the largest drop since Musk's buyout, continuing into December

Laura Forman / Wall Street Journal :

2020-03-06
NEW: Two measures were announced today to combat the growing problem of online child sexual abuse: one a bill in Congress, the other an int'l initiative establishing guidelines for tech companies. Here's our story https://www.nytimes.com/... First, more about the bill... 1/
2020-03-06 View on X
New York Times

Five Eyes nations release guidelines for platforms to fight child porn; US senators introduce a bill to remove Section 230 for failure to police such content

Michael H. Keller / New York Times :

2019-11-10
Tests of Google didn't find similar imagery, but @CdnChildProtect shared documentation with us showing how Google repeatedly refused to remove images the Canadian Center's analysts discovered, including explicit photos of children younger than 12. 10/
2019-11-10 View on X
New York Times

Tech companies have been failing to deal with child abuse images for years, and they still don't have a common standard for identifying illegal video content

How PhotoDNA Works  —  The uploaded image — in this instance a photograph of Dr. Farid — is turned into a square and colors are removed …

We found 75 images that matched on Bing, DuckDuckGo and Yahoo before stopping the program. After our tests, Microsoft said it found a flaw in its scanning practices and was re-examining search results. But subsequent tests found even more sexual abuse imagery. 8/
2019-11-10 View on X
New York Times

Tech companies have been failing to deal with child abuse images for years, and they still don't have a common standard for identifying illegal video content

How PhotoDNA Works  —  The uploaded image — in this instance a photograph of Dr. Farid — is turned into a square and colors are removed …

Here's the full story, with our complete findings and moving portraits by @kholoodeid of abuse survivors and their parents. She has traveled across N. America taking photos for this series, documenting the people this issue affects. Thanks for reading. https://nyti.ms/36PVNXt 15/
2019-11-10 View on X
New York Times

Tech companies have been failing to deal with child abuse images for years, and they still don't have a common standard for identifying illegal video content

How PhotoDNA Works  —  The uploaded image — in this instance a photograph of Dr. Farid — is turned into a square and colors are removed …

Enterprise cloud storage scans even less. Amazon doesn't scan anything at all, and neither does Microsoft Azure. Both cited customer privacy reasons and said their terms of service prohibited illegal imagery. 13/
2019-11-10 View on X
New York Times

Tech companies have been failing to deal with child abuse images for years, and they still don't have a common standard for identifying illegal video content

How PhotoDNA Works  —  The uploaded image — in this instance a photograph of Dr. Farid — is turned into a square and colors are removed …

We found other gaps, too. Many cloud storage companies, including Dropbox, Google Drive and Microsoft OneDrive, don't scan files when they're uploaded, only when shared. We found court cases in which offenders exchanged logins, getting around prevention measures. 12/
2019-11-10 View on X
New York Times

Tech companies have been failing to deal with child abuse images for years, and they still don't have a common standard for identifying illegal video content

How PhotoDNA Works  —  The uploaded image — in this instance a photograph of Dr. Farid — is turned into a square and colors are removed …

As one woman said, now 21 and dealing with lifelong effects of the abuse she and her sister endured: “It's more than just images. When I'm in public with my little sister, and I see some man looking at her, that is one of the first things I think about. You're always worried.” 3/
2019-11-10 View on X
New York Times

Tech companies have been failing to deal with child abuse images for years, and they still don't have a common standard for identifying illegal video content

How PhotoDNA Works  —  The uploaded image — in this instance a photograph of Dr. Farid — is turned into a square and colors are removed …