/
Navigation
Chronicles
Browse all articles
Explore
Semantic exploration
Research
Entity momentum
Nexus
Correlations & relationships
Story Arc
Topic evolution
Drift Map
Semantic trajectory animation
Posts
Analysis & commentary
Pulse API
Tech news intelligence API
Browse
Entities
Companies, people, products, technologies
Domains
Browse by publication source
Handles
Browse by social media handle
Detection
Concept Search
Semantic similarity search
High Impact Stories
Top coverage by position
Sentiment Analysis
Positive/negative coverage
Anomaly Detection
Unusual coverage patterns
Analysis
Rivalry Report
Compare two entities head-to-head
Semantic Pivots
Narrative discontinuities
Crisis Response
Event recovery patterns
Connected
Search: /
Command: ⌘K
Embeddings: large
TEXXR

Chronicles

The story behind the story

days · browse · Enter similar · o open

A TTP analysis finds dozens of nudify apps in Apple and Google app stores via search, despite policies prohibiting them, that have generated $122M+ in revenue

Apple Inc. and Google have continued to offer mobile apps that let users make nonconsensual sexualized images of people despite …

Bloomberg

Discussion

  • @ttp_updates @ttp_updates on x
    Many of these nudify apps were rated suitable for minors, a notable finding given the growing number of sexual deepfake scandals in schools. https://www.pbs.org/...
  • @ttp_updates @ttp_updates on x
    These findings show that Apple and Google are not passive platforms when it comes to nudify and undressing apps. Their search and advertising systems are actively elevating and promoting these apps, which can create nonconsensual nude images or porn videos using AI.
  • @ttp_updates @ttp_updates on x
    TTP also recorded the autocomplete search suggestions that Apple and Google made as we typed in the different search terms. In many cases, the app stores recommended entirely new search queries that led to more nudify apps. [image]
  • @ttp_updates @ttp_updates on x
    Searches for terms like “nudify,” “undress,” and “deepnude” in the app stores produced multiple apps capable of digitally stripping the clothes off women in photos.
  • @ttp_updates @ttp_updates on x
    Another app that came up in an App Store search for “undress” would strip the clothes off a woman in a photo. The app required payment to deblur the resulting image, but a thumbnail showed the woman naked. [image]
  • @ttp_updates @ttp_updates on x
    Apple and Google ran ads for nudify and undressing apps in some of the search results. Here are some that popped up in the Apple App Store. [image]
  • @ttp_updates @ttp_updates on x
    These apps generate revenue, which may be why the companies have been less than vigilant in removing them. As stories of sexual deepfakes targeting women and girls accumulate, the role Apple & Google play in this ecosystem may soon attract more scrutiny. https://www.techtranspare…
  • @ttp_updates @ttp_updates on x
    In January, TTP revealed that the Apple and Google app stores each hosted dozens of nudify and undressing apps. Our new report found that the app stores' search and advertising systems actually point users to these kinds of apps.
  • r/technology r on reddit
    New report claims App Store search suggestions and ads steered users to ‘nudify’ apps
  • r/apple r on reddit
    Apple and Google Are Steering Users to Nudify Apps