/
Navigation
Chronicles
Browse all articles
Explore
Semantic exploration
Research
Entity momentum
Nexus
Correlations & relationships
Story Arc
Topic evolution
Drift Map
Semantic trajectory animation
Posts
Analysis & commentary
Pulse API
Tech news intelligence API
Browse
Entities
Companies, people, products, technologies
Domains
Browse by publication source
Handles
Browse by social media handle
Detection
Concept Search
Semantic similarity search
High Impact Stories
Top coverage by position
Sentiment Analysis
Positive/negative coverage
Anomaly Detection
Unusual coverage patterns
Analysis
Rivalry Report
Compare two entities head-to-head
Semantic Pivots
Narrative discontinuities
Crisis Response
Event recovery patterns
Connected
Search: /
Command: ⌘K
Embeddings: large
TEXXR

Chronicles

The story behind the story

days · browse · Enter similar · o open

An internal Huawei document, found on its website, says Huawei tested facial recognition with a “Uighur alarm” to alert Chinese police when it detected Uighurs

Washington Post

Discussion

  • @charlesrollet1 Charles Rollet on x
    .@Huawei and @Megvii worked together to test and validate ‘Uyghur alarms’ in facial recognition software, per a document found by @ipvideo https://twitter.com/...
  • @zackwhittaker Zack Whittaker on x
    Holy shit. “Huawei has tested facial recognition software that could send automated ‘Uighur alarms’ to government authorities when its camera systems identify members of the oppressed minority group.” https://www.washingtonpost.com/ ...
  • @stand_with_hk @stand_with_hk on x
    A big data program for policing in Xinjiang arbitrarily selects Uighurs for possible detention. Analysis of the leaked list strongly suggests that the vast majority of the people flagged by the system are detained for everyday lawful, non-violent behavior. https://www.hrw.org/...
  • @harikunzru Hari Kunzru on x
    I feel like I've been freaking out about this possibility since I was writing for Wired in the mid-90's and now it's here https://www.washingtonpost.com/ ...
  • @paulmozur @paulmozur on x
    Important reporting on the way that Chinese tech companies work to automate racial bias into security systems. Last year a Huawei sales person at the World Internet Conference told me they offered Uighur recognition. Hadn't been able to double source it. But this backs it up. htt…
  • @yuanfenyang Yuan Yang on x
    Just out from @cdcshepherd @hrw: another leaked list of 2,000 Uighur Muslim detainees shows how China's so-called “predictive policing” system flags people as suspicious for calling international numbers, or having relatives abroad @ft https://www.ft.com/...
  • @senrickscott Rick Scott on x
    This is disgusting. Xi, the CCP and their corporate puppets are thugs, promoting a genocide. Companies like Huawei are not only a security threat to the US but they're complicit in China's crimes against humanity. https://twitter.com/...
  • @donaldmaye D. Maye on x
    @SpencerDailey @rizzn @drewharwell If you use the right search terms,"华为" “test-report” “维族”, you can still find it in the results - though the link is 404: https://twitter.com/...
  • @sheenagreitens Sheena Greitens on x
    Read this thread. It's an important data point for understanding China's current data capabilities (& its intentions for how to use them in internal security): https://twitter.com/...
  • @delaneym917 Michael Delaney on x
    It is also interesting to see some pushback in the opinion columns of tech and business publications in China - use of facial recognition in the real estate business, in particular, has triggered a “this can hurt me too!” reaction in the Han middle class. https://twitter.com/...
  • @charlesrollet1 Charles Rollet on x
    here is @ipvideo's full article on the document which describes Uyghur alarms as a ‘basic function’ https://ipvm.com/... https://twitter.com/...
  • @matinastevis Matina Stevis-Gridneff on x
    Hard to see how Huawei can maintain it shares “European values” in its bid to revive its 5G prospects on the continent, while doing this —> Huawei tested AI software that could recognize Uighur minorities and alert police, report says https://www.washingtonpost.com/ ...
  • @geoffreyfowler Geoffrey A. Fowler on x
    They called it a “Uighur alarm”: Huawei tested AI software that could recognize the faces of Uighur minorities and alert Chinese police https://www.washingtonpost.com/ ...
  • @drewharwell Drew Harwell on x
    @ipvideo Here's the write-up from @ipvideo, the surveillance research organization that found the test report on Huawei's website: https://twitter.com/...
  • @drewharwell Drew Harwell on x
    @BuzzFeed @meghara Why AI ethics matter: In the same year researchers were working to estimate a person's ethnicity via face scan ( https://onlinelibrary.wiley.com/ ...), one of the world's biggest tech companies was testing the same idea - for police alerts: https://www.washingt…
  • @colinlecher Colin Lecher on x
    Think this is one of the most outrageous examples of facial recognition use I've ever seen and it was... on Huawei's public website https://www.washingtonpost.com/ ... https://twitter.com/...
  • @drewharwell Drew Harwell on x
    @ipvideo After we published, Huawei said the “Uighur alarm” report was “simply a test” and that the system “has not seen real-world application.” But human-rights advocates say similar systems have been used across China to track and persecute minorities https://www.washingtonpos…
  • @drewharwell Drew Harwell on x
    China has detained more than 1 million Uighurs in reeducation camps, and U.S. leaders say the Xinjiang crackdown is “something close to” genocide. @BuzzFeed's @meghara and @alison_killing mapped how the camps power a “system of total control”: https://www.buzzfeednews.com/ ...
  • @drewharwell Drew Harwell on x
    @rizzn @SpencerDailey Here's an archived copy: https://d1tzzns6d79su2.cloudfront.net/ ... via IPVM: https://ipvm.com/... which we linked to in the story.
  • @cszabla Csz on x
    “In one 2018 paper, ‘Facial feature discovery for ethnicity recognition,’ AI researchers in China designed algorithms that could distinguish between the ‘facial landmarks’ of Uighur, Korean and Tibetan faces.” https://twitter.com/...
  • @noahpinion Noah Smith on x
    Speaking of technological totalitarianism... https://twitter.com/...
  • @robynurback Robyn Urback on x
    Meanwhile, Canada is still wavering on whether to ban or restrict Huawei from its 5G network https://twitter.com/...
  • @feliciasonmez Felicia Sonmez on x
    Chinese tech giant Huawei has tested facial recognition software that could send automated “Uighur alarms” to authorities when its systems identify members of the oppressed group, according to an internal document. Important story by @drewharwell & @evadou https://www.washingtonp…
  • @mguariglia Matthew Guariglia on x
    This is 19th Century science being used to surveil and oppress a minority ethnicity by presuming criminality, using AI to “find Uighurs by facial structure”, and dispatch police. It's hard to overemphasize the extent to which the age of Eugenics/Proto-Eugenics never ended. https:…
  • @drewharwell Drew Harwell on x
    @SpencerDailey @rizzn IPVM found the document on Huawei's publicly facing website. We saw it there, as well, before the link went dead. We saved it, translated it from Mandarin, and worked to verify the findings and report on the implications. The companies have since acknowledge…