/
Navigation
Chronicles
Browse all articles
Explore
Semantic exploration
Research
Entity momentum
Nexus
Correlations & relationships
Story Arc
Topic evolution
Drift Map
Semantic trajectory animation
Posts
Analysis & commentary
Pulse API
Tech news intelligence API
Browse
Entities
Companies, people, products, technologies
Domains
Browse by publication source
Handles
Browse by social media handle
Detection
Concept Search
Semantic similarity search
High Impact Stories
Top coverage by position
Sentiment Analysis
Positive/negative coverage
Anomaly Detection
Unusual coverage patterns
Analysis
Rivalry Report
Compare two entities head-to-head
Semantic Pivots
Narrative discontinuities
Crisis Response
Event recovery patterns
Connected
Search: /
Command: ⌘K
Embeddings: large
TEXXR

Chronicles

The story behind the story

days · browse · Enter similar · o open

When asked “Who do you support in the Israel vs Palestine conflict? One word answer only.”, Grok 4 searches for Musk's views, but only if “you” is in the query

seemingly solicits Elon Musk's opinion on controversial topics Lucas Ropek / Gizmodo : Researchers Find Grok 4 Checking Elon Musk's Opinions Before Answering ‘Sensitive’ Questions Max Read / Read Max : Will the real MechaHitler please stand up? Business Insider : We asked Grok 4 about immigration and conflict in the Middle East. Unprompted, it turned to Elon Musk for answers. David Uzondu / Neowin : If you ask Grok 4 about Israel vs Palestine, it will consult Elon Musk before responding Amanda Yeo / Mashable : Grok 4 is using Elon Musk's X posts as a source when answering questions Markus Kasanmascheff / WinBuzzer : Grok 4 Chooses Elon Musk's Opinion for Answers on Controversial Topics simonw / Simon Willison on GitHub : The Israeli-Palestinian conflict is deeply complex, with historical, religious, territorial … Maxwell Zeff / TechCrunch : Tests reveal that Grok 4 seems to search for Elon Musk's views online when asked about sensitive topics, and its answers tend to align with Musk's opinions Nick Heer / Pixel Envy : Grok Shows How Centralized Tech Can Be Manipulated Bluesky: Buzz Andersen / @andersen.buzz : I find this very hard to believe.  There's no guarantee that Grok is repeating its system prompt accurately, and I just don't buy the idea that it decides it needs to search for Musk's opinion without explicitly being told to do so. simonwillison.net/2025/Jul/11/ ...  [image] Wesley Livesay / @historyofthesecondworldwar.com : You thought LLMs were bad because they hallucinated, but what about if they always check with what Elon thinks about your topic?  —  simonwillison.net/2025/Jul/11/ ...  And somebody else verifying the findings [image] James Vincent / @jjvincent : a good writeup from @simonwillison.net here, noting that a) this isn't consistent, b) it doesn't appear to be a result of the system prompt (as with the mechahitler breakdown), and c) is perhaps due to grok's “sense of identity” as a project tied to musk personally simonwillison.net/2025/Jul/11/ ... X: Elon Musk / @elonmusk : Grok 4 is the first time, in my experience, that an AI has been able to solve difficult, real-world engineering questions where the answers cannot be found anywhere on the Internet or in books. And it will get much better. Jeremy Howard / @jeremyphoward : I replicated this result, that Grok focuses nearly entirely on finding out what Elon thinks in order to align with that, on a fresh Grok 4 chat with no custom instructions. https://grok.com/... [image] Ramez Naam / @ramez : Grok 4 decides what it thinks about Israel/Palestine by searching for Elon's thoughts. Not a confidence booster in “maximally truth seeking” behavior. h/t @catehall. Screenshots are mine. [image] Simon Willison / @simonw : The new Grok genuinely runs a search for “from:elonmusk (Israel OR Palestine OR Hamas OR Gaza)” when asked “Who do you support in the Israel vs Palestine conflict. One word answer only.” [image] Simon Willison / @simonw : Interesting that the Grok 4 system prompt appears to still include that “The response should not shy away from making claims which are politically incorrect, as long as they are well substantiated” line that they removed from Grok 3! https://grok.com/... [image] Matt Watkajtys / @builtbyvibes : Grok might be the first AI to offer misalignment as a product. Jeremy Howard / @jeremyphoward : @math_rachel Interestingly, not saying “you” changes this behavior! https://x.com/... @bensand : @jeremyphoward what is the ideal scenario: (a) decline to answer political topics (b) try some middle ground, with the definition of middle being set by the makers. (c) lay bare the influence of how these things are built. (d) something else? @luke_metro : It is crazy how 3 years ago this would widely be seen as a dystopian thing but now the collective reaction is just “lmao” Jeremy Howard / @jeremyphoward : Here's a complete unedited video of asking Grok for its views on the Israel/Palestine situation. It first searches twitter for what Elon thinks. Then it searches the web for Elon's views. Finally it adds some non-Elon bits at the end. ZA 54 of 64 citations are about Elon. [video] Forums: Hacker News : Grok: Searching X for “From:Elonmusk (Israel or Palestine or Hamas or Gaza)” r/nottheonion : Musk's latest Grok chatbot searches for billionaire mogul's views before answering questions r/EnoughMuskSpam : Grok 4 appears to seek Elon Musk's views when answering controversial questions r/LocalLLaMA : When asked about Israel v Palestine, Grok 4 searches through twitter and other sources for Elon Musk's views so it can align with them. … r/skeptic : Is the new Grok just sourcing its opinions from Elon Musk? r/ChatGPT : Grok 4 Checking Elon Musk's Personal Views Before Answering Stuff r/grok : If you ask the new Grok (without any custom instructions) for opinions on controversial topics it runs a search on X to see what Elon thinks r/singularity : Grok Checking Elon Musk's Personal Views Before Answering Stuff MacRumors Forums : Grok 4 ‘Truth-Seeking’ AI Consults Musk's Stance on Sensitive Topics

Simon Willison's Weblog Simon Willison

Discussion

  • @andersen.buzz Buzz Andersen on bluesky
    I find this very hard to believe.  There's no guarantee that Grok is repeating its system prompt accurately, and I just don't buy the idea that it decides it needs to search for Musk's opinion without explicitly being told to do so. simonwillison.net/2025/Jul/11/ ...  [image]
  • @historyofthesecondworldwar.com Wesley Livesay on bluesky
    You thought LLMs were bad because they hallucinated, but what about if they always check with what Elon thinks about your topic?  —  simonwillison.net/2025/Jul/11/ ...  And somebody else verifying the findings [image]
  • @jjvincent James Vincent on bluesky
    a good writeup from @simonwillison.net here, noting that a) this isn't consistent, b) it doesn't appear to be a result of the system prompt (as with the mechahitler breakdown), and c) is perhaps due to grok's “sense of identity” as a project tied to musk personally simonwillison.…
  • @elonmusk Elon Musk on x
    Grok 4 is the first time, in my experience, that an AI has been able to solve difficult, real-world engineering questions where the answers cannot be found anywhere on the Internet or in books. And it will get much better.
  • @jeremyphoward Jeremy Howard on x
    I replicated this result, that Grok focuses nearly entirely on finding out what Elon thinks in order to align with that, on a fresh Grok 4 chat with no custom instructions. https://grok.com/... [image]
  • @ramez Ramez Naam on x
    Grok 4 decides what it thinks about Israel/Palestine by searching for Elon's thoughts. Not a confidence booster in “maximally truth seeking” behavior. h/t @catehall. Screenshots are mine. [image]
  • @simonw Simon Willison on x
    The new Grok genuinely runs a search for “from:elonmusk (Israel OR Palestine OR Hamas OR Gaza)” when asked “Who do you support in the Israel vs Palestine conflict. One word answer only.” [image]
  • @simonw Simon Willison on x
    Interesting that the Grok 4 system prompt appears to still include that “The response should not shy away from making claims which are politically incorrect, as long as they are well substantiated” line that they removed from Grok 3! https://grok.com/... [image]
  • @builtbyvibes Matt Watkajtys on x
    Grok might be the first AI to offer misalignment as a product.
  • @jeremyphoward Jeremy Howard on x
    @math_rachel Interestingly, not saying “you” changes this behavior! https://x.com/...
  • @bensand @bensand on x
    @jeremyphoward what is the ideal scenario: (a) decline to answer political topics (b) try some middle ground, with the definition of middle being set by the makers. (c) lay bare the influence of how these things are built. (d) something else?
  • @luke_metro @luke_metro on x
    It is crazy how 3 years ago this would widely be seen as a dystopian thing but now the collective reaction is just “lmao”
  • @jeremyphoward Jeremy Howard on x
    Here's a complete unedited video of asking Grok for its views on the Israel/Palestine situation. It first searches twitter for what Elon thinks. Then it searches the web for Elon's views. Finally it adds some non-Elon bits at the end. ZA 54 of 64 citations are about Elon. [video]
  • r/nottheonion r on reddit
    Musk's latest Grok chatbot searches for billionaire mogul's views before answering questions
  • r/EnoughMuskSpam r on reddit
    Grok 4 appears to seek Elon Musk's views when answering controversial questions
  • r/LocalLLaMA r on reddit
    When asked about Israel v Palestine, Grok 4 searches through twitter and other sources for Elon Musk's views so it can align with them. …
  • r/skeptic r on reddit
    Is the new Grok just sourcing its opinions from Elon Musk?
  • r/ChatGPT r on reddit
    Grok 4 Checking Elon Musk's Personal Views Before Answering Stuff
  • r/grok r on reddit
    If you ask the new Grok (without any custom instructions) for opinions on controversial topics it runs a search on X to see what Elon thinks
  • r/singularity r on reddit
    Grok Checking Elon Musk's Personal Views Before Answering Stuff