Keyword search finds articles that contain specific words. Vector search finds articles that mean similar things. The difference sounds subtle. In practice, it's the difference between finding "articles that mention Arm" and finding "articles about chip companies that stopped licensing and started competing" — a concept that connects Arm, Intel's foundry pivot, and Samsung's chip independence without sharing a single keyword.
How Vectors Work
Every article in TEXXR's database is converted into a vector — a list of 1,536 numbers that represent the article's semantic position in meaning-space. Two articles about similar topics will have vectors that point in similar directions. Two articles about unrelated topics will point in different directions. The angle between two vectors — measured by cosine similarity — tells you how semantically related they are.
A cosine similarity of 1.0 means identical meaning. A similarity of 0.0 means completely unrelated. In practice, articles above 0.6 are meaningfully related. Above 0.7, they're covering the same story from different angles. Above 0.8, they're essentially about the same event.
Triangulation
Vector search becomes powerful when you move beyond single-article similarity to triangulation — finding the semantic center of multiple signals and searching from there.
Here's how it works. Instead of searching for a single query ("AI regulation"), you provide multiple data points:
- An article about the EU AI Act
- An article about the FTC investigating OpenAI
- An article about California's SB 1047
TEXXR computes the centroid of these three vectors — the average point in meaning-space. Then it searches for articles near that centroid. The results aren't articles about "AI regulation" (which keyword search would find). They're articles that sit at the semantic intersection of European policy, US antitrust, and state-level legislation. The triangulated search surfaces connections that no single query could express.
Triangulation doesn't search for words. It searches for the space between ideas.
What This Finds
Some of TEXXR's strongest editorial insights came from triangulation:
The Architect's Chip started with a single article about Arm's AGI CPU. Similar-article search found a 2023 article about Arm planning a test chip, a 2024 article about Arm setting up an AI chip division, and a 2025 article about Nvidia cutting its Arm stake — all at 0.65-0.72 similarity. None shared keywords with the original. The vector space connected them because they occupied the same semantic region: "chip designer becomes chip maker."
The Suboptimal Move traced a Bloomberg article about chess grandmasters back through similar articles spanning 2018-2023 — from AlphaZero teaching itself chess to an amateur beating a Go AI by exploiting blind spots. Keyword search for "chess grandmasters" would have found chess articles. Vector search found the structural pattern: humans adapting to AI by exploiting what AI can't see.
The $6 Million Precedent used triangulation across a jury verdict, a Section 230 analysis, and a Facebook whistleblower article to find the 5-year arc connecting them — a constellation of 28 articles across four years that traced the path from leaked internal research to courtroom liability.
Period Centroids
TEXXR precomputes centroids for every entity in every time period — monthly and quarterly. These period centroids represent what an entity "meant" during that window. The distance between successive centroids is semantic drift — the measurable change in an entity's meaning over time.
Period centroids also enable comparative search. You can ask: "What was the semantic difference between 'OpenAI' in Q1 2024 and 'OpenAI' in Q1 2026?" The answer is a vector — and that vector points in a direction that describes the transformation. Project it onto interpretable axes (safety → commerce, research → product, nonprofit → corporation) and you get the drift map.
Why This Matters
Most news platforms search by keyword, date, or source. This works for finding a specific article you remember. It fails for finding patterns you don't know exist.
Vector search inverts the process. Instead of starting with what you're looking for, you start with what you've found — an interesting article, a surprising data point, a juxtaposition — and ask: "What else occupies this region of meaning-space?" The answer is often something you would never have searched for, from a time period you weren't looking at, by a source you don't follow.
That's the difference between search and discovery. Keywords find what you expect. Vectors find what you don't.
Try It
TEXXR's Explore page runs vector search on every query. Enter any natural language concept — "companies that pivoted from services to chips," "AI products that failed as standalone apps," "the intersection of defense procurement and AI safety" — and get results ranked by semantic similarity, not keyword overlap.
The Pulse API exposes triangulation via POST /api/triangulate — pass any text and get the most semantically similar articles across the full corpus. The CLI makes it even simpler: python3 cli/texxr.py triangulate "your concept here".