/
Navigation
C
Chronicles
Browse all articles
C
E
Explore
Semantic exploration
E
R
Research
Entity momentum
R
N
Nexus
Correlations & relationships
N
~
Story Arc
Topic evolution
S
Drift Map
Semantic trajectory animation
D
P
Posts
Analysis & commentary
P
Browse
@
Entities
Companies, people, products, technologies
Domains
Browse by publication source
Handles
Browse by social media handle
Detection
?
Concept Search
Semantic similarity search
!
High Impact Stories
Top coverage by position
+
Sentiment Analysis
Positive/negative coverage
*
Anomaly Detection
Unusual coverage patterns
Analysis
vs
Rivalry Report
Compare two entities head-to-head
/\
Semantic Pivots
Narrative discontinuities
!!
Crisis Response
Event recovery patterns
Connected
Nav: C E R N
Search: /
Command: ⌘K
Embeddings: large
TEXXR

Chronicles

The story behind the story

days · browse · Enter similar · o open

Meta rolls out three new Ray-Ban Meta features: “live AI” for videos and live translations for Early Access Program members, and Shazam for US and Canada users

What's Next, Therapist? Markus Kasanmascheff / WinBuzzer : Meta Ray-Ban Smart Glasses Add Live AI and Translation Alan Martin / Tom's Guide : Meta is putting Shazam and AI into its Ray-Ban smart glasses — here's what's new Stan Schroeder / Mashable : Ray-Ban's Meta glasses can now chat with you about your surroundings The Indian Express : Ray-Ban Meta glasses get smarter with live AI, live translate, and also Shazam Naomi Li Gan / Tech in Asia : Meta adds live AI, translation to Ray-Ban smart glasses Colin Kirkland / MediaPost : Meta Rolls Out Live AI Features On Ray-Ban Glasses Kyle Wiggers / TechCrunch : Meta updates its smart glasses with real-time AI video Enacy Mapakame / Cryptopolitan : Meta makes three major updates to its Ray-Ban glasses, including live AI Juli Clover / MacRumors : Meta's Smart Glasses Gain Live AI and Live Translation Rohith Bhaskar / Notebookcheck : Ray-Ban Meta Glasses updated with Live AI, Translation, and Shazam support Scott Stein / CNET : Meta's Ray-Bans Can Now Do Real-Time Live AI And Translation Harshita Mary Varghese / Reuters : Meta adds live translation, AI video to Ray-Ban smart glasses Bluesky: Ben Wagner / @benwagner : It seems to me that Apple really is missing the boat here.  Meta being able to build a far superior voice control from scratch while Siri continues to be mostly useless is baffling.  —  And, Meta isn't really able to deeply integrate.  An Apple version with really deep phone integration would be a hit Paul Armstrong / @paularmstrongtbd : Jesus Christ what is coming in January?  Can't any of this wait? [embedded post] Ben Wagner / @benwagner : This product continued to impress me after using it for a week, and this is before these updates.  —  That being said, the AI needs a ton of work... I asked the ai for the score of an NHL game and it twice told me it couldn't find it and once gave me an inaccurate score.  —  Seems like table stakes? [embedded post] Instagram: Mark Zuckerberg : Clutch new update for Ray-Ban Meta glasses — song recognition with Shazam X: Rowan Cheung / @rowancheung : @AndrewCurran_ Been waiting for live real time translation. Such a simple, useful feature that can help billions of people. Andrew Curran / @andrewcurran_ : @rowancheung It's wonderful, and it feeds the live translation into the little speakers by the ear so it doesn't distract from the conversation. Like magic honestly. Andrew Curran / @andrewcurran_ : META is rolling out the v11 update for Ray-Bans this morning. Early access users will gain access to Live Language Translation, Shazam integration, and an interesting new feature they're calling ‘Live AI’. The design goal is live context awareness, and proactive behavior. [image] David Woodland / @davidsven : Today we start the rollout of one of the biggest updates of year to Ray-Ban Meta Glasses. Here are some of my favorite things in this update: 1) “Hey Meta, start Live AI.” You can now have a conversation with AI that streams video of what you see into the context of the [image]

The Verge