/
Navigation
Chronicles
Browse all articles
Explore
Semantic exploration
Research
Entity momentum
Nexus
Correlations & relationships
Story Arc
Topic evolution
Drift Map
Semantic trajectory animation
Posts
Analysis & commentary
Pulse API
Tech news intelligence API
Browse
Entities
Companies, people, products, technologies
Domains
Browse by publication source
Handles
Browse by social media handle
Detection
Concept Search
Semantic similarity search
High Impact Stories
Top coverage by position
Sentiment Analysis
Positive/negative coverage
Anomaly Detection
Unusual coverage patterns
Analysis
Rivalry Report
Compare two entities head-to-head
Semantic Pivots
Narrative discontinuities
Crisis Response
Event recovery patterns
Connected
Search: /
Command: ⌘K
Embeddings: large
TEXXR

Chronicles

The story behind the story

days · browse · Enter similar · o open

Google updates Maps with an “immersive view”, which combines Street View and aerial shots, for select cities, and will let third-party apps use Live View's AR

and even feel like you're right there. It uses advances in computer vision and AI to fuse together billions of Street View images. #GoogleIO https://twitter.com/... @searchliaison : Imagine you want to learn more about multiple items on a shelf. In the future, Google Search will offer “scene exploration” to help you with this or similar tasks, to bring back info about multiple objects all at once. Stay tuned & learn more! #GoogleIO https://blog.google/... https://twitter.com/... Sundar Pichai / @sundarpichai : 2/ In Google Search, we unveiled a new multisearch feature to help people find information about local businesses. And our “scene exploration” advancement uses your camera to pan a scene and show insights about multiple objects in a space. https://blog.google/...

9to5Google Kyle Bradshaw

Discussion

  • @google @google on x
    Immersive view in @GoogleMaps is a new way to experience what a neighborhood, landmark, restaurant and venue will be like — and even feel like you're right there. It uses advances in computer vision and AI to fuse together billions of Street View images. #GoogleIO https://twitter…
  • @jetscott Scott Stein on x
    Between Apple and Google doing this, maps are going to become rapidly virtualized to a different degree https://twitter.com/...
  • @pollagarmiany @pollagarmiany on x
    Central Kurdish (Sorani) is now also available on @Google Translate! It's been already available on @Bing for some years. If you speak Kurdish but can't read Sorani yet, you definitely should learn it. It's not that hard. Just 32-36 letters. https://twitter.com/...
  • @sundarpichai Sundar Pichai on x
    6/ We're adding 24 new languages to Google Translate, spoken by more than 300 million people. We did it with an AI technique called ‘zero-shot learning’ — the model learned to translate without ever seeing translations in these languages. https://blog.google/...
  • @marie_haynes Dr. Marie Haynes on x
    Current translation services tend to cover about 100 of the languages in the world. Google has just announced new machine learned translation integrating deep learning with NLP. Very cool. https://ai.googleblog.com/...
  • @krzyzanowskim Marcin Krzyzanowski on x
    year is 2022 Google: we're adding 24 NEW languages to our translator Apple: 16 languages to translate should be enough for everyone 🍿 https://twitter.com/...
  • @iamhaks @iamhaks on x
    Kurdish Sorani is finally being added to google translate. Now we have both Sorani and Kurmanci on there ✌🏼☀️ https://twitter.com/...
  • @ariesasandino @ariesasandino on x
    We have gone beyond the world of “The Jetsons!” Google unwrapped a new prototype pair of augmented reality glasses that can automatically translate speech for wearers that speak different languages. Having both a VCR and HBO was the highlight of my childhood!
  • @bseratg @bseratg on x
    Super excited to share Tigrinya & Oromo have been added to Google Translate! One of the most meaningful features I've had the pleasure of supporting. **Just launched today but it won't get to users worldwide until the next couple of days. https://twitter.com/...
  • @google @google on x
    We recently launched multisearch in the Google app, which lets you search by taking a photo and asking a question at the same time. Later this year, you'll be able to take a picture or screenshot and add “near me” to get local results. #GoogleIO https://twitter.com/...
  • @garrettsussman Garrett Sussman on x
    The potential for Google Lens is fascinating. From discovering where to buy things you see IRL to seeing the actual reviews of those things. But honestly, I don't see people using it. Has anyone ever used Google Lens in a practical situation, outside of the novelty of it? https:/…
  • @searchliaison @searchliaison on x
    Imagine you want to learn more about multiple items on a shelf. In the future, Google Search will offer “scene exploration” to help you with this or similar tasks, to bring back info about multiple objects all at once. Stay tuned & learn more! #GoogleIO https://blog.google/... ht…
  • @sundarpichai Sundar Pichai on x
    2/ In Google Search, we unveiled a new multisearch feature to help people find information about local businesses. And our “scene exploration” advancement uses your camera to pan a scene and show insights about multiple objects in a space. https://blog.google/...
  • @marie_haynes Dr. Marie Haynes on x
    You can now search using images AND text at the same time and add “near me” to the search to find local options. Apparently not *yet* MUM powered. https://twitter.com/...
  • @wittednote @wittednote on x
    Today at Google I/O, we shared our vision to make the whole world around you searchable - making it easier and more natural for you to find and explore information: https://blog.google/... → A mini-thread on what this means... 🧵 #GoogleIO (1/4)