Google patents smart glasses with adaptive assistant powered by gaze and voice

Google has received a patent for an automated assistant designed to offer suggestions to users wearing smart glasses.

featured-image

Google has received a patent for an automated assistant designed to offer suggestions to users wearing smart glasses, adjusting these recommendations based on the user's visual focus or spoken commands. Also Read: Oppo unveils affordable battery replacement plan with limited-time bonus offer Google Patents Smart Glasses Assistant That Adapts Suggestions Based on User Gaze, Voice Input https://t.co/wqxeB4MgIg — Readytion (@AnassBelgour) December 10, 2024 Previously, Google had been developing a pair of smart glasses utilizing augmented reality (AR) technology; however, the company reportedly discontinued its 'Project Iris' AR smart glasses last year to concentrate on creating similar hardware for original equipment manufacturer (OEM) partners.

It remains uncertain whether Google will introduce a pair of AR glasses utilizing the technology outlined in its recent patent. A document published on the World Intellectual Property Organization (WIPO) website, titled "Adapting assistant suggestions rendered at computerized glasses according to changes in user gaze and/or other user input," details the functionality of an "automated" assistant that can respond to audio and visual inputs from the smart glasses worn by the user. According to the company, this automated assistant would display suggestions on the smart glasses' screen, allowing users to select options through technology that monitors their "gaze.



" This implies that the device would incorporate some form of eye-tracking technology to facilitate the assistant's functionality. When a user shifts their gaze, the assistant would utilize the smart glasses' camera and microphone to dynamically "adapt" its suggestions based on the user's visual perspective and verbal commands. Google patents smart glasses with display, speaker & eye-tracking https://t.

co/hh5i0uFRfw #android #tech — Android Headlines (@Androidheadline) December 10, 2024 Google illustrates a scenario in which a user, while navigating an unfamiliar city, receives restaurant recommendations aligned with their line of sight. Users can activate the assistant by tapping the glasses or employing a designated wake phrase. Also Read: Apple and Sony collaborate to upgrade Vision Pro for next-level VR gaming.