Here’s how I use my Ray-Ban® Meta smart glasses as a person with vision impairment
Автор: Guide Dogs
Загружено: 2024-12-04
Просмотров: 32973
Описание:
As of 10 April 2025, Meta’s Look and Tell feature is available on Ray-Ban Meta smart glasses. This powerful multimodal AI update allows users to ask questions about their surroundings and receive real-time spoken responses, whether it's identifying objects, reading text, or describing visual elements. Previously unavailable in the UK due to legislative requirements, this feature marks a significant step forward in accessibility and independence for users.
In this video, Dennis speaks to camera and shares how he uses his pair of Ray-Ban® Meta smart glasses as a person with vision impairment.
Dennis uses his smart glasses to help him when he’s cooking, to read bus timetables and more. AI functionality can help answer questions Dennis has about objects in front of him, responding through the frames’ built-in speakers. Always be aware that AI can make mistakes.
[Visual description: Video shows Dennis sat on a chair speaking to camera while wearing his Meta smart glasses.]
Whilst Guide Dogs may suggest various third-party websites and third-party applications, those are not endorsed by Guide Dogs. Guide Dogs have no control over third parties and cannot be held responsible for the accuracy of information and support they can provide or the suitability and quality of any products or services.
Повторяем попытку...
Доступные форматы для скачивания:
Скачать видео
-
Информация по загрузке: