Amazon earlierconfirmConfirming previous reports, the company is developing smart glasses specifically for its delivery drivers. The glasses leverage AI-powered sensing and computer vision technology to identify images captured by the glasses' cameras and provide real-time guidance to drivers via a heads-up display (HUD) embedded directly into the lenses.
Amazon said the glasses have been in development for some time, with hundreds of delivery drivers testing early versions and providing feedback. The goal is to improve overall delivery efficiency while enhancing driver safety on their routes.
AI visually identifies packages, and the head-up display provides walking navigation
According to Amazon, the smart glasses automatically activate after the driver parks their vehicle. The system then displays a list of packages to be delivered based on the driver's current location on the head-up display. Furthermore, as the driver removes items from the pile of packages, the glasses can even visually verify that they are picking up the correct package.
When the driver leaves the vehicle to make a delivery on foot, the glasses' heads-up display provides turn-by-turn navigation to the destination, identifies potential hazards along the way, and assists the driver in navigating complex locations such as apartment buildings.
Simply put, this device allows drivers to complete the tasks of finding addresses and delivering packages without having to frequently take out or operate their mobile phones. They can even take pictures of proof of receipt of goods through the glasses.
Comes with a controller vest and replaceable batteries, and supports prescription lenses
In terms of hardware, the smart glasses will be paired with a special vest that has an integrated controller and a dedicated "panic button" that allows drivers to quickly call emergency services if needed while on the route.
The glasses themselves use a replaceable battery design to ensure all-day endurance. At the same time, considering the needs of different drivers, they also support the installation of prescription lenses or photochromic lenses.
Future features: Wrong address warning, pet detection
Amazon expects that future versions of the glasses will have more powerful AI capabilities, such as being able to warn drivers when they are about to drop off a package at the wrong address, and detect and alert them to more potential dangers, such as "whether there are pets in the yard."
Improve safety and efficiency, consumer version may be available by the end of 2026
Beryl Tomay, Amazon's vice president of transportation, said the glasses "reduce the need for drivers to manage both their phone and packages simultaneously," helping drivers stay focused on their work and thus improving their safety. She also revealed that among drivers participating in the test, Amazon has observed an average time saving of approximately 30 minutes per shift.
Although Amazon did not mention whether it would develop a version of smart glasses for general consumers in this announcement, according to The Information websitePrevious reportsAmazon also appears to be developing a consumer model, which could be expected to launch in late 2026 or early 2027.















