Tag: Ray-Ban

Meta updates its second-generation Ray-Ban Meta smart glasses and launches the Oakley Meta Vanguard, designed for sports use.

Meta smart glasses have received a major update, introducing "conversation focus" to instantly transform into a hearing aid? They can even automatically play Spotify music while viewing scenery.

Meta recently confirmed that it has begun releasing a new wave of software updates for its smart glasses product line (including the Ray-Ban Meta and Oakley Meta series). This update brings two killer features: "Conversation Focus" and Spotify integration with multimodal AI. The initial rollout will be to users participating in the Early Access program. A Savior in Noisy Environments: "Conversation Focus" This feature essentially gives smart glasses the ability to reduce ambient noise and enhance human voices, similar to a hearing aid. When users are in crowded or noisy environments (such as restaurants or parties), activating this feature allows the glasses' microphone array to use beamforming technology to lock onto and amplify the voice of the person directly in front of them, while reducing background noise. Meta's official description states that the enhanced voice sounds slightly "clearer," helping users hear more clearly. Users can quickly activate it via the voice command "Hey Meta, start Conversation Focus" or by setting a touch shortcut by long-pressing the temple. AI DJ at your fingertips: Spotify plays whatever you watch. Another interesting update is the deep integration with Spotify. Through multimodal AI technology, you can now have your glasses act as your personal DJ. Simply say, "Hey Meta, play a song to match..."

Meta leverages AI to advance its vision for the Metaverse, with comprehensive upgrades to Horizon Engine, Horizon Studio, and entertainment content.

Is Meta's "Metaverse" dream shattered? Rumors suggest Meta will drastically cut Reality Labs staff and bet all its resources on AI smart glasses.

Meta, which once made a bold move by renaming itself to demonstrate its determination, has recently had to bow to market realities. According to a report by Business Insider citing sources, Meta is planning layoffs at its core department, Reality Labs, responsible for developing the metaverse, and will redirect the saved resources to its recently booming AI smart glasses and wearable devices. The restructuring of the metaverse department could result in layoffs of up to 30%. This wave of layoffs has been described as a precise "surgery" and is expected to begin as early as next month. Sources indicate that the team responsible for developing VR headsets and VR social platforms will be primarily affected, with layoffs estimated at between 10% and 30%. While Meta hasn't completely abandoned the metaverse, it has clearly decided to stop endlessly investing in that "bulky" VR dream. The $700 billion lesson: Consumers want "fashion," not bulky "headsets." Over the past four years, Reality Labs, as the hardware embodiment of the metaverse vision, has accumulated losses exceeding $700 billion. What truly brought Meta a glimmer of hope for profitability wasn't the high-end Quest Pro, but rather the Ray-Ban Meta smart glasses, a collaboration with Ray-Ban. These glasses, resembling ordinary sunglasses but featuring a built-in camera and AI voice assistant, achieved great success in the market. This proved that consumers are more willing to pay for "lightweight and stylish" tech gadgets than bulky VR headsets that isolate them from the world. Meta spokesperson Nissa Anklesaria also confirmed that, given the current momentum, the company is adjusting its investment portfolio, shifting resources from Metaverse to AI glasses. To further strengthen its "aesthetic" capabilities, Meta even poached former Apple senior designer Alan Dye. He will lead a new creative studio within Reality Labs, focusing on the fusion of design, fashion, and technology, reporting directly to CTO Andrew Bosworth. Meta CEO Mark Zuckerberg stated on Threads that AI glasses will change the way humans connect with technology, and the new studio will be dedicated to making every interaction "natural and thoughtful." Analysis: From "Ready Player One" to "Cloud Lover" In my opinion, Meta's strategic shift is actually in line with the changing trends in the overall technology landscape. In 2021, both Apple and Google aggressively pushed forward with VR/AR, and Meta heavily invested in Metaverse to gain a competitive edge. However, with Apple's Vision Pro fizzling out, and industry players like HTC gradually slowing down their VR application development, and with Apple and Google clearly aiming to enter the smart glasses market for XR and AR applications, Meta should realize it no longer needs to shoulder the heavy burden of VR development alone. Rather than forcing users to wear headsets into a virtual world like Ready Player One, it's better to use lightweight AI glasses, like in "Cloud Lover"...

Meta's new Ray-Ban Display smart glasses have been leaked in a video, featuring a heads-up display and dedicated wristband controls.

Meta's new Ray-Ban Display smart glasses have been leaked in a video, featuring a heads-up display and dedicated wristband controls.

Just before Meta Connect 2025, a video showcasing Meta's new smart glasses, Ray-Ban Display, has leaked online. The video reveals that the glasses will feature a heads-up display (HUD) and allow for more precise interaction via a dedicated wristband. UploadVR reports that in addition to the new Ray-Ban Display, Meta will also launch new Oakley photography glasses. The leaked video confirms that the new Ray-Ban glasses retain the classic Wayfarer design with transparent lenses. Its biggest highlight is the built-in HUD, which can display real-time map previews, friend messages, and even provide gaze-related information. Furthermore, the dedicated wristband allows for more precise control, such as replying to chats by swiping your finger to type text, making interactions more natural and avoiding the need to constantly use voice or a phone to reply. Besides the Ray-Ban Display, the new Oakley Sphaera-style sports glasses will feature a single lens positioned above the nose bridge in the center of the lens, clearly designed for cyclists and outdoor sports enthusiasts to easily record their activities. However, these smart glasses do not have a built-in display, seemingly leaning more towards shooting, streaming, or sports recording, differentiating them from the smart interactive experience of the Ray-Ban Display. Although Meta has not officially responded to this, this newly revealed product aligns with rumors circulating for the past year that Meta will integrate head-up display technology into its smart glasses products. More details are expected to be officially revealed at the Meta Connect event on September 17th (US time), where, in addition to the next generation of smart glasses, the outside world is also looking forward to Meta further explaining its integration strategy for AI assistants, AR applications, and the Horizon OS ecosystem.

Meta will collaborate with Oakley to launch new smart glasses, which will be officially unveiled on June 6th

Meta will collaborate with Oakley to launch new smart glasses, which will be officially unveiled on June 6th

Following its collaboration with Luxottica Group on Ray-Ban smart glasses, Meta recently announced that it will unveil smart glasses in partnership with Oakley on June 20th (Pacific Time). Compared to Ray-Ban's more fashionable design, the Oakley-branded smart glasses are expected to have a more sporty style, and will be Meta's second product after Ray-Ban. Reports indicate that the Oakley-branded smart glasses may be based on the Sphaera series of polarized sports sunglasses, which were previously the official Olympic choice, and will retain the architectural design from the previous Ray-Ban collaboration, adding features such as photo and video recording, social media sharing, and voice assistant services. The main difference is the lens placement, now centered on the frame, optimized for sports use, and designed for outdoor activities such as cycling. However, unlike previous years when smart glasses were announced at the Connect conference in September, this time the announcement of the smart glasses in collaboration with Oakley was brought forward to June. This may be because the overall design of this product is not significantly different from the Ray-Ban collaboration, and there had already been numerous rumors circulating about it. On the other hand, announcing the collaboration with Oakley in June not only attracts more outdoor enthusiasts during the summer but also may allow more time to prepare for the announcement of a more functional smart glasses at the September event. Google unveiled the progress of its Android XR platform at Google I/O 2025 this year and announced collaborations with Samsung, Xreal, and Gentle Monster. Meta is expected to respond by announcing its new smart glasses product.

In addition to updating Meta AI artificial intelligence assistant functions, Meta and Ray-Ban's smart glasses also add more application functions.

Meta is reportedly looking to add facial recognition back to its smart glasses as part of its environmental awareness capabilities.

The Information website reports that Meta originally intended to add facial recognition to its smart glasses, but later cancelled the design. However, it has recently redesigned the feature to include it as part of its environmental awareness capabilities in new smart glasses. The initial reason for the cancellation was the glasses' insufficient battery life; once activated, the battery would only last about half an hour. However, sources indicate that this feature may be improved in a product expected to launch in 2026, providing several hours of use. Besides incorporating facial recognition into smart glasses to enable various real-time AI recognition applications, Meta also seems to plan to apply this technology to smart headphones with integrated cameras. These headphones could also use environmental awareness to alert the wearer to nearby information, such as obstacles or landmarks. However, the introduction of this feature by Meta raises potential privacy and security concerns. There are even reports within Meta discussing whether to activate an indicator light when environmental awareness is enabled on the smart glasses, potentially leading to privacy controversies surrounding the device. In a recent software update for smart glasses developed in collaboration with Ray-Ban, Meta has enabled the Meta AI function by default. Users can only disable it by disabling the "Hey, Meta" enable keyword. Meta also stated that user voice interaction content will be used for data training.

Apple does not plan to attract the market with Vision Pro specifications, but hopes to improve virtual visual immersion by simplifying development difficulties.

Apple has no plans to launch an M4 Ultra, possibly revising its Vision Pro market strategy

Besides reports suggesting that Apple's HomePod with a screen might have its release schedule adjusted due to the delayed launch of the new Siri, Bloomberg reporter Mark Gurman listed three main reasons why Apple hasn't released the M4 Ultra. Furthermore, Gurman also indicated that given the less-than-ideal performance of Vision Pro, Apple will likely readjust its development strategy, perhaps focusing on more appealing augmented reality glasses products. Apple has no plans to release an "M4 Ultra" processor. In previous interviews, Apple stated that it wouldn't offer Ultra specifications for every M-series processor product. This means that while the Mac Studio products have recently been updated, the processor design uses two M3 Max processors combined using UltraFusion technology to create the M3 Ultra, rather than being based on the M4 Max. This is naturally because the M4 Max's original design doesn't include the connector required for UltraFusion. Therefore, unless Apple uses a new technology to connect the two processors, it's difficult to achieve the "M4 Ultra" processor specification without a native connector. Given Apple's current processor designs, launching a completely new Ultra-specification processor would obviously require higher costs and more development time. Furthermore, considering it only meets specific market demands, Apple will likely continue to use the M3 Ultra as its highest-end processor for now. Perhaps with the upcoming Mac Pro update, Apple might release a processor even higher than the M3 Ultra, but this depends on the anticipated M5 processor. However, considering the performance of Ultra-specific processors and the actual product update cycle, Apple doesn't need to release a new version of this processor with every iteration. This would alleviate the upgrade anxiety for users of such products and allow Apple more time to invest in high-performance computing processors. Regarding the future product lineup for the Vision Pro, Mark Gurman points out that while the product incorporates many advanced technologies and offers a novel user experience, its price point makes it difficult to penetrate the general consumer market. Especially with Meta's Quest series of virtual reality headsets available at a lower price, many people are unlikely to consider purchasing Apple's high-priced Vision Pro. Therefore, considering improvements in product specifications, wearing experience, and overall cost reduction, Apple may begin to shift its focus to other more attractive products, such as lighter and more portable augmented reality glasses. Mark Gurman indicates that Apple has already invested more ideas and underlying technology research into augmented reality glasses, but it may take another 3-5 years before a product meets expectations is released. However, Mark Gurman also stated that Apple will not immediately release Vision...

In addition to updating Meta AI artificial intelligence assistant functions, Meta and Ray-Ban's smart glasses also add more application functions.

Reports indicate that Meta is currently working on at least three new smart glasses, but "Orion" will not be sold as a commercial model.

Bloomberg News reports that Meta is currently developing at least three new smart glasses models and has already outlined a concrete product development roadmap. The first smart glasses specifically designed for augmented reality (AR) use could be launched as early as 2027. In late September last year, Meta showcased AR glasses codenamed "Orion," demonstrating its vision for future smart glasses devices. This particular model represents over 10 years of research, aiming to operate independently without requiring a separate connection to a smartphone or other device, and to allow users to interact with the world through holographic AR images. However, the Bloomberg report indicates that "Orion" will not be sold commercially, but a marketable version, codenamed "Artemis," is planned for release in 2027. The new smart glasses, codenamed "Supernova," will continue the collaboration with Ray-Ban and will be sold in more countries and regions. In addition, Meta plans to establish more collaborations with Luxottica Group, the parent company of Ray-Ban. This includes developing smart glasses codenamed "Supernova 2," based on Oakley's Sphaera model, which is expected to be more suitable for use during exercise, with the camera positioned in the center of the frame. The "Hypernova" smart glasses, on the other hand, will focus on a more complete augmented reality experience, allowing users to run various simple-interface apps on the glasses and view notifications or photos. These smart glasses are expected to be priced in the $1000 range. Previous reports indicated that Meta plans to add a display function to its Ray-Ban smart glasses, expected to launch in the second half of 2025, which may be the "Hypernova" model. Furthermore, Meta appears to be developing a wristband device to control the smart glasses, somewhat similar to the wearable control device previously unveiled alongside "Orion." Smart headphones are also a likely next product from Meta, aiming to compete with Apple's AirPods series.

In addition to showcasing its smart glasses products early next year, Samsung also plans to announce its hybrid vision platform system in December.

In addition to showcasing its smart glasses products early next year, Samsung also plans to announce its hybrid vision platform system in December.

Yonhap News Agency, citing sources, reports that Samsung may unveil its new smart glasses at the Unpacked event in January 2025, and is also expected to announce the hybrid vision platform system used in the glasses in December. Previous reports indicated that while Samsung would announce the glasses at Unpacked in January 2025, the actual market launch was likely in the second half of 2025, primarily to showcase the product's design and attract more developers to provide application services through the new hybrid vision platform system. However, the reports did not reveal specific details about Samsung's upcoming hybrid vision platform system. The glasses have been under development since February 2023, with deep collaboration with companies such as Google and Qualcomm. Their design resembles ordinary glasses or sunglasses, weighing only about 50 grams. They are expected to use a Qualcomm Snapdragon XR1 processor, a 12-megapixel camera, and a 155mAh battery. They will not have lens projection display capabilities, thus their functionality is essentially similar to the glasses released by Meta. In addition, Samsung will use Gemini AI technology developed in collaboration with Google in the glasses device, thereby providing gesture recognition, facial recognition, object recognition, and functions including QR code scanning and payment.

Xiaomi and Honor will both launch phones equipped with Snapdragon 10 Elite at the end of October, and the appearance of ROG Phone 8 has been unveiled.

Samsung will launch a new smart glasses device as early as the second half of 2025, using a Qualcomm processor.

Following its previously confirmed continued collaboration with Qualcomm and the continued use of Snapdragon processors in its products, reports indicate that Samsung may launch a new smart glasses device as early as the second half of 2025. In terms of quantity, Samsung may initially produce 500,000 units to assess actual market demand. Like its collaborations with Meta and Ray-Ban, the glasses will utilize Qualcomm's Snapdragon XR1 processor, with an overall weight kept to 50 grams, slightly heavier than its predecessor. Specifications will include a 12-megapixel camera and a 155mAh battery. Like the previous models, they will lack lens projection display capabilities, making their functionality largely similar to Meta's glasses. However, unlike Meta's glasses which utilize its own Meta AI technology, Samsung will employ its Gemini AI technology, developed in collaboration with Google, to provide gesture recognition, facial recognition, object recognition, and features including QR code scanning and payment. Some believe that Samsung may announce the launch of its smart glasses device at the same time as unveiling the Galaxy S25 series in late January next year, but it may also be officially unveiled when the flagship phone is announced in the second half of the year, and promoted through the IFA 2025 exhibition.

In addition to updating Meta AI artificial intelligence assistant functions, Meta and Ray-Ban's smart glasses also add more application functions.

In addition to updating Meta AI artificial intelligence assistant functions, Meta and Ray-Ban's smart glasses also add more application functions.

In addition to announcing the update to the Meta AI artificial intelligence assistant, Meta also announced at Connect 2024 an enhanced user experience for its smart glasses in collaboration with Ray-Ban. This includes enabling users to engage in fluent conversations with the smart glasses to recognize and interpret what they see, directly identify and dial phone numbers on signs, remember their parking location, and use real-time translation. By adding voice interaction and image recognition capabilities to Meta AI, Meta smart glasses equipped with Meta AI can offer more practical functions, even supporting multimodal operation by adopting the Llama 3.2 large-scale natural language model. New features include allowing users to interact with Meta AI through more natural and fluent conversations, enabling Meta AI to more intuitively explain details of objects and scenes, translate images on signs, dial phone numbers on signs, and even scan QR codes directly with the built-in camera. Other application features include the ability to remember the user's parking location, to help remember to-do lists, shopping items, and even provide clothing matching suggestions. It will also collaborate with the "Be My Eyes" service to enable more visually impaired people to "see" what's in front of them. The translation function can translate long texts, and there are plans to add real-time translation capabilities for English, French, Italian, and Spanish by the end of this year, allowing users to conduct real-time conversations and translations through the Meta smart glasses. In addition, Meta also announced the launch of smart glasses made of transparent material, which will be sold as a limited edition. The hardware specifications are the same as the standard version, but the transparent design allows a clear view of the internal structure, creating a more technological feel.

Pages 1 to 2 1 2

Welcome back!

Login to your account below

Retrieve your password

Hãy nhập tên người dùng hoặc địa chỉ email để mở mật khẩu