Tag: Meta universe

Meta leverages AI to advance its vision for the Metaverse, with comprehensive upgrades to Horizon Engine, Horizon Studio, and entertainment content.

Is Meta's "Metaverse" dream shattered? Rumors suggest Meta will drastically cut Reality Labs staff and bet all its resources on AI smart glasses.

Meta, which once made a bold move by renaming itself to demonstrate its determination, has recently had to bow to market realities. According to a report by Business Insider citing sources, Meta is planning layoffs at its core department, Reality Labs, responsible for developing the metaverse, and will redirect the saved resources to its recently booming AI smart glasses and wearable devices. The restructuring of the metaverse department could result in layoffs of up to 30%. This wave of layoffs has been described as a precise "surgery" and is expected to begin as early as next month. Sources indicate that the team responsible for developing VR headsets and VR social platforms will be primarily affected, with layoffs estimated at between 10% and 30%. While Meta hasn't completely abandoned the metaverse, it has clearly decided to stop endlessly investing in that "bulky" VR dream. The $700 billion lesson: Consumers want "fashion," not bulky "headsets." Over the past four years, Reality Labs, as the hardware embodiment of the metaverse vision, has accumulated losses exceeding $700 billion. What truly brought Meta a glimmer of hope for profitability wasn't the high-end Quest Pro, but rather the Ray-Ban Meta smart glasses, a collaboration with Ray-Ban. These glasses, resembling ordinary sunglasses but featuring a built-in camera and AI voice assistant, achieved great success in the market. This proved that consumers are more willing to pay for "lightweight and stylish" tech gadgets than bulky VR headsets that isolate them from the world. Meta spokesperson Nissa Anklesaria also confirmed that, given the current momentum, the company is adjusting its investment portfolio, shifting resources from Metaverse to AI glasses. To further strengthen its "aesthetic" capabilities, Meta even poached former Apple senior designer Alan Dye. He will lead a new creative studio within Reality Labs, focusing on the fusion of design, fashion, and technology, reporting directly to CTO Andrew Bosworth. Meta CEO Mark Zuckerberg stated on Threads that AI glasses will change the way humans connect with technology, and the new studio will be dedicated to making every interaction "natural and thoughtful." Analysis: From "Ready Player One" to "Cloud Lover" In my opinion, Meta's strategic shift is actually in line with the changing trends in the overall technology landscape. In 2021, both Apple and Google aggressively pushed forward with VR/AR, and Meta heavily invested in Metaverse to gain a competitive edge. However, with Apple's Vision Pro fizzling out, and industry players like HTC gradually slowing down their VR application development, and with Apple and Google clearly aiming to enter the smart glasses market for XR and AR applications, Meta should realize it no longer needs to shoulder the heavy burden of VR development alone. Rather than forcing users to wear headsets into a virtual world like Ready Player One, it's better to use lightweight AI glasses, like in "Cloud Lover"...

Meta unveiled several prototype designs for virtual reality headsets, including the Tiramisu, which aims for ultra-realistic image quality.

Reports suggest that the Meta project will drastically cut spending on the metaverse, with Reality Labs facing a 30% budget reduction and layoffs, shifting its focus entirely to AI.

Just a few years after Facebook changed its name to Meta and announced its full-fledged foray into the "Metaverse," the social media platform appears to be facing a major strategic shift. According to a Bloomberg report, Meta is planning significant cuts to its Metaverse division, Reality Labs, with budget reductions potentially reaching 30%, and even layoffs in early 2026 are not ruled out. This news is reportedly discussed during a series of budget meetings held at Meta CEO Mark Zuckerberg's private estate in Hawaii. Sources familiar with the matter revealed that Reality Labs' funding was asked to be cut "deeper than average," indicating that the company's patience in this area is wearing thin. Burning through $700 billion in four years, investors are losing patience. Reality Labs, as Meta's core division for developing VR (virtual reality) and AR (augmented reality), has accumulated losses exceeding $700 billion since 2021. Although Mark Zuckerberg has repeatedly emphasized that this is a long-term investment in the future, for investors, it seems more like a bottomless pit of resource consumption. The report points out that the concept of the metaverse has not taken off as expected among consumers. While players are still willing to buy VR headsets to play games (such as shooting games), their willingness to wander around virtual worlds (such as Horizon Worlds) or spend real money to buy virtual clothing remains low. This budget cut is expected to directly impact the Meta Horizon Worlds virtual world platform and the hardware development process of the Quest series headsets. Strategic Shift: From Virtual Worlds to AI and Smart Glasses Although Mark Zuckerberg stated that he still believes people will ultimately spend the vast majority of their time in virtual worlds, this budget cut is seen as a signal that he understands this vision is still years, or even decades, away from being realized. So, where will the money saved go? The answer is clearly the hottest topic right now: AI. The report indicates that Meta's future spending focus will shift to developing large AI models, chatbots, and hardware products closely linked to AI experiences. For example, the recently well-received Ray-Ban Meta smart glasses. This also explains why Mark Zuckerberg has been mentioning the "metaverse" less and less in public appearances and earnings calls lately, instead focusing on the development of AI agents and open-source models.

Meta will bring developers customizable "talking" AI NPCs to Horizon Worlds

Meta will bring developers customizable "talking" AI NPCs to Horizon Worlds

Meta is further infusing its metaverse platform, Horizon Worlds, with generative AI applications, allowing developers to create AI NPCs (non-player characters) driven by large language models (LLMs) for more lifelike interactive experiences. This update is expected to launch soon and will be a key development for the next phase of Horizon Worlds. Previously, NPCs in Horizon Worlds mostly relied on pre-defined scripts for responses, resulting in stiff and shallow dialogue between players and characters. With Meta's continued investment in AI, these NPCs will be able to interact with players in real-time via voice, moving beyond one-way communication to respond based on context and player actions. In other words, players will no longer face "cold, programmatic responses" but will have the opportunity to engage in more realistic and multi-layered interactions with characters. Meta also showcased new features in the Worlds Desktop Editor in its latest developer update. Developers can use this tool to customize character appearances and set backstories and interaction rules, making NPCs more than just "props in the scene"—characters with personalities, memories, and even motivational backstories. This means that characters players encounter in Horizon Worlds may exhibit drastically different interaction patterns, further enhancing immersion. Currently, Meta has already introduced AI NPCs in worlds like Bobber Bay Fishing and Profit or Perish, allowing players to experience this new form of interaction. Compared to traditional characters with fixed dialogue, these AI NPCs demonstrate more flexible responses and may even gradually reveal new behavioral characteristics as the game progresses. This update further highlights Meta's ambition to deeply integrate generative AI with its metaverse vision. From previously introducing AI assistants and AI modeling tools to now giving NPCs dialogue capabilities, Meta is gradually injecting more "living elements" into its virtual worlds. In the future, developers will not only be able to design environments and quests but also create "virtual residents" who can interact with players long-term—perhaps a crucial step in making the metaverse a truly "organic social space." With the upcoming Meta Connect conference, it's widely expected that Meta will reveal more details about the integration of generative AI with Horizon Worlds. At that time, not only game developers, but also applications in education, social networking, and business will likely have room to utilize AI NPCs. For players, characters in the metaverse will no longer be just background, but rather "digital partners" who can co-create experiences with them.

Netflix and Roblox partner to create "Netflix Nextworld," a digital theme park featuring elements from TV series and anime.

Netflix and Roblox partner to create "Netflix Nextworld," a digital theme park featuring elements from TV series and anime.

Netflix has announced a collaboration with Roblox to create a digital theme park featuring elements from Netflix series and anime, named "Netflix Nextworld." Built on Roblox's digital service platform, Netflix Nextworld will initially be available as a beta experience, allowing fans to easily participate in activities related to their favorite series and anime digitally. Currently, Netflix Nextworld includes themed activities related to *Stranger Things*, *One Piece*, *Cobra Kai*, and Chuck Snyder's *Rebel Moon*, as well as interactive content from the upcoming animated film *Jurassic World: Chaos Theory*. During their exploration of this digital theme park, fans can collect various collectibles and wearable items to decorate their personal spaces called "Fan Pods." In addition, fans can engage in online social interaction through a shared space called "Streamship," and even participate in content premieres and group viewings. Users can currently access "Netflix Nextworld" via iOS and Android mobile devices, or via Windows PCs and Macs. Netflix expects to use this to attract more users to participate and drive the growth of Netflix service users.

Disney invests $15 billion in Epic Games to create a vast, open "game and entertainment universe"

Disney invests $15 billion in Epic Games to create a vast, open "game and entertainment universe"

Disney announced a $15 billion investment in Epic Games to create a vast, open "Games and Entertainment Universe." In further details, Disney stated that this universe will offer rich gameplay experiences and will connect with Epic Games' popular game, Fortnite. Players will be able to experience various Disney content within this universe and interact with stories, characters, and other elements. Even if players temporarily leave the universe, it will continue to function, providing a different experience upon re-entry. Disney CEO Bob Iger stated that this is Disney's largest investment in the gaming market to date, and that they will work with Epic Games to create a sustainable, open, and interconnected virtual environment ecosystem, expected to integrate with Epic Games' large Fortnite community. While Disney has consistently invested in new technologies and the gaming market, and previously strengthened its metaverse development, it surprisingly removed members of its metaverse team during last year's layoffs. This announced collaboration with Epic Games may indicate a plan to leverage the technological resources of a gaming industry player to accelerate the expansion of its metaverse strategy. Details of the collaboration have not yet been revealed, but it may be based on the Fortnite framework, integrating Disney's rich content assets to create a metaverse interactive environment that allows for wider player participation.

Meta's Codec Avatars technology significantly shortens the distance between Metaverse applications

Meta's Codec Avatars technology significantly shortens the distance between Metaverse applications

In a podcast hosted by Russian-American computer scientist Lex Fridman, Meta's flagship virtual reality headset, the Quest Pro, launched last year, and its Codec Avatars technology (first showcased in 2018), conducted an online interview with Meta CEO Mark Zuckberg, spanning hundreds of miles and taking place entirely within a metaverse environment. During the interview, both Fridman and Zuckberg wore the Quest Pro, and Codec Avatars technology rendered their facial expressions in real-time within the virtual reality setting, enhancing the realism of the interview within the metaverse environment. However, due to limitations of Codec Avatars technology, it currently only displays real-time facial expressions; to display a full-body image, additional wearable devices may be required. While Codec Avatars technology originally required rendering via an NVIDIA Titan X-level graphics card, previously announced details show it being implemented using a custom RISC-V chip from Meta Reality Labs, occupying only 1.6mm², allowing for direct application to standalone virtual reality headsets. Compared to earlier versions, the personalized faces in Meta's early Horizon Worlds were mostly composed of simple lines. However, the faces presented using Codec Avatars technology clearly convinced more people of the future trend of the metaverse. Besides Meta's promotion of metaverse applications, companies including Google and Logitech have also attempted to create more realistic "face-to-face" communication and interaction experiences using existing cameras and other devices. However, these also face limitations such as higher setup costs and difficulty in implementing them anytime, anywhere. Therefore, Meta uses virtual reality headsets, combined with customized chips and related technologies, to make it easier for people to interact in the metaverse through virtual vision, and to reduce the sense of distance during interaction through realistic facial expressions. Currently, virtual vision headsets still require other methods to represent full-body movements in the metaverse environment. Perhaps in the future, there will be simpler, more intuitive ways to achieve this goal, and even further, "tactile" sensations will be added, allowing users to interact more naturally in the metaverse environment.

Apple and Adobe collaborate on USDZ, an open file format, to make AR content more accessible and shareable.

Apple, NVIDIA, Adobe, and others establish the OpenUSD Alliance to promote the development of 3D content development standards

Following Apple's collaboration with Pixar and Adobe in 2018 to create the open file format USDZ, and subsequent partnerships between NVIDIA, Pixar, Adobe, Autodesk, Siemens, and other companies to further promote the USD standard, the OpenUSD Consortium has been announced in partnership with the Joint Development Foundation (JDF), an affiliate of the Linux Foundation, to promote the development of 3D content development standards. This consortium aims to enhance the functionality of OpenUSD (Open Common Scene Description), thereby promoting the standardization of the 3D ecosystem, enabling greater interoperability between 3D creation tools and data, and allowing developers and content creators to describe, write, and produce large-scale 3D projects, as well as build an ever-expanding range of 3D products and services. OpenUSD, created by Pixar Animation Studios, is a high-performance 3D scene description technology that provides interoperability across tools, data, and workflows. Known for its ability to help capture artistic expression and simplify film content production, OpenUSD's powerful features and flexibility make it an ideal content platform to meet the needs of new industries and applications. The consortium will develop written specifications detailing the functionality of OpenUSD. This will enable greater compatibility and wider adoption, integration, and implementation, allowing other standards bodies to incorporate it into their own specifications. The project will be led by the Linux Foundation's JDF, ensuring open, efficient, and effective development of the OpenUSD specification while assisting in obtaining International Organization for Standardization (ISO) certification. The OpenUSD Consortium will collectively define the direction for industry-wide improvements to OpenUSD technology and invite numerous companies and organizations to join and participate in shaping the future of OpenUSD development. Steve May, Pixar's Chief Technology Officer and Chairman of AOUSD, stated: "Universal Scene Description (USD), invented by Pixar, is the technological foundation of our state-of-the-art animation workflow. OpenUSD is based on Pixar's years of research and application in film production. We open-sourced the project in 2016, and now OpenUSD's influence has expanded to film, visual effects, animation, and other industries increasingly reliant on 3D data for media exchange. With the announcement of AOUSD, we reveal an exciting next step: OpenUSD technology will continue to evolve and become an international standard." ...

Microsoft: The new Bing service will change the traditional search usage model of the past decades

Microsoft maximizes its cloud service investment and will continue to build technical resources for AI application development.

Following the recent announcement of its quarterly revenue performance and highlighting the shift of its primary revenue source to the cloud, Microsoft CEO Satya Nadella emphasized that the company will continue to assist customers in expanding their businesses through the Azure cloud platform, thereby maximizing the return on its cloud service investments. Simultaneously, it will continue to invest in technology resources to address future trends in artificial intelligence applications, thereby creating even greater revenue streams. Compared to its past reliance on software licensing as its main source of revenue, Microsoft shifted its focus to cloud services long ago. Under Nadella's leadership, Microsoft has indeed transformed its revenue model from software licensing to connecting with more development opportunities through cloud services, and recently, it has aligned itself with the development trend of artificial intelligence. Currently, Microsoft's Azure OpenAI service, developed in partnership with OpenAI, has accumulated over 1.1 customers, including IKEA, Volvo, Mercedes-Benz, Zurich Insurance, and companies like Flipkart, Humane, Kahoot, Miro, and Typeface, which are building native services on the cloud platform. It is growing at a rate of almost 100 new customers per day each quarter. Besides Azure OpenAI services, which include existing Azure cloud platform services, Microsoft's recent integration of Copilot technology into many products, and businesses such as Office that generate substantial revenue for the company, Microsoft's cloud-based development platform applications have also garnered significant developer support, contributing to its profits. The gaming market is another area Microsoft is actively expanding into. With regulatory bodies worldwide gradually approving Microsoft's $687 billion acquisition of Activision Blizzard, a US court ruling that it does not constitute a market monopoly, and a successful 10-year guarantee from Sony for games including the Call of Duty series, Microsoft expects to further enhance the growth of its Xbox gaming business, expand the Xbox Game Pass subscription service, and support future applications such as the Metaverse and AI training. Market observers believe that Microsoft has shifted from its previous horizontal expansion model to vertical integration through its cloud platform, and is deeply committed to multiple business areas.

Facebook's parent company acquires US bank's trademark for $6000 million

Meta's revenue grew by double digits in the second quarter, showing significant profit compared to Google's advertising business.

Meta announced its financial results for the second quarter of fiscal year 2023, ending in June 6, showing revenue of $319.99 billion, an 11% increase year-over-year, and net income of $77.88 billion, a 16% increase year-over-year. This double-digit revenue growth, the highest since the end of 2021, contrasts sharply with three consecutive quarters of declining revenue. Advertising revenue grew by 12%, surpassing Google's performance. Daily active users exceeded 20.6 billion, and monthly active users reached 30.3 billion, with an average revenue of $10.63 per user. Breaking it down by segment, the app family segment (including Facebook and Instagram) generated $317.23 billion in revenue, advertising revenue reached $314.98 billion, and other revenue was $2.25 million. Reality Labs, responsible for developing virtual reality and augmented reality products and services, reported revenue of $2.76 million but a loss of $37.39 billion. Meta forecasts third-quarter revenue between $320 billion and $345 billion, representing a growth rate of around 15%. In recent developments, Meta confirmed the launch of its new virtual reality headset, Quest 3, in the third quarter of this year, and announced a new version of its large-scale artificial intelligence language model, Llama 2, which has been adopted by Alibaba. It also launched new social service products such as Threads. Furthermore, Meta stated that it continues to invest in data centers and artificial intelligence technologies, expecting spending to continue to increase in 2024, which clearly includes investment in the development of metaverse applications.

Meta reiterates its commitment to strengthening AI technology development and reshaping its next-generation infrastructure architecture

Meta reiterates its commitment to strengthening AI technology development and reshaping its next-generation infrastructure architecture

Meta announced a next-generation infrastructure overhaul, encompassing both hardware and software layers, to enhance artificial intelligence (AI) technology development and more efficiently deploy new technologies, ultimately driving future metaverse applications through AI. This new AI-designed infrastructure will include Meta's first custom chip for executing AI models, a new AI-optimized data center design, and a supercomputer equipped with 16,000 GPUs for accelerated computing. Meta emphasizes AI as a core element of its products, enhancing personalized experiences, developing safer and fairer products, creating richer experiences, and helping businesses reach their most valued audiences. Furthermore, Meta plans to reshape programming methods through its internally developed generative AI programming tool, Code Compose, to improve developer productivity throughout the software development lifecycle. Since establishing its first data center in 2010, Meta has continuously improved its infrastructure, from the Big Sur hardware in 2015 to PyTorch programming language development, and then to the supercomputer designed for artificial intelligence research last year. Currently, it is refining its infrastructure architecture in three main ways: • MTIA (Meta Training and Inference Accelerator): MTIA is Meta's first internally developed custom accelerator chip series, specifically designed for inference-related tasks. Designed for internal workloads, MTIA offers superior computing performance and processing efficiency compared to CPUs. By deploying MTIA chips and GPUs simultaneously, the performance of each task can be improved, latency reduced, and processing efficiency increased. • Next-Generation Data Centers: Meta's next-generation data center design not only supports existing products but will also assist future AI hardware in training and inference. This new data center will be optimized for AI, supporting liquid-cooled AI hardware and high-efficiency AI networks, connecting thousands of AI chips to form data center-scale AI training clusters. Both its development time and cost will increase, and it can complement other new hardware devices, such as the MSVP (Meta...), an ASIC solution developed internally by Meta to support the ever-growing audio-visual content market.

Pages 1 to 9 1 2 ... 9

Welcome back!

Login to your account below

Retrieve your password

Hãy nhập tên người dùng hoặc địa chỉ email để mở mật khẩu