Apple is set to expand its artificial intelligence capabilities to the Vision Pro headsets, according to Bloomberg’s Mark Gurman. This initiative is part of Apple’s broader strategy to integrate their recently unveiled suite of AI features, known as Apple Intelligence, across its product lines, including iPhones, iPads, and Macs.
The suite includes advancements such as an enhanced Siri, proofreading tools, and personalized emojis, aligning with Apple’s vision for a technologically integrated future.
Why is Vision Pro’s AI Integration Delayed?
While introducing AI to the Vision Pro seems a logical step, the headset’s high cost and currently narrow market reach pose challenges. For all its impressive features, the Vision Pro remains an unusually pricey device with a limited audience so far.
Gurman notes that Apple Intelligence will not debut on the Vision Pro within this year. The delay stems from the need to adapt these AI features for a mixed reality environment, which differs significantly from traditional screens like those of MacBooks and iPhones. Apple’s main challenge is rethinking how these features will appear and function in mixed reality, rather than on a typical device screen.
Improving Vision Pro Demos
In addition to AI developments, Apple is modifying how the Vision Pro is showcased in retail stores. Potential customers will now be able to experience their personal media through the headset, enhancing the demo experience. Additionally, Apple is changing the headband from the Solo Loop to the Dual Loop, aiming to improve comfort during demonstrations.
Further fueling Apple’s technological aspirations, analyst Ming-Chi Kuo, through his latest supply chain analysis, anticipates that Apple plans to start mass production of AirPods equipped with infrared cameras by 2026. These next-generation AirPods are expected to support novel spatial audio experiences and gesture controls, particularly when paired with the Vision Pro.
Featured Image courtesy of Hadrian/Shutterstock