It won’t admit it, but an Apple made a virtual reality headset


The “metaverse” of Mark-Zuck: Apple’s transition into “spatial” virtual tech and the emergence of augmented reality and mixed reality

Apple ships 200 million of its marquee iPhones a year. But the iPhone wasn’t an immediate sensation, with sales of fewer than 12 million units in its first full year on the market.

Apple’s first year of sales of the headsets will be about 150,000, and the second year will see 1 million headsets sold, according to an analyst’s estimate.

Magic Leap, a startup that caused quite a stir with previews of a mixed-reality technology that could conjure the spectacle of a whale breaching through a gymnasium floor, has shifted focus to Industrial, Health Care and Emergency uses after it had trouble marketing its first headset to consumers.

Microsoft also has had limited success with HoloLens, a mixed-reality headset released in 2016, although the software maker earlier this year insisted it remains committed to the technology.

The introduction of virtual, augmented and mixed reality is not doing well so far. Some of the gadgets deploying the technology have even been derisively mocked, with the most notable example being Google’s internet-connected glasses released more than a decade ago.

Although Meta’s virtual reality headset is the top-selling device, the metaverse remains a ghost town, mostly as a result of the fact that it’s more popular with video game players looking for even more immersive experiences. Cook and other Apple executives avoided referring to the metaverse in their presentations, describing the Vision Pro as the company’s first leap into “spatial computing” instead.

The “metaverse” is what Mark-Zuck described these alternate realities as. It’s a geeky concept that he tried to push into the mainstream by changing the name of his social networking company to Meta Platforms in 2021 and then pouring billions of dollars into improving the virtual technology.

How much is the Apple Vision Pro?” An analyst’s review of the company’s history of innovation in video calling, face-to-face, and presentation creation

One area where the Apple Vision Pro looks like it will excel is video calling. FaceTime looks slick, with virtual app sharing and a room-filling interface that expands as life-size people join the call. Microsoft and Meta have both been working on Immersive meetings, but this time they are about consumption, not creation. Even Apple admitted that. This is powerful for many things, such as viewing a movie together, sharing pictures and videos, and reviewing a presentation. That’s still work, but what happens when you’re reviewing a presentation and want to make edits? Again, we don’t know.

Analysts are not expecting the Vision Pro to be a big hit right away. Most people won’t see a compelling reason to wear something wrapped around their face for an extended period of time, largely because of the hefty price.

Despite such skepticism, the headset could become another milestone in Apple’s lore of releasing game-changing technology, even though the company hasn’t always been the first to try its hand at making a particular device.

Although Vision Pro won’t require physical controllers that can be clunky to use, the goggles will have to either be plugged into a power outlet or a portable battery tethered to the headset — a factor that could make it less attractive for some users.

Apple on Monday revealed a new device that will allow users to connect to the virtual and real world, while also trying to popularise new technology after others failed to capture the public’s imagination.

The company said it drew upon its past decades of product design while working on the Vision Pro, which included more than 5,000 different patents.

Apple’s lineage of breakthroughs date back to a bow-tied Jobs peddling the first Mac in 1984 —a tradition that continued with the iPod in 2001, the iPhone in 2007, the iPad in 2010, the Apple Watch in 2014 and its AirPods in 2016.

I think Apple nailed text legibility here and made the environment more compelling to use as a mobile workspace, but at $3,499 it is more than enough for a virtual giant workspace and TV screens.

It was a piece of technology that was almost like a tease, according to an analyst. “It looked like the beginning of a very long journey.”

The initial reviews were mixed, and skeptics wondered if Apple could make virtual reality anything more than a niche technology. Even if a company can make it to mainstream it is Apple with the two billion users of its products.

The company is differing on how we use these devices. Meta wants you to live in a computer on your face, but Apple wants to put virtual computers into your world.

What’s perfect for the office or when you’re working remote? The case for the Apple Vision Pro headset during WWDC 2023 keynote

I took off the headset at the end of the demonstration and felt a couple of things. 2) Did I just do drugs?” wrote Joanna Stern of The Wall Street Journal.

These are difficult times for virtual reality. Enthusiasm for virtual worlds, often called the metaverse, rose during the pandemic, but waned as lockdowns eased. Metaverse-related start-ups have raised less money in the first five months of the year, compared to the same period in the previous year.

We’ve seen Apple struggle to adapt the iPad for creation over the years, even after the company blurred the lines with the iPad Pro — a hybrid device much like the Surface Pro that blends laptop and tablet. Apple spent most of its time during the iPad Pro announcement in 2015 demonstrating productivity apps like Office and Photoshop, with a focus on professionals getting work done. Almost 10 years later, I still grab a laptop when I want to get work done because iPad apps and the OS still haven’t quite caught up to macOS or Windows for multitasking and creation.

“This powerful combination of capabilities makes Apple Vision Pro perfect for the office or for when you’re working remote,” said Allessandra McGinnis, a product manager for Apple Vision Pro, during Apple’s WWDC 2023 keynote. We didn’t really see just how powerful these capabilities are or how well the voice, eye, and hand gestures let you control and manipulate documents. Instead, Apple showed a 10-second demo of team collaboration on a document from the headset wearer’s point of view. But it was just a static document, and we didn’t see how you can interact with or create a document. What’s perfect for the office about this? We don’t really know yet.

In fact, it looks like you’ll need a physical keyboard and mouse for that precise type of control on the Vision Pro. For this new input, developers will need to adapt their apps. Apple demonstrated the ability to use both the Magic Trackpad and Magic Keyboard on its devices when you want to type lengthy emails or peruse a spreadsheet. You can make a private 4K display in the Vision Pro headset, which is compatible with Mac computers, and remotely connect to a Mac screen to run the Vision Pro headset apps.

3D content within the headset didn’t come from people creating it, but from drag and drop from Messages. There was a brief demo shown using a virtual keyboard to send a message, but not the complex type of “pro” interactions for text, document, and image manipulation using just your voice, hands, and eyes that we’ve come to expect from pro devices with a traditional mouse and keyboard attached.

That’s probably because the “pro” label has long lost its meaning across the industry since those early MacBook Pro days. Apple decided to change its nickname for the iPhone 11 Pro in 2019, after other phones started using pro names. I wish I could tell Chaim Gartenberg what it meant to be a pro when he asked it over four years ago.

The MacBook Pro was one of Apple’s first Macs to switch to Intel, announced alongside an Intel-powered iMac that was targeted more at consumers with a built-in iSight camera, DVD burning capabilities, and a bundle of digital lifestyle apps. The MacBook Pro was about justifying the switch to Intel for power and performance per watt. Steve Jobs stood onstage and even showed off SPECint benchmarks for CPU integer processing power during the announcement. Apple didn’t use any benchmarks to justify its “pro” label on the Vision Pro.

Every successful Apple product of the past 20 years has disappeared into our lives at one point or another, like the iPad into our pockets, the Apple Watch into our wrist and the AirPods into our ears. Wearing the Vision Pro for hours on end will call into question what it means to compute, but also, what it means to live in the real world. My forehead felt cool when I took the Vision Pro off after around 30 minutes, a testament to Apple’s considerate design. But my face also breathed with relief, the way it has after using other heads up displays. The air feels more real out here.

It would be a chance to create a very realistic human experience with mixed reality headsets. This was not achieved in my demo. In addition to being able to capture your face in digital form, the internal cameras within the headset are capable of giving you a hyper- realistic digital twin when you chat. In my FaceTime demo I chatted with the digital twin of an Apple employee who cheerfully talked me through some of these features. But she felt disembodied. She was real, but she wasn’t. I have no recollection of her name.

I Observed Apple Apps with Virtual Docks in Augmented Reality: What I Missed Most of the Photo and Safari I’ve Learned

The part that was interesting was my interaction with them. I opened Photos by pinching my forefinger and thumb together, scrolled through photos by “grabbing” each image and swiping to the left, expanded panoramic photos by staring and tapping at the “Expand” option. I scrolled web 2D pages in Safari using my eyes and a couple fingers. Audio interactions aren’t ready yet and I was unable to record or send a message. Most of the content I saw wasn’t fully volumetric, nor could I pinch the apps to scale up, or bring myself into them. An Apple representative has said, though, that app makers can build these experiences in the future.

In home mode, a virtual dock of Apple apps floated in front of me. I was able to see the real-life living room surroundings. The home screen for Apple apps in augmented reality is just as boring as it sounds. The app containers themselves were certainly not reinvented, and their icons were not little grabble globules or anything else that conferred volume. They were just there.

The Vision Pro interface is intuitive—within a few gestures and taps on the digital crown, I had it down. External cameras obviate the need for hand controllers, because the device sees your hands. And internal eye-tracking cameras see where your eyes are looking, so it knows which app you want to open or close.

Source: https://www.wired.com/story/apple-vision-pro-hands-on-demo/

How to Feel the Light: Calibration of an External Battery Pack for a Real-Time Virtual Reality Adaptation of a Low-Sizing Headset

I assumed this external battery pack meant the headset itself would feel as light as a feather, but it still felt hefty. Once I adjusted my back strap, I proceeded to do another calibration process, during which I heard an audible chime of approval. (Still, a light orb appeared in the middle distance throughout my demo.)

“People’s tolerance for wearing something on their head for an extended period of time is limited,” says Leo Gebbie, a VR analyst at CCS Insights. If it is something that people will wear all day, it needs to be light and comfortable. No one has really achieved that just yet in the VR world.”

Also, the screens we already use every day aren’t totally reliable. You’ve probably had the experience where you want to snag a photo or video of something, so you launch your phone’s camera app, only to see the image stutter or the app crash. Now imagine that happening with your entire field of vision.

I was struck by the words Cook used when he announced the new Vision Pro headset: “So today, I’m excited to announcement an entirely new AR platform with a revolutionary new product.”

During the keynote, Alan Dye, Apples VP of human interface design said that the company intended to avoid other virtual reality clichés by making sure that you are never isolated from the people around you. You do not say? The Vision Pro’s EyeSight feature, which lets you see the eyes of the person wearing a Vision Pro on the device’s outer screen, could be a passable way to simulate eye contact. When I’m wearing Meta’s Quest 2, anyone in the same room as me can’t look me in the eye unless I physically remove the headset from my face.

A Day in the Life of a Silent Person: Counting My Breaths and Measuring My Fate with Face Timing

The year is 2025. I’m sitting on the couch next to my husband who appears in my field of vision whenever we talk to each other. He fades out of view so I can pay more attention to the screens in front of me. I am editing a presentation for work, FaceTiming with my family across the country, learning Spanish, and watching cooking demos on YouTube at the same time.

I get a notification — it’s time for my daily meditation. The screens darken, and the room around me fades away completely so I can focus on a calming animation and count my breaths. When I’m done, the session is automatically logged in my journal, along with my daily mood: pleasant. Just like yesterday. The call back into focus is when my family members are laughing at something funny my mom said. I’m not sure what it was, but my avatar laughs along with theirs anyway.

Source: https://www.theverge.com/23751675/apple-vision-pro-vr-headset-ios-17-mental-health-mood-journal

Self-Centered Reality isn’t Everything: How much do you spend on a headset? How much are you going to lose? How will you need a screen to save your life?

It puts you at the center of your screen filled world, even if it is not supposed to take you out of reality. It is a more isolated experience than holding a 6-inch screen in your hand. How do you show someone a funny TikTok you just watched in your headset? How do you watch a cute video you took of your kid with your spouse? How do you show everyone on your FaceTime call that your cat just jumped into your lap? Won’t Apple please think of the pet moments we’re going to miss?

To be clear, I don’t want to declare this product — which is still many months away from shipping — a disaster for society or anything like that. Nobody will come to my house and replace my phone with a headset even if it takes off. It is easy to forget about the whole experience if you do not spend more than $3,500 on a headset.

But I also can’t help noticing the juxtaposition between the Vision Pro and its very self-centered nature and Apple’s simultaneous push for better tools to manage mental health. In the very same keynote that it announced the Vision Pro, Apple revealed a couple of new features for iOS 17 to help people understand their own emotions and moods. You’ll be able to log your daily mood and moment-to-moment emotions in the Health app, and you’ll also be able to access a standard survey that health professionals use to screen for depression and anxiety.

There’s also a new journaling app, which can automatically prompt you to stop, reflect, and write a journal entry based on things you’ve recently photographed or places you’ve been. It’s probably not a bad thing to take a break from the constant pressure to share your photos, videos, and thoughts with the world and just write something for yourself.

For years, Apple has worked on ways to keep track of and reduce your screen time. If we already have a hard time putting our phones down, how hard will it be to peel yourself away from TikTok when you’re actually wearing the screen? Apple is willing to outfit us with a few flimsy tools to help us keep healthy relationships with our phones, but it’s also willing to sell you a screen to literally keep strapped to your face.