iOS 15 beta hands-on: A surprisingly complete preview


The iOS 15 public beta is live today, which means a larger swath of people can now check out the latest features coming to iPhones later this year. Despite being a beta, it’s surprisingly complete, with most of the coming changes already available. Some of the updates getting the most buzz are the new Focus modes and FaceTime sharing tools, but there are also changes across Messages, Maps, Weather, Safari, Photos and more to check out.

So far, the preview software seems largely stable. But as always with betas, think twice about how willing you are to risk bricking your phone in exchange for early access to new features. Regardless of whether that’s you, we’ve put together a detailed preview of how iOS 15 will work when it launches in the fall.

FaceTime: SharePlay, screen sharing and spatial audio

Though it would have been a lot more helpful if Apple had launched this feature during the throes of the pandemic, FaceTime’s SharePlay feature will still be useful for many of us. Whether you want to watch an episode of Ted Lasso with your long-distance buddy or provide remote tech support to your relatives, SharePlay and screen sharing over FaceTime will make your life a little easier.

A composite of two screenshots showing FaceTime's new control panel and screen sharing feature.

Screenshots of iOS 15 beta

Unfortunately, my colleague Mat Smith and I had to futz around for ages before we figured out how to SharePlay something. While screen sharing is more straightforward — just press a button at the bottom right of a new control panel at the top of FaceTime calls — SharePlay options only show up when you have a compatible media app open during a chat. Mat and I are seasoned tech journalists and we still spent some time looking for a SharePlay-specific button, which seems like the more intuitive way.

Once we figured it out, things went a little more smoothly. When you try to play an episode or video while on a FaceTime call, a window pops up asking if you want to use SharePlay. From there, you can choose to stream with your caller (or callers), play it only for yourself, or cancel.

As a reminder, depending on the app, both you and your friend will need subscriptions to watch stuff together on SharePlay. For Apple’s services like TV+ and Music, you’ll both need a subscription or trial. Other streaming apps like HBO Max, Hulu and Disney+ will be the ones that decide whether all parties need accounts to watch shows together on SharePlay, but it’s highly unlikely they allow it some other way.

On our attempts to stream episodes of Mythic Quest and Central Park on SharePlay, though, Mat and I kept getting a failure notice saying “Unable to SharePlay. This title isn’t available to SharePlay with people in different countries or regions.” It’s odd, since both those shows are available in both our regions. It’s also sad that you wouldn’t be allowed to watch it with someone abroad. Apple hasn’t said if this limit will be in place when iOS 15 launches, but if it is it’ll be disappointing for anyone that was looking forward to SharePlaying with their overseas partners, families and friends. We’ll update this article if Apple confirms this either way.

Screen sharing worked better. I was able to show Mat my dubious shopping list on Instagram though, as it does with other video chat apps, my camera automatically turned off whenever I shared my screen. When Mat streamed his display, his camera stayed on. We suspect this has something to do with the fact that he’s using a more capable iPhone 12 mini while I was on an aging iPhone XR that was burning up from my testing. This is a known issue with SharePlay that has been detailed in the iOS 15 developer beta release notes, so it may get fixed in time. 

A composite showing three screenshots of FaceTime's SharePlay feature in the iOS 15 beta.

Screenshots of the iOS 15 beta

Two other FaceTime features that are also live in this beta: links to join calls from non-Apple devices and spatial audio. The latter lets you hear each person in a call from the direction where they’re positioned on your FaceTime grid. Since it required multiple people running the beta to work, I couldn’t fully experience this. I got on a call with Mat and our former colleague Chris Velazco, and while Mat and I were able to hear each other from different directions, Chris wasn’t on the beta and did not notice the effect.

I also sent FaceTime web links to Chris, as well as Engadget staffers Nathan Ingraham and Valentina Palladino. The URL brought us to a page that prompted us to enter our names, and as the host I could choose to allow or block each would-be participant. Chris was able to join my call from a non-Apple laptop, while Valentina and Nate went through the browser on their Macs. Meanwhile, I was using an iPhone. Everyone looked and sounded great… to me.

Valentina and Nate couldn’t hear each other until they used the FaceTime app on their MacBooks. Chris also couldn’t hear other people on the call — all anyone heard was my beautiful voice. (As it should be.) But really, this appears to be an issue with how browsers handle audio input devices or a possible bug in the beta.

It’s not yet clear whether the region-specific SharePlay restrictions will also work this way in the stable release. But so far, barring some glitches, the updates to Apple’s video calling app appear meaty and potentially very useful.

Focus modes

I’ve spent too much time talking about FaceTime, so I’m going to try to succinctly describe the other iOS 15 features I’ve tested thus far. One of these felt incredibly relevant as I spent time finishing this article on deadline: Focus modes. Here, Apple allows you to customize profiles that will allow notifications from specific apps or people when enabled.

A composite showing three screenshots of the Focus Mode feature in the iOS 15 beta. The first two show shortcuts to enable profiles like Do Not Disturb, Personal, Sleep and Work. The screenshot on the right show a detailed Settings page for the Work profile.

Screenshots from the iOS 15 beta

Three placeholders are available at the start: Work, Bedtime and Personal. On your first time trying to enable each, you’ll have to set up which contacts and apps to allow. You can also choose to enable your Focus Status so people who try to reach you will see that you’re away when they’re using a compatible app. Developers of messaging apps will have to use Apple’s API to enable this, so that your friends who hit you up on, say, Telegram or Facebook Messenger will see your status too.

For now, only Apple’s own Messages supports it and I was able to see below our conversation that Mat had silenced notifications. I sent a message anyway, and the app showed my text was “delivered quietly.” Just like you can on Slack, you can choose to “notify anyway” so your message breaks through the wall of silence. (I’m not an awful person so I didn’t, poor Mat had already put up with my relentless testing and FaceTiming all day.)

With each Focus mode, you can also pick a home screen showing just the apps you want. To do so, you’ll have to first create each page as an additional panel on your main screen, then select the relevant one when customizing your Focus mode. I created a barebones page with just four apps and designated it as my main Personal screen. I also made a different option for Work and was able to have apps appear on multiple pages — Instagram and Twitter could be placed on every page, for example. When each mode was enabled, I couldn’t see any other page; swiping sideways only showed the apps drawer and the Today view.

I haven’t spent enough time with the beta to know how useful these customized views will be, but I’m already in love with the ability to pick different notifications profiles. You can also set them to automatically activate based on the time of day, your location or app usage. Again, this is something I’ll need to use for more than a few days, but I appreciate the concept. Unfortunately, I haven’t encountered Notifications summaries in the beta yet.

Live text (aka Apple’s version of Google Lens)

Many other iOS 15 updates are similar to features that competitors already offer, and the most obvious of these is Live Text. This tool scans the photos on your device for words and turns them into text you can actually use, whether it’s copying and pasting a phone number to another app or translating foreign words on a menu. This is basically Apple’s answer to Google Lens, which has been around for years.

A composite showing three screenshots of Apple's Live Text feature through the viewfinder in the Camera app in the iOS 15 beta. The left screenshot shows a small yellow frame focused on the middle of a bottle of green moisturizer, the middle screenshot shows the middle part of the bottle highlighted with options above it for

Screenshots of the iOS 15 beta

Similar to Lens, Apple’s version will show a small symbol at the bottom right of each image in the Photos app to indicate it’s found something. Tap that icon, and all the characters in that picture will be highlighted, and you can select the portions you need. I snapped a picture of my bottle of moisturizer and was able to copy all the words on the label and URLs also got identified as links I could click through. You can also use Live Text via the Camera app’s viewfinder without snapping a shot, by the way. When your phone detects words in the scene, the same icon will appear in the bottom right and you can hit it to pull up the snippets that Live Text noticed.

So far, this generally performed as expected, though it’s worth noting that as its name suggests, Live Text only works on images that have a lot of words in them. But even a photo of my dinner, which included a container of yogurt with a brand name prominently displayed on it, didn’t trigger Live Text. Google’s Lens, meanwhile, will identify buildings, pets, furniture and clothes in pictures with nary a letter in them.

Maps, Photos and generally tighter integration

Elsewhere in iOS 15 you’ll find updates to Maps, Weather and Photos. In some cities, Apple’s maps look richer and more detailed than before, thanks to meticulous drawings of individual trees, lanes, traffic lights and more. I was able to explore a golf course in San Francisco, as well as the Conservatory of Flowers and Dutch Windmill in the Golden Gate Park in surprisingly detailed 2D and 3D views. I was disappointed when I zoomed super close to the Penguin Island in the San Francisco zoo and there were no cute little feathered friends. But I guess that’d be too much to ask.

A composite of three screenshots from the Maps app in the iOS 15 beat showing 3D drawings from around San Francisco. Landmarks include the San Francisco Zoo and Penguin Island.

Screenshots of the iOS 15 beta

Memories in Photos has also been updated to give you greater control over who shows up in them and what music plays in the background. You can now edit your pictures’ descriptions to create richer alt text that stays with each image as you forward them to friends. I liked using this to identify people and places in a photo for contacts who are blind or have low vision. Even though I added keywords like “sunset” and people’s names to some pictures’ descriptions, searches for those words in my iPhone’s Spotlight didn’t return those images. It would be nice, but the descriptions aren’t currently being indexed for that.

But that’s another update in iOS 15: Spotlight searches for all things in your phone will now include your photos in results, too. It uses Apple’s own machine learning to detect things in your library though, and this is still sometimes inaccurate. I searched for “Cherlynn” and “Sunset” and was shown screenshots with my name in them and an image of a red-hot map of New York from the Weather app that Apple thought was a sunset. This isn’t perfect, but at least photos are better integrated into Spotlight now.

Another update that provides better integration across iOS is the consolidation of media that your friends send you. Apple calls this Share With You, and things from your recent interactions with each person will show up there — pictures that Mat sent me of his adorable baby niece, as well as the screenshots he shared from our FaceTime adventures, were all in his page in the Phone app.

A composite of two screenshots showing the Weather app in the iOS 15 beta.

Screenshots of the iOS 15 beta

There’s still a ton more to explore not only in the public beta but in iOS 15 when the final release is ready. The Weather app has new maps that appropriately show just how scorching hot it’s been in the New York area these last few days. And we still have to test more things like Safari mobile extensions and ID and keys support in Wallet. For now, this has been an intriguing taste of what to expect in the software update. Despite a few snags, it looks like iPhone users will have plenty to look forward to later this year.

All products recommended by Engadget are selected by our editorial team, independent of our parent company. Some of our stories include affiliate links. If you buy something through one of these links, we may earn an affiliate commission.



Source link

We will be happy to hear your thoughts

Leave a reply

Household Attire
Logo