Google’s own mobile chip is called Tensor


Rick Osterloh casually dropped his laptop onto the couch and leaned back, satisfied. It’s not a mic, but the effect is about the same. Google’s chief of hardware had just shown me a demo of the company’s latest feature: computational processing for video that will debut on the Pixel 6 and Pixel 6 Pro. The feature was only possible with Google’s own mobile processor, which it’s announcing today.

He’s understandably proud and excited to share the news. The chip is called Tensor, and it’s the first system-on-chip (SoC) designed by Google. The company has “been at this about five years,” he said, though CEO Sundar Pichai wrote in a statement that Tensor “has been four years in the making and builds off of two decades of Google’s computing experience.”

That software expertise is something Google has come to be known for. It led the way in computational photography with its Night Sight mode for low light shots, and weirded out the world with how successfully its conversational AI Duplex was able to mimic human speech — right down to the “ums and ahs.” Tensor both leverages Google’s machine learning prowess and enables the company to bring AI experiences to smartphones that it couldn’t before.

Holding a diet Coke in one hand and gesturing animatedly with the other, Osterloh threw around hyperbolic marketing language like “We’re transforming the computing experience” and “It’ll be what we consider to be a pretty dramatic transformation overall.”

Six Google Pixel 6 and Pixel 6 Pro devices laying on a grey surface at various angles, perpendicular to each other.

Google

He’s alluding to Tensor enabling experiences that previous chips (the company’s mostly used Qualcomm’s Snapdragon processors in its prior phones) couldn’t deliver. Things like being able to run multiple AI-intensive tasks simultaneously without a phone overheating, or having enough power to apply computational processing to videos as they’re being captured.

That belief in Tensor’s significance is part of why Google chose to announce it today ahead of the Pixel 6’s actual launch in the fall. The company isn’t giving away all the details about the processor yet, nor is it sharing specific information about its latest flagships now. But “there’s a lot of new stuff here, and we wanted to make sure people had context,” Osterloh said. “We think it’s a really big change, so that’s why we want to start early.”

Plus, there’s an added benefit. “Information gets out,” Osterloh added. “Nowadays it’s like, stuff leaks.”

A new chip design with AI infused

Thanks to those leaks, though, we’ve heard plenty of rumors about Google’s efforts to make its own mobile processor for a while, under the code name Project Whitechapel. While the company won’t publicly discuss code names, it’s clear that work on Tensor has been going on for a long time.

The chip’s name is an obvious nod to the company’s open-source platform for machine learning, TensorFlow, Osterloh said, and that should tell you how big a role AI plays in this processor. Though Google isn’t ready to share the full details about Tensor yet, Osterloh did explain that the SoC is an ARM chip designed around a TPU, or Tensor Processing Unit. The mobile chip was co-designed with Google’s AI researchers and the TPU is based on their larger versions in the company’s data centers.

A rendering of Google's Tensor mobile chip with words on it saying

Google

It’s not just designed to speed up machine learning tasks on your phone, either. Osterloh said they’ve also redesigned the image signal processor, or ISP. Specifically, he said there are a “few points in the ISP where we can actually insert machine learning, which is new.”

Google also reconstructed the memory architecture to make it easier to access RAM, and allow for data manipulation while processing images. There are also a few places where Osterloh said they’ve directly encoded their image processing algorithms into the hardware. He says this allows Google to do “stuff that was previously impossible to do on standard SoCs,” though he didn’t share specific details on what Tensor now enables that previous SoCs couldn’t.

Of course, with this being Google’s first mobile chip, Osterloh concedes people might see the company as unproven. Though he did push back by saying, “I think people are pretty aware of Google’s overall capability.”

It’s natural to wonder if the company can compete in areas like power efficiency and heat management. Osterloh said they’ve designed Tensor to perform some tasks more power efficiently than previous processors they’ve used while staying within a thermal threshold. Similar to existing processors, Osterloh said “the system has a bunch of different subsystems, [and] we can use the most power efficient element of it for the task at hand.”

Side view of the peach/gold Pixel 6 Pro.

Google

Though there’s an ongoing global chip shortage, Osterloh is confident that Google can manage demand. “Everyone’s affected by this, no doubt,” he said. “The positive thing about this is it’s under our control, we’re making this and we’re responsible for it. So we think we should be okay.”

Why make Tensor?

So what can Tensor do that other mobile processors can’t? Google is saving most of the juicy bits for the Pixel 6’s launch in the fall. But, it did offer two examples of areas that would see dramatic improvement: Photography and voice recognition (and processing). At our meeting, Osterloh showed off some demos of new Tensor-enabled features on the Pixel 6 and 6 Pro, which also gives us our first look at the phones. The handsets feature a distinctive new look and bright color options. They also have a horizontal camera bump that spans the width of the rear, which is “a very intentional part of the design,” Osterloh notes. “We’ve really been known for photography, [so] we wanted to really emphasize this.”

Google’s upgraded the cameras themselves, but the promised photography improvements aren’t just from optical hardware. Tensor is behind some of it. With previous chips, the company kept running into limits when trying to improve photography on its phones. “These weren’t designed for machine learning, or AI, and certainly not designed to optimize for where Google’s headed,” he said.

Three Pixel 6 phones. From left to right, their color schemes are black/black, green/blue and red/peach.

Google

So where is Google headed? Towards a world of “ambient computing,” a vision that Osterloh and many of his colleagues have touted in the past. They see a future where all the devices and sensors we’re surrounded by can communicate with Google (sometimes via the Assistant) or the internet. But Osterloh knows that for most people, the most important device is still going to be the smartphone. “We see the mobile phone as the center of that.”

So when Google wanted to improve beyond the limits of contemporary processors, it had to do something different. “What we’ve done in the past, when we encountered these kinds of engineering, constraints and technical constraints, is we take on the problem ourselves,” Osterloh said.

Upgrading photo and video processing

With Tensor, the Pixel 6 can do things like concurrently capture images from two sensors, with the main one recording at normal exposure and the wide-angle running at a much faster shutter speed. Osterloh said the system runs a number of different machine learning models in real time to help figure out stuff about the scene, like whether there’s a face and is the device moving or shaking. The Pixel 6 will then combine all that info and use it to process photos so that if you’re trying to capture a hyperactive puppy or toddler, you’ll be less likely to get a blurry shot.

Tensor will also let Google perform computationally intensive tasks while you’re shooting video. Osterloh said that in the past the company hasn’t been able to apply a lot of machine learning to video, since it would be too taxing for a phone processor. But “that all changes with Tensor,” he said. One thing they’ve been able to run is an HDRnet model on videos, which drastically improves quality in tricky situations like when the camera is pointing at the sun.

Osterloh showed me demos of how the Pixel 6 did both these things, including a before-and-after example of a blurry photo of an active child and video comparisons of a campground at sunset. While there was a clear difference, I unfortunately can’t show you the results. Besides, these were controlled demos from Google. I can’t really gauge how impressive and useful these features are until we get to test them in the real world.

Improvements in voice and speech

I did get to see a more telling preview, though. Osterloh also showed me how voice dictation will work in the Pixel 6 on GBoard. On the upcoming phone, you’ll be able to hit the microphone button in the compose field, narrate your message and use hotwords like “Send” or “Clear” to trigger actions. You can also edit typos via the onscreen keyboard while the mic is still listening for your dictation.

This all works via a new Speech On Device API, and I was impressed that the system was smart enough to distinguish between when you say “Send” in “I will send the kids to school” versus when you’re telling it to send the message. Osterloh told me the algorithm is looking not just for the hotword but also your tone of voice and delivery before it triggers the action.

Finally, there are a couple more things that Osterloh showed me: Live Caption with Translate, as well as Android 12’s Material You design. Thanks to Tensor, Android’s Live Caption feature, which provides subtitles for anything playing through your device’s sound system, will be able to translate what’s being said in real time as well. This all happens on device, so the next time you’re watching a foreign-language TED Talk or your favorite international TV show, it won’t matter if they don’t have subtitles — Tensor will provide.

Android 12 Beta

Google

A look at Material You

Meanwhile, Material You, which Google first unveiled at I/O this year, is what Osterloh called the biggest UI change in Android maybe since the beginning. I’ve been waiting to see the feature in the Android 12 public beta, but it’s still not available there. At our meeting, Osterloh showed me how it works — he changed the wallpaper of a Pixel 6 from something more rosy hued to a scene of a body of water, and the system’s icons and highlights quickly updated to match. App icons were painted to match as well, but something new I learned from this demo was that if you’re like me and prefer your icons to keep their original colors, you can opt to leave them untouched.

We’ve gotten a really good look at what’s coming in the fall, though Google is still keeping plenty of details under wraps. We don’t know yet if it plans to say if it had help from other manufacturers in coming up with Tensor, and details about CPU and GPU cores, clock speeds and other components will be shared later this year. But with the new chip, Google’s been able to realize a years-long dream.

“We kind of see this as The Google Phone,” he said. “This is what we set out to build several years ago and we’re finally here.”

All products recommended by Engadget are selected by our editorial team, independent of our parent company. Some of our stories include affiliate links. If you buy something through one of these links, we may earn an affiliate commission.



Source link

We will be happy to hear your thoughts

Leave a reply

Household Attire
Logo