Google confirms and partly reflects the reason

Google confirms and partly reflects the reason

LLM models require a lot of RAM to function optimally.

Pixel 8 customers can activate

We got proof of this when Google announced in early March that only the Pixel 8 Pro with 12GB of RAM would get the Gemini Nano. The reason is that the younger brother has 4GB less RAM. So the answer is yes, it is a RAM issue. Meanwhile, Google is partially reversing course, by allowing Pixel 8 owners to enable Gemini Nano from developer settings.

This is bad news anyway for the millions of Android phones with 8GB of RAM or less: depending on the manufacturer, you won't get the full AI experience. The big question now is what Apple can achieve: the company's current top model has “only” 8GB of RAM.

The challenge is that Google wants specific AI functions to always remain in RAM so that the user can retrieve the function without waiting. An example of this is smart replies in messages. “The Pixel 8 Pro with 12GB of RAM was a perfect place for us to put the Gemini Nano and see what we could achieve. When we looked at the Pixel 8 with less 4GB of RAM, it couldn't have been easier to make a call Let's just say, “Well, we'll enable it on the Pixel 8, too,” according to Siang Zhao, “VP of Device Software and Services.” It must be because it “degraded the experience” so much.

See also  Information and communication technology, transportation | Next year, the copper network will close: 100,000 customers on wireless broadband
Hanisi Anenih

Hanisi Anenih

"Web specialist. Lifelong zombie maven. Coffee ninja. Hipster-friendly analyst."

Leave a Reply

Your email address will not be published. Required fields are marked *