Google is planning to update Assistant with features powered by generative AI, according to a report from Axios. In an email obtained by the outlet, Google tells staff members that it has already started exploring a “supercharged” Assistant powered by the newest large language models (LLM), similar to the technology behind ChatGPT and Google’s own Bard chatbot. According to the email, “A portion of the team has already started working on this, beginning with mobile.”
As part of this change, Google says it’s condensing the team that works on Assistant. The email obtained by Axios states that the company is “eliminating a small number of roles,” although it’s unclear how many employees are affected. According to Axios, Google laid off “dozens” of workers. The Verge reached out to Google to confirm this, and we’ll update this article if we get more information.
“We’re excited to explore how LLMs can help us supercharge Assistant and make it even better.”
“We remain deeply committed to Assistant and we are optimistic about its bright future ahead,” Peeyush Ranjan, the vice president of Google Assistant, and Duke Dukellis, the company’s product director, write in the email.
Possibilities with Generative AI
While Google doesn’t elaborate on what kinds of features it plans on bringing to Assistant, there are some pretty big possibilities. For example, Assistant could tap into the same technology that powers its AI chatbot, Bard, possibly allowing it to answer questions based on the information it gleans from across the web.
Optimizing User Experiences
“Hundreds of millions of people use the Assistant every month and we’re committed to giving them high-quality experiences,” Google spokesperson Jennifer Rodstrom says in a statement to The Verge. “We’re excited to explore how LLMs can help us supercharge Assistant and make it even better.”
It’s still not clear when Google plans on bringing this technology to its smart home products, though — and I don’t think a lot of people (myself included) would be entirely comfortable with that, given the potential privacy implications.