Main menu

Pages

Google has everything it needs to fight ChatGPT

featured image

ChatGPT’s ability to answer questions in a direct, conversational manner has led some to proclaim that AI chat will kill the traditional search engine. Google is seriously responding to this and – from what has already been shown – it should be more than capable of competing. The issue is the user experience.

Questions and answers

Fundamentally, Google’s mission to “organize the world’s information and make it universally accessible and useful” can be broken down into two components.

Users ask questions and Google provides the answers. The queries – first keywords, then naturally worded questions – were originally typed into a box and then spoken. Responses started out as links to websites that might have relevant information, but they also evolved.

Google has started providing immediate answers to simpler questions that are more or less facts, using information from databases, listings, and, more often than not, Wikipedia. This shift to direct responses coincides with smartphones and their relatively smaller screens becoming the mainstream device. Then came wearables and other audio-first devices like smart speakers and smart displays.

Other questions cannot be answered easily, but Google still tries and uses something called a Featured Snippet, or direct quotes from a website that it thinks will answer your question. In recent years, Google has been criticized for these Snippets from all sides. Sometimes he chooses to cite a source that is clearly wrong, while the owners of that content blame Google for conspiratorially stealing clicks to keep users on Search.

This same kind of complex question is something ChatGPT excels at by being able to generate the answer for a lot of things instead of sending it somewhere else. Early adopters believe that the future of research will involve getting straight answers all the time through back-and-forth with the ability to request follow-up. In fact, ChatGPT can also ask questions for you to clarify your query as needed. Meanwhile, he can also debug code, write essays (with the ability to specify paragraphs), summarize, explain, and much more.

What Google has |

LaMDA

Google has been working on the same language model technology that underpins ChatGPT for some time, albeit in a less flashy way. That said, she’s given her work on natural language understanding (NLU) and centralized collection of big language models on I/O to two developer conferences in a row now.

LaMDA (Language Model for Dialog Applications) is “Google’s Most Advanced Conversational AI Yet”. It was introduced at I/O 2021 “to talk about any topic”, with the caveat that it was still in the R&D phase. Google’s examples of talking to the planet Pluto and a paper airplane were meant to demonstrate how LaMDA “captured many of the nuances that distinguish an open conversation,” including sensible, specific responses that encourage more back-and-forth.

Other qualities that Google wants are “interest” (whether the answers are insightful, unexpected, or witty) and “factuality,” or sticking to the facts.

A year later, LaMDA 2 was announced, and Google began allowing the public to experience three specific examples of LaMDA with the AI ​​Test Kitchen app.

MOM

In addition to LaMDA, Google highlighted multimodal models that “allow people to naturally ask questions across different types of information” with MUM (Multitask Unified Model). Noteworthy is the example query offered by Google that cannot be answered by a search engine today, but is something this new technology can solve:

I hiked Mt Adams and now I want to hike Mt Fuji next fall, what should I do differently to prepare?

MUM would understand that you are comparing two mountains, and the time range you provided is Mt Fuji’s rainy season, therefore requiring waterproof gear. It could show articles written in Japanese where there is more local information, while the most impressive example was more or less linked to Google Lens:

So now imagine taking a picture of your hiking boots and asking, “Can I wear these to hike Mount Fuji?” MUM would be able to understand the content of the image and the intent behind your query, let you know that your hiking boots would work fine, and then point you to a list of recommended gear and a blog from Mt. Fuji.

It was still an exploratory query, but more concretely, Google announced how it’s adding MUM to Lens so you can take a picture of a broken part of your bike (that you don’t know about) and get instructions on how to fix it. 🇧🇷

Palm

If MUM allows questions to be asked with a variety of media and LaMDA can continue conversations, PaLM (Pathways Language Model) is what can answer questions. It was announced in April and received an onstage mention at I/O. PaLM is able to do:

Answering questions, semantic analysis, proverbs, arithmetic, code completion, general knowledge, reading comprehension, summarizing, logical inference chains, common sense reasoning, pattern recognition, translation, dialogue, joke explanations, quality control of physics and language comprehension.

It is powered by a state-of-the-art AI architecture called Pathways, which can “train a single model to do thousands or millions of things” compared to the current highly individualized approach.

Below the products

When Google announced LaMDA in 2021, Sundar Pichai said its “natural conversational capabilities have the potential to make information and computing radically more accessible and user-friendly.”

Google Assistant, Search, and Workspace were specifically identified as products that it hopes to “incorporate[e] better conversation features.” Google may also offer “features for developers and enterprise customers.”

In this post-ChatGPT world, many people have commented that direct answers can harm Google’s ad-based business model, thinking that people would no longer need to click on the links if they already got the answer. In the examples that Google has provided, there is no indication that it wants to stop linking to the content.

There are major security and accuracy concerns, which Google has always emphasized when demoing. The fact that these models “can invent things” seems to be the biggest bottleneck more than anything else.

Meanwhile, it’s not clear that people want every interaction with a search engine to be a conversation. That said, Google acknowledged internally that the conversational approach “really hits a need that people seem to have.”

Google is said to be “code red” on ChatGPT and has reassigned several teams to work on competing AI products and demos. Another tech show at I/O 2023 is more than likely, but whether that means LaMDA, MUM and PaLM will be prominently integrated into Google’s biggest products is up in the air.

In May, Pichai reiterated how “talking and natural language processing are powerful ways to make computers more accessible to everyone.” From everything the company has envisioned, the ultimate goal is to make Google Search able to answer questions like a human being.

Not surprisingly, Google has the technology to get there, but the company’s eternal challenge is to turn R&D into real products, and rushing it doesn’t seem wise for the search engine the world needs to be consistently correct.

FTC: We use automatic affiliate links for income generation. Most.


Check out 9to5Google on YouTube for more news:

Comments