"Search" is dead. What is the new paradigm?

post by shminux · 2022-12-23T10:33:35.596Z · LW · GW · 9 comments

For over two decades internet search (which has been largely synonymous with Google for most of that time) has been the main way to augment the human brain with the digital information outside of it. First the virtual assistants and now LLMs are challenging and changing the way people access, process and use the data. It is becoming more like a true capability augmentation, though still clunky and unreliable, but improving at a dizzying rate. Specifically, it is becoming amazingly easier to create, rather than consume, though consumption is greatly facilitated, too. 

Assuming we are still ways away from the prophesied AI doom, what might be a better name for the way people interact with information?

9 comments

Comments sorted by top scores.

comment by tricky_labyrinth · 2022-12-23T21:46:11.905Z · LW(p) · GW(p)

Related: seems like some search engines are already integrating LLMs:
- One approach is directly providing links; see https://metaphor.systems, brought up yesterday @ https://www.lesswrong.com/posts/rZwy6CeYAWXgGcxgC/metaphor-systems [LW · GW]
- Another is LLM summarization of search engine provided links; https://you.com/search?q=what+was+the+recent+breakthrough+in+fusion+research%3F as an example

Replies from: shminux
comment by shminux · 2022-12-23T22:17:36.624Z · LW(p) · GW(p)

For many queries Google has been offering an answer that is not a link for some time. Calculating, graphing, facts, etc. It is becoming more of an answer engine than a search engine, but rather slowly. I assume that Google is working furiously now to catch up with other LLMs UIs, and they are in a good position to do so, if they let go of the Search mentality.

Replies from: Algon
comment by Algon · 2022-12-23T22:42:37.234Z · LW(p) · GW(p)

I think I read a thread somewhere which said that Google has a lot of tooling built and many teams already dedicated to integrating LLMs into their products. But the economics don't make sense at the moment, apparently. The cost of using these models would need to come down by 1-2 OOM before they'd deploy things. And that seems plausible? Like, I haven't done a detailed analysis, but Davinci is at around $0.1/1000 words, which sounds way too high to use to augment search. 

On the other hand, I expect that few people will need Gopher-like models. The mythical average person probably wants to hear what's new about celebrity X, or a link to a youtuber's channel or so on. When they need a link to a wikipedia page, or an answer to a pub quiz question, I suspect GOFAI is enough. So maybe cost is slightly less of an issue if only 1/10-1/100 queries needs these models to be used. 

comment by Mateusz Bagiński (mateusz-baginski) · 2022-12-23T11:16:05.527Z · LW(p) · GW(p)

"GPT" may become a verb just like "google" did. Although it has one or two syllables two many, so would probably get shortened to something like "jeept".

Replies from: TheMcDouglas
comment by CallumMcDougall (TheMcDouglas) · 2022-12-26T16:17:27.378Z · LW(p) · GW(p)

Or "prompting" ? Seems short and memorable, not used in many other contexts so its meaning would become clear, and it fits in with other technical terms that people are currently using in news articles, e.g. "prompt engineering". (Admittedly though, it might be a bit premature to guess what language people will use!)

Replies from: mateusz-baginski
comment by Mateusz Bagiński (mateusz-baginski) · 2022-12-26T18:29:53.056Z · LW(p) · GW(p)

Maybe, though prompting refers more generally to giving prompts in order to get the right kind of response/behavior from the LLM, not necessarily using it as a smarter version of a search engine

comment by Jon Garcia · 2022-12-23T21:05:33.633Z · LW(p) · GW(p)

"Let me see what Chatty thinks," (or whatever humanesque name becomes popular).

I assume people will treat it just like talking to a very knowledgeable friend. Just ask a question, get a response, clarify what you meant or ask a followup question, and so on. Conversation in natural language already comes naturally to humans, so probably a lot more people will become a lot more adept at accessing knowledge.

And in future iterations, this "friend" will be able to create art, weave stories, design elucidating infographics, make entertaining music videos, teach academic subjects, try to sell you stuff (hmm), spread conspiracy theories (oops), etc., based on the gist of what it thinks you're looking for (and based on what it knows about you personally from your history of "friendship" with it). It would be nice if we could make it truthful and cooperative in a way that doesn't amplify the echo chamber effect of existing social media and search engines, but unfortunately, I don't see that as being very profitable for those deploying it.

comment by Algon · 2022-12-23T10:44:52.048Z · LW(p) · GW(p)

Chat? That is how most people will use it, I imagine.

EDIT: It is still early days though, and the shape of things is unclear. What will be the most popular usecases, the ones that stands out in people's minds as what you use LLMs for? I don't know yet, so any naming seems pre-mature.

Replies from: GuySrinivasan
comment by GuySrinivasan · 2022-12-23T13:41:21.596Z · LW(p) · GW(p)

Ramble your question into a mic, get a good coherent answer. I will hate using audio. Newer generations will not even notice.