>>215347967
>Ezra Klein: Eliezer Yudkowsky, welcome to the show.
Eliezer Yudkowsky: Thanks for having me.
>So I wanted to start with something that you say early in the book, that this is not a technology that we craft; it’s something that we grow. What do you mean by that?
Well, it’s the difference between a planter and the plant that grows up within it. We craft the A.I. growing technology, and then the technology grows the A.I.
Editors’ Picks
Does Joy Feel Out of Reach? There’s a Word for That.
Brandi Carlile Climbed Music’s Peak. Then She Had to Start Over.
When Your Mom Becomes an Accidental Influencer
The central, original large language models, before doing a bunch of clever stuff that they’re doing today, the central question is: What probability have you assigned to the true next word of the text? As we tweak each of these billions of parameters — actually, it was just millions back then — does the probability assigned to the correct token go up? And this is what teaches the A.I. to predict the next word of text.
Even on this level, if you look at the details, there are important theoretical ideas to understand there: It is not imitating humans. It is not imitating the average human. The actual task it is being set is to predict individual humans.
Then you can repurpose the thing that has learned how to predict humans to be like: OK, now let’s take your prediction and turn it into an imitation of human behavior.
And then we don’t quite know how the billions of tiny numbers are doing the work that they do. We understand the thing that tweaks the billions of tiny numbers, but we do not understand the tiny numbers themselves. The A.I. is doing the work, and we do not know how the work is being done.