Quote:
Originally Posted by brainz01
Great discussion and summary of LLM generative AI.
I tend to agree that it's often "interpolation" how to string words or pixels together into something resembling reality.
|
Also when I say interpolation I don't even mean figuratively. The thing says the answer is between these two vectors and it then literally interpolates between the two, truncates the result into a number and whatever is the next word is the next word.
Oh! Important thing left out. There's a basic function called word2vec that converts any word into a unique number. The result of an llm is a number thats less than about 50,000, and that number is looked up to see what word it is. Vectors -> interpolate -> number -> word. Thats it.
Google translate maps all known languages into an n-dimensional word space. To convert an english sentence to basque it first projects (as in linear algebra projection) the sentence vector into the universal word space, then projects that universal sentence back down into basque.
The universal vector for king minus the universal vector for queen equals... man.