• WANTED: Happy members who like to discuss audio and other topics related to our interest. Desire to learn and share knowledge of science required. There are many reviews of audio hardware and expert members to help answer your questions. Click here to have your audio equipment measured for free!

Maximum subjectively preferred loudness level

sarumbear

Master Contributor
Forum Donor
Joined
Aug 15, 2020
Messages
7,604
Likes
7,314
Location
UK
Perhaps when it starts citing references we'll then be able to check.
You maybe missing my earlier post. There are no references when writing code. Nor there are for passing the bar or MBA exams. In those cases where the procedure is important ChatGBT delivers. It is not the knowledge but the use of the knowledge...
 

Axo1989

Major Contributor
Joined
Jan 9, 2022
Messages
2,820
Likes
2,816
Location
Sydney
You maybe missing my earlier post. There are no references when writing code. Nor there are for passing the bar or MBA exams. In those cases where the procedure is important ChatGBT [sic] delivers. It is not the knowledge but the use of the knowledge...

ChatGPT doesn't have semantic knowledge. It is built on patterns of proximity between word-tokens in a very large corpus of text. Per my previous link:

... McCoy and Ullman explain that “the word ‘dog’ is represented as the vector [0.308, 0.309, 0.528, ?0.925, ….]”. If you plot that into a coordinate system, then words that often co-occur with “dog” in the training data will be positioned close to “dog”. This “map” of how words are related to each other is also called the “vector space” or “latent space” or even just “space”.
Once GPT-3 is trained, it doesn’t “know” anything about its training data any more. All it knows is those coordinates. Dog is [0.308, 0.309, 0.528, ?0.925, ….], and that …. stands for a lot more numbers. It also knows what other words (or tokens) “dog” is close to. All those tokens and their coordinates across billions of different parameters make up the “latent space” of the model.

If you think about it only in terms of how humans understand semantic concepts, then your analysis will be orthogonal to the operation of the generative text tool.
 

sarumbear

Master Contributor
Forum Donor
Joined
Aug 15, 2020
Messages
7,604
Likes
7,314
Location
UK
ChatGPT doesn't have semantic knowledge. It is built on patterns of proximity between word-tokens in a very large corpus of text. Per my previous link:
Which is what I said: "It is not the knowledge but the use of the knowledge..."
 

sarumbear

Master Contributor
Forum Donor
Joined
Aug 15, 2020
Messages
7,604
Likes
7,314
Location
UK
Maybe we are saying the same thing, I didn't get how that statement was meaningful except in vague terms?
I am using "forum language" :)
 
Top Bottom