• WANTED: Happy members who like to discuss audio and other topics related to our interest. Desire to learn and share knowledge of science required. There are many reviews of audio hardware and expert members to help answer your questions. Click here to have your audio equipment measured for free!

ChatGPT to calculate DIY speakers

OP
sarumbear

sarumbear

Master Contributor
Forum Donor
Joined
Aug 15, 2020
Messages
7,604
Likes
7,321
Location
UK
Did you look at the videos? It does screw up calculations.
Did you checked the calculations I posted? My post is only about using it for speaker calculations.
 

napfkuchen

Senior Member
Joined
Mar 9, 2022
Messages
331
Likes
460
Location
Germany
I gave two of these problems with the exact same prompt to chatgpt and the results were correct for me. ;)
But even if they are not and you kind of know that telling chatgpt it did s.th. wrong helps because it will apologize (get's annoying soon ...), "rethink" the prompt and most of the times put out the results I was looking for.
 

pjn

Active Member
Joined
Aug 28, 2021
Messages
176
Likes
167
I'm sorry, Dave. I'm afraid I can't do that.
Just in case someone was wondering -

Are you Hal 9000?


No, I am not Hal 9000. I am ChatGPT, a language model developed by OpenAI. Hal 9000 is a fictional artificial intelligence character from the novel and movie "2001: A Space Odyssey" by Arthur C. Clarke and Stanley Kubrick. While Hal 9000 is a work of fiction, ChatGPT is a real technology that is designed to simulate human conversation and provide information and assistance to users.
 

kemmler3D

Major Contributor
Forum Donor
Joined
Aug 25, 2022
Messages
3,346
Likes
6,819
Location
San Francisco
posters on this thread blindly decide that because of the underlying technology ChatGPT can't work;
We decided this based on experience using it and finding that it doesn't work. In my case, several hours of trying to get it to work.

It treats quantitative and qualitative information the same way, remember.

I don't know that anyone said it CAN'T work. It just doesn't work as it exists today.

Exactly. Because it's trained to achieve certain goals, mainly completeness, but mathematical correctness is not one of them, because, and its the last time I say this, it's a language model, not a calculator, so any calculation it performs correctly is more or less a matter of chance, like any other correct answer it provides.

ChatGPT today is not suitable for any work where correctness actually matters. Maybe the next version(s) will be better, I don't know. I think we all agree it's promising but not ready for prime-time, why is there any debate on this?
 
OP
sarumbear

sarumbear

Master Contributor
Forum Donor
Joined
Aug 15, 2020
Messages
7,604
Likes
7,321
Location
UK
ChatGPT today is not suitable for any work where correctness actually matters.
Does that include speaker design by DIY enthusiast, which my posts was aimed at?
 

voodooless

Grand Contributor
Forum Donor
Joined
Jun 16, 2020
Messages
10,371
Likes
18,283
Location
Netherlands
Does that include speaker design by DIY enthusiast, which my posts was aimed at?
I don’t see how this is more useful than tools like Hornresp or WinISD? They actually show you the response, you can tweak it on the fly and see the result in real-time.
 

xaviescacs

Major Contributor
Forum Donor
Joined
Mar 23, 2021
Messages
1,499
Likes
1,977
Location
La Garriga, Barcelona
ChatGPT today is not suitable for any work where correctness actually matters. Maybe the next version(s) will be better, I don't know. I think we all agree it's promising but not ready for prime-time, why is there any debate on this?
I don't think it's promising at all.

As I've said, the amount of computational power and investment in training make it seems it's more that what actually is. It's nothing smart, it's brute force. IMO it's a marketing device to convince people to pay to use OpenAI models, nothing more. Someone has decided to loan those people a lot of money under the promises that a lot of data scientists and researcher will start using OpenAI models and that will make huge profits.

The truth however is that those models aren't special at all, and that chat GPT gets its appearance from being hugely dopped, not because it's a clever model that changes anything. It's not an advance in the field of statistical learning, it's marketing, that's all.
 

F1308

Major Contributor
Joined
May 24, 2020
Messages
1,058
Likes
910
Before making any spending following those instructions, would anyone please ask chat GPT, what is heavier, a kilogram of iron or a kilogram of hay ?

Don't be surprised when reading the answer.
 
OP
sarumbear

sarumbear

Master Contributor
Forum Donor
Joined
Aug 15, 2020
Messages
7,604
Likes
7,321
Location
UK
Before making any spending following those instructions, would anyone please ask chat GPT, what is heavier, a kilogram of iron or a kilogram of hay ?

Don't be surprised when reading the answer.
Both a kilogram of iron and a kilogram of hay weigh the same amount, which is one kilogram. The weight of an object is measured in units of mass such as kilograms, pounds, or grams. Therefore, in this case, both objects have the same weight because they have the same mass. It is important to note that the properties of the materials are different, and their densities, shapes, and sizes can affect how heavy they feel or appear to be when lifted.
 

kemmler3D

Major Contributor
Forum Donor
Joined
Aug 25, 2022
Messages
3,346
Likes
6,819
Location
San Francisco
Does that include speaker design by DIY enthusiast, which my posts was aimed at?
Yes, for example, I tried to get it to design a crossover and it gave me conflicting instructions. As I've said before, I don't think it's useful unless you already know the answer or have calculated it with another method, in which case why bother?

t's nothing smart, it's brute force. IMO it's a marketing device to convince people to pay to use OpenAI models, nothing more.
I think this is a little bit like saying Jimi Hendrix was nothing special, he just played the same instrument louder and in a different way. In my tests with ChatGPT I've seen it do things with natural language processing that I didn't think I would see in this decade. Whether it's mere brute force or something more interesting, it can still take a very wide range of arbitrary inputs and do something useful (or sometimes, almost useful) with them. This is what I think is promising, despite the obvious flaws.
 

F1308

Major Contributor
Joined
May 24, 2020
Messages
1,058
Likes
910
Both a kilogram of iron and a kilogram of hay weigh the same amount, which is one kilogram. The weight of an object is measured in units of mass such as kilograms, pounds, or grams. Therefore, in this case, both objects have the same weight because they have the same mass. It is important to note that the properties of the materials are different, and their densities, shapes, and sizes can affect how heavy they feel or appear to be when lifted.
Is that your answer or chat GPT's.
Because a friend of mine told me he got a very different inaccurate answer.
 
OP
sarumbear

sarumbear

Master Contributor
Forum Donor
Joined
Aug 15, 2020
Messages
7,604
Likes
7,321
Location
UK
Is that your answer or chat GPT's.
Because a friend of mine told me he got a very different inaccurate answer.
I copy and pasted ChatGPT answer. Asked again the answer had different wording but the same correct one.

Both are equally heavy, as they both weigh one kilogram. The unit of measurement "kilogram" refers to a specific amount of mass, and it is a fundamental unit of the metric system used to measure weight and mass. Therefore, a kilogram of iron and a kilogram of hay weigh the same.
 

Emlin

Addicted to Fun and Learning
Joined
Jul 8, 2018
Messages
790
Likes
1,113
I copy and pasted ChatGPT answer. Asked again the answer had different wording but the same correct one.

Both are equally heavy, as they both weigh one kilogram. The unit of measurement "kilogram" refers to a specific amount of mass, and it is a fundamental unit of the metric system used to measure weight and mass. Therefore, a kilogram of iron and a kilogram of hay weigh the same.
I asked bing a similar question and it said:
A pound of lead and a pound of feathers both weigh the same, about 16 ounces or 0.45 kilograms. Is there anything else you would like to know?

But note that it thinks a pound is only 'about' 16 ounces. So much for precision.
 

tomtoo

Major Contributor
Joined
Nov 20, 2019
Messages
3,712
Likes
4,774
Location
Germany
I asked bing a similar question and it said:
A pound of lead and a pound of feathers both weigh the same, about 16 ounces or 0.45 kilograms. Is there anything else you would like to know?

But note that it thinks a pound is only 'about' 16 ounces. So much for precision.

Bing? Chatgpd?
 

F1308

Major Contributor
Joined
May 24, 2020
Messages
1,058
Likes
910
Many thanks.
It is great to see it climbing the curve of knowledge.



I copy and pasted ChatGPT answer. Asked again the answer had different wording but the same correct one.

Both are equally heavy, as they both weigh one kilogram. The unit of measurement "kilogram" refers to a specific amount of mass, and it is a fundamental unit of the metric system used to measure weight and mass. Therefore, a kilogram of iron and a kilogram of hay weigh the same.
my thanks.
 

jcr159

Member
Joined
Apr 8, 2021
Messages
21
Likes
15
Generative AI knows nothing about loudspeakers, acoustics, or anything at all. It builds sentences based on statistics and a training set. Because the statistics are extremely complex (it seems the complexity is over our understanding ability) and the training set is extremely huge, it seems that formally correct sentences are meaningful.
But that's not the case at all. Humanizing this fact by using the term "hallucination" is just a cheap gimmick.
In the general case, you will achieve nothing but well-known platitudes, nothing replaces years of study and practice.

This.

I wish others would understand this better. None of the current "AI" works the way folks say. That is projecting your learning model upon the AI of your choice. The models don't understand sentence structure anymore than paint colors... they are given the option to "say" anything they want trillions of times and given feedback that a give word order doesn't generate a high acceptance.

Think about that for a minute. AI models currently are trained by being told that their output is "pleasing", or "best", or "correct" by the users essentially.

They are no different than all social media.... they tell people what they want to hear. and if not, adjust outputs until people like what it has to say.

o_O
 

tomtoo

Major Contributor
Joined
Nov 20, 2019
Messages
3,712
Likes
4,774
Location
Germany
This.

I wish others would understand this better. None of the current "AI" works the way folks say. That is projecting your learning model upon the AI of your choice. The models don't understand sentence structure anymore than paint colors... they are given the option to "say" anything they want trillions of times and given feedback that a give word order doesn't generate a high acceptance.

Think about that for a minute. AI models currently are trained by being told that their output is "pleasing", or "best", or "correct" by the users essentially.

They are no different than all social media.... they tell people what they want to hear. and if not, adjust outputs until people like what it has to say.

o_O

OT
I wonder how they do that? Do they backpropagate the right answer 10000 times? So the wighting is heavy on it?
 

pjn

Active Member
Joined
Aug 28, 2021
Messages
176
Likes
167
One thing ChatGPT does is just make things up - for example, if I ask about getting a refernence for protein X and disease Y, it just comes up with a completely bogus (but real-looking) reference. If I then challenge it about this and tell it to try again, it comes up with a real reference - it's a bit like a student who thinks their questioner won't bother looking to see if they have plagiarized something.
So, essentially a child, IMHO
 

Rick Sykora

Major Contributor
Forum Donor
Joined
Jan 14, 2020
Messages
3,603
Likes
7,293
Location
Stow, Ohio USA
No, I haven't looked at specifics for Alexa or its APIs. They are quite different under the skin from the GPT large language model though.
As for Alexa, it would have comparable issues, based on this from recent ZDNET article…

…(Amazon CEO Andy) Jassy acknowledges that Alexa was powered by a large language model but aims to equip the virtual assistant with one that can execute more complex tasks…

As with ChatGPT, if I did not find its mistake before building would be wasting my precious DIY time and money!
 
Top Bottom