Did you look at the videos? It does screw up calculations.I said for calculation. Do please read my question again.
Did you look at the videos? It does screw up calculations.I said for calculation. Do please read my question again.
Did you checked the calculations I posted? My post is only about using it for speaker calculations.Did you look at the videos? It does screw up calculations.
I gave two of these problems with the exact same prompt to chatgpt and the results were correct for me.
Just in case someone was wondering -I'm sorry, Dave. I'm afraid I can't do that.
We decided this based on experience using it and finding that it doesn't work. In my case, several hours of trying to get it to work.posters on this thread blindly decide that because of the underlying technology ChatGPT can't work;
Exactly. Because it's trained to achieve certain goals, mainly completeness, but mathematical correctness is not one of them, because, and its the last time I say this, it's a language model, not a calculator, so any calculation it performs correctly is more or less a matter of chance, like any other correct answer it provides.
Does that include speaker design by DIY enthusiast, which my posts was aimed at?ChatGPT today is not suitable for any work where correctness actually matters.
I don’t see how this is more useful than tools like Hornresp or WinISD? They actually show you the response, you can tweak it on the fly and see the result in real-time.Does that include speaker design by DIY enthusiast, which my posts was aimed at?
I don't think it's promising at all.ChatGPT today is not suitable for any work where correctness actually matters. Maybe the next version(s) will be better, I don't know. I think we all agree it's promising but not ready for prime-time, why is there any debate on this?
Both a kilogram of iron and a kilogram of hay weigh the same amount, which is one kilogram. The weight of an object is measured in units of mass such as kilograms, pounds, or grams. Therefore, in this case, both objects have the same weight because they have the same mass. It is important to note that the properties of the materials are different, and their densities, shapes, and sizes can affect how heavy they feel or appear to be when lifted.Before making any spending following those instructions, would anyone please ask chat GPT, what is heavier, a kilogram of iron or a kilogram of hay ?
Don't be surprised when reading the answer.
Yes, for example, I tried to get it to design a crossover and it gave me conflicting instructions. As I've said before, I don't think it's useful unless you already know the answer or have calculated it with another method, in which case why bother?Does that include speaker design by DIY enthusiast, which my posts was aimed at?
I think this is a little bit like saying Jimi Hendrix was nothing special, he just played the same instrument louder and in a different way. In my tests with ChatGPT I've seen it do things with natural language processing that I didn't think I would see in this decade. Whether it's mere brute force or something more interesting, it can still take a very wide range of arbitrary inputs and do something useful (or sometimes, almost useful) with them. This is what I think is promising, despite the obvious flaws.t's nothing smart, it's brute force. IMO it's a marketing device to convince people to pay to use OpenAI models, nothing more.
Is that your answer or chat GPT's.Both a kilogram of iron and a kilogram of hay weigh the same amount, which is one kilogram. The weight of an object is measured in units of mass such as kilograms, pounds, or grams. Therefore, in this case, both objects have the same weight because they have the same mass. It is important to note that the properties of the materials are different, and their densities, shapes, and sizes can affect how heavy they feel or appear to be when lifted.
I copy and pasted ChatGPT answer. Asked again the answer had different wording but the same correct one.Is that your answer or chat GPT's.
Because a friend of mine told me he got a very different inaccurate answer.
I asked bing a similar question and it said:I copy and pasted ChatGPT answer. Asked again the answer had different wording but the same correct one.
Both are equally heavy, as they both weigh one kilogram. The unit of measurement "kilogram" refers to a specific amount of mass, and it is a fundamental unit of the metric system used to measure weight and mass. Therefore, a kilogram of iron and a kilogram of hay weigh the same.
I asked bing a similar question and it said:
A pound of lead and a pound of feathers both weigh the same, about 16 ounces or 0.45 kilograms. Is there anything else you would like to know?
But note that it thinks a pound is only 'about' 16 ounces. So much for precision.
my thanks.I copy and pasted ChatGPT answer. Asked again the answer had different wording but the same correct one.
Both are equally heavy, as they both weigh one kilogram. The unit of measurement "kilogram" refers to a specific amount of mass, and it is a fundamental unit of the metric system used to measure weight and mass. Therefore, a kilogram of iron and a kilogram of hay weigh the same.
Generative AI knows nothing about loudspeakers, acoustics, or anything at all. It builds sentences based on statistics and a training set. Because the statistics are extremely complex (it seems the complexity is over our understanding ability) and the training set is extremely huge, it seems that formally correct sentences are meaningful.
But that's not the case at all. Humanizing this fact by using the term "hallucination" is just a cheap gimmick.
In the general case, you will achieve nothing but well-known platitudes, nothing replaces years of study and practice.
This.
I wish others would understand this better. None of the current "AI" works the way folks say. That is projecting your learning model upon the AI of your choice. The models don't understand sentence structure anymore than paint colors... they are given the option to "say" anything they want trillions of times and given feedback that a give word order doesn't generate a high acceptance.
Think about that for a minute. AI models currently are trained by being told that their output is "pleasing", or "best", or "correct" by the users essentially.
They are no different than all social media.... they tell people what they want to hear. and if not, adjust outputs until people like what it has to say.
As for Alexa, it would have comparable issues, based on this from recent ZDNET article…No, I haven't looked at specifics for Alexa or its APIs. They are quite different under the skin from the GPT large language model though.