• WANTED: Happy members who like to discuss audio and other topics related to our interest. Desire to learn and share knowledge of science required. There are many reviews of audio hardware and expert members to help answer your questions. Click here to have your audio equipment measured for free!

Could ChatGPT Replace Audio Writers?

OP
Dismayed

Dismayed

Senior Member
Joined
Jan 2, 2018
Messages
386
Likes
404
Location
Boston, MA
Apparently, the creators of ChatGPT warned that the bloody thing may make up things or lie to support conclusion. Wtf
That's what humans do . . .
 
OP
Dismayed

Dismayed

Senior Member
Joined
Jan 2, 2018
Messages
386
Likes
404
Location
Boston, MA
I like to think about limits because I've taken too many math courses (though fewer than my younger son who majored in math).

Assume a machine that satisfies all human needs and desires. That could produce a life of leisure for all, or a living hell. If the machine was owned by a single person, they would have no motivation to provide for other people. Ownership across all the population would result in everyone sharing in the output.
 

Mnyb

Major Contributor
Forum Donor
Joined
Aug 14, 2019
Messages
2,640
Likes
3,605
Location
Sweden, Västerås
I think less than a 1% percent of the “work” done at the multinational behemoth I work for is actual work , most is busywork to keep the hive happy. Today a was attending a completely incomprehensible safety course...

In the future social skills/scheming will decide if your employable? Not actual skills at anything. Be nice attend meetings and be likeable etc , make good presentations and talks .
Truth about anything will be even less important nobody would recognise it.

Things that are reasonable deterministic can be automated .

Wonder if chatGPT can design a DAC with the usual chip sets aviable ?
The product and brand story may need a human hand :) but 90% of it can be done by an AI template , it can then derive different flavoured reviews to sell to magazines who’s editor AI select one depending on ad profile in the magazine and reader demographics
 

kemmler3D

Major Contributor
Forum Donor
Joined
Aug 25, 2022
Messages
3,008
Likes
5,612
Location
San Francisco
Assume a machine that satisfies all human needs and desires. That could produce a life of leisure for all, or a living hell. If the machine was owned by a single person, they would have no motivation to provide for other people. Ownership across all the population would result in everyone sharing in the output.
Careful, you're starting to sound like a DIRTY COMMIE! ;)

But realistically, yes. Most economic systems we know of are predicated on scarcity and labor. If scarcity and labor go away because machines effortlessly give us everything we ask for, so SHOULD the economic system, to be replaced with something that comports with the new economic reality. If not, we should expect undesirable results.

What is interesting about this concept that hasn't been addressed anywhere I know of is how we would assign property rights to land. Who gets to live in the nice areas if nobody is working for money anymore? Almost everything that has a monetary value has that value because of the labor that goes into it. Real estate is an exception, it's valued via legal fiat and scarcity. AI may be able to do almost anything in the future, but it won't be able to create more oceanfront property out of nowhere.

So it seems we will still need money, but the means by which we might earn it are far from obvious if we're assuming "jobs are a thing of the past". Gambling maybe?
 

kemmler3D

Major Contributor
Forum Donor
Joined
Aug 25, 2022
Messages
3,008
Likes
5,612
Location
San Francisco
In the future social skills/scheming will decide if your employable? Not actual skills at anything. Be nice attend meetings and be likeable etc , make good presentations and talks .
Truth about anything will be even less important nobody would recognise it.
Arguably this is already true to a big extent.
Wonder if chatGPT can design a DAC with the usual chip sets aviable ?
Not now, but I wouldn't bet against that in a 5-10 year horizon.
The product and brand story may need a human hand
As a marketer, I wish I really believed that. In truth, designing brand / positioning is more deterministic (once you have the qualitative data on how people feel) than most people realize.
 

Mnyb

Major Contributor
Forum Donor
Joined
Aug 14, 2019
Messages
2,640
Likes
3,605
Location
Sweden, Västerås
Arguably this is already true to a big extent.

Not now, but I wouldn't bet against that in a 5-10 year horizon.

As a marketer, I wish I really believed that. In truth, designing brand / positioning is more deterministic (once you have the qualitative data on how people feel) than most people realize.
Aww was hoping that audiophiles had a special kind of crazy, not easily mimicked by AI :) just look at the exterior design of some stuff super bling and basement project at the same time ? Like a 12 year old designing his own Lamborghini with extra jet engines?
 

Benedium

Senior Member
Forum Donor
Joined
Aug 1, 2020
Messages
343
Likes
255
No more jobs for the sake of jobs.
No more competing just to survive.
Genetically engineered babies.
State/company nurtured elite orphans.
No more families.
Hope it happens soon.
 
Last edited:

kemmler3D

Major Contributor
Forum Donor
Joined
Aug 25, 2022
Messages
3,008
Likes
5,612
Location
San Francisco
No more jobs for the sake of jobs.
No more competing just to survive.
Good with me
Genetically engineered babies.
State/company nurtured elite orphans.
I have some serious questions about the details here.
No more families.
Hope it happens soon.
I got off the train a few stops ago.

If we are aiming for a Star Trek-style utopia, let's try to avoid having our own Eugenics Wars, eh?

I think genetic engineering could be a real godsend over time... but only when used responsibly. I can't think of many people I would trust to use it responsibly, especially not any employed in government. This isn't a political opinion, just a general pessimism about anyone's ability to make helpful reproductive decisions for anyone else. You can't monkey with the most basic biological imperatives without stirring up some trouble.
 

Benedium

Senior Member
Forum Donor
Joined
Aug 1, 2020
Messages
343
Likes
255
Good with me

I have some serious questions about the details here.

I got off the train a few stops ago.

If we are aiming for a Star Trek-style utopia, let's try to avoid having our own Eugenics Wars, eh?

I think genetic engineering could be a real godsend over time... but only when used responsibly. I can't think of many people I would trust to use it responsibly, especially not any employed in government. This isn't a political opinion, just a general pessimism about anyone's ability to make helpful reproductive decisions for anyone else. You can't monkey with the most basic biological imperatives without stirring up some trouble.
I'd like to think that everything in culture and nature existed to be a solution to problems. If that's true, then it should be ideal to update every solution for the times.
 

kemmler3D

Major Contributor
Forum Donor
Joined
Aug 25, 2022
Messages
3,008
Likes
5,612
Location
San Francisco
I'd like to think that everything in culture and nature existed to be a solution to problems. If that's true, then it should be ideal to update every solution for the times.
Perhaps, sure, why not. But some solutions to problems are baked into our biology (families) while others are simply things we've lived with for generations (business, economy, jobs).

If we are willing to genetically engineer to the point that (say) children no longer feel a need for connection to parents, well, that's IMO beyond the scope of what AI might do to the economy in the next few years and I don't really know what to think about it.

My hesitation around any state-sponsored genetic work of any kind comes from my view of human political leadership in general. Has any country on earth been run by benevolent, intelligent people (that you agree with) for more than a few years at a time? For my part, I can't name one. (to keep politics out of the thread, keep the answers to yourselves.) To me, that is too much power for any human to wield over another.
 

Benedium

Senior Member
Forum Donor
Joined
Aug 1, 2020
Messages
343
Likes
255
Perhaps, sure, why not. But some solutions to problems are baked into our biology (families) while others are simply things we've lived with for generations (business, economy, jobs).

If we are willing to genetically engineer to the point that (say) children no longer feel a need for connection to parents, well, that's IMO beyond the scope of what AI might do to the economy in the next few years and I don't really know what to think about it.

My hesitation around any state-sponsored genetic work of any kind comes from my view of human political leadership in general. Has any country on earth been run by benevolent, intelligent people (that you agree with) for more than a few years at a time? For my part, I can't name one. (to keep politics out of the thread, keep the answers to yourselves.) To me, that is too much power for any human to wield over another.
I think AI is the most realistic replacement for the hypothetical benevolent intelligent leader.
 

kemmler3D

Major Contributor
Forum Donor
Joined
Aug 25, 2022
Messages
3,008
Likes
5,612
Location
San Francisco
I think AI is the most realistic replacement for the hypothetical benevolent intelligent leader.
I agree only because I find the concept of a truly effective, benevolent leader pretty unrealistic to begin with. It might be outlandish, but no more than [insert political body of your choice here] actually doing their jobs properly.

What, me, cynical?

I would point out that an AI can't really make decisions for us, because it doesn't have emotions of its own. It can only interpret and evaluate what we ask of it. So whatever genetically engineered battle school super-orphans we end up with, will still be of our own making one way or the other.
 

FrantzM

Major Contributor
Forum Donor
Joined
Mar 12, 2016
Messages
4,337
Likes
7,730
Genocide and eugenics are not funny, there are plenty of powerful people who would stoically wipe you out "for the greater good" when your existence becomes problematic for them.

The scenarios I've seen in movies that I think are at least directionally informative:

Star Trek: Post-scarcity communism built on the back of cooperative, effective AI with no self-agency

Her: The AI advances beyond the point where it's able to remain interested in humanity, leaves us behind

Westworld / Terminator / The Matrix: The AI becomes evil for some reason* and sets out to destroy humanity, mostly succeeds

Mad Max: Capitalism self-destructs and takes the infrastructure needed to run AI with it, humanity goes back to the stone age

Jetsons / Also Star Trek: AI is capable of doing all our work for us, but we find increasingly contrived reasons to keep doing it anyway, because we can't handle the alternatives

(NB: In Star Trek, the ship's computer spontaneously achieves sentience at least 2 or 3 times during the series. Even in this 1960-90 view of the far future, the computer is obviously capable of doing the jobs of everyone on the ship and then some. There are extremely capable androids all over the place for physical work. And yet they sit at desks putting in numbers and crawling through tubes turning wrenches all the time. Kinda makes you think. Our vision of the future almost never considers a scenario in which people are obsolete.)

*I think this is very unlikely since there's no way I can think of to give AI an emotional motivation, and so it will only happen if someone specifically programs the AI to wipe out the human race. Which is not impossible, I guess.
One could advance that Data in Start Trek was sentient. He could do anything humans could, and then more.

One different take on Sentient Machines, is that of the Culture Series from Ian M. Banks. "The Culture" is a post-scarcity society with no laws (basically), in which anyone can have anything. This civilization has access to infinite energy and was "governed" by A.I. called "minds" .. Infinitely intelligent "machines" with god-like knowledge and abilities... However many less powerful machines were sentient in this universe, even weapons or jewelry ... I love the Culture Series.. would have loved to see a TV or movies adaptation.

I know, wildly O.T.



Peace.
 

Benedium

Senior Member
Forum Donor
Joined
Aug 1, 2020
Messages
343
Likes
255
Problem with humans is we are baked with animal instincts like fear. Especially the greatest people are motivated by fear seen as fight or flight, followed by learned rules that steer them towards success. The greater the fear, the greater their drive. If I'm not wrong, AI is just about objectives and a set of rules. Sounds like an improvement over even the best humans. If we don't find the smartest way to deal with our innate fear and instincts, instead of always trying to harness it to motivate people, we will always be stuck in nature's loop of suffering and violence.
 
Last edited:

kemmler3D

Major Contributor
Forum Donor
Joined
Aug 25, 2022
Messages
3,008
Likes
5,612
Location
San Francisco
One could advance that Data in Start Trek was sentient.
Wasn't there an episode where they put it on trial and ruled that he was, indeed, sentient?

As far as it goes, I think it's right. A person with no emotions is still considered sentient.

If we don't find the smartest way to deal with our innate fear and instincts, we will always be stuck in nature's loop of suffering and violence.
Sure, seems reasonable to me. My reservation is mostly just on whether genetic engineering is a smart way to deal with things. Fear is overrated but it's still an important emotion. Getting rid of whole emotions gets into Blade Runner / Ghost in the Shell territory of "what is human".
 

Benedium

Senior Member
Forum Donor
Joined
Aug 1, 2020
Messages
343
Likes
255
Wasn't there an episode where they put it on trial and ruled that he was, indeed, sentient?

As far as it goes, I think it's right. A person with no emotions is still considered sentient.


Sure, seems reasonable to me. My reservation is mostly just on whether genetic engineering is a smart way to deal with things. Fear is overrated but it's still an important emotion. Getting rid of whole emotions gets into Blade Runner / Ghost in the Shell territory of "what is human".
I believe fear is the only real emotion. Everything else is a derivative.

I'm not an expert though. Just my 2 cents or less of life experience and observations. Heheh.
 

Timcognito

Major Contributor
Forum Donor
Joined
Jun 28, 2021
Messages
3,349
Likes
12,552
Location
NorCal
Whenever I read this kind of stuff I think Aldous Huxley was so far ahead of time with his predictions in "Brave New World". ChatGPT will ultimately find its way in advertising, sitcoms and porn.
 

fpitas

Master Contributor
Forum Donor
Joined
Jul 7, 2022
Messages
9,885
Likes
14,191
Location
Northern Virginia, USA
Whenever I read this kind of stuff I think Aldous Huxley was so far ahead of time with his predictions in "Brave New World". ChatGPT will ultimately find its way in advertising, sitcoms and porn.
Can it seriously be any stupider than the status quo?
 

FrantzM

Major Contributor
Forum Donor
Joined
Mar 12, 2016
Messages
4,337
Likes
7,730
Hi

I believe fear is the only real emotion. Everything else is a derivative.

I'm not an expert though. Just my 2 cents or less of life experience and observations. Heheh.

What would love be then ? A fear of //?
It is a difficult concept a life without some sort of emotion. Even the use of logic is a choice, based on a need, itself an emotion... a yearning to assign reasons to things ... What is that yearning? an emotion
This absence of "emotions" we attribute or would "like" AI to display, seems to be based on .. emotions? If that imbue them with some superiority, how do we know, how they will react, interact, deal, with these sentient, full or emotions, beings? What would be the need to have us with our tendencies, among these, to multiply and control?
We can on our side be all misanthropic, if it suits us, emotionally, or logically.. It would be another emotion, if we educate the A.I. to share such, I am not sure the results would be desirable..

Peace.
 
Last edited:

Timcognito

Major Contributor
Forum Donor
Joined
Jun 28, 2021
Messages
3,349
Likes
12,552
Location
NorCal
Hi



What would love be then ? A fear of //?
It is a difficult concept a life without some sort of emotion. Even the use of logic is a choice, based on a need, itself an emotion... a yearning to assign reasons to things ... What is that yearning? an emotion
This absence of "emotions" we attribute or would "like" AI to display, seems to be based on .. emotions? If that imbue them with some superiority, how do we know, how they will react, interact, deal with these sentient, full or emotions, beings? What would be the need to have us with our tendencies, among these to multiply and control?
We can on our side be all misanthropic, if it suits us, emotionally, or logically.. It would be another emotion, if we educate the A.I. to share such, I am not sure the results would be desirable..

Peace.
1675293073233.jpeg
 
Top Bottom