• WANTED: Happy members who like to discuss audio and other topics related to our interest. Desire to learn and share knowledge of science required. There are many reviews of audio hardware and expert members to help answer your questions. Click here to have your audio equipment measured for free!

Could ChatGPT Replace Audio Writers?

FrantzM

Major Contributor
Forum Donor
Joined
Mar 12, 2016
Messages
4,377
Likes
7,881
Borrowing from the theory of Ricardian comparative advantage, in theory there can still be productive exchanges, even if your trading partner is better than you at everything you do.

However, this theory relies on there being limited capacity for production. You allocate your time / resources to the thing you're best at, and the trading partner allocates their time to whatever they are best at, and you trade, and both come out ahead vs. not trading.

However, limited capacity doesn't really apply to AI. One piece of software can run 10 billion instances, in theory.

So if AI gets better at our jobs than we are, there isn't really any way back in, unless we come up with something that is 1) worthwhile and 2) can't be done by AI.

And "can't be done by AI" is a list that seems to shrink faster than the list of worthwhile activities grows.

This also leaves aside the possibility of "true" AI, something that approximates a thinking machine on par with or beyond human intelligence. (Doesn't need to actually think, just needs to behave as if it does.) Such a thing doesn't seem too far off anymore. Many of the "computers will never be able to..." hurdles have been cleared in the past few years alone.

Once a machine is smarter than you for any given task you might both perform... are you even employable IN THEORY? I say no. This is a significant break with economic orthodoxy, but again, workers are built into the math for traditional economics, because the idea of a distinction between man and machine is inherent to the concepts of labor and capital.

To the extent that distinction weakens or even goes away (just for the purposes of employment, let's not get all Blade Runner here) then people are simply no longer required for production, and will not be paid for it. Which actually destroys our entire concept of an economy... it's predicated on investors paying people who are performing productive work.

Once workers are out of the picture, one final round of investment can happen before money itself becomes conceptually invalid.

Vonnegut's Player Piano coming in hot...
EXCLLENT POST!!

I had to scream.

I find these advances in AI, terrifying. The sheer amount of availability is no laughing matter. Aren't we playing with a fire we cannot extinguish? Haven't we crossed the Rubicon? A point of no return? The question needs to be asked : For what purpose?

A very concerned Human being.

Peace.
 

fpitas

Master Contributor
Forum Donor
Joined
Jul 7, 2022
Messages
9,885
Likes
14,213
Location
Northern Virginia, USA
Too many humans doing redundant jobs anyway. My guess is AI only needs to learn from the best of us, the top 1% or 0.1%
That may be true. Sad, but true. In any event, my career is safe. Well, until we have super computers like on Star Trek....
 

Benedium

Senior Member
Forum Donor
Joined
Aug 1, 2020
Messages
343
Likes
255
That may be true. Sad, but true. In any event, my career is safe. Well, until we have super computers like on Star Trek....
I thought the arts and humanities was safe. But it seems to me they have tackled the hardest problem first. Isn't science and tech much more logical and thus more computable?
 

fpitas

Master Contributor
Forum Donor
Joined
Jul 7, 2022
Messages
9,885
Likes
14,213
Location
Northern Virginia, USA
I thought the arts and humanities was safe. But it seems to me they have tackled the hardest problem first. Isn't science and tech much more logical and thus more computable?
Some is, some isn't. From what I've seen, design can't be taught effectively. It's like art: you can teach how to hold the brush, what colors and pigments do in general, but some people just lack the ability.
 

FrantzM

Major Contributor
Forum Donor
Joined
Mar 12, 2016
Messages
4,377
Likes
7,881
Next phase, population control and quality vs quantity.
Who or whom would define "quality" then?
We can be flippant about it, perhaps the scariest part., not recognizing this as a threat. This is not the usual threat. It has the potential to do everything we do, perhaps by our own metrics, better.
What, who would be left then? To do what?


Peace
 

fpitas

Master Contributor
Forum Donor
Joined
Jul 7, 2022
Messages
9,885
Likes
14,213
Location
Northern Virginia, USA
Just keep in mind. The chatbots will have total recall of every internet post or thread, like this one. Anything you say can and will be used against you :oops:
 

Overseas

Major Contributor
Joined
Feb 1, 2021
Messages
1,098
Likes
603
Apparently, the creators of ChatGPT warned that the bloody thing may make up things or lie to support conclusion. Wtf
 

fpitas

Master Contributor
Forum Donor
Joined
Jul 7, 2022
Messages
9,885
Likes
14,213
Location
Northern Virginia, USA
Apparently, the creators of ChatGPT warned that the bloody thing may make up things or lie to support conclusion. Wtf
So it really IS like an audio writer!
 

sarumbear

Master Contributor
Forum Donor
Joined
Aug 15, 2020
Messages
7,604
Likes
7,324
Location
UK
Too many humans doing redundant jobs anyway. My guess is AI only needs to learn from the best of us, the top 1% or 0.1% or 0.01%
How do you rank us?
 

fpitas

Master Contributor
Forum Donor
Joined
Jul 7, 2022
Messages
9,885
Likes
14,213
Location
Northern Virginia, USA
I have no problem admitting I'm more useless than most of yous, heheh. Just imagine if AI could learn from all the scientists in the world.
A lot of the scientists I've worked with are kind of crackpots. Brilliant perhaps, but...odd. I hope the AI learns from them!
 

Benedium

Senior Member
Forum Donor
Joined
Aug 1, 2020
Messages
343
Likes
255
Who or whom would define "quality" then?
We can be flippant about it, perhaps the scariest part., not recognizing this as a threat. This is not the usual threat. It has the potential to do everything we do, perhaps by our own metrics, better.
What, who would be left then? To do what?


Peace
Answers are in all the scifi movies I'm sure. Dystopian or utopian is up to us.
 

sarumbear

Master Contributor
Forum Donor
Joined
Aug 15, 2020
Messages
7,604
Likes
7,324
Location
UK

fpitas

Master Contributor
Forum Donor
Joined
Jul 7, 2022
Messages
9,885
Likes
14,213
Location
Northern Virginia, USA

kemmler3D

Major Contributor
Forum Donor
Joined
Aug 25, 2022
Messages
3,358
Likes
6,881
Location
San Francisco
Next phase, population control and quality vs quantity.
Genocide and eugenics are not funny, there are plenty of powerful people who would stoically wipe you out "for the greater good" when your existence becomes problematic for them.
Answers are in all the scifi movies I'm sure. Dystopian or utopian is up to us.
The scenarios I've seen in movies that I think are at least directionally informative:

Star Trek: Post-scarcity communism built on the back of cooperative, effective AI with no self-agency

Her: The AI advances beyond the point where it's able to remain interested in humanity, leaves us behind

Westworld / Terminator / The Matrix: The AI becomes evil for some reason* and sets out to destroy humanity, mostly succeeds

Mad Max: Capitalism self-destructs and takes the infrastructure needed to run AI with it, humanity goes back to the stone age

Jetsons / Also Star Trek: AI is capable of doing all our work for us, but we find increasingly contrived reasons to keep doing it anyway, because we can't handle the alternatives

(NB: In Star Trek, the ship's computer spontaneously achieves sentience at least 2 or 3 times during the series. Even in this 1960-90 view of the far future, the computer is obviously capable of doing the jobs of everyone on the ship and then some. There are extremely capable androids all over the place for physical work. And yet they sit at desks putting in numbers and crawling through tubes turning wrenches all the time. Kinda makes you think. Our vision of the future almost never considers a scenario in which people are obsolete.)

*I think this is very unlikely since there's no way I can think of to give AI an emotional motivation, and so it will only happen if someone specifically programs the AI to wipe out the human race. Which is not impossible, I guess.
 
Last edited:

fpitas

Master Contributor
Forum Donor
Joined
Jul 7, 2022
Messages
9,885
Likes
14,213
Location
Northern Virginia, USA
Genocide and eugenics are not funny, there are plenty of powerful people who would stoically wipe you out "for the greater good" when your existence becomes problematic for them.
"The Greater Good" usually defined as "we can steal all their stuff!"
 

kemmler3D

Major Contributor
Forum Donor
Joined
Aug 25, 2022
Messages
3,358
Likes
6,881
Location
San Francisco
"The Greater Good" usually defined as "we can steal all their stuff!"
At that point, they will already have all the stuff under lock and key, the issue will be that we keep annoying them by asking for some of it.
 
Top Bottom