• Welcome to ASR. There are many reviews of audio hardware and expert members to help answer your questions. Click here to have your audio equipment measured for free!

Can anyone explain the vinyl renaissance?

From what I have read, and of course it is hard to verify but things written by mastering and remastering engineers, is that Toole is correct. In many cases only the Vinyl Masters were kept by the studios as that is all that was considered valuable (they also often kept the original individual track tapes used to mix the vinyl masters). I believe the "myth" is that the subtle changes needed for the LP master tapes "sounded dreadful". In many cases the "mastering moves" including reverb, EQ changes, and flying in extra sounds "improved" the original master tape. The "degradation" happens when a tape is used to cut an acetate and then a mold is made from that and then the pressing die is made from the mold and then a piece of vinyl is pressed from the pressing die. The main degradation happens during all these physical steps to mass produce vinyl albums. As @atmasphere points out a direct cut acetate can be objectively better than any analog tape recording machine.

The other big "degradation" happens because it can be decades from when the master / cutting tapes were originally created and the transfer to digital took place. Tapes are perishable to a lesser or greater degree depending on formulation, storage, use, and abuse.
Rubbish!

The changes needed to make a master stamper for LP are not subtle at all. I don't know that direct cut acetate was better than any analog tape, especially if one considers 15 i.p.s. tape. It might be comparable. OTOH, how many direct to disc recordings are there? Not even 1%.

I recorded many albums off of FM radio when they would play whole albums. They were all things considered better than LP almost always. Broadcast RTR even copied to broadcast tape cartridges was better than the total degradation over an LP format. Revisionist history is a b*tch. Unless someone can provide good data the idea they kept the "most precious" vinyl masters doesn't even make any sense.
 
From what I have read, and of course it is hard to verify but things written by mastering and remastering engineers, is that Toole is correct. In many cases only the Vinyl Masters were kept by the studios as that is all that was considered valuable (they also often kept the original individual track tapes used to mix the vinyl masters). I believe the "myth" is that the subtle changes needed for the LP master tapes "sounded dreadful". In many cases the "mastering moves" including reverb, EQ changes, and flying in extra sounds "improved" the original master tape. The "degradation" happens when a tape is used to cut an acetate and then a mold is made from that and then the pressing die is made from the mold and then a piece of vinyl is pressed from the pressing die. The main degradation happens during all these physical steps to mass produce vinyl albums. As @atmasphere points out a direct cut acetate can be objectively better than any analog tape recording machine.

The other big "degradation" happens because it can be decades from when the master / cutting tapes were originally created and the transfer to digital took place. Tapes are perishable to a lesser or greater degree depending on formulation, storage, use, and abuse.
The question would then be what a vinyl master would be like. Does it mean "this is a master that with default settings on a lathe cuts a good acetate"? I'd assume in that case that the "default settings on the lathe" would determine what that vinyl master sounds like when transferred to CD. It would also imply that vinyl was indeed the standard in practice.

Given the cassette conversation, were "vinyl masters" also used for making cassettes?
 
Toole has described the situation when CD first came along, that a distressingly large amount of studios had only kept the so-called Vinyl Master of albums, so, when it was decided to re-release the album on CD, that was all they had to work with!

They were dreadful because they were vinyl masters, not because the copying (A/D digitising) was dreadful. It's a myth that early A/D conversion for CD was audibly problematic.

I haven't seen this thread pop up in my New Posts list for some time, and clearly there's something very wrong with me if I've decided to post in this thread again. :)

Nevertheless, to this point about vinyl masters, and to @levimax 's follow-up comment about it, it might very well be true that a "distressingly large" number of studios kept only the vinyl masters.

However, there is a truly enormous amount of music out there, and a "distressingly large" number can still represent a minority, even a small minority of the total number.

Moreover, studios are not to my knowledge the main repository of master tapes. And many if not all of the major record labels - which are the main repository - had vaults and tape storage facilities in more than one location. Moreover, their control over, and knowledge of, the retention and location of master tapes was far from total or perfect.

This is why so many CDs over the decades, from the mid-1980s through to the present, have been made from master tapes that were thought to be lost or destroyed - or were not known/understood to be the masters. Maybe the best known examples are many of the CDs Steve Hoffman (problematic figure, I know) mastered for MCA in the 1980s. He found many tapes marked "Do Not Use" because they were the original masters and in the days of vinyl you had to use the vinyl cutting copy to make records. (And cassettes were made from copies of the master, not the true master.) They had long been ignored and they were generally not the tapes you would get if you asked someone to pull the masters for you and didn't go hunting around the shelves yourself. And for these reasons they tended to be in very good condition! He used many of these "Do Not Use" tapes to make those CDs.

Now, of course it is true that many 1980s were indeed made from vinyl cutting tapes - it is likely that at least some of the famous early CDs prized for their "warmth" and "analogue" sound were made from vinyl masters. For example, the "2-track" Pink Floyd Wish You Were Here, the "non-TO" Pink Floyd Dark Side of the Moon, the "black triangle" Beatles Abbey Road, and one or both of the two earliest CD masterings of Fleetwood Mac's Rumours. I don't know with certainty that all of these are from LP masters, but I'm confident that at least some of them are. Barry Diament has also written that he had to master two of the latter Led Zeppelin 1980s albums for CD from vinyl cutting tapes, if memory serves Presence and In Through the Out Door, because that's what he was given to work with.

So it's really a mixed bag.

As to the question of how much the mastering moves required to make a vinyl cutting copy impact the sonics, I'll only say that of all the Zeppelin CDs Diament mastered, Presence sounds the worst compared to other available masterings IMHO.
 
Last edited:
So it's really a mixed bag.

That's certainly true. A very mixed bag.

And who knows what has really been lost to that huge Universal fire in 2008 (up to 175,000 master tapes destroyed) and what they are now using as "masters". Likely first release CDs if you ask me. They talked of digital "clones" they were able to use...
 
That's certainly true. A very mixed bag.

And who knows what has really been lost to that huge Universal fire in 2008 (up to 175,000 master tapes destroyed) and what they are now using as "masters". Likely first release CDs if you ask me. They talked of digital "clones" they were able to use...
I can only hope by digital "clones" they meant flat transfers created precisely to be digital backups, to guard against eventual analogue master tape degradation.

If they are instead talking about CDs, first release or not, that's truly sad - not because of CD's 16-bit/44.1kHz format, but because even in the 1980s when digital limiting had not been invented and mastering engineers tended towards a somewhat minimalist approach, vanishingly few CDs were truly flat transfers. They were almost all EQ'd or otherwise modified from the master tape in some audible way.
 
Revisionist history is a b*tch.
Amen to that! The Vinyl Defenders League has a lot to answer for.
Unless someone can provide good data the idea they kept the "most precious" vinyl masters doesn't even make any sense.
Just to be clear, my initial comment relayed from Toole, was made by Toole in a tone of disappointment bordering on dismay. Nothing 'most precious' about his view of vinyl masters. He clearly would have preferred that the final studio master had been retained and was available for direct transfer to CD.
The question would then be what a vinyl master would be like. Does it mean "this is a master that with default settings on a lathe cuts a good acetate"? I'd assume in that case that the "default settings on the lathe" would determine what that vinyl master sounds like when transferred to CD. It would also imply that vinyl was indeed the standard in practice.
There seem to be two stages of remastering to get from final studio master tape to what is actually in the grooves of an LP:-

- Firstly, create a vinyl master tape from the studio master tape, with whatever changes deemed fit. As well as the changes, this adds one more generation of copy to an already nth-generation tape.

- Secondly, 'ride' the gain and compression controls of the cutting lathe, making on-the-fly adjustments by ear. This is changing dynamic compression among other things, on a moment-by-moment basis. Not good! Plus the RIAA EQ is applied at this point, which has several versions and the owner usually has no clue which version was used, and the record owner's reverse-RIAA circuit often doesn't result in flat output.

Note that, if the first step above includes more-severe adjustments to allow plenty of safety margin for 'LP cutting demands' and also for 'record owners with very average players', then the second step becomes much easier -- lazier, if you will -- requiring less focused attention and skill, less time, less screw-ups requiring rework.

Now tell me: if you were Management in a big business operation at the centre of the music industry making money from churning out recordings when vinyl was king, would you really apply minimal changes in step one above, resulting in higher costs and problems in step two? Only a Grade A vinyl defender would assume so.

...to this point about vinyl masters, and to @levimax 's follow-up comment about it, it might very well be true that a "distressingly large" number of studios kept only the vinyl masters.

However, there is a truly enormous amount of music out there, and a "distressingly large" number can still represent a minority, even a small minority of the total number.
...
Yes I agree. I was careful when raising this to avoid giving the impression that it was either General Practice or scarce.

I know it was often enough to severely disappoint Toole -- and I don't know what that means numerically nor proportionately.

It's just amazing that it happened at all. And something to bear in mind.

cheers
 
I haven't seen this thread pop up in my New Posts list for some time, and clearly there's something very wrong with me if I've decided to post in this thread again. :)

Don’t sweat it. Post like the one you made, and the contributions of others on this subject in the past few pages, show that this thread still has some juice in terms of interesting subjects.
 
I can only hope by digital "clones" they meant flat transfers created precisely to be digital backups, to guard against eventual analogue master tape degradation.

If they are instead talking about CDs, first release or not, that's truly sad

If they truly did lose 175,000 masters, going right back 30, 40 or more years, I can be pretty confident there may have only been a very small fraction of those that were properly digitally archived and offsite. The sheer scale of setting up suitable RTRs, to restore the tapes, then play and archive that many masters to a digital "clone" would have taken many decades even if it could be done.

Just a monumental loss and I don't think the listening public of those artists going forward will ever really know the provenance of the music. But then again, we never really did.
 
but LP was usurped as soon as people got a compact medium they could record what they wanted and take it anywhere.
But wasn't the source of what they were recording (in most cases) LPs?
 
If they truly did lose 175,000 masters, going right back 30, 40 or more years, I can be pretty confident there may have only been a very small fraction of those that were properly digitally archived and offsite. The sheer scale of setting up suitable RTRs, to restore the tapes, then play and archive that many masters to a digital "clone" would have taken many decades even if it could be done.

Just a monumental loss and I don't think the listening public of those artists going forward will ever really know the provenance of the music. But then again, we never really did.
A lot would have depended on sales, at a guess. After all, they probably put effort into albums that could make money into the future. Since they could always cut from the stereo master in practice, they only needed a "vinyl master" for cutting additional acetates when the first set of stampers were exhausted. And they probably only kept valuable tapes offsite as it were. There would be first generation copy masters all over the place by the look of things (some of those "vinyl masters" may just have been copy masters) and not all first generation copies seem to have been poor, either.

Properly prepared first generation CDs would do nicely as "digital clones" in practice, Especially the later ones made from 16 bit digital masters or better.

Don't forget Apple, either. They hold a huge stock of digital copies, with a lot of those at 24/96 which they have requested/required for years.

A lot of what was lost though were the multitrack originals. That makes them not available for remixing. It also raises the question of who is archiving all those multitrack digital file sets for modern albums being made in home studios around the world. Want a surround mix, anyone? You might have to wait for better AI.
 
I can only hope by digital "clones" they meant flat transfers created precisely to be digital backups, to guard against eventual analogue master tape degradation.

If they are instead talking about CDs, first release or not, that's truly sad - not because of CD's 16-bit/44.1kHz format, but because even in the 1980s when digital limiting had not been invented and mastering engineers tended towards a somewhat minimalist approach, vanishingly few CDs were truly flat transfers. They were almost all EQ'd or otherwise modified from the master tape in some audible way.
I seem to remember a ton of people joining in from here to say that the signal isn't "holy". Good CD copies will do if what we want to do is hear the music in stereo. I know we are supposed to be sceptical these days, but the reason we have mastering engineers is to save us from flat masters.

I know, I've argued for us to have access to flat copies in the past and it is a purist ideal, but not necessarily the best solution.
 
A lot would have depended on sales, at a guess. After all, they probably put effort into albums that could make money into the future. Since they could always cut from the stereo master in practice, they only needed a "vinyl master" for cutting additional acetates when the first set of stampers were exhausted. And they probably only kept valuable tapes offsite as it were. There would be first generation copy masters all over the place by the look of things (some of those "vinyl masters" may just have been copy masters) and not all first generation copies seem to have been poor, either.

Properly prepared first generation CDs would do nicely as "digital clones" in practice, Especially the later ones made from 16 bit digital masters or better.

Don't forget Apple, either. They hold a huge stock of digital copies, with a lot of those at 24/96 which they have requested/required for years.

A lot of what was lost though were the multitrack originals. That makes them not available for remixing. It also raises the question of who is archiving all those multitrack digital file sets for modern albums being made in home studios around the world. Want a surround mix, anyone? You might have to wait for better AI.
Since the fire I think much music is stored more carefully. Like Universal's use of Iron mountain. Remember it was Universal Studios where the big fire happened in 2008.
 
A lot of what was lost though were the multitrack originals. That makes them not available for remixing. It also raises the question of who is archiving all those multitrack digital file sets for modern albums being made in home studios around the world.

Good point. Losing the multitrack masters is terrible, but losing the final 2 track masters as well, is even worse.
 
I seem to remember a ton of people joining in from here to say that the signal isn't "holy". Good CD copies will do if what we want to do is hear the music in stereo. I know we are supposed to be sceptical these days, but the reason we have mastering engineers is to save us from flat masters.

I know, I've argued for us to have access to flat copies in the past and it is a purist ideal, but not necessarily the best solution.

That’s not my point or my position. I agree that in many or most cases we would not prefer a flat transfer as the final released product that we listen to.

But the point of a master source is to be the original mixdown recording, a “clean slate” from which a final master can be produced for the commercial release that we listen to.

Because we have plentiful empirical evidence that multiple masterings/(re)issues of music on CD sound different from each other and the first issue is not always flawless or widely preferred, it’s obvious that we want to have the original mixdown master as the archival base source from which future releases can be made. This isn’t “purist” in the way you are claiming. It’s just good practice and common sense, and I’d have thought that was obvious from my prior comment.

It’s bad if the only existing copy of an album is a CD that has a particular mastering engineer’s 1980s transfer equipment (which might very well include a sub-optimal/misaligned playback deck and/or non-transparent studio analogue gear before the ADC) and EQ choices baked into it. That reduces the fidelity of future releases that have to use that CD as their base source, and none of those baked in characteristics can be undone or reversed with any precision.

The hardware analogy to the position you’re taking here would be to say that from a “purist” point of view it would be ideal to use fully linear, load-independent amplification, but if high-quality solid-state amps were all destroyed in a fire then it’s no big deal to use a tube amp and just make it more linear with EQ and DSP in the playback chain. That position wouldn’t fly here, and for good reason.

I have noticed here at ASR a tendency among some members to ignore or dismiss the importance of mastering in the perceived sound quality of the listening experience, and I find that bizarre. Yes, mastering comes before the playback chain, and there’s no objective measurement standard for right vs wrong in mastering. But that doesn’t mean that mastering makes no sonic difference - it obviously and clearly does, in cases when multiple masterings exist and their differences are obvious.

So having the original, unadulterated stereo source available as a basis for mastering is important and goes way beyond a “purist” concern for anyone interested in fidelity as the standard, as most of us here at ASR are.
 
OK - if you mean cassettes of all kinds, I'd agree. I was assuming you meant pre-recorded cassettes. Of which I never bought any - and knew no-one who bought more than one or two. They were seen as too likely to be eaten by cheap players to invest more than the cost of a blank tape in.

Though I've not been able to find any UK based data along the lines of the chart shown above.
That was the joy of life in the 80s and early 90s - if someone among your friends, at school, at work, in the youth club, whatever, had some music you liked, you could get the music for free - just dub it to a cassette tape.
Of course, when the mp3 craze began many years later, the same thing happened in a different way.



37686300_0.jpg
 
That’s not my point or my position. I agree that in many or most cases we would not prefer a flat transfer as the final released product that we listen to.

But the point of a master source is to be the original mixdown recording, a “clean slate” from which a final master can be produced for the commercial release that we listen to.

Because we have plentiful empirical evidence that multiple masterings/(re)issues of music on CD sound different from each other and the first issue is not always flawless or widely preferred, it’s obvious that we want to have the original mixdown master as the archival base source from which future releases can be made. This isn’t “purist” in the way you are claiming. It’s just good practice and common sense, and I’d have thought that was obvious from my prior comment.

It’s bad if the only existing copy of an album is a CD that has a particular mastering engineer’s 1980s transfer equipment (which might very well include a sub-optimal/misaligned playback deck and/or non-transparent studio analogue gear before the ADC) and EQ choices baked into it. That reduces the fidelity of future releases that have to use that CD as their base source, and none of those baked in characteristics can be undone or reversed with any precision.

The hardware analogy to the position you’re taking here would be to say that from a “purist” point of view it would be ideal to use fully linear, load-independent amplification, but if high-quality solid-state amps were all destroyed in a fire then it’s no big deal to use a tube amp and just make it more linear with EQ and DSP in the playback chain. That position wouldn’t fly here, and for good reason.

I have noticed here at ASR a tendency among some members to ignore or dismiss the importance of mastering in the perceived sound quality of the listening experience, and I find that bizarre. Yes, mastering comes before the playback chain, and there’s no objective measurement standard for right vs wrong in mastering. But that doesn’t mean that mastering makes no sonic difference - it obviously and clearly does, in cases when multiple masterings exist and their differences are obvious.

So having the original, unadulterated stereo source available as a basis for mastering is important and goes way beyond a “purist” concern for anyone interested in fidelity as the standard, as most of us here at ASR are.
I'm trying to tread carefully here, but there are a lot of ifs floating around.

We have no idea that the original mixdown recording is a "clean slate". The mixing engineer could just as easily be using a sub-optimal/misaligned tape deck and/or non-transparent studio analogue gear, may be mixing on an analogue desk with less than perfect slider controls, and could also be recording to another sub-optimal misaligned tape deck.
So in most cases, we'd just be insulting engineers who are doing their best with the equipment they have. Guessing that the mastering engineer in the 1980s couldn't set up their deck, went out of their way to use poor amplification But I fancy the mixing process is much more likely to produce errors by its nature, than playing back a two track master into an ADC. There's nothing necessarily pure about a studio master. it is usually what we have though. I'm not convinced that we have a software equivalent of fully linear, load-independent amplification anyway, to refer back to your hardware example.

There are two bigger problems. The first is definitely there in your comments, the taste of the engineer. But for a lot of those transfers, the engineer is going to be more in line with the contemporary taste when the recording was made than an engineer with modern tastes today, and that may mean more for the presentation. The second, covered in posts here already, is the quality of the master. But again, if the stereo master was used, it's likely to have been cleaner than today, and the tape player in better condition as well compared to the aging beasts that are used today (of course today's engineers are doing great work on them, but it';s still an issue). The ADC wouldn't be to modern standards. Where are the measurements that show that it is guaranteed audibly inferior? By how much?

I do agree that mastering is important. However, I'd argue that the largest component of difference between early masters and today's are going to be subjective. And we have to get off the remastering bandwagon at some point, especially from analogue originals. It's less of a deal than you're making out.

The exception to this relates to the original multitracks, which if preserved well allow us to have Atmos mixes and such.

The point is to be able to listen to the music, ultimately. And to turn your hardware analogy on its head, if it suddenly became impossible to use fully linear amplification to listen to music, I'd be first in the queue for that tube amp. For all our audiophilic leanings, at the end of the day I'm here to listen to music, and if I had to listen on a 1960s cassette deck or a 1930s wireless set, I would. The point of what we do is to listen to recorded music. If I can't have the standard of equipment I have in my living room right now, I'm not giving up. For me, that position would absolutely fly here. What;s the alternative, give up?
 
Rubbish!

The changes needed to make a master stamper for LP are not subtle at all.
According to @atmasphere this is not correct but good information is hard to come by. In your opinion what are the "not subtle" audible changes done to a LP master? To me "sum to mono" for lower bass is not a big deal as a lot of people do that with a sub anyway. Some roll off of highs and lows are also not a big deal for most pop music and neither is dynamic compression. RIAA is applied at the "cutting stage" so the LP master is not RIAA EQ'd. These issues do affect classical music much more than other types of music. Also if you look at the specs for a Studer A80 or similar they all ready are rolling off the highs and lows and the other specs are nothing special. I do agree that by the time an original master becomes a stamper the quality has been degraded but the degradation from original master tape to LP master tape is not the worst of it and in some cases a LP cutting master is "better" than the original if a skilled mastering engineer did his job correctly.

Studer A80 Specs.jpg
 
I'm trying to tread carefully here, but there are a lot of ifs floating around.

We have no idea that the original mixdown recording is a "clean slate". The mixing engineer could just as easily be using a sub-optimal/misaligned tape deck and/or non-transparent studio analogue gear, may be mixing on an analogue desk with less than perfect slider controls, and could also be recording to another sub-optimal misaligned tape deck.
So in most cases, we'd just be insulting engineers who are doing their best with the equipment they have. Guessing that the mastering engineer in the 1980s couldn't set up their deck, went out of their way to use poor amplification But I fancy the mixing process is much more likely to produce errors by its nature, than playing back a two track master into an ADC. There's nothing necessarily pure about a studio master. it is usually what we have though. I'm not convinced that we have a software equivalent of fully linear, load-independent amplification anyway, to refer back to your hardware example.

There are two bigger problems. The first is definitely there in your comments, the taste of the engineer. But for a lot of those transfers, the engineer is going to be more in line with the contemporary taste when the recording was made than an engineer with modern tastes today, and that may mean more for the presentation. The second, covered in posts here already, is the quality of the master. But again, if the stereo master was used, it's likely to have been cleaner than today, and the tape player in better condition as well compared to the aging beasts that are used today (of course today's engineers are doing great work on them, but it';s still an issue). The ADC wouldn't be to modern standards. Where are the measurements that show that it is guaranteed audibly inferior? By how much?

I do agree that mastering is important. However, I'd argue that the largest component of difference between early masters and today's are going to be subjective. And we have to get off the remastering bandwagon at some point, especially from analogue originals. It's less of a deal than you're making out.

The exception to this relates to the original multitracks, which if preserved well allow us to have Atmos mixes and such.

The point is to be able to listen to the music, ultimately. And to turn your hardware analogy on its head, if it suddenly became impossible to use fully linear amplification to listen to music, I'd be first in the queue for that tube amp. For all our audiophilic leanings, at the end of the day I'm here to listen to music, and if I had to listen on a 1960s cassette deck or a 1930s wireless set, I would. The point of what we do is to listen to recorded music. If I can't have the standard of equipment I have in my living room right now, I'm not giving up. For me, that position would absolutely fly here. What;s the alternative, give up?

Respectfully, not a single point you make here changes the fact that preserving the original mixdown master is in every case preferable to not preserving it and instead preserving only a CD that has been made from that master.

Once again, note that I am not saying the original mixdown master is preferable for release to consumer music listeners. So everything you say about an original mixdown master and/or a flat digital transfer aka clone of that master possibly being mediocre, not done to the highest standard, or a result of a mixing engineer's "taste" is absolutely true. But none of that means that it's preferable, or even equally good, to only have access to a CD that has had additional work done on top of that original master/clone work.

Again, that CD might be preferable to the original master on its own as a release. But for the purposes we are discussing here - the existence of a version of the recording used as a source for future mastering of future releases - the pig is better than the pig with lipstick on it, because the original pig can have the benefit of technological advances in lipstick, of different tastes in lipstick color or application technique, and so on. If it's felt that the original lipstick that was put on the pig is as good as it gets, then fine - the original lipstick on a pig can be reissued as-is, or those who like it can simply play their copy (or easily buy one on the used market).

RE the hardware analogy, you've completely missed the point: I would be right there with you in line to get that tube amp if all the better amps suddenly disappeared. My point was never that I would simply refuse to listen to the music anymore - I'm sorry, but that's just absurd. Of course none of us would "give up" - my point is simply that if all those better amps disappeared and all we had was the nonlinear, load-dependent tube amp, that would not be a good thing and I'd be sad about it.

Your final point illustrates exactly the critique I was making in the comment you are responding to: this bizarre compulsion to try to claim that mastering doesn't really matter when in fact mastering of the source routinely makes more of an audible difference in what we hear than anything else in the playback chain except for speakers and the room.

It's just silly to dismiss the importance of variations in mastering, and it's equally silly to try to argue that the loss of the original, first-generation stereo source for a stereo piece of music is of no importance and not worth lamenting.
 
According to @atmasphere this is not correct but good information is hard to come by.

I agree that this is a question that's not easy to answer in a definitive, blanket way.

However, with all respect, atmasphere's posts here are invariably anecdotal in nature. The pattern is quite clear and repeated: the posts start with a blanket factual claim, and the evidence usually turns out to be what he did or saw in this or that situation. More than once his posts have even stated explicitly that what he did was in his opinion a best practice that was not consistently followed by others in the industry.

So to my eyes, most of atmasphere's posts are evidence for what he has done, and what he thinks or wishes would be (or would have been) industry standard practice, but were not.
 
Back
Top Bottom