[Slowhand] Re: Da Boss Man Pt. 1

Luke Pacholski LukPac at lukpac.org
Sun Aug 22 19:02:03 EDT 2004


AG, re:

>As *I* use it (correct me if others use it differently), "on the 
>fly" copying is where you use one (source) drive and another to burn 
>(destination) at the same time.

That's the norm, yeah. *Although*, I think (don't quote me) some 
programs these days still read to the hard drive first, then burn.

>The problem with this - as Luke has pointed out - especially a few 
>years back with older PCs, was that if the reader hits any 
>scratches, dirt, hair, dust, anything, and wants to go back and 
>re-read a sector, this can cause a hiccup or gap on the copy being 
>burned.

True. However, decent software *should* stop writing with a "buffer 
underrun" whenever it encounters such a situation. What that means is 
you only have a burned disc up to that point, after which you have 
nothing.

>It doesn't work the same way with CD-Audio. When CDs were invented, 
>the engineers never in their wildest dreams thought of this kind of 
>problem, they designed the error correction routines to allow for 
>the problems that would likely be encountered with pressed CDs. 
>Here's one example - if you look at the data capacities for an audio 
>CD versus those for a data mode CD-ROM, you will see that the CD-ROM 
>has less room for you to use - the "missing" room is in the form of 
>the far more robust error correction and checksum routines present 
>on CD-ROMs, since losing even a single bit of data on for instance a 
>program or file can prevent it from working altogether. Whereas a CD 
>has 44,100 samples PER SECOND and the error correction in the player 
>can compensate for some missing samples.

The error correction actually isn't the biggest problem here - the 
error correction used on CDs is so robust that gaps up to 2.4mm can 
be *correctly* read from surrounding data. No, I don't mean "it will 
get close", but actually "the data you read off is exactly what was 
put down." Compared to most "pinholes" on CDs, 2.4mm is huge. Gaps up 
8.5mm can be approximated - you shouldn't be able to hear any 
problems.

More here:

http://www.disctronics.co.uk/technology/cdbasics/cd_frames.htm

No, the larger problem is the way data is stored on audio CDs. As CDs 
were meant to be played at 1x, and not on a computer, the audio data 
is essentially one long spiral of data, without block addressing. 
Poor software might re-read parts of blocks and miss others. Like 
this:

Data:  ABCDEFGHIJKLMNOPQRSTUVWXYZ
Pass1: ABCDE
Pass2:    DEFGH
Pass3:           KLMNO

Etc...good software reads and re-reads as to overlap. Ie, to say 
"I've already read DE, I'll start this next read at F." More here:

http://www.cdrfaq.org/faq02.html#S2-15

---
>The problem occurs because the Philips CD specification doesn't 
>require block-accurate addressing. While the audio data is being fed 
>into a buffer (a FIFO whose high- and low-water marks control the 
>spindle speed), the address information for audio blocks is pulled 
>out of the subcode channel and fed into a different part of the 
>controller. Because the data and address information are 
>disconnected, the CD player is unable to identify the exact start of 
>each block. The inaccuracy is small, but if the system doing the 
>extraction has to stop, write data to disk, and then go back to 
>where it left off, it won't be able to seek to the exact same 
>position. As a result, the extraction process will restart a few 
>samples early or late, resulting in doubled or omitted samples. 
>These glitches often sound like tiny repeating clicks during 
>playback.
>
>On a CD-ROM, the blocks have a 12-byte sync pattern in the header, 
>as well as a copy of the block's address. It's possible to identify 
>the start of a block and get the block's address by watching the 
>data FIFO alone. This is why it's so much easier to pull single 
>blocks off of a CD-ROM.
>
>With most CD-ROM drives that support digital audio extraction, you 
>can get jitter-free audio by using a program that extracts the 
>entire track all at once. The problem with this method is that if 
>the hard drive being written to can't keep up, some of the samples 
>will be dropped. (This is similar to a CD-R buffer underrun, but 
>since the output buffer used during DAE is much smaller than a 
>CD-R's input buffer, the problem is magnified.)
>
>Most newer drives (as well as nearly every model Plextor ever made) 
>are based on an architecture that enables them to accurately detect 
>the start of a block.
>
>An approach that has produced good results is to do jitter 
>correction in software. This involves performing overlapping reads, 
>and then sliding the data around to find overlaps at the edges. Most 
>DAE programs will perform jitter correction.
---

>This is ALSO why the "it sounds fine to me" test is no proof that 
>the copy is identical to the original - it's possible that the copy 
>you made has a percentage of bad data, but when you play back the 
>copy the CD player is "fixing" the errors. This is great - unless 
>THAT disc gets copied, and then THAT disc gets copied, etc.  The 
>problem snowballs on later generations. Here's one literal example: 
>with my older PC/soft, I could make a copy of the same silver 
>(pressed) CD with both Adaptec/Roxio and with EAC.  Both of them 
>played perfectly on my home CD player.  Now my car player is a 
>little older and has a harder time "seeing" CD-R discs since they're 
>less reflective.  The disc copied with EAC mounts up instantly and 
>plays perfectly. The Roxio copy would grind and spit a while, and 
>either not play, or would play jerkily
>because the car player had a harder time correcting for the errors.

While I agree that burning errors isn't a good idea, I don't think 
that was your problem. Even if your Roxio rip/burn did contain errors 
compared to the original, that would have *nothing* to do with not 
being able to read the new disc. Let's take my example above. After 3 
passes you get the following data:

ABCDEDEFGHKLMNO

Compared to the original, yeah, that has errors. But as far as any 
drives reading that are concerned, that's what the data is supposed 
to look like. The resulting disc would be exactly the same as a disc 
that had that data in the first place.

If a disc is having problems being read, it's a problem with the 
burning step, *not* the ripping step.

Luke
-- 
http://lukpac.org/


More information about the Slowhand mailing list