Melco ripping embarrassment

I wonder how the UnitiServe knew it had a 100% accurate rip?

When you’re ripping CDs (using any transport) it’s not uncommon to hit minor errors. This is what accurip protects against using publicly available databases of hashes of perfect bitstreams. In our collection error rate was about 20% of CDs until I began cleaning before ripping. Different transports had issues with different CDs. Cleaning the CD and re-ripping usually resolves.

The Melco being very much slower - is it reading the CD many times and taking a majority vote to try and work around reading errors. Output would still be same perfect bitstream as an accurip elsewhere.

Perhaps comparing every UnitiServe rip to an independently verified accurip bitstream is comparing apples and oranges.

I also remember that horrible feelings when the penny dropped and i knew that i had to re rip all mine, plus also knowing this time its going to take even longer this time.
Kept the D100 even if it’s hardly used now as you never know

1 Like

I don’t think that’s quite how it works.

.sjb

I reckon 90% of my listening is via Tidal now, I really hope they ripped the music with an overpriced buffalo nas, :slight_smile:

I still have mine too as I occasionally buy a CD when I can’t source a 24bit version of the album, but otherwise I generally only buy 24 bit downloads.

1 Like

Same as you really, i use my qobuz sublime account and download directly to the melco

I really am quite happy for it to be proved wrong, but so far it does answer all the diagnostics given here. Biggest point being that a Melco ripped file “only” sounds better through a Melco Server.

I have vowed never to buy from Qobuz again!

I just like the extremely easy way to buy, download the music to the melco directly, nothing else in the way, no pc, no stick, etc.
Plus being a sublime member gets very nice discounts

2 Likes

Have you noticed the sneaky Qobuz clause in their contract which requires you to agree that the download contract is fulfilled as soon as the download begins?
That was introduced after I forced them (via an intervention by PayPal) to refund me after the purchase of a Bob Dylan Basement Tapes album wouldn’t download correctly. When I asked for a refund they told me that ‘Qobuz doesn’t do refunds’. They wanted me to buy something else to the same value for free, but I didn’t want anything else at time.
They later fixed the download problem but I told them I had subsequently bought the album from another source. They insisted on seeing the HDTRACKS invoice before they would refund me.
I became very wary after that and would only buy when there was no alternative. But in the last year or so I have downloaded a couple of albums which had track names too long to be accommodated in my system, and couldn’t be copied. (I download to a PC and then copy to Melco). In some cases I was able to edit the track name and so fix the problem, but recently bought an album which had track names too long to be edited in my system.
My email to them was not responded to and I had to buy the album elsewhere.
So, never again. HDTracks and Prostudiomasters have very slick downloading processes which also allow re-downloading if necessary, so goodbye to extracting tracks from .tar files!

1 Like

I’ve had occasional issues with track name length but never had an issue editing them using one of mp3tag’s features to create a new track name from slices of metadata - ‘tracknumber_trackname’ style. Not that the track name matters at all, just a number would do.

Life’s too short to be going through that malarkey! I’m looking for simplicity, not tech solutions…

It’s a few click of the mouse to amend all the track names simultaneously.

I use Roon so not amended anything for years :slightly_smiling_face:

I wouldn’t buy anything from HDtracks, too much uncertainty of the origin of the files. :face_with_raised_eyebrow:

1 Like

This was discussed to death on a few threads. IIRC, it was established that the packed data was identical once the RIFF format header was stripped off, however, what was not established was how the data was packed. The RIFF format is strict but allows for some variability in time coding in how samples are grouped. The AccurateRip Checksum will report on the re assembled bit sequence only. Anything that rips data from a CD is sending that (or a checksum of), not the RIFF payload - which AccurateRip doesn’t understand.

There was certainly talk of processing two files byte for byte and looking at, not just the payload but the sample sequence packing. I remember because I suggested this. I actually have more time now and could probably do this over Xmas and New Year if someone cares to supply me with two WAV files: one from a Melco, the other from something like EAC or dbPoweramp - both having passed AccurateRip.

Certainly I remember a similar exercise I documented on the old forum, where I was trying to determine why EAC rips sounded so much better than MediaPlayer rips and just looking at the 4byte samples (not how they were packed), was enough to show significant deviation (was a miniscule fraction like 0,00x% but multiple over 1 second gave a statistically significant number of deviant samples). Of course, in that case, one rip was most definitely not passing AccurateRip and packing was not examined at all (why bother? Divergent samples was explanation enough). That’s going back 8 years or more though.

I’d caution anyone against falling into the trap that a certain science (allegedly) based audiophile publication frequently does where assumptions are made based on understanding of maybe 10% of a topic completely unaware that there is another 90%. And those assumptions and bold declarations are often totally wrong and ill informed and anything but scientific.

I see a lot of assumptions made on this topic where people are absolutely sure they know. Therein lies the danger.

If people are interested I can certainly do this. It won’t be fast though - I have a day job and a family :grin:

3 Likes

I hope one or more of the Melco owners on here take up your offer - you didn’t state how to send to you but perhaps @Richard.Dane would oblige by forward upon request. Otherwise if you authorise he can pass your email address to specified people.

I don’t believe Naim will facilitate the transfer of files for legal reasons. But if someone wants to take me up on this and we agree in this thread, Richard can share my details and we can work out a location to fetch them from off-forum.

I agree with you on this. At first I attributed this to the D100 I had borrowed.

But I am now back to using a Buffalo BDXL and the rips sound better when I connect it to the Melco than when using dbpoweramp on a mac. And this makes more sense as any noise from the rip device should not carry over to the file stored on disk.

AccurateRip cannot be trusted these days when fewer are buying CDs.

I simply believe Melco has tuned to file format and the hard-disk driver to minimize electrical noise when reading.

I use the recommended settings from the Melco blog and store as FLAC-uncompressed and sharing turned off. I ran Minim for a while but I am back on Twonky and use SongKong running on the Melco to edit metadata.

Art is a useful analogy. David Hockney is an amazing artist who has produced digital works in addition to his long career of paintings and other media.

If you made a digital copy of one of his digital paintings, it would be identical, but a photocopy of the image would not.

The difference with digital is that everything starts from a blank canvas (excuse the pun). A CD is blank until a laser cuts pits to store the binary code. When a copy is made, there is nothing to copy except what was recorded in the first place, because all that is being copied is bits (literally an ‘on’ or ‘off’ signal). Each bit holds only one piece of information and it’s either there or it isn’t. It’s therefore easy to copy a digital file because all you need is to copy the little on and off markers.

A separate checksum (calculation) runs to confirm that the bits in the original file have been copied and none missed or introduced. There is no therefore complication introduced by variables such as media, temperature, humidity, equipment and so-on as there is for an analogue signal copy.

This enables us to be completely confident that a copied digital file is identical. The world’s economic systems would collapse without such confidence.

If the two files you have created are identical (have you checked this?) then they are identical. A file cannot store information in some other way that cannot be read, as it would be ‘found’ as extraneous bits in the file.

This is not accurate. A CD has two levels of error correction. One is deterministic, the other isn’t. Given enough mis reads, the non deterministic must be used and therefore, error introduced.

They almost certainly won’t be at the file level. The header metadata, the number of lean in zero byte samples will almost certainly be different between any two different ripping mechanisms, even if the AccurateRip checksum in the same. The files will (almost) certainly have different checksums so you cannot use that as any reliable way to verify anything.

A common misconception is that with ripping CDs, we are dealing with data files. This is patently false. The data on a CD is digital. But the arrangement has more in common with a vinyl LP than a file on your computer. Tracks are laid out with time markers in the TOC that tell the laser head physically where to go on the disc as a “best guess” (but pretty bang on accurate by the standard of our ears) location to find the start of a stream. There absolutely no files making up the audio portion of a CD.

In ripping then, you are taking non file organised data with a the Red Book order and error correction data, re-assembly only the resulting bitstream, and then sub dividing that and storing those sample in whichever format is required.

An important distinction here is that with files, you can checksum file X and compare it to file Y and ensure that two parties have identical data. But when ripping, you cannot do that. The best you can do is say: If I reconstruct the payload X from RedBook, and you reconstruct payload Y from a file, we hope the payloads are the same. But after that we will go our separate ways. Red Book will stay on the disc it it’s non file format, and you’ll take the payload and write it to file and we will trust that once reconstructed, the payload is the same again but we won’t verify.

Because AccurateRip verification is not on a file, it is on a payload before being written to file.

Now, I personally do have a view on this topic. After all I use EAC rather than a Melco or a Core so what I think is pretty easy to figure out. However, I might work in IT for a living and have a fair grasp of some of this stuff, but I like to think I am smart enough to say honestly, “I have not programmatically compared two such files from Melco and method X, taking into account the finer details of the RIFF format, and therefore I do not have definitive proof.”

3 Likes

The number buying is irrelevant, what counts is the number ripping or who have ripped. For recent CD releases the reduction in number buying may well mean AR data will be limited, but different for the millions of releases over the decades that people have ripped with data sent to AR, and are ripping even now as they shift their longstanding CD collections to music files.