In a world where ultra-thin laptops and tablets (all of which have no optical disc drives) are beginning to supplant the desktop, in a tentatively named ‘post PC era’, the optical disc is beginning to lose its relevance. Its limited storage compared to multi-terabyte hard disks, its vulnerability to scratches, its limited reliability and lifetime and bulky size have made it one of the next technologies to disappear from the home. Increasing reliance on digital download delivery, and online purchases have made the optical disc a bit of an outcast.
Despite this, I still use optical discs, although I realize we are sitting on the tipping point. There are advantages and disadvantages to each technology – delivery of high quality, high-definition content is still best achieved using optical discs due to the limited bandwidth of present internet connections and download costs. Backups to write-once optical discs also have a permanence that cannot be accidentally deleted, erased or overwritten. And failures of drive mechanisms are separated from the media – so you can always read the disk in a different drive, where any failure of a hard disk’s head subsystems pretty much leaves the data stranded forever (or alternatively, will involve an extremely expensive data recovery operation that most people cannot justify). And unlike flash memory which stores data as charges in a semiconductor circuit which degrade over time – properly constructed, burnt and stored optical discs should last longer.
But still, despite the advances of optical discs from initial beginnings of CD at around 650-700Mb through to DVD at 4.7/8.5Gb through to Bluray at 25/50Gb to Bluray XL 100/128Gb, the capacity of optical discs have never lagged so far behind hard disks as now. Backups of hard disks to optical discs are unwieldy, multiple-disc sets vulnerable to the failure of any disc. The time-consuming nature of writing and reading discs (especially the higher capacity discs which tend to be slower) makes them even less attractive, especially when coupled with expensive prices for bluray optical drives and BDXL media (which is currently much more expensive than hard disks on a per gigabyte basis).
Unfortunately, low-cost optical media has a bad reputation. Many people have purchased off-brand, counterfeit media, or low quality media which is incompatible with their optical drives, and while these discs may initially work, the data disappears over time – the dye degrades and the data becomes unreadable, or the reflective layer oxidises and turns dark and spotty. Many people don’t read their discs regularly to check their condition, and most people don’t understand the concept of the digital cliff (with regards to media, but equally it applies to most error-corrected digital transmissions, for example digital TV and radio). I’ll explain that further as we go on, but the key take-away point is – just because you can read it today, you don’t exactly know how close to disaster you might be. You might just have a box of backup discs which are dead. That’s exactly what happened to my Ritek first-generation Bluray media.
One way of staving off disaster is to make sure that you use relatively modern writer drives, as they will be supported by the manufacturer and the laser would not have drifted from calibration too far. And then, this is key, upgrade the firmware on a regular basis. The firmware of the optical drive contains the program code which operates the drive – but most importantly contains a media code table with write strategies. You can think of this as a “recipe for burning a particular sort of disc”. If you use a generic recipe (e.g. using a disc the burner doesn’t know about or support), the disc may write slowly (e.g. limited to 4x or 8x when the disc should be 16x capable) and use a generic write strategy resulting in marginal quality burns (which struggle to read back – slow read speeds, or corrupted data straight after burn).
There are ways with supported burners to inspect the firmware and look at the media table. MCSE from ala42 from CD-Freaks is a tool which is quite good. Combine this with DVD Identifier which allows you to find the Media Code Block data from the disc to try and tell if it’s a supported disc. By doing so, you can stick to the quality discs which you know work well – normally MCC (Verbatim), and TYG (Taiyo-Yuden) are the “standards” of the industry when it comes to quality and are safe buys. One thing to be aware of is the existence of fakes which use the media codes belonging to these manufacturers to make their disc appear more compatible – if you find any media “Made in Hong Kong” or “Made in China” – be very careful. Discs from CMCMAG (CMC Magnetics), MBI (Moser Baer India), RITEK (Ritek) tend to inhabit the cheap-bulk media category and are perfectly suitable for burns where the safety isn’t critical, their quality is “acceptable”. Discs from OPTODISC, PRODISC, PRINCO, LEADDATA, INFOME tend to be poor and not really worth risking your data with. This covers about 98% of the discs you will ever occur on the market nowadays, given many smaller manufacturers have pulled out or shut down.
Using the tool, you can even modify the firmware and “stratswap” – which means changing around the burning recipes in a quest to either improve write quality or over-speed for media which aren’t supported or poorly supported. I used to do this a lot, but it comes with a grave danger – especially, if you have no method to see whether a particular burn is better than another.
In fact, as an aside – I was involved in the development of MCSE in beta testing the tool on my LG GSA-4163B burner, which had suffered laser failure a long while back. You can find me as lui_gough in the changelog. That was 2005. I’ve been a long-time member of CD Freaks (now myCE) with a post-count above 1000. I’ve been a long-time user of Omnipatcher and EEPROM Tool (also useful tools for LiteOn owners), and have “overclocked” several of my old drives (e.g. turning an LH-18A1P into an LH-20A1P, and my 1633s into a 1653s).
But onto the star of this post – the Milleniata Disc (or M-Disc). This disc promised to change the whole landscape of optical discs when a press release made the news about a year or two back. They promised a disc which would not suffer the same problems with disc rot from failure of the dye or reflective layer that regular discs did. They claimed to use an inorganic, stone based data layer which data is etched into, lasting up to 1000 years (ambitious!). And best of all, they backed this up with research originating from the US Defence Forces!
It sounded too good to be true – vapourware almost. Except for the fact that just this year, it had made it into a computer shop! It’s real and it’s here, now! In fact, ARC Computers has occasional stock of it – I managed to get one spindle from Parramatta branch (the only spindle they had). The price? about $20 for 10 discs – seems expensive, but compared to the price of Kodak Gold (the former “gold archival standard”, which is MAM-A media rebadged from memory) which commands a price about $7 per disc, this was a bargain. It is a bit special though, as it requires about five times the amount of laser power to burn, you do need a special M-Disc capable burner, which at the moment are confined to Hitachi-LG Data Storage branded burners (i.e. modern LG burners). Luckily, you can get one for under $25 – a cheap capital investment indeed, if it does keep to its claims! In this post, I’ll try my best to test these discs – and see just how they perform. Unfortunately, without a time machine, I can’t tell how well they will hold up over time – and whether there are any unforeseen failure modes which the accelerated projected life testing doesn’t account for, but at least I can get some metrics on how the discs are.
Interestingly, if you try to go to eBay, or buy it directly from M-Disc, all sources are somewhat more expensive than ARC. I have it on good authority that the spindle cost price is about $15. As we can see, even the spindle itself is quite unique with its base plate turned into a more square shape to make packing a bit more efficient.
The discs themselves, curiously, are made in the Czech Republic. That’s not a common sight to see – most discs are made in Taiwan, China, Japan, India, Vietnam and possibly Malaysia or Singapore. This in itself could be a cause for worry, especially if the quality control isn’t very good resulting in inconsistent performance.
In fact, things point to an immature operation – especially when you see lot-codes labelled onto the spindles, which would be used to track defective discs – so they know if they’re doing things right or not.
The discs themselves are a bit different to what I expected. Most of the discs publically shown to date have the M-disc silkscreening covering the top surface of the disc entirely, as you would expect from non-printable discs. These ones instead had a discreet hub silkscreen, with the top surface nice and clean (suitable for thermal printing). It also could confuse some people who may put in the disc upside down and try to use it that way, sadly, it is NOT a special double-sided disc, I tested it upside down with no recognition at all. Unfortunately, photographs with flash didn’t quite give a good representation of the appearance of the disc, so here are some that have been taken without flash, but poorly lit.
The top surface of the disc is a medium grey colour, it looks like there was some sort of spin-coated top chemical due to the uneven appearance close to the inner hub.
The bottom surface of the disc is a very shiny silver, similar to a single layer pressed DVD disc. There is a peculiarity – the disc itself seems more transparant than regular discs with an aluminium reflective coating, which is of note. There is a code in the Burst-Cutting Area (BCA – the area near the hub ring in-between the lead-in and the clamping zone) which has a font similar to those used on Prodisc blanks – which may mean that these discs were made using some Prodisc-inspired equipment. The code is ‘SMS36-E 31’ on all discs.
In the clamping zone, there is also a printed code. No idea what it means, but it’s definitely an identifying feature of these discs (most manufacturers have particular codes, fonts in the burst-cutting areas, which allow you to identify the originating manufacturer without ever even needing to use a burner to read the MCB data). Unfortunately, inspecting these discs revealed quite a few had a dusty surface, and some even had sight scuffs on the data-surface – these will compromise burn quality and make for poor quality burns by shielding the material from the laser that burns it. There was definitely round scratches as if the discs rotated against each other and scratched – after all, they were packed without any foam donuts or protect discs on the top or bottom. This is not a good start.
The drive I used was the sub-$25 drive linked to earlier – nothing special about it, except the M-Disc logo which is making an appearance on new M-Disc capable drives. The firmware is IN01, which is an initial release (version 01) firmware, and as of testing there was no more recent version available.
So, how do we test a disc – technically, to properly test a disc, you require special calibrated reference drives and analyser systems which give you readouts on 30-something parameters. Unfortunately, not being able to afford drives like this – there are ways with special particular drives and software utilizing diagnostic modes in order to get some stats about the disc itself. Specifically, certain models of LiteOn drives (1s, 2s, 3s, 4s, 5s and 6s series preferred), and BenQ drives (DW1640) can be used to get some good information on the readability of the discs. For those more “wealthy”, certain Plextor drives with PlexTools Professional also provide exceptionally good test results. There are also abilities to use various NEC drives to get some statistics, however, those are not comparable. However, one has to keep in mind that the numbers DO NOT come from a calibrated drive, so repeatibility can be a problem, and the values themselves are not absolute. They provide a good indication about how well a drive is read by ONE given drive, and different drives with different tolerances to disc warpage, jitter, focus and tracking errors will return different figures – however, a disc with low figures on multiple drives likely points to a good disc! Testing at different drive speeds will yield different results, and there are standard “rules of thumb” for the hobbyist community around that, but these tests are somewhat controversial. In general, testing on older, more picky drives will yield a more sensitive test result which will go bad at the slightest anomaly, however – on this occasion, I found my favourite LiteOn 1653S and LH-20A1P both died which was a shame (more on that in another post), so I’ll have to make do with a modern LiteOn iHBS212 which is not capable of providing Beta, Jitter and TE/FE values correctly.
The first thing we do when getting a new disc for testing is to identify it – DVD Identifier is the tool of choice:
---------------------------------------------------------------------------- Unique Disc Identifier : [DVD+R:MILLENIA-001-001] ---------------------------------------------------------------------------- Disc & Book Type : [DVD+R] - [Not Available] Manufacturer Name : [Manufacturer Not Found In Database] Manufacturer ID : [MILLENIA] Media Type ID :  Product Revision :  Blank Disc Capacity : [2,295,104 Sectors = 4.70 GB (4.38 GiB)] Recording Speeds : [1x-2.4x , 4x] ---------------------------------------------------------------------------- ** INFO : Hex Dump Of 'Media Code'-Block Listed Below ** INFO : 4-Byte Header Preceding 'Media Code'-Block Present ** INFO : Format 11h (Method 1) - ADIP Information 0000 : 01 02 00 00 a1 0f 02 00 00 03 00 00 00 26 05 3f .............&.? 0010 : 00 00 00 00 00 00 01 4d 49 4c 4c 45 4e 49 41 30 .......MILLENIA0 0020 : 30 31 01 38 23 54 37 0c 12 82 00 26 fa 00 0b 0e 01.8#T7....&.... 0030 : 07 08 02 03 01 0b 0e 07 08 02 03 01 00 00 00 00 ................ 0040 : 00 00 00 00 01 00 38 38 00 4b 64 9d 14 10 10 10 ......88.Kd..... 0050 : 0a 0c 05 00 00 00 00 4b 64 9d 14 10 10 10 0a 0c .......Kd....... 0060 : 05 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 ................ 0070 : 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 ................ 0080 : 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 ................ 0090 : 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 ................ 00a0 : 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 ................ 00b0 : 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 ................ 00c0 : 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 ................ 00d0 : 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 ................ 00e0 : 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 ................ 00f0 : 00 00 00 00 00 00 00 00 00 00 00 00 ............ ---------------------------------------------------------------------------- [ DVD Identifier V5.2.0 - http://DVD.Identifier.CDfreaks.com ] ----------------------------------------------------------------------------
While the disc must be burnt in a compatible burner (due to laser power and strategy requirements) – it can be identified correctly in other burners. That suggests it is compliant to DVD+R physical specifications.
The next thing to do when testing a disc is to see how well a burner is able to focus on the disc surface and keep track of the grooves. There is a non-destructive (does not consume discs) test for this which you can run only on BenQ DW1640 drives using a tool called QSuite which is provided by BenQ themselves. Alternatively, LiteOn Drives with certain test firmwares in the 5S, 6S series are verified to be able to run TE/FE tests on written discs using DVDScan. We can run the test at varying speeds (2.4x, 4x, 8x, 12x, 16x) which should give an indication of the ability to overspeed the disc when burning (another unique feature of BenQ DW1640).
The Milleniata discs scored as follows:
Looking at these graphs, it looks like the disc is fine for a 4x burn, marginal at 8x in terms of tracking and focus error. Given that the disc itself can only be burnt at 4x, this shouldn’t be a problem, in theory! However, this suggests that the disc may have problems with mechanical rigidity at high spin rates and is warping or wobbling on its outer edge, which will definitely cause problems if readback is attempted at high speeds – it may mean that readback at 16x is not possible!
Lets compare this with a Mitsubishi Chemical Corp disc (MCC). This particular disc is a B-grade reject which was rebadged as a no-brand (did NOT pass QC for Verbatim) and cost me just 12c/disc. See while the tracking error and focus error are initially spiky (due to manufacturing issue – probably worn stamper), there is no great out-of-bounds increase towards the other edge suggesting mechanical issues (click for larger). In fact, this disc burnt at 22x just fine, and 24x burns on this batch of disc resulted in throttle-back during burn due to tracking issues at the outer edge. (The outer edge of the disc is most vulnerable to oxidation attack, is most difficult to burn and read back well due to mechanical wobble – so not filling your discs to the brim can make them last longer!)
Okay, so we’ve established that our M-Disc should be fine for a 4x burn, so we can shove that into the M-Disc capable burner and burn it. To analyse the burn speed process – we can use (freeware) Nero CD-DVD Speed, which is no longer supported, to burn an image and graph the speed:
So nothing special here – it’s a pretty straightforward CLV (Constant Linear Velocity) burn – the data rate remains a constant 4x (with no dips, so no WOPC – Walking Optimum Power Control which stops burning, tests the burn quality and adjusts the laser power periodically during burn). It does illustrate the slow rate of data burning, which is one of the annoyances of optical media, but otherwise, it’s what we can expect. CPU Usage cannot be reported correctly due to the use of an (unexpected) 6-core CPU on this machine.
So, it’s burnt, that’s the end of the story right? Maybe do a quick file verify and if that works, we’re good? Not yet!
Now comes the exciting part – PI/PIF/Jitter Testing. PI stands for Parity Inner, and PIF stands for Parity Inner Failure. Basically, what we’re testing is the number of errors and uncorrected errors in the multiple stages of error correction data burnt to the disc, to establish how much margin we have till it’s unreadable. Roughly speaking, PI errors should not exceed 280 counts per 8 ECC blocks. PIF errors should not exceed 4 counts per ECC block. Jitter should remain below about 11-12% to be reliably readable, with a bit of variance depending on the drive used to test, and the speed that you’re testing at. The ECMA specifications seem to suggest that they conduct testing at 1x, which would take a LONG time.
So here’s the first disc, tested back in the BenQ DW1640 using Nero CD-DVD Speed:
It’s not a good sight! While the PI Error rate is excellent and well within limits, the PI Failure rates are way too high. Jitter is acceptable. Granted this disc was slightly dusty in the outer edges, there seems to be quite a few errors in the early parts of the disc. BenQ drives report PIFs summed over 8 blocks, so the error rates can be up to 8x the PIF limit summed over 1 block, however, if they occur too close together, it can be unreadable. It could be a first-disc problem – some burners will “learn” how to burn discs over time by storing calibration data into their EEPROMs.
But lets ask a second opinion of the disc – from a LiteOn IHBS212 using DVDScan – this reports in sums over 1 block:
Very vaguely, this scan tells a similar story – the errors in the inner region just don’t quite meet the expected limits. The disc isn’t unreadable, but it could be if it degraded over time. A fresh burn should look much nicer – in fact, here’s the result from the 12c disc:
So, lets try another one – here’s a second disc, as reported by the BenQ:
This looks NASTY! The test actually paused itself to realign itself with the disc, suggesting the disc warpage was affecting the test quite badly, invalidating the results towards the outer edge, however, the disc itself is actually readable (by virtue of slowing the read rate). The second opinion drive seems to have very little trouble with this disc, being a drive of more modern vintage, and testing being conducted at 4x rather than 8x (as it is customary to do testing at 4x on Lite-On drives):
In fact, this is a good burn according to the Lite-On drive. So why don’t we try a third disc?
Again, we see a similar result to the second disc. Hmm. Looks like there’s a bit to the physical construction of these discs which could be improved – one of my fears is that these discs may delaminate eventually or the polycarbonate may flex and warp too much to hinder readback. Only time can tell. The second opinion drive tells a different story – a good burn:
So, is it really just the rigidity causing errors to show when they’re not really there? I suspect it is. In order to prove the hypothesis, I did a readback speed test – and on the BenQ, this happened:
So the disc is readable, however, the drive had to slow down for the read to succeed. Lets try this with a different drive and see what happens?
For comparison, here’s the same BenQ drive reading back the MCC disc (the 12c comparison disc):
So, it’s definitely not a drive shortcoming that the discs cannot be read at full speed. Considering that there are burners which write up to 24x (in fact, the LG used to burn it is one of them) – this suggests the media physical construction isn’t quite up to snuff.
So lets test another burn – Disc number 4 on the spindle:
Now this one resembles the first disc more than the second and third. Now we’re beginning to see what may be manufacturing variance coming into play, but the result from this burn isn’t very encouraging. A second opinion from the Lite-On drive backs up the result:
The errors on the inner zone definitely show up on both drives suggesting the burn was indeed flawed.
It’s a tough conclusion to write. The Milleniata concept of having discs with an innovative, stone based data layer that “etches, rather than burns” seems to be obviously unique and patented. It’s visually distinguishable from regular discs, and its promise of long term data storage backed up with research is hard to dispute.
The disappointment lies in the slow write speed of just 4x, and the variable burn readback quality, which may be linked more to the physical construction of the disc and quality of manufacture than anything else. Of course, this can be improved by the manufacturer over time, and firmware updates can possibly eek out more speed and quality. For an initial release to the market, the Milleniata experience was quite decent.
The disappointment is that the burn quality of these discs fresh out of the burner isn’t class leading by any means. While over time, the better burns on organic dye-based discs will fade and errors will creep in, leaving the Milleniata the clear winner, it would be preferable for the Milleniata discs to have much better burn qualities initially, which could improve readback margin making them last even longer! Pricing could be improved, and so could media compatibility – allowing other burners to be used with the discs. I think there might be an exclusivity agreement between LG and Milleniata which limits the compatibility of the discs to just LG drives, which is better than before where it required Millentiata specific burners.
For now, it’s an interesting curiosity, that’s worth using to back up your most precious data with, just in case their claims are true. It’s cheap insurance compared to a full-out data-recovery service call for a hard disk, and it’s readback compatible with any DVD drive. Unfortunately, as optical drives are on their way out, the only guarantee is that there may not be a single DVD drive left in existence in 1000 years. So much for that.