Investigation: How Accurate is that 3.5 digit Multimeter?

A few weeks ago, one of the commenters made a suggestion that I test some more inexpensive test equipment, after my stint at reverse engineering USB charger doctors/detectors and determining their accuracy. While I won’t go chasing low end test equipment, at the time, I reasoned that testing them would be a waste of time.

But in the end, curiosity got the better of me, so I decided I would give a pile of 3.5 digit multimeters a test and see just how accurate they were. These sorts of meters are the ones which are now very affordable and turn up almost everywhere – they can even be had for under $5 now, with probe leads and batteries, with more deluxe units featuring more than the “basic” 19-ranges (aka DT830 and variants) and claims of better accuracy.

Seeing as I had some test equipment, I decided to test them against the Keithley Model 2110 5.5 digit digital multimeter as a reference. After almost two weeks of work, I’ve managed to come up with some results! But first, a few notes and important points.

Getting a Reference

In general, it’s considered bad form to use a multimeter to calibrate another multimeter. Part of the reason for this is that a proper calibration normally uses sources with a very high level of precision and stability. Utilizing other sources and cross-referencing readings can be vulnerable to errors introduced by the reference multimeter, its resolution/accuracy, loading on the source, instability in the sources, etc.

Unfortunately, high accuracy calibration sources are not one thing I possess. That being said, if the reference multimeter is sufficiently accurate and in calibration, it could be possible to get a reasonable reference reading by which to judge your other multimeters on. In order for this to happen, you would expect that your reference multimeter needs to have more digits than the meters to be calibrated, and the error ranges to be less than the reading gradations on your meter to be tested. In order to understand this, you must consult your datasheet.

In our case, I have chosen my highest accuracy instrument – the Keithley Model 2110 to serve as the reference measurement. Its 5.5 digit display surpasses the 3.5 digit display on the meters to be tested by two digits. I also performed some comparisons between its stated error in reading versus the step sizes on the 3.5 digit multimeters we will be testing.

Reading-Error

This table itself is compiled from the errors as a % of reading + % of range. Only the values at the zero and full-scale readings are presented, however further computation can provide more insight. A careful examination of the table will show that the Keithley 2110’s readout error is almost small enough to be negligible at the 3.5 digit level. For example, for the 3.5 digit multimeter, it will read up to 2V in 1mV steps, but for a 2V signal, we will be in the 10V range with 0.44mV of error (i.e. less than one display unit). In fact, for voltage and resistance measurements, this appears to be the case.

However, this is not the case for current, where for 200mA of input, the readout error on the Keithley is 0.5mA which is 5 units. At 10A, the error at full scale reaches 3 units. Therefore, we must be a little careful in our interpretation of the test results.

In order to give this experiment the best chance of success, the multimeters were all kept in my room for days, temperature controlled by an air conditioner at 23 degrees C. The Keithley 2110 has not been switched off in weeks, and has been in thermal equilibrium, thus avoiding drift. It’s important to note that the above error margins are typically conservative, and actual measurement units may perform better than specified in the data sheet – so we might be able to ignore the source meter’s influence in the results after all.

Another complication is the granularity of readings. We are comparing a 5.5 digit result to a 3.5 digit result, and due to rounding, you will expect to see errors of up to one decimal place of the 3.5 digit result.

SIM-Del

As a result, we can sometimes see this kind of sawtooth pattern. This was generated by producing a sequence of numbers (1, 2.002, 3.004, 4.008, etc) and rounding that to 2 decimal places and finding the difference between the “actual” and rounded values. Basically, because the input value might sometimes lie right on the reported result (i.e. zero error), or it might lie on the boundary of the reported result (i.e. maximum error), the results have to be seen as closely-spaced spot checks. The actual maximum error would be the “furthest” distance from the zero difference (delta) result.

SIM-PC

However, if we plot this as a percentage error, we can see that we get a decaying sawtooth shape. This is because as the input value gets larger, the rounding error still remains the same size, thus the rounding error represents a smaller fraction of the entire value. This is partly why meter errors are rated with a % of reading (for linearity issues) and % of range (for offset/rounding issues).

At times, the meters struggle to keep stable display values, often “dropping” by 2-3 counts. I have chosen the value that the meter rests at after 15 seconds of measurement as the “final” value.

So do keep these points in mind when you look at the graphs.

The Contenders

The contenders in this round are a set of multimeters which have been lying around my room getting occasional use. These things just seem to collect, and are of various ages. Given that these units never came with calibration certificates (and you wouldn’t expect them to for the small money you pay for them), I’m not sure how reliable their values are. Many of the original spec sheets are lost, but some of them claimed 0.5% accuracy +/- 3-5 digits if I recall correctly. This is not really going to be an article about which meter to buy, as calibrations do drift over time and can be impacted by environment, transit and care. All meters have been treated with care, however. It’s more of a look at what you can expect if you “pick up” a random meter and make a measurement.

DSC_9505Dick Smith Q1459

The oldest meter in the bunch (over 8 years), this one comes with a red rubberized skin, data hold functionality and backlight. It doesn’t feature any transistor test options, but does have an audible beeper for continuity testing and a square wave output. It does differ from some other “19-range” meters by having a 200M ohm range, but sacrifices a 2M ohm range for it. How peculiar.

Further research seems to show that this is a rebranded Uni-Trend UT33D as confirmed by the imagery and rear serial number label format.

DSC_9507Unbranded DT-830B

Arguably one of the more popular meters on the market today for beginning hobbyists, these units have no branding whatsoever and can differ subtly in the quality of the casing and the colour. This one, with the B suffix, measures resistance to 2M ohm, and has a transistor tester but no continuity beeper. This one is a fairly recent purchase, from two years back.

 

 

DSC_9509Excel DT9205A

Another Chinese special, this one is a more deluxe meter of a different sort, featuring capacitance test (but with limited range), wider resistance scale, continuity with beeper, transistor test, DC and AC current with wider range and a data hold facility.

This one comes with a yellow rubberized skin, and was about twice the price of the DT830’s. It features a large display but still only the average 3.5 digits.

 

 

DSC_9508Unbranded DT-830D

Another variant of the same DT830 recipe, this one features square wave output and continuity beeper with the diode test moved to the 2k ohm range. Otherwise, it’s pretty similar to the yellow unit, just a bit older at around eight years old.

 

 

 

 

DSC_9506Jaycar Digitech QM1540

A more deluxe meter, this one is technically on loan to me, featuring capacitance and inductance test, AC and DC current and high resistance ranges with temperature and transistor gain available via external plug-in dongle. It has a data hold feature as well, but just the regular 3.5 digits. This unit is about 1 year old.

On further research, it was determined that this is an rebranded version of the Mastech MS8269.

 

 

The Results

As much as I would have wanted to test all the ranges and all the values – equipment limitations, time and patience are limited. In order to undertake these measurements, special purpose devices and improvisations were made. However, I have covered the most useful ranges on the meters, by hand-measuring 6976 measurements – a painstaking operation that took more than a whole week of spare time.

Resistance – Methodology

Testing for resistance can be a bit tricky. The first thing you need is a large set of resistances – now unfortunately I don’t own any precision decade boxes, so I sorted out all my 1/4W carbon film resistors (mostly) and soldered them to some Veroboard as a reference.

DSC_9592

They mostly go ascending in resistance although I do have one resistor out of place (see if you can spot it). Unfortunately, I don’t have every E12 value in there, but that’s just life. Knowing that carbon film resistors can be affected by temperature and humidity, the air conditioning took as best care of that as possible. Using just one set of test leads (so as to cancel out the lead resistance impact), I measured the set with the Keithley, and then measured the set with each of the meters. I then determined the absolute difference in readings (this includes rounding errors – i.e. sub 1-display unit errors) and also the error in percentage (absolute value – no negative values).

Resistance – 200 Ohm Range

R-Del-200R

At the 200 ohm range, measurements of the resistors showed resistance values which were within about 9-digits of the Keithley value, with the multimeters over-reporting the resistance slightly. This wasn’t the case with the DT9205A which was abysmal with accuracy being a whole 30-digits out near 150 ohms! Yikes!

R-PC-200R

As a percentage, we can see that the errors are astronomical for small resistance values, only dipping below 3% above 50 ohms or thereabouts. Most of the meters except the DT9205A managed to be below 1% of error by 56 ohms.

Resistance – 2kohm Range

R-Del-2kR

Performance was more consistent in general for values in the 2k ohm range. The DT9205A and DT-830B under-reported by up to 5 and 6 digits respectively, whereas the black DT-830D over-reported by about 6 digits. The remaining two units were pretty much within 3 digits, which is a good result.

R-PC-2kR

As a percentage, it can be seen the majority of readings seem to lie within 0.5% of the Keithley value with the exception of a spike in both DT-830’s and the DT9205A overall.

Resistance – 20kohm Range

R-Del-20kR

In the 20kohm range, the DT9205A remains the worst, being up to 11 digits out. The other units seemed to be within 5 digits, increasing in error with higher resistance values.

R-PC-20kR

As a percentage, however, most of the meters were better than 0.5% of error, and mostly better than 0.4% as well, with the exception of the DT9205A.

Resistance – 200kohm Range

R-Del-200kR

In the 200kohm range, different trends seem to emerge. While the DT9205A has been the outlier so far, its performance is now more similar to the DT-830B, under-reporting resistances by anywhere from 1-8 digits. The QM1548 didn’t do too badly with the error increasing steadily as the resistance value increased to just over 4 digits. The other meters were within 3 digits.

R-PC-200kR

As a percentage, we can see all meters except the DT9205A and DT-830B managed to cling to or stay below the 0.5% error line.

Resistance – 2Mohm Range

R-Del-2MR

The trend seems to change for the 2M range with most meters under-reporting the resistance values. The DT9205A retains its habit of being least accurate for the majority of the range, however all other meters except the DSE Q1459 “tangle” together in that sort of trend pointing towards a maximum error of about 11 digits. The DSE Q1459 is a little opposite, peaking out at 7 digits over-reported.

R-PC-2MR

As for percentage error, the stats look a little worse now, with errors hovering slightly higher near 0.7%, with the exception of the DT9205A of course.

Resistance – 20Mohm Range

R-Del-20MR

The 20M ohm range is a bit of a “special” class which the DT-830’s aren’t a part of. In all cases, it seems the trend seems to be the same – with the meters being anywhere from 7-10 digits out at ~8Mohms, but otherwise being within ~3 digits for smaller resistances.

R-PC-20MR

In general, in terms of percentage error, the meters performed well for the smaller Mohm resistances, rapidly degrading at ~8Mohm. The DT9205A retains its characteristic problems with accuracy.

Resistance – Conclusion

From a quick examination, I’d have to say that the DSE Q1459 did a good job over all the ranges compared with the others. Its errors seemed more controlled throughout. The standout dud in this is the Excel DT9205A which consistently missed the mark almost every single time. The rest of the meters seemed to have varying trends depending on the range – some better, some worse. At times, errors at extremes of scale were noted at 10 digits, in other words, rendering the last digit on the screen technically meaningless.

DC Voltage – Methodology

To test the error in DC voltage, my Manson HCS-3102 switching mode power supply was used to supply power from 0.8 – 36.2V in 0.1V steps. The actual voltage was recorded with the Keithley 2110 in parallel with the reading shown on the multimeter – thus paired readings were recorded. This was due to the inherent voltage “drift” of the switch mode controlling circuitry. Sawtooth pattens due to the granularity in the steps of the power supply are evident, as explained in the earlier section.

DC Voltage – 2V Range

DCV-Del-2V

Testing below 0.8v was not possible as the power supply was not able to produce such low voltages. It seems that each meter has its own characteristics. The DT9205A was over-reporting by about 5-digits, with the DT-830B under-reporting by about 7-digits at the extreme. There was one spike in readings experienced by both the DT-830B and QM1548. Overall, the DSE Q1459 remained remarkably close to the actual reading (within 2 units).

DCV-PC-2V

The accuracy of the DSE is clearly shown in the percentage error graph with it near or below 0.1% – an exemplary result. Aside from the spike, all were below about 0.4% which is reasonably close to what specification sheets may tell you.

DC Voltage – 20V Range

DCV-Del-20V

This test, spanning the entire 2-20V range really makes for an interesting graph. The curves are due to the fact the power supply voltage isn’t incremented in exact multiples, whereas the “steps” up and down are due to the granularity of the 3.5 digit display.

Of all of them, the DT-830B is the worst, hitting a whopping 14 digits of error through the range. Everything else remained within 5 digits, however, interestingly, the DSE that performed so well in the last test is now the second worst. The DT9205A isn’t half bad.

DCV-PC-20V

As a percentage, we can see that at the low end of the range, the DT9205A struggles, and the DT-830B struggles across the board. The rest of the meters seem to meet their 0.5% spec and are under 0.3% for the top half of the scale.

DC Voltage – 200V Range

DCV-Del-200V

Unfortunately, we can’t get too far with voltage with the Manson power supply I have here, and I wasn’t going to bother jury rigging a rectifier and capacitor filter to a variac output to go further. Regardless, it shows that at the low end of the 200V scale, it seemed all meters were within 2 digits, and all seemed to under-report the voltage slightly.

DCV-PC-200V

The magnitude of the error starts off higher, as we’re in the low end of the scale, but settles down to sit under 0.5% as expected.

DC Voltage – Conclusion

It seems the meters themselves aren’t too bad at measuring DC voltage. That being said, the yellow DT-830B doesn’t seem to be in good calibration, and while the DSE Q1459 appeared good at the 2V scale, it wasn’t anywhere near as accurate on the 20V scale. The DT9205A didn’t fare too well, but I suppose it wasn’t too bad either. Jaycar’s Digitech QM1548 seems to have performed decently through most of the tests, although having a spike in the 2V range.

AC Voltage – Methodology

In order to measure AC voltage, a 230v 50hz true sine wave sourced from an HIP-300 inverter and fed through a 1A variac to produce voltages between 1-260V. This was measured by the Keithley and meter under test in parallel. The paired readings were taken at every nudge of the variac, with no regards for having a particular number of steps.

AC Voltage – 200V and 750V Scale Combined

ACV-Del

When reading the graph, please note that the 750V range has units which are 10 grid lines tall. It seems that the meters generally had trouble with accurate DC measurements – the DT9205A surprises again with 8 digits of deviation, and the DT-830B having about 11 digits. All meters had difficulty keeping the error within 5 digits on the 200V range. When switched to the 750V range, only the lower end was tested, but generally error didn’t exceed 4 digits within the tested area.

ACV-PC-Full

The impact of this can be seen as a whopping large value of error at low readings below about 50V. The DSE, DT-830D and DT-830B are all offenders when it comes to error.

ACV-PC-Enlarged

Zoomed in, we can see the other units were starting at lower values of error, but at higher values they were all able to meet about 0.5% error except for the DT-830B. At the higher 750V range, the DT9205A had a higher amount of errors than the others, and the DSE seemed to perform very well with errors below 0.3% (a complete reversal of its performance in the 200V range).

AC Voltage – Conclusion

At a glance, all meters weren’t terribly accurate when it came to measuring AC Voltage. However, some were worse than others – the DT830’s and the DSE being poor at low voltage in the 200V range, and the DT9205A being poor in the upper 200V range and 750V range. However, into the 750V range, the DSE managed to reverse its behaviour. On the whole, it seems Jaycar’s Digitech QM1548 was most consistent when it came to AC voltage.

DC Current – Methodology

Testing DC current was performed by having the Keithley and device under test in a series circuit with a load. For testing the 200mA range, a load comprised of 12×560 ohm resistors in parallel was used. For testing the 10A range, a 50W MR16 halogen downlight globe and the Manson HCS-3102 power supply was used. Unfortunately, this limited the readings to about 4-5A.

DSC_9503

DC Current – 20 and 200mA range

LDCI-Del

Due to minimum voltage and load construction, testing below about 10mA was not possible. In the brief moment tested in the 20mA range, it was noted that the DSE performed the worst of the lot, showing an over-reporting of about 4-digits while other units remained well within 2 units. The DT9205A surprisingly put in a good result.

On the 200mA range, the DSE managed to perform poorly again, but in the opposite direction, now under-reporting current by about 9 digits at full scale. The worst result was about 12 digits over-reported by the DT9205A, closely followed by the DT-830D with over-reporting by 8 digits.

Surprisingly, the cheap yellow DT-830B was most consistent throughout these tests despite performingly poorly in other tests (ACV, DCV) previously. A break is seen in the data points due to an automatic range change by the Keithley Model 2110 – this accompanies a change in burden resistance which changes the current flow step-wise.

LDCI-PC

In terms of error percentage, in the 200mA range, close to all readings were under 0.7% with the better meters staying mostly under 0.5%. The 20mA range saw mixed performance from most, but the DT9205A produced an exemplary under 0.2% error result.

DC Current – 10A Range

HDCI-Del

Testing for high 10A DC current range produced interesting results. It seems the DSE Q1459 is not suited to continuous current measurement as written on the front panel of the meter – the results clearly indicate why. The meter heated up significantly, which is likely to have affected the resistance of the hunt causing the error to show a very non-linear shape.

The other meters seemed to perform better – although the DT9205A retains its signature of being an inaccurate meter. It managed to deviate by a whole 24 digits by 4.5A. The other meters were able to keep it within about 4 digits.

HDCI-PC

Still, the numbers reflect the difficulty with accurate current measurements, with the better meters struggling to keep it under 1.5% of error. The DT9205A and DSE Q1459 need no consideration at all.

DC Current – Conclusion

Surprisingly, the stand out performer is the yellow DT-830B again, which failed abymsally at voltage but seems great at current accuracy. The Jaycar QM1548 and DT-830D both performed fairly well as well. Unfortunately, the DSE was inconsistent, and performed poorly at high currents, whereas the DT9205A was poor all round.

What are you Calibrating?

So maybe those results weren’t what you hoped for, and you’re thinking there must be some way to improve that multimeter! Well, maybe you’re in luck, because depending on your meter, there maybe one or (sometimes) many trimpots to play with.

DSC_9510 DSC_9512

DSC_9511Unfortunately, as it’s not labelled, who knows what setting you’re really adjusting. Worse still, the trimpots aren’t exactly precision units either, and it’s unlikely that even with the best handling and care, that they will retain their precise values for very long.

Other times, you might be unlucky and come across something like my black DT830 which has no trimpot installed at all. I wonder how that works – calibration free, fixed calibration courtesy of surface mount resistor?

Whatever it is, I don’t really think it’s a great idea to touch it unless you have some decent equipment, patience and time to check just how well it is performing. From the data presented above, it seems very unlikely that changing the calibration will fix reading errors for all ranges, as the errors don’t seem to even be in the same direction for all ranges. I wonder whether these units were even properly calibrated at the factory, or whether they just did a one-voltage spot check on the calibration and fine tuned it for that.

Conclusion

Well, it was a fairly involved investigation, which took a fair amount of time, but rather surprisingly I suppose it proves some of the things I’ve anecdotally heard about the 3.5 digit multimeters – namely that the last digit is pretty useless on many of them. Judging from the amount of error, this could well be true depending on which unit you have. It’s also interesting to see the errors in play – showing “curved” delta plots which suggest a linearity problem, others showing offset problems, and without consistency between ranges.

One must really wonder what they are calibrating if they choose to change the values of the trimpot inside their meter – it could well be point calibrated for one particular voltage, but the other ranges could still be incorrect. This probably comes down to the choice of components (e.g. resistors) used inside, not being of high precision. As a result, it’s probably pointless to even try calibrating one of these units.

I suppose this does tell you just how indicative the readings are – it’s fine for telling the difference between 9V and 12V, but it’s not able to tell whether the absolute value of a voltage is 5.00V or 5.05V. At the low end of the range, the errors can be quite high, and thus readings should be avoided there (as well as for small resistances <50 ohms).

It’s all hit and miss really – some meters are good at certain ranges or types of readings, and terrible at others. However, in general, it seems the DT9205A takes the wooden spoon. I suppose that’s why quality multimeters exist, and are more expensive while being also better specified.

Posted in Electronics | Tagged , , , | Leave a comment

Tech Flashback: Microsoft Word Version 5.5 (MS-DOS) and 6.0a (Windows)

Most people who use a computer have undoubtedly used a word processor before. Whether it was to type up an assignment, a letter, a thesis or something else entirely, the dominant word processor for at least the past decade has undoubtedly been Microsoft Word. Its file formats and ubiquity have led to a large number of semi-compatible attempts at producing compatible office programs and frustration due to lack of file format documentation.

It’s important to remember that Microsoft Word wasn’t always the dominant word processor. The word processing market goes back all the way to purpose built Wang machines, through WordStar on CP/M machines and so on.

The first one that I used was actually WordPerfect 5.1 for DOS (which I might have a “flashback” about in the future) followed by Microsoft Works for DOS (which Sparcie covered very well in his post). The landscape changed very quickly with the introduction of WYSIWYG graphical interfaces, that of Microsoft Word for Windows. This post from the Computer History Museum shows how quickly Microsoft Word for Windows took the market, despite the new paradigm being initially shot down as being inefficient, difficult and “different”. The success of Microsoft Word meant no recovery for the rivals – we watched Samna Word/Lotus Word Pro and WordPerfect disappear into the abyss.

In this post, we will look at two versions of Microsoft Word, version 5.5 for MS-DOS and version 6.0a for Windows.

Microsoft Word for MS-DOS Version 5.5

This look at Word version 5.5 was completely unplanned, and only came about due to stumbling across it during some research on Word 6.0. As it turns out, Microsoft Word version 5.5 is actually still available for download from Microsoft’s site, mainly as a “fix” for Y2K issues. This meant that I could grab a copy and give it a spin to see what was possible back in MS-DOS back in 1991.

word5-install

Extracting the files allowed me to run the setup even within DOSbox, which meant that I didn’t need to go to the lengths of using a full-blown MS-DOS VM. The software is actually quite advanced for a piece of software running under MS-DOS and makes use of more complex graphic modes and directly interfaces with input devices (keyboard and mouse), thus making the setup slightly more involved than expected. The setup does seem to have a bug in the user name/organization dialogue which required some retrying to get around.

As computers of the time came in various configurations, there were options in the setup to install Word to a floppy disk, as hard disks (and available space) were not a given. Likewise, as Word 5.5 was changed from prior versions of Word to be “more like” the Windows counterpart, there are options in regards to configuring the program as well.

By default, the user interface fires up in the regular 80×25 text mode with ANSI colours, which is generally the most compatible mode.

word5.5UI

Users of later MS-DOS versions will already note the interface semantics of scroll bars, window borders and shadows made by text and shortcut key highlights to be quite similar to that of Works and even the in-built EDIT.COM. It’s very spartan, with just one row of contextual “tooltip” text at the bottom, and one row of “status” indications.

word5-menus

A quick peruse through the menus shows just how many features there are. The shortcuts themselves will be relatively unfamiliar to Windows users.

word5.5about

The about dialogue dates the software to 1990.

word5.5transition word5.5mainhelp

Hyperlinked help documents are available, with special help for those transitioning from an earlier version of Word.

Careful watchers of the menu may have noted something called Ribbon in the View menu. Rest assured, this has nothing to do with the Ribbons introduced in the 2007 interface, but was a name for the toolbar that you would normally expect to see on Windows versions of Word to do your formatting.

word5.5-theribbon

So, how does it actually look like when you compose a document?

word5.5-interfacetest

Like this! That’s right – that’s what your document looks like in a non WYSIWYG editor. There wasn’t any real drawing features either – although there was a Line Draw mode which allowed you to draw lines by inserting ASCII characters as you moved your cursor around.

word5.5-linedrawmode

Basically, you’re just writing your text and inserting formatting codes, and the only way to be really sure of what it would look like is to print it out! Another alternative was to use Print Preview, if your graphics card supported graphical mode …

word5.5-printpreview-VGA

Speaking of graphical modes, the software supported numerous modes as some graphic card and monitor combinations don’t work well in some modes, or have performance issues.

word5.5-graphicsmode-dropdown1 word5.5-graphicsmode-dropdown2

Utilizing the most basic 25-line graphics mode changes the display somewhat, as the character generation isn’t performed by the VGA’s BIOS.

word5.5-display-graphicsmode-25

It’s possible to bump it all the way up to 60-lines, but it’s likely you would have had to squint at your monitor. At least this mode allowed you to fit a lot more document onto your screen. Unfortunately, DOSBox isn’t too happy emulating these modes, resulting in on-screen garbage after a while.

word5.5-display-graphicsmode

What else caught my eye? Well, for one, there seemed to be a way of inserting an image into your document. Unfortunately, I had no compatible images, and it’s likely that they word5.5-insert-picture-format-supportaren’t displayed in text mode anyway – but the format support was very limited, with mostly printer graphic languages supported (not even PCX, which I remember WordPerfect supporting).

word5.5-insert-picture

The file browser seemed the “norm”, but it seemed you also had to format your image on the spot for size as well. Oh how that would be a nightmare – you were literally “flying blind”.

word5.5-pagenumbers

It was pretty nice to see that automated page numbering was available, but with more limited formatting choices.

word5.5-paragraph

Paragraph options were spread across multiple sub-dialogues which could be accessed with the “buttons” at the bottom. This was another normal choice because of the problem of having over-cluttered dialogues … speaking of which …

word5.5-preferences

… there were many options you could customize in preferences. Maybe the blue background burns your eyes – you could opt for a much more classical black in the Colours dialog.

word5.5-preferences-colours

Evidence of an autosave was also available, but not turned on by default. This would be a great help for when the power unexpectedly goes out, but in general MS-DOS programs rarely crashed in comparison with Windows programs of the time.

word5.5-preferences-customize

As a MS-DOS word processor, Word 5.5 was well advanced. That was to be expected, because, at the time it was released, the MS-DOS word processing market was very much being cannibalized by the Windows market. The lack of WYSIWYG presentation is quite apparent to those who have “grown up” mostly using graphical word processors, and feels like flying blind. Features, however, are very extensive. As such, this can be considered the “state of the art”, with only one more Word for MS-DOS to be released (version 6).

Microsoft Word 6.0a for Windows

I suppose, all that talk about Microsoft Word 5.5 is merely a prelude to Microsoft Word 6.0a for Windows. This is the version of Microsoft Word I remember most, because it was my first exposure to graphical word processing. Interestingly, because of a desire to synchronize version numbers amongst the Word family, they settled on version 6.0 despite this being the third version release of Microsoft Word for Windows (and would have otherwise been a Version 3.0).

DSC_9591

This part of the flashback was made possible by a generous OCAU member who was willing to send me the set of disks for cost of postage. Word 6.0 came on a total of nine(!) high density format 3.5″ floppy disks. These were painlessly imaged by Kryoflux to have them mounted in my Windows 3.11 VM.

word6-install

word6-iconsWith that many floppies, the install process was somewhat involved, but it was completed easily. It’s noteworthy that, despite this really being a 3rd release, the colour scheme (grey dialog box backgrounds) in the setup implies that it was much more like PowerPoint and Excel 4.0 rather than their 3.0 version releases which featured white dialog box backgrounds. Another thing that makes it clear is the fact it installs into the Microsoft Office program group, rather than the individual program groups which earlier software used. In essence, the idea was to portray this as an essential part of the “Office” productivity “suite” which Microsoft sold very successfully.

word6-splash

Starting the software shows this very memorable splash screen showing a pen and paper, with copyright date to 1994. Unfortunately, as the VM is extremely fast, the splash isn’t fully displayed (even with screen capturing at 60fps) before disappearing to make way for a user-friendly interface with tips.

word6-tip

By default, it displays documents in the drafting mode, which is more familiar to existing users, but it’s possible to toggle (with a click on the icons next to the horizontal scroll bar) into WYSIWYG mode (the default choice of most modern word processors). This makes it look like you are dealing with a sheet of paper, with rulers and margins.

word6-wysiwygmode

A quick walk through the menus shows you the features which existed, mostly contemporary with Excel 4.0 and Powerpoint 4.0.

word6-menus

word6-aboutThe about dialog shows us all the version information.

Some of the integrated features are provided by third parties, whose copyrights are presented in the dialog as well.

 

Nice features? Well, for one, the toolbar is customizable with right clicking and the right click invokes context sensitive menus in the document itself.

word6-toolbar-customize word6-rightclick

word6-fontselectorIt is a little different – for example, font rendering was costly in terms of resource consumption, so the font selection is “without” preview – unlike what I remember from Mac based word processors of the time.

 

 

Just playing around, it seemed many of the functions I would use are all somewhat there, just less refined. For example – want to change the text? Well the dialog, even today, is pretty similar.

word6-font-selection-colour

Formatting fonts are a pinch, as is putting in WordArt (in a similar way to that demonstrated with PowerPoint 4.0 – as an OLE object). It’s also possible to bring up the drawing toolbar on the bottom for some simple shapes, although there wasn’t really a large library of pre-defined autoshapes like we have today.

word6-wordartformat

word6-table

Putting in bulleted text was easy using the toolbar button, but some features like pressing enter twice to “escape” bullet mode back into paragraph text wasn’t implemented, thus necessitating a manual click of the button to toggle bullet mode off.

It was possible to easily define tables with a quick drag of the mouse, but it seems like table borders were not solid by default, and appear “dotted”.

word6-sampletable

Drawing is easily accomplished with the drawing toolbar. I was also able to put in a caption for cross-referencing purposes too. It’s amazing just how many features were available that I just didn’t know about back then because I was never taught properly how to use the software!

word6-sampledrawing

Many properties were available for the drawn shapes to produce more complex filled objects. Pretty cool, given the limitations of most displays at the time in regards to colour.

word6-drawingprops word6-drawingprops2 word6-drawingprops3

Back then, if you keyed in a hyperlink (for example), it wouldn’t be recognized because it just wasn’t a common thing back then. Likewise, if you made a typo, it would just sit there without any indication. Spell checking was a resource intensive process, so you had to manually invoke it.

word6-spelling

Tracking changes doesn’t seem to be a feature, and neither were comment bubbles, but instead, you had the Annotate options, where you could enter your comments and even record voice annotations. It doesn’t seem to be a commonly used feature … and I don’t remember seeing that option in modern versions of Word.

word6-annotations

Best of all? Well, there’s print preview, but you probably don’t need it if you edit using WYSIWYG mode. Images are a cinch, along with the supplied clip art.

word6-printpreview

Conclusion

Given the finite time I have to spend with these old pieces of software, it’s not possible to document every single feature that was available. It was definitely a trip down memory lane to see the software in action, especially given that I’m now much more adept at using the software. The feature set developments and refinements can be seen, but some of them are really subtle but are great conveniences we take for granted. That being said, the core functionality we take for granted today were already in place by the early 1990’s, but the revolutionary nature of WYSIWYG editing has never been more clear.

Posted in Computing, Tech Flashback | Tagged | 1 Comment

Quick Review: Toshiba Exceria 64Gb UHS-I Class 3 microSDXC Card (SD-C064GR7VW060A)

Our demand for storage is pretty much insatiable, but companies continue to roll out new products which are larger, faster and sometimes even cheaper than before. Of note is the proliferation of microSD-slots in Android and Windows phones, tablets as well as the use of microSD cards in action cameras which are beginning to demand even higher speeds to support capturing of high quality 4K video.

This post will look at the Toshiba Exceria 64Gb UHS-I Class 3 microSDXC card, model number SD-C064GR7VW060A) which has been sent to me for testing by a friend. This card was (apparently) a chance discovery at a local retail outlet, selling at a reasonable (~$1/gb) price which seems to be great news as Toshiba cards have rarely been seen at retail here.

DSC_9584

The package it comes in is mostly blue in colour, and a departure from previous Exceria cards with their “type-x” nomenclature. The box is a thin card box with colour print and silver foil, boasting an impressive 95Mb/s read, 60Mb/s write speed and 64Gb capacity. As with most flash cards, it is “water proof” (compatible? eh?), but this one is Made in Japan. Plastic seals adorn every joint, and an anti-counterfeit hologram is visible on the top tab. As with most Exceria cards, there are warnings not to use the card in adapters – this is because the high speed signalling can be incompatible/cause errors with adapters.

This card differs from the majority of cards on the market, as most of them are still rated in the older Class system, which tops out at class 10 (10Mb/s) denoted by a 10 enclosed in a C. The new UHS class ratings have a number inside a U, and represent the minimum write speed in units of 10Mb/s. This is the first card that I have that is rated for a class greater than UHS Speed Class 1 (i.e. a U with a 1 in it).

This can lead to confusion, as UHS-I cards with Class 10 rating are commonly seen, people may naturally assume that the I in UHS-I represents the speed, but instead it represents the bus speed. There is also a faster UHS-II bus available for even higher performance cards.

So, in short, this is a card that communicates over the UHS-I bus, with a minimum write speed of UHS Speed Class 3 (30MB/s). It can only achieve its maximum speed over the UHS-I bus, with backwards compatibility to various slower “regular” SD card bus speeds but with performance limitations.

DSC_9585

The box itself contains text on the back including a disclaimer about usable storage capacity in binary gigabytes. The inside of the box elaborates more about the speed limitations of different SD card bus speeds, and can be accessed by cutting along the dotted line. As this was “on loan”, I didn’t cut it open.

DSC_9587

As with most of the “higher end” Exceria cards, this one came in a clear plastic shell which is held by a larger milky-white plastic tray. It’s a bit of overkill if you ask me.

DSC_9588 DSC_9589

The card itself has standard white printing including the speed information and capacity on the front. A laser etched serial number is also placed on the front. The rear is unmarked, although features a ribbed appearance.

For reference, the card information is as follows:

Capacity: 62,813,896,704 bytes
CID: 02544d554330453320f1ea4c3d00e7b1
CSD: 400e00325b590001d3ff7f800a4000e3

Performance Testing

The card passed a full fill and multiple passes of verify with no problems at all. The behaviour of the card does deserve some mention, however.

Sequential Read with HDTune Pro and Transcend RDF8

With a new card straight out of the box, the sequential read speed seems to fall somewhat short of the stated 95Mb/s.

hdtblank-exceria3-rdf8

However, in a pattern which seems to be consistent with high performance cards, after filling every sector, the card actually gets faster. But only by a little.

hdtwritten-exceria3-rdf8

The speed averages 78.9Mb/s, with a few variations along the way which are probably due to the flash memory in the card itself, or timing issues between the card and reader. This is still blisteringly fast, in general, and leaves even the Sandisk Extreme (~45Mb/s) in the dust.

Sequential Read with HDTune Pro and Realtek RTS5301

hdtblank-exceria3-rts5301

The Realtek based reader didn’t have any compatibility problems using the microSD slot, and was able to negotiate UHS-I with the card straight away. The read performance, however, was less at 67.9Mb/s.

CrystalDiskMark with Transcend RDF8

cdm-rdf8

With the reference CrystalDiskMark benchmark, we see the card scores a much more interesting read score of 82.64Mb/s, which is among the fastest results we see with this reader/USB3.0 controller combination. The write speed also can be seen to hit above 60Mb/s, thus beating the specification on the package. The card appears to perform excellently at sequential large-file streaming-style operation but takes severe penalties at small block 4k accesses. This is not an ideal situation if you wish to use the card with embedded systems, but photographers and videographers should be pretty happy with it.

CrystalDiskMark with Realtek RTS5301

cdm-rts5301

The speeds registered with the Realtek based unit are inferior to that of the Transcend RDF8. This is probably due to performance limitations with the chipset, but it’s good to see it put through a strong result nonetheless.

H2testW with Transcend RDF8

h2testw-exceria3

A nearly full write and verify of the filesystem was completed without error. The speeds with the RDF8 were slightly below the CrystalDiskMark results, likely due to the additional overhead of checking the test data.

Conclusion

The card seems to be one of the few cards on the market which has a UHS Speed Class rating faster than 1, and it seems to perform relatively well.

The card provides read and write performance very similar to the full size Exceria Type2, but in a microSDXC form factor. The high speeds generally applied only to large sequential accesses, with significant penalties for small 4k block accesses, making this card a good choice for photographers and videographers, but a poor choice for embedded systems users. The price is very competitive, especially when compared with the main market competitors, and it is Made in Japan, which should mean quality.

That being said, it doesn’t mean that other cards on the market which are only UHS Speed Class 1 are slower. They may not have been qualified for speed class 3, but may still be able to offer 30Mb/s or faster writes (as the database seems to show). Of course, devices that mandate a certain UHS Speed Class will still require those certified cards. Maybe this is a sign we will see more UHS Speed Class 3 marked products on the shelves soon.

Posted in Computing, Flash Memory, Photography | Tagged , , , | Leave a comment