What's In A Name? The story of DLT vs dlt
Paw Prints: Writings of the maddog
The TK50 was a cartridge tape and tape drive system launched by Digital Equipment Corporation in 1984. It was not the first cartridge tape that came out. 3M had developed a cartridge tape called “QIC” for “Quarter Inch Cartridge” in 1972, and our Unix systems supported that, but Digital also decided to create its own tape cartridge and drive, and of course it was deemed to be “proprietary”.
For its day it was a competitive system:
o it used a pocket-size tape cartridge (o.k....for a large, coat-sized pocket)
o it used a “serpentine motion” which meant that it would write to the end of the tape, then shift the heads slightly and continue writing as the tape moved towards the beginning of the tape
o it was a streaming tape system, not “stop-start”
The last bullet needs a bit of explanation. “Stop-start” tapes would write a block of information at a time, with the block size usually going from some very small minimum to some fairly large maximum (for the day). Block sizes of about 64Kbytes were typical, but block sizes of 80 bytes were not unknown. Then there would be an “inter-record gap” of approximately ¾ of an inch, then another block of data, and continue that way
When outside the drive the tape was completely inside the cartridge, and there was a leader that was grabbed by the tape drive mechanism which then pulled the tape from the cartridge and wrapped it around a take-up reel, pulling the tape past the read heads.
When the take first came out, it had a capacity of 98 megabytes. Yes, dear reader, you read that correctly....98 Megabytes. Of course the average size of a 5 ¼ inch Winchester disk those days was either five or ten megabytes, so 98 Megabytes of storage was a lot.
However, it was not so simple as to just say it was 98 Megabytes. If you kept the data buffers full so the tape was always writing data you could fit 98 Megabytes on it. On the other hand, if the buffer was empty the tape drive (trying not to stop forward motion) would start writing “nulls” to the tape in the hope that the buffer would soon have data in it. If this happened then the tape would keep on writing. If the buffer was not filled, then after a while the tape would stop, back up, and re-position itself to slightly before where it was last writing data. Then when the buffer filled again the tape drive would start back up and start writing from where it had left off. Therefore the two ways of filling you tape with 98 Megabytes of data were:
o always keep the buffer full of data to be written
o allow the tape to stop, back up and restart (which was very, very, VERY slow)
otherwise there was a potential for having a tape that was mostly “nulls”, and very little real data.
In those days of one million instruction per second CPUs and relatively small (2 Mbyte) memory systems with 5 Mbyte, relatively slow disks, keeping that buffer full was easier said than done. The disk head moving from one cylinder to the other might be enough time of not transferring data to empty the little I/O buffer in RAM. So we watched the little TK50 tape drive chug away, hour after hour, trying to back up our 10 Mbyte disk drive (if we were lucky), TWO 10 Mbyte disk drives.
Over the years Digital changed the TK50 to a TK70, taking the storage capacity up to 294 Mbytes by quadrupling the density of tracks on the tape. The TK70 tape drive could read TK50 tapes, but could only write TK70 format (and of course could only write to TK70 cartridges which were the same size, but of different tape density). About that time Digital decided to change the name of the tape architecture from Compac II to Digital Linear Tape (DLT), and later designs took the capacity up to ten Gigabytes.
Digital tried to put the tape out as an industry standard, but try as Digital would, the “TK**, Compac, DLT” tape was always perceived as a Digital proprietary format.
Then in 1994 the technology was sold off to Quantum. Quantum, of course, did not make computer systems, they “simply” made storage devices. One of the first things they did was rename the tape format from “Digital Linear Tape” to “digital linear tape”, and almost overnight the format became popular. Eventually capacities of 800 Gbytes became possible before Quantum switched to a more “Open” standard of “Linear Tape Open (LTO).
This story of how DLTs went from a Digital “proprietary” format to an “industry standard” of dlt reminds me of a story told about George Pullman, the creator of the famous “Pullman Company”, who make railroad travel luxurious. Pullman had started his company, but was struggling under lack of capital to really get it going. One of the main railroad barons of the day went to Pullman and offered a very good deal for financing in the creation of a joint company. Pullman kept hesitating, and finally the railroad man asked Pullman why he was not jumping on the deal. Pullman asked “What will the company be called”, and the railroad man quickly said “'The Pullman Company', of course.” Pullman then smiled and signed the paperwork.
Whether or not this story is true, it illustrates that something as innocent as a name will create a perception of either a proprietary standard or an open one. Digital struggled to make its tape an “open” industry standard, but people perceived it as being closed. Quantum, who was in the market of selling to all industry system vendors, was in the proper position to create what could be considered at least a pseudo standard in the industry.
Next: The lowly dd(1) command
I used one of those!At the university we had a couple of VAXen and one of them had a TK-something reader. I guess it was TK40, because I remember one hard disk being around 100 MB in size and the other around 1 GB, and we had a stack of tapes to make backups. Once I actually had to reinstall the system and it took a whole bunch of those tapes, like 10, to do it. At the same time I had a Colorado QIC system at home, and I remember it could back up something around 40 or 50 MB, which was a whole lot at the time.
Now, about 20 years later, my PC at home has close to 2 TB of storage, and more RAM than storage in any of the computers I used to use back then. *sigh*
Customers can take a free test drive of SLES for HPC on the Azure Cloud
San Francisco-based chip company announces their first fully open source chip platform.
The whole distro gets rebuilt on glibc 2.3
Ubuntu Vendor tries to solve app packaging and distribution problem across distributions.
Founder of ownCloud launches the Nextcloud project.
Will The Machine change the way future programmers think about memory?
The new Torus distributed storage system is available under an open source license on GitHub
Juries decides Google’s use of Java APIs Was Fair Use
But if you are not using the latest Linux kernel, your system is insecure.
Home routers will give room for custom firmware but still comply with FCC rules