Legends of operating system development

Doghouse – OS Development

Article from Issue 251/2021
Author(s):

Thinking about the history of Linux, maddog sheds light on why there are so many different flavors of operating systems.

Perhaps because I had been reminiscing about the 30 years of Linux kernel development, or perhaps because of a recent discussion over whether the GNU project or the Linux kernel was the most significant part of the system that most people call "Linux," I began pondering how many people think about software creation as if it were being done today instead of how it was done 40 or 50 (or more) years ago. This kind of thinking often creates "urban legends" about software that are passed on from person to person.

Take the conspiracy theory about how system companies created all these different operating systems to "lock in their customers." I have been in the computer field for over 50 years and participated in many engineering meetings about new functionality. Not one time did I ever hear customer lock-in as a reason for creating new functionality.

If you remember (or maybe you were not around then, so just trust me on this), computers had relatively small amounts of memory (even mainframes) measured in kilobytes (not gigabytes or even megabytes) and slow, single-core CPUs with slow disk drives.

Normally, these computers were so slow and small that it was hard to create an operating system that could do batch processing, time-sharing, real time, and other loads simultaneously. We also had ideas to create interfaces in the operating system to help make them handle application-specific code, such as medical, manufacturing, educational, and so forth.

If creating an operating system to lock in customers to their systems was the sole reason for the APIs and functionality they supplied, Digital Equipment Corporation (DEC) would have needed only one operating system on their PDP-11s, instead of the multitude (over 11) of operating systems DEC offered (and engineered and supported).

As computer speed and memory size started to grow, the idea of having one operating system that could "do it all" began to appear, although it might still be nuanced between business (84 percent of the computer market) and scientific (16 percent of the market).

However, there were still differences in interfaces from vendor to vendor, which meant big differences in APIs and user training that extended to applications that only ran on one or two systems from different vendors.

Out of this came the idea to make portable applications so you could have the same application on various operating systems. Standards development started to help software portability, including standards in languages and language runtime systems resulting from hard work by people from each company to create and test those standards. Some companies made "extensions" to those standards, and (for the most part) the good programmer stayed away from those extensions.

Eventually, as more and more computers were used by more and more people, the concept of portability showed that it was better and cheaper to have the same interfaces across lots of computers rather than having the operating system finely tuned to the actual load.

Since Unix was not created by a system vendor but by a third party, Unix was portable across lots of different hardware and had the same programming interfaces. Of course as Microsoft operating systems rolled out, the same thing could be said for most of the desktop systems that evolved.

This desire to have the same operating system on every platform was demonstrated to me time after time when DEC's Unix engineers would bring out really great functionality that our customers loved, but the first thing out of the customer's mouth was "When can I have this on my Sun systems?" Why couldn't the Unix companies see this coming, when so many of their customers loved using the same software on all of their PCs bought from IBM, Dell, Compaq, etc.?

All of this came back to me in the conversation about GNU and the kernel. Yes, Richard Stallman wanted to build a complete operating system called GNU. But in the environment of the early 1980s with very expensive hardware, no real Internet, and many other limitations, he started with development tools that were freely available and freely changeable that allowed developers to have the same tools across a wide variety of operating systems.

If Richard had first started working on a kernel, he would have had nothing to run on it. And Linus Torvalds did not have to worry about the libraries, compilers, utilities, windowing system, etc., because those already existed.

Some people might say "What about BSD?" The issue with BSD is one of timing. By the time version 1.0 of the Linux kernel was released in early 1994, Linux distributions like SLS, Red Hat, Debian, Slackware, and Yggdrasil (and others) had already released as well, and the BSD lawsuit was still going on.

History and circumstances mean a lot. Many things in the computer industry have happened almost by accident.

The Author

Jon "maddog" Hall is an author, educator, computer scientist, and free software pioneer who has been a passionate advocate for Linux since 1994 when he first met Linus Torvalds and facilitated the port of Linux to a 64-bit system. He serves as president of Linux International®.

Buy this article as PDF

Express-Checkout as PDF
Price $2.95
(incl. VAT)

Buy Linux Magazine

SINGLE ISSUES
 
SUBSCRIPTIONS
 
TABLET & SMARTPHONE APPS
Get it on Google Play

US / Canada

Get it on Google Play

UK / Australia

Related content

  • maddog’s Doghouse

    Chip replacement isn’t always the answer.

  • maddog's Doghouse

    A recent rocket launch has maddog thinking about high performance computing and accurate weather forecasts.

  • maddog's Doghouse

    As operating systems and computer languages evolve, programmers need to keep learning new skills.

  • maddog's Doghouse

    For a better bug report, maddog offers a refresher course on crafting a clear statement that will help get your problem fixed.

  • Doghouse – FOSS and FOH

    Free hardware is a noble concept, but expenses associated with the hardware manufacturing process means your single-board system will never be quite as a free as the software that runs on it.

comments powered by Disqus
Subscribe to our Linux Newsletters
Find Linux and Open Source Jobs
Subscribe to our ADMIN Newsletters

Support Our Work

Linux Magazine content is made possible with support from readers like you. Please consider contributing when you’ve found an article to be beneficial.

Learn More

News