Aging Computer Infrastructure

Doghouse – Old Systems

Article from Issue 237/2020
Author(s):

As unemployment claims surge, US computer systems are straining under the increased load. In this column, maddog weighs in on COVID-19 and COBOL.

The first language I ever programmed was Fortran, and I learned to program it in 1969 by reading a book and practicing on an IBM 1130 computer that ran one job at a time and had basically no operating system. You linked the device drivers for the hardware into your program and effectively booted your program, not the operating system. Later in life, I would tell my students that we were not doing "Computer Science," but "Computer Black Magic."

Few systems were "networked," or if they were, they were dial-up networks to computers that held "bulletin boards" for exchanging programs and data. Music was on vinyl, and graphics were ASCII art printed on line printers (for the most part).

I am writing this today because of a YouTube video by Russell Brandom (https://www.youtube.com/watch?reload=9&v=Ox_Wm6XQnxI), who criticized the US government for still using COBOL on certain systems. He said that these systems were not up to the huge number of unemployment claims currently being filed, whereas companies like Netflix can scale to meet the large number of demands on them "even during the coronavirus outbreak." He inferred that it was the "fault" of COBOL.

Then one of my Facebook fans asked for my comment on a comparison between the "government's system" and Netflix.

Here are a few caveats before I even start.

First, unemployment claims are handled on a state-by-state basis, so their systems, software, and rules are different between states.

Second, I have not studied any of these systems, at all, nor have I spent a lot of time closely studying Netflix.

Third, I know that in many government departments it is hard to get money to update systems that are "working fine," particularly when you cannot prove a cost savings in doing the job you are already doing.

So with those three caveats, I started drawing comparisons.

Probably the systems to enroll unemployment requests were written some time ago, and COBOL was a fairly good language in which to write rules-based transactions.

The systems might typically have input during business hours, between 9am and 5pm (notice my use of this quaint designation for time) five days a week and not on federal or state holidays.

When unemployment is low, meeting the demands for new enrollment would be "easy," and when unemployment started to go up in peak times (mild recession, seasonal changes), people could work after hours and on weekends to meet the demands.

Typically this work might be done on highly dependable mainframe computers, designed with the hot-swap and redundant capabilities in mind. Even if the mainframe were to fail dramatically, the machine could be fixed relatively quickly and the processing of claims would continue.

In peak times, enrollment might be delayed a day or two, and people might not even notice it.

To plan for the highest peak loads over a large period of time would not be cost effective, since that computer would probably not be shared with any other non-departmental load (such as the agriculture department).

Over the past couple of years the unemployment rate has been very low. In "normal" years, it might be 3.5 to 4.5 percent; if unemployment goes to zero percent (everyone has a job), this makes is hard for employers to find new employees.

In the recession of 2008, the unemployment rate went up to 10 percent of the total workforce, but that happened over several months. As people lost their jobs, they would apply for unemployment funds. The systems could handle it, because, at the peak of that recession, nationwide claims were less than one million claims per week.

In this pandemic, the unemployment rate went from 3.5 percent to by some estimates over 25 percent of the population, with over 30 million claims in a six week period or between 5-6 times what the peak was in the recession of 2008, and 10-12 times what the rate was "normally."

Netflix (and many other online services) have different demands. They are meant to be available "24 by 7." Typically as demands for service come in, they are distributed to a large group of servers, so servers may be added or subtracted (for maintenance) as needed.

Services do see "peak loads," but they tend to be focused more around holidays and events. In any case, while Netflix is probably experiencing higher than average loads, I doubt very sincerely that they are 10-12 times their "normal" peak load, or even 5-6 times.

Netflix also has other techniques for handling loads. They can offer only SD movies instead of HD movies. They can stream at lower speeds.

COVID-19 is bringing out many lessons. When we talk about repairing our "aging infrastructure" in the US, we normally talk about highways and bridges. Perhaps we should also talk about our aging computer systems.

The Author

Jon "maddog" Hall is an author, educator, computer scientist, and free software pioneer who has been a passionate advocate for Linux since 1994 when he first met Linus Torvalds and facilitated the port of Linux to a 64-bit system. He serves as president of Linux International®.

Buy this article as PDF

Express-Checkout as PDF
Price $2.95
(incl. VAT)

Buy Linux Magazine

SINGLE ISSUES
 
SUBSCRIPTIONS
 
TABLET & SMARTPHONE APPS
Get it on Google Play

US / Canada

Get it on Google Play

UK / Australia

Related content

comments powered by Disqus

Direct Download

Read full article as PDF:

Price $2.95

News