Balancing materials, power, and cost in modern computer design
In Balance
"maddog" looks at the idea of balance in modern computer design – the trade-offs that must be made by designers to meet changing requirements.
A few years ago, I wrote an article about how much paper tape (used on an old ASR-33 Teletype) it would take to hold 1 terabyte (TB) of data. Without going into the detail of that particular article, I had figured out that it would take more than 6,330 years to read in or write out 2TB of paper tape, assuming the Teletype did not break in that time.
That was one illustration of the issue of "balance" – the various trade-offs that have been made by computer designers through the years to accommodate the technology on hand. Previously, devices were of such small capacity that large main memories did not make much sense.
To start with the easiest example of balance, try to imagine a modern-day cell phone built out of the transistors available in 1968. Some of those transistors cost $1.50, while a gallon of gasoline (about four liters) only cost 35 cents. The size of the transistor meant that your cell phone would probably cover the state of Texas, and the power requirements would both need the output of the world's largest hydroelectric plant and create a real danger of climate change on the spot.
I have not done any actual calculations to back up the statements in that last paragraph, but I don't really need to. The point comes through that a cellphone of today would have been impractical to build back then, even if we had known how to build it. So, the first thing that helps create the balance of the modern-day computer is the density of materials – the number of components we can put in a small space through micro-miniaturization. This density also helps with issues of lowering power consumption, which in turn helps lower the heat generated. All of these things help create an affordable product once a solution is found.
Density helps with another factor: speed. As Rear Admiral Grace Murray Hopper used to demonstrate with her 12-inch-long telephone wire representing a nanosecond (and later her pepper grain-sized picoseconds), light and electricity do take time to travel through circuits, and time for a charge to build up and change a 0 to a 1. If you do not give the electricity enough time, then the signal that was supposed to be a 1 ends up looking like a 0 to the circuit that is waiting for it. Of course, as you shrink the circuits, electricity does not have to travel as far, and it does not take as many electrons to charge the smaller circuit, so cycle times can decrease.
However, as the circuits keep getting smaller, the ability for manufacturers to make them in large quantities becomes more difficult. There is good news, however, in the use of new manufacturing techniques, and new materials also allow the components to become smaller and closer together.
So far, I have been discussing issues that most people would assign to CPUs, but which also affect other devices. However, balance also affects the ratio of CPU speeds to memory sizes, cache sizes, and pipelining in processors. Few of the early computers had floating-point instructions, caches, or pipelining. The cost of the components was so high that these mechanisms for speeding up calculations were just not affordable for most computers. Many of the early Intel 386 processors had no floating-point hardware; you had to buy another Intel chip to put on the motherboard to get that feature. Thus, a lot of people avoided doing floating-point calculations unless they had to, substituting integer instructions whenever possible.
Likewise, when your memory size is 4,000 bytes, having a processor run at several gigahertz is not practical; you just will not be able to deliver the data fast enough to keep the processor fed. And, if your RAM memory is small and your processor slow, the average person will not really enjoy the wait between actions – for example, between taking a picture and being ready to take the next picture.
This brings me to the concept of "real time." Slow processors, small memories, and other factors mean that tasks that seem to be real time today were not so real time a few years ago; they were completely impractical (either in terms of cost or technology) from the standpoint of being able to be done in the time needed for them to be finished – a loose interpretation of real time.
Balance in a system is both technological and financial: Can the customer afford the solution? Often, our solutions are limited by the base technologies, the balance of putting them all together, and the total cost. All three have to be considered.
Buy this article as PDF
(incl. VAT)