Balancing materials, power, and cost in modern computer design

In Balance

Article from Issue 180/2015
Author(s):

"maddog" looks at the idea of balance in modern computer design – the trade-offs that must be made by designers to meet changing requirements.

A few years ago, I wrote an article about how much paper tape (used on an old ASR-33 Teletype) it would take to hold 1 terabyte (TB) of data. Without going into the detail of that particular article, I had figured out that it would take more than 6,330 years to read in or write out 2TB of paper tape, assuming the Teletype did not break in that time.

That was one illustration of the issue of "balance" – the various trade-offs that have been made by computer designers through the years to accommodate the technology on hand. Previously, devices were of such small capacity that large main memories did not make much sense.

To start with the easiest example of balance, try to imagine a modern-day cell phone built out of the transistors available in 1968. Some of those transistors cost $1.50, while a gallon of gasoline (about four liters) only cost 35 cents. The size of the transistor meant that your cell phone would probably cover the state of Texas, and the power requirements would both need the output of the world's largest hydroelectric plant and create a real danger of climate change on the spot.

I have not done any actual calculations to back up the statements in that last paragraph, but I don't really need to. The point comes through that a cellphone of today would have been impractical to build back then, even if we had known how to build it. So, the first thing that helps create the balance of the modern-day computer is the density of materials – the number of components we can put in a small space through micro-miniaturization. This density also helps with issues of lowering power consumption, which in turn helps lower the heat generated. All of these things help create an affordable product once a solution is found.

Density helps with another factor: speed. As Rear Admiral Grace Murray Hopper used to demonstrate with her 12-inch-long telephone wire representing a nanosecond (and later her pepper grain-sized picoseconds), light and electricity do take time to travel through circuits, and time for a charge to build up and change a 0 to a 1. If you do not give the electricity enough time, then the signal that was supposed to be a 1 ends up looking like a 0 to the circuit that is waiting for it. Of course, as you shrink the circuits, electricity does not have to travel as far, and it does not take as many electrons to charge the smaller circuit, so cycle times can decrease.

However, as the circuits keep getting smaller, the ability for manufacturers to make them in large quantities becomes more difficult. There is good news, however, in the use of new manufacturing techniques, and new materials also allow the components to become smaller and closer together.

So far, I have been discussing issues that most people would assign to CPUs, but which also affect other devices. However, balance also affects the ratio of CPU speeds to memory sizes, cache sizes, and pipelining in processors. Few of the early computers had floating-point instructions, caches, or pipelining. The cost of the components was so high that these mechanisms for speeding up calculations were just not affordable for most computers. Many of the early Intel 386 processors had no floating-point hardware; you had to buy another Intel chip to put on the motherboard to get that feature. Thus, a lot of people avoided doing floating-point calculations unless they had to, substituting integer instructions whenever possible.

Likewise, when your memory size is 4,000 bytes, having a processor run at several gigahertz is not practical; you just will not be able to deliver the data fast enough to keep the processor fed. And, if your RAM memory is small and your processor slow, the average person will not really enjoy the wait between actions – for example, between taking a picture and being ready to take the next picture.

This brings me to the concept of "real time." Slow processors, small memories, and other factors mean that tasks that seem to be real time today were not so real time a few years ago; they were completely impractical (either in terms of cost or technology) from the standpoint of being able to be done in the time needed for them to be finished – a loose interpretation of real time.

Balance in a system is both technological and financial: Can the customer afford the solution? Often, our solutions are limited by the base technologies, the balance of putting them all together, and the total cost. All three have to be considered.

The Author

Jon "maddog" Hall is an author, educator, computer scientist, and free software pioneer who has been a passionate advocate for Linux since 1994 when he first met Linus Torvalds and facilitated the port of Linux to a 64-bit system. He serves as president of Linux International®.

Buy this article as PDF

Express-Checkout as PDF
Price $2.95
(incl. VAT)

Buy Linux Magazine

SINGLE ISSUES
 
SUBSCRIPTIONS
 
TABLET & SMARTPHONE APPS
Get it on Google Play

US / Canada

Get it on Google Play

UK / Australia

Related content

  • Comment

     

  • Xeon 7300: Intel's First Quadruple-Core Processor Platform

    The Xeon 7300er processor family is Intel's first quad-core processor for multiple processor servers. The energy efficiency of the new processors differs depending on the speed with 2.93 GHz requiring 130 Watts compared to 50 Watts for a 1.86 GHz version.

  • Doghouse: Tiny Computers

    A choice of ultrasmall boards lets future developers learn how to create and program computers.

  • Zeus Load Balancer

    On today’s networks, distributing requests in a cluster of web servers requires more than just assigning the requests in a round robin. The Zeus ZXTM 7400 appliance demonstrates the technical finesse necessary to keep busy websites running.

  • Introduction to FPGA Design

    Learn what FPGAs are, how they work, and how to design FPGA integrated circuits on Linux.

comments powered by Disqus
Subscribe to our Linux Newsletters
Find Linux and Open Source Jobs
Subscribe to our ADMIN Newsletters

Support Our Work

Linux Magazine content is made possible with support from readers like you. Please consider contributing when you’ve found an article to be beneficial.

Learn More

News