Exit


The Microprocessor as a Modern Odysseus

by Dennis Báthory-Kitsz


Note: This talk was given before the Vermont Chapter of IEEE in Burlington, Vermont, March 25, 1982
Copyright ©1982 by Dennis Báthory-Kitsz


Introduction

The microprocessor promises to be the universal machine tool of the next generation of industrial and appliances devices in this century. Tonight I will attempt to describe something about what a microprocessor is, some of the many human applications it is capable of assuming, some of the unusual attitudes is use encourages, and some local applications.

Top


Miniaturization

An anniversary of historic proportions passed very quietly a few months ago. It has been just over ten years since the creation of the first true microprocessor -- a general-purpose digital device capable of receiving binary instructions and data, calculating, producing and storing results. A true computer had been created on a single slice of silicon.

Not perhaps since the plagues of the late middle ages has something so small had such a great impact on its contemporary landscape. The microprocessor was brought to fruition as the indirect result of a program of miniaturization undertaken during the heyday of the American space effort. Mechanical control and computational equipment had to be packed into a tiny capsule, together with an astronaut or two, and function reliably thousands of miles out in space. Split-second decisions were crucial, and as space travel moved further and further away from the surface of the earth, proportionately more and more control had to be taken by the spacecraft and its occupants. When radio waves were first seconds and then minutes away from ground control, each succeeding element of ship management and life support had to be taken over by computer equipment on board the craft.

Out of this drive for miniaturization, the first microprocessor was developed. It was announced in November 1971, and was called the 4004. Officially it was designed and manufactured for business purposes -- probably because by 1971 it was clear that the American initial goals in space had been met. Through the space programs the technical world had learned an enormous amount, and set about working on new applications for this knowledge.

The story is related in the March 1982 issue of Byte magazine. It sounds like history. In fact, it is history. In the broadest, most optimistic sense, the microprocessor offers to humankind the greatest body of tools since the industrial revolution of the Nineteenth Century, and perhaps even since the beginning of civilization.

Although the microprocessor uses electricity, its impact is even more important than the discovery of electricity itself, for electricity was a discovery, an increase in the scale of human awareness. But the microprocessor, like the wheel or the pulley or the telephone, was the result of inspired application of known but partially understood natural resources. Because, unlike raw electricity, it is numbered among the fruits of human labor, it better serves the human condition.

To start, let me present a curious scenario.

Top


Surviving a Major Disruption

The great inventions of pre-Twentieth Century history have in large part been monumental. I mean monumental in both literal and figurative senses -- that the inventions are large in not only breadth of conception and general purpose, but also that they are physically substantial. The advancement of technology has made possible more capable, yet more complex, machines and other tools for human production and creativity.

But with this capability and complexity has come a price, a hidden and not-so-hidden cost: the increasing interdependence of society. Society in fact becomes not a fabric, but rather durable as tempered glass. It is strong, yet with a structure so delicate that a sharp tap in crucial areas can cause it to splinter into a heap of rubble. Major wars of the past have been fought by and in an agrarian world. Where industrialized and industrializing nations were involved, the societal setbacks and dislocations were almost incomprehensible. Food and medical supplies were disrupted because they had become self-sufficient units where the whole remained intact. Individual societal units, unless they had remained essentially agricultural, were stripped of even the basics of existence.

This fragility is doubly frightening today. Not only are our international destructive capabilities far more outrageous, but our cultural and intercultural dependencies are incontrovertible. Even putting the innate horrors of war aside, disruption of supplies and distribution systems and communication networks might set global society back centuries.

There are, however, nearly two million personal computers in this country, and perhaps several million more microprocessor-based computing tools of one kind or another. Their presence at once removes the possibility of society being set back to the stone age; the survival of a single microcomputer represents the survival of civilization, and the fast and efficient retrieval of the technological prowess made possible by the lightning-fast speeds of a digital computer. And because these microcomputers are hardy, dozens, perhaps thousands, will survive intact.

Such broad generalization needs to be qualified just a little: first, someone handy enough to use it must survive, and second, makeshift power of some sort must also be created. But the latest generation of these microprocessor-based personal computing machines is being programmed by grade-school children, and uses batteries ... or can run on solar cells, or wind, or water, or even biochemical power based on human waste. Beyond that, the mathematical and scientific knowledge of thirty centuries has been concentrated in the operating systems of personal computers costing no more than business suit.

Top


Progenitors of an Electronic Civilization

The arrival of computer power at the consumer level was not done by the genius creators and purveyors of the microprocessor chips themselves. Basically, it was not even done by the bold, risk-taking hobbyists and experimenters who graced the pages of Time magazine a few weeks ago. Instead, the enthusiastic acceptance of millions of personal computers by a lay public was brought about by ordinary capitalist marketing strategy. Seeing a product with consumer potential, three vendors of American cultural nitty-gritty -- Magnavox, Warner Brothers, and Radio Shack -- began to manufacture and market appliance-level products. They are the now-classic Odyssey and Atari video games, and the TRS-80 Model I microcomputer.

The demand for microprocessors for video games was at record high levels in mid-1981 as manufacturers placed parts requests in anticipation of heavy Christmas sales. A single game maker's order was rumored to be in excess of four million microprocessors. Othello, Simon, computer chess, and hundreds of other toys with sophisticated innards were designed. A new field was born, it seemed, overnight. Yesterday's New York Times business section contains a dozen pages of advertisements for Software Developers, Microprocessor Design Engineers, and Programmers.

What brought about this incredible burgeoning in demand for personnel of whom there are only a few relatively well trained? It's important to be candid about the origins of such demand: defense applications. From the time the first stone was thrown, humankind has been methodically working toward more efficient methods of destruction and annihilation.

The March 1980 issue of High Technology magazine describes so-called Kamikaze missiles with pattern matching, loaded with course and target data. These missiles travel quickly, close to the ground, and contain minicomputers with 100 kilobytes of memory. In development are units with fast pattern-matching and self-reprogramming, with high-speed dual minicomputers and 200 kilobytes of memory. In five years, High Technology predicts, detailed pattern recognition will be coupled with target discrimination to select, say, a single supply truck out of a random group of vehicles. Faster microcomputers will take over this task.

The defense applications are inevitable, even if it is astounding that a powerful computer -- the culmination of centuries of scientific progress -- should be used to guide an implement of horror.

However, the popular enthusiasm is another matter. The microprocessor's entry into the American mainstream was made initially through arcade games -- first in bars, and then at home. The success of Pong lead the way. At the time Pong first arrived, CB was king, capturing the popular imagination with dreams of riding high with ten-wheelers and evading the dreaded smokies. And Radio Shack was also riding high with those 10-4's. A pair of engineers by the names of Leininger and French had designed a small computer using the new Z80 microprocessor.

There is a story about the first meeting between the nervous engineers and Chuck Tandy, president of Radio Shack and the other companies under the corporation that bears his name. They had written a short BASIC program to calculate benefits and other mundane items as a demonstration. They suggested Tandy sit down at the machine and give it a try; he entered his salary, and the program immediately crashed. They were using a preliminary, integer-based language whose greatest number was 32767. Tandy's salary was too large for the computer.

Nevertheless, in 1977 Tandy Corporation hesitantly announced a very expensive new product. A few typed sheets with hand drawings were all store sales personnel had to explain this small "home and business computer". Neither Tandy nor the few initial customers had any idea that the TRS-80 would become the first true home computer, the Model T of the microprocessor age.

Far from Texas, the tale of the Apple Computer was being created. It is often told now ... how these two young folks were filled with the energy of Silicon Valley, that enclave of high-tech development southeast of San Francisco. They even traded their venerable Volkswagen bus, it is told, to raise the cash to create the Apple I computer. They hand-made a bunch, which sold as fast as they could turn them out. A few months later, they were in business. A classy looking computer was designed, aimed at the very high-tech community of which they were an integral part.

The news of the Apple II -- both accurate and fictional -- spread throughout the country. It was a status symbol: a sleek, beige-skinned belle that bespoke a classic power in its very appearance. It has become almost a mythical machine. Yesterday's Doonesbury shows how Duke, on the lam for trying to kill Zeke for burning down his house, has computerized his wholesale drug dealing. His contact gives him a call about some missing marijuana:

"Wait a minute, let me check the Apple. What's your account number with us? Here it is. For future reference, your number is TB765035." On the other end of the phone, Diaz is incredulous. "Are you crazy? You keep records?". Duke is cool; he replies, "It facilitates billing, Diaz. Besides, in the case of a raid, I can destroy everything with the touch of a button. No more fooling with fires or toilets." The proverbial computer bug is the butt of Trudeau's joke; Duke says, "Bad news man. We've lost ten bales in the computer." Diaz: "No problem. I'll just send a couple guys over to help you take it apart."

Top


The Diminution of Big Brother

We are careering toward a year that is infamous -- even though it has not even arrived ... 1984. That ominous annum was described by George Orwell in the year I was born. The greatest threat, it seems, was the certain, horrendous, inevitable omniscience of Big Brother. The electronic eyes that watched, and the ears that heard. Nothing was safe, not even history.

A fear of computers has justifiably grown up as a result of both the Orwell fiction and the reality of computer abuse by both governmental and business agencies.

But ironically, the appearance of personal computers may well weaken and diminish, or at least delay, the appearance of the menacing Big Brother. The presence of disconnectable personal computers in millions of homes encourages not only computer literacy and understanding of hardware -- plus a little bit of caginess -- but also a familiarity that destroys the psychological threat that the Big Computer / Big Brother days had a tendency to encourage.

The growth of a computer-literate public is a threat to secrecy and domination. Although personal computers have not made headway in the communist-dominated countries of Eastern Europe, the copying machine has, and I would like to offer that as an example of how repressive governments can be threatened by a technologically literate populace.

During the recent period of martial law in Poland, the highlights of government restrictions were reported. However, there were some secondary, very interesting reactions. Among the new limitations of press activity was the banning of the use of and the government dismantling of photocopying machines. This new technology posed a serious threat because of the simplicity and availability inherent in the process.

By comparison, large and vibrant microcomputer user groups, phone networks, and even ham broadcasting has begun in the United States. A computer can be hidden more easily than a transmitter; data tapes or discs can be circulated and copied with great speed. The universal communications standards hand an amazing new power to the public.

Top

Who Am I?

Before I continue, you might be curious to know who I am and why I, who have no formal credentials in any field resembling computers, electronics, or engineering, would be speaking here. On the announcement of this meeting, I was identified as the President of Green Mountain Micro, and that's true. Professor Sundaramurthy gave you a thumbnail biography at the beginning of this talk. And what is crucial from that description is that I am a composer co-director of the Dashuki Music Theatre. That is the key to both why I am involved with microcomputers and why the points of view I express tonight might strike you as unusual.

I do not believe in the mystique of machines. Radios, automobiles, telephones, typewriters, even toasters -- use should be with understanding. Computers of any kind, being mostly a collection of simple switches, should be nothing more than a study in very detailed yet very simplistic tedium. I began my first electronic composition in 1969 very warily; I knew that there would be no way of performing the work with the technology available then. As a matter of fact, it is still unperformed. But after purchasing an analog music synthesizer ten years ago, I undertook a process of learning the electronics -- and eventually the software -- that would allow me to perform not only electronic music per se, but also that would assist me during the composition or auditioning of works in progress.

This course has led me to find the simplest and, of course, the cheapest ways to accomplish this task. I have also faced four remarkable biases: the musician who believes that no electronic machine can assist in the production of art; the computer career person who believes that the computer is a professional's tool; the software designer who believes it is incumbent upon every computerist to find a software solution to a problem; and the hardware designer who believes that any software solution can be converted to an equivalent circuit, no matter how complex.

Because of these viewpoints, and the relatively irreverent viewpoints that artists normally hold, you will be hearing me generalize and simplify somewhat tonight. But if I do so, please realize that it is because such simplification and generalization contributes to reducing any problem to its essentials, to a level at which the solution -- it is hoped -- will become more evident.

Top


Defining a Microprocessor

I began speaking of microprocessors in terms of their purpose, but without defining them or discussing digital electronics. From the perspective of being outside the "black boxes", that is, as a user and not a designer of digital logic devices, it is possible to explain them very simply. A digital logic circuit, packaged in a small plastic or ceramic case, acts like a block of switches and relays. On a signal from the outside world, different internal switches are activated, producing some sort of known result at the output of the "black box".

The simplest analogy is the a pair of switches and a light bulb. If I hook the switches in series, both switches must be flipped on to activate the bulb; if I hook the switches is parallel, either switch will light the bulb. The first is an AND condition, the second is an OR condition. If I add a normally-closed relay between switches and light bulb, another pair of unique situations is created. With the first hookup, both switches must be on to turn the relay on, extinguishing the light; with the second hookup, either switch will snap on the relay, killing the light. These are NAND and NOR gates. Latching relays are akin to digital latches, and mechanical step-counter relays produce a result like digital counter circuits.

What, then, is a microprocessor? It is a combination of thousands of these digital logic circuits combined in such a way that, when the input is presented with a pattern of on and off conditions, a known, useful result will occur. The pattern of on and off conditions is the processor's instruction set, and the output is data.

The remarkable thing is that, depending on the original microprocessor designer's choices, each half-inch by two-inch circuit can differently and predictably to hundreds of different input situations. The instructions and data come from memory, and may return to other portions of memory, which is an electronic storage area used by the processor to execute a program ... a complete set of instructions that define a task.

Most remarkable of all is the cost. Circuitry has been miniaturized to the point where all of what I have described -- memory and processor -- can fit on a single integrated circuit chip. But the cost of microprocessors has dropped from over $300 in 1973 for this chip to $2.50 in quantity.

Someone said recently that if Detroit could build cars like Silicon Valley could make digital circuits, the cars would cost $9.95 and get 200 miles to the gallon. I understand that one of Detroit's engineers responded by saying that if Detroit followed Silicon Valley's lead, the car would also fit in the palm of your hand.

Top


A Question of Smallness

But the question of smallness is an astounding one. I have some things in these overall pockets. I have some processor and memory chips which I've split open, together with a stylus microscope so you can look at them closely. I have a John Bell Engineering control computer, a single board device which I plan to use in a solar-powered garage installation. And finally, I have a Radio Shack pocket computer, a full BASIC-programmable computer that sells for under $150.

Top


Defining the World by Steps

I would like to reflect on how the computer view of the world -- if it could have one -- differs from ours ... and how it is surprisingly the same. We live in a world of seemingly infinite gradations, gradations that blend one into the other so that the moment of change seems imperceptible. But this is a deception borne out of human physical weaknesses. Consider...

What these images have in common is that when we attempt to examine them in minute detail, our senses fail ... our perceptions weaken. There is a step so small that we cannot discern it.

This is nothing new, and belongs more properly in the realm of philosophy. But in the past, our tools have attempted to imitate the way we work and think. As Seneca said, "Omnis ars natura imitatio est" -- all artifice is in imitation of life. But we ourselves work and think so well that the marvelous machines cannot live up to our expectation of them -- except by brute force, they cannot compete with us.

It is from this perspective that the microprocessor represents a dramatic and irrevocable departure from our human past.

First of all, use of a processor to perform human tasks acknowledges that these tasks can be broken down into discrete, quantifiable steps. Granted, these steps may at first seem large in human perception, but that too is deceptive. The microprocessor, as I said, just celebrated its tenth anniversary, and cannot be expected to undertake human tasks flawlessly.

Top


Some Failures and Some Successes

The General Motors "Computer Command Control" engines are a perfect example. These engines, installed for the first time on 1982 Cadillacs, were to be the pride of modern American technology applied to the automobile, pride of past American technology. The first reports are in, and those reports are a 22-state class-action suit against General Motors.

Although GM has been sullen about the failure, the symptoms seem to be: engine shutdown in heavy traffic, extremely poor gas mileage, and unexpected malfunctions. It's hard to analyze what could have gone wrong with the mating of these two technologies, but my suspicion is that both systems are working properly. Unfortunately, the engine's electromechanical expectations are not met by the computer's decisions.

In other words, any computer operating in the real world uses two criteria: the programmer's requirements and the user's input of data. If the programmer expects something different from what the user provides, anomalous results will be provided. Now you can call the system engineer when your computer hoists the white flag, but driving a country road on a dark night is another situation.

The GM computer might be evaluating and acting too quickly. If I tap the gas pedal of an ordinary car very hard and release it very quickly, I might get a slight extra buck for a second. With the computer engines, the car will surge ahead and then stall. Stop-and-go city traffic seems to confuse it. The software isn't up to it.

Here's another example, but one that does undertake human tasks remarkably well. I received a rather frantic letter from Brad Naples of New England Digital. He was very upset with me. He assumed that since I had mentioned New England Digital as a Vermont example of the use of microprocessors, that I intended to talk about the Synclavier, a second-generation digital musical instrument. He sent me a reprint of one of my very own articles, circling the paragraph which stated that the heart of the Synclavier was a custom minicomputer, not a microprocessor.

Brad's assumption was wrong, at least partly. The point I was going to make in passing was that New England Digital had good reasons for choosing to use minicomputer design for their digital synthesizer. I was going to talk mainly about their choice of a Z80 microprocessor to assist in the control of input and storage for their real-time laboratory sampling system.

But Bradley's letter got me to thinking. There were lots of reasons that NED might have chosen mini technology. Let me offer something more dramatic. There is no quintessential difference between a minicomputer and a microprocessor. They are both digital devices acting upon some sort of binary data according to a fixed set of instructions.

The differences, examining the state of the art at the beginning of the 1980's, are merely practical ones. The minicomputer, because it uses separate, fast integrated circuits, can perform faster than a microprocessor, which is limited by often ponderously slow speeds -- if you consider up to 10 million clock cycles per second slow. Internal heat buildup, reliability, and other problems continue to limit the speed of integrated processors.

Flexibility is also a key reason for creating a custom minicomputer. I'm guessing here, but it seems that if an extremely complex task must be performed very quickly, the best way to go about it is to create a device specifically oriented to the task. And a computer built of separate circuits (rather than out of a single, general-purpose device) is likely to be more suited to the designer's task.

But the essence is there. My guess is that any engineer will choose the tool which is both most efficient and most flexible, given the present task and the potential for future changes in that task. Were a high-speed, microprogrammable microprocessor available, I'm betting that it would be in use in lots of installations -- including ones such as New England Digital's.

The differences may seem to be a small point, but the use of computer technology in music is quite an advanced one.

Top


Artistic Applications

Sound is a complicated, real-time event. As I speak to you, the air molecules are transmitting ever-changing levels of pressure. The frequency of those variations, their intensity, their reflections off the walls around us, and even the difference in time it takes to reach each of your two ears -- all contribute to the overall sound that you hear. But as I talk, you hear -- except for an occasional extraneous noise -- just a one pattern of sound, one line: monophonic sound.

To reproduce even that single line of sound requires enormous precision, since incredibly small changes of level or frequency can be detected by our ears. Music is for the most part polyphonic ... of many voices or parts. These many parts combine into a single sound wave, a wave representative of all the original sounds, and one that can be perceived -- that is, understood by our ears and our intellect -- to be all of those original sounds.

The job of composers is not merely to reproduce sound, but to create new music, and often new sounds with it. Instrument invention, development and perfection was at a high level in the Nineteenth Century. But from the mists of prehistory through the greater past, composers never felt obligated to understand what sound was, but only how to use it. They might have been curious, certainly, though they left the heavy work to musical theorists, philosophers, physicists and mystics. But creating -- there's the key word -- electronic music (at the software and hardware design level) involves describing it successfully. To say sound is vibration is simplistic, although Beethoven might have been more than satisfied with the wealth of science in that idea.

Contemporary programmers, hardware designers, and composers need something with more specific boundaries, say: Sound is large-scale molecular disturbance whose average rate of disturbance falls within the region that human ears can detect and decode. The object causing the disturbance, the medium for transmitting the sound to the human eardrum, and the human eardrum itself all provide similar versions of the disturbance, which, when viewed over time and distance, consist of quantizable variations in molecular pressure.

A time-and-distance description might help answer the question "if a tree falls in a forest, and no one is there to hear it, is a sound made?" If the event occurred, all the necessary processes were put in motion to create a sound. Thus, if some long-term transmission medium -- a tape recorder, for example -- preserved the event for later playback, who would conclude that a sound was not made? Only someone who defines sound in terms of the moment it falls upon the ear, or perhaps when it is transmitted to the brain and cognitively perceived. From that view, it takes time for sound to travel from the disturbance to the human detecting it; so when the tree falls, no sound is made or heard at that instant anyway. The tree might not even be making the sound, but merely instituting a sound-creation event sequence. In fact, it truly is the overall time as we conceive it that allows sound to be detected and understood. We'll come back to this question in the future; it's more important that this amateur philosophical exploration would indicate.

Since Edison first cut grooves in a cylinder of hot wax, sound recording and reproduction has been based on keeping an instantaneous, proportional image of the sound's energy. A century of audio development has been spent perfecting this system, each component being tailored to reproduce the variations in sound pressure precisely, or to alter them in a known way with amplifiers or filters. The goal was at one time the delayed simulation, through recording and playback, of the original sonic event.

But concepts and methods have been changing. Much of the popular music recorded today cannot be performed live. It is the result of overlaying sound in separate recording sessions -- ten trees falling, all at different times, with the sound preserved on distinct yet parallel tracks of a recorder. The "reproduced" result sounds like the whole that never was. Background music squeezes performers into tiny sonic boxes. Even the enhancements added to supposedly purist classical recordings place the listener cheek-to-cheek with the soloist, who seems to stand angelically above the orchestra.

This potpourri of descriptions and metaphysical sidetracking has a point. Einstein once claimed that were it not for the philosophical attitudes introduced during his youth, he could never have conceived of the theory of relativity. No fact was evident, he suggested, until humanity was ready to perceive it. Likewise, the ability to create electronic sound arises out of the historical manipulation of real sound by recording engineers and their devices. A public acquiescence to the super-reality (surreality?) of music assembled for such recordings has opened the public mind and ear. Not only does it become possible to create or synthesize familiar sounds, but the ability (and desire) to invent sounds previously unknown comes into play.

Let me unhitch the cart and lead the horse behind it for a moment. The appearance of video arcade games with their plethora of bloops, bleeps and squawks has begun to educate the public ear to sound with purely electronic genesis. Although the introduction of electronic sound has been subtle -- beginning with the famous Maxwell House coffee commercial of the Sixties -- it has been mostly for effect or in partial simulation of "real" instruments. The video games have legitimized a blatant electronic sound. Miniature electronic keyboards have replaced plastic reed organs as kiddie toys, and ubiquitous electronic themes on tiny games have become a recognizable and unremarkable part of holiday mornings in America.

The problems of designing a legitimate system for real musicians are enormous. New England Digital has succeeded in doing that by using, at present, a custom designed minicomputer. In the future, though, I can expect that microprocessor technology will change that viewpoint.

Top


The Uniqueness of a Local Application

Now I would like to turn from the artistic subtleties of music to the mundane subject of wastewater ... something I used to call sewage before I was otherwise educated by wastewater engineers. DuBois & King, Inc., of Randolph, Vermont, undertook the design of a unique wastewater treatment plant for the Village of Alburg, Vermont. The company chose to use a method of waste treatment known as spray irrigation. This is essentially the detoxification of waste materials using traditional processes, and the dilution of the sludge into a sprayable, organic fertilizer. This would then be used to fertilize a hay crop that could provide income for the operation of the plant with three yearly cuttings.

Successful spraying requires proper temperature, wind, and rain conditions. The system must not be allowed to spray sludge during high wind (for obvious reasons), during freezing weather, and when the ground is already very wet. Although the Alburg location was somewhat experimental, experiences in other areas of the country suggested that it could be a viable option in Alburg.

However, conditions were nevertheless experimental, and refinement and continuous adjustment of the system could be expected for a few years. And it is this continuous adjustment that led project managers William H. Baumann, Jr., and Steven E. Mackenzie to select a microprocessor control system in place of a traditional cam-timer arrangement.

Since timing the opening and closing of spray valves was to be done by microprocessor control, the major weather conditions were also fed to the system. Alarm conditions, including high and low pressure extremes, water on pump room floor, high and low well levels, and power loss, were sent to the processor instead of to individual alarms. This, combined with battery system backup, allowed not only alarm conditions to be signalled, but a record of the alarm to be kept by the processing system.

The software control was designed by Bristol Systems in Waterbury, Connecticut, and tailored to the DuBois & King specifications. However, this resulted in an interesting anomaly. At the time, it was the engineering company's first microprocessor installation. Spray valves would be opened and shut only when all conditions to the central processor allowed them to be, but one of those conditions was rainfall. The original software would shut the system down whenever rainfall was detected, and restore the system to operation a specified time period after the rain stopped.

Working with human operators is very flexible because judgment is involved. A human will not shut down the system for half a day after a passing shower; the computer would. So a simple control system had to be changed to accommodate different lengths and intensities of rainfall.

What is important in this installation is that very flexibility. The rainfall parameters -- as well as temperature, seasonal, and wind speed parameters -- could be changed from the microprocessor control panel by any operator with the correct password. Instead of endlessly readjusting mechanical timing systems, the microprocessor software could make changes to the operation of the entire wastewater treatment plant within minutes.

Top


The Integrity of an Industry

The microprocessor, as with any commercial venture (and it is a commercial venture), has also created problems. Among them is an decrease in sensitivity to human foibles, but this is inherent in the development of any technology which by its nature stresses uniformity of action and consistency of results. It is not an art form.

But my point here is more mundane. It is becoming increasingly clear that extensive self-interest -- to the injury of others -- is being practiced by many American firms. The first practice is what is rapidly becoming a health scandal for the "clean" semiconductor industry. Workers in underdeveloped and developing nations are being put to work in factories for what are considered good wages by local standards, and the jobs are thought to be prestigious by friends and family. The problem is that many of the factories would fail government health and safety inspection in the United States.

Young women are having their eyesight ruined by hours staring into binocular microscopes with neither sufficient breaks nor regular vision tests. They are, in effect, being used up and thrown away in less than two years. Hazardous chemicals and gene-damaging metals are not being handled or overseen with the care required of on-shore operations. Poisons are part of the job description.

A second area of concern is more political, and although I understand such items are usually not included in the agendas of professional organizations, I will attempt to spark your concern. It has been said that the United States, and particularly U.S. business, still acts as a colonial master toward Central and South American nations. I believe we all should be concerned about corporations whose economic interests are so tied in with the survival of a nation's government that that government's repression and brutality are overlooked.

I am speaking specifically of Central America, and more specifically of El Salvador. The U.S. electronic industry maintains a great number of factories in this country -- a nation physically smaller than Vermont. But the economic stakes are high enough that the present regime is encouraged, the rebels are provoked, and innocents die. In that matter I ask your concern.

Top


Exit