Before Television – BTV

Green Libertarianism, Young Person's Guide.. chapters

What Have Computers Done to Our Minds?

A brief discourse on technological progress

BTV = Before Television

My own early childhood was part of the last generation to develop its consciousness before the advent of television. I did not experience television until I was six years old, and we didn’t own one in our home until I was nine. More importantly, all of the adults in my life, including parents and teachers, were raised and educated BTV – before television. Television, and then computers, the internet, VCR’s, DVD’s, and the national corporate media have completely changed our consciousness during the past 50 years, and this New Age of electronic media closely parallels the rise of science fiction as literature and the obsession with an alien presence, here. The effects have been highly political, drastically changing the economic, social, and cultural life of nearly everyone in this country who is in any way “plugged in” to them.
Few Americans now in their teens or 20’s, unless they have lived in a remote place, have experienced anything like the personal freedom, connection with nature, and sense of local community which I experienced as a child. Is this really a problem? No, because it has no solution. It is a change which we can make ourselves aware of, and in certain respects compensate for with our personal lifestyle choices. What we need to do is understand both the positive and negative implications of these changes, and attempt to direct our individual and community lifestyle choices in a healthier, freer, more natural and humanistic direction.
Some parents have actually made the choice of having no television in their homes, encouraging reading, crafts, and hobbies of the same kinds which nearly everyone practiced when I was a child. Others have opted for high-tech, internet-based home schooling, intensive sports programs, music, dance, skating, art, and other kinds of private or group instruction, and so forth. Forming an intentional community of some sort (most are religiously based, but that needn’t be the case) is highly desireable, both for the nurturing, health, and sanity of children, and to maintain the sort of lifestyles which are good for people of all ages. But the vast majority of Americans simply haven’t done it, and aren’t about to do it.
If you were raised in the average environment of public schools, lots of TV and video games, computers, Top 40 radio, fads in clothes, cars, hair, and gadgets, you will probably see no reason to make any radical changes in your lifestyle. You are probably interested in having a good job, owning your own home, marrying someone you love and with whom you share many or most interests and aspirations, and raising children to be pretty much like you are. This book might lead you to question some or all of these goals, and to re-think or make some better choices. Whatever happens, I have tried to provide some alternatives to the present assumptions which underlie American society at the beginning of the 21st century. This book is more for those who are unhappy with the current state of affairs, and wish to head out in some different direction. It is only by defining and understanding where we are, today, that we can even think about being somewhere else. These are personal choices which all of us must make for ourselves.

=======================

A Brief Discourse on Technology

We live in a highly technological age, and virtually no part of the world is free of its attractions and liabilities. Even the most isolated and “pre-industrial” civilizations now rely on automobiles, power boats, farm machinery, and now, of course, computers, cell-phones, and every other kind of modern technology. And every nation, no matter how poor or disadvantaged, wants to spend an inordinate part of its national income on military, police, and other repressive and destructive institutions. More than 70% of our “foreign aid” over the past century has been devoted to military and “internal security” purposes.
The U.S. government spends 20% of its budget on direct military spending; another 20% or so on interest on the national debt which is almost entirely attributable to past military spending; and another 10-15% on health care, pensions, and other services for veterans of past wars. At the same time, spending 1% of the budget on aid to families with dependent children was considered to be an unconscionable waste of the taxpayer’s money, and 50 cents per taxpayer spent on support for the arts, and another 50 cents for public broadcasting are under continual attack by “conservative” senators and congressmen.

I. The Abuse of Science, Sociology, and Mathematics

In defending a radical, logical opposition to today’s technocracy, it is important to distinguish the human uses of new science and technology from its abuses. Most of the criticism of technology and techno-think is directed towards its rampant abuses, not its utilitarian values.

The primary abuse of physics is the nuclear arms race. The primary abuse of rocket science is an ICBM nuclear arsenal (which I live next to here in central Montana). The primary abuse of economics is its role as apologetics for the multi-national corporate aristocracy. The clearest abuse of mathematics may be seen in the actions of another Montanan-by-choice, Theodore Kaczynski.

If we include theology, then its abuse may be seen in monotheistic fundamentalism, whether Christian, Islamic, or Judaic; and its resulting conflicts in the Middle East and elsewhere based on putting one faith above nature, and creation as a whole, in being “the one true faith” and the only accurate representation of God’s scheme of things.
Is it wrong (“Satanic”) to teach children calculus and quantum theory? Evolution? The Marxist theory of historical development? Of course not! Should we attempt to indoctrinate our teachers and schoolmasters in some particular faith or ideology? Or should we encourage diversity and choice? These are the vital issues surrounding another great abuse: the abuse of education by brainwashing, “training for capitalism,” racism, imperialism, genocide, or whatever. Liberal education, it would seem, has suffered even greater setbacks than liberal politics or religion.

What about high-tech terrorism based on our utter dependence on massive, energy intensive machines, buildings, and other accessories of civilization? As this is being written, we have just witnessed the first major, successful attack on the 48 states since the War of 1812, accomplished by 18 men armed with pocket knives, but in control of 4 airliners which were used as guided missiles against some of our tallest and most important government and financial buildings. The death toll is now estimated to be approaching 7000 — more than Pearl Harbor and the Titanic, combined. (A few years later, we know the 9-11 death toll was some 3500, and at least 2 of the four airliners are believed to have been shot down or otherwise disposed of by our own military forces. The strike on the Pentagon is now believed to have been a military aircraft or missile, not an airliner. And instead of 3500 deaths, the number has now multiplied to millions of casualties in Iraq, Afghanistan, and other parts of the world as a consequence of the U.S. “response.”)

In the 1970’s, a movement known as “appropriate technology” emerged, led by counter-culture leaders such as Stewart Brand, who founded the Whole Earth Catalog and associated enterprises and publications, and E. F. Schumacher, a British economist and former bureaucrat in Britain’s state-owned coal industry who wrote a charming little book called Small is Beautiful which became an international best seller. The gist of this movement was that we need to free ourselves of technological domination by governments and large corporations by regaining control of our economy, our tools, and “the means of production.” Children of the upper middle class “dropped out” to form rural communes, urban collective businesses, schools, community centers, and all sorts of other humanistic, more or less anti-corporate and anti-technological endeavors.

Much of the recent policy debate between advocates of “appropriate technology” and those who believe that no one should attempt to control its development and evolution centers around this question. In the 1930’s, there was an explicit political movement – the Technocrats – who believed that all social problems could be “re- engineered” by science and technology to correct or eliminate them. Much of Nazi ideology had a similar “scientific” aura and rationale. Marxists called their system “scientific socialism” to distinguish it from the softer “social democrats” or “Utopian Socialists” – a term which Marx originated. Yet, his followers would also rely heavily on the French tradition, with its Phalanges and rule by scientists and engineers.

The classical Liberals – the laissez-faire, free trade, rule of law, parliamentary democracy advocates – often had a different view based closely on emerging evolutionary biology. A healthy society must not overprotect its weakest members; captains of industry are like ecologic dominants, evidence of the perfect working of the principle of the “survival of the fittest.” Technology, for them, was just one part of this process. Clearly, we must leave entrepreneurs free to develop and experiment with any and all technology. We must protect their right to exploit their discoveries and inventions, a principle which was later severely questioned by libertarian purists. Since studying their arguments, I have been able to find little social value in granting monopoly protection to most scientific patents and discoveries. Although “trade secrets” and the fruits of well-financed research and development programs are known to be keys to success in the marketplace for new technologies and products, the fact that pharmaceutical companies spend far more on advertising and promotion of their products than they do on research indicates their arguments in favor of monopoly profits resulting from patent “protection” are bogus.

In the United States, the latter position has obviously had the upper hand for the past couple of decades, if not before. Although the American natural environment is less ravaged than Europe’s or the heavily populated parts of Asia, we now lag behind the rest of the world in international initiatives to address global ecologic problems. Indeed, we are as reactionary as the Vatican or Islamic Republics on many of these issues – especially those pertaining to population control.

Clearly, we must begin to address such issues as overpopulation, land use, and non-renewable resource exploitation sooner rather than later. The human species is rapidly approaching some form of negative utopia in which life has lost all meaning, and in which the physical conditions of most people’s existence has again fallen to a level of barest subsistance on a day-by-day basis. All the advances of science and technology, the arts, culture, and human understanding will be swept away in a radioactive holocaust, genocide by genetic engineering, and total management of all information sources and political responses. This happened under Nazism and Bolshevism, and can just as well happen under a theocracy or rule by some other rigid, rigorously-enforced ideology. We may also be reaching the point where for a great many people, violent revolution carried out by small, autonomous groups (AKA “terrorism”) is seen to be the only practical recourse.

Americans have traditionally preferred fighting to switching. We remain an essentially warrior culture – something which all the liberal panaceas in the world will not change. They can weaken us, or deceive us temporarily, but eventually we will rally to face any threat – foreign or domestic. We have finally “met the enemy, and they are us,” as Pogo so cogently put it.

We’ve got a tiger by the tail, to use another metaphor. If we let go, it will turn and rend us. If we hold on, we will be dragged to our death. We can continue the American vision of post-World-War II global supremacy – a program no one understands or wishes to pursue any further – or we can let go, and find ourselves immersed in a seething caldron of nuclear terrorism (which we invented) or the Old World imperialistic struggles for resources and territory, as well as religious wars (which we tried to avoid, but have now finally caught up with us).

We Americans have the distinction of having supplied the nuclear technology to make what has always been a hopeless struggle into one which threatens human civilization, itself. We may, indeed, be recognized by future explorers and archaeologists as the civiliation which destroyed itself – Western, Christian, Scientific, Humanistic civilization. If any people survive, they are likely to be at the very margins of today’s scientific, technological civilization, uncontaminated by its technology and values.

II. What are Computers Doing to Our Minds?

When I first encountered computers on a direct, personal level, I was a graduate student in philosophy, with special interests in what was then called “cybernetics”, philosophy of mind, and the relationship between human and machine thinking – the field now known as “Artificial Intelligence.” Although I soon found myself both out of school and unemployed, the subject continued to interest me. I had been working as a computer operator in a large computing center of a prestigious, research-oriented university – UCLA – where I had recently graduated and was facing the choice of continuing my education there, or moving elsewhere in pursuit of an academic career. The choice I finally made was neither; instead, I quit school entirely and returned to Montana, vowing that I would never again take a course for academic credit.

Part of my revulsion was based on my research in the economics of education, and the seeming counter-productivity of most formal schooling. The rest was based on a then common fear or suspicion of technology, although I was more scientific and better-trained academically than most of the so-called “counter culture.” My newly-acquired knowledge of computers and how they were becoming ubiquitous and essential to the American way of life made me wonder just where we were headed, and as an avid reader of science fiction, I was very future-conscious and future-oriented.
I was especially concerned about the computer’s role in government, for I was also a political libertarian. The libertarian left (much of which is also called “anarchism”) was beginning to shape my thinking about social philosophy. It was just at that time in my life that I was familiarizing myself with that rarified part of the political spectrum where left and right overlap, and for those who find themselves in this territory, the history of our political life can easily seem to have been one long, unmitigated disaster – the gradual erosion of a free society into an empire or other elitist, totalitarian state.
The idea of governments armed with massive, powerful computers regulating, structuring, and evaluating the most minute and private aspects of our lives filled me with horror. I knew that we were living in an age of declining freedom and social morality, overpopulation, environmental degradation, and the imminent danger of complete annihilation from nuclear war. We were losing, or had already lost, the ability to plan and determine the course of our individual lives. It appeared that governments everywhere were becoming totalitarian.

Computers were an obvious tool for oppressive governments, and at this time, governments were the main impetus to computer development. The very first computers in the United States were actually built to do the numerous and complex integrations required for artillery trajectories. Later, they were to be employed in the Manhattan Project in the development of the atomic bomb. The first commercial builders of computers could hardly imagine any business applications, and estimated that only a dozen or so computers could ever be sold — mainly for record-keeping functions, accounting, and the like. Thus, digital technology remained primarily a government domain for a decade or two, where tax collection, the census, and similar functions might provide a likely application for magnetic or other coded information storage, and electronic data processing.

Military applications proceeded apace. It was widely believed in the late 1960’s that the main benefit of the Apollo Program (to land a man on the moon by the end of the decade) was the impetus it gave to computer development and design. Because governments had ordered and paid for their development, generally for purposes of defense and scientific research, engineering of weapons systems and other military and aerospace applications, computers were not yet recognized as a means of personal empowerment.

Those of us who worked in computing centers soon found that we were empowered simply by our access to computers. This was where the really smart people worked, and in order to continue our work, we had to adjust to an authoritarian setting, knowing universities were the least oppressive and most amenable to creative, divergent thinking. We became something like a new priesthood, serving the machine-gods who had become a sort of oracle or Divine Presence. If you’ve seen the original film of “2001, A Space Odyssey,” you’ll know what I’m talking about. Although no such computers existed in the late 1960’s, the fear was already there, and by 2001, “virtual reality” and the internet far exceeded those earlier predictions.

If you were an engineering or science type, you designed, built, and thoroughly understood digital technology. If you were a business type, you may have sold computers, programmed them, or otherwise employed them in your business planning and administration. If you were an artist or an academic, you could begin to computers to create new patterns of light, line, or sound; or in research, which might be textual analysis of a great novel, or to decipher ancient, previously untranslated texts.

If you were an urban planner, computers would prove very useful, and economic planning was supposed to have been revolutionized by the development of computers. In short, almost any field was open to the development of computer programs which would inevitably change the ways we worked, solved problems, and carried out the everyday tasks of production, distribution, and the applications of theoretical knowledge and information to our everyday lives.

In the late 1960’s, the personal computer, so far as we knew, did not yet exist — even in the science fiction where most futuristic technology first appeared. Shows like “Star Trek” had a ship’s computer which could answer questions (this was even anticipated in a charming 1957 film called “Desk Set” with Spencer Tracey and Katherine Hepburn). But the impending horror of a totally centralized, computerized and thus “regulated” society seemed to be the real prospect we were facing. In the epic science fiction novel Dune, by Frank Herbert, computers have been outlawed in that distant future, feudalistic civilization, to be replaced by human “mentats” – carefully trained logical thinkers who could evaluate complex data and make probabilistic predictions from it.

Thus, the decade or so before personal computers became widely available was the last time that a principled – if hysterical – opposition to the further development of computers and their intrusion into our everyday lives was expressed. Student radicals had actually taken over university computing centers (including the University of California, Santa Barbara, the year before I began working there) and in one case, totally destroyed a large, multi-million dollar system at a Canadian university. For awhile, working in a computing center could be seen as “hazardous duty” – even on a university campus! Ted Kaczynski’s so-called “Unabomber Manifesto” expresses this period and thinking very well, although in a rather convoluted and distorted fashion, representing the mental state of the author.

The computers we used in those days should be described for the benefit of younger readers who’ve never seen a computer which wouldn’t fit on a small desk or in a briefcase. The IBM 360/91 I operated cost more than $5 million ($35-40 million in today’s dollars), and filled a large room – perhaps 1500 square feet, carefully air-conditioned, and kept immaculately clean. The computer itself (CPU – central processing unit – and RAM, or Random Access Memory) was water-cooled with a radiator system holding more than a hundred gallons of distilled water. RAM then cost about $1.00/byte, so that a 4 MB (4 million byte) memory like we had (one of the very largest in use at that time) cost $4 million in 1969 dollars, and was most of the cost of the entire computer system. Now, it would cost about 10 cents per MB, and a giga-byte or more is commonly found on a chip about the size of a postage stamp inside of a flash drive or on a memory card.

In the 360/91, 4 MB filled several large cabinets roughly the size of supermarket coolers – roughly 4 X 4 X 20 feet. They contained millions of wires and transistors, which had to be hard-wired into place. The original memory location “bit” was a bead-sized doughnut of ferrous metal with three wires going through it. A current along one of the wires would magnetically polarize an individual doughnut either positively or negatively. The other wire would reverse the polarization, changing a 1 to a 0, or vice versa. The third wire was a “read” wire, to tell the CPU whether that location was presently a 1 or a 0. All “binary” digital computers work on the same principle, but today’s hardware looks very different, and billions of such “doughnuts” are microscopically “printed” on a single memory chip.

Similar developments can be seen in graphics, programming, speed, and “user-friendliness.” To use a computer for any obvious task then required hundreds or thousands of hours of “programming”, usually in the form of mathematical symbols or formulas. FORTRAN was the language of choice. Crude word processors, graphics displays, music synthesizers, and remote terminal access were just then being developed. The business “spread sheet” was practically unheard of, but CAD (Computer-Aided Design) was beginning to be used in engineering to do routine and repetitive calculations, and to graphically display drawings of parts or whole systems.
Computers were also used in accounting (payrolls, billing, inventories, etc.). In fact, this last was by far their largest commercial application, usually in banks, insurance companies, and other large corporate enterprises. Soon, very large offices which had once been filled with rows of bookkeepers with adding machines (like Jack Lemmon in the 50’s film “The Apartment”) were replaced by a single mainframe computer and a cadre of keypunch operators.

What is now called a “data entry clerk” was then a “keypunch operator,” for that is exactly what they did. Instead of just “scanning” in data from barcodes or whatever (they were also just then being developed), the “keypunch operator” typed in data or program codes on punched “IBM cards” – something which today’s computer users may have never seen. I still find old ones placed as bookmarks in some of the books I owned at that time. They were also good for taking notes.

As a single line of characters was typed along the top of the card, a coded sequence of holes beneath it translated the characters into “machine language.” This consisted of a set of electrical impulses corresponding to the codes punched through the cards, generated as the cards were run through a “card reader” and thus transferred into computer memory.

All computer programs were at some point “keypunched” on these unwieldy cards. Each card contained a single line of code or data in a computer program, and a typical program might use boxes of them, at 500 to the box. Keypunch machines and card readers were very expensive, and prone to failure. A typical academic computer user might hire both a programmer and a keypunch operator if there was much programming and data to record.

The impact of the Apollo Program could be seen very clearly in the computer center where I worked at UCLA. IBM made less than 20 360/91’s like ours, and NASA owned most of them, and used them in the Apollo Program. In fact, I had the pleasure of watching the live television coverage of the first landing in the Sea of Tranquility from the machine room of our own 360/91. All the rest were bought by large research universities, including Stanford and Princeton. UCLA had two – one for general use, and the other for its large biomedical research facility in the School of Medicine.
Silicon chips were the technological breakthrough which in the early 1970’s made every previous generation of computer immediately obsolete. Instead of being a large bundle of wires and transistors, very slow, and very costly to manufacture, any microprocessor could now be more or less photographically “printed” on silicon wafers at a scale so tiny that powerful microscopes were required to see the circuits and junctions. Random Access Memory (RAM) chips containing 4K (4000) memory locations were soon developed, and reduced in cost to a few dollars apiece. Thus, the greatest expense in making computers was drastically reduced.

By the mid-1970’s there were single-chip CPU’s or “micro-processors” such as the Motorola 6200, the Zilog Z-80, and later, the Intel 8086 – the first processor used in the IBM PC or Personal Computer. RAM chips “grew” every few years by a factor of four: from 4K to 16K to 64K to 256K to 1MB to 4MB, and so on. Now, 256MB costs less than $50 as part of a “memory board” that can be plugged into a personal computer (2006). Today (2017), you can buy a 32GB USB memory stick (“thumb drive”) for about $10.

“Silicon Valley” sprang into being, and beginning with the Apple, the powerful desk- top personal computer became a reality. Since that time, the formula, known as “Moore’s Law” after one of the founders of Intel, has been that each year, a given quantity of computing power and speed will cost 30% less than it did the year, before. This is a rapid rate of development which probably cannot be sustained (they said that ten years ago, too) but it is a tangeable form of “progress” which has never been equalled in all the history of technology. Even Henry Ford’s remarkable reduction in the price and availability of automobiles by mass production pales in comparison.
III. The Idea of Progress in the Evolution of Digital Technology

The Idea of Progress, an issue dear to the hearts of important thinkers in the mid- 20th century, seems to have been stood on its ear in what can now only be called “the Cybernetic Revolution.” It’s already over, or in its final “set a long-term course” phase, and many believe it will almost cease to be an issue in the new Millenium. I concur with this prediction. Technics do, indeed, shape civilization and all its art, culture, and intellectual content and direction. Many seemed to understand this in mid-century. My father read books and journals at that time, when I was growing up, and there were sarcastic references to “an air-conditioned nightmare” and a nation of imbeciles “dumbed down” by television and other commercial mass media.

The same Louis Mumford who wrote The City in History and Technics and Civilization also wrote an impassioned plea for nuclear disarmament, In the Name of Sanity. Bertrand Russell provided a similar humanistic perspective tempered by fears for a future dominated by Stalinesque leaders with nuclear arsenals. Yet, the wonders of technology, exemplified by the slogan “Atoms for Peace,” became a dominant theme in popular and commercial culture during the 1950’s. The United States attemped to become, again, the Empire it had abandoned two centuries before. The “100% American” of the 1950’s took more pride in his nation and its recent victory over tyrants and dictators, even as our leaders were protecting and installing tyrants and dictators around the world. The 1950’s, like the 80’s and 90’s, was an age of upward mobility and professional success – suceess being counted mainly, but not exclusively, in material terms. Much of popular culture was reinforced and enhanced by a Yuppy-esque pursuit of wealth and beauty.

Yet, everyone was not benefitting equally or proportionately. Here are the roots of the Black Revolt in the 1960’s; the peace and social justice movements which accompanied and reinforced a larger Third-world Liberation and anti-colonialist impulse. It is to this post- war period of euphoric ambition – the GI Bill and the amalgamation of the working class with intellectuals – that we can also trace the roots of the Counterculture in the Beats and Jazz scenes who were predominantly ethnic (i.e., of non-British ancestry).

Whenever we visited foreign countries, or listened to fine arts and educational radio and television, we understood how limited American popular culture had become – how the “melting pot” had consistently denied our individual characters and heritage, leaving us a rootless, history-deprived society. In our family, religion got much of the blame for this, even as we respected and encouraged the moral foundations of traditional Christianity and western European civilization.

It was in this context that computers emerged. The classic film “Deskset” (mentioned above) with Spencer Tracy and Katherine Hepburn is one of the few brilliant insights into the effects computers were having on our daily lives. The fact that it was made in 1957, which is extraordinary, qualifies it as “science fiction,” for there was then nothing like this computer in existence. Basically, the story describes a computer programmed with all of human knowledge, and thus becomes an infallible source of truth and guidance – a kind of oracle which exposes and makes fun of contemporary American culture. The ironic title, Desk Set (sub-titled “His Other Woman”) reminds us of our own obsession with the latest desk-top computer technology. Although this computer was not a desktop, it became the accessory of a “smart working girl” and her greedy boss.

Now, the Internet has become the oracle for all knowledge and information, the “world-wide-web” which connects all the computers and data-bases in the world! Computers are the one known area where the technology has actually exceeded the wildest expectations and imaginary machines of the most optimistic science-fiction writers. In contrast, we are still far behind the science fiction standards in robotics and propulsion systems – even with respect to what was written half a century or more, ago.

IV. How lives changed at the Dawn of the Cybernetic Age

The largest threat posed by computers was evident to me from the beginning (say, 1970): what would they do to the way we think? Students from “Third World” underdeveloped countries often remarked that computers posed a real dilemma for those of them who were learning science or engineering with the help of computers, but planned to return to their own countries and do their work without computers later on. Even the pocket calculator was yet to be developed, and those who didn’t have computers could only look forward to doing calculations with slide rules!
People in their 50’s and older may remember the large $30-$50 logarithmic slide rule, with 20-30 different scales, and carried in a leather holster like a large sheath knife by the rather awkward-looking science and engineering students. The handwriting was on the wall when I could go to a campus lost-and-found auction and buy as many of these antiques as I wanted for two or three dollars apiece!

If we were to become dependent on computers, what would happen if the computers were somehow not available? Obviously, this was a real problem so long as computers filled large rooms and cost millions of dollars. When I returned to Montana in 1972 and lived 30 miles from a city, and the same distance from any usable computer, I called the phone company and asked what it would cost to install a phone line that could handle a remote computer terminal. It would have been necessary to extend a private line (our normal rural phone used a noisy 8-party line not well-suited for modems!) for about 8 miles, at a cost of several thousand dollars per mile – not an economically feasible proposition. In fact, the Mountain Bell representative seemed amused that anyone would even consider such a thing.

Another aspect of this dependency was observable in the computer centers where I worked. Those who were really dedicated to the newest incarnation of the God of the Machine often seemed to lose other aspects of their humanity. Staring fixedly into CRT displays which more resembled oscilloscopes than modern color monitors; forgetting to eat, wash, change clothes, and otherwise interact with friends and families; these “hackers” turned out to be next year’s millionaires or literal “rocket scientists”. Some went mad, or disappeared into the counterculture, never to be seen or heard from again.

We know, now, that this was a form of addiction, or obsessive-compulsive behavior. Denial was a large aspect of the problem. I can still remember a young man, ambitious and clean-cut, who over several months turned into a kind of Dr. Jekyll before our very eyes. On one occasion, his aging father, an immigrant from some European country, came to the computer center to rescue his son from the infernal machines which had somehow captured his soul. All entreaties were in vain, with the old gentleman finally leaving in tears, leaving his son to complete the work of genius he was performing.

Anyone wishing to deal with computers had first to deal with the new priesthood of the Cybernetic Age. They resembled Ross Perot. Even the lowliest technician from IBM wore a three-piece suit and tie to the shop in the back of the machine room, where the coat might be hung on a chair and sleeves rolled up, but vest and tie stayed on. One could have easily mistaken them for FBI agents. They knew nothing about programming or what the computer was doing, but they had the ability to locate, replace, or adjust any defective part or mechanism in what was undoubtedly the most complex machine ever built. Basically, they were glorified mechanics, usually trained in the military or according to a military-style regimen.

One of my friends wrote an article for an underground paper entitled “Cyborgs” in which he “exposed” this new class of machine-bound humans. Even a person driving a car, he maintained, is a cyborg – half human (or less than half), and half machine, plunging headlong into an unknown future which, when contrasted with the “flower children” of the 1960’s, began to resemble H.G. Wells’ future technocracy of Morlocks in The Time Machine. But most social criticism of the dawning Cybernetic Age was restricted to the infuriating unresponsiveness of trying to deal with computers, and the form-letters which they were already generating in a blizzard of meaningless paperwork. Someone would get a bill or other document from a company or the government, and assuming that it was from a real mind and a real person, would call or write to straighten out the problem. Soon, the unwitting customer or client would discover that no person had written or even seen the letter, and that the problem would be identified as a “computer error” for which there was likely to be no redress or adjustment very soon.

“It’s the computer” became everyone’s favorite excuse for mistakes or inaction. This was the time when the phrase “Do not fold, spindle, or mutilate” became a cliche’ – not with respect to people, where we sometimes hear it today, but with respect to one’s phone bill or other document, itself a punched “IBM card” which would at some point have to be read at very high speed by a card-reader, the slowest and weakest link in the flow of “information.” Any damage to the card might cause a jam in the card reader or loss of valuable data.

It was at this point that some of us began to see that computers were not just machines, but harbingers of the beginning of a new era in culture and belief; that computers had already become the newest oracles of what we now might call a “virtual” religion; and that we should properly speak of computer theology rather than computer science.

There was also an ecological aspect to all this. The evolution and proliferation of computers constituted an ecological system, subject to most of the same rules identified by the investigators of living systems. There were many extinctions and dead-ends in the evolution of new computer species. And some lay dormant for years or decades until someone figured out a way to utilize them.

The “mouse,” for example, was developed by Xerox in the early 1970’s, but didn’t become a common feature of computers until a decade or so later with the advent of the Macintosh. The punched card technology became entirely extinct, as did many other forms of data storage and retrieval. Dot-matrix printers replaced the costly and complex line printers, only to be superceded by laser and ink-jet printers which continue to become cheaper and better with each passing year.

But it was the advent of the personal computer which totally changed the game from one of centralized authority and superstition to the age-old American ideal of individualism and “do it yourself” technology. Once computers became affordable to the average professional or working person, the mystique was gone. Although computers no longer seemed to pose the same totalitarian threat they had, before, other potential dangers lurked in the shadows.

For one thing, computers in the workplace were not entirely the labor-saving devices they were intended to be. They may have saved certain kinds of labor costs for employers, but they also imposed heavy costs on many of the users, including the still-controversial effects of CRT radiation and the damage to fingers and tendons (carpal tunnel syndrome) from long hours of steady typing, uninterrupted by inserting paper, erasing, or other more natural and spontaneous movements once required of typists.
In terms of labor relations, many once-salaried and low-pressure jobs became piece-work nightmares, in which each worker’s productivity could be precisely monitored and measured. Slower typists were demoted or fired, regardless of their other talents or value to the firm. Those who refused to become “computer literate” found their employment opportunities severely curtailed.

As computers became more and more essential – not only to the completion of repetitious typing or calculating tasks, but to the creative end of business, such as design, layout and typesetting, and the robotics found in manufacturing – the standards for products and productivity improved or increased at what seemed to be an accelerating rate. Those who could use computers effectively had an immediate and enormous advantage over their competitors who could not. Soon, it began to appear as though computers were becoming a kind of magical “answer” for every workplace or industrial problem.

The marketers and vendors of computers and computer accessories became the new prophets of the cybernetic religion. In government and other large-scale institutions, the rate of mechanization and replacement of already advanced and successful technology snowballed, costing hundreds of thousands of jobs with little or no increase in productivity or service to the clients or customers. Somehow, the American way of doing business was no longer proving to be effective, and was being replaced by an attitude which began to see computers and other high technology as ends in themselves, regardless of their negative impacts on the average person.

V. The Idea of Progress – Computers

The message we’ve been sold is that everything is getting better every year – just like the machines, themselves. More computers means a higher standard of living. Ever more costly and complex gadgets are somehow believed to have improved “the quality of life.” Yet, most of us cannot afford to upgrade our equipment every couple of years, and re-learn the software. Most of us couldn’t even figure out how to program a VCR, until screen “menus” (again, based on micro-processors) simplified the process to the level of a complete idiot.

But more love and care is not being put into products, yet. “Quality control” is still seen as primarily a technological or economic issue – not a matter for human aspirations. “Efficiency” is no longer the Puritan virtue it once was. It means, instead, dehumanizing the workers (or the customers) for the sake of corporate profits. It means cutting public services by governments, not improving them. Being “cost-effective” doesn’t mean that we should get maximum value for our money, but that we should spend a minimum of money for anything except our own immediate material desires – the next generation of gadgets, in other words.

Talk to college business students or recent MBA recipients and you are likely to be in for a shock. Not only are most of these people not knowledgeable about philosophy, the arts, or broader community issues; in most cases, they are not the least bit interested, either. They hope to make a lot of money for themselves, so they can then purchase “leisure,” apparently. Or gain power over others, attract suitable members of the opposite sex (as many as possible, it would seem), or perhaps just become “rich and famous” so that they might be featured on some TV “lifestyles” segment.
They’re not actually interested in enriching their lives and minds, or improving their cultural awareness, and certainly not intersted in helping others to do so, or in creating a society in which all may achieve and prosper. In fact, most of them seem to think we’re playing a zero-sum game, in which one person’s gain must necessarily come at someone else’s expense. Our victory must mean someone else’s defeat – hopefully someone of a different race and culture. Ultimately, it become a “casino model of society” or “party culture,” where self-destructive behavior makes a few people rich and comfortable at tremendous human costs to a much larger number of people.

Fortunately, computers are no longer restricted to a wealthy elite. They may, indeed, become the “great equalizers” and the ultimate expressions of an Open Society in which no one group or faction can control the future, or abuse those with less power and influence. Although I am not inclined to use computers more than is necessary and beneficial, I feel fortunate, indeed, to have a small, affordable computer which performs all the functions which are useful and beneficial to me. It is, indeed, an empowerment tool of great flexibility and utility. The fact that civilization reached the high level it did before they were invented is almost more miraculous than the fact that computers were invented at all, and perfected to their present level.

VI. Cyber-linguistic Socialism

How do we explain the fact that the American people and our intellectual and moral leaders have virtually no standing in the policy decisions of this country? If it is truly the case that the most legitimate and pressing issues are not even on the table, or have been long since discarded from active consideration, how do we explain this fact, or maintain our illusions that we live in a society with principles, and that the democratic process actually works to bring forward the best ideas and policies?

Of course, we can admit that we no longer have a democracy, and that the corporate media refuses to consider or discuss the real issues of poverty, racism, sexism, the nuclear threat, envirornmental crisis, or whatever. Even when they do discuss these issues, there is a nearly complete censorship in their reluctance to publish certain writers or schools of thought.

Now, thanks to the internet and a new wave of popular activism bolstered by other telecommunications including public broadcasting, cable, CNN, C-SPAN and various other channels and networks, all of us can publish and read exactly what we like. With all sorts of related progress in the epistemology of political rhetoric and action, a true participatory democracy is beginning to arise. Is it enough, yet, to swing an election? Sometimes, in some places. But more importantly, it is changing the consciousness of the average news reader or viewer; changing the media, itself; and changing the way that elected officials manage or “coordinate” government functions and public policy.

Amazingly, all of this has not much shifted the balance between right and left, which was already very heavily weighted towards a more or less constantly rightward-moving Center. The Internet, itself, is a scientific, academic and political instrument, developed under government contracts in support of defense research and other kinds of academic, scientific endeavors like the National Science Foundation, NASA, and a number of other agencies and institutions. Soon, it would become a library resource, publisher of academic journals, a medical management tool, and so forth. Computers had been developing concurrently to serve all the various academic, scientific, business, consumer, and entertainment functions, and it was a natural development in “cyber-ecology” to expand the WWW to encompass all these applications, and many more, besides.

But few imagined, or had any idea, what the political consequences might be. With the advent of the personal computer, most fears of a computerized dictatorship were tossed out the window. Both Right and Left saw in computers a means of personal and community liberation, and perhaps even the demise of the giant nation-state and centralized governments of all kinds. Legislators could immediately access all the research, news, and opinions they could possibly assimilate, and the public consciousness became a concrete, quantifiable reality which could be “studied” and interpreted with a pseudo-scientific exactness which brooked no argument or refutation.

It is now possible to have national referendums and town meetings to decide every kind of issue, and some progress has been made in that direction, but there is little progress, yet, in actually moving in the direction which socialist or social democratic activists might favor.

The fact is, there is now probably two or three times as much Right-wing activity and propaganda on the WWW as there is Leftist, and the class division, which made computers accessible to the professional class long before working class people hardly knew what they were, has had grim consequences politically.

But it is in education and the media that computers have had by far the greatest social impact. How terribly insecure and impermanent our young people must feel, seeing yesterday’s most glamorous technologies thrown on a junk-heap of the obsolete and over-costly. For the educated and elite classes, it is Brave New World — light, sexy, and scientific. For the working poor, it is 1984 — bleak, frightening, and dictatorial. Meanwhile, there is a growing incidence of irrational violence and other destructive behavior directed not at the system, but at anyone and everyone within reach.

These “flavors” — entirely refuted and unwanted — have nevertheless come to dominate our national consciousness. In spite of all the technological savvy, greed, and ambition, there is very little critical thinking or what the Right calls “Secular Humanism” (academic, scientific, progressive philosophizing) going on in either our public or private media and education systems.

So what is our new paradigm of cybersocialism going to look like? And what sort of education systems and media will it foster and maintain? What I discovered as an economics student (most interested in the history of economic thought and comparative economic systems) is that the science of economics has defined its own boundaries so narrowly that it is up to social philosophers (who hopefully thoroughly understand economic theory) to actually define the ends of human civilization, and the means to attain them. Thus, any sort of economic system is essentially meaningless and irrelevant unless it reflects a deeper set of social values and (natural) scientific understanding. That’s why work of people like Noam Chomsky and other progressive, humanistic Leftists is so valuable, and so under-appreciated in our centralized, totalitarian corporate state.

 

Advertisements

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s