Before Television – BTV

Green Libertarianism, Young Person's Guide.. chapters

What Have Computers Done to Our Minds?

A brief discourse on technological progress

BTV = Before Television

My own early childhood was part of the last generation to develop its consciousness before the advent of television. I did not experience television until I was six years old, and we didn’t own one in our home until I was nine. More importantly, all of the adults in my life, including parents and teachers, were raised and educated BTV – before television. Television, and then computers, the internet, VCR’s, DVD’s, and the national corporate media have completely changed our consciousness during the past 50 years, and this New Age of electronic media closely parallels the rise of science fiction as literature and the obsession with an alien presence, here. The effects have been highly political, drastically changing the economic, social, and cultural life of nearly everyone in this country who is in any way “plugged in” to them.
Few Americans now in their teens or 20’s, unless they have lived in a remote place, have experienced anything like the personal freedom, connection with nature, and sense of local community which I experienced as a child. Is this really a problem? No, because it has no solution. It is a change which we can make ourselves aware of, and in certain respects compensate for with our personal lifestyle choices. What we need to do is understand both the positive and negative implications of these changes, and attempt to direct our individual and community lifestyle choices in a healthier, freer, more natural and humanistic direction.
Some parents have actually made the choice of having no television in their homes, encouraging reading, crafts, and hobbies of the same kinds which nearly everyone practiced when I was a child. Others have opted for high-tech, internet-based home schooling, intensive sports programs, music, dance, skating, art, and other kinds of private or group instruction, and so forth. Forming an intentional community of some sort (most are religiously based, but that needn’t be the case) is highly desireable, both for the nurturing, health, and sanity of children, and to maintain the sort of lifestyles which are good for people of all ages. But the vast majority of Americans simply haven’t done it, and aren’t about to do it.
If you were raised in the average environment of public schools, lots of TV and video games, computers, Top 40 radio, fads in clothes, cars, hair, and gadgets, you will probably see no reason to make any radical changes in your lifestyle. You are probably interested in having a good job, owning your own home, marrying someone you love and with whom you share many or most interests and aspirations, and raising children to be pretty much like you are. This book might lead you to question some or all of these goals, and to re-think or make some better choices. Whatever happens, I have tried to provide some alternatives to the present assumptions which underlie American society at the beginning of the 21st century. This book is more for those who are unhappy with the current state of affairs, and wish to head out in some different direction. It is only by defining and understanding where we are, today, that we can even think about being somewhere else. These are personal choices which all of us must make for ourselves.


A Brief Discourse on Technology

We live in a highly technological age, and virtually no part of the world is free of its attractions and liabilities. Even the most isolated and “pre-industrial” civilizations now rely on automobiles, power boats, farm machinery, and now, of course, computers, cell-phones, and every other kind of modern technology. And every nation, no matter how poor or disadvantaged, wants to spend an inordinate part of its national income on military, police, and other repressive and destructive institutions. More than 70% of our “foreign aid” over the past century has been devoted to military and “internal security” purposes.
The U.S. government spends 20% of its budget on direct military spending; another 20% or so on interest on the national debt which is almost entirely attributable to past military spending; and another 10-15% on health care, pensions, and other services for veterans of past wars. At the same time, spending 1% of the budget on aid to families with dependent children was considered to be an unconscionable waste of the taxpayer’s money, and 50 cents per taxpayer spent on support for the arts, and another 50 cents for public broadcasting are under continual attack by “conservative” senators and congressmen.

I. The Abuse of Science, Sociology, and Mathematics

In defending a radical, logical opposition to today’s technocracy, it is important to distinguish the human uses of new science and technology from its abuses. Most of the criticism of technology and techno-think is directed towards its rampant abuses, not its utilitarian values.

The primary abuse of physics is the nuclear arms race. The primary abuse of rocket science is an ICBM nuclear arsenal (which I live next to here in central Montana). The primary abuse of economics is its role as apologetics for the multi-national corporate aristocracy. The clearest abuse of mathematics may be seen in the actions of another Montanan-by-choice, Theodore Kaczynski.

If we include theology, then its abuse may be seen in monotheistic fundamentalism, whether Christian, Islamic, or Judaic; and its resulting conflicts in the Middle East and elsewhere based on putting one faith above nature, and creation as a whole, in being “the one true faith” and the only accurate representation of God’s scheme of things.
Is it wrong (“Satanic”) to teach children calculus and quantum theory? Evolution? The Marxist theory of historical development? Of course not! Should we attempt to indoctrinate our teachers and schoolmasters in some particular faith or ideology? Or should we encourage diversity and choice? These are the vital issues surrounding another great abuse: the abuse of education by brainwashing, “training for capitalism,” racism, imperialism, genocide, or whatever. Liberal education, it would seem, has suffered even greater setbacks than liberal politics or religion.

What about high-tech terrorism based on our utter dependence on massive, energy intensive machines, buildings, and other accessories of civilization? As this is being written, we have just witnessed the first major, successful attack on the 48 states since the War of 1812, accomplished by 18 men armed with pocket knives, but in control of 4 airliners which were used as guided missiles against some of our tallest and most important government and financial buildings. The death toll is now estimated to be approaching 7000 — more than Pearl Harbor and the Titanic, combined. (A few years later, we know the 9-11 death toll was some 3500, and at least 2 of the four airliners are believed to have been shot down or otherwise disposed of by our own military forces. The strike on the Pentagon is now believed to have been a military aircraft or missile, not an airliner. And instead of 3500 deaths, the number has now multiplied to millions of casualties in Iraq, Afghanistan, and other parts of the world as a consequence of the U.S. “response.”)

In the 1970’s, a movement known as “appropriate technology” emerged, led by counter-culture leaders such as Stewart Brand, who founded the Whole Earth Catalog and associated enterprises and publications, and E. F. Schumacher, a British economist and former bureaucrat in Britain’s state-owned coal industry who wrote a charming little book called Small is Beautiful which became an international best seller. The gist of this movement was that we need to free ourselves of technological domination by governments and large corporations by regaining control of our economy, our tools, and “the means of production.” Children of the upper middle class “dropped out” to form rural communes, urban collective businesses, schools, community centers, and all sorts of other humanistic, more or less anti-corporate and anti-technological endeavors.

Much of the recent policy debate between advocates of “appropriate technology” and those who believe that no one should attempt to control its development and evolution centers around this question. In the 1930’s, there was an explicit political movement – the Technocrats – who believed that all social problems could be “re- engineered” by science and technology to correct or eliminate them. Much of Nazi ideology had a similar “scientific” aura and rationale. Marxists called their system “scientific socialism” to distinguish it from the softer “social democrats” or “Utopian Socialists” – a term which Marx originated. Yet, his followers would also rely heavily on the French tradition, with its Phalanges and rule by scientists and engineers.

The classical Liberals – the laissez-faire, free trade, rule of law, parliamentary democracy advocates – often had a different view based closely on emerging evolutionary biology. A healthy society must not overprotect its weakest members; captains of industry are like ecologic dominants, evidence of the perfect working of the principle of the “survival of the fittest.” Technology, for them, was just one part of this process. Clearly, we must leave entrepreneurs free to develop and experiment with any and all technology. We must protect their right to exploit their discoveries and inventions, a principle which was later severely questioned by libertarian purists. Since studying their arguments, I have been able to find little social value in granting monopoly protection to most scientific patents and discoveries. Although “trade secrets” and the fruits of well-financed research and development programs are known to be keys to success in the marketplace for new technologies and products, the fact that pharmaceutical companies spend far more on advertising and promotion of their products than they do on research indicates their arguments in favor of monopoly profits resulting from patent “protection” are bogus.

In the United States, the latter position has obviously had the upper hand for the past couple of decades, if not before. Although the American natural environment is less ravaged than Europe’s or the heavily populated parts of Asia, we now lag behind the rest of the world in international initiatives to address global ecologic problems. Indeed, we are as reactionary as the Vatican or Islamic Republics on many of these issues – especially those pertaining to population control.

Clearly, we must begin to address such issues as overpopulation, land use, and non-renewable resource exploitation sooner rather than later. The human species is rapidly approaching some form of negative utopia in which life has lost all meaning, and in which the physical conditions of most people’s existence has again fallen to a level of barest subsistance on a day-by-day basis. All the advances of science and technology, the arts, culture, and human understanding will be swept away in a radioactive holocaust, genocide by genetic engineering, and total management of all information sources and political responses. This happened under Nazism and Bolshevism, and can just as well happen under a theocracy or rule by some other rigid, rigorously-enforced ideology. We may also be reaching the point where for a great many people, violent revolution carried out by small, autonomous groups (AKA “terrorism”) is seen to be the only practical recourse.

Americans have traditionally preferred fighting to switching. We remain an essentially warrior culture – something which all the liberal panaceas in the world will not change. They can weaken us, or deceive us temporarily, but eventually we will rally to face any threat – foreign or domestic. We have finally “met the enemy, and they are us,” as Pogo so cogently put it.

We’ve got a tiger by the tail, to use another metaphor. If we let go, it will turn and rend us. If we hold on, we will be dragged to our death. We can continue the American vision of post-World-War II global supremacy – a program no one understands or wishes to pursue any further – or we can let go, and find ourselves immersed in a seething caldron of nuclear terrorism (which we invented) or the Old World imperialistic struggles for resources and territory, as well as religious wars (which we tried to avoid, but have now finally caught up with us).

We Americans have the distinction of having supplied the nuclear technology to make what has always been a hopeless struggle into one which threatens human civilization, itself. We may, indeed, be recognized by future explorers and archaeologists as the civiliation which destroyed itself – Western, Christian, Scientific, Humanistic civilization. If any people survive, they are likely to be at the very margins of today’s scientific, technological civilization, uncontaminated by its technology and values.

II. What are Computers Doing to Our Minds?

When I first encountered computers on a direct, personal level, I was a graduate student in philosophy, with special interests in what was then called “cybernetics”, philosophy of mind, and the relationship between human and machine thinking – the field now known as “Artificial Intelligence.” Although I soon found myself both out of school and unemployed, the subject continued to interest me. I had been working as a computer operator in a large computing center of a prestigious, research-oriented university – UCLA – where I had recently graduated and was facing the choice of continuing my education there, or moving elsewhere in pursuit of an academic career. The choice I finally made was neither; instead, I quit school entirely and returned to Montana, vowing that I would never again take a course for academic credit.

Part of my revulsion was based on my research in the economics of education, and the seeming counter-productivity of most formal schooling. The rest was based on a then common fear or suspicion of technology, although I was more scientific and better-trained academically than most of the so-called “counter culture.” My newly-acquired knowledge of computers and how they were becoming ubiquitous and essential to the American way of life made me wonder just where we were headed, and as an avid reader of science fiction, I was very future-conscious and future-oriented.
I was especially concerned about the computer’s role in government, for I was also a political libertarian. The libertarian left (much of which is also called “anarchism”) was beginning to shape my thinking about social philosophy. It was just at that time in my life that I was familiarizing myself with that rarified part of the political spectrum where left and right overlap, and for those who find themselves in this territory, the history of our political life can easily seem to have been one long, unmitigated disaster – the gradual erosion of a free society into an empire or other elitist, totalitarian state.
The idea of governments armed with massive, powerful computers regulating, structuring, and evaluating the most minute and private aspects of our lives filled me with horror. I knew that we were living in an age of declining freedom and social morality, overpopulation, environmental degradation, and the imminent danger of complete annihilation from nuclear war. We were losing, or had already lost, the ability to plan and determine the course of our individual lives. It appeared that governments everywhere were becoming totalitarian.

Computers were an obvious tool for oppressive governments, and at this time, governments were the main impetus to computer development. The very first computers in the United States were actually built to do the numerous and complex integrations required for artillery trajectories. Later, they were to be employed in the Manhattan Project in the development of the atomic bomb. The first commercial builders of computers could hardly imagine any business applications, and estimated that only a dozen or so computers could ever be sold — mainly for record-keeping functions, accounting, and the like. Thus, digital technology remained primarily a government domain for a decade or two, where tax collection, the census, and similar functions might provide a likely application for magnetic or other coded information storage, and electronic data processing.

Military applications proceeded apace. It was widely believed in the late 1960’s that the main benefit of the Apollo Program (to land a man on the moon by the end of the decade) was the impetus it gave to computer development and design. Because governments had ordered and paid for their development, generally for purposes of defense and scientific research, engineering of weapons systems and other military and aerospace applications, computers were not yet recognized as a means of personal empowerment.

Those of us who worked in computing centers soon found that we were empowered simply by our access to computers. This was where the really smart people worked, and in order to continue our work, we had to adjust to an authoritarian setting, knowing universities were the least oppressive and most amenable to creative, divergent thinking. We became something like a new priesthood, serving the machine-gods who had become a sort of oracle or Divine Presence. If you’ve seen the original film of “2001, A Space Odyssey,” you’ll know what I’m talking about. Although no such computers existed in the late 1960’s, the fear was already there, and by 2001, “virtual reality” and the internet far exceeded those earlier predictions.

If you were an engineering or science type, you designed, built, and thoroughly understood digital technology. If you were a business type, you may have sold computers, programmed them, or otherwise employed them in your business planning and administration. If you were an artist or an academic, you could begin to computers to create new patterns of light, line, or sound; or in research, which might be textual analysis of a great novel, or to decipher ancient, previously untranslated texts.

If you were an urban planner, computers would prove very useful, and economic planning was supposed to have been revolutionized by the development of computers. In short, almost any field was open to the development of computer programs which would inevitably change the ways we worked, solved problems, and carried out the everyday tasks of production, distribution, and the applications of theoretical knowledge and information to our everyday lives.

In the late 1960’s, the personal computer, so far as we knew, did not yet exist — even in the science fiction where most futuristic technology first appeared. Shows like “Star Trek” had a ship’s computer which could answer questions (this was even anticipated in a charming 1957 film called “Desk Set” with Spencer Tracey and Katherine Hepburn). But the impending horror of a totally centralized, computerized and thus “regulated” society seemed to be the real prospect we were facing. In the epic science fiction novel Dune, by Frank Herbert, computers have been outlawed in that distant future, feudalistic civilization, to be replaced by human “mentats” – carefully trained logical thinkers who could evaluate complex data and make probabilistic predictions from it.

Thus, the decade or so before personal computers became widely available was the last time that a principled – if hysterical – opposition to the further development of computers and their intrusion into our everyday lives was expressed. Student radicals had actually taken over university computing centers (including the University of California, Santa Barbara, the year before I began working there) and in one case, totally destroyed a large, multi-million dollar system at a Canadian university. For awhile, working in a computing center could be seen as “hazardous duty” – even on a university campus! Ted Kaczynski’s so-called “Unabomber Manifesto” expresses this period and thinking very well, although in a rather convoluted and distorted fashion, representing the mental state of the author.

The computers we used in those days should be described for the benefit of younger readers who’ve never seen a computer which wouldn’t fit on a small desk or in a briefcase. The IBM 360/91 I operated cost more than $5 million ($35-40 million in today’s dollars), and filled a large room – perhaps 1500 square feet, carefully air-conditioned, and kept immaculately clean. The computer itself (CPU – central processing unit – and RAM, or Random Access Memory) was water-cooled with a radiator system holding more than a hundred gallons of distilled water. RAM then cost about $1.00/byte, so that a 4 MB (4 million byte) memory like we had (one of the very largest in use at that time) cost $4 million in 1969 dollars, and was most of the cost of the entire computer system. Now, it would cost about 10 cents per MB, and a giga-byte or more is commonly found on a chip about the size of a postage stamp inside of a flash drive or on a memory card.

In the 360/91, 4 MB filled several large cabinets roughly the size of supermarket coolers – roughly 4 X 4 X 20 feet. They contained millions of wires and transistors, which had to be hard-wired into place. The original memory location “bit” was a bead-sized doughnut of ferrous metal with three wires going through it. A current along one of the wires would magnetically polarize an individual doughnut either positively or negatively. The other wire would reverse the polarization, changing a 1 to a 0, or vice versa. The third wire was a “read” wire, to tell the CPU whether that location was presently a 1 or a 0. All “binary” digital computers work on the same principle, but today’s hardware looks very different, and billions of such “doughnuts” are microscopically “printed” on a single memory chip.

Similar developments can be seen in graphics, programming, speed, and “user-friendliness.” To use a computer for any obvious task then required hundreds or thousands of hours of “programming”, usually in the form of mathematical symbols or formulas. FORTRAN was the language of choice. Crude word processors, graphics displays, music synthesizers, and remote terminal access were just then being developed. The business “spread sheet” was practically unheard of, but CAD (Computer-Aided Design) was beginning to be used in engineering to do routine and repetitive calculations, and to graphically display drawings of parts or whole systems.
Computers were also used in accounting (payrolls, billing, inventories, etc.). In fact, this last was by far their largest commercial application, usually in banks, insurance companies, and other large corporate enterprises. Soon, very large offices which had once been filled with rows of bookkeepers with adding machines (like Jack Lemmon in the 50’s film “The Apartment”) were replaced by a single mainframe computer and a cadre of keypunch operators.

What is now called a “data entry clerk” was then a “keypunch operator,” for that is exactly what they did. Instead of just “scanning” in data from barcodes or whatever (they were also just then being developed), the “keypunch operator” typed in data or program codes on punched “IBM cards” – something which today’s computer users may have never seen. I still find old ones placed as bookmarks in some of the books I owned at that time. They were also good for taking notes.

As a single line of characters was typed along the top of the card, a coded sequence of holes beneath it translated the characters into “machine language.” This consisted of a set of electrical impulses corresponding to the codes punched through the cards, generated as the cards were run through a “card reader” and thus transferred into computer memory.

All computer programs were at some point “keypunched” on these unwieldy cards. Each card contained a single line of code or data in a computer program, and a typical program might use boxes of them, at 500 to the box. Keypunch machines and card readers were very expensive, and prone to failure. A typical academic computer user might hire both a programmer and a keypunch operator if there was much programming and data to record.

The impact of the Apollo Program could be seen very clearly in the computer center where I worked at UCLA. IBM made less than 20 360/91’s like ours, and NASA owned most of them, and used them in the Apollo Program. In fact, I had the pleasure of watching the live television coverage of the first landing in the Sea of Tranquility from the machine room of our own 360/91. All the rest were bought by large research universities, including Stanford and Princeton. UCLA had two – one for general use, and the other for its large biomedical research facility in the School of Medicine.
Silicon chips were the technological breakthrough which in the early 1970’s made every previous generation of computer immediately obsolete. Instead of being a large bundle of wires and transistors, very slow, and very costly to manufacture, any microprocessor could now be more or less photographically “printed” on silicon wafers at a scale so tiny that powerful microscopes were required to see the circuits and junctions. Random Access Memory (RAM) chips containing 4K (4000) memory locations were soon developed, and reduced in cost to a few dollars apiece. Thus, the greatest expense in making computers was drastically reduced.

By the mid-1970’s there were single-chip CPU’s or “micro-processors” such as the Motorola 6200, the Zilog Z-80, and later, the Intel 8086 – the first processor used in the IBM PC or Personal Computer. RAM chips “grew” every few years by a factor of four: from 4K to 16K to 64K to 256K to 1MB to 4MB, and so on. Now, 256MB costs less than $50 as part of a “memory board” that can be plugged into a personal computer (2006). Today (2017), you can buy a 32GB USB memory stick (“thumb drive”) for about $10.

“Silicon Valley” sprang into being, and beginning with the Apple, the powerful desk- top personal computer became a reality. Since that time, the formula, known as “Moore’s Law” after one of the founders of Intel, has been that each year, a given quantity of computing power and speed will cost 30% less than it did the year, before. This is a rapid rate of development which probably cannot be sustained (they said that ten years ago, too) but it is a tangeable form of “progress” which has never been equalled in all the history of technology. Even Henry Ford’s remarkable reduction in the price and availability of automobiles by mass production pales in comparison.
III. The Idea of Progress in the Evolution of Digital Technology

The Idea of Progress, an issue dear to the hearts of important thinkers in the mid- 20th century, seems to have been stood on its ear in what can now only be called “the Cybernetic Revolution.” It’s already over, or in its final “set a long-term course” phase, and many believe it will almost cease to be an issue in the new Millenium. I concur with this prediction. Technics do, indeed, shape civilization and all its art, culture, and intellectual content and direction. Many seemed to understand this in mid-century. My father read books and journals at that time, when I was growing up, and there were sarcastic references to “an air-conditioned nightmare” and a nation of imbeciles “dumbed down” by television and other commercial mass media.

The same Louis Mumford who wrote The City in History and Technics and Civilization also wrote an impassioned plea for nuclear disarmament, In the Name of Sanity. Bertrand Russell provided a similar humanistic perspective tempered by fears for a future dominated by Stalinesque leaders with nuclear arsenals. Yet, the wonders of technology, exemplified by the slogan “Atoms for Peace,” became a dominant theme in popular and commercial culture during the 1950’s. The United States attemped to become, again, the Empire it had abandoned two centuries before. The “100% American” of the 1950’s took more pride in his nation and its recent victory over tyrants and dictators, even as our leaders were protecting and installing tyrants and dictators around the world. The 1950’s, like the 80’s and 90’s, was an age of upward mobility and professional success – suceess being counted mainly, but not exclusively, in material terms. Much of popular culture was reinforced and enhanced by a Yuppy-esque pursuit of wealth and beauty.

Yet, everyone was not benefitting equally or proportionately. Here are the roots of the Black Revolt in the 1960’s; the peace and social justice movements which accompanied and reinforced a larger Third-world Liberation and anti-colonialist impulse. It is to this post- war period of euphoric ambition – the GI Bill and the amalgamation of the working class with intellectuals – that we can also trace the roots of the Counterculture in the Beats and Jazz scenes who were predominantly ethnic (i.e., of non-British ancestry).

Whenever we visited foreign countries, or listened to fine arts and educational radio and television, we understood how limited American popular culture had become – how the “melting pot” had consistently denied our individual characters and heritage, leaving us a rootless, history-deprived society. In our family, religion got much of the blame for this, even as we respected and encouraged the moral foundations of traditional Christianity and western European civilization.

It was in this context that computers emerged. The classic film “Deskset” (mentioned above) with Spencer Tracy and Katherine Hepburn is one of the few brilliant insights into the effects computers were having on our daily lives. The fact that it was made in 1957, which is extraordinary, qualifies it as “science fiction,” for there was then nothing like this computer in existence. Basically, the story describes a computer programmed with all of human knowledge, and thus becomes an infallible source of truth and guidance – a kind of oracle which exposes and makes fun of contemporary American culture. The ironic title, Desk Set (sub-titled “His Other Woman”) reminds us of our own obsession with the latest desk-top computer technology. Although this computer was not a desktop, it became the accessory of a “smart working girl” and her greedy boss.

Now, the Internet has become the oracle for all knowledge and information, the “world-wide-web” which connects all the computers and data-bases in the world! Computers are the one known area where the technology has actually exceeded the wildest expectations and imaginary machines of the most optimistic science-fiction writers. In contrast, we are still far behind the science fiction standards in robotics and propulsion systems – even with respect to what was written half a century or more, ago.

IV. How lives changed at the Dawn of the Cybernetic Age

The largest threat posed by computers was evident to me from the beginning (say, 1970): what would they do to the way we think? Students from “Third World” underdeveloped countries often remarked that computers posed a real dilemma for those of them who were learning science or engineering with the help of computers, but planned to return to their own countries and do their work without computers later on. Even the pocket calculator was yet to be developed, and those who didn’t have computers could only look forward to doing calculations with slide rules!
People in their 50’s and older may remember the large $30-$50 logarithmic slide rule, with 20-30 different scales, and carried in a leather holster like a large sheath knife by the rather awkward-looking science and engineering students. The handwriting was on the wall when I could go to a campus lost-and-found auction and buy as many of these antiques as I wanted for two or three dollars apiece!

If we were to become dependent on computers, what would happen if the computers were somehow not available? Obviously, this was a real problem so long as computers filled large rooms and cost millions of dollars. When I returned to Montana in 1972 and lived 30 miles from a city, and the same distance from any usable computer, I called the phone company and asked what it would cost to install a phone line that could handle a remote computer terminal. It would have been necessary to extend a private line (our normal rural phone used a noisy 8-party line not well-suited for modems!) for about 8 miles, at a cost of several thousand dollars per mile – not an economically feasible proposition. In fact, the Mountain Bell representative seemed amused that anyone would even consider such a thing.

Another aspect of this dependency was observable in the computer centers where I worked. Those who were really dedicated to the newest incarnation of the God of the Machine often seemed to lose other aspects of their humanity. Staring fixedly into CRT displays which more resembled oscilloscopes than modern color monitors; forgetting to eat, wash, change clothes, and otherwise interact with friends and families; these “hackers” turned out to be next year’s millionaires or literal “rocket scientists”. Some went mad, or disappeared into the counterculture, never to be seen or heard from again.

We know, now, that this was a form of addiction, or obsessive-compulsive behavior. Denial was a large aspect of the problem. I can still remember a young man, ambitious and clean-cut, who over several months turned into a kind of Dr. Jekyll before our very eyes. On one occasion, his aging father, an immigrant from some European country, came to the computer center to rescue his son from the infernal machines which had somehow captured his soul. All entreaties were in vain, with the old gentleman finally leaving in tears, leaving his son to complete the work of genius he was performing.

Anyone wishing to deal with computers had first to deal with the new priesthood of the Cybernetic Age. They resembled Ross Perot. Even the lowliest technician from IBM wore a three-piece suit and tie to the shop in the back of the machine room, where the coat might be hung on a chair and sleeves rolled up, but vest and tie stayed on. One could have easily mistaken them for FBI agents. They knew nothing about programming or what the computer was doing, but they had the ability to locate, replace, or adjust any defective part or mechanism in what was undoubtedly the most complex machine ever built. Basically, they were glorified mechanics, usually trained in the military or according to a military-style regimen.

One of my friends wrote an article for an underground paper entitled “Cyborgs” in which he “exposed” this new class of machine-bound humans. Even a person driving a car, he maintained, is a cyborg – half human (or less than half), and half machine, plunging headlong into an unknown future which, when contrasted with the “flower children” of the 1960’s, began to resemble H.G. Wells’ future technocracy of Morlocks in The Time Machine. But most social criticism of the dawning Cybernetic Age was restricted to the infuriating unresponsiveness of trying to deal with computers, and the form-letters which they were already generating in a blizzard of meaningless paperwork. Someone would get a bill or other document from a company or the government, and assuming that it was from a real mind and a real person, would call or write to straighten out the problem. Soon, the unwitting customer or client would discover that no person had written or even seen the letter, and that the problem would be identified as a “computer error” for which there was likely to be no redress or adjustment very soon.

“It’s the computer” became everyone’s favorite excuse for mistakes or inaction. This was the time when the phrase “Do not fold, spindle, or mutilate” became a cliche’ – not with respect to people, where we sometimes hear it today, but with respect to one’s phone bill or other document, itself a punched “IBM card” which would at some point have to be read at very high speed by a card-reader, the slowest and weakest link in the flow of “information.” Any damage to the card might cause a jam in the card reader or loss of valuable data.

It was at this point that some of us began to see that computers were not just machines, but harbingers of the beginning of a new era in culture and belief; that computers had already become the newest oracles of what we now might call a “virtual” religion; and that we should properly speak of computer theology rather than computer science.

There was also an ecological aspect to all this. The evolution and proliferation of computers constituted an ecological system, subject to most of the same rules identified by the investigators of living systems. There were many extinctions and dead-ends in the evolution of new computer species. And some lay dormant for years or decades until someone figured out a way to utilize them.

The “mouse,” for example, was developed by Xerox in the early 1970’s, but didn’t become a common feature of computers until a decade or so later with the advent of the Macintosh. The punched card technology became entirely extinct, as did many other forms of data storage and retrieval. Dot-matrix printers replaced the costly and complex line printers, only to be superceded by laser and ink-jet printers which continue to become cheaper and better with each passing year.

But it was the advent of the personal computer which totally changed the game from one of centralized authority and superstition to the age-old American ideal of individualism and “do it yourself” technology. Once computers became affordable to the average professional or working person, the mystique was gone. Although computers no longer seemed to pose the same totalitarian threat they had, before, other potential dangers lurked in the shadows.

For one thing, computers in the workplace were not entirely the labor-saving devices they were intended to be. They may have saved certain kinds of labor costs for employers, but they also imposed heavy costs on many of the users, including the still-controversial effects of CRT radiation and the damage to fingers and tendons (carpal tunnel syndrome) from long hours of steady typing, uninterrupted by inserting paper, erasing, or other more natural and spontaneous movements once required of typists.
In terms of labor relations, many once-salaried and low-pressure jobs became piece-work nightmares, in which each worker’s productivity could be precisely monitored and measured. Slower typists were demoted or fired, regardless of their other talents or value to the firm. Those who refused to become “computer literate” found their employment opportunities severely curtailed.

As computers became more and more essential – not only to the completion of repetitious typing or calculating tasks, but to the creative end of business, such as design, layout and typesetting, and the robotics found in manufacturing – the standards for products and productivity improved or increased at what seemed to be an accelerating rate. Those who could use computers effectively had an immediate and enormous advantage over their competitors who could not. Soon, it began to appear as though computers were becoming a kind of magical “answer” for every workplace or industrial problem.

The marketers and vendors of computers and computer accessories became the new prophets of the cybernetic religion. In government and other large-scale institutions, the rate of mechanization and replacement of already advanced and successful technology snowballed, costing hundreds of thousands of jobs with little or no increase in productivity or service to the clients or customers. Somehow, the American way of doing business was no longer proving to be effective, and was being replaced by an attitude which began to see computers and other high technology as ends in themselves, regardless of their negative impacts on the average person.

V. The Idea of Progress – Computers

The message we’ve been sold is that everything is getting better every year – just like the machines, themselves. More computers means a higher standard of living. Ever more costly and complex gadgets are somehow believed to have improved “the quality of life.” Yet, most of us cannot afford to upgrade our equipment every couple of years, and re-learn the software. Most of us couldn’t even figure out how to program a VCR, until screen “menus” (again, based on micro-processors) simplified the process to the level of a complete idiot.

But more love and care is not being put into products, yet. “Quality control” is still seen as primarily a technological or economic issue – not a matter for human aspirations. “Efficiency” is no longer the Puritan virtue it once was. It means, instead, dehumanizing the workers (or the customers) for the sake of corporate profits. It means cutting public services by governments, not improving them. Being “cost-effective” doesn’t mean that we should get maximum value for our money, but that we should spend a minimum of money for anything except our own immediate material desires – the next generation of gadgets, in other words.

Talk to college business students or recent MBA recipients and you are likely to be in for a shock. Not only are most of these people not knowledgeable about philosophy, the arts, or broader community issues; in most cases, they are not the least bit interested, either. They hope to make a lot of money for themselves, so they can then purchase “leisure,” apparently. Or gain power over others, attract suitable members of the opposite sex (as many as possible, it would seem), or perhaps just become “rich and famous” so that they might be featured on some TV “lifestyles” segment.
They’re not actually interested in enriching their lives and minds, or improving their cultural awareness, and certainly not intersted in helping others to do so, or in creating a society in which all may achieve and prosper. In fact, most of them seem to think we’re playing a zero-sum game, in which one person’s gain must necessarily come at someone else’s expense. Our victory must mean someone else’s defeat – hopefully someone of a different race and culture. Ultimately, it become a “casino model of society” or “party culture,” where self-destructive behavior makes a few people rich and comfortable at tremendous human costs to a much larger number of people.

Fortunately, computers are no longer restricted to a wealthy elite. They may, indeed, become the “great equalizers” and the ultimate expressions of an Open Society in which no one group or faction can control the future, or abuse those with less power and influence. Although I am not inclined to use computers more than is necessary and beneficial, I feel fortunate, indeed, to have a small, affordable computer which performs all the functions which are useful and beneficial to me. It is, indeed, an empowerment tool of great flexibility and utility. The fact that civilization reached the high level it did before they were invented is almost more miraculous than the fact that computers were invented at all, and perfected to their present level.

VI. Cyber-linguistic Socialism

How do we explain the fact that the American people and our intellectual and moral leaders have virtually no standing in the policy decisions of this country? If it is truly the case that the most legitimate and pressing issues are not even on the table, or have been long since discarded from active consideration, how do we explain this fact, or maintain our illusions that we live in a society with principles, and that the democratic process actually works to bring forward the best ideas and policies?

Of course, we can admit that we no longer have a democracy, and that the corporate media refuses to consider or discuss the real issues of poverty, racism, sexism, the nuclear threat, envirornmental crisis, or whatever. Even when they do discuss these issues, there is a nearly complete censorship in their reluctance to publish certain writers or schools of thought.

Now, thanks to the internet and a new wave of popular activism bolstered by other telecommunications including public broadcasting, cable, CNN, C-SPAN and various other channels and networks, all of us can publish and read exactly what we like. With all sorts of related progress in the epistemology of political rhetoric and action, a true participatory democracy is beginning to arise. Is it enough, yet, to swing an election? Sometimes, in some places. But more importantly, it is changing the consciousness of the average news reader or viewer; changing the media, itself; and changing the way that elected officials manage or “coordinate” government functions and public policy.

Amazingly, all of this has not much shifted the balance between right and left, which was already very heavily weighted towards a more or less constantly rightward-moving Center. The Internet, itself, is a scientific, academic and political instrument, developed under government contracts in support of defense research and other kinds of academic, scientific endeavors like the National Science Foundation, NASA, and a number of other agencies and institutions. Soon, it would become a library resource, publisher of academic journals, a medical management tool, and so forth. Computers had been developing concurrently to serve all the various academic, scientific, business, consumer, and entertainment functions, and it was a natural development in “cyber-ecology” to expand the WWW to encompass all these applications, and many more, besides.

But few imagined, or had any idea, what the political consequences might be. With the advent of the personal computer, most fears of a computerized dictatorship were tossed out the window. Both Right and Left saw in computers a means of personal and community liberation, and perhaps even the demise of the giant nation-state and centralized governments of all kinds. Legislators could immediately access all the research, news, and opinions they could possibly assimilate, and the public consciousness became a concrete, quantifiable reality which could be “studied” and interpreted with a pseudo-scientific exactness which brooked no argument or refutation.

It is now possible to have national referendums and town meetings to decide every kind of issue, and some progress has been made in that direction, but there is little progress, yet, in actually moving in the direction which socialist or social democratic activists might favor.

The fact is, there is now probably two or three times as much Right-wing activity and propaganda on the WWW as there is Leftist, and the class division, which made computers accessible to the professional class long before working class people hardly knew what they were, has had grim consequences politically.

But it is in education and the media that computers have had by far the greatest social impact. How terribly insecure and impermanent our young people must feel, seeing yesterday’s most glamorous technologies thrown on a junk-heap of the obsolete and over-costly. For the educated and elite classes, it is Brave New World — light, sexy, and scientific. For the working poor, it is 1984 — bleak, frightening, and dictatorial. Meanwhile, there is a growing incidence of irrational violence and other destructive behavior directed not at the system, but at anyone and everyone within reach.

These “flavors” — entirely refuted and unwanted — have nevertheless come to dominate our national consciousness. In spite of all the technological savvy, greed, and ambition, there is very little critical thinking or what the Right calls “Secular Humanism” (academic, scientific, progressive philosophizing) going on in either our public or private media and education systems.

So what is our new paradigm of cybersocialism going to look like? And what sort of education systems and media will it foster and maintain? What I discovered as an economics student (most interested in the history of economic thought and comparative economic systems) is that the science of economics has defined its own boundaries so narrowly that it is up to social philosophers (who hopefully thoroughly understand economic theory) to actually define the ends of human civilization, and the means to attain them. Thus, any sort of economic system is essentially meaningless and irrelevant unless it reflects a deeper set of social values and (natural) scientific understanding. That’s why work of people like Noam Chomsky and other progressive, humanistic Leftists is so valuable, and so under-appreciated in our centralized, totalitarian corporate state.



The Alien Contingency (1999, 2001)

Nuclear Issues, Young Person's Guide.. chapters

This essay reflects a lifetime of being interested in science fiction and the possibility that alien species from Outer Space have visited Earth and influenced elections and other human events….   When I started putting together a book, “A Young Person’s Guide to Life, Love, Art, and Philosophy,” this was one of the first topics I wanted to present.  I’ve updated a few of the references in here, but it’s basically the same as what I saved in 2006.   I’m posting individual chapters, here, in case the book is never finished or published….  Paul Stephens, June 28, 2017

The Alien Contingency (1999, 2001)

In arguing with “true believers” about flying saucers, alien invasions, and a future in which intersteller wars between different species and cultures may come to pass, a number of interesting contingencies arise. Personally, I acknowledge the possibility that alien cultures, more advanced in science and technology than our own, have visited earth, and may still be here, observing and altering the course of human history in various ways.
What probability do I place on this being true? At present, I conjecture that there is about a 1 in 7 chance, or 15%, that this is true. There is also the possibility that life on earth was “seeded” here by aliens millions or billions of years ago, rather than having spontaneously evolved from materials indigenous to earth. The second conjecture depends on the first, but the converse is not the case. Life may have evolved independently on earth, and only recently come to the attention of extra-terrestrial species who were attracted by our nuclear explosions or high power radio transmissions (mostly television and radar, which could have already traveled about 60 light years into space).
This estimate, entirely free of scientific evidence, statistical or other mathematical derivation, is a very rough guess based on my reading of science and s+cience fiction, and the credibility of the evidence which various people put forward to “prove” either that aliens are here, or that they couldn’t possibly be here. No doubt the estimate will seem overly generous to many professionals in the field (even those who strongly believe there is life in other solar systems are very doubtful that it could have ever traveled to earth), while it will seem drastically conservative to those who are totally convinced of an ongoing alien presence.
If I had ever personally observed flying saucers, or had the experience of alien abduction, I would undoubtedly give the alien presence a much higher probability, which is why I stress that this is a personal estimate, based upon my own personal experience. Tens or hundreds of millions of dollars are presently being spent on such projects as SETI ( the “search for extra-terrestrial intelligence”), although most of it is now privately funded, since Congress decided this was an unconscionable waste of the taxpayer’s money. It seems unlikely that such projects would continue if it was already certainly known that extraterrestrials are already here, and our government is spending billions of dollars in studying, preserving, and reverse-engineering alien spacecrafts and other technology. Yet, there are many highly educated and experienced people who will claim that this is true. I hope to include some comments or analysis from such people in this chapter. [See attached PDF – AlienConting-ET&Sci-Tech]

What I am wondering, and intend to explore in this essay, is whether or not the alien presence is a contingency which is taken seriously in national security planning. It drastically changes the nature of defense policy, for example, if we are to prepare for an alien invasion, rather than merely an attack from some other human nation – no matter how large and powerful it may be. In fact, it changes the whole picture so drastically that it seems clear either that nothing is being done about this contingency at all, or that it totally overshadows all other defense considerations.
For example, even though space-based defense systems, large nuclear arsenals, and sophisticated laser and other “ray gun” or electro-magnetic-pulse (EMP) weapons are totally useless (and potentially suicidal or “doomsday”) defensive weapons if deployed against other human nations, they may be entirely rational and necessary if deployed against an impending alien invasion. Of course, it would be up to the defense authorities to somehow prove or demonstrate that such an invasion was impending – something they are neither willing nor able to do, at least publicly. Yet, it wouldn’t surprise me if a great deal of military policy and research, development, and procurement of advanced weapon systems is actually based on this idea.
On the other hand, it is virtually certain that we will eventually have a global nuclear war – with nuclear winter, destruction of the ionosphere, a proliferation of mutations and the degeneration of the gene pool (both human and for other species) along with other genuine threats to our very existence as an ecosystem – unless we dispose of our nuclear arsenals before that happens. It is almost as certain that we will have plagues and famines resulting from biological warfare and genetic engineering killing billions of people unless we control these technologies along with nuclear weapons. Further exponential population growth and environmental degradation also threaten our health and well being in a much more obvious and direct way than any potential alien invasion.
Yet, even these more tangible and human-caused threats seem to be largely ignored or denied in our real-world policy-making efforts. And it is a non-partisan or bi-partisan kind of denial. When Democrats controlled the Presidency and Congress, they were no more interested in ending the Cold War, disposing of nuclear weapons, and removing the causes of war than the Republicans are, today. There are too many jobs, corporate profits, and “national security” issues involved to let this happen. Indeed, whatever programs and treaties once attempted to address them have been put on the back burner. They have become politically inexpedient, and any major candidate who focuses on them is likely to lose the next election, largely because of negative news coverage and biased polls which misrepresent the issue and then misinterpret voter interest in it. (See the chapter “Fads and fallacies in the name of Democracy”).
Better, perhaps, that we consider alien invasion as a kind of generic disaster scenario. At least it has a following in Hollywood – something the anti-nuclear movement hasn’t had since Jane Fonda retired. Perhaps, as Jung suggested, alien contact is only a metaphor for a human-caused crisis (the nuclear arms race) or disasters. In a similar fashion, George Lucas’ Star Wars became a generic term for space-based weapon systems and a revival of the nuclear arms race among human nations. As long as we keep thinking in these terms, and each successive generation is imprinted with images of space flight and combat with alien species, we will continue to spend the billions of dollars and risk billions of lives in an enterprise which is so stupid and improbable that we can hardly conceive of its actual costs and consequences.
Whatever the case may be, a 15% probability of an alien presence is not enough to focus a nation’s energies to meet that threat. But it is certainly worth a lot of research and contingency planning. The chances of a serious nuclear accident, in any given year, are supposed to be about 1 in 50 (historically, it has been about 1 catastrophic accident every 10 years, any one of which could have precipitated some larger nuclear exhange in one of many “doomsday” scenarios). The chance of a nuclear war being started more or less by accident is approximately the same. For some of us, this is far too great a risk to take, when the consequences may include the extinction of the human species.
A similar logic could apply to the presence of aliens. Even if there are some, the chances that they are hostile and mean to destroy us are much less than the probability of their existence and presence here. Perhaps there is a 1 in a 100 chance that we are actually being threatened by hostile aliens. Or 1 in 10,000. Without concrete, reproduceable evidence that we actually face such a threat, any calculation of probablilities is sheer guesswork and speculation. If there are such beings, here, it would seem prudent to treat them as though they were welcome, and attempt to establish peaceful relations with them, rather than immediately assuming they are some sort of hostile force. But that would be a strange reversal, at least for American diplomacy and “foreign policy.”
One can’t help but wonder whether the nuclear arms race, itself, has been stimulated by a fear of alien invasion. For those who were fans of the short-lived television series Dark Skies, or more successful series such as The X-Files or Earth, Final Conflict, these contingencies have already been worked out in fiction. Dark Skies interpreted “post-Roswell” history as being largely the consquence of alien infiltration and intervention. President Truman, according to this account, started the war with the aliens by refusing to surrender unconditionally in a meeting at Roswell, and then ordering the alien emissary’s ship destroyed. Thus began an alien infiltration into the highest levels of government, and the show subsequently dramatized the Kennedy assassination as being alien-caused. Finally (the show was cancelled after the first season), they were portrayed as being at large within the government as a kind of CIA-like underground, manipulating government policies at home and abroad. A similar conclusion was reached in the X-files TV series and subsequent feature films.
This is the issue which concerns us, here. What political methods and policy options are available to those who would resist such an alien takeover? In Earth, Final Conflict the Taelons attempt to peaceably acquire a benign and wisely scientific hegemony over earth. Unfortunately, they, too, are threatened by another species in an intergalactic war, and there is much deception and intrigue in the effort to recruit human soldiers and adapt them to warfare in interstellar space. At first, humans are genetically altered to be “companions” to the Taelons. Later, under a secret program which the “good Taelon” Da-an opposes, large numbers are altered or created as warrior clones and other secret operatives. A resistance grows to overthrow the Taelons, leading to a situation where earthlings and resistance leaders are making contact with another alien species at war with the Taelons. A few years later, it is the “Twilight of the Taelons”, who are running out of their vital energy, and threatened by yet another alien species.
All this makes an interesting story, but it is little different from H.G. Wells’ War of the Worlds written nearly a century ago. It was viruses or other micro-organisms which destroyed the Martians, and we would probably do just as well to rely on such natural obstacles to alien colonization, ourselves. In any case, there is no use planning for such contingencies in the absence of any evidence that they might happen. It all begins to sound like the ritual to keep away elephants, which was later judged a success due to the fact that no elephants have appeared.
If we can solve our very human problems of the nuclear arms race, environmental degradation, plagues (whether human-caused or “natural”), and overpopulation, we will be doing very well, indeed. But it wouldn’t be the first time that preparation for a war which otherwise would never have happened has been the cause of war, genocide, and otherwise undreamed-of devastation and human suffering.

Aliens, 9-11, and the War on Terror

The text of this chapter so far was written in about 1999. Now, it is September, 2001, and we are experiencing the immediate aftermath of “the attack on America” by 18 Islamic “terrorists” who commandeered four airliners, crashing three of them into the World Trade Center towers and the Pentagon. The film, Armageddon, has been released in the interim, in which a number of asteroids and large meteors hit the earth. In some of the scenes, skyscrapers in New York City are destroyed in a vision so much like what we have just seen in the nightly news that people have remarked about the similarity. This has nothing to do with aliens (we’d have to go back to the War of the Worlds film from the 1950’s in which several Los Angeles landmarks are destroyed by alien ray-guns to make that connection), but the fact remains that science fiction often seems like prophecy when compared with current history.
Nearly every science fiction plot, today, has something to do with aliens. Its the one unifying theme, and the source, perhaps, of most of the popularity of this genre. What is the significance of this fear of aliens? Is it merely a transposed fear of the outsider, the foreigner, and thus a kind of xenophobia? Now, we actually have legitimate sciences concerned with this issue – exobiology, and the like. We routinely sterilize spacecraft and quarantine their crews, lest they bring back some alien microbes which could wreak havoc on our bodies since we have no immunity or resistance to such organisms.
Paradoxically enough, Congress and the Defense Department have eliminated nearly every kind of “Search for Extraterrestrials”, flying saucer research, or whatever. Yet, many professional, experienced and legitimate ex-government or military people claim that “Area 51” in Nevada (said to be the repository of alien remains and technology) exists, and that much of the recent explosion in advanced technology was obtained by “reverse engineering” from flying saucers and other alien machines. The unwillingness of governments to even admit the possibility of this alien presence may be due to political pressures of various kinds, or simply the requirements of mility secrecy. If there actually were such programs and the alien presence they imply, wouldn’t there be widespread panic based on the failure of the government to protect us?
This is one of those issues which will not be resolved by any sort of idle speculation, no matter how sophisticated. Either there is an alien presence and the government knows about it, and is keeping it a secret in order to use these technologies and the threat of hostile alien action to further restrict our freedom and well-being, or it is being made up by some sort of cadre of people, in or out of the government, who hope to stampede the country into war, dictatorship or other harmful outcome. Who knows?
My position, and that of this book, is that we don’t need a “national security state”, and creating one acts as a self-fulfilling prophecy to create the very threats which the national security state is supposed to counter. Whether the imagined “enemy” is aliens, terrorists, or “godless communists,” the consequences are pretty much the same. We lose our political self-determination, our individual rights, our freedom of expression and association, and our national wealth and human resources are channeled into weapons of mass destruction and a militaristic regimentation which has by now become entirely politcally and socially respectable. For the past year or two, and in recent days, especially (following “the attack on America”), I have witnessed an astounding change in young people These changes included a surge in enlistments in the military, a marked tendency to become politically right-wing and nationalistic, and believing implicitly whatever we are told by the media and those in authority.
For those of us who came of age during the Vietnam War and the social revolutions of the 1960’s and 70’s, this is inconceivable, and it is for this reason, primarily, that I have undertaken to write this book. We must continue to “Question Authority!” and otherwise maintain our mental, spiritual, and material freedom and independence from government and the agencies of coercion and repression which are constantly being expanded and marshalled against us. We will have more to say in other chapters about the education system and the co-option of the youth culture, and their roles in creating this vast reversal in youth consciousness.

Objectivity… The latest Crisis

Young Person's Guide.. chapters

OBJECTIVITY  The Latest Crisis

(edited 6-26-17)

For anyone possessing a philosophical turn of mind, it is obvious that there is something radically wrong with the way we think about social and political issues. Denial and blame have replaced most enlightened and enlightening discussion in the political realm, while the social sciences, although often sophisticated and meaningful, are rarely applied effectively within the constraints of organized government and the political process which is largely poll-driven and subject to control by a few large media conglomerates.
The fact remains that few people in public life even make the pretense of being fair-minded or “objective” in their thinking and policies. It is as though objectivity, itself, is no longer a valid goal or principle, probably because of the abuses to which this concept or criterion has long suffered. In Soviet Russia, the term “objective” became a kind of buzzword meaning something like “material reality” or “the facts, themselves, independent of analysis.” Something of this idea was imported into American thinking via the emigre novelist-philosopher Ayn Rand. Although largely discredited as an academic philosopher, her influence is still pervasive and must be taken into account in any understanding of the contem porary American political/social status quo.
Ayn Rand, of course, is the originator of a copyrighted “philosophy” known as “Objectivism.” After studying and learning the principles of this closed and rigid system, most people found themselves torn between their love of freedom, order, peace and prosperity (all of which, of course, Objectivism promises to deliver) and their distaste for the actual principles and reasoning which Ayn Rand offered them. Objectivism was actually a useful part of many people’s education (myself included), and Ayn Rand was neither a bad thinker nor a bad writer, but only a somewhat warped personality. She lacked the one essential quality of a philosopher — dispassionate objectivity — and with it, some essential human qualities, including simple kindness and some measure of reverence for life, nature and the larger universe. Having grown up and spent her formative teenage and young adult years in the Russian Revolution and its aftermath, her hatred of Marxism and Communism was automatic, and her distrust of government in general followed close behind.
Her anti-authoritarian arguments are as powerful and absolute as any which have ever been made, rivaling John Stuart Mill’s, Kropotkin’s, and in many ways resembling Herbert Spencer’s and the other Social Darwinists. Once convinced of them, as I was in my own formative teenage and young adult years, one is not likely ever to change or abandon them. Much of contemporary libertarianism and even anarchism is directly attributable to Ayn Rand. That she was not an explicit anarchist is most likely due to her experiences during the McCarthy Era, and the fact that she perceived a total breakdown of authority as presaging something like the Russian Revolutionary and the resulting bloodshed and terror. She was already a Republican in the 1940 election, having worked for Wendell Wilke. Yet, if one takes literally her ideas of the sovereignty of the individual, and the necessity of a social contract, renewed and affirmed by each successive generation (and each individual therein), there is really no place for a sovereign state in Ayn Rand’s system. Much of the confusion and distress which her philosophy has caused has been based on that particular contradiction.
In concrete form, we have witnessed since the Nixon Administration the absurdity of an Ayn Rand disciple, Alan Greenspan, advising Republican Presidents and finally presiding over the funny-money system of the Federal Reserve, while Objectivism advocated a gold standard, no central bank, and certainly no Federal Reserve System, as such. How Mr. Greenspan reconciles this particular contradiction in his own mind is, I suppose, his business, but our next President would do well to find someone for these jobs who is not burdened with so many supposed misgivings.
The main issue — the discrediting of the very idea of “objectivity” because of Ayn Rand’s advocacy of it — is a much deeper and apparently insoluble dilemma. Either we dismiss Ojectivism, and with it, objectivity, as being irrelevant or counter-productive, or we “buy into” Ayn Rand’s total world-view, and with it, a total rejection of the Welfare State, welfare as such, and virtually all “interventionist” government, including subsidies for the arts, protection of the family farm, a mercantilist trade policy, the role of the world’s policeman (something which Ayn Rand seemed to support, so long as her former tormentors were still in power in the Kremlin), the protector of the environment, etc., etc.
The peculiar thing about Ayn Rand’s philosophy (and its attractiveness, obviously, for Mr. Greenspan and his ilk) is that it is emphatically pro-business! No other philosophy has so glorified the power of money and the nobility of counting and accumulating it. Ayn Rand explicitly opposed the idea that money or the love of it is the root of all evil. As a novelist, she wrote basically the same book three times, but her heroes evolved from engineers, architects, and creative artists, none of whom cared a damn about money, into the financiers and industrial magnates of the 1890’s, transposed into the 1950’s and beyond. The main hero in Atlas Shrugged, to be sure, was an inventor (a physicist-philosopher of Irish antecedents) who never seemed to want to capitalize on his genius, and thus eschewed conspicuous consumption, but he chose his friends exclusively on the basis of their material success.
For those having some background in Marxist thought, or the persistent Russian view of capitalism, all this fits into place. The real motivation of the Soviet leaders since Lenin’s day has been to make themselves into the “captains of industry” which they imagined Western capitalists accomplished by a similar process — first, by taking over the government, and then by planning (or conspiring, depending on one’s point of view) to make themselves rich and powerful at the expense of everyone else, while at the same time making the country rich and powerful, as well.
The dialectical process is also very evident in Ayn Rand’s life and career, as is the principle that ideology reflects one’s class affiliation. As a poor, struggling writer, Ayn Rand’s heroes were poor and struggling. As a wealthy, upper-East Side “novelist-philosopher” and successful writer of screenplays, Ayn Rand’s heroes tended to occupy a similar territory, and her ideas went from opposing tyranny and oppression to attempting to gain the instruments of tyranny and oppression for themselves and their friends.
I wonder that there has never been a “Marxist Critique” of Ayn Rand. The reason is probably that Marxists were quite happy with her work, while it was the real anarchists, libertarians and advocates of a free society who attacked her most vociferously. The history of this movement and its opponents is a book or two in itself, but much current history is attributable to it, whether or not one agrees with the Objectivist agenda. It is safe to say that the Libertarian Party owes its existence to Objectivism, and so does the transformation of the national psyche into an obsession with economics — the baby-boomers and their “counterculture,” the yuppies with their snobbery and (upper) class-consciousness, and the continuing superstition that the United States is “the greatest country in the world,” “a beacon of freedom and opportunity,” etc., etc. For such a radical individualist, Ayn Rand had a peculiar knack for playing to the crowd.
I hope the reader will understand that I am viewing this with a sense of wonder, not hatred or contempt. Give her credit for beating the game, even while she committed suicide with cigarettes. This is another example of the denial which seemed to dog the footsteps of the Objectivist “inner circle.” She will be remembered, while other 20th century intellectuals — more “respectable,” today — are long forgotten. Many scientists revere her for her advocacy of the scientific method and the purity of scientific knowledge. Most scientists, after all, do profess to believe in an objective reality, and practice a generic form of “objectivism” in their daily lives. Computers are likewise an “objectivist” factor in our cultural evolution — a profound “reality check” for those of us who’ve learned to use them and to think logically.
Has anyone else noticed that Objectivism (the movement) actually may constitute a kind of Marxian synthesis between State socialism and capitalism? The two are actually shown to be the same. Both are “scientific” societies, technological, dynamic, change-oriented, ruled by the strong and able, etc., etc. Both are actually quite “macho” — a rather peculiar position for a woman philosopher to take!
I remember agonizing discussions among college-age disciples of Ayn Rand in the 60’s about whether or not she hated children (since none of her major characters had them), or what the Objectivist position on homosexuality might be. What are now called “family values” got relatively short shrift in the Objectivist system. Intelligence, however, was practically divine, which proved quite an attraction for the highly-gifted who had gotten little recognition or approval, elsewhere. Some of them seem never to have thought they were grateful enough for this special recognition.
At this point, I believe that a new interpretation — a new dispensation, if you will — of the Objectivist faith is in order. We need to talk about concrete social and political principles — methods by which the real-world problems of today may actually be dealt with. Unless we expect to “bottom out” with some total collapse of the government and social order, and then to be led by the wise and good to the foundation of a better State, we’d better start breaking the free-fall and start the climb back out of the hole we’ve dug for ourselves over the past 30 years. We’d better figure out who knows what they’re talking about, and listen to them.
Ayn Rand’s stuff is mostly of academic interest, since she didn’t follow her own principles, herself. Like most of us, she was somewhat confused in many ways, and unable to reconcile the reality “out there” with her own finely-developed instincts and prejudices.
It would be easier for us, now, if Ayn Rand had never existed. She actually contributed nothing of value except a vision of what the future might be, and what sort of people might lead “us” (not the masses) there. Political leadership or success in statecraft were not on her agenda, and those who followed her rarely, if ever, changed their mind about the general worthlessness and incompetence of government to serve important human needs and aspirations.
Perhaps that’s the really harmful legacy of Ayn Rand: the conviction that government is “the enemy” and doomed to be the province of the corrupt and incompetent. For those of us raised on the American frontier, with strong traditions of populism and participatory democracy, civic virtue and public recognition and esteem for the virtuous, this is a heresy of unspeakable danger and destructiveness. Our pioneer system was practically anarchism by today’s standards and definitions, but it worked, and we thought of it as “good government” and “reform.” It was the Eastern monied interests who played the bogey-men — the very “heroes” and “heroines” whom Ayn Rand glorified in her magnum opus, Atlas Shrugged.
Beyond the idea of community — a concept quite alien to Ayn Rand in any case, although some have considered her positively utopian in her creation of a higher social order based on “reason, purpose, and self-esteem” — there is the international dimension. Ayn Rand seems to have caught on in other countries, too — most notably the Netherlands, Australia, and South Africa. One wonders if she might not have a following, today, in Russia! Or what might have happened to the family of Alicia Rosenbaum in the Stalinist purges? What might be Ayn Rand’s role in (or attitude about) the ending of the Cold War, if she were alive, today?
Or did it end? Are we not about to see a transition to an alliance of the left and right wing nationalists, again? Will the Communist Party have a resurgence, once the bastard form of gangster capitalism which has taken over, there, has run its course. Has everyone unlearned all the Marxist history and theory they’ve been taught over the past 70 years? It seems unlikely. Perhaps we’re still operating under the false assumption that Marxists are stupid, and that all Marxist thought is now discredited and safe to ignore.
In fact, it seems to be thoroughly vindicated in many respects. We needn’t see it as a threat. Marxists aren’t intellectually invincible, but often they are found to be much more liberal and realistic in their thinking than their bourgeois colleagues.
And what about spirituality? Here is another concept which Ayn Rand appropriated and turned to her own advantage. It’s much more common, nowadays, to think in terms of a spiritual dimension than it was 30 years ago. The purely materialistic vision has not been at a lower ebb since 1814. Even scientists have become mystics, if only because life and society are ultimately mysterious. I would venture to say that great scientists are far more playful and imaginative than political leaders, and far more effective, too, in altering the course of history. Like their medical colleagues, they have found it very profitable to be smart and specialized. But there is still plenty of room for the humanistic generalist, or pure research scientist. They are treated with a mixture of awe and wonder.
In short, the scientific community, broadly construed to include such fields as “economic science”, cybernetics, and psychology, has the best hope of saving us — our once peaceful, harmonious society — from self-destruction. Just as democratic values and an international esthetics are becoming universal, we seem to be locked into a paranoia of “us against them.” Who is them? Them are us! “We have met the enemy, and they are us!” as Pogo so eloquently put it.

From Objectivism to Technocracy

The impetus to technocracy is still seen to be a threat, and in many respects, it is, leaving us with something like a military-driven economy in every case, whether it be Nazism or Cold War communism mirroring our own military- industrial complex. There must be other ways to do it, and it needs to be decentralized, empowering, humanistic, and even holistic (i.e., respectful of whole systems and interdisciplinary concerns) to be viable. It’s very difficult to run a country, an economy, or an ecosystem by means of “scientific” planning and centralization of control functions. In fact, we can say it is impossible, if our goal is to approach some optimal social model. Total Quality Management and other “excellence” theories. The real-world practices of different kinds of enterprises have clearly shown that the old-fashioned democratic, community-based, cooperative ownership and management systems work the best.
It is important, here, to say something about goals and the kinds of organizations which can meet certain kinds of goals. Nearly always before in history, the ultimate test of a nation-state was its success in military combat. If it defeated its rivals in battle, it could dictate terms, and more or less impose its own vision of reality (and its own form of economic success) on the rest of the world, who might then be robbed of their land and natural resources or forced to pay some other kind of tribute. The ethical principle involved is an abomination: namely, “I beat you up, so now you have to do what I tell you to do.”
Among “civilized” people, the question then becomes, “Who started it?” And this question can go on forever. Ultimately, it doesn’t matter. What matters is the rules, and the responsibility of each nation to police itself so that it doesn’t develop some idea of power and conquest, and march forth to do damage to some other nation.
In reading about the origins of World War I in Barbara Tuchman’s The Guns of August, it seemed to me that the greatest single cause of this (and probably any other) war was the opposition of various absurd notions of personal honor and chauvinistic “team rivalry” — much like today’s high school or college sports teams. One can say that all wars ultimately involve conquest of land (and its redistribution among the victors) and other resources. What was clearly learned from the Treaty of Versailles is that one sophisticated nation cannot morally blame and exact reparations from the defeated one. Its like making the losers pay court costs along with whatever damages or punishments are inflicted on them. Far better to the have the winner pay them, for that nation can afford to do so, and is benefitting in many other ways from the victory. This was our policy following World War II, and in most respects, it was an unqualified success.
The failure of the United States and Soviet Union to maintain their hegemony over the rest of the world is due largely to the fact that both nations are founded on anti-imperialistic principles, and dissidents in either nation would never let their governments forget this basic contradiction in their behavior. If each nation can finally began to look outward and reconstruct itself according to emerging higher standards of planetary consciousness, we might save ourselves, and pool our resources towards positively reconstructing our less fortunate, poverty- and violence-ridden neighbors, wherever in the world they might be.
It is time for a sophisticated ethics of international brotherhood, voluntarism, and mutual aid. Let us be done with the empires, and begin again to live as human beings in a common habitat on planet Earth. These are the “objective conditions” of our present social existence, and when we finally acknowledge and accept them, we may begin to re-think our political and social reality, and alter it to fit the humanistic and ecological imperatives which are becoming ever-clearer. If “human life on earth is the ultimate value”, as Ayn Rand maintained (in the non- gender-inclusive form “man’s life”), then even her version of “objectivity” has something to offer, and we can proceed to work together to improve the quality of life for everyone.
Paul Stephens