Exit


The Impact of the Microprocessor,

or,

Is 1984 Here Two Years Early?

by Dennis Báthory Kitsz

1982


Note: This talk was presented before the Rensselaer ACM in 1982
Copyright ©1995 by Dennis Báthory-Kitsz


The microprocessor could be the most significant invention -- in concept -- since the discovery of the wheel and the development of written language. Consider: the wheel by itself is nothing more than an object, a meaningless slab of treetrunk or stone. But by applying its shape to tasks of motion, humankind liberated itself from the physical limitations of the body. The world could somehow be bent to the human will. War was possible. Likewise, by itself, written language is little more than a series of scratchings on rock, or impressions in a tablet. But its discovery changed not only the time and distance across which communication was possible, but the nature of communication itself. History began. Immortality was present on earth.

Since then, there have been the discoveries of all the basic tools, different methods of writing, and a long period of application. The wheel has grown in variety, extent and conception from a prehistoric cart to both interplanetary exploration and Cruise missiles. Written language has developed from simple expressions or monetary notations in clay to both the multilayered poetic masterpieces of T. S. Eliot and governmental bureaucratese.

The microprocessor is the first tool which is at once both wheel and writing. It is a general-purpose tool designed to understand human communication. It is small and eminently portable -- a human sized tool. It is an creation which melds humanity's two greatest inventions -- and carries with it all the potential wonders and potential terrors of any monumental human development. It must merely be applied to its task.

Since we are present at the infancy of microelectronics, the questions of microprocessor applications and their impact make up my topic tonight. As an author, writing about microcomputers, I get a glimpse of the thousands of interests and diverse applications of my readership. As a composer, I feel I am sensitive to a side of human experience often left out of the curriculum of the sciences. That side of me immediately reacted to something I saw this afternoon ... RPI's deification of its largest computer by stripping the insides of a Gothic chapel to install it -- a chapel with its architectural harmonies shattered by stark lines and its acoustics violated by clicks of keys and beeps of irate computers.

As a hardware and software designer, I believe I understand both the limitations and possibilities of these devices. And finally, as somewhat of a political activist, I become very concerned with what I see, and I want to share those concerns with you.

The questions that have come to my mind over the past few years are these:

How does and will microprocessor technology affect our work ethic? This is a philosophical question, but it bears directly on the practical question, how will microprocessor technology affect jobs? Does the computer community have any responsibility in providing guidance for our nation during a high-tech transition?

Secondly, in what way has the microprocessor begun to affect the face of the arts? In other words, in this presence from the new Lucas film editing studios, through digital music synthesizers, with word processors, and to computer-generated visual imagery, does the microprocessor technology impact -- that is: clarify, alter, or blur -- the definition of humanity, the nature of humanness, and the community of humankind? I will not address this question tonight, but I feel you should consider it as I present other ideas.

Next comes the question, has the appearance of the microprocessor encouraged the world of George Orwell's 1984? If you recall, Orwell's conception embodied something very important that our more generalized remembrances of the book often leave out: that is, the people of Orwell's world by and large accepted that world. They accepted Big Brother, they understood Newspeak, and they did not rail against the rewriting of history. It was a world where contradiction -- war is peace, freedom is slavery, ignorance is strength -- was the norm. It was a medieval concept, and their is no guarantee against its return.

On a more immediate level, has the microprocessor become the tool of new imperialist colonization? How does high technology affect our relationship with other nations, particularly our relationship with the Soviets and Eastern Europe? How has the technological explosion in the United States affected our politics in, for example, Korea, the Philippines, and El Salvador? How has the existence, and perhaps future dominance, of high technology in the West affected our perspective on the Third World? And, even more importantly, how has it affected their perspective on us? What is happening right now?

And finally, an almost bizarre question: what is the role of the microprocessor after a nuclear holocaust, if there is an "after"? How could the survival of a single personal computer (and its owner) change the planet's destiny?

Top


First, let's look at what a microprocessor is, how it works, and the state of the art.

A microprocessor is a general-purpose electronic calculation, comparison, and storage device capable of high speed and inexpensive manufacture. Most important, its operation can be changed by programming, the process of providing ordered instructions and data to perform tasks. Singly or in groups, microprocessors can be programmed to perform or direct the performance of virtually any type of human labor.

Most human labor -- not creative activity, but labor -- is repetitive, progressive or otherwise physical with minor decision-making in nature. That is, repetitive labor is a continuous string of identical processes, such as filling bottles of nail polish, cutting and stitching shirt sleeves or digging a series of trenches; progressive labor is somewhat developmental in the short term, but still repetitive over the larger scale, such as assembly of vehicles and televisions, creating maps, engineering drawings and circuit boards, or washing clothes; labor with minor-decision making involves mining, stocking store shelves, and cleaning contaminated areas such as the office building in Albany or the Three Mile Island nuclear reactor.

Mechanical devices have long replaced human labor in filling bottles of nail polish because mass-marketed nail polish made it economical to design and build a reasonably complicated mechanical device whose only job was to fill bottles, cap them, and usher them along into paper boxes and cardboard cartons. A million bottles later the machine might pay for itself. But should you need to fill and box, say, cherry pies, this machine would be useless. Not only would the size of the material be different -- nail polish vs. cherry filling -- but the process would change. Complicated mechanical cam timers, linkages, gears, valves and pistons would need to be replaced. The pie needs a crust pressed on, not a cap screwed on.

The microprocessor, on the other hand, is fully programmable. By creating a sort of mechanical language, the programmer can make the bottle-filler change into a pie-filler with only the alteration of the software and a few filling pipes.

Why can the microprocessor do this? It is because the microprocessor, though not intelligent by itself, is fast enough to check continuously every one of hundreds of parameters in the bottle or pie filling process: whether the bottle is there, how to position it, whether there is enough nail polish left, and so on. In other words, the processor is being more repetitive than any human operator -- who after a time uses hearing, touch, sight and smell to judge the state of the assembly line -- could ever be.

How fast is a processor? One classic example is the Intel 8080, introduced around 1974. Its external clock, that is, the string of pulses which trigger its operations, ran at one million pulses per second -- 1 MHz. Each complete instruction could take from four to ten of these cycles, meaning a programmer would potentially have 100,000 instructions to play around with every second. Complicated timing and evaluation tasks would use up a lot of these instructions in loops, but nevertheless there are still thousands of instructions left.

In 1974, this processor cost $375. $375 bought a lot of power, considering thousands of dollars for mechanical timers. But the "cost-effectiveness" implications are even more astounding when you consider the state of the art in 1982: Advanced Micro Devices has a processor, now being shipped in limited quantities, called the 29116. It uses one clock cycle per instruction, and runs at 20 MHz. That's better than 150 times the computing power for the same price. Essential to a microprocessor system is memory, but the prices have tumbled so far that this consideration almost drops into irrelevancy: 256 bytes of memory cost $10 eight years ago; 64 kilobytes now costs under $7 ... one-seven-hundredth the cost.

What are the implications? First, some random events.

Item: A recent feature news story showed a Japanese family who created small plastic parts in their home factory. Both husband and wife worked a full day filling molds, removing parts, checking quality, and boxing the results. They purchased a microprocessor-controlled industrial robot, which now does the work for them.

Item: In Alburg, Vermont, a wastewater treatment plant in an environmentally difficult area for such treatment needs only a part-time operator because weather, soil, and other conditions are monitored by a microprocessor system. It not only treats wastewater and grows a hay crop from it, but also "learns" the weather over the long term, saving and re-averaging the data from disks.

Item: Most high-end FM receivers now offer micro controlled tuning, eliminating the need to find and center on stations.

Item: Several months ago, a Bloomingdales advertisement in the New York Times offered a microprocessor-controlled hair remover for "getting rid of that ugly, unwanted stubble".

Item: One of my friends had used her microprocessor-controlled microwave oven for so long that she was unable to produce an edible meal with an ordinary oven when the microwave stopped working.

And an item I would add: This afternoon, I discovered that traditional drafting is no longer taught at RPI. It is in the catalog, but that is the only place you can find it on this campus. Computers have taken over the rest of the task.

I would suggest that all of these items have an eventual, subtle, inexorable effect on the American work ethic. I do not defend the work ethic, but only suggest that if the nation has matured through the work ethic, then undermining it may well be cracking one of the binders that holds our society together.

If the premise of our work ethic is that good and gain and achievement and reward can only come through effort and drive and sweat, then microprocessor -- or robot -- assistance or total replacement is in direct and serious contradiction. There is a likelihood of not so much laziness or sloth, but of forced irresponsibility. Skills can be forgotten and are often hard to relearn. Who knows how to beat a rug in the age of vacuum cleaners? Or cook on a wood stove in the age of Corningware electric stovetops? Or draw water from a stream in an age of municipal water systems? Yes, all those methods are alive and well where I live in rural Vermont. But they have moved from the norm to the quaint.

The owner of our country store is young to be retired. But he was a printer in the city, and one of his tasks was to raise any bent dots on metal printing plates so the reproduced image was clear, perfect, beautiful. He's told me about that process over and again, speaking almost lovingly of it. He watched the transition from metal to lithography, from hand set type to photographic type to automatic typesetting from the newspaper writer's own video display. He now runs a country store, having experienced not so much the collapse of his life's work as the withering of its significance.

As each task of human sweat is taken by machines, the nature of the social fabric will change, as will the delicacy with which the technology of communication, manufacture, farming, and distribution must be held in balance.

Tom Shea of InfoWorld (November 27, 1982) looks at it quite another way: "Organized religions may take on an Eastern emphasis, with the focus on turning inward. Because computers will give us so much leisure time, it will be not only possible but necessary to go on a voyage of self-discovery."

On the other hand, Paul Snigier, editor of Digital Design, in December 1980 described a deteriorating situation in the American workplace where assembly jobs will vanish; blue collar minimal-judgment jobs will vanish; middle-management will fall victim to decision-making computers; and office automation will hit hard. He fears anti-computer legislation, but states frankly, "computers kill jobs". He says, "Rather than act when the deteriorating situation has antagonized U.S. society and labor unions, we must educate our leading lawmakers. What we need now are "early warnings" for workers whose jobs are in jeopardy. Some might suggest retraining. It's rubbish. Retraining for what? Jobs are already vanishing. Besides, after investigation and interviews with participants in existing retraining programs, my conclusion is that most retraining is a cruel joke. Retraining is a non-solution." His solution: work sharing and four-day workweeks.

Another interesting response is posited in Joseph H. Delaney's story, My Brother's Keeper, from the October issue of the science fiction magazine Analog. It presents a surprisingly typical bureaucratic response to the post-industrial society. The author describes American society in 30 years as energy-poor, farmland exhausted, and with huge social problems. The solution: "If you meet the income requirements you'll have to register. Then you get to pick your new dependents out of the pool... You'll have to support at least one person, maybe two, and give him basic subsistence. The law also requires you to attempt re-education and rehabilitation, so that your ward can get a job and get off the program."

Top


So how far have we gotten? How much has the microprocessor and its effects -- especially the video game and the personal computer -- invaded the American consciousness? How much has been accepted, and what are those implications?

I think the newspaper comics are somehow the best reflections of social trends. First come the "intellectual" funnies, then the wide-open, mass audience types. My first inkling that the personal computer had made it in the social consciousness was a year ago in the Barre-Montpelier Times-Argus. A funny fellow by the name of Danziger produced this one ... Pop sits in the back room watching Kojak, beer in hand. On the counter is a computer with this message:

"Hi there and welcome to Mom and Pop's computerized general store -- Mom isn't here and Pop is busy. Please type in your name. ----------------------- Oh hi Fred. How is your wife? Code 1: Good. Code 2: Down with the miseries. Code 3: Compared to what? Select answer. Ha, ha, Fred, that's a good one."

Not long afterward, Gary Trudeau's Doonesbury took a shot at the home computer. Duke was on the lam, and gets a call from his contact Diaz about some drugs. Diaz complains about getting shortchanged. Duke: "Wait a minute, let me check the Apple. What's your account number with us." Diaz: "What are you talking about, man?" Duke: "Here it is. For future reference, your number is TB765035." Diaz: "Are you crazy? You keep records?" Duke: "It facilitates billing, Diaz. Besides, in the case of a raid, I can destroy everything with the touch of a button. No more fooling around with fires or toilets. Okay, you file's coming up now... Bad news, man. We've lost ten bales in the computer." Diaz: "No problem. I'll just send a couple guys over to help you take it apart."

But for me the real kicker was a down-to-earth comic called Motley's Crew, drawn by Templeton & Forman. A few days ago, one of the characters bought a home computer. His wife calls to him, "Did you give the computer the figures on our bills and income?" He responds, off-frame, "Yeah." Another question, "And did the computer recommend a way to balance our budget?" He says, "Yeah, but it didn't tell us which bank to rob!"

What does this mean? It means that people are now and will be more comfortable with the notion of a computer, their own computer. But beyond that? If computer power is put in the hands of individuals, that's good. If people get comfortable with the machine, and comfortable with the concept, that's good. If computers allow individuals and small businesses to cope better with increasing government bureaucracy, that's good. If they can do their readin', writin', and 'rithmetic better, faster and more accurately, that's good.

So what's in question? The invisible computer, that's what. And that sense of comfortableness. And a lack of alertness and diligence.

Admittedly, computers can be fun to have around. They remember your name, some can talk, and each succeeding program vies for your attention in new ways. Madison Avenue style computing. Cars that remind you in a pleasant voice that the door is open. Electronic telephone company voices that help you through your credit card calls.

But other things are happening. Electronic banking is being pushed very hard by bank executives who claim the cost of processing paper checks is too high. General Motors is installing computers in its top-of-the-line vehicles to control the engine and monitor its efficiency. In Columbus, Ohio, an experimental talk-back terminal setup known as QUBE has been in operation for nearly a decade. The French government is replacing phone directories with terminals.

All these items have one thing in common: it's tough to unplug them. Already, General Motors is being hit with a 27-state class action suit against its "Computer Command Control" automotive computer, not only because it doesn't work as well as claimed, but also because it records and monitors -- ostensibly for warranty violations -- the number and length of time the car has been driven over 80 miles per hour. It's a short leap from warranty violations to "Okay, buddy, pull over and let me look at your speed monitor ... and your drug sensor monitor ... and your fuel consumption record ..." ... and so forth.

Once you've got an electronic phone directory in your home, moreover, you are actually wired into a central telephone office with very sophisticated electronics. Again, it's a short leap from calling up a phone or service directory to the authorities monitoring you. Sophisticated equipment is difficult for the layperson to discern, and who's monitoring whom is immediately open to question. With digital transmission, there's no way for the untrained person to know.

Let's combine these so-called services with the possibility of in-house robotics. Right now, industrial robots are selling for less than $35,000. Sears is selling smart washing machines with built-in processors. Combine Sears plastic technology with industrial techniques and -- presto -- personal robots. Not sophisticated, to be sure, but with enough capability to make them even more attractive than a personal computer. The Hitchhiker's Guide to the Galaxy, Douglas Adams' book which has been dramatized by the BBC and is running on National Public Radio in this country, even posits a future in which computers are given GPP's -- Genuine People Personalities -- and a robot is described as "your plastic pal who's fun to be with".

But that's not too far fetched. The middle-class home of the very near future includes not only the electronic appliances of today -- microprocessor-controlled hi-fi, washer-dryer, microwave, video game, personal computer -- but also electronic banking, two-way teletext phone service, and personal robots. In this scenario -- somewhat expressed in the opening of Monty Python's movie Time Bandits -- the middle-class American becomes dependent on the devices, those who service them, and the very fact that they are there. Pulling the plug on these sophisticated monitoring devices is no more a possibility in the immediate future than cutting your electricity or phone service is a possibility now . You can do it, but it throws you back quite far into a self-sufficiency that our nation's population is hardly prepared to accept: witness the lines at gasoline stations during the fuel shortage, or the madness during urban power outages. Self-sufficient existence is for the modern American akin to savagery or barbarism, or at best the wishes of some aging commune member in the wilds of New England.

Stay with my scenario a little longer. Combine this technology with the governmental penchant for misstatement or colorful ambiguity Combine this technology with the Communist tendency to rewrite history from a present perspective -- for example, how Stalin slips in and out of influence, or how Mao's achievements are waning. Combine this technology with the manner in which the slaughters of innocents are diplomatically explained as peacekeeping missions. Combine this technology with the manner in which more and more industrial and government secrets are being legally protected here and have long been hidden in other industrial countries. And combine this technology with the inevitable separation of average humans from the internal, sophisticated operations of the machines in their lives.

What, then? With the general acceptance of microprocessor-initiated technology, the general acceptance of the distortions of reality introduced in political speech, history and liberty, it becomes war for the maintenance of peace; it becomes that too much freedom is in itself viewed as slavery to the rule of chaos; and it becomes that too much knowledge, seen as too broad and shallow a perspective, weakens society. It is, at last, a short leap to the motto of the Ministry of Truth: War is peace; freedom is slavery; ignorance is strength. We may well be ready for 1984, right now. Our programming languages suggest a kind of Newspeak. Our attention to reality is flavored by the 15-minute commercial break.

I said earlier that this was a medieval concept. Humans have accepted such contradictions and controls before, and may again. The Crusades of Christianity were in some ways undertaken as moral war. Medieval sermons flew in the face of the visible, assessable world around. The ignorance of the many provided a firm foundation for the nobility and stability of the few. The renaissance, democratic liberties, and the host of developments of so-called "modern" history are again endangered by a new and different kind of priesthood, a high-tech religion.

However, Douglas Adams suggests all our worry might be pointless. In The Hitchhiker's Guide to the Galaxy, he suggests something entirely different. Arthur Dent, one of two remaining earthlings to survive the destruction of the Earth to make way for a hyperspace bypass, finds out something very interesting about white mice from Slartibartfast, the Magarathean who supervised construction of Earth's fjords ten million years ago. Slartibartfast says, "These creatures you call mice, you see, they are not quite as they appear. They are merely the protrusion into our dimension of vastly hyperintelligent pandimensional beings. The whole business with the cheese and the squeaking is just a front... How better to disguise their real natures, and how better to guide your thinking. Suddenly running down a maze the wrong way, eating the wrong bit of cheese, unexpectedly dropping dead of myxomatosis. If it's finely calculated the cumulative effect is enormous... You see, Earthman, they really are particularly clever hyperintelligent pandimensional beings. Your planet and people have formed the matrix of an organic computer running a ten-million-year research program."

Great fiction, but consider: American scientists are working simultaneously toward faster and faster processors, are working hard on artificial intelligence, and are working quickly on genetic engineering. Is it too hard to imagine such a biological computer matrix? Could artificial intelligence at last be achieved through biological storage in the human brain combined with the processor's speed and calculation ability? In fact, the way questions are handled now, but without the morality of it?

Top


My focus tonight is really on the morality of our approach to high technology, and the responsibility of those of you involved in high technology. In my introduction, I suggested that there was a strong political element in this field. I believe there is.

I would first like to consider it from a somewhat philosophical standpoint, that is, how far would the development of the microprocessor have gotten if our language were not alphabetic? Rather, what if it were ideogrammatic, such as Chinese or Japanese? Conversely, what have microcomputer displays, dot matrix, character set limitations, etc., done to our perception of the world and other cultures? Is not high technology in our minds inevitably intertwined with what we once called the Roman alphabet and now call ASCII? In a world of Cyrillic, Kata Kana, Hebrew, Greek, Arabic and dozens more, are we not indeed ASCII chauvinists?

Westerners are blessed or cursed, depending on your point of view, with linear thinking. Classical logic presents hypothesis and conclusion. Strings, equations, linearities. Our perception of other dimensions all spring from this. Area is length and width; volume is length, width and depth. It is carried to its social extreme in the goal-oriented and work-oriented aspects of American existence.

Our ethic and our success as a society has engendered in us a strong sense of superiority over other cultures. It has also encouraged our desire to develop our own skills at the expense of other nations and cultures. It has been given many names: imperialism, exploitation, colonialism. We use and develop our technology consciously and unwittingly in continuing this imperialism or exploitation or what have you.

This is not meant to be a judgment of the American dream or a vilification of the nation's industrial and post industrial growth. But our administration in Washington is using high technology boycotts as a kind of political weapon in its tussle with the Soviet Union. I have had a wonderful correspondence with a textile engineer in Budapest who is in awe of American high-tech developments. He is working with the few microelectronic components he has received as gifts in order to develop automated textile machinery. In a centrally planned economy, he has even ventured into his own private business for the purpose. As a person and a technologist, I want to help; but does my government approve? Should I support its use of high-technology as a weapon?

Simultaneously, our Administration is vigorously supporting repressive governments in which U.S. companies have substantial investments. I am speaking specifically of South Korea and the Philippines, the U.S. support for both of whose governments is painful to many Americans, and also of El Salvador. It is curious that the presence of a major Texas Instruments assembly plant in El Salvador coincides with our intervention there. I believe it is our responsibility as high technologists to think about these actions by our government and the companies we will work for or purchase from.

Let me quote from Colin Norman in the New York Times, October 5, 1980. The article was entitled The Menace of Microelectronics. "In part, the attention being lavished on microelectronics -- and the government funds being poured into its development -- are a response to international economic pressures. In an interdependent world economy, governments have little choice but to insure that their high-technology industries are in good health. If the implications are sobering for the advanced industrial countries, they are even more alarming for countries that barely have one foot on the industrial ladder. The microelectronic revolution adds a new factor to international competition that could make it more and more difficult for poor countries to close the technological gap... The nostalgic hope seems to be that microelectronics, along with other high-technology industries, will lead the way back to the golden days of the postwar era, when the world economy expanded at a rate that provided high demand that in turn created millions of new jobs. Such a development is at best unlikely..."

The key here is Norman's suggestion that these implications are "even more alarming for countries that barely have one foot on the industrial ladder." It certainly accentuates the differences between the industrialized and postindustrialized West and the Third World. But it also has permitted continuing and extensive exploitation of them by high-tech companies.

For example, Ms. Magazine was the first popular magazine to report that the use of toxic chemicals and binocular microscopes in Third World countries was not prohibited or regulated by health and safety laws. American companies build their components overseas because it is cheaper. But not only is the labor cheaper (because it is the only work around) but also the health and safety regulations are lax or non-existent. Young women are most often hired for work in integrated circuit assembly plants. After two years of working with toxic chemicals and looking through badly adjusted binocular microscopes, they are in effect discarded ... unhealthy, and with eyesight too bad to continue working. When you go to Radio Shack and pick up an integrated circuit stamped "made in El Salvador", you are the final participant in a process which has been killing and blinding young workers, and a silent supporter of our government's participation in the bloodshed there. Likewise, every American high-tech product made in Korea or the Philippines is a vote for the military regimes who control by the force of death.

I believe the violence embodied in our so-called "clean" technology must be dealt with. It is not a clean technology. It is a technology which not only exploits the laborers off our shores -- it is one which is polluting the waters of the Silicon Valley and causing birth defects; and it is one which is continuing the cycle of impersonal, potentially final holocaust in the world. The blasting and burning of Space Invaders is more than relevant.

Here is just one contribution; I refer to High Technology magazine, March 1980. The article, by Contributing Editor Franklin Meyer, was entitled Kamikaze missiles will carry electronic pilots: Advances in computer chips are leading to smart weapons that can read maps, make evasive maneuvers, and seek and destroy targets.

"The computer in all the strategic missiles, a Litton LC45 6C, is also used in the F-18 navigation system. It is a general purpose 16-bit word-length minicomputer which is packaged to fit into the guidance set aboard the cruise missile. Together with the memory, it occupies less than one cubic foot. The memory is a Litton 9000 ferrite-core type, densely packaged on two circuit boards, each containing 32,000 16-bit words. The computer itself is a microprocessor-based machine using AMD 2901 circuits."

This field -- our field -- is creating the parts that are built into the missiles set to destroy civilization. My state was a leader in the nuclear freeze movement, and I support that freeze. I encourage you to consider the destructive ends of the circuits and software that you develop, and voice your feelings. In a tough economy, that can be difficult; but is it better than no economy at all?

And, at last, what is your responsibility as professionals in technology? Is it to discover, research, develop, and then offer your work to the highest bidder? Or is it to be partners with those who would employ it against you, your sisters and brothers, or your neighbors? You deal in the most powerful human invention, I believe, since the wheel and since the written word.

Top


Finally, it is an intriguing and sobering consideration that microprocessors might dramatically change the rebuilding of human civilization should there be a nuclear holocaust -- in which, ironically, microprocessors and their hardware and software engineers themselves have played an active role.

These devices, inert silicon packages, can be affected by radiation. But with minimal protection, they can survive intact. Consider: should but one personal computer, a few solar power cells, and one competent operator survive, the technological and computational expertise of 5,000 years of civilization will still be at hand. The culture will be in a shambles, the surface of the earth perhaps a smoldering wreckage. But if humankind has the capability to learn, and to remember, a sole surviving microcomputer running on the energy of the sun will provide a splendid example of humankind's finest achievement, as well as a tool for the rekindling of civilization -- a civilization, I hope, of humanity and humility.

Top

Contact the author


Exit