Computers, Websites, and other Digital Gadgetry
What is novel today is old-hat tomorrow; but what is old-hat to someone today is still novel for someone else. These are our own thoughts about a variety of electronic novelties, for whoever finds them of interest.
The Age of the Philadelphia Computer
Computers have a long slow history. The computer industry, however, had an abrupt start and sudden decline, in Philadelphia.
Computers, Digital Cameras, and Cellphones
The Industrial Revolution had a lot to do with manufacturing cotton cloth by religious dissenters in the neighborhood of Manchester, England in the Eighteenth Century. What needs more emphasis is the remarkable fact that Quakerism and the Industrial Revolution both originated about the same time, in about the same place. True, the industrializing transformation can be seen in England as early as 1650 and as late as 1880. The Industrial Revolution thus extended before Quakerism was even founded, as well as long after most Quakers had migrated to America. No Quaker names are much mentioned except perhaps for Barclay and Lloyd in banking and insurance, and Cadbury in candy. As far as local history in England's industrial midlands is concerned, the name mentioned most is Richard Arkwright, whose behavior, demeanor and beliefs were anything but Quaker.
He seems to have invented nothing, stealing the patents and ideas of others freely, while disgustingly boasting about his rise from rags to riches. Some would say his skill was in the organization, others would say he imposed an industrial dictatorship on a reluctant agricultural community. He grew rich by coercing orphans, convicts and others he obviously disdained into long, unpleasant, boring and unwelcome labor that largely benefited him, not them. In the course of his strivings, he probably forced Communism to be invented. It is no accident that Karl Marx wrote the Communist Manifesto while in Manchester visiting his friend Friedrich Engels, representing reasonably well the probable attitudes of Arkwright's employees. What Arkwright recognized and focused on was that enormous profits could flow from bringing piecework weaving into factories where machines could do most of the work. Until his time, clothing was mostly made by piecework at home, with middlemen bringing it all together. The trick was to make clothing cheaper by making a lot of it, and making a bigger profit from a lot of small profits. Since the main problem was that peasants intensely disliked indoor confinement around dangerous machines, the industrial revolution in the eyes of Arkwright and his ilk translated into devising ways to tame such semi-wild animals into submission. For their own good.
Distinctive among the numerous religious dissenters in the region, the Quakers taught that it was an enjoyable experience to sit indoors in quiet contemplation. Their children were taught to submit to it at an early age, and their elders frequently exclaimed that it was a blessing when everyone remained quiet, enjoying the silence. Out of the multitude of religious dissenters in the first half of the Seventeenth century, three main groups eventually emerged, the Quakers, the Presbyterians, and the Baptists. Only the Quakers taught that silence was productive and enjoyable; the Calvinist sects leaned toward the idea that sitting on hard English oak was good for the soul, training, and discipline was what kept 'em in line.
The Quaker idea of fun through daydreaming was peculiarly suitable for the other important feature of the Industrial Revolution that Arkwright and his type were too money-centered to perceive. If workers in a factory were accustomed to sit for hours, thinking about their situation, someone among them was bound to imagine some small improvement to make life more bearable. If such a person was encouraged by example to stand up and announce his insight, eventually the better insights would be adopted for the benefit of all. Two centuries later, the Japanese would call this process one of continuous quality improvement from within the Virtuous Circle. In other cultures, academics now win professional esteem by discovering "win-win behavior", which displaces the zero-sum or win/lose route to success. The novel insight here was that it has become demonstrably possible to prosper without diminishing the prosperity of others. In addition, it was particularly fortunate that many Quaker inhabitants of the Manchester region happened to be watchmakers, or artisans of similar trades that easily evolved into the central facilitators of the new revolution -- becoming inventors, machine makers and engineers.
The power of this whole process was relentless, far from limited to cotton weaving. When Charles Babbage sufficiently contemplated the punched-cards carrying the simple instructions of the knitting machines, he made an intellectual leap to the underlying concept of the tabulating machine. Using what was later called IBM cards, he had the forerunner of the stored-program computer. There were plenty of Arkwrights getting rich in the meantime, and plenty of Marxists stirring up rebellion with the slogan that behind every great fortune is a great crime. But the quiet folk were steadily pushing ahead, relentlessly refining the industrial process through a belief in welcoming the suggestions of everyone.
We are indebted to Paul W. Schaffer, the curator of the ENIAC museum, for the novel concept that much of the complexity of modern computers can be reduced to a few adjectives. Before we get to that, let's explain how "computing" was done before the University of Pennsylvania revolutionized it.
We once (1940-55) used calculating machines, which are sort of overgrown calculators. As big as baby grand pianos, bearing no resemblance at all to hand calculators which sit on top of desks, calculating machines were noisy as all get-out. A typical "calculating shop" used to contain eight or ten machines, each with a specialized function. Key-punch machines, usually several of them, put holes in cards to be fed into the machines. A couple of sorting machines, to count the holes in the cards and shuffle them into pockets for specified holes, specified by temporary wiring boards on the back of the machine. A collator, which was capable of more complicated sorting and sequencing. And the calculator itself, which was able to count various combinations of holes in cards, and even print out the calculations on very large rolls of paper tape with perforated holes along the edges. Five or six trained operators would move the piles of cards around, feeding them into the appropriate machines in a prescribed order. Sitting off in a corner was the super-operator, whose job it was to design the sequences of manipulation minute and string wires around a wiring board at the back of the machines. His were the brains of the calculating system, and the wires were strung around in accordance with his design. My recollection is that IBM refused to sell these machines, and a typical cluster rented for $1100 a month in 1955. The Pennsylvania Hospital was considered very advanced for having this arrangement as its billing system, but its primitive quality can be seen in the system architecture. The whole system revolved around the concept of producing a bill for every patient in the hospital, every day. If the patient went home, he was given the latest bill. If he remained another day, the old bill was discarded and a new updated one made. It was simple, it was clever, and please don't tell Jefferson. (On the other hand, something appears very wrong about all this progress that fifty years later, the same hospital with hundreds of computers, today cannot produce a hospital bill within a month of patient discharge.)
Well, back to adjectives. John Mauchly the mathematician came to the fundamental recognition that just about everything in mathematics and calculating could be done by "iteration", and re-iteration. Don't be afraid of the words. They only mean you take a small piece of arithmetic, and perform it over and over, millions of times. You really don't need to redesign new machines for each new process, since anything you might want to do could be done by reducing it to the same sort of iteration. So there's the first adjective: Mauchly's iterative design concept amounted to a "general purpose" computer. Many different patterns, perhaps, but all performed on the same machine, just as many different pieces of music are performed on the same piano.
|John Presper Eckert,|
His graduate student, John Presper Eckert, eliminated the moving parts. Instead of metal hammers and prongs moving around, Eckert moved electrons. This step vastly increased the speed of the processing and even decreased effective maintenance. The early computers required a man to go around with a wheel-barrow, constantly replacing vacuum tubes as they burned out. But by moving electrons instead of mechanical parts, the iterative speed was so great that overall maintenance, per million calculations, was less. Eckert gave us the "electronic" computer. Together with Mauchly, the two ideas blended into the electronic, general purpose, computer.
|John von Neumann|
So then along came John von Neumann, observing this thing at work. Millions of punched cards were fed into the machine, but the holes in the cards represented data; the instructions were still wired in by physically connecting one contact to another, which had to be changed when the instructions changed. Von Neumann immediately saw how to get rid of half of this non-electronic effort. His contribution was to punch the instructions into program cards and feed them into the machine when the program instructions changed. So now, we had "stored instruction sets". As we still have today, the trio created the idea of a stored-instruction, general purpose, electronic computer, and actually made a working model of it. That's what promoted it ahead of the Mark I and other mechanical computers that had been developed in Europe. Vastly increased speed, vastly decreased costs -- and lots of big bucks for the manufacturer.
So, off to court, to sue for patent protection. Thousands of patents have been granted for various small innovations in the system, but who was entitled to claim ownership of the basic idea? Who invented the general purpose, electronic, stored-instruction calculator? Some puzzled judge finally worked his way out of that jig-saw puzzle by declaring that no one owned the right to have an overall patent. His reasoning was that since von Neumann had rushed to publish his work rather than rushing to the patent office first, it had become the property of the public and no longer belonged to the inventor. Compared with the contributions of a great many present computer billionaires, it really seems as though Mauchly, Eckert and von Neumann were conservatively entitled to a trillion dollars apiece. But life is not fair, and the law is an ass. Or is that so?
In later lawsuits, of which there were a great many, it came out that Mauchly and Eckert were employees of the University of Pennsylvania. They did what they were told to do and were paid for doing it. Maybe the University is entitled to trillions, thereby allowing them to pay their history professors better. But then, one final idea. The University accepted government money to do the job. Maybe all us citizens are entitled to trillions since we collectively commissioned and paid for this work. So, to recover, we seek out a class action lawyer. Standard procedure for class action lawyers is to take most of the money for themselves, sending each member of the class a share which is less than the postage required to send it.
The final outcome in this particular case was a suit between Honeywell (in Minneapolis) and Univac in Philadelphia. Univac essentially decided to go into the patent infringement business and Honeywell refused to pay royalties. Meanwhile, IBM engineers had been called as consultants for some rush job and reported to Mr. Watson that those people in Philadelphia had something really great. So, IBM paid several million to settle the patent infringement claim and started to mass-produce these things, with economic success everybody knows about. Meanwhile, the Honeywell/Univac case droned on and was eventually won by Honeywell when they produced a professor from Iowa who claimed he had the same ideas earlier. The computer concept was thus declared to be "prior art", preventing patent enforcement. By that time, IBM had such a long lead on the litigants, that the market was theirs, and IBM competitors just sort of dwindled away.
Note: This article was written in 1999, long before Computerized Medical Insurance Exchanges were such a disaster:
My first encounter with a computer was in 1958, and I have loved them ever since. As president of what called itself the Delaware Valley Hospital Computing Society, I remember giving a dinner speech concluding as follows: "If you want to be happy for a day, get drunk. If you want to be happy for a week, get married. But if you want to be happy for a lifetime, get a computer!" After fifty years, my affection continues. But to be candid, billions of dollars about to be spent on computers in medical care will mostly be wasted. Even worse, like malpractice suits computers will induce behavioral changes in the system costing far more than the directly visible costs.
That's unpopular news at present since the National Business Coalition for Health has launched a major lobbying campaign to persuade Congress to spend an initial billion dollars inducing physicians to maintain an electronic medical record. Various health insurance companies already provide financial incentives to doctors to file electronic claims forms, eventually threatening to reject any claim submitted on paper. The American College of Physicians has established a rather large department to develop programs for physicians to use in their practices; twenty years ago the University of Indiana started much the same thing. The College of Physicians of Philadelphia has spent close to a million dollars on such a project. It is reported that Microsoft Corp. has a massive project underway to supply electronic medical records. It sounds fairly easy to obtain large research grants from the government to devise something, anything, useful in this area. In my own case, training funds really weren't necessary, since I eagerly got into the field when everybody was a beginner. I was just as good a beginner as any other beginner. But let me repeat: the electronic medical record has been in the past and will be for decades, an expensive digression. In health care, creating more administrative work isn't the solution, it is the problem.
For fifty years the problem with an electronic medical record was that it took too much of the doctor's time to complete his part of the input, and then cost him too much to pay employees to do the rest. Presumably, automatic voice recognition and dictation will soon make it possible to record doctor's notes without handwriting or typing. Since, however, the elimination of current paper forms and check-off boxes will create a major problem in organizing the dictation verbiage, it could add five or ten additional years before programmers manage to rearrange dictation material and effectively integrate it into organized form, complete with laboratory results, dictated x-ray and EKG reports, even small images of the original material. Temperature, blood pressure, weight, photographs and the like can all be readily integrated into the stored electronic record, but to do so usefully is an expensive programming project. Doctors are quite right to be anxious they will lose control of the usefulness of their records in order to ease the task of programmers, speed up the sluggish pace of development, and reduce what will surely be an unexpected cost overrun. Storage and retrieval of such records is known to be an achievable but expensive task, which however also risks sacrificing the speed and ease requirements of the medical task it is supposed to serve -- in the name of cost-effectiveness.
Computers are no longer an unfamiliar tool; physicians have altogether too much experience with "vaporware", unrealized promises of convenience, and the damaging effect on the medical quality of the philosophy of Quick and Dirty. To respond to their resistance to design blunders with an accusation of undue conservatism is to provoke an icy stare and gritted teeth. Inevitably, the effective use of automation will require a redesign of workflow with major disintermediation of "gopher" staff; after all, that is how cost savings are to be achieved. That will provoke outcry that physician time is the most expensive component in the process, but unfortunately, physicians will discover Information Specialists with a business background will brush that argument aside. The most overpaid people on the face of the earth are investment bankers, but information consultants have persuaded business executives that inefficiency of the investment process is more expensive than even an investment banker's time. Having been through this themselves, insurance executives are unlikely to pay the slightest attention to physicians dancing to a familiar old tune.
For all that, data input is not the real problem; it's just the first problem. It's in a class with data storage and retrieval, which is expensive and cumbersome when you add a need for instant access and total privacy. But costs will come down steadily, and eventually, we can expect automated fingerprints or other biological identification, and cheap instant retrieval. Doctors will be able to make rounds in the hospital with a computer in their pocket, record telephone calls in their entirety, dial automatically and whatnot. There are problems with wireless transmission inside buildings with steel girders, and legal requirements for signatures on narcotic orders, but if we are determined, these problems can be overcome as easily as they were with electronic check writing and stock brokerage. Cost may top twenty billion dollars in twenty years, but it all can be done if we insist.
But then you encounter the real problem. Information will accumulate in these records in staggering amounts. Even if you resolutely resist demands to have the nurses record every groan, and the orderlies file every laundry slip, the legitimately important medical information will be exposed as the massive heap of transients that they really are. Plaintiff lawyers will insist no scrap of data may be deleted, hospital administrators will insist on compliance, when in fact most of a doctor's concentrated effort is devoted to brushing aside momentarily distracting data in order to see what's going on and react to it instantly. When a quick look doesn't solve the problem, the doctor goes back for additional data. If you disrupt these skills and traditions of coping with information overload, evolved over centuries, you will at best impose frustrating delays on a complex system under pressure, and ultimately inspire elaborate systems of short-cuts. The Armed Forces are famous for paperwork, but even they know better than to ask a pilot for his Social Security number as he starts a bombing run. The hospital nursing profession has already just about collapsed under paperwork pressure. If you see five nurses in a hospital, three of them will be sitting down writing something. The terrible truth is that no one reads it, no one checks it, and ultimately it sits in the record room waiting for a plaintiff lawyer with unlimited time to sieve out some misrecorded misconception or uninformed conclusion. My faith in the computer is such that I feel sure that methods can be devised to produce periodic summaries, automatic alarm signals, and mostly effective prioritization of data elements. Unfortunately, medical care is changing at such a rapid rate that ad hoc automation of physician thought processes cannot keep up with the current pace of change in medical progress. You would think some things would be unthinkable, but since I can remember the organized campaign to suppress the CAT scan as an unnecessary expense, I confidently predict that programmer inability to keep up with some advance in medical care will at times lead to organized outcry that we should slow down the pace of improving medical care, so that computer clerks can keep up with it. But that is only a small part of the issue, which at its center is that physician time will be dissipated and his attention distracted by presenting him with unwieldy amounts of neatly printed, spell-checked, encrypted and de-encrypted, biometrically secure, hierarchically prioritized -- avalanches of data which are irrelevant to the issues of the moment. The goal is not, after all, an electronic record. The local goal is to decrease the cost of medical care by increasing the productivity of the physician, and the overarching goal is high-quality patient care at a reasonable price. Behind all that, since the impetus comes from NBCOH -- the ones paying the insurance premiums -- suggests that the local goal is not so much the improvement of care as oversight reassurance that cares provided has been as good and as cheap as possible. The goal is legitimate, but this cybernation approach looks to be self-defeating by being overly specific.
If the reader has the patience for it, let me now cite a historical example of the third-party tail wagging the medical dog. In this case, third-party health insurance similarly overextended its reach by imposing internal health system changes, trying to facilitate the role of monitoring it externally. Specifically, the system of diagnostic code numbers was changed from one devised by the medical profession for its purposes, into a different coding system devised outside medial profession sponsorship, which seemed to suit the needs of payment agencies better even though it suited medical purposes less. After twenty-five years, it is now clear that third-party payers have shot themselves in the foot on this matter, and everyone is worse off. The topic, please pardon the obscurity, is the diagnostic coding system.
To go back to beginnings, the American Medical Association perceived a need for a diagnostic coding system in the 1920s. Organizing or even merely indexing vast amounts of information about a disease required more specificity than freestyle verbal nomenclature could provide. Quite a distinguished panel of specialists and consultants then produced the Standard Nomenclature of Diseases (SNODO) which in time became the Standard Nomenclature of Diseases and Operations. In order to reduce ambiguity, this system developed a branching-tree code design for anatomy, linked to a branching-tree for causes of disease, ultimately linkable to a branching tree of procedures. These three sets of three-digit codes linked the components together with hyphens (000-000-000). The first digit of each was the most general, as in Digestive, Musculo-skeletal, etc. and subsequent digits were progressively more specific and detailed, as in "Digestive, large intestine, sigmoid colon". The causes of disease would resemble "Infections, bacterial, streptococcal". An example of Procedures would be "Incision, incision, and drainage, drainage and insertion of the drain". In nine digits, it was thus possible to represent " incision, drainage, and insertion of a drain into a streptococcal infection of the sigmoid colon". After a while, the codes grew from three to five and six digits, again repeated three times, so an immensely detailed, unambiguous description might be coded in fifteen digits by a physician who knew the rules but didn't own a codebook. This code was ultimately taken over by the Academy of Pathology, expanded and is called SNAP. The pathologists absolutely refused to give it up.
The rest of the profession gradually yielded to the pressure of hospital administration, who was pressured by the Association of Medical Record Librarians, responding to the views of outside statistical interests, particularly insurance. A simpler, shorter coding system was needed, they felt, concentrating on the thousand most common diseases. The International Classification of Diseases was produced, reducing the millions of SNODO diagnoses to 999 by heavy use of several varieties of "Miscellaneous" or "Not Otherwise Classifiable (NOC)". Since the goal was to count the incidence of common diseases, the coding system was stripped of any logical tree-branching and became a short list of what was most common, starting with 1 and going to 999. In time, of course, the common-ness of conditions changed, and various complaints from various directions forced the ICD to go to 4 digits, then five. Unanticipated conditions or complications eventually required the patchwork of some alpha "modifiers", and the original short hodge-podge became a long and bewildering hodge-podge. Coding accuracy declined markedly, but ho-hum. The health insurance companies paid the bill, no matter what the code said. At another place, we will discuss the entertaining way that Ross Perot became a billionaire out of the computer chaos of Blue Cross and Medicare at this time, but right now the central theme to follow is DRG, Diagnosis Related Groups. Try to follow, please.
By 1980, Medicare was fifteen years old. It was clear that certain things just had to be changed because the excuse that the system was new and untried was beginning to wear thin. The early designers of the system based their payments on auditing a hospital's yearly costs, auditing the proportion of patients who were Medicare beneficiaries, and paying a proportionate share. That was easy and reasonably accurate, but it had a rather significant flaw that it took no account of whether the patients needed to be in the hospital in the first place. Or whether they needed to stay so long. The response they adopted (in the Budget Reconciliation Act of 1983) is a measure of just how desperate they must have felt. Knowing full well how inaccurate the ICD coding system was in practice, it was all there was. Consultants, particularly at Yale, ran computer simulations of various subsets of ICD codes to find a formula that would produce approximately the same hospital payments as the system of cost reimbursement. If memory serves, the original formula was to divide the thousand ICD codes into 27 diagnosis-related groups (DRG). Eventually, the process was tweaked to seventy or eighty groups. Walter McNerny, then Past President of the American Hospital Association told Congress hospitals could live with this system, and promptly we had a system for paying out hundreds of millions of dollars. It was touted as a highly sophisticated advance in the arcane science of hospital reimbursement, so it must have included a lot of deliberate overpayment. I can remember trying to remonstrate with McNerny, who felt he didn't have time for the discussion. Physicians had very little to do with the DRG portion of the 1983 Medicare Amendments because the AMA had long insisted that physicians and hospitals go their separate ways on reimbursement. Russell Roth, who was president of the AMA at the time, recounted many times the episode in the Oval Office, when it was announced to Lyndon Johnson that Dwight D. Eisenhower"was in the next room waiting for him. LBJ excused himself to leave, and on the way out said to Wilbur Cohen, "Give him anything he wants." Things were destined to change, but at least for a very long time, physician and hospital reimbursements were strictly independent.
The Wedding of Computers and Medicine:
First Annual Fuller G. Sherman Lecture
George Ross. Fisher, M.D.
October 1, 1987
Fuller G. Sherman, M.D. was born August 15, 1894, graduated with academic distinction from Jefferson Medical College in the class of 19** with his second doctorate degree, was certified by the American Board of Internal Medicine, practiced for many years in Woodbury New Jersey, and retired from practice in 19** to live in his native state of Maine.
The competitive strengths of Dr. Shermanâ€™s character have actually been easier to see during the so-far thirty years of his retirement from medicine. Past the age of 90, he attended Bowdoin College, taking courses in Shakespeare and geology, plays par golf, holds a Mastersâ€™ certificate in tournament bridge, is a distinguished cabinet maker, and does creditable work in oil painting. Two notable achievements were once, on the day after his graduation, to have flattened an Associate Dean of this Medical School with a single punch; and secondly to have consistently outperformed the Dow Jones Industrial Average on the New York Stock Exchange. Because of the latter, of course, he was able to endow the lectureship we inaugurate today. In both of these adventures, he illustrated the truism that in life, everything is a matter of timing.
He has been my teacher, employer, referring physician, and friend. He is now my patient, allowing me to judge he has as good a chance as any of us to live another thirteen years. If he does, he will have the almost unheard-of opportunity to observe the practice of medicine in three different centuries. There can be little doubt he will study the next century harder than any of us, as his patronage of computer science demonstrates today.
My subject has three parts: yesterday, today, and tomorrow. The exhilarating nature of the computer world lies in only a little yesterday, a little of today, and a great deal of tomorrow. For our purposes, yesterday began about thirty-five years ago when the Chairman of the International Business Machines Corporation, Thomas Watson, made the decision to gamble the whole future of his successful typewriter and tabular company on mass-producing computers. There were then only a few dozen of those machines in existence, mostly owned by the military. They cost millions and were expensive to operate. Typically, a bushel of worn-out vacuum tubes were replaced every day. You could walk around inside them without stooping over. By 1960, IBM was selling a thousand of these machines a month to large corporations for about $ 4 million apiece. Technology in 1960 had greatly reduced the maintenance cost, but the University of Pennsylvania still had to rent them for $300 an hour at the academic rate. Machines of equal power can today be purchased for a thousand dollars, and are the size of typewriters. I own five of them; there are about twelve million others in existence, up from nine million last year. One surgeon recently told me he bought one of the best, about a year ago, but had not yet had time to take it out of the box. The cost of these miraculous machines was thus trivialized in a single generation, and each year the Sunday supplements have promised us that within two years, five at most, such things as a medical diagnosis would be relegated to computers. It never happened, of course, because science fiction writers had not heard Dr. Shermanâ€™s professor Thomas McCrea (Dr. Maddreyâ€™s predecessor by seven) repeatedly intone that â€œmost diagnoses are missed because the doctor didnâ€™t look not because he didnâ€™t knowâ€. The problem of diagnosis today, as then, is one of information gathering, not information manipulation.
A generalization can be offered. If you hear a prediction about computers, be fairly certain it will never happen, unless it already exists. So many brilliant minds are at work, with financial rewards providing unlimited resources, that the immediately achievable is achieved immediately. Mr. William Gates, a self-made billionaire at the age of 31, illustrates how among people who are successful in this field, there is no motivation for idle chatter.
The amazing drop in the cost of computers has made it possible to have a personal computer that is those dedicated to use by a single person. Personal, or stand-alone computers, can now do almost everything a large main-frame computer can do except cope with multitudes of users. But, having only a single master, they cater solely to his needs and undergo an unexpected transformation into tools not appropriate to big shared machines, becoming extensions of one userâ€™s brain. Word-processors and spreadsheets transform the way we think and work; such generalized mind-expenders prove to be more powerful than programs which merely calculate acid-based balance or remind us of potential adverse drug interactions.
Word processing is a utility as revolutionary as Guttenbergâ€™s invention of movable type; it can be expected to raise the standards of thought just as much as the standard of typing. A program costing less than $200 permits preliminary display on the cathode ray tube, where prose can be corrected and modified repeatedly before it is printed on paper. The machine will find all spelling errors and most grammatical errors, permit any character, paragraph or page to be replaced, repositioned or erased. It will index the material, automatically insert hyphens place footnotes and references in place, and allow unlimited experimentation with different margins, page size or paragraphing. When finally printed on paper, the right margin can be automatically justified, and the words become unified into important than these aesthetic advantages, word processing permits the author to revise repeatedly until what he writes finally says what he means.
A second innovative creation on a personal computer is a hybrid of two steps, the generalized data management system, and the spreadsheet. Many small stored globlets of information are aggregated on request, meaningful. If the unit of data is a single patient, with blue eyes can then be effortlessly linked with any other glob of information, such as antibodies to retroviruses. The spreadsheet concept then organizes such data cells into rows and columns which can be fed into formulas which operate serially on every row in a column, generating a new column of derivatives. The user need not, in fact commonly does not know statistical theory, but for example, can process anyway to command regression analysis on eye color and AIDS or any more plausible hypothesis in clinical research. These programs will then transform selected numbers into colored graphs on request (slide). The ability to use statistical tools without understanding them will, of course, create abuses of this system, which in the case of regression analysis would be to overemphasize the validity of 95% confidence limits. Ultimately, the value of the computer product will depend on the brainpower of the individual user. When convincingly packaged data can be processed in massive amounts by chimpanzees at the keyboard of $1000 machines, it is a little daunting to await the misinformation which will be generated by the 5% error content of mountains of data. By the rules of regression analysis, one conclusion in twenty will be reached by the operation of chance alone. Since editors are intrigued by papers which reach unexpected conclusions, thoroughly documented spreadsheets research which later turns into smoke will someday be their proper torment. The exciting future of computers is thus not to replace doctors in some profession-threatening way, but rather to extend the capabilities of their minds in powerful ways which continue to reflect the personality of the user. As has been said of corporations, the dedicated personal computer projects the lengthened shadows of the man.
Early steps in that direction would please Adam Smith, producing exalted results from trivial and mercenary motivations. Whether you like it or not, and whether cost-effective or not, the of medical practice with insurance and reimbursement is making it essential for every practicing physician to employ a computer; those who avoid it have done so out of fear of the disruptions, not because they deny the value of the office machine. Once the machine is installed, word processing is seen as a free bonus, and the financial affairs of the practice become raw materials for database program and spreadsheets. In this way, the practicing physician acquires a mind-expander when he merely sought to reduce his clerical expenses. He also, by way, merges into the mainstream of business computing, and like everyone else, will find that the new IBM model 50 has become the modern standard, just as the IBM Selectric typewriter became the business standard thirty years earlier.
To understand why this is so, notice that IBM continuously spends $5-6 billion annually on research but withholds most new products from the market. Then, about every seven years, a bundled package of innovations is released as a â€œnew generationâ€ which then makes existing machines obsolete and dominates the field for the next seven years. Throughout the seven-year gestational period, other companies also bring forward innovations but must recover their costs by releasing them immediately. IBM watches market reactions, preparing to submerge would-be pioneers in a tidal wave of releases. In effect, other companies test the market for IBM to exploit. Almost every component of the 1987 new generation is new and very little of it is unique to IBM (slide). However, an irresistible market standard is created when many innovations are released at once by the largest volume producer. The new 1987 machines seem mainly designed to permit for the first time several personal computer users to share one octopus machine, a mildly useful thing for the doctor and his secretaries. The main importance of using 32-bit technology lies in the fact that the doctorâ€™s personal computer uses the same system and thus can talk the same language as the mainframes in hospitals, laboratories, insurance companies, and the Internal Revenue Service. The profession must not let itself get lost in the chatter of intercomputer communications; while he must adapt to the equipment that is available, the physician mostly needs many different mind-expanding applications on a single machine. The creation of a market standard will almost surely prove to be a dominant force even for physicians. The IBM model 50 will be what to buy until 1993, as will IBM stock, but physicians need to establish their own culture within the commercially available environment.
1993 will not, of course, be a far enough horizon for Dr. Shermanâ€™s third century, so we might look across the valley over two intervening mountains. Fourteen years from now, the new 2001 models will also be composed of standardized refinements of whatever exciting advances may accumulate in the meantime. By that time, computers should be able to accept voice dictation since they can already understand a spoken vocabulary of about 1000 words. Computer scanners can now read pages of typing with 90% accuracy, so we can except much less typing for accession of much larger volumes of day-to-day information. I understand x-ray films require a resolution of 2000 by 1600 pixels; since advanced computer screens already achieve 1600 by 1200 pixels, the silver-coated films we use now should likely disappear.
In the shorter term, it would be a fair prediction that the exciting programs of the next seven years coming up will exploit the telecommunication power of 32-bit processing and the vast storage capacity of CD ROMâ€™s. Large Multimillion dollar mainframe computers operate in units of 32, but personal computers now mostly use units of 16. Declining prices of transistor chips make 32-bit technology affordable for personal computers, so we can expect to see the doctorâ€™s computer talking on equal terms with the big computer in the hospital, the drug store, and the Library of Medicine. It is exciting to predict role reversal, with once imperious mainframe owners outmaneuvered by users of agile PCs. Organizational whales like hospitals, department stores, banks, and the Internal Revenue Service have had their day of forcing everyone else to conform to their convenience; the little piranha fish will grow sharp teeth.
Since the profession of Medicine is after all in the knowledge business, it is breathtaking to contemplate the migration of medical journals and libraries from paper to electronic medium and the dissemination of libraries to the doctorâ€™s consultation room. Compact disc technology is already the cheapest form of information storage, whose inevitable price decline has scarcely begun (slide). Grolierâ€™s encyclopedia is now available on a plastic disc you could put in your shirt pocket; the entire encyclopedia only takes up 20% of the disc. This unerasable form of storage is sometimes called WORM (write once, read many). Inexpensive scanners can convert pages of print to a computer file in 20 seconds; rather accurate programs can convert foreign language to pidgin English. Between the two processes, whose only present limitation is price, whole libraries will surely soon be swallowed up on plastic disc, and medical journals may appear in that form as soon as someone figures out how to incorporate drug advertising. However, donâ€™t run out and buy a CD-ROM just yet; there are twenty different types and they are incompatible with each other. The exasperating power of IBM is well illustrated by the fact that this titanic information storage revolution will not take a step forward until someone like IBM is able to impose a standard which will disciple the present Tower of Babel.
In closing, the point must be made that the main hindrances to adoption of computers by the medical profession will not be technical, they will be sociological. â€œNot in my back yardâ€ is the Spirit with which most new things are greeted, even in the learned professions. A plain fact of human behavior is that how you stand is determined by where you sit. For years, banks have transferred patient payments to physician bank accounts without the creation of a single piece of paper. But electronic funds transfer has made very limited progress in 20 years, primarily because the payers and their banks do not wish to surrender the interest float which develops during the delay of transfer. Blue Shield of Pennsylvania has over a million floating dollars earn interest at all times while the obsolete paper check depositing process limps on. The videotape machines which are everywhere provide a warning example of how technical potential is easily frustrated. Instead of ten thousand college professors giving mediocre lectures on Hamlet, it is clear that some professor at Oxford could give the very best lecture on videotape which all students everywhere could watch at home without even paying tuition. Since it hasnâ€™t happened, and it wonâ€™t happen, perhaps the point I am striving to make becomes clear.
Try to image the resistance which pharmacists would create to electronic drug ordering; indeed, the nursing profession is very resistant to physician orders which come in any way except handwriting on the floor chart. While I cannot identify the economic incentive which explains the delay, Jefferson Hospital has just installed a system in which laboratory results are instantaneously transmitted to the clinical floors. However, a similar system was installed at the old Philadelphia General Hospital in 1965. These and many other examples of apparently irrational delays most likely have their explanations in the motivation of people rather than the limitations of machines. Therefore, in predicting a revolution in medical information handling we must not, for example, underestimate the capability of printers and typesetters at the New England Journal of Medicine to hold up electronic publishing, or the librarians of the world to resist the destruction of their careers by plastic disks. IBM is in the business of setting bridal supplies and has repeatedly proved to be a shrewd judge of bride psychology. They obviously believe 1993 will be soon enough for the real wedding of medicine and computers; maybe it can wait for the year 2000. Meanwhile, it will not matter much that the brideâ€™s father can easily afford the wedding, or the groom is anxious to perform. Medicine, the bride-to-be, hasnâ€™t yet said â€œyesâ€.
The Internet provides new blessings, but new problems as well. Identity theft has now ballooned from a rarity to a fairly serious issue. After initial turf confusion, the issue has been assigned to the U.S. Secret Service. If it happens to you, that's where you make your anguished call. (1-877-ID-THEFT) or www.consumer.gov/idtheft
There's a certain logic to regarding identity theft as a modern form of counterfeiting, which has been with us since the days of William Penn. Shirley Vaias, representing the Philadelphia regional Secret Service, recently addressed The Right Angle Club of Philadelphia on the topic. It makes sense to learn the Service is headquartered on Independence Mall, across from the Mint. The crude forms of printing in the 18th Century made counterfeiting easy, and ever since the early days, there's been a race between improvements in technology and improvement in counterfeiting. We now have a paper with little red fibers in it, watermarks, serial numbers, color-shifting inks, microprinting of secret messages in the portraits, special magnetic strips, and probably lots of other clever things we aren't told about. The Bureau of Printing and Engraving is changing the currency, one bill at a time, and recently there was a new ten-dollar bill. A counterfeit version was in circulation within six hours.
ATM machines are equipped with counterfeit-recognition devices, and special gadgets are provided for banks and retail stores, but one detection device traditionally catches most fake bills. After handling huge amounts of currency, bank tellers catch a counterfeit just by the feel of the paper. Color photocopiers are getting better and cheaper, but of course, they can't change the serial numbers, so they aren't as smart as they seem. About one-hundredth of one percent of the currency in circulation appears to be fake, so you are pretty safe, but the possessor of a bad bill is deemed to be the one out of luck. The consequence is that many citizens suspect a bad bill, take it to a bank and have it instantly confiscated without recourse. That would seem to discourage reporting a counterfeit, encourage passing it off to an unsuspecting friend, and overall seems terribly unfair; but it results from the wisdom of the ages. Experience shows honest citizens are indeed tempted to try to pass the money on. While the banks don't enjoy being policemen, the effect is that counterfeits will circulate until they hit a bank, and thus confiscation is fairly comprehensive.
As the printing of money gets more complicated, the special presses needed to produce good money has become a monopoly of certain German companies, who sell the machines to other countries. Some of the American presses thus got into the hands of some Russians, who sold them to the North Koreans. So for a while at least, the North Korean government was printing American currency. It provoked vigorous countermeasures, the nature of which is confidential.
A bill of any denomination costs the government about half a cent to produce and lasts about four years in circulation. When tons of old bills are retired from circulation, the serial numbers are recycled; to an outsider, that sounds like an impossibly tedious job, but they say they do it. There's also the issue of seignorage, a term for the profit the government makes when the paper currency gets destroyed in one way or another, costing less than a cent to replace. Just how profitable the currency business is, cannot be accurately determined, because a lot of it is buried or hidden in mattresses and might someday resurface. But there is a substantial profit, which like any shrewd businessman, the government weighs against the cost of detection. Bail bonds and casinos are big sources of bad money, as could be readily imagined, and hence it is in their interest to get pretty sophisticated (and extremely unpleasant) about detection. On balance, however, it can be expected that legalized gambling in Philadelphia will promote more counterfeiting in the local economy, and hence is an offsetting cost of the tax revenue.
Over the centuries, governments have learned how to cope with counterfeiting, and there is actually much less of it than a century ago. You win some and you lose some; life just goes on. With internet identity theft, however, the criminals are developing techniques faster than governments have learned to combat them, and it is governments who struggle to catch up. Unfortunately, everybody takes a business-like approach to the matter, asking whether the precautions cost more or less than the losses. It would seem that if money continues its migration from paper currency to bookkeeping entries, it will eventually seem unsatisfactory for only one party in a transaction, a bank let us say, to keep the books while the public simply trusts them. Eventually, each individual will be forced to seek the protection of some sort of computerized system keeping the counter-parties honest, on behalf of the public, and to prevent a paralysis of commerce. Identity theft is getting expensive enough to warrant the effort.
Just how to do all that is not too clear. So, in the meantime, just let the Secret Service figure it out.
Dan Rottenberg, who wrote an outstanding book about Anthony Drexel called The Man who Made Wall Street, had access to many private papers that had to be omitted from that book because of space limitations. He tells an interesting tale about telegrams between Drexel and his bulbous-nosed protege at the New York office, J.P. Morgan.
Around 1880, Morgan put AT & T together, but before the telephone came into being, most high-speed communication was by telegraph. Naturally, Drexel and Morgan could afford to have a private telegraph line going between them. It would have been a bit much for them to use Morse Code themselves, so the scraps of conversation were written down and some have been preserved.
Spam, of course. If you want to avoid hackers, intruders, and unwanted advertisements, then as now, you have to be a zillionaire. Since, however, Morgan's private library on Madison Avenue had lots and lots of pornography hidden away, it does almost boggle the mind to imagine what might have been accomplished with a telegraphic wire tap.
As life becomes more cluttered, its time to keep track of what is important. Here is our approach.
We're trying something new; let's explain it. You can learn when you can hear this magnificent orchestra from our calendar.
Look down the left column of the home page of Philadelphia Reflections, and click on a box called "Philadelphia Calendars." (It's about the 12th one down). In time, the screen will display a list of group activities, like Music, or Sports, or Computer Discussion Groups, or whatever. Clicking one will display a list of activities. If there are enough activities, we may have to go to the third row of choices.
At the moment, we start with the Philadelphia Orchestra concerts. Click that choice, and you get a year's calendar, marked with what is presently known about the schedule. If you wish, you can drag the calendar to your desktop, dropping it there. In the URL box, there's usually a tiny icon to the left of the text. (It's meant for dragging purposes and called a Favorite-Icon, or FaviconThe Kimmel Center.
So, if you want to know what's at the Kimmel Center for some date, what time it starts, or who the soloist is, you can find it in this nook of Philadelphia Reflections. Frequent fliers in the music world can even have their own calendar on their own website, by dragging the Favicon. We hope to attract many such calendars, and it would be just fine to have local field hockey schedules, local poetry readings, etc. Just so it's located from Trenton to lower Delaware, the area once known as the Quaker colonies, now called the Philadelphia region. Why not just print it out? You can do this, but you lose any last-minute updates provided by the calendar creator, and that's one of the great features of Internet communication.).
To pull this off, several components must be assembled:
1. The calendar creator, the secretary, or executive office of the sponsoring organization, must either have iCal (free for Apple users), Outlook (about $100 for PC users and at a 60% discount at some online stores), or the Mozilla calendar available for both Macintosh and PCs. Users, however, only need normal Internet access. If you create one, you are responsible for the accuracy of the calendar.
2. The calendar server is a computer that holds the master copy of the calendar, and it really should be running night and day, every day. Both iCal and Outlook can make such an arrangement. The server can be anywhere in the world.
3. The calendar clearing house, Philadelphia Reflections, in this case, offers a defined selection from the millions of potential calendars in the world. Our selection merely claims to be local to Philadelphia, nothing more.
4. The end-user. Just what the user does with the information is rapidly evolving. The software is changing, getting more convenient every day. We'll comment from time to time, passing on suggestions as clever folks perfect them.
Oh, yes, one more thing. If you want to post a calendar with us, click the button on the front page, called "Contact us." It's a self-addressed e-mail. Rate this "Reflection" Printer-Friendly Format E-mail to a Friend
This little morality tale was told to me by two unrelated sources, one of whom was a staff aide to Wilbur Cohen, the author of the Medicare law. And the other was a high official of Pennsylvania Blue Shield, the appointed administrative agent for Medicare in Pennsylvania. Its relevance to the more recent SNAFU with Insurance Exchanges introducing the world to Obamacare should be fairly obvious.
After Lyndon Johnson rammed the Medicare amendment to the Social Security Act through Congress in 1965, he wasn't shy about drawing attention to it. The press was present in great numbers, with staff officials who had a role in crafting the document, members of Congress, and anyone else who was standing around. The legislation was laid before him and signed with twenty different pens to be presented as mementos to the in-group. Each pen was only used to inscribe about half of one letter of his name, so it was a slow but joyful process. As intended, it got lots and lots of publicity.
|H. Ross Perot|
So, thousands of thankful old folks saw the ceremony on television, though they heard that the law was in effect immediately, and proceeded to dump their medical bills into a shoe box, sending them to Medicare to be paid. Unfortunately, Medicare didn't have an office, a staff, or even a telephone number. These things take time. As fast as they could, the Medicare staff constructed a system of carriers and intermediaries, carriers for part A, and intermediaries for part B. And almost without exception, appointed the local Blue Cross and Blue Shield organizations to be the carriers and intermediaries. Consequently, the organization of Medicare was patterned closely after the organization of the two administrative corporations. Meanwhile, the bills from old folks just kept pouring in through the postal service. It was about all the staff in Washington could do, just to direct the mail out to the local intermediaries and at least get it out of their hair.
Less than a year later, that's how the claims manage to Camp Hill, PA, a little suburban town near Harrisburg. In desperation, Blue Shield had rented a local vacant supermarket and piled the mailbags ten feet high. There were quite a few telephone calls of inquiry, and the old folks were politely told the matter was being looked into. It was beginning to look as though one supermarket wasn't big enough.
Computers were, of course, rented from IBM, who had a policy of renting, not selling, its valuable equipment. Keypunch operators, computer operators were hired, air conditioning was installed, and one team after another of computer programmers was hired -- and fired. Consultants were called, scratched their heads, sent big consultation bills, and turned sadly away. Sorry, but somehow it just doesn't work.
So that's how it happened that one Friday afternoon, a vice-president of Texas Blue Cross named H. Ross Perot came in, accompanied by a fellow with glasses so thick they looked like the bottom of Coca Cola bottles. So far as anyone can remember, the guy with coke-bottle glasses never said one word. The desperate, hopeless mess was explained to Perot, whose salary at that time was rumored to be twenty-five thousand dollars a year, about right for a Blue Cross executive. His background as a kindred Blue Cross person inspired confidence, and the conversation rambled on for an hour or so. Meanwhile, the guy with coke bottles went over to the Penn-Harris Hotel across the street and got to work. By the end of the weekend, he had come back a couple of times, but eventually, would you believe, it really, well it really worked. Contracts were quickly signed, the wheels began to turn, the mailbags in the supermarket began to march through the processing cycle. Blue Shield, the Medicare program, the finances of the nation's elderly, and Lyndon Johnson's reputation -- were all rescued.
As everyone now knows, the Medicare processing contracts made Ross Perot into a billionaire, living on Bermuda in the lap of luxury, eventually upsetting the re-election hopes of George Bush, senior by running for President himself on a third party ticket that had something or other to do with giant sucking sounds. A Congressional investigating committee looked into the outrageous profits Perot had extracted from his homeland's elderly, volleyed and thundered. Whether Perot actually thumbed his nose at them is doubtful, but he certainly was in a position to do so.
Meanwhile, whatever happened to that guy with the coke bottle glasses, no one seems to know.
On a hot summer evening, attendance at computer user-group meetings is light, so after a recent one got through discussing spammers and new software, we adjourned to the sidewalk tables of a nearby pizza joint, just off Broad Street. United only by a common interest in computer hardware, software, and techniques, this one comes from many corners of the Philadelphia social scene and goes back to those corners after the meeting. Whatever their background, computer geeks are uniformly good at math. But on this particular evening, the conversation turned to medical experiences.
A middle-aged man with a ponytail related he had recently experienced Bell's Palsy, a paralysis of the muscles of one side of the face. The group was fascinated to hear how saliva dripped from a corner of his mouth, and his unclosable eye dried out and got sore. His doctor told him there was not much to do except wait, an opinion confirmed by a specialist. Immediately, this man who makes his living programming computers set out to find the best acupuncture person he could find. It was a familiar story, and I remained as quiet as I could while he essentially related that when the regular medical profession confesses failure, the patient feels released to take matters into his own hands. His instincts were to do something, anything, even if that something was plainly futile. He could not bear the idea of doing anything, and sure enough, in time he got better.
I said nothing because it was a familiar reaction to conditions that either would or would not get better by themselves. The more serious studious members of our little club were quiet, too, because their instincts were to do what they were told, and in his place wouldn't have acted the same way, didn't completely approve. In a moment, a large muscular man took up the medical subject by telling that he had spent two years in a hospital after falling down an 18-story elevator shaft. Man, oh man, it seemed like it took two weeks as I was going down, and when I hit I wasn't knocked out. He had landed on one buttock and his leg was nearly wrenched off, but he remained awake, not bleeding much. Within minutes, he was headed for an operating room and reached out to grab the clipboard from the nurse. On it was written "amputation", which he circled, wrote "No amputation!!", dated and signed it. No way were they going to cut off his leg. His leg was in fact saved; he now scarcely walks with a limp. Then another computer nerd chimed in.
This man's story was that he spent fifteen months in a hospital after a motorcycle accident. He somersaulted seven times through the air before he landed on his chest, and twenty years later he could still remember every single twist of all seven turns. He, too, related many hospital disputes about morphine injections and contemplated surgeries. Both men related dubious experiences with young interns and medical students, and numerous proposed remedies that had been rejected. All three of these medical veterans expressed violent hatred of HMOs, for reasons unspecified. I was quiet; no argument from me. This motorcyclist eventually had his vehicle repaired and proceeded to ride it for six more months until he sold the machine. So there.
We all had different thoughts about this, I suppose. I was lost in thought about why they had survived and imagined that not being knocked unconscious meant that they had not hit their heads. Heavily muscled men like this were probably cushioned by their muscles; a skinny little bony nerd would have been much more smashed up. The negative side of that protection came out in hearing them both describe their heart problems, with by-pass surgery and whatnot later on in life. That's probably the negative side of their muscularity. Neither man smoked, but I bet they both did at one time.
As we strolled home from the pizza joint, it occurred to someone that the national political conventions were on television that evening. The women were fighting with the blacks. The studious nerds were silent; the wild men merely grunted. That didn't seem like something to fight about, or even to comment on.
As I walked into the darkness, I wondered if such a conversation would seem normal in any other city in the world.
When money was tangible you had to guard it, now that it's mostly virtual you have to verify it. Hardly anybody can, and that's a problem.
When money and wealth were wampums, precious metals, and paper currency, these physical objects required physical protection. It was all a big nuisance, with six-guns on the belt, bank vaults, and appraisers of one sort or another. But now that wealth is merely a bookkeeping entry on someone's computer, things may be even more nuisance because verification is almost beyond us. Counterfeiting of the computer variety must be left to institutions to detect or deflect, causing them to introduce firewalls of various sorts that also block legitimate inspection by customers. "Trust but verify" doesn't work so well in this environment. Let's use a personal example, slightly fictionalized to protect the innocent.
Several software products now exist to download transaction information automatically from various institutional sources to a customer's home computer; they are either free or cost a nominal amount, and are quite "user-friendly". In my case, however, the reports they generated were quite significantly at variance from the monthly reports which were issued directly by my counterparties. Dear Sirs, Please explain.
What I soon discovered was that everyone blamed someone else, and everyone blamed me for bothering them. Quite obviously, I had little understanding of these specialized accounting niceties, and quite obviously I had too much spare time on my hands. Telephone help desks, often located in India, will not give out telephone numbers for incoming calls and are programmed to check the size of your account before placing you in a call-back queue. The first call is usually taken by a trainee whose job it is to screen out the silliest sort of help request, and then to refer to a supervisor if things rise in complexity. Supervisors have supervisors. That's if you are lucky. More commonly, the tedious software business has been farmed out to a vendor, and the contracting agency has neither the necessary understanding of the issue nor any ability to fix it. From the sound of it, the vendor often gives the contracting agency the same sort of isolation treatment that they would give a customer if he could find their telephone number. And guess what. At the end of the day, one of those high-handed defensive linemen -- turns out to have been at fault.
Let's explain one problem. On the surface, we were talking about a $40,000 difference in account balances; one may have been correct, but a second one must have been wrong. That rises to lawsuit level, so the matter got intensive study. It turns out the stockbroker had misinterpreted instructions for a "sweep-account" system. When a stock in your portfolio pays a dividend, the amount of the dividend is subtracted from that stock's line item and added to the line item of your money-market fund. That's fine, but there is one exception. When the money market fund itself pays a dividend, subtracting that dividend cancels out the addition, and the dividend essentially disappears from your net worth. Was this intentional? Certainly not; no one could stay in business doing that. It's not even a highly stupid error, since you can easily see yourself making the same oversight of the one implicit exception to the rule of sweep accounting. Because of this "bug" in the program involved one institution making a mistake and transmitting it to a second institution, the systematic error did not unbalance any books, until it reached mine. But since I did not detect the error for five months, there must be dozens, hundreds, maybe thousands of customers who did not detect it. Ouch. Do the math yourself to judge whether this was a serious error.
This illustration, only one of several on my personal report, leads to at least two larger principles. The first is that the transformation of money from tangible to virtual has occurred so rapidly that bullet-proof safeguards have not had time to emerge. After a century of use, most people cannot balance their checkbooks, but enough people can balance them so that systematic errors are not likely to slip past. When enough people with home computers repeatedly test the internal complexities of their virtual money accounts, confidence will develop that the system is probably working. Confidence is an important matter; it is possible to imagine quite a bank panic if the public suddenly got the idea that virtual money is maybe a mere vapor. In fact, the securitized credit panic of 2007 is a little like that. With a few new regulations and a lot of computer programming it surely will be possible to know who owns how many bum mortgages. That innovative mortgage system got ahead of its tracking verification, and we now just have to hope nothing serious happens before that gets fixed.
The second important lesson is that our health insurance system has a similar problem of far greater size and complexity. We are here talking about at least ten percent of Gross Domestic Product, in which one daily unit of measurement is in truckloads of insurance claims forms. Stocks and bonds are admittedly complicated but compared with thousands of different diagnoses, drugs, procedures, and hospitals -- verifying financial transactions is trivial compared with measuring medical ones. With a twenty billion dollar budget and ten years of lead time, we might have a shot at it. Except for the fact that during the ten-year interval, medical care will have changed so much, you will have to start over on the project.
|Skype on the iPhone|
Skype on the iPhone works exactly as you would expect.
The iPhone automatically detects all wireless hotspots in the vicinity (it does this with or without Skype installed and it uses the wireless connection for all internet traffic while connected.)
Start Skype and you can see who's online and have a conversation with them or call them off-net through Skype just as you do on your computer.
The only deficiency I can see is that you can't multi-task while Skyping; while using the cell phone you can switch to other applications but with Skype doing this disconnects the call.
The AT&T cell + data package seems to be less money than Verizon's and the iPhone beats the pants off a Blackberry.
I have broken free from the landline tether.
I had a land-line Verizon home phone number forever but I have canceled it.
That number had been call-forwarded to my cell phone
- About $60 a month.
So I now have a Skype-in number:
call that number and if I'm offline (most of the time) the call will forward to my cell phone @ $0.02/minute.
- Exactly $60 a year (for the number, plus charges for any calls which I expect will be very few.)
$720/year vs. $60/year. Duh.
Why have any number other than my cell phone at all? In my case, I need a local area code for the guard at the gate of my condo where the phone is blocked for all non-local calls. Skype also offers international Skype-in numbers so your mother in France can call you with a local number. Etc.
The iPhone is an option for international cell phone use but it can be expensive. Here are AT&T's recommendations to reduce this expense.
When using your service outside the U.S., Puerto Rico or U.S. Virgin Islands (for either voice or data), international roaming rates apply. Your iPhone provides access to email, Visual Voicemail, Web browsing and other applications that can use a significant amount of data, so remember-international data roaming can get expensive quickly.
How iPhone Users Can Minimize International Data Charges:
This concludes the general topic of computers, computing and digital devices. The more specific topic area of the Internet, websites and website programming can be reached by clicking on the title below. It's large, so wait a moment for it to come up:
Only a decade ago, the Quakertown exit of the Pennsylvania Turnpike made possible a quick trip from the city to the country, letting you off in the cornfields between Sumneytown and Lansdale. Today, the rush hour traffic is as bad as anywhere else, even on the four-lane express highway known as Forty Foot Road. A comfortable two-lane highway would be about forty feet wide, so presumably, the name denotes what was once a modern miracle of a two-lane highway, in this case until quite recently. It's all built up for miles, but almost all the commercial buildings are new. Exurban sprawl has positively lurched across the landscape, making prosperous people rich, and poor people prosperous. It won't be long before the housing subdivisions demand traffic signals to protect the school children, speed limits to reduce the collisions by teenagers, and other things destined to bring high-speed travel to a crawl, all day long. When that happens, it won't be called farm country anymore.
|Alderfer Auction Company|
On Fairground Road, where occasionally corn is still growing, a number of large new commercial enterprises have located, among them a moving and storage company with ten or so truck loading platforms in the back. Behind that is another large new building, also with a parking lot for fifty or so cars, the auction house. Different categories come up for auction on different days, so used furniture, for example, comes up every few weeks and has to be stored as things accumulate for the big day. With a moment's thought, you can easily see why the auction is affiliated with or owned by a moving and storage company. As you go through the entrance, you are invited to sign up and identify how you plan to pay, just in case you buy something; the product of this registration is a card with a number in big colored letters. That's your number, your payment arrangement, and soon you will find no one cares anything about you except that number. The auction I was interested in was for used books, one of three or four auctions conducted in different rooms.
Nearly a hundred people had numbers for used books, maybe a similar number for antique furniture and paintings. Obviously, one other purpose of the registration process is to create a mailing list of customers interested in various objects, possibly linked to a program which sends out flyers and announcements. Country auctions have always been a source of local entertainment, so non-buying spectators are able to come and watch if they wish. There seemed to be few if any casual sight-seers; just about everybody is a buyer or a potential buyer. Players, as they say.
Most of the customers probably set their alarm clocks for 5 AM or earlier; the auction is centrally located, but most everybody comes from a considerable distance. At 9 AM, very promptly, the auction began, and from his manner, you could tell the auctioneer was anxious to get started. The object for sale had been on display for a day, but most people arrived around 7 AM to examine the goods, which are frequently sold in lots, meaning a box full of thirty or forty books more or less on the same topic. At the stroke of nine, the auctioneer chanting began, "Do I have ten dollars, yeh, ten, ten, ten, five, five, ten, fifteen, twenty, twenty, sold for fifteen. Your number, sir?" Two assistants took down the customer number, and the lot number, and the price; one of the two recorded the transaction in a computer, the other on a list by hand. One gathers the man without a computer was on the look-out for shills, people trying to bid up to the price without getting stuck for a purchase. The auctioneer repeatedly assured the audience that no one but a real bidder was allowed to bid, you owned it, and no excuses about being confused. When he reached he hundredth sale, he stopped for a drink of water, and proudly noted the first hundred sales took thirty-seven minutes. It required four other assistants to fish out the lots next in line, holding them up for confirmation only, since inspecting them at as the distance was out of the question. After each sale, the assistant dumped the prize in the new owner's lap.
And yet entitled to wonder a little. The ordinary run of books thirty or forty years old will sell for between ten and twenty dollars. Books about golf, just about any old book about golf, "go" for about forty dollars. Children's books are about sixty dollars. And, to my great surprise, boxes or albums of old photographs go for over a hundred dollars. A lady next to me excitedly brought an album of old photos back to her seat and thumbed through them. "Are you a dealer?" Yes. "Who buys this stuff?" I don't know, they come to my store and just buy it. Like the Auctioneer, she had a feeling for what the retail price would be, made a calculation, and knew what she could afford to pay wholesale. What the stuff actually represented, why people wanted it, what was a good one and what was a bad one--these people in the trade had very little idea. But they knew very precisely what a fair price, and gradually lowers it until it sells. Fun Lots of fun. When a familiar insider makes a mistake and pays too much, the others laugh heartily at him. Why this funny system works has long been a mystery, but everyone except a socialist readily acknowledges it does work. At least it works better than any known substitute.
Although the ritual of the country auction has been essentially unchanged for the centuries, it is just another transaction system. In the past fifty years, the world economy has been transformed by computerized efficiencies in transaction systems, with vast prosperity resulting from small saving endlessly repeated. Banking and Wall Street have concentrated most of the standardized transaction, in perfectly astounding volume; lots and lots of people have become immensely rich for producing small efficiencies in high volume. Those of us who have not become immensely rich can easily identify trivial innovations which resulted in wealth, and we easily sense the unfairness of old photos worth more than books of poetry. After all, the country auction is still grossly inefficient; the seller pays the auction company 20% of the price, and the buyer pays another 10%. There's 3% for the credit card company and &% for the sales tax. Forty percent of this transaction is going to the middle man, over and over and over again. The goal is to reduce transaction costs to the level of Wall Street, considerably less than one percent, Which still lots of yachts for middlemen.
As you walk out of the country auction, it doesn't take a mathematical genius to multiply thirty percent times the number of transactions, times a guess at the average sales price. No wonder these auction people are so cheerful, so much in love with their work. But two other parties are cheerful, too. That is, the buyer and the willing seller.
Since this book was written as a web site, then printed as a website from the book, the reader is invited to take a look at both forms of communication, and judge the advantages both ways. On the front cover is printed the web address of the whole book, and what you read here in book form will flow seamlessly forth if you enter the address ("URL") in the box provided on every browser.
PAGINATION. So that's one difference already. The book comes out in single pages, while the website scrolls down the pages without stopping for individual page breaks. Mostly, that doesn't make any difference to the reader, but if pagination is desired, the Adobe company provides a way of converting into pages in a format they describe as ".pdf format", and then provides Adobe reader as a program to read such paginated copy. Depending on the version, it is possible to make notations on the pages and send them onward to others, thus enabling a conference mode for committees and editors, etc. Extra-charge features include the ability to add material, only, thus leaving a permanent trail for legal purposes. If you press the "print" button on this particular website, you will be offered the options of converting the material to .pdf format, or others depending on your computer, and printing from the .pdf on your computer's printer. Home printing is thus a two-step process.
LINKS (footnotes). If you look at the screen version of this book on your home computer, you will find many passages within sentences appear in blue type. This is a signal you are looking at a link, and links are not exclusively confined to blue color. You can detect linking by running the computer cursor over the screen, with the effect that other passages "light up" and change color in some way, a signal that quickly double-clicking on that spot will be taken as an order to open up another site on the web. In effect, you are able to look at footnotes which display the entire reference, meanwhile providing the ability to keep on linking in other directions if the footnote provides footnotes of itself.
LINKS (related topics). In the website but not the printed version, pages which seem to relate to the same topic are grouped together in the margins, for browsing purposes. We have provided these suggested relationships within the 2500 pages of the site; to go outside the site, the reader will have to go to a Search engine, such as Google or Yahoo, but that ability is part of almost every browser.
VIDEOS. At several pages in this book, an image includes an arrow within a circle. This is the indicator which YouTube employs, and is rapidly becoming an industry standard. If you are looking at the page on a computer screen, click on the arrow to cause the downloading of an audio-visual platform, and in a moment the video will appear. If your computer does not have iTunes or similar, it may be necessary to download a copy before this feature becomes operative.
MAPS and SATELLITE VIEWS. Like the YouTube features, it will be necessary first to download a copy of Google Earth before proceeding. If Google Earth is resident, you can click the small button on the first page of the website and be taken on a guided tour. Just about every page on this website is labeled with its GPS markings, so you can read articles about topics which are geographically located near the one you started with.
Roaming around these features with this book as a starting point, you can quickly gather a general idea of the power of the computer and surmise how much more power is going to be available in a year or two. Too complicated? Naw, just a little complicated. The thing to worry about is how addictive it tends to be.
My father in law, a prominent obstetrician in Binghamton, New York, regularly took his family to New York City sometime between Thanksgiving and Christmas. The three-day junket was described as a visit to do Christmas shopping. Another relative made similar trips from home in Tyler, Texas. Several of my patients made such visits to Philadelphia from their homes in West Virginia, stopping by to make a medical visit to me during the same trip. From the seasonal crowds in Penn Station and in the shops on Chestnut Street, it was clear that an annual visit to the big city was a common custom in the upper crust of small to medium-sized cities, for whom the more expensive shops of the bigger city provided big-ticket items bought infrequently, and the distinctive luxuries which made them stand out from the socially less-enlightened back home.
These shopping visits were not confined to purchasing, although that was the main focus. It was a time to go to the theater, orchestra and opera, maybe an occasional ballet and art exhibit. The choice of a large city might be related to returning to the University, or another period of professional training for a drop-in visit because these associations made it possible to observe the latest trends and innovations, a useful issue in the smaller towns. The ladies could observe the trends in fashions, and everyone would have a chance to dress up in the better hotels and restaurants. This recirculation between the small towns and the big one at the hub unified the region, establishing hierarchy rather widely. And it hardened traditions in the big city, since islanders tend to return to the same hotel, restaurants and social gathering spots even more than the local residents do; there isn't time in a brief visit to shop for new venues, unless the trip itself reveals that times and places have somehow changed in an important way.
|Wannamakers Pipe Organ|
That's all changed, today. The pipe organ at Wannamakers, the cluster of department stores around Eighth and Market, the theater district, Caldwell's and Bailey Banks and Biddle upscale jewelers, the fancy women's clothing shops on Walnut and Chestnut Streets, and the bespoke men's tailor shops -- have disappeared in a slew of retail despond. The excited crowds of upscale shoppers have dwindled, at least in the center city shopping area. Students of sociology point to the decline of the department store as a central commotion in the center of this phenomenon, blaming that in turn on the spread of national brand names by television and more electronic forms of advertising. The department store did your comparison shopping for you, putting its brand name on the product and placing its reputation behind the choice. If Wannamaker could determine that a Japanese radio was of high quality, it became Wanamaker's radio. Today, Samsung and Sony do their own advertising, sell their products through outlets in the suburban malls. Philadelphia residents enjoy as much retail choice and pricing as ever, they just shop in the malls located along the Interstate circumferential highway, just outside what used to be the outermost suburbs. So the volume of retail shopping among Philadelphians probably hasn't changed a great deal; it has merely shifted to the malls where there is parking for your car to transport goods which the department stores used to deliver. It is the annual visits from the subordinate small cities at moderate distance that has disappeared. Small cities now have their own shopping malls, carrying national brand name merchandise. Losing this source of business, the associated entertainment industry has declined to a point below sustainability for most of them -- in the center city hub. Along with the disappearance of this regional recirculation, small cities have lost their sense of affiliation with a bigger one. The small-town professional class which was the biggest participant in this annual migration is professionally more isolated but so are their clients. The upper crust of the small town now must constrain its horizon to the smaller town professionals, with their lesser claim to distinction. For a while, the disparity can be overcome by specialization, but ultimately the distinction of the big-city specialist rests on assembling a richer experience from wider drawing power. In Medicine at least, the insistence of Medicare on paying the same fee for the same service lessens the economic incentive for self-repair of the system.
|Wannamaker's Christmas Light Show|
Meanwhile, nature of Christmas itself is becoming standardized. Fewer people make their own Christmas presents, whether through knitting or baking. If the process of commercial gifts goes the full distance, eventually the joy of searching for exactly the right gift will seem more like paying your taxes. These things already cost too much, are worth too little, and neither the process of giving a gift nor the process of receiving a welcome gift will retain much joy. Or significance. What was until recently a Christian religious celebration has become diversified into a generic "Happy Holiday", presumably in order to avoid offense to other religious groups who themselves likely persist in their old traditions of ritual greeting. The assault on Christmas is however not primarily cultural, but commercial. The silent ostentation of elaborate outdoor lighting and the secular versions of Christmas carols endlessly replayed over loudspeakers in stores, probably have a more destructive effect on the community winter solstice ceremony than any competition for religious adherence.
The coming next step in the modification of the Christmas season is dimly visible in the assault of pocket telephones on the suburban shopping mall. After enjoying only a few years of victory over the center city department store, malls must now confront shoppers with portable telephones containing a camera and GPS geographical locator. Seeing something he likes, the shopper of the future can photograph its bar code in the shop display, and be immediately told of all the neighboring stores which sell the same product for a lower price. Electronics are thus about to turn Christmas shopping into an electronic auction, no doubt making it eventually easier to do the shopping from home.
What Christmastime means to me is a recollection of what it once was like at the nation's oldest hospital, and not so terribly long ago, at that. Before 1965, the Pennsylvania hospital had been staffed for centuries with unpaid student nurses, working under the direction of unpaid doctors in training, supervised by volunteer attending physicians. Of the five hundred beds, only forty were filled with paying patients and the rest were housed in long communal halls. On Christmas morning at 7 AM, the drowsy patients were astonished to be awakened by a procession of very pretty student nurses, led by Miss McClellan the grim-looking Directress of nursing, and followed by a handful of internet and residents, all singing Christmas carols and carrying lighted candles in the dark. Miss McClellan herself was never heard to utter a note, but the student nurses had been trained in four-part harmony, and the interne doctors were enthusiastic followers. The faces of the poor old indigents in the beds were filled with pure delight as we traipsed past, chanting of the travels of Orient kings, the pregnancy of virgins, and other miracles of the occasion.
Dealing with a topic as complicated as the causes of the 2007 financial crisis, it's quite possible for two viewpoints to be entirely in agreement, until abruptly coming to different conclusions. In this paper, we consider the relative merits of blaming government housing subsidies in various forms, relative to blaming the unanticipated effects of the computer revolution. The subsidy argument has just been succinctly and effectively argued by a lawyer, Peter J. Wallison. Agreeing with every word he writes, I nevertheless hold the perspective that the disruptive effects of the computer revolution were equally responsible, if not more so. Politics versus technology, choose your poison.
Mr. Wallison served as a lawyer in the financial loins of Washington, and thus has the perspective of a Reaganite who sees government as the main problem; with the significant distinction that his proposals for solution also lie in government corrective action, particularly "covered bonds" and step-wise privatization of the Federal Housing Authority (FHA). While agreeing with both reform proposals, my concern here is about too little general recognition in the analysis of how vulnerable the banking system has become, to revolutions made possible by even primitive computers of the 1960s. Such revolutions soon grew many times magnified by the inexpensive high-speed internet. If that analysis is correct, it predicts mere legislative action for the housing industry will prove inadequate; banking has taken a radical new direction.
Mr. Wallison's argument in the January 3, 2011 edition of the Wall Street Journal is admirably succinct. He points out the New Deal Federal Reserve deliberately suppressed interest rates to the benefit of the housing industry, but made a significant exception for the Savings and Loans. (It was forced to abandon that approach by the innovation of money market funds, in turn, made feasible by the widespread adoption of the IBM 360 computer.) When the collapsed, that segment of the market was awarded to the GSEs (Fannie and Freddy Mac, insured by FHA). In 1992, Congress imposed the goal of promoting "affordable housing" on the GSEs, which is to say the subsidization of "subprime" (i.e. high risk) mortgages. By 2007, half of all mortgages were subprime, and by September 7, 2008, Fan and Fred were insolvent, effectively replaced by the Federal Reserve (i.e. the taxpayers) as the final guarantor against national insolvency. It will take a decade to restore the economy from its present setback, but Mr. Wallison's proposals do indeed have some chance of eventually leading to a viable economy. He proposes the threshold for "jumbo" mortgages be reduced by $50,000 every six months until mortgages are effectively privatized. And he also suggests we create a pool of mortgage assets as security for a bond issue, thus privatizing existing mortgages in the way Europeans describe as a "covered bond" system. Go ahead, do it; it might work, and nothing else is on offer.
|IBM 360 Computer|
Meanwhile take a look at banks; we seemingly can't get along without them. But other institutions are undermining them, with cheaper products made possible by computers. For two centuries, banks transformed short-term borrowing into long-term loans; no one else could do it. It's a simple idea, and it works, that a constant or even rising pool level can be maintained by a steady inflow of short-term deposits. But it is risky; the risk is that some event will precipitate a sudden rush of withdrawals, a run on the bank. Sooner or later, the law of averages catches up. The risk is real, it happens every few years. A price in the form of interest must be imposed to maintain reserves against occasional bank runs, and collectively the whole nation must maintain a central "bank", charging interest to maintain reserves against simultaneous runs on multiple banks. No device has ever been created for a nation to protect against a universal bank panic, which is as effective as placing the risk in the hands of private bankers who can expect to be stripped and shorn if things get out of control. Robert Morris demonstrated this point in 1779, and the nation seemingly must re-learn it every few decades. The IBM 360 computer made it possible to transform short-term into long-term in greater volume and lower cost by allowing banks to get bigger; but it could also perform the short-long transformation in cheaper ways than depository banks do, and from there the bank-competitive process we know as securitization has gone on to commercial credits, auto loans, credit cards, high-velocity stock trading, and mortgage-backed securities. These approaches are often cheaper and more convenient than the trusty old banking system and Credit Default Swaps show its power isn't exhausted; any legislation to prohibit CDS is sure to to be circumvented. Insurance is also on the edge of being threatened. An industrial revolution of this magnitude takes decades of tweaks to become stabilized, but it will suffice for now, if we can establish reasonable protections against the risk shifted into the securitization or investment banking arena. As risk shifts, remuneration for accepting risk must shift as well. This new system for generating capital must not be starved because depository bankers resist the loss of their share of profitability; politics will have much to answer for if that happens.
Most likely, the main obstacles to getting this system fixed will come from overseas. Fifty years of disillusionment with the United Nations will make nations, the United States chief among them, resist loss of sovereignty in something so vital as finance. But that's for the future. For nearly a century, the past has been disrupted by idle notions of the fairness of coerced redistribution, in ways Mr. Wallison has succinctly described. But meanwhile we almost willfully ignore technological upheavals which everyone welcomed but no one fully anticipated.
|Late Hour Calls|
My fancy new cell phone has an annoying habit of ringing a bell every time an e-mail arrives, which is a little puzzling when it rings in the middle of the night. The email program displays time of arrival, so after a while, I took the trouble to see who was emailing me at 4 AM. It seems to be spam and other commercial programs, but it is also an occasional letter with a large attachment, which had been sent several hours earlier. At this, a light began to go on in my head.
I had been told the internet measures the size of files and puts big ones at the end of the queue. That seemed to explain the occasionally delayed transmission of ultra-large emails at times of heavy internet traffic. And it brings up the issue of net neutrality. If the traffic in large files grows enough, it might eventually clog the wires and bring things to a halt. The internet providers would have to spend money to build additional capacity, and it only seems fair to charge big users more for the costs they have created. That would seem a reasonable technological argument for allowing the networks to impose differential pricing, and for overturning the idea of net neutrality.
|Comcast and NBC|
Unfortunately, it might or might not be a sincere argument for resisting net neutrality, since there are major commercial issues at stake as well. For example, Comcast is trying to purchase NBC; its motives are clarified by remembering that a few years ago it tried to purchase Walt Disney. In both cases, a common carrier would be acquiring a "content provider", and thus acquiring a competitive advantage over competitive internet network providers who lack a captive source of content. A strong temptation would exist to slant the internet charges to the disadvantage of other competitors, thus providing a motive to get involved in insincere arguments about net neutrality. What we seem to have here is a familiar antitrust legal doctrine of "vertical integration". For years, vertical integration was prohibited, but the U.S. Supreme Court reversed that prohibition a few years ago, in the case of State Oil v. Kahn. Lewis van Dusen and I had been in the audience of the State Oil arguments, because of our interest in the implications of vertical integration for the medical profession (doctors versus hospitals, for example).
Although the example of Curtis Publishing was not introduced into the arguments of State Oil versus Kahn, it was much in my mind and might well have been used effectively to demonstrate the vulnerability of any corporation which attempts to become vertically integrated by purchasing its suppliers and/or distributors. Curtis Publishing, a few blocks from my office, had been a successful magazine publisher, so successful that it had enough profits to buy Canadian forests to use for paper pulp in its magazines. The outcome was the bankruptcy of the profitable magazine company when the paper pulp business fell on hard times. No antitrust action to prohibit vertical integration was necessary; the dismal fate of Curtis and similar integrators stood as an effective restraint on anyone else who was tempted to get into the vertical integration business. That may be a little hard to follow, and it took the Supreme Court many years to get to that point. But the fact remains that vertical integration is no longer illegal because it is effectively restrained by recognition of its dangers.
So, if we are getting into the insincere argument business, it is time for someone to put his arm around the shoulders of Comcast. Let's whisper that avoidance of the net neutrality dispute is kindly advice, offered solely for Comcast's own good.
And, having gone this far in poking into other people's business, there might be some value in giving some advice to the antitrust lawyers. This sort of case can take years, even decades, to evolve through the legal system. And while its resolution will be phrased in legal terms, I'm not so sure that's sincere, either. It takes me back to the IBM case, where one of the junior lawyers was courting one of my daughters. This young fellow sat for months in front of a microphone at a deposition, doing nothing but read documents into the record. Although he was handsomely paid, the lawyer finally got so sick of the boring futility of dictating a mountain of transcript no one would ever read, into a microphone in an empty room, that he quit. And in the opinion of observers on the courthouse steps, the case was finally determined by the Judge's decision that the patent infringement business was trivial compared with the fact that IBM was mass-producing the greatest innovation of the century -- and the patent-infringement people were just getting in the road.
That may or may not have been the case, but it raises the question of whether antitrust law is wisely based when it considers, not the welfare of competitors, but the strength and vitality of competition itself. What might thus be considered paramount, and perhaps occasionally is so, is the economic welfare of the nation. At present, the newspapers regard this issue as a fight between Netflix and Comcast, and so are now free to devote news attention to other matters. I don't think so. I believe it directly challenges the operation of the Law, which contends that vertical integration eventually takes care of itself. To me, that is only true if circumstances give us enough time to wait it out. In the long run, as Maynard Keynes quipped, we are all dead.
|Get the Lead Out|
At a local outlet of a well-known chain of computer stores, the geek told me that small computer towers don't last as long as big-box desktops, perhaps only three years compared with the old five-year lifespan. And that's because they get hotter. Which is because they run faster than they used to, and also because a federal regulation prohibiting the use of lead in soldering joints makes the wiring wear out sooner. By the time he was done explaining things to me, I was ready to run out and join the local political Tea Party. Because I don't think it's very likely that toddler children will be eating my solder very soon, or even ever. And indeed, I have trouble imagining any children anywhere in the world ever nibbling on computer innards, even once. Maybe the concern is that the heat will vaporize the lead, and little children crawling on the floor will inhale the lead vapor, getting lead poisoning that way. While that may be somewhat more plausible than eating computer parts, or eating vegetables grown in the neighborhood of trash disposal, or breathing the air full of lead fumes -- it doesn't really seem very plausible at all.
It is generally reckoned that 835 million computers worldwide were manufactured in 2010. If they cost an average of $500 apiece and lasted 40% less long than if they used lead solder, the world would end up buying 300 million additional computers per year, conservatively spending $1.5 billion more dollars a year to do so. Are the dangers of lead poisoning so threatening that such a cost is justified on a hypothetical basis? The people who do the soldering are possibly at somewhat greater risk, but you could buy a lot of masks and air purifiers for the extra cost for computers alone. Can this possibly be true?
Is it possible that the geek in the computer store is just selling warranty insurance, or more expensive computers when he passes on this news? Is it possible that the makers of fumes ventilators are promoting their products in this way? How about the plaintiff trial lawyers. Are they calculating that frenzied citizens will wander into jury duty and be concerned to punish the evil makers of computers with gigantic penalties, of which the lawyers will get 40%? Or the makers of cool computer boxes are competing indirectly with the evil makers of hot computer boxes?
This article ends with a comment section. Those who can offer references to the facts, in this case, are urged to send them in. Something in this story doesn't stand the light of day, and perhaps a way can be found to shine a little light of day on the facts.
Here's some advice for new authors of books: You can't write the first chapter until you have written the last chapter. That is, you have to hit the reader between the eyes in the first chapter, draw him into the argument, making a pauseless transition from a general statement of the author's thesis into a relentless march of evidence toward the conclusion. This general design comes easier with practice and is therefore much harder for beginners to accomplish. But even experienced authors are usually unable to keep the overall design of their message constantly in mind, to be able to sit down and write the book straight through from beginning to end. It's true that Sir Walter Scott was said to turn over the last page of a book, and immediately begin writing the first page of the next one without getting up from his desk. But we aren't talking about pot-boilers, we're talking about serious books. That includes almost all non-fiction and the great majority of serious fiction.
As a matter of fact, the description includes the majority of short articles as well; newspaper editorials would be a good example. Although the style of an editorial is to start with a generality, marshal a description of some recent events, and end up with a short summary, that isn't in fact how it is usually composed. The editorial writer starts with a one-liner, or call to action, organizes some recent events and some historical arguments as a reason to issue such a call, and then ends up by summarizing things in the first paragraph. Having mentally designed the editorial into such a three-step pattern, with experience a professional editorialist can sit down and write the editorial from beginning to end and, after a few touch-ups, it's ready for the printer. He really has gone through the organizational process which a book author needs to go through, but the article is short enough so that reconstruction is performed in his head. In a book, it is generally necessary to write out the chapters in a jumbled way, and later re-organize them. A new author with his first manuscript generally doesn't adequately appreciate the truth of this and has to be muscled by the editor, at least just a little bit. One of my editors summarized his job as follows: you tell every new author to take the first four chapters of his book and throw them away,
That's cruel, of course, and is seldom accepted graciously. The brusqueness is justified by understanding that the fresh new author thinks he's all finished when he isn't. He's silently telling himself he means to tell that editor, "Don't you touch a single comma of it." In the old days, authors were rare and had to be coddled. Book publishers in the Eighteenth century purchased the manuscript in its entirety, either then losing money or making a huge fortune, but leaving the author with only his manuscript price. At that time, publishers called themselves booksellers. As things evolved, booksellers often had to support a starving author while the book was being written, offering an "advance" payment to be deducted from royalties paid after final sales to readers. Author royalties were about ten percent of sales. The royalty system persists today, but advance payments are uncommon and negotiated around the tax code effects. All of these payment evolutions reflect the underlying issue: good authors used to be rare, but now are frequent. Book publishers used to be wealthy, but now are rapidly going bankrupt and extinct. Authors of excellent books have a hard time finding someone to publish them. The advent of the personal computer around 1980 is what caused this.
In 1980 I published a book, writing it on my brand-new Radio Shack TRS-80, Mod I. The editor of the publishing house had never heard of such a notion, scoffed at it, and declared he would never touch such a thing as a computer. In 2010 there were more than 800 million personal computers manufactured and sold, and by this time almost no publisher will accept a manuscript without an accompanying magnetic disc to make revisions cheap and easy, and to shift the costs of key-entry from the publisher to the author. At first, manuscripts were shipped to India for key entry. Now, it is the diskette which is mailed or e-mailed to India, and the editor is often located in India. We are soon approaching a day when the keyboard and author remain at home, sending material to gigantic server computers in China, from which the editor anywhere in the world can retrieve the material and revise it, returning the material to the server computer where the author can comment on the revisions without moving from his desk. After that stage, looms the prospect of the reader paying a fee on his credit card to access the "book" directly from the server, and reading it at home. At that point maybe the book will have been completed, and maybe it will be revised some more. In a sense, a book will never be definitely finished and allowing the public to read it will only be an episode within an unending process of revision. Newspapers, magazines, and books are all struggling to find a way to cope with this unpredictable evolution. Like most revolutions, this one doesn't have a clear idea where it is going.
So let's reflect back on the central process of authoring. In addition to the old maxims of the trade, there is the Euclidian reality that you can't write the last chapter until you somehow write a first one. The original first chapter, the one the author struggled so hard to compose, is destined to be cast off and replaced by a new first one, one that succinctly announces what is about to be said. After that must come a new second chapter, which takes the reader from the initial disconcerting summary back to the origins of the problem now about to be clarified. Followed by a third chapter, probably one shifted forward from the assorted chapters of evidence back into prominence as the key piece of evidence which leads to other confirmatory pieces of evidence; after that marches the parade of confirmatory evidence, ending with one zinger of a conclusion.
Voltaire or some other cynic would probably comment that what has here been outlined is merely an elaborated process of what editorial writers do: start with the conclusion and find slanted facts to fit it. Some may indeed do that. But there is some hope that the inevitable impact of technology on authorship can bring us to a system where many authors will assemble the facts, and only then derive a conclusion from them. If politicians would only adopt that system, maybe we could hope for a perfect world.
|Modern Print Press|
There are many more authors than publishers of books. Since almost every school child owns and uses a home computer, this disparity might be even greater except for a technical barrier between home computers and high-speed printing presses in the way they treat illustrations. A presently insurmountable mismatch arose from commercial printing presses migrating from movable type (i.e. Gutenberg style) to page-images, whereas the computer industry aimed for cheap printers which perfected the Gutenberg method instead of replacing it. Commercial printers need to produce high volume output inexpensively, while cheap computer printers produce low-volume output and disregard a rather high unit price. Most of the barriers between the two have been overcome, except for photos and other detailed images. Let's give a simplified explanation.
Since 1993 when Adobe invented the method, commercial printers generally work from what amounts to a photograph of each page, called a PDF or "portable description format". To some extent, portions of a page can be stitched together like a patchwork quilt, but of course, all the pieces must be uniform in their technology. The industry standard is that everything is printed at 300 dots per inch. That's essentially 300 pixels per inch. The establishment of this standard made it possible for huge high-speed presses to produce hundreds of pages of newsprint a minute on machines which cost millions of dollars apiece. Commercial printing during the first half of the Twentieth century accepted the massive cost of the printing machine in order to promote production speed. Home computer printers sacrificed production speed in order to become cheap. Profitability comes from the ink, not the printer.
Low-volume desktop printers can afford to take the time to examine each character or image as it comes along and readjusts appropriately. Essentially, computer printers do individual typesetting every time a new page is printed. They are thus able to exploit considerable compression for storage, or for the speed of electronic transmission for printing at a remote location. For them, 72 dots per inch are sufficient, since computer-driven printers have acquired the facility to guess the gaps between dots (dithering) well enough to fool the eye of the reader. Since the same thing is true of display monitors, there is resistance in the computer industry to sacrificing the interests of the multitude to the needs of those comparatively few authors and publishers who use the mass-printing industry to keep their unit costs low. Computer printing squashes thousands of pixels down to 72 per inch.
But it's hard to convert photo images back from 72 to 300 dpi since the dithering trick won't stretch that far. A good illustration is the washing of wool socks. You can throw argyle socks in a washing machine and they will shrink to the size of baby booties, but they won't stretch back up if you decide to wear them. The usual expedient is to enlarge a small picture and take the second picture of it, then enlarge the enlargement, and so on. Alternatively, the Genuine Fractals program by Altamira Group comes closer to achieving the desired result, but even it has limits. When someone has more than a very few pictures that need stretching between Internet screen display and commercial printers, the current best advice is to store two different-density copies of the same image, and use as required.
Slight re-design of workflow is advised to create still a third version of the image, for the purpose of storage. The biggest possible image with the most pixels possible should be stored on the author or publisher's own computer. A second, shriveled 72-dpi, image is sent to be stored on the host computer of a website and can be used for desktop printing as well. When commercial printing happens to be desired, the much larger stored image can be shriveled to 300 dpi by conversion of a page to PDF format, or else (for editing) this third version of the image is incorporated into Office Word , subsequently re-incorporated back into a pdf file for final printing. The PDF conversion is quick and simple once a decision is made as to what the final product should look like after it is printed. Fine art display or other highly demanding graphics will follow this general outline, as well. Those who have any aspiration for high-density output would be well advised to go back to the original photograph. In 2008, that translates into another maxim for the photographer who takes the original picture: Always take all photos in RAW format, with a view toward later flexibility of use. Less demanding output can be produced from degraded copies of the RAW original, the most common of which is now the so-called JPEG format. Those who discard an original RAW image, are almost always sorry. And those who buy cheaper cameras that go straight to JPEG are just asking for frustration. Now that memory chips have become cheap, the extra technology to generate the RAW image does not greatly increase the cost of a camera, but greatly enhances its ability to retouch images without breaking up in the repeated dithering and re-dithering usually required for retouching. A number of steps in the photography process could be eliminated if the user would accept the requirement of some new step resembling retouching as part of every snapshot.
Moore's Law is named after Gordon Moore, who pointed out that computer chips seemed to double their speed every few years, an important issue affecting the cost of computers and the heat they give off while operating, and so on. In fifty years, there have been enough doublings of speed to make it often irrelevant whether they get any faster for the job they are intended to do. No one really cares whether a blink of an eye gets any quicker. The question is beginning to arise whether it makes any practical difference if the trading of stocks gets faster.
At the moment, it's widely quoted that 70% of stock trades on major exchanges are now conducted between two unattended computers; it won't be long before 100% are. All of the inefficiencies of trading pits, with shouting and shoving, winking and maybe some front-running, will vanish into the humming of progressively smaller electric machines. Kinks will appear then get ironed out, perhaps after another Long Term Capital episode or two. The cost of trading will become vanishingly small, essentially a chess game between mathematics wizards. But volatility will smooth out, and costs will become negligible. So far as we can see, that's the end of the line, beyond which a computer speed sixteen times as fast serves no extra purpose.
But John Bogle, who invented the index fund and grew Vanguard to several trillion in assets, amused himself recently in front of a bedazzled audience. Several large funds, maybe even a lot of them, are taking what looks like a static mass of sleeping stocks and trading them internally at a rate of thousands per second, hoping to make a tiny fraction of a penny per trade individually, and a whole lot of profit for the managers in aggregate, while giving the appearance of standing still. Is this a good thing or a bad thing? Hard to say. John Bogle seems to imply it might even be a bad thing. Whatever it is, it is not the end of the line.
For completeness, look at the opposite end of this spectrum. Once more, it is John Bogle who points out that the price of the stock can be divided into its earnings, its dividends, and its speculative volatility. Total earnings for the past century have averaged 9.1%, but if you strip off the effect of price-to-earnings variance, you have an investment return of 8.8%, essentially the same in the eyes of normal people. The way an investor could strip away the P/E volatility -- is to buy the whole company. When you own the whole company, public opinion stops influencing the price. Holding companies can do that, as can private equity funds, and even Warren Buffett. If you are playing this game, all you need is a big-enough holding company with honest management or at least one independent method for estimating a fair price. If you are a value investor like Warren, buying the company for a P/E ratio well below 12.5 and holding it forever, you ought to achieve a return which significantly exceeds 8.8%. If you are an investment bank on Wall Street, you may buy the stock cheap, fix it up, and sell it rather soon for a much higher P/E ratio. Either way, there are transaction costs and taxes only twice, when you buy it and when you sell it. An investment company can do all kinds of things, but an individual investor should know enough to adjust his buying (of shares of these intermediaries) to a youthful stage of life, and his selling to his retirement years. It cannot be claimed the quirks have been completely worked out, but it's a start. Come back in a few years and see what has been added to this idea to make it air-tight.
|Dr. Russel Kaufman|
The Right Angle Club was recently honored by hosting a speech by Dr. Russel Kaufman, the CEO of the Wistar Institute. Dr. Russel is a charming person, accustomed to talking on Public Broadcasting. But Russel with one "L"? How come? Well, sez Dr. Kaufman, that was my idea. "When I was a child, I asked my parents whether the word was pronounced any differently with one or two "Ls", and the answer was, No. So if I lived to a ripe old age, just think how much time and effort would be wasted by using that second "L". In eighty years, I might spend a whole week putting useless "Ls" on the end of Russel. I pestered my parents about it to the point where they just gave up and let me change my name". That's the kind of guy he is.
|The Wistar Institute|
The Wistar Institute is surrounded by the University of Pennsylvania, but officially has nothing to do with it. It owns its own land and buildings, has its own trustees and endowment, and goes its own academic way. That isn't the way you hear it from numerous Penn people, but since it was so stated publicly by its CEO, that has to be taken as the last word. It's going to be an important fact pretty soon since the Wistar Institute is soon going to embark on a major fund-raising campaign, designed to increase the number of laboratories from thirty to fifty. The Wistar performs basic research in the scientific underpinnings of medical advances, often making discoveries which lead to medical advances, but usually not engaging in direct clinical research itself. This is a very appealing approach for the many drug manufacturers in the Philadelphia region, since there can be many squabbles and changes about patents and copyrights when the commercial applications make an appearance. All of that can be minimized when fundamental research and applied research are undertaken sequentially. Philadelphia ought to remember better than it does, that it once lost the whole computer industry when the computer inventors and the institutions which supported them got into a hopeless tangle over who had the rights to what. The results in that historic case visibly annoyed the judge about the way the patent infringement industry seemingly interfered with the manufacture of the greatest invention of the Twentieth century.
Patents are a tricky issue, particularly since the medical profession has traditionally been violently opposed to allowing physicians to patent their discoveries, and for that matter, Dr. Benjamin Franklin never patented any of his many famous inventions. But the University of Wisconsin set things in a new direction with the patenting of Vitamin D, leading to a major funding stream for additional University of Wisconsin research. Ways can indeed be devised to serve the various ethical issues involved since "grub-staking" is an ancient and honorable American tradition, one which has rescued other far rougher industries from debilitating quarrels over intellectual property. You can easily see why the Wistar Institute badly needs a charming leader like Russel, to mediate the forward progress of our most important local activity. From these efforts in the past have emerged the Rabies and Measles vaccines, and the fundamental progress which made the polio vaccine possible.
It was a great relief to have it explained that there is essentially no difference at all between Wisters with an "E" and Wistars with an "A". There were two brothers who got tired of the constant confusion between them, see and agreed to spell their names differently. When the Wistar Institute gathered a couple of hundred members of the family for a dinner, the grand dame of the family declared in a menacing way that there is no difference in how they are pronounced, either. It's Wister, folks, no matter how it is spelled. Since not a soul at the dinner dared to challenge her, that's the way it's always going to be.
1.-2. Save the cost of publishing long bibliographies in every copy of a book whose readers mostly make no use of the bibliography, while still making the bibliography available to those who will use it. The Economist magazine now does this in the form of notifying the reader that the source documents for their articles are available on the Economist web site. This is a suitable methodology for publications with only limited bibliographies, but very large circulation and short shelf life. Essentially, it is a free service to readers which reduces the clutter and intimidation of citations to essentially unavailable sources.
On the other hand, a recent book about Thomas Jefferson had 120 pages of bibliography citations. Unless that book has an unusually scholarly readership, most of the cost of printing and distributing 120 pages were wasted. Printing that book without the citations, but also publishing a diskette, Kindle, or website -- containing nothing but citations --would produce considerable savings for the publisher and reader, and they ought to be willing to pay for it. Unfortunately, there is resistance to anything new, and you may have to do it both ways until the idea catches on. The cost should include the right to some recognizable copy mark on the book, signifying this feature is available. Bowker and advertisers should be encouraged to use something smaller but similar. In that way, the concept can be advertised in advance of actual market penetration.
Some thought should be given to making some use of the searchability of such a bibliography. At the negligible cost, it can be resorted and listed in a wide variety of options(author, date, publication source), since the incremental cost of such additional material is minimal. For example, identifying all the citations available at one location should assist the scholar in deciding where to pursue his work. Perhaps there are ways to produce it which would help the librarian locate the material within the library or to pick out material in the same location before moving on.
3. Widen the availability of the text of primary sources. Much of this would be of interest to the non-professional reader if he could get to it more easily, and it would enlarge his sense of participating in the interpretation or "buy in". Unfortunately, most photocopying is still of poor quality, and the most useful version of the original is to use a keyboard. Therefore, I recommend searching for ways to induce the scholar to do it for you; if he really thinks it is an important document, would he please keyboard it for everyone else. If a way is provided for counting the number of "hits" on a document's citations, it will lead you to the popular documents, to begin with. Please don't try to start with "A" and end with "Z". After you have produced digitized copy, then photocopy it if you wish. The local Athenaeum makes quite a lot of revenue from selling reprints of architectural drawings, so there are exceptions.
4. Do not limit yourself to primary sources. There are copyright issues here, but a link to Amazon will get you revenue from Google, and a used copy of a book from ABEbooks will be delivered to your home by United Parcel Service. It's often cheaper than parking near a library.
|Thomas Jefferson: The Art of Power: Jon Meacham: ISBN: 978-1400067664||Amazon|
Dithering is originally a photographic term, referring to the process of smoothing out rough parts of an excessively enlarged photo. Carried over to the profession of writing history, the term alludes to smoothing over the rough parts of a narrative with a little unacknowledged conjecturing.
In photography, when a picture is enlarged too far, it breaks apart into bits and pieces. Dithering fills in the blank gaps between "pixels" actually recorded by the camera. It amounts to guessing what a blank space should look like, based on what surrounds its edges. In the popular comic strip, Blondie's husband Dagwood works for an explosive boss called Mr. Dithers, who "dithers" between outbursts, when he vents his frustration. But that's to dither in the Fifteenth-century non-technical sense, meaning to hesitate in an aimless trembling way. If the cartoonist who draws Blondie will forgive me, I have just dithered in the sense I am discussing in history. I haven't the faintest idea what the cartoonist intended by giving his character the name of Mr. Dithers, so I invented a plausible theory out of what I do know. That's what I mean by a third new meaning for either, in the sense of "plausible but wholly invented". It's fairly common practice, and it is one of the things which Leopold von Ranke's insistence on historical documentation has greatly reduced.
To digress briefly about photographic dithering, the undithered product is of degraded quality, to begin with. Dithering systematically removes the jagged digital edge, and restores the original analog signal. In that case, dithering the result brings it back toward the original picture. That's not cheating, it's the removal of a flaw.
Dithering of history comes closer to resembling a different photographic process, which samples the good neighboring pixels on all sides of a hole and synthesizes an average of them to cover the hole. It works best with four-color graphics, converting them to 256-color approximations. But the key to all photographic approaches is to apply the same formula to all pixel holes. That's something a computer can do but a historian can't, and historical touch-ups seem less legitimate because the reader can't recognize when it has happened, can't reverse it by adding or removing a filter. Dithering may somewhat improve the readability of the history product occasionally, usually at the expense of inaccuracy, sometimes large inaccuracy. It's conventional among more conscientious writers to signal what has happened by signaling, "As we can readily imagine Aaron Burr saying to Alexander Hamilton just before he shot him." That's less misleading than just saying "Burr shouted at Hamilton, just before he blew his brains out." The latter sends no signal or footnote to the reader, except perhaps to one who knows that Hamilton was shot in the pelvis, not the head. In small doses, dithering may harmlessly smooth out a narrative. But there's a better approach, to write history in short blogs of what is provable, later assembling the blogs like beads in a necklace. Bridges may well need to be added to smooth out the lumps, but that becomes a late editorial step, consciously applied with care. And consequently, author commentary is more likely to recognized as commentary, rather than rejected as fiction.
Dithering the holes is often just padding. It would be better to spend the time doing more research.
The New York Times ran an article by Kevin Carey on March 8, 2015, predicting such big changes ahead for colleges, bringing an end of college as we know it. A flurry of reader responses followed on March 15, making different predictions. Since almost none of them mentioned the changes I would predict, I now offer my opinion.
Colleges have responded to their current popularity, mostly by building student housing and entertainment upgrades, presumably to attract even more students. What I am seeing seems to be a way of taking advantage of current low-interest rates with the type of construction which can hope for conventional mortgages or even sales protection, in the event of a future economic slump. In addition, they are admitting many more students from foreign countries, probably hoping not to lower their standards for domestic admissions. They probably hope to establish a following in the upper class of these countries, eventually enabling them to maintain expanded enrollments by lowering standards for a worldwide audience of students, rather than merely a domestic one. With luck, that might lead to an image of superiority for American colleges, even after the foreign nations eventually build up their standards. The example would be that of Ivy League colleges sending future Texas millionaires back to Texas, which now maintains an aura of superiority for Ivy League colleges, well after the time when competing Texas colleges are themselves well-funded. The Ivy League may even be aware of the time when the Labor Party was in power in England, and for populist reasons deliberately underfunded Oxford and Cambridge. American students kept arriving anyway, seeking prestige rather than scholarship.
|William F. Buckley Jr|
Television courses seem to be a different phenomenon. A good course is a hard course, so a superior television course will prove to be even harder. In fact, it might be said the main purpose of college is to teach students how to study; the graduates of first-rate private schools find college to be rather easy, providing them with extra time for extra-curricular activities which are not invariably trivial. I well remember William F. Buckley Jr, pouring out amazing amounts of written prose for the college newspaper and other outlets, in spite of carrying a rigorous academic workload. I feel sure he did not acquire that talent in college, but rather, came to Yale, already loaded for Bear. I am certain I do not know what future place tape-recorded classes will eventually assume, but I do feel such courses would be most useful for graduate students, who have already learned how to study in solitude.
To return to the excess of dormitories under construction, the approaching surplus of them might also lead to better use, which is for faculty housing and usage. An eviction of students from dormitories would lead to urban universities beginning to resemble London's Inns of Court in physical appearance, with commuting day-students, mostly attending from nearby. The day is past, although the students do not believe it, that there is very much difference between living in Boston and living in California, and the much-touted virtue of seeing a new environment will eventually lose its charm. It may all depend on how severely a decline in economics retards the traditional pressure to escape parental control, but at least it is possible to foresee at least one improvement which could result from fiscal stringency.
The Franklin Institute gives out an annual award for business innovations, and a few years ago it was given to Michael Dell. The banquet is very splendid and well-attended by people willing to pay high prices. So, it happened that the founder of Dell Computers was wandering around the dinner table where I was seated. Being a gregarious sort of guy, he introduced himself and told his story.
As he relates it, his mother gave him a new IBM portable computer for his 19th birthday. So, he took it upstairs to his bedroom along with a screwdriver, and took it all apart.
What he discovered annoyed his mother, but intrigued the birthday guests. Every single part of the computer was composed of articles obtained from other manufacturers. So he got in touch with these parts makers and asked for their prices. His discovery was that he could assemble a duplicate computer for half the price his mother had paid. One thing led to another, and he was soon producing Dell computers for much less than IBM was selling them. Naturally, there is a market for such a product, particularly if they were sold without middle-men, mail-order. And in the course of a short time, he became a billionaire, IBM got out of the business, and it was all his. As is so common with stories like this, he eventually went bust and was soon engaged in new adventures. But he took his screwdriver to other tables, and we never did hear about those later exploits.
|F. Hastings Griffin|
Hastings Griffin ("Haste") died last week in his nineties. The Orpheus Club put on a concert at his memorial service, and probably the Squash world put on something because he was the reigning world champion for his age group. And his wife was there in all her glory, having married and outlived three men, all of whom were roommates at Princeton; among women, that's a champion on a different level. I knew Haste as a fellow member of the Shakspere Society, where his booming voice was an arresting feature, particularly when you knew his motorcycle was parked outside, ready for the 30-mile trip home at night to his home near Valley Forge. But I knew him to be most famous as the lawyer who was on the losing side of a lawsuit which cost Philadelphia the whole computer industry.
|John Mauchly (on Left) and J. Presper Eckert|
As a matter of fact, I am very friendly with Ben Heintzen, the lawyer on the winning side of the same case. So, over a period of years, I was able to piece together the main facts of the case, checking remarks from one side against the recollections of the other. First of all, the computer as we know it was assembled by Mauchly and Eckert, on the faculty of the University of Pennsylvania. Eckert had patented it, but the University had a rule that patents of the faculty belonged to the university. Unfortunately for that position, all of the money was government money. Right there, you have the makings of a big lawsuit, but there was much more. Ben Heitzen had discovered a paper by a Midwestern professor, Iowa I believe, who seems to have put the patent in the public domain by publishing the main substance of it, or what lawyers contended was the essence of the case. Furthermore, the case had many plaintiffs and defendants, working more or less together, but under the team leadership of Sperry Rand for the defendants, and Honeywell for the plaintiffs. The case dragged on for more than eight years, to the delight of the law firms and dismay of the Judge, who had been heard to growl that he didn't,t want to spend the rest of his life listening to this same case. All a losing lawyer had to do was wait for the verdict to tell you who won, and then file an appeal that the Judge had acted in prejudice. Furthermore, the Judge expressed the opinion that IBM wanted to mass produce computers, whereas Sperry was really only in the "patent infringement business." Somebody said that perhaps it was the Judge. Well, there's more.
|Sperry Rand Building|
It happens that Sperry Rand had round holes in their punch-cards, and IBM had square holes. The hanging chad issue became famous in the Gore-Bush presidential election, and you would suppose square holes would have more of a tendency to hang their chads than round ones, but it was actually the other way around. It seemed so to Sperry Rand, too, so they finally hired IBM engineers to tell them what the matter was, and those IBM engineers were hanging around while the trial was going on. They must have picked up the gossip in the lunch room, and reported back to Tom Watson at IBM something like, "Do you know what these people are doing with computers?" So they were given orders to stretch out the hanging chad matter and see what else they could learn. When Watson heard more, he told his lawyers to ask what Sperry wanted in return for letting IBM out of the lawsuit, and the answer came back, "Ten million dollars". To which Watson replied, "Pay them immediately because we are going to mass-produce those things." At that time, there were only a handful of computers, all doing such things as calculating field artillery aiming instructions. So Watson was essentially betting his whole company on success. At that time, General Electric, RCA, Sperry, Burroughs, Honeywell, and others were in Philadelphia, trying to imitate what they had heard the machines were capable of, so it was not a sure-fire gamble at all, but it was certainly successful in moving computers to upstate New York, and eventually to Silicon Valley.
|UNIVAC (Sperry Rand) Unimatic terminal|
Since half of this story comes from Griffin, let me reconcile a point that came up at his funeral. One of his partners heard him boast he had never lost a case, and when challenged on it, replied that it didn't matter what the jury decided, it was the judge who must approve the size of the settlement. His claim was based on getting settlements down to much less than the client was afraid it might be and was therefore persuaded he had been lucky. Well, in this case, it was a little different. The chief lawyer of the firm took the case away from Griffin and carried it himself. Shortly later, Griffin was heard to shout at the boss, "You are going to lose this case!". The next morning he was standing at the airport, next to the President of Sperry Rand. The President came close and asked him, "How do you think this case is going?"
To which Haste replied, "Well, sir, you'll have to ask my boss."
|Posted by: how to get twitter followers free | Feb 13, 2012 10:02 AM|
|Posted by: cheapostay discount | Feb 13, 2012 9:40 AM|
|Posted by: chjchj | Aug 23, 2010 10:34 PM|
|Posted by: George Fisher | Apr 26, 2009 2:35 PM|
Quakerism and the Industrial Revolution
The Industrial Revolution extended over two centuries and was more important than all the wars, governments, and agitations of its time. Quakerism began at the same time, in the same place. Was that only coincidence?.
The basic concepts of a computer can be reduced to three adjectives, related to the ideas of three men at the University of Pennsylvania. Mauchly made a general purpose computer. Eckert made an electronic version of it. And von Neumann designed the stored instruction set. An electronic, stored-instruction, general-purpose computer resulted..
Computerizing Medical Care
Healthcare is mainly information processing, but utilizing computers has been a disappointment. Be prepared for high costs and continuing disappointment for decades to come.
Report Identity Theft to the Secret Service
Identity theft is now under the jurisdiction of the U.S. Secret Service.
The Beginnings of E-Mail
Tony Drexel and J.P. Morgan may well have been the first.
Here's how to use a computer to be reminded of what's going on. Keeping dates straight
When Medicare started it was chaos, worse confounded.
Intelligensia, Philly Style
All computer user groups are different. In Philadelphia they are more different.
Making Money (8): Virtual Money
When money was tangible you had to guard it, now that it's mostly virtual you have to verify it. Hardly anybody can, and that's a problem.
iPhone, Skype, Land Lines and International rates
The iPhone is the bomb and with Skype it is really the most functional cell phone around. Don't leave home without it.
Internet, Websites, and related Programming
Technical Comments related to programming this and other websites require a special topic section of their own.
Country Auction Modernized
On Fairgrounds Road, in the Quaker farmlands of Bucks County, efficiency and computerized streamlining are nibbling at the enduring customs of country auctions.
Printed Books v. Websites
Since this booklet is provided in both printed book form, and online website form, the reader has a chance to compare the two media. Each has its merits.
It once was a tradition to go back to the big city for Christmas shopping..
Causes of the 2007 Crash: Political and Technological
After 4 years, we are gradually piecing out the causes of the second great crash. It seems two main causes were government subsidies of cheap housing in one form or another, and the impaction of computers on banking.
Net Neutrality and Vertical Integration
Net neutrality is a new issue with several other issues buried in it. Here we take up its relation to antitrust, particularly the legal concept of vertical integration.
Do Computers Thrive on Lead Poisoning?
There's a rumor that regulations prohibiting lead in solder will shorten the life of computers.
Future Directions for Book Authoring
Personal computers have already revolutionized book authorship by greatly expanding it, and revolutionized book publishing by drowning it in authorship. What comes next?
Modern Printing and Post-Modern Printing
Like computer monitors, computer printers operate at 72 dpi. Commercial publishing uses 300 dpi, and it's a little awkward to switch back and forth.
Is Stock Trading Passe?
Computers have been in some sort of use for trading stocks, for over fifty years. Are they reaching their limits?
Wistar Institute, Spelled With an "A"
The Wistar Institute is properly pronounced "Wister", but in fact it's all the same family. Its fame in biomedical research makes that quite irrelevant.
Revenue Stream for Historical Documents
New blog 2013-01-10 17:45:38 description
It's better to start writing history with what you know for certain. And it's usually better to stop there, too.
New Looks for College?
Many people predict big changes are in store for Universities. Here's my take on it.
Michael Dell, the Millionaire Teen-ager.
Happy Birthday, Mike.
The Lawsuit That Ate Philadelphia
Other cities want to attract Silicon Valley, Philadelphia drove it away.