Philadelphia Reflections

The musings of a physician who has served the community for over six decades

0 Volumes

No volumes are associated with this topic

Academia (2)

continued.

Nanoparticles: The Dwarfs are Coming

{Privateers}
Professor Shyamelendu Bose

Professor Shyamelendu Bose of Drexel University recently addressed the Right Angle Club of Philadelphia about the astounding changes which take place when particles are made small enough, a new scientific field called nanotechnology. In one sense, the word "nano" comes from the Latin and Greek for "dwarf". In a modern scientific sense, the nano prefix indicates a billionth of something, as in a nanometer, which is a billionth of a meter. Or nanotubes, or nano calcium, or nano-anything you please. A nanometer is likely to be the dominant reference because it is around this width that particles begin to act strangely.

{Privateers}
Nanoparticles

At this width, normally opaque copper particles become transparent, the cloth becomes stain-resistant, and bacteria begin to emit clothing odor. Because the retina is peculiarly sensitive to this wavelength, colors assume an unusual brilliance, as in the colors of a peacock's tail. The stable aluminum powder becomes combustible. Normally insoluble substances such as gold become soluble at this size, malleable metals become tough and dent-proof, and straight particles assume a curved shape. Damascus steel is unusually strong because of the induction of nanotubes of nanometer width, and the brilliance of ancient stain-glass colors is apparently created by repeated grinding of the colored particles.

{Richard Feynman of Cal Tech}
Richard Feynman of Cal Tech

Practical exploitation of these properties has almost instantly transformed older technologies and suggests the underlying explanation for others. International trade in materials made with nanotechnology has grown from a few billion dollars a year to $2.6 trillion in a decade, particularly through remaking common articles of clothing which were easily bent or soiled, into those which are stain and water resistant. Scientists with an interest in computer chips almost immediately seized upon the idea, since many more transistors can be packed together in more powerful arrangements. Richard Feynman of Cal Tech seems to be acknowledged as the main leader of this whole astounding field, which promises to devise new methods of drug delivery to disease sites through rolling metal nanosheets into nanotubes, then filling the tubes with a drug for delivery to formerly unreachable sites. Or making nanowires into various shapes for the creation of nano prostheses.

{Cal Tech on Los Angeles}
Cal Tech on Los Angeles

And on, and on. At the moment, the limitations of this field are the limitations only of imagination about what to do with it. For some reason, carbon is unusually subject to modification by nanotechnology. It brings to mind that the whole field of "organic" chemistry is based on the uniquenesses of the carbon atom, suggesting the two properties are the same or closely related. For a city with such a concentration of the chemical industry as Philadelphia has, it is especially exciting to contemplate the possibilities. And heartening to see Drexel take the lead in it. There has long been a concern that Drexel's emphasis on helping poor boys rise in the social scale has diverted its attention from helping the surrounding neighborhoods exploit the practical advances of science. The impact of Cal Tech on Los Angeles, or M.I.T on Boston, Carnegie Mellon on Pittsburgh, and the science triangle of Durham on North Carolina seems absent or attenuated in Philadelphia. We once let the whole computer industry get away from us by our lawyers diverting us into the patent-infringement industry, and that sad story has a hundred other parallels in Philadelphia industrial history. Let's see Drexel go for the gold cup in this one -- forget about basketball, please.

More, or Fewer, Raisins in the Pudding

{Privateers}
manuscripts

Some book publishers do indeed regard history as handfuls of paper in a manuscript package, mostly requiring rearrangement in order to be called a book. Librarians are more likely to see historical literature dividing into three layers of fact mingled with varying degrees of interpretation: starting with primary sources, which are documents allegedly describing pure facts. Scholars come into the library to pore over such documents and comment on them, usually to write scholarly books with commentary, called secondary sources . Unfortunately, many publishers reject anything which cannot be copyrighted or is otherwise unlikely to sell very well, however important historians say it may be. Authors of history who make a living on it tend to focus on the general reading public, generating tertiary sources, sometimes textbooks, sometimes "popularized" history in rising levels of distinction. Sometimes these authors go back to original sources, but much of their product is based on secondary sources, which are now much more reliable because of the influence of a German professor named Leopold von Ranke. Taken together, you have what it is traditional to say are the three levels of historical writings.

{Leopold von Ranke}
Leopold von Ranke

Leopold von Ranke formalized the system of documented (and footnoted) history about 1870, vastly improving the quality of history in circulation by insisting that nothing could be accepted as true unless based on primary documents. Von Ranke did in fact transform Nineteenth-century history from opinionated propaganda in which it had largely declined, into a renewed science. At its best, it aspired to return to Thucydides, with footnotes. That is, clear powerful writing, ultimately based on the observations of those who were actually present at the time. Unfortunately, Ranke also encased historians in a priesthood, worshipping piles of documents largely inaccessible to the public, often discouraging anyone without a PhD. from hazarding an opinion. The extra cost of printing twenty or thirty pages of bibliography per book is now a cost which modern publishing can ill afford, making modern scholarship a heavier task for the average graduate student because the depth of scholarship tends to be measured by the number of citations, but incidentally "turning off" the public about history. All that seems quite unnecessary since primary source links could be provided independently (and to everyone) on the Internet at negligible cost. And supplied not merely to the scholar, but to any interested reader, no need to labor through citations to get at documents in a locked archive.

{E-Books}
E-Books

The general history reader remains content with tertiary overviews, and a few brilliant secondary ones because document fragility bars public access to primary papers, but many might enjoy reading primary sources if they were physically more available. The general reader also needs impartial lists of "suggested reading", instead of the "garlands of bids", as one wit describes the bibliographies employed by scholars. If you glance through the annual reports of the Right Angle Club, you will see I have increasingly included separate internet links to both secondary as well as primary sources, because the bibliographies within the secondaries lead back to the primaries. Unfortunately, you must fire up a computer to access these treasures. The day soon approaches when scholars can carry a portable computer with two screens, one displaying the historian's commentary while the second screen displays related source documents. It seems likely history on paper will persist while that remains cheaper, but also because e-books make it hard to jump around. Newspapers and magazines particularly encounter this obstacle, because publication deadlines give them less time for artful re-arrangement. E-books are sweeping the field in books of fiction because fiction is linear. Non-fiction wanders around, even though this subtlety is often unappreciated by computer designers.

Prisoners in the Stone

{A Prisoner of the Stone}
A Prisoner of the Stone

The first revolution in book authorship has already taken place. Almost all manuscripts arrive on the publisher's desk as products of a home computer. In fact, many publishers refuse to accept manuscripts in any other form. Not only does this eliminate a significant typing cost to the publisher, but it allows him to experiment with type fonts and book design as part of the decision to agree to publish the book. What's more, anyone who remembers the heaps of paper strewn about a typical editor's office knows what an improvement it can be, just to conquer the trash piles. To switch sides to the author's point of view, composing a book on a home computer has greatly facilitated the constant need for small revisions. An experienced author eventually learns to condense and revise the wording in his mind while he is still working on the first draft. However, even a novice author is now able to pause and select a more precise verb, eliminate repetition, tighten the prose. He can go ahead and type in the sloppy prose and then immediately improve it, word by word. The effect of this is to leave less for the copy editor to do, and to increase the likelihood the book will be accepted by the publisher. The author is more readily able to see how the book looks than he would have been with mere typescript.

It must be confessed, however, that a second revolution caused by the computer has already come -- and gone. Not so long ago, I once linked primary documents already on the Internet to the appropriate commentary within my blog. The idea was to let the reader flip back and forth to the primary sources if he liked, and without interrupting the flow of my supposedly elegant commentary. In those early days of enthusiasm, volunteers were eager to post source material into the ether, just asking for someone to read it eagerly. I had linked up Philadelphia Reflections to nearly a thousand citations when an invincible flaw exposed itself. Historians were eager enough to post source documents, but not so eager to maintain them. One link after another was dropped by its author, producing a broken link for everyone else. The Internet tried to locate something which wasn't there, slowing the postings to a pitiful speed. Reluctantly, I went through my web site, removing broken links and removing most links. Maybe linking was a good idea, but it didn't work. There is thus no choice but to look to institutional repositories for historical storage, and funding to pay for maintaining availability for linkage. It is not feasible to free-load, although only recently it had seemed to be.

This pratfall assumed technology would provide more short-cuts than in fact it would. A more difficult obstacle emerges after some thought given to the nature of writing history because it seems remarkably similar to Michelangelo's description of how to carve a statue. Asked how to carve, he said it was simple. "Just chip away the stone you don't want, and throw it away." Michelangelo saw statues as "prisoners of the stone" from which it was carved, while insights and generalizations of history emerge from a huge mass of unsorted primary documents. But there is a different edge to writing history. Uncomfortably often, the process of writing history is one of disregarding documents which fail to support a certain conclusion, often documents which send inconvenient messages to modern politics. Carried too far, de-selecting disagreeable documents amounts to attempting to destroy alternative viewpoints. By this view of it, what the author chooses to disregard, is then as important as what he chooses to include. Unless awkward linkages are consciously maintained in some form, they will soon enough disappear by themselves. It is a great fallacy to assume that ancient history can be isolated from current politics, or even to believe that history teaches the present. Often, it is just the other way around.

Dithering History

{Dithering}
Dithering

Dithering is originally a photographic term, referring to the process of smoothing out rough parts of an excessively enlarged photo. Carried over to the profession of writing history, the term alludes to smoothing over the rough parts of a narrative with a little unacknowledged conjecturing.

In photography, when a picture is enlarged too far, it breaks apart into bits and pieces. Dithering fills in the blank gaps between "pixels" actually recorded by the camera. It amounts to guessing what a blank space should look like, based on what surrounds its edges. In the popular comic strip, Blondie's husband Dagwood works for an explosive boss called Mr. Dithers, who "dithers" between outbursts, when he vents his frustration. But that's to dither in the Fifteenth-century non-technical sense, meaning to hesitate in an aimless trembling way. If the cartoonist who draws Blondie will forgive me, I have just dithered in the sense I am discussing in history. I haven't the faintest idea what the cartoonist intended by giving his character the name of Mr. Dithers, so I invented a plausible theory out of what I do know. That's what I mean by a third new meaning for either, in the sense of "plausible but wholly invented". It's fairly common practice, and it is one of the things which Leopold von Ranke's insistence on historical documentation has greatly reduced.

To digress briefly about photographic dithering, the undithered product is of degraded quality, to begin with. Dithering systematically removes the jagged digital edge, and restores the original analog signal. In that case, dithering the result brings it back toward the original picture. That's not cheating, it's the removal of a flaw.

Dithering of history comes closer to resembling a different photographic process, which samples the good neighboring pixels on all sides of a hole and synthesizes an average of them to cover the hole. It works best with four-color graphics, converting them to 256-color approximations. But the key to all photographic approaches is to apply the same formula to all pixel holes. That's something a computer can do but a historian can't, and historical touch-ups seem less legitimate because the reader can't recognize when it has happened, can't reverse it by adding or removing a filter. Dithering may somewhat improve the readability of the history product occasionally, usually at the expense of inaccuracy, sometimes large inaccuracy. It's conventional among more conscientious writers to signal what has happened by signaling, "As we can readily imagine Aaron Burr saying to Alexander Hamilton just before he shot him." That's less misleading than just saying "Burr shouted at Hamilton, just before he blew his brains out." The latter sends no signal or footnote to the reader, except perhaps to one who knows that Hamilton was shot in the pelvis, not the head. In small doses, dithering may harmlessly smooth out a narrative. But there's a better approach, to write history in short blogs of what is provable, later assembling the blogs like beads in a necklace. Bridges may well need to be added to smooth out the lumps, but that becomes a late editorial step, consciously applied with care. And consequently, author commentary is more likely to recognized as commentary, rather than rejected as fiction.

Dithering the holes is often just padding. It would be better to spend the time doing more research.

Ruminations About the Children's Education Fund (3)

The Right Angle Club runs a weekly lottery, giving the profits to the Children's Educational Fund. The CEF awards scholarships by lottery to poor kids in the City schools. That's quite counter-intuitive because ordinarily most scholarships are given to the best students among the financially needy. Or to the neediest among the top applicants. Either way, the best students are selected; this one does it by lottery among poor kids. The director of the project visits the Right Angle Club every year or so, to tell us how things are working out. This is what we learned, this year.

The usual system of giving scholarships to the best students has been criticized as social Darwinism, skimming off the cream of the crop and forcing the teachers of the rest to confront a selected group of problem children. According to this theory, good schools get better results because they start with brighter kids. Carried to the extreme, this view of things leads to maintaining that the kids who can get into Harvard, are exactly the ones who don't need Harvard in the first place. Indeed, several recent teen-age billionaires in the computer software industry, who voluntarily dropped out of Harvard seem to illustrate this contention. Since Benjamin Franklin never went past the second grade in school perhaps he, too, somehow illustrates the uselessness of education for gifted children. Bright kids don't need good schools or some such conclusion. Since dumb ones can't make any use of good schools, perhaps we just need cheaper ones. Or some such convoluted reasoning, leading to preposterous conclusions. Giving scholarships by lottery, therefore, ought to contribute something to educational discussions and this, our favorite lottery, has been around long enough for tentative conclusions.

{Darwinism}
It is now safe to conclude the data show better schools generally lead to better academic results later on. Compared with other children who competed in the lottery pool but did not win the lottery, the scholarship students were far more likely to graduate from high school and be admitted to college. Since they were randomly selected, the different talents of the students could not account for this difference. So, it seems possible to say the school teachers -- not the ability of the kids -- caused the difference. That's extreme; obviously smarter kids do better in school. It could still be argued it was not the new teachers in those chosen schools, but rather the new companion students who made the difference, by means of peer pressure and good example. But kids who won the lottery were regardless better off for the experience, and Philadelphia is better off. The evidence at least implies children born in poverty can benefit from better schools than the ones we have been providing them; one does not have to be a doctrinaire liberal to agree we should get to work improving inner-city schools.

Just what improving schools means in practical terms, does not yet emerge from the experience. Some could say we ought to fire the worst teachers, others could say we ought to raise salary levels to attract better ones. Most people would agree there is some level of mixture between good students and bad ones. At that point, the culture mix becomes harmful rather than overall helpful; whether just one obstreperous bully is enough to disrupt a whole class or something like 25% of well-disciplined ones would be enough to restore order in the classroom, has not been quantitatively tested. What seems indisputable is that the kids and their parents do accurately recognize something desirable to be present in certain schools but not others; their choice is wiser than the non-choice imposed by assigning students to neighborhood schools. Maybe it's better teachers, but that has not been proved.

{Privateers}

It seems a pity not to learn everything we can from a large, random experiment such as this. No doubt every charity has a struggle just with its main mission, without adding new tasks not originally contemplated. However, it would seem inevitable for the data to show differences in success among types of schools, and among types of students. Combining these two varieties in large enough quantity, ought to show that certain types of schools bring out superior results in certain types of students. Providing the families of students with specific information then ought to result in still greater improvement in the selection of schools by the students. No doubt the student gossip channels already take some informal advantage of such observations. Providing school administrations with such information also ought to provoke conscious improvements in the schools, leading to a virtuous circle. Done clumsily, revised standards for teachers could lead to strikes by the teacher unions. Significant progress cannot be made without the cooperation of the schools, and encouragement of public opinion. After all, one thing we really learned is that offering a wider choice of schools to student applicants leads to better outcomes. What we have yet to learn, is how far you can go with this idea. But for heaven's sake, let's hurry and find out.

New Looks for College?

{Privateers}
Kevin Carey

The New York Times ran an article by Kevin Carey on March 8, 2015, predicting such big changes ahead for colleges, bringing an end of college as we know it. A flurry of reader responses followed on March 15, making different predictions. Since almost none of them mentioned the changes I would predict, I now offer my opinion.

{Privateers}
Cambridge University

Colleges have responded to their current popularity, mostly by building student housing and entertainment upgrades, presumably to attract even more students. What I am seeing seems to be a way of taking advantage of current low-interest rates with the type of construction which can hope for conventional mortgages or even sales protection, in the event of a future economic slump. In addition, they are admitting many more students from foreign countries, probably hoping not to lower their standards for domestic admissions. They probably hope to establish a following in the upper class of these countries, eventually enabling them to maintain expanded enrollments by lowering standards for a worldwide audience of students, rather than merely a domestic one. With luck, that might lead to an image of superiority for American colleges, even after the foreign nations eventually build up their standards. The example would be that of Ivy League colleges sending future Texas millionaires back to Texas, which now maintains an aura of superiority for Ivy League colleges, well after the time when competing Texas colleges are themselves well-funded. The Ivy League may even be aware of the time when the Labor Party was in power in England, and for populist reasons deliberately underfunded Oxford and Cambridge. American students kept arriving anyway, seeking prestige rather than scholarship.

{Privateers}
William F. Buckley Jr

Television courses seem to be a different phenomenon. A good course is a hard course, so a superior television course will prove to be even harder. In fact, it might be said the main purpose of college is to teach students how to study; the graduates of first-rate private schools find college to be rather easy, providing them with extra time for extra-curricular activities which are not invariably trivial. I well remember William F. Buckley Jr, pouring out amazing amounts of written prose for the college newspaper and other outlets, in spite of carrying a rigorous academic workload. I feel sure he did not acquire that talent in college, but rather, came to Yale, already loaded for Bear. I am certain I do not know what future place tape-recorded classes will eventually assume, but I do feel such courses would be most useful for graduate students, who have already learned how to study in solitude.

To return to the excess of dormitories under construction, the approaching surplus of them might also lead to better use, which is for faculty housing and usage. An eviction of students from dormitories would lead to urban universities beginning to resemble London's Inns of Court in physical appearance, with commuting day-students, mostly attending from nearby. The day is past, although the students do not believe it, that there is very much difference between living in Boston and living in California, and the much-touted virtue of seeing a new environment will eventually lose its charm. It may all depend on how severely a decline in economics retards the traditional pressure to escape parental control, but at least it is possible to foresee at least one improvement which could result from fiscal stringency.

Millennials: The New Romantics?

{Privateers}
Romantic Era

It was taught to me as a compliant teenager that the Enlightenment period (Ben Franklin, Voltaire, etc.) was followed by the Romantic period of, say, Shelley and Byron. Somehow, the idea was also conveyed that Romantic was better. Curiously, it took a luxury cruise on the Mediterranean to make me question the whole thing.

It has become the custom for college alumni groups to organize vacation tours of various sorts, with a professor from Old Siwash as the entertainment. In time, two or three colleges got together to share expenses and fill up vacancies, and the joint entertainment was enhanced with the concept of "Our professor is a better lecturer than your professor", which is a light-hearted variation of gladiator duels, analogous to putting two lions in a den of Daniels. In the case I am describing, the Harvard professor was talking about the Romantic era as we sailed past the trysting grounds of Chopin and George Sand. Accompanied by unlimited free cocktails, the scene seemed very pleasant, indeed.

{Privateers}
Daniel Defoe

In the seventy years since I last attended a lecture on such a serious subject, it appears the driving force behind Romanticism is no longer Rousseau, but Daniel Defoe. Robinson Crusoe on the desert island is the role model. Unfortunately for the argument, a quick look at Google assures me Defoe lived from 1660 to 1730, was a spy among other things, and wrote the book which was to help define the modern novel, for religious reasons. His personal history is not terribly attractive, involving debt and questionable business practices, and his prolific writings were sometimes on both sides of an issue. He is said to have died while hiding from creditors. Although his real-life model Alexander Selkirk only spent four years on the island, Defoe has Crusoe totally alone on the island for more than twenty years before the fateful day when he discovers Friday's footprint in the sand.

{Privateers}
Robinson Crusoe

But the main point of history was that Defoe was born well before William Penn and died before George Washington was born. The romanticism he did much to promote was created at least as early as the beginning of the Enlightenment and certainly could not have been a retrospective reaction to it. Making allowance for the slow communication of that time, it seems much more plausible to say the Enlightenment and the Romantic Periods were simultaneous reactions to the same scientific upheavals of the time. Some people like Franklin embraced the discoveries of science, and other people were baffled to find their belief systems challenged by science. While some romantics like Campbell's Gertrude of Pennsylvania, who is depicted as lying on the ocean beaches of Pennsylvania watching the flamingos fly overhead, were merely ignorant, the majority seemed to react to the scientific revolution as too baffling to argue with. Their reasoning behind clinging to challenged premises was of the nature of claiming unsullied purity. Avoidance of the incomprehensible reasonings of science leads to the "noble savage" idea, where the untutored innocent, young and unlearned, is justified to contest the credentialed scientist as an equal.

Does that sound like a millennial to anyone else?

"Sir"

In 1938 when I was 14 years old, I entered a new virtual country with its own virtual language. That is, I went to an all-male boarding school during the deepest part of the worst depression the country ever had.

{Privateers}
Boarding School

identified While it should be noted I had a scholarship, there is little doubt I was anxious to learn and emulate the customs of the world I had entered. My life-long characteristic of rebellion was born here, but at first, it evoked a futile attempt to imitate. Not to challenge, but to adopt what I could afford to adopt. The afford part was a real one because the advance instructions for new boys announced a jacket and tie were required at all meals and classes, and a dark blue suit with a white shirt for Sunday chapel. That's exactly what I arrived with, and let me tell you my green suit and brown tie were pretty well worn out by the first Christmas when I came home on the train for ten days vacation, the first opportunity to demand new clothes. First-year students were identified by requiring a black cap outdoors, and never, ever, walking on the grass. The penalty for not obeying the "rhine" rules was to carry a brick around, and if discovered without a brick, to carry two bricks. But that's not what I am centered on, right now. The thing which really bothered me was unwritten, equally peer-pressured by my fellow students, the custom of addressing all my teachers as Sir. The other rules only applied until the first Christmas vacation, but the unwritten Sir rule proved to be life-long.

{Privateers}
Sir

And it was complicated. It was Sir, as an introduction to a question, not SIR!, as a sign of disagreement. You were to use this as an introduction to a request for teaching, not as any sort of rebuke or resistance. Present-day students will be interested to know that every one of my teachers was a man; my recollection is, except for the Headmaster's secretary, the Nurse was the only other female employee. The average class size was seven. Seven boys and a master. Each session of classes was preceded by an hour of homework, the assignment for which was posted outside a classroom containing a large oval maple table. Needless to say, the masters all wore a jacket and tie, most of the finest style and workmanship. They always knew your name, and always called on every student for answers, every day. Masters relaxed a little bit during the two daily hours of required exercise, when they took off their ties and became the coaches, but were just as formal the following day in class. I had been at the head of the class of what Time Magazine called the finest public high school in America, but I nearly flunked out of the first semester in this boarding school. It was much tougher at this private school than I felt any school had a right to be, but they really meant it. Over and over, the Headmaster in the pulpit intoned, "Of those to whom much is given, much is expected."

I had some new-boy fumbles. Arriving a day early, I found myself with only a giant and a dwarf for a company at the dining table. I assumed the giant was a teacher, but he was a star on the varsity football team. And I assumed the dwarf was a student, but he was assistant housemaster. One was to become a buddy, the other a disciplinarian, but I had them reversed, calling the student "Sir", but the master by his first name. Bad mistake, which I have been reminded of, at numerous reunions since then.

{Privateers}
Yale

When I later got to Yale, I began to see the rules behind the "Sir," rule. In the first place, all of the boarding school graduates used it, and none of the public school graduates, although many of the public school alumni began, falteringly, to imitate it. Without realizing it, a three-year habit had turned out to be a way of announcing a boarding school education. The effect on the professors was interesting; they rather liked it, so it was reinforced. It had another significance, that the graduates who said "Sir" acquired upper-class practices, the red-brick fellows seldom did. The only time I can remember it's being scorned was eight years later, by a Viennese medical professor with a thick accent, and he was obviously puzzled by the significance. Hereditary aristocracy, perhaps. Indeed, I remember clearly the first time I was addressed as Sir. I was an unpaid hospital intern, but the medical students of one of the hospital's two medical schools flattered me with the term. In retrospect, I can see it was a way of announcing that graduates of their medical school knew what it meant, while the other medical school was just red-brick. Although the latter had mostly graduated from red-brick colleges, their medical school aspired to be Ivy League.

If you traveled in Ivy League circles, the Sir convention was pretty universal until 1965, when going to school tieless reached almost all college faculties, thus extending permission to students to imitate them. Perhaps this had to do with co-education, since the sir tradition was never very strong in women's colleges, and denounced by the girls when the men's colleges went co-ed. Perhaps it had to do with the SAT test replacing school background as the major selection factor for admission. Perhaps it was the influx of central European students, children of European graduates for whom an anti-aristocratic posture was traditional, and until they came to America, largely futile. Perhaps it was economic. The American balance of trade had been positive for many decades before 1965; afterward, the balance of trade has been steadily negative.

In Shakespeare's day, "Sirrah" was a slur about persons of inferior status. In Boswell's eighteenth Century day, his Life of Johnson immortalized his characteristic put-down with a one-liner. It survives today as a virtually text-book description of how to dominate an argument at a boardroom dispute. "Why, Sir," was and remains a signal that you, you ninny, are about to be defeated with a quip. It's a curious revival of a new way of immortalizing small-group domination, and a very effective one at that, which even the soft-spoken Quakers use effectively. Whatever, whatever.

The 90-plus years of tradition of addressing your professor as "Sir," is gone, probably for good, except among those for whom it is a deeply ingrained habit. Along with the tradition of female high school teachers, followed I suppose by male college professors.

 

Please Let Us Know What You Think

 
 

(HTML tags provide better formatting)
 

8 Blogs

Nanoparticles: The Dwarfs are Coming
A lot of basic science will have to be revised when we fully understand what happens to particles after they get small enough.

More, or Fewer, Raisins in the Pudding
The primary sources of history are supposed to be factual. Tertiary history is mostly an interpretation of many primary bits.

Prisoners in the Stone
History emerges from a welter of facts, with the irrelevant and inconvenient material selectively removed.

Dithering History
It's better to start writing history with what you know for certain. And it's usually better to stop there, too.

Ruminations About the Children's Education Fund (3)
By selecting children for scholarships by lottery, it emerges that different schools make big differences.

New Looks for College?
Many people predict big changes are in store for Universities. Here's my take on it.

Millennials: The New Romantics?
The romantic period of literature is said to have followed the Enlightenment. Maybe they were just different people at the same time.

"Sir"