Culture: The Flavors of Philadelphia Life
Philadelphia began as a religious colony, a utopia if you will. But all religions were welcome, so Quakerism mainly persists in its effects on others, both locally and in America, in Art, clubs, and the way of life.
Sociology: Philadelphia and the Quaker Colonies
The early Philadelphia had many faces, its people were varied and interesting; its history turbulent and of lasting importance.
The Philadelphia Media
Although presently scheduled to happen between the November 2008 elections and the actual onset of the newly elected government, a big event has been postponed twice, and may be postponed again. The event in question is federal prohibition by Congress of further American broadcasting of analog television, the only free television now being broadcast. Congress has passed such a law, and the President has signed it. Even if you aren't a cynic, it's hard to believe the cable television industry isn't overjoyed, if not a little culpable. Presumably, useless analog television sets will then be put out at the curb for trash collection simultaneously. One could, however, imagine that trash collectors will be busy attending protest riots in Washington, so you can't be entirely sure the trash will be picked up. The Constitution provides for lame-duck sessions of Congress just before this impending moment, so perhaps all can be rescued by chastened legislators.
That's one vision of the future. An alternative would be that the spin industry will work us up into a buying frenzy for high-definition television (HDTV) by that time, so the nation may eagerly line up in front of discount stores to buy a technological marvel that renders current television a laughable joke we can hardly wait to eliminate. Judge for yourself, because HDTV capable television sets are already on display in the shops, side by side with the old-fashioned variety. Unfortunately, it is a little hard to observe much difference. The present spin is that HDTV, or digital television, will be such a dramatic improvement that any transitional disruption will be ignored.
In fact, improved reception isn't the issue at all, more bandwidth is. Analog transmission is a bandwidth hog, severely limiting the number of channels available for licensing. Digital transmission would permit a vast increase in the number of channels available to use the free airways, making heftier competition for cable transmission, also a notorious political plum, as all utilities somewhat are. Dig a little deeper and you find that not only is the issue fifty dollars a month versus free, or ten channels versus two hundred, it also has to do with the vast overbuilding of fiberoptic networks by that bearded CEO fellow who went to jail, and the subsequent scooping up of fiberoptic by the Chinese, Google, or whoever. But set that aside. The current buzz is that WHYY, or Channel 12 the Public Broadcasting System, will expand to channels 12.1, 12.2, 12.3 and 12.4. That will give Philadelphia a full-time arts and culture channel, in addition to the present channel 12, which is now somewhat overweighted with environmental photography. It's hard to know whether the existing arts and culture institutions, which charge admission to their content, will welcome or deplore a new competitive medium in their midst. Some of the grimmer realities of the matter are highlighted by the additional reality that Channel 12 will be given yet another two channels to fill, and presently isn't at all sure where any content will come from. One of the most discouraging features of the present single PBS channel is the obvious shortage of money for programming purposes, sadly evident in the irritating and humiliating marathons of public pleading and begging for contributions. If there isn't enough revenue to support one channel, how can four be financed?
Here's some advice for new authors of books: You can't write the first chapter until you have written the last chapter. That is, you have to hit the reader between the eyes in the first chapter, draw him into the argument, making a pauseless transition from a general statement of the author's thesis into a relentless march of evidence toward the conclusion. This general design comes easier with practice and is therefore much harder for beginners to accomplish. But even experienced authors are usually unable to keep the overall design of their message constantly in mind, to be able to sit down and write the book straight through from beginning to end. It's true that Sir Walter Scott was said to turn over the last page of a book, and immediately begin writing the first page of the next one without getting up from his desk. But we aren't talking about pot-boilers, we're talking about serious books. That includes almost all non-fiction and the great majority of serious fiction.
As a matter of fact, the description includes the majority of short articles as well; newspaper editorials would be a good example. Although the style of an editorial is to start with a generality, marshal a description of some recent events, and end up with a short summary, that isn't in fact how it is usually composed. The editorial writer starts with a one-liner, or call to action, organizes some recent events and some historical arguments as a reason to issue such a call, and then ends up by summarizing things in the first paragraph. Having mentally designed the editorial into such a three-step pattern, with experience a professional editorialist can sit down and write the editorial from beginning to end and, after a few touch-ups, it's ready for the printer. He really has gone through the organizational process which a book author needs to go through, but the article is short enough so that reconstruction is performed in his head. In a book, it is generally necessary to write out the chapters in a jumbled way, and later re-organize them. A new author with his first manuscript generally doesn't adequately appreciate the truth of this and has to be muscled by the editor, at least just a little bit. One of my editors summarized his job as follows: you tell every new author to take the first four chapters of his book and throw them away,
That's cruel, of course, and is seldom accepted graciously. The brusqueness is justified by understanding that the fresh new author thinks he's all finished when he isn't. He's silently telling himself he means to tell that editor, "Don't you touch a single comma of it." In the old days, authors were rare and had to be coddled. Book publishers in the Eighteenth century purchased the manuscript in its entirety, either then losing money or making a huge fortune, but leaving the author with only his manuscript price. At that time, publishers called themselves booksellers. As things evolved, booksellers often had to support a starving author while the book was being written, offering an "advance" payment to be deducted from royalties paid after final sales to readers. Author royalties were about ten percent of sales. The royalty system persists today, but advance payments are uncommon and negotiated around the tax code effects. All of these payment evolutions reflect the underlying issue: good authors used to be rare, but now are frequent. Book publishers used to be wealthy, but now are rapidly going bankrupt and extinct. Authors of excellent books have a hard time finding someone to publish them. The advent of the personal computer around 1980 is what caused this.
In 1980 I published a book, writing it on my brand-new Radio Shack TRS-80, Mod I. The editor of the publishing house had never heard of such a notion, scoffed at it, and declared he would never touch such a thing as a computer. In 2010 there were more than 800 million personal computers manufactured and sold, and by this time almost no publisher will accept a manuscript without an accompanying magnetic disc to make revisions cheap and easy, and to shift the costs of key-entry from the publisher to the author. At first, manuscripts were shipped to India for key entry. Now, it is the diskette which is mailed or e-mailed to India, and the editor is often located in India. We are soon approaching a day when the keyboard and author remain at home, sending material to gigantic server computers in China, from which the editor anywhere in the world can retrieve the material and revise it, returning the material to the server computer where the author can comment on the revisions without moving from his desk. After that stage, looms the prospect of the reader paying a fee on his credit card to access the "book" directly from the server, and reading it at home. At that point maybe the book will have been completed, and maybe it will be revised some more. In a sense, a book will never be definitely finished and allowing the public to read it will only be an episode within an unending process of revision. Newspapers, magazines, and books are all struggling to find a way to cope with this unpredictable evolution. Like most revolutions, this one doesn't have a clear idea where it is going.
So let's reflect back on the central process of authoring. In addition to the old maxims of the trade, there is the Euclidian reality that you can't write the last chapter until you somehow write a first one. The original first chapter, the one the author struggled so hard to compose, is destined to be cast off and replaced by a new first one, one that succinctly announces what is about to be said. After that must come a new second chapter, which takes the reader from the initial disconcerting summary back to the origins of the problem now about to be clarified. Followed by a third chapter, probably one shifted forward from the assorted chapters of evidence back into prominence as the key piece of evidence which leads to other confirmatory pieces of evidence; after that marches the parade of confirmatory evidence, ending with one zinger of a conclusion.
Voltaire or some other cynic would probably comment that what has here been outlined is merely an elaborated process of what editorial writers do: start with the conclusion and find slanted facts to fit it. Some may indeed do that. But there is some hope that the inevitable impact of technology on authorship can bring us to a system where many authors will assemble the facts, and only then derive a conclusion from them. If politicians would only adopt that system, maybe we could hope for a perfect world.
|Modern Print Press|
There are many more authors than publishers of books. Since almost every school child owns and uses a home computer, this disparity might be even greater except for a technical barrier between home computers and high-speed printing presses in the way they treat illustrations. A presently insurmountable mismatch arose from commercial printing presses migrating from movable type (i.e. Gutenberg style) to page-images, whereas the computer industry aimed for cheap printers which perfected the Gutenberg method instead of replacing it. Commercial printers need to produce high volume output inexpensively, while cheap computer printers produce low-volume output and disregard a rather high unit price. Most of the barriers between the two have been overcome, except for photos and other detailed images. Let's give a simplified explanation.
Since 1993 when Adobe invented the method, commercial printers generally work from what amounts to a photograph of each page, called a PDF or "portable description format". To some extent, portions of a page can be stitched together like a patchwork quilt, but of course, all the pieces must be uniform in their technology. The industry standard is that everything is printed at 300 dots per inch. That's essentially 300 pixels per inch. The establishment of this standard made it possible for huge high-speed presses to produce hundreds of pages of newsprint a minute on machines which cost millions of dollars apiece. Commercial printing during the first half of the Twentieth century accepted the massive cost of the printing machine in order to promote production speed. Home computer printers sacrificed production speed in order to become cheap. Profitability comes from the ink, not the printer.
Low-volume desktop printers can afford to take the time to examine each character or image as it comes along and readjusts appropriately. Essentially, computer printers do individual typesetting every time a new page is printed. They are thus able to exploit considerable compression for storage, or for the speed of electronic transmission for printing at a remote location. For them, 72 dots per inch are sufficient, since computer-driven printers have acquired the facility to guess the gaps between dots (dithering) well enough to fool the eye of the reader. Since the same thing is true of display monitors, there is resistance in the computer industry to sacrificing the interests of the multitude to the needs of those comparatively few authors and publishers who use the mass-printing industry to keep their unit costs low. Computer printing squashes thousands of pixels down to 72 per inch.
But it's hard to convert photo images back from 72 to 300 dpi since the dithering trick won't stretch that far. A good illustration is the washing of wool socks. You can throw argyle socks in a washing machine and they will shrink to the size of baby booties, but they won't stretch back up if you decide to wear them. The usual expedient is to enlarge a small picture and take the second picture of it, then enlarge the enlargement, and so on. Alternatively, the Genuine Fractals program by Altamira Group comes closer to achieving the desired result, but even it has limits. When someone has more than a very few pictures that need stretching between Internet screen display and commercial printers, the current best advice is to store two different-density copies of the same image, and use as required.
Slight re-design of workflow is advised to create still a third version of the image, for the purpose of storage. The biggest possible image with the most pixels possible should be stored on the author or publisher's own computer. A second, shriveled 72-dpi, image is sent to be stored on the host computer of a website and can be used for desktop printing as well. When commercial printing happens to be desired, the much larger stored image can be shriveled to 300 dpi by conversion of a page to PDF format, or else (for editing) this third version of the image is incorporated into Office Word , subsequently re-incorporated back into a pdf file for final printing. The PDF conversion is quick and simple once a decision is made as to what the final product should look like after it is printed. Fine art display or other highly demanding graphics will follow this general outline, as well. Those who have any aspiration for high-density output would be well advised to go back to the original photograph. In 2008, that translates into another maxim for the photographer who takes the original picture: Always take all photos in RAW format, with a view toward later flexibility of use. Less demanding output can be produced from degraded copies of the RAW original, the most common of which is now the so-called JPEG format. Those who discard an original RAW image, are almost always sorry. And those who buy cheaper cameras that go straight to JPEG are just asking for frustration. Now that memory chips have become cheap, the extra technology to generate the RAW image does not greatly increase the cost of a camera, but greatly enhances its ability to retouch images without breaking up in the repeated dithering and re-dithering usually required for retouching. A number of steps in the photography process could be eliminated if the user would accept the requirement of some new step resembling retouching as part of every snapshot.
Current American law now makes it illegal to broadcast the currently free (analog) version of television after Feb 1, 2009. Couch potatoes, man the barricades.
Future Directions for Book Authoring
Personal computers have already revolutionized book authorship by greatly expanding it, and revolutionized book publishing by drowning it in authorship. What comes next?
Modern Printing and Post-Modern Printing
Like computer monitors, computer printers operate at 72 dpi. Commercial publishing uses 300 dpi, and it's a little awkward to switch back and forth.