The “Land Battleship”

Modern life, especially in the era since roughly the introduction of the CD player in the early 1980s, has conditioned us to deal with an increasingly rapid pace of technological change. One day $1000 “bag” style cell phones weighing 5 pounds are being used by highly paid professionals who think of them as status symbols. Seemingly in the blink of an eye, everyone on the planet is suddenly carrying smart phones that make bits of the original “Star Trek” series seem old fashioned.

"Spock, where's my Android?"

“Spock, where’s my iPhone?”

Film cameras, once widely used after Kodak brought the box camera to the market around 1900, suddenly vanished in favor of digital photography.

Let’s get some perspective here. Phonographs (‘record players’ to some) arrived around 1875, and for the next 110 or so years were the standard for sound reproduction in most homes (tape recorders only turned up in the 1960s in any quantity). The CD appeared and lasted maybe 20 years, only to be replaced by digital media for a large segment of the population. Analog (film) photography had a similar lifespan – perhaps a century – before digital all but destroyed the medium. In both cases, the transition was sudden.

We’re used to this sort of change. Earlier generations were not. For example, archaeologists suggest it might have taken 10,000 years for humans to move from stone to early copper tools. Now, we shift our paradigms on a regular basis. It’s no wonder we’re a bit neurotic.

But why all this background material?

Planes, Trains, and Tanks

Envision the technology of the Civil War. Probably the biggest advances involved breech-loading rifles and early forays into rapid firing guns like the Gatling. Otherwise, warfare was conducted more or less as it had been for hundreds of years, with cavalry charges and men marching into battle in tight formation. Ships were still mainly wood, with early “ironclad” steam powered vessels making some of their first appearances.

Then, in a generation, it all changed. The machine gun, steel ships, larger ordnance, the automobile, telegraphy, and early radios changed the face of war.

maxim

World War I German Maxim Machine Gun

The problem, however, was that military leaders often clung to outmoded tactics and refused to acknowledge the effect that modern technology was having on their profession. Thus we had the First World War, complete with trenches, million-man battles that achieved nothing aside from keeping burial details busy, and stagnation for years on end. Cavalry charges against fixed positions and machine guns were quickly found to be suicide missions.

Airplanes, which started off as observation platforms, emerged four years later with machine guns and bombs. Navies became focused on larger vessels, with battleships and ‘dreadnought’ class ships reaching up to 35,000 tons. The contraption that quickly became known as  the ‘tank’ made its first appearance on the battlefield.

WW1 Tank

World War I tank

Into this mix came a number of fanciful ideas about how warfare might continue to evolve. One of these, involving a paper by Rear Admiral Bradley Fiske entitled “If Battleships Ran on Land,” introduced some very amusing and (to modern eyes) ridiculous speculation.

Fiske’s paper, which was presented to the US Naval Institute around 1912, talked about the difference between armies and ships. Fiske, a well known technical innovator with a range of significant patents already in his pocket (including things like telescopic sights on warships, a massive innovation at the time that no one else seems to have thought of) appears to have been talking in a more theoretical vein in his paper. It discusses the difference between, say, a battleship that can be steered and maneuvered by one man (the helmsman) vs. the problems of controlling an army of hundreds of thousands. He asks:

“Now can anybody imagine the entire standing army of Germany being carried along at 27 miles an hour and turned almost instantly to the right or left by one man? The standing army of Germany is supposed to be the most directable organization in the world; but could the Emperor of Germany move that army at a speed of 27 miles and hour and turn it as a whole (not its separate units) through ninety degrees in three minutes?”

 Basically, Fiske was talking about the problem of command and control in the field. Note that today’s military commanders, with instant access to nearly every unit in the field using radio and satellite communications, would probably shrug and ask what the problem was. But in Fiske’s day, the problem of large unit tactics was a major issue. Guys were still galloping from one unit to another on horse to hand out movement orders to units in attack. Radios were available, but weighed a lot and were often unreliable under battlefield conditions. Infantry units still used semaphore flags to pass movement orders along.

The amusement factor comes into play when the ‘civilian’ reaction to Fiske’s paper is considered. The paper was reprinted in The Book of Modern Marvels (1917), a  volume dedicated to newly discovered and emerging technologies that will be covered in future blogs, as it’s a gold mine of material from this period.

The reprint was prefaced by the fanciful illustration shown below.

"Hard to port, Mister Christian! Mind that building!"

“Hard to port, Mister Christian! Mind that building!”

This is pretty obviously not what Fiske was talking about. Clearly, some artist got hold of the “land battleship” phrase and decided to run with it. What would such a behemoth look like, anyway? One major question is how one would get it on shore in the first place. Or was it supposed to be amphibious, sailing across the ocean to emerge on its wheels to attack the hapless enemy?

As the Russians say, “it is to laugh.”

From the Ridiculous to…the Even More Ridiculous

As if the above illustration wasn’t enough, an inventor named Frank Shuman (one of the fathers of solar power, incidentally) apparently either had a high fever or decided to engage in a fit of whimsy. He wrote another article, also found in the Book of Modern Marvels, entitled “The Giant Destroyer of the Future”. In it, he envisioned a monster machine with 200′ high latticework wheels and massive weights hanging beneath. He also uses the term “battleship on land” to describe the device, though whether he’d read Fiske’s article is unknown.

land_dreadnought_1

Shuman’s “Giant Destroyer” on Land

Shuman envisions his invention running rampantly across the enemy’s countryside, destroying towns and defenses with frightening ease. It sports a cabin with “perhaps thirty men” inside, heavily armored, suspended between the two wheels along with massive steam (!) engines to run the beast. He elucidates:

“I am fully aware the problem of obtaining engines which will give this war machine a speed of one hundred miles per hour is not easily solved. But if thousands of horsepower can be developed by the engines of pitching and rolling battleships it is not unreasonable that competent engineers can be found to design and build steam engines of twenty thousand horsepower, fed by oil-fired boilers.”

He waxes almost lyrical at the end of the article. “The commander gives a signal. The machine moves. It gains headway. Soon it travels at express-train speed. […] An enemy village, occupied by enemy soldiers lies in front. The machine speeds on toward it. It reaches them. Houses are battered down as if they are made of paper.”

battleship0003

River? What River?

Clearly, Shuman had good intentions and was definitely not a crank. Or at least he wasn’t a crank when he invented a number of other very important devices. In this case, how he thought his “land battleship” could be driven by steam engines and would be robust enough to hold up under the pounding it would receive is unknown. Maybe he was just being fanciful.

Or maybe this was an outgrowth of the massive loss of human life during the Great War — the conflict that was supposed to be “the war to end all wars.” The idea of a few soldiers mowing down vast swaths of enemy troops and infrastructure with little loss of life on their own side was surely appealing after horrors like Verdun (over 700,000 casualties), Passchendaele, the Somme, and the disastrous Gallipoli (over 500,000 casualties) landings. In 1917, the US was just entering the war and hadn’t experienced the huge number of war dead suffered by the other powers.

Fiske and Shuman also could not have foreseen the subsequent rise of air power and its impact on warfare, even a decade later. In 1921, Billy Mitchell demonstrated that bombers could sink even the largest warships with his demonstration against several German ships seized at the end of the war. No one listened. In 1938, a few B17 bombers made the news by intercepting the Italian liner Rex while she was still 620 miles off New York. This had the sole effect of infuriating our Navy, which thought of the oceans as its private hunting ground and promptly lobbied to have the Army Air Corps limited to operational flights of under 100 miles from shore.

It wasn’t until World War II, which saw the sinking of a number of unescorted battleships and other large vessels by enemy aircraft, when the era of the battleship came to an end. The Land Battleship was obsolete before it could ever be built, or even seriously considered.

Would it have worked? It’s pretty doubtful, and the advent of anti-tank guns as well as larger  bombers would likely have made short work of the Land Battleship, as was found with the seagoing version. It would have been interesting as a terror weapon, but probably little more. Like the bag phone, film camera, and tape recorder, the battleship in all its forms (with apologies to the USS Missouri) went into the dustbin of technology.

But it would have kicked serious ass at a “monster truck” competition.

Into the Modern Age: The Adoption of Standard Time

Our modern perception of time, with its almost slavish devotion to millisecond accuracy, was a totally alien concept to even our recent ancestors. Although mechanical timepieces were known even in the ancient world, time and timekeeping were inexact. Accuracy was generally low, with devices often “drifting” significantly. Where public clocks were in use, men employed as clock setters periodically adjusted a city’s clocks according to local custom, which was generally based on local mean time. This was based on the position of the sun at noon, since each degree of longitudinal movement around the globe represents a difference of roughly 4 minutes on a solar-based time system. Under solar time, if it was noon in Boston it was 11:24 in Charleston, South Carolina, 11:36 in Washington. DC, 11:48 in New York City, and so forth.

This system worked because communities were relatively isolated from one another, and the idea of time based on the sun’s position was well understood. There was no need for pinpoint accuracy. A person traveling by foot, or even by horse, was not concerned with exact time schedules and certainly had no need or desire to know precisely when they might arrive at their destination. Upon arrival, they simply adjusted their watch (if they owned one) to the local mean time customary to that location.

Most histories cite the rise of the railroads, with their complex timetables and need for precision across larger distances, as the impetus behind the creation of what we now know as Standard Time. This is overly simplistic, however. While railroads were indeed a factor in the equation, other groups (including astronomers and telegraph companies) played a much larger initial role in its adoption. Railroad companies jumped on the bandwagon only later, once it was apparent that standards-based system was about to be adopted. They did so solely to protect their interests, and to guarantee that the new system met their own needs.

Typical early railroad timetable, with no time standard or zone indicated

The Origins of Standard Time

The concept of standard time did not originate in the United States. The United Kingdom adopted the use of a single time zone for the entire country on December 11, 1847, when railways in that country switched from use of local mean time to a single zone based on GMT (Greenwich Mean Time). The UK had the advantage of covering only 8 degrees of longitude, so only one time zone was required. Those communities at the furthest reaches only needed to change their clocks by one-half hour (forward or backward, depending on their location) to adopt GMT as their standard time. Adoption was relatively rapid. By 1855, most of the country’s public clocks were set to GMT.

The United States had a far larger problem, since its territory (excluding Alaska and Hawaii, which were not even states at this time) stretched across 57 degrees of longitude.  Hundreds of time standards were in use across the country. One source cites the presence of no fewer than 38 local times in mid 19th century Wisconsin, 27 in Michigan, and 23 in Indiana.  Unlike England, the US also lacked a well-known central city or feature on which to base any national system of timekeeping.

An Increased Need for Precision

We have already noted the popular misconception that railroads were the prime motivating force behind the adoption of a standard time system in the US. However, railroads did suffer from time-related problems as their areas of operation grew. To compensate for the plethora of local mean times, each railroad eventually introduced its own corporate timekeeping standard, generally known as railroad time. These times were used to coordinate the movement of trains as well as arrival/departure schedules. In many cases companies adopted the local mean time used at their home office as the official time for their entire line. A Philadelphia company adopted that city’s time as its standard, while a New York firm used its own local time to run its own trains.

Boston & Providence RR Standard Time notice (courtesy Harvard University Collection of Historical and Scientific Instruments)

Initially, train timetables were established based on known travel time and distance between stations, a practice that often resulted in accidents and delays. Beginning in the 1850s, schedules were managed using another new piece of technology: the telegraph. Once widely adopted, it allowed railway companies to send updated schedules and other data along the line in a nearly instantaneous manner, warning stations of delayed or disabled trains. By 1855, most companies used telegraphic means to manage schedules and to ensure that trains were spaced safely along the line.

Central RR of New Jersey timetable prior to establishment of Standard Time. Note the comment that “New York Time is the standard, as indicated by the Clock in Elizabeth Station”

This system worked well, but often generated confusion among passengers. A traveler arriving at a train station served by several lines might be confronted with half a dozen clocks, each set to an individual line’s railroad time. These displayed times might not even match the local mean time for the station, thus adding another dimension of complexity. Anyone scheduling a trip involving multiple railway companies had to account for differences in each company’s standard, since in some cases the arrival time of a train operated by one line might be later than the departure time used by another. So while railroad companies were fully capable of operating under their own time system, this scheme did nothing for the passenger.

The first known discussion of a standardized railroad time in the United States came in 1852, when a writer in an industry journal recommended that New York City’s time be adopted by all operating companies as “the first [or sole] meridian of railroad time.” No serious discussion of this suggestion appears to have resulted, though the subject was brought up occasionally afterward.

It was not until 1869, when Charles F Dowd presented a plan at a New York convention of trunk line (railway) operators, that some momentum toward adoption of a national railroad time began. Dowd noted the confusion travelers felt when confronted by multiple independent time systems in use by the various lines, stating that “these variations are governed by no general principle which would enable a person familiar with them in one locality, to judge of them in another. Any traveler, therefore, upon leaving home, loses all confidence in his watch, and is, in fact, without any reliable time.”  He proposed a system that adopted Washington DC’s local mean time to manage all national railroad schedules, with each city retaining its own local mean time for other purposes. A year later he elaborated this system still further by proposing four zones, indexed on the Washington meridian. Each would be separated by one hour, which would become the standard railroad time for that zone.

Charles Dowd’s Proposed system of time zones

However, after some initial discussion the railroads lost interest in such a plan, remarking that “the disadvantages the system seeks to avoid are not of such serious consequences as to call for any immediate action on the part of railroad companies.” In other words, companies were perfectly content with their existing system of “private” railroad times and apparently felt any inconvenience and confusion it caused individual travelers was not their problem to solve.

Astronomical Influences

With railroads losing interest, the task of gathering support for a standard time scheme fell to another group. Astronomers also had the problem of maintaining a universal time system for the purposes of coordinating celestial observations.  In 1875 a researcher named Cleveland Abbe wrote of the difficulties he experienced when attempting to coordinate observations of an aurora borealis on April 7, 1874. He had recruited 100 observers – twenty from the Army’s Signal Service along with eighty volunteers – in an attempt to determine the aurora’s altitude above the Earth, which was then an unknown quantity. These observers were instructed to telegraph observations of the aurora, along with the precise time the phenomenon became visible, to Abbe’s office. They were instructed to note their exact location and to state the time in terms of local mean time. Abbe could then identify and correct each time to a single standard.

The experiment failed because, as Abbe stated, “the errors of the Observers’ clocks and watches, and even of the standards of time used by them, are generally not stated…so that the uncertainty of this vitally important matter will be found to throw obscurity upon some interesting features.” He also discovered later on that some had used railroad time rather than local mean time. This lack of accuracy rendered many of the observations useless for his purposes.

 

Cleveland Abbe (1838-1916), first
chief meteorologist of the US weather service

Abbe’s letter, which was written to the relatively new American Meteorological Society, urged “action by the Society to secure the adoption of a uniform standard of time.” The Society formed a Committee on Standard Time, with Abbe as chairman, and began researching options. In 1879 the committee issued its Report on Standard Time, which became a key document in the adoption of uniform time across the United States. This report noted the seventy-five “standard” times then in use by the railroads, and recommended adoption of no more than five time zones for North America. These would be known as “Railroad and Telegraph Time” and would be indexed to the Greenwich Meridian – GMT.

However, the document went further. It recommended a subsequent step involving the establishment of a national standard for the United States  using a single meridian through the Mississippi Valley that would be six hours earlier than Greenwich Time. Cities and towns would discontinue the use of local mean time, instead adopting the time standard in use by their primary local railroad. The whole system was to be implemented by the railroad and telegraph companies, since they were already equipped to manage such a large venture and had experience managing time schedules across long distances.

In 1880 Abbe also began corresponding with Sanford Fleming, a Canadian railway engineer who was starting his own campaign for adoption of a prime meridian system using a single, universal world time. In one of his letters to Fleming, Abbe wrote that his goal was to have the rail and telegraph companies “make a beginning,” and then to “hammer away at our national congress to call for its action on the subject.” The pair also called for an international convention to discuss the problem of standard time, which also plagued many steamship and other companies involved in wide-ranging trade. This partnership, as well as growing international interest in resolving the problem, eventually resulted in the International Meridian Conference, held in Washington DC in 1884. However, a great deal of progress was made between 1881 and 1884 that eventually resulted in the creation of Standard Time zones for the US and Canada.

Sanford Fleming, Canadian Railroad
Engineer and Standard Time advocate


First, legislative efforts began to bear fruit. In 1881, an astronomer at Yale University persuaded the Connecticut legislature to enact a statute making New York City’s local time the standard for Connecticut. This was the first legislation of its type, and forced local railroads to display this time in their stations.  In that same year, the Naval Observatory caused a bill to be introduced into the US House of Representatives. This bill “specified the Naval Observatory’s Washington time for official uses” – in other words, it established a national time standard, similar to railroad time, for official national business. This bill died in committee and was not enacted, but it showed that interest in establishment of a national standard existed even at the Federal level.

In 1882 petitions to the President from the American Metrological Society and other concerned groups persuaded the President to sign an act authorizing Abbe “to call an International Conference to fix on and recommend for universal adoption a common prime meridian, to be used…in the regulation of time throughout the world.” Soon afterward, the State Department sent invitations to various foreign governments, asking them to participate.

In the same year, observatory directors (who were the nation’s de facto timekeepers) and groups such as the American Society of Civil Engineers began discussing a multi-zone time system for use across the United States. Such a system, they realized, could meet the needs of the scientific community as well as railroads, steamship companies, and the public at large.

The Railroads Take Charge

As part of efforts to stimulate adoption, Abbe’s committee invited various industrialists and scientists to participate in the Society. One went to William F Allen, secretary of a General Time Convention maintained by US railroad companies.  In April 1883 Allen, having become aware of efforts being made toward adoption of Standard Time by other agencies, wrote an editorial for the General Time Convention. He stated that “We should settle this question among ourselves, and not entrust it to the infinite wisdom of the…State legislatures.” Laws were seen as unwelcome because “there is little likelihood of any law being adopted in Washington, effecting [sic] railways, that would be as universally to the railway companies.”

The adoption of Standard Time accelerated following Allen’s editorial. It is clear railroads had no internal need to adopt the new system. Instead, their interest in implementing a national standard stemmed from a desire to prevent “outsiders” from designing a system that might not be in the railroads’ best interests.

Meetings in 1883 formalized the system of meridians, then known as Standard Railway Time, each exactly one hour apart, that formed the basis for Standard Time.  By a happy coincidence, the proposed Eastern meridian of Allen’s system coincided within 6 seconds of GMT, which was 5 hours away. This reinforced Allen’s belief that Greenwich time should be adopted as a world standard.

Despite scattered opposition from individuals claiming a departure from solar time was against God’s will or “unnatural,” Standard Railway Time was formally implemented on November 18, 1883. Allen wrote that on that date in Philadelphia, “the time-ball made its rapid descent, the chimes of old Trinity rang twelve measured strokes, and local time was abandoned, probably forever.” The new system was almost universally adopted within a year. New Yorkers set their clocks back four minutes. Chicagoans stopped theirs for nine minutes. Those in Philadelphia, New Orleans, and Denver did nothing since they were located at the base meridian for their time zone.

Official Railroad Guide December 1883,
following adoption of Standard Railway Time

 

Naturally, some conflict occurred. A few towns refused to adopt the new system. A Boston man was threatened with arrest because he arrived several minutes late for an appointment with a local judge who refused to acknowledge the new time system.  In 1918, the US formally adopted Standard Time and abolished all local timekeeping. The modern era had arrived, for better or for worse.

Halloween: Thank the Celts

The holiday we all know as Halloween (or Hallowe’en to some) is often thought of as one dedicated to children. Some folks I’ve encountered over the years even thought it was started in the early 20th century by candy companies as an excuse to sell more product! That said, nothing is further from the truth. Halloween is an extremely old holiday that originated with the Celtic/Gaelic races in Ireland, France, and England several thousand years ago, when it was known as Samhain (pronounced “sah-ween”).

When it first originated, Samhain was (more or less) the Celtic “New Year” but it was much more than that. It marked the time when final harvests were brought in and cattle were driven from summer pastures to winter quarters. It also involved huge bonfires that were part of an annual cleansing ritual for both people and animals. While we’re not completely sure of all the events surrounding the holiday, it appears the Celts also extinguished their home fires at this time, re-lighting them from embers carried home from the communal bonfires. This may have been a ritual designed to bond the community more closely together for the coming winter. Or maybe everyone cleaned their chimneys at this point, and needed an excuse to put the fire out temporarily. We’ll probably never know for sure.

 

Samhain was also thought of as one of the times when the “door to the afterlife” or “other world” was open, and the souls of those who died during the previous year made their way to whatever fate awaited them there. Our modern Halloween ritual of going from door to door asking for treats probably comes from the Celtic practice of offering feasts for the departing dead. It was also probably the case that performers or local residents went from door to door, mimicking the spirits and receiving treats of some type. Those who refused might have been tormented by the “spirits”, thus giving rise to the practice of soaping windows and other light pranks in the modern era.

Since the “doors” were thought to be open, this was also considered an auspicious time for divination practices. One old rhyme says that young girls would throw a ball of yarn into a suitable location, like a cellar or old house thought to represent a threshold or liminal space, then wind it up while repeating a rhyme intended to disclose the name of the true love they would later marry. Whether this exact rhyme dates back to Celtic times is debatable, but it’s an interesting example of the sort of thing that might be attempted at this time of year.

Page 318 of The Journal of American Folklore, Vol. 5, No. 19, Oct. - Dec., 1892

Divination Folklore Example

For that matter, the tradition of the Jack ‘O Lantern probably derives from the legend of a man known as “Stingy Jack,” who managed to trick the Devil on several occasions and in doing so managed to keep his soul as his own. When he died, he was refused admission to heaven and the Devil also told him to bugger off, having agreed earlier never to take his soul. However, the Devil took pity on Jack and gave him an ember, which he subsequently carried in a gourd (think “pumpkin”) he happened to be eating at the time of his death, to light his way through eternity.

Roman Holiday

One general truism of history is that it’s easier to get a conquered population to adopt a conqueror’s religious practices by “overlaying” new religious holidays onto those that already exist. This is often intentional, but it’s also true (as some students of mythology have suggested) that certain patterns of holiday celebration are common across cultures. For example, many cultures have Spring and Autumn celebrations and others celebrate New Year around the time the sun starts noticeably spending more time in the sky. Some events just naturally fall on similar dates that align with seasonal changes. In any case, the Romans had holidays of their own that fell on roughly the same date as Samhain. The first, Feralia, was the Roman day honoring the dead (what an amazing coincidence!). The second was for Pomona, the Roman goddess of trees and fruit. Pomona’s symbol was the apple — think of the French word pommes, meaning apple — which may (or may not) explain the practice of bobbing for apples on Halloween.

http://upload.wikimedia.org/wikipedia/commons/c/c8/Nicolas_Fouch%C3%A9_001.jpg

Pomona and Her Apples

The Christians Take Over

Keeping with the custom of overlaying holidays in order to get a subject population to accept a new religion, Christianity decided to pull a fast one by moving All Saint’s Day, which was originally celebrated in May, to November 1st. This happened as early as the first party of the 8th century in the British Isles, but the move wasn’t made official until 835CE. Another Christian holiday, albeit one generally only celebrated by the Catholic church, was All Souls’ Day which occurs usually on 2 November. Both are (yes, you guessed it) festivals honoring those who died during the previous year, and are intended to send them speedily to their eternal reward.

So How Did We Get Halloween?

Derivation of the name is pretty obvious when the final piece of the puzzle becomes known. As one history of Halloween notes, “All Saints Day was also known as All Hallows, or All Hallowmas (Hallowmas is Old English for All Saints Day). Since Samhain was celebrated the night before November 1, the celebration was known as All Hallows Eve, and later called Halloween.” This also explains the “Hallowe’en” spelling, since “even” or “e’en” are contractions for “evening.” “Hallows Even” can easily become one word with a dropped “v” when said quickly.

All Hallows and other holidays were abolished under Protestant law, which abandoned the idea of saints altogether. It wasn’t until much later that the holiday formerly known as All Hallows morphed into what we now know as Halloween, complete with costumes, decorations and increasingly macabre displays, not to mention a sales bonanza for stores that sell appropriate paraphernalia during the season. What was a religious holiday became a secular one, and also diverged into related holidays like The Day of the Dead, which is celebrated mainly in Latin American countries.

But what about all the stuff with black cats, witches, and other creepy creatures? That all came in later, probably over hundreds of years, as bits of folklore were added to the story. The story of how witches came to be so reviled in Christian tradition is a tale for another day, and in Medieval times cats were commonly thought to be agents of the Devil (and companions of witches…), which led to numerous cat killing sprees that may have contributed to the spread of the black plague…but that’s yet another tale for the future.

So if you want to be an authentic Halloween creature, pick a skeleton or corpse. All those vampires, witches, and werewolves are Johnny Come Lately characters that weren’t part of Samhain, or even the early Roman and Christian versions of the holiday. That’s so gauche…

Premature Burial and the Modern Age

“To be buried while alive is, beyond question, the most terrific of these extremes which has ever fallen to the lot of mere mortality.” — Edgar Allan Poe, “The Premature Burial.”

The fear of being shoveled six feet under while still breathing ranks fairly highly in the top ten list of Things People Least Want To Happen. Even worse is the idea of actually regaining consciousness afterward, with no one to call for help and no means of escape. This fear was very real in the days before modern medical practice and the invention of instruments capable of accurately detecting heartbeat (EKG or ECG) and brain activity (EEG). This long-held dread, coupled with of a number of scientific discoveries and highly class-based social conditions in the Victorian era, led to a “moral panic” of sorts that created “protective societies” for the dead and scenes of hysteria at pauper funerals. Let’s look at the process that allowed this panic to emerge.

Back in these Bad Old Days, the tools available to a physician — presuming one was available, that is — examining a patient for signs of life were:

  • Listening and watching for breathing
  • An ear to the chest to listen for a heartbeat
  • Checking for a pulse
  • Whether the patient was starting to smell bad following failure of the other tests

Given the potential for error, the opinion of the presiding physician wasn’t always considered the last word. Prior to the advent of modern mortuary practice, corpses generally were “laid out” and prepared for burial by family members. It wasn’t uncommon for relatives (often children) to take turns keeping watch over Grandma’s body while these preparations were carried out and the funeral service arranged. This wasn’t just a sign of respect, either — one of the reasons for watching the body was to make sure it didn’t show any lingering signs of life. News stories, even up to the early 20th century, are full of accounts of someone actually regaining consciousness during a funeral service, or of a family member detecting movement during the period of mourning and reviving the fortunate survivor.

Poe’s story, which was at least partly cribbed from an earlier story called The Buried Alive published in an issue of Blackwood‘s magazine of 1821, is only one example of the genre. The Danger of Premature Internment (1816) and Thesaurus Of Horror (1821) both include lurid, terrifying tales of some poor soul trapped in their own coffin, scratching in terror and beating at the wood of the “narrow house” that confines them. Some stories circulated, like modern urban legends, from one generation to the next with only minor changes of location to keep them fresh.

Antoine Wiertz, The Premature Burial, 1854.

Antoine Wiertz, The Premature Burial, 1854.

In one, a Henry Watt(s) is described as having been saved from his coffin by his own groaning, after it had been nailed up and was ready to be lowered into the grave. The same story was heard in Yorkshire, Surrey, and Oxford (at least) in the late seventeenth…or was it the eighteenth? No, it was certainly the nineteenth century (See, variously, The Yorkshire Wonder: Being a Very True and Strange Revelation How One Mr. Henry Watt Made a Great Noise in A Coffin (1698); The Surrey Wonder: Giving a True And Strange Relation of Mr Henry Watts (London? 1770); The Surprising Wonder of Doctor Watts (1800?)).

Another factor that played into this fear, especially in Europe where burial plots or crypts were often re-used after a few years, involved accounts of disinterred people showing signs of having awoken following burial. Many of these tales certainly belong purely to the realm of folklore, while others likely could be explained by other circumstances. But the myth persisted and the fear of premature burial was very real. It wasn’t until the invention of the stethoscope in 1816 that doctors had access to a marginally more accurate means of determining actual death. The concepts of ‘brain death’ or ‘cellular death’ were far in the future, and it was generally believed that once someone crossed the threshold there was no coming back.

Then, starting in the early 1800s, everything started to change.

 Electrifying Experiments

Today, electricity runs the world. In the 18th century, it was little more than a toy. Wealthy men bought or had built crude electrostatic generators, which they used to make party guests’ hair stand on end or shock unwary friends.

http://upload.wikimedia.org/wikipedia/commons/3/33/PSM_V17_D436_An_eighteenth_century_electrical_experiment.jpg

Party trick: eighteenth century static electricity experiments.

Electricity quickly became a fad, with people experimenting with its possible uses in all sorts of odd contexts.

A 1753 experiment with static electricity and "electro-horticulture"

A 1753 experiment with static electricity and “electro-horticulture”

Experimenters also used “Galvanic piles” (today we call them “batteries”) in even stranger displays. In 1803, one Giovanni Aldini obtained the corpse of a recently hanged criminal named Forster. Connecting the contacts of a large battery to the man’s mouth and ear, observers were able to see Forster’s “jaw begin to quiver, the adjoining muscles were horribly contorted, and the left eye actually opened” (Aldini, An Account of the Late Improvements in Galvanism, London 1803).

A galvanized corpse

A galvanized corpse

Some medical researchers thought it might be possible to use electricity, along with “dextrous management” (whatever that means) to resuscitate the dead — or indeed to put the living into a state of “suspended animation” from which they could be revived. Whether the latter belief was inspired by party tricks gone wrong, with participants knocked senseless by a jolt of electricity and later revived by another, is best left to the reader’s imagination.

Such experiments introduced significant ambiguity into the whole issue of life and death. If a Galvanic pile could cause a corpse to twitch, was the corpse really a corpse? Was it possible to move back and forth across the “threshold” of death, possibly bringing back knowledge of the afterlife? Such uncomfortable questions had probably never been considered outside the realm of religion. Now, it seemed, there might be ways to cheat or toy with death.

Other discoveries were soon to add even more ambiguity to the question, and to put the fear of premature burial into the minds of generations of Europeans and Americans.

Knockout Drops

Prior to 1846, all surgeries were carried out with the patient conscious and writhing in agony, held down by burly surgical assistants while the surgeon sliced off an infected leg or arm. Experienced surgeons could remove a leg, start to finish, in less than a minute (Dr. Joseph Liston, who practiced in the 1840s, is said to have taken as little as 30 seconds on some occasions).

But in 1846, Liston decided to operate on a man named Frederick Churchill after the patient was rendered unconscious using inhaled Ether. A few minutes later, Churchill awoke, looked around, and asked when the operation was going to commence (peals of laughter from the audience…no, really). But because Ether was dangerous and unreliable, other chemicals were tried. In 1847 James Simpson, a professor of midwifery at Edinburgh University spent the better part of  a year sniffing, drinking, and otherwise testing every chemical compound he could lay his hands on. One day he woke up on the floor after having inhaled concentrated Chloroform vapors. Anesthesia had arrived.

Anesthesia using Ether

Anesthesia using Ether

While the use of Chloroform dramatically improved surgical outcomes, it introduced a new problem. Patients were said to be “dead to the world” while under its influence. They felt (or at least remembered) no pain. So what state was the human body in while under its influence? It wasn’t dead since the patient was still breathing, but it didn’t respond to pain while anesthetized.

The discovery that multiple states of awareness existed between consciousness and death added even more complexity to the whole issue. New terms were coined to describe these proposed conditions: “trance,” “syncope,” “coma,” “catalepsy,” “suspended animation,”  and “human hibernation” were probably most popular, though others can certainly be added.

The idea that humans might be able to put themselves into a state resembling death (usually known as a trance or suspended animation), while retaining the ability to recover on their own was probably also influenced by other, unrelated discoveries. Europeans were more and more enthralled by the wonders of the Far East, with Indian “fakirs” and ascetic holy men claiming the ability (sometimes tested under allegedly scientific conditions, sometimes blindly accepted) to enter a state in which they required no food, water, or air. This, along with the discovery of various species of frog and toad that can “go dormant” for long periods in order to escape cold or dry spells convinced some researchers that humans might be able to perform similar feats.

Body Snatching

The final component that allowed fears of premature burial to emerge as a near “moral panic” in late Victorian Europe was that of unwanted postmortem dissection or vivisection (accidental or otherwise) by anatomists or surgeons.

http://3.bp.blogspot.com/_y5Y_xVte8sI/THhCyGUQ2nI/AAAAAAAABvU/KZvXcyJHmJU/s1600/tumblr_l7tz72qz361qztk1wo1_500.jpg

Is she dead or alive?

This fear was especially prominent among the poor, who feared doctors would be too badly trained, lazy or otherwise unmotivated to ensure someone was actually dead before pronouncing them as such. Also compare this to modern fears, in areas like South America and Southeast Asia, that rich European or American clients hire doctors to steal organs from poor people. Numerous cases of anatomists buying corpses of recently hanged criminals from professional body snatchers can be found.

A nightwatchman disturbs a body-snatcher who has dropped the

A nightwatchman disturbs a body-snatcher who has dropped the

In some cases such fears were justified, while in others they were simply an element of fear in the class struggle between the Victorian rich & poor. Elizabeth Hurren’s book Dying for Victorian Medicine: English Anatomy and its Trade in the Dead Poor, c.1834 – 1929 covers the myth and reality of this situation in great detail.

The Stage Is Set

We now have all the elements for a moral panic over premature burial. Medical science has discovered that death may not be absolute, that electricity can be used to “animate” corpses that, therefore, might not be corpses, and that various “not dead but not conscious either” states seem to exist. The looting of corpses from cemeteries is relatively common, not just for use as cadavers but also for petty theft of grave goods. The idea that Grandpa might be disinterred and victimized while still alive is even more horrifying than outright abuse of his corpse.

Into this volatile mix stepped an interesting cast of characters. One of the first, a Dr. Franz Hartmann, published a pamphlet in 1894 entitled Buried Alive: An examination into the occult causes of apparent death, trance and catalepsy. This was mostly a catalog of lurid tales of vivisepulture and its horrors, all undocumented and frequently ridiculous. A reviewer in the British Medical Journal in 1896 lambasted it, noting “what is to be thought of the credibility of an author who gravely narrates a case of an Englishman who died of typhoid fever in 1831, who was buried four days later, and after another four days of burial in a coffin in a grave was exhumed and found alive, and who stated that he had been conscious all the time, and that his lungs had been paralyzed and used no air, and that his heart did not beat?”

The same reviewer also (rightly) ridiculed Dr. Hartmann’s “still more startling statement […] that even putrefaction of the body is not a certain sign that life has entirely departed.” One wonders what signs Dr. Hartmann would accept!

Hartmann’s work was followed by that of William Tebb, Walter Hadwen, and others who founded the London Association for the Prevention of Premature Burial (LAPPB) in 1896. This was not the only such group that was formed during the era, either: similar groups existed in Germany and other countries. According to its charter, the LAPPB’s charter was:

1. The prevention of premature burial generally, and especially amongst the members.

2. The diffusion of knowledge regarding the predisposing causes of the various forms of suspended animation or death-counterfeits.

3. The maintenance in London of a center of information and agitation.

4. According to an arrangement entered into with skilled and experienced medical experts in respect to death verifications, members of the association are guaranteed against premature burial.

Tebb was an archetypical, and extremely active, Victorian reformer. He was (variously) active in the abolitionist movement, an anti vivisectionist, an early anti vaccination crusader, a teetotaler, a vegetarian, in favor of protection for children and animals, and a tireless crusader against premature burial. His LAPPB published a pamphlet called Premature burial, and how it may be prevented, with special reference to trance catalepsy, and other forms of suspended animation in 1905. He and others obsessed with this subject danced on the fringes of numerous controversial practices, effectively acting as anti-establishment gadflies for highly visible causes, no matter how odd or unsupported their positions might be. (One could also say the same about certain modern anti-vaccination groups, but that’s another story).

The Germans created an interesting and possibly unique response to fears of premature internment: “waiting mortuaries” where allegedly deceased individuals could be taken so they could be monitored for signs of life. These first appeared in the 1790s, and were popular throughout the Victorian era before dying out (sorry) in the later 19th century. In these facilities, bodies might be fitted with strings tied to bells that would ring if the corpse-patient (which would it be?) moved. Workers would sit for extended periods, generally among putrefying bodies, watching for movement. One hopes they were really, really well paid for this work.

While the fears generated by such groups were largely imaginary and overstated, Victorian era doctors often had few professional credentials and some may have been incompetent in the area of determining whether a patient had in actuality become a corpse. Groups like the LAPPB and others did have the effect of forcing governments to revise standards in terms of which doctors were permitted to issue death certificates.

The process became easier once improved instrumentation became available, and doctors in the later Victorian era would generally have accurate stethoscopes and opthalmoscopes (the latter used to examine the retina for deterioration) as well as hypodermic syringes to inject a small amount of ammonia under the skin in order to check for an inflammatory response. They might also have a magnesium lamp that would allow them to examine the thin skin between the fingers for signs of circulation (Grave Doubts, Journal of British Studies, Vol 42 No 2, Apr 2003). Such instruments, as well as more stringent training standards, did much to reduce claims of medical incompetence.

Marketing Opportunities

Naturally, all these fears inspired entrepreneurs. Certain types of “patent” coffins were designed to allow accidentally interred victims to activate surface-mounted flags via a long cord. The victim could simply begin pulling the cord to signal their distress to cemetery workers.

http://inrepose.typepad.com/photos/uncategorized/2007/09/10/coffinbellcolor.jpg

Safety Coffin

Those interred in surface crypts could enjoy the protection of special vaults with internally accessible handwheels. Someone awaking in such a chamber could simply turn the wheel to open the crypt from within and escape back to the land of the living.

http://upload.wikimedia.org/wikipedia/commons/f/f4/Premature_Burial_Vault.JPG

Some people were so fearful of accidental premature interment that they left specific instructions in their wills to prevent the possibility altogether. One woman instructed her physician to sever the arteries in her neck so completely that no possibility of awaking in her coffin existed, for example.

It was long rumored that certain very wealthy people (Mary Baker Eddy, founder of the Christian Science movement, is often cited in this myth) had telephones installed in their crypts in order to allow them to call for help if needed. None of these claims are known to have ever been verified, but they make great stories.

In time, still further improvements in the ability to detect signs of life using EEG and ECG machines, as well as additional research into the processes involved in death, would make groups such as the crusading LAPPB and its sister organizations obsolete. Today the debate has receded into subtle and often ridiculously acrimonious arguments over the exact nature of brain death (witness the Terri Schaivo case of 2001-05) and claims of “NDE” (Near Death Experiences) by some patients who claim to have temporarily “crossed over” and returned. Many such claims lie firmly in the area of mysticism and wishful thinking, and are the modern stepchildren of Spiritualist and other movements popular during the Victorian era.

Premature burial fears seem quaint today, and are largely relegated to rural areas in less advanced countries. A century ago, they were very real and very common in the most advanced nations on the planet. Ironically, the rapid advance of scientific understanding had the effect of both creating the panic of the latter Victorian era and ending it.

The Tea Party Myth

One of the fundamental underpinnings of the American system is the concept that we broke away from Britain over issues like “taxation without representation”, or the idea that the American colonies were somehow excessively taxed. One of the grounding rods of this myth is the infamous Boston Tea Party, which is said to have been a protest against high taxes levied on imported tea.

The problem, as with so many aspects of our perception of the American past, is that this event was mythologized and altered drastically in order to fit into the American independence narrative. Instead of a protest against high taxes, the original Tea Party was one against lowered taxes. What really happened was that the British East India Company (EIC), a major business with connections directly to King George III of England, was losing money at an alarming rate due to Dutch and other importers who were selling their goods at a lower price than the EIC. Thus, the directors of the EIC petitioned the King for aid, which came in the form of an import duty exemption granted to their company. Thus, the EIC were permitted to import tea into the colonies on a duty-free basis, which meant their price would undercut all others (and probably drive them out of business).

The actual Tea Party itself was by no means arranged by flag-waving patriots. Instead, it was instigated by leading figures such as Paul Revere and John Hancock. The latter was especially interested in rejecting British tea, as he’d made a fortune smuggling in Dutch tea and selling it to the colonists. He knew he’d lose business if the EIC tax exemption was allowed to proceed, and used the classic tactic of whipping up a convenient mob to do his dirty work.

To be fair, the Tea Party was also about the colonists’ lack of control over the decision making process involved in the tea tax (also known as the Tea Act of 1773). As colonists rather than British citizens living in England, they had no direct representation in Parliament and this was a legitimate sticking point that eventually resulted in the Revolutionary War and American independence.

The Tea Party itself, however, was made into a myth about higher taxes causing economic hardship among common citizens, and about a spontaneous demonstration by “patriots” against evil gentry oppressing the lowly. This isn’t even remotely true, yet it’s become a rallying cry for today’s Tea Party. What they’re missing is that the event for which their movement was named involved a company (the EIC) that was “too big to fail” in its day, and special legislation (the Tea Act) rushed through Parliament in order to benefit a small group of corporate executives. The parallels with the TARP act passed under the Bush administration in 2008 are too obvious to ignore. So today’s Tea Party has it all right…except totally backwards.

Myth of the Empty Continent

For many Americans, the traditional imagery of pre-Columbian North America is one of small or medium sized native tribes living in vast, primeval forests or on rolling plains largely unaltered by human hands. The first part of this evocative description is relatively accurate, since settlers moving West often traveled through large swaths of unoccupied land, encountering native tribes only occasionally if at all.

The "American Progress" conception, with the "light" of civilization driving the "darkness" of primitivism inexorably Westward.

However, physical and other evidence indicates the pre-Columbian population of North America was much larger and more advanced than popular myth has taught. As a result, the land in many regions was far from pristine. Native tribes engaged in large-scale agriculture, built towns, roads, and temples, and conducted far-ranging commerce. Vast areas of forest were regularly burned to create and maintain fields and prairies for agricultural use. Natural resources were managed in order to improve the availability of various foods. Thriving civilizations were found across the continent.  Yet, by the time settlers arrived in significant numbers, many tribes had vanished or were vastly reduced in number and power through disease and warfare. Native settlements and agricultural efforts had largely vanished over perhaps centuries of neglect.

No firm count is possible due to a dearth of written records, but estimates of pre Columbian native populations in the Americas range between 20 and 100 million, with a “consensus estimate” of between 40 and 80 million. Geographer William Denevan has suggested a total of 53.9 million, with a breakdown of “3.8 million for North America, 17.2 million for Mexico, 5.6 million for Central America, 3.0 million for the Caribbean, 15.7 million for the Andes, and 8.6 million for lowland South America.” He estimates only 5.6 million remained by 1650.

An absence of major settlements and large tribes convinced many 19thcentury American scholars and settlers the continent had been largely empty prior to the arrival of European explorers.  They also seem to have been unaware of first-person accounts written by early explorers such as Hernando De Soto, as well as letters and diaries kept by early colonists who described the size and complexity of native societies they encountered.

15th century sketch of a Native American village encountered by European settlers

The decimation of native tribes by disease, and the resulting impression that North America was home only to small, primitive settlements helped foster the “manifest destiny” myth that the newly formed American nation was destined to conquer and civilize a largely “empty” continent.  19th century historians such as Frederick Jackson Turner popularized the latter, in conjunction with the related myth of the frontier as a steadily westward-moving line between civilization and primitive life. Later it was adopted into public school history texts and taught to generations of students. Vestiges of these teachings persist to this day.

For an excellent introduction to this subject, pick up a copy of Charles Mann’s book 1491:  New Revelations of the Americas Before Columbus.

No, It Wasn’t Columbus

“In the Year of Fourteen Ninety-Two…” begins the bit of doggerel that nearly every American schoolchild learns. That’s when “Columbus Sailed the Ocean Blue” and, some people still believe, discovered America. To this day I know people who entertain a vision of old Chris sloshing ashore (probably in Virginia) with a priest and Portuguese flag, claiming all the land in the name of Isabella.

Oh, the irony of it.

For several decades now, at least, it’s been well known that dear Chris was by no means the first European to set foot on this side of the Pond.  At the very least, we know that Vikings had a long term settlement at L’Anse Aux Meadows in Newfoundland in the 11th century, which puts paid to Chris’ claim of supremacy. In fact, Columbus never set foot on the North American continent — he only “found” the islands of the Caribbean, and very few of those. There, he set out to enslave the natives, torture them into revealing where their hidden gold mines (which didn’t exist) were located, and watch as 95% of the population died of European diseases against which they had no resistance.

15th century drawing of Native Americans suffering from smallpox

15th century drawing of Native Americans suffering from smallpox

So why does the Columbus myth persist?

To a large degree, it’s because that’s what was taught in public schools for decades under the guise of “social studies”. I haven’t reviewed a recent grade-school text to see what’s currently being taught, but generations of Americans were victimized with the simplistic myth that “Columbus did it.” At best, he managed to popularize and open the New World to Europeans due to the publicity that surrounded his return to Portugal.

I’d also bet the same legions of school kids got the “Columbus’ sailors almost mutinied because they were afraid of falling off the edge of the Earth” myth, but that’s a story for another day.