Thursday, December 30, 2010

Imitation Not Affectation, Pt. 1

Since I read J.E. Lendon's marvelous Ghosts and Soldiers a few years ago, I've been struggling to get a firm grasp on the concept of imitation. (In one of my first blog posts, "Churchill and the Personal," I discussed the role of imitation in Winston Churchill's life and biographical writings, stressing the importance of his exploration of his father's life and of the poltico-military career of his ancestor, John Churchill, Duke of Marlborough.) Iain McGilchrist gives it a special place in his recent The Master and His Emissary, an exploration of recent research into the neuropsychology of the brain's two hemispheres and its application to history.

We are used to thinking of imitation in a reduced, deracinated sense of copying or simulation, as something almost dishonest or at least distasteful. It is a view that fits in with our misunderstanding of creativity, our present-day cult of originality, and our academic witch-hunts of plagiarists.  The ancient world, the Middle Ages, and the Renaissance understood the matter differently. At the very least it was what Thomas Mann (in his speech "Freud and the Future," quoted by McGilchrist) calls "mythic imitation."

This is no mere act of copying, but of becoming; not a matter of Napoleon behaving like Charlemagne, but of becoming Charlemagne, as Mann would have it. The left hemisphere will only allow the former interpretation of imitation as legitimate or possible; the right hemisphere, open to paradox and metaphor, is accepting of the latter. We find this type of imitation operating among the early Caesars, but clearly in trouble by the time of the late Roman emperors, such as Julian in the 360s. It seems to have resurfaced in the Middle Ages, not discussed by McGilchrist, where it was present not just in the mystical Imitation of Christ, but also in much of the behavior discussed by Johann Huizinga in his classics, Waning of the Middle Ages and Homo Ludens, examinations of symbolic and play behavior in civilization.

This mythic imitation, which we might also call creative imitation, is central to the Renaissance. Literally nothing about the age is explicable without this understanding. For the writers, thinkers, artists, musicians, craftsmen, and rulers who directly participated, it was necessary to absorb as much of the ancient world as possible.  It was not about memorizing facts or techniques, but about absorbing antiquity into their very being. Recovery of the past became a creative act, and imitation became a necessary part of an old world renewed.

In a very real sense, they still thought of themselves as part of the Roman Empire. Much of the political propaganda of the late Renaissance and Reformation established the national monarchies, not just the Holy Roman Empire, as legitimate successors to Augustus and Constantine. When Leonardo Bruni and Machiavelli sought military reform, they turned to the Romans; in the former case by analyzing their military terminology to prove that the legions were primarily composed of foot soldiers; in the latter, by consciously and unconsciously adapting Roman military formations to the modern world of cannons and arquebuses. Machiavelli took imitation directly into his political works, the most important of which, The Discourses on the First Ten Books of Livy, was couched as a commentary on the Roman historian Titus Livy. This was not entirely a conscious effort; as Sydney Anglo points out, many of the examples Machiavelli picks support his arguments badly and sometimes contradict his points. There was an unconscious reworking of the material. Machiavelli may also have been imitating his own father, who had edited an early printed edition of Livy when the great political writer was a boy.

You will have gathered that little of what we think of as imitation was involved in the Renaissance idea of imitation. It was not a superficial, external affectation, but, as in the case of Napoleon two centuries ago, and Churchill in the last century, a dynamic process of absorbing and re-imagining the past with both conscious and unconscious aspects.

In my next post, I want to look more deeply into this process by looking at the work of Albrecht Durer.

Friday, December 17, 2010

A Slight Pressure on the Trigger

Not long ago, I ran across a news story about the U.S. Army's latest infantry weapon, the XM25, a "smart" grenade launcher using a laser rangefinder, onboard computers, and programmable ammunition to explode small grenades above an enemy hidden from sight. It's a case of exporting a difficult skill to a machine with an easy-to-use (dare I say point-and-shoot) interface. Armies around the world have a long history of this; you could reduce the entire history of firearms to this one theme without undue distortion.

This off-loading of skill has followed three arcs. The first made it easier to fire (or use) the gun. The second facilitated rapid fire. The third simplified hitting the target.

The first guns were awkward things fired by touching a coal or heated wire to the powder while holding the stock under one arm. Hitting anything at all was chancy. First the coal was replaced by a slow match, a smoldering, yard-long coil of treated cord that functioned much like a punk on the Fourth of July. So the soldier could keep his eye on the enemy when firing, a long, S-shaped, hinged rod (the serpentine) was introduced to mechanically lower the match into the power. As yet there were no triggers. The serpentine had to be balanced with the fingers of the right hand preventing premature firing, thus with the addition of levers and springs for control, the trigger was invented. A slight pressure on the trigger at the right moment fired the gun with minimal distraction. This was the first real user interface in firearms.

For over two hundred years this was the dominant system, but after 1650 was gradually replaced by the flintlock, wherein a piece of flint was driven against a piece of steel, sending a shower of sparks into the powder. It was easier and safer to use, as having a burning match around loose gunpowder was always an invitation to disaster. Flintlocks were also more reliable, misfiring half as often as matchlocks. After 1811, mercury fulminate, which explodes when struck sharply much like the cap in a cap gun, began to displace flints. Again, it was more reliable and easier to use. (With each change, the amount of effort in loading and firing the gun decreased.) By 1860, the modern methods of firing were in place.

The second arc was to make loading and firing faster. Each of the innovations just described made the process a little better, so that an experienced soldier in 1815 could fire six to ten times faster than one in 1500. So long as guns remained muzzle loading (loaded from the far end of the barrel), little more could be done. To make them faster (and easier) to load, they had to be loaded from the breach, leading to leakage of gas. Various things had been tried before 1800; however, the real breakthroughs came in the decades following the Napleonic Wars. Interchangeable parts and machine tools made all the difference. Guns were produced with finer tolerances between parts, which in turn meant less gas leakage. Then, with the introduction of fulminate primers, cartridges, which had previously been made of paper, began to incorporate metal. Originally just a metal disc to hold the primer was used,  later, cartridges became completely metallic. These expanded slightly when fired and making a tighter gas seal. Even better, since they were not crushable like paper, metallic cartridges could be fed from a magazine (either a box or tube with a spring at one end). By the 1860s a green horn armed with a magazine rifle holding seven or more shots, could fire faster than a drill sergeant armed with an older, muzzle-loading weapon. From that point on, repeating weapons have only become faster.

The third arc was accuracy. Initially, matchlocks and flintlocks improved this, as each required less attention than its predecessor, but the real star was rifling. It was expensive and took longer to load, so it was not popular before the second quarter of the nineteenth century, when the French found a way (the Minie ball) to create an acceptable rate of fire with muzzle-loading rifles using a conical bullet with an expanding base that created a gas-tight seal. This gave them and their British allies an advantage over the Russians in the Crimean War (1853-6), but it was the new breach-loading, repeating rifles that made the greatest difference. Firing and loading had been simplified to the point that a soldier could give most of his attention to hitting his target. For the first time, gun makers and ordnance boards began paying attention to sights. They worked out ways to make the sights simple and foolproof, so that less training would be needed to make a marksman. By the last decades of the nineteenth century, smokeless powder removed the last major obstacle to accuracy, eliminating the dense clouds of smoke created each time a gun was fired.

Over the last century-and-a-half, military rifles have continued to focus on creating weapons that are easy to use, fire rapidly, and are simple to aim. With the XM25, it appears we may have reached a new plateau, one where a computer does everything but select the target and make the decision to pull the trigger. It seems that everything else has been transferred to the gun. Will those functions also be externalized?

Sunday, December 12, 2010

A Bit about Interfaces

Interface is a modern word for a very old thing. The OED gives its first use in 1882 as a scientific term. The modern usage only dates from the 1960s. Language is our interface with one another and, to a limited extent, with dogs and parrots, which have been experimentally shown capable of understanding human language. The calendar markings Marshack detected on 30,000 year-old bones were at once both interface and recording. (See the post: Embodiments of Memory.) Interfaces have been around for as long as we have been fully human, if not longer.

Recently interfaces, the apparatus that allows us to control and receive feedback from our creations, have forced themselves on my consciousness. They don't get much attention in history. They are not entirely neglected, but often relegated to specialist niches. Outside of the history of technology, it seems to me that interfaces appear most frequently in naval and aviation history. Nevertheless they are important both because they constrain what we can do, and directly mediate between the mind and the environment.

Interfaces do not just pop into existence. They evolve and change over time, their characteristics adapting to the needs of the moment and granting us new capabilities. We can see this in the early history of aircraft. We are used to airplanes controlled by a combination of a joystick or yolk (a semicircular wheel attached to a moveable column) and pedals, but this was not the case in the earliest machines. The Wrights for many years used a harness connecting the pilot by wires to the control surfaces and wings (which warped or bent), an ingenious system for translating the movements of the body, as well as hands and feet, into the motion of the airplane. This allowed the involvement of the whole body in control and reception of feedback, but it was difficult to master. It also allowed the Wrights to build highly unstable machines that were difficult to fly. A more easily mastered system was needed.

Another early system involved two wheels, one controlling pitch (movement of the nose up and down) and the other wheel roll (movement of the wingtips up and down), combined with rudder pedals for steering the machine. The use of three separate controls required a higher level of coordination than the joystick.

What the joystick did was combine pitch and roll control into an easily mastered interface, making learning and flying easier. (Later, of course, machine gun triggers would be added for military aircraft further unifying the controls.)  The joystick was also capable of numerous improvements and adaptation to other purposes. To us the joystick may seem an obvious device for controlling aircraft, weapons, and anything else where delicate or accurate movements are involved, but it emerged slowly, and anonymously, appearing by 1910.

The joystick represents some of the most important characteristics of good interfaces. It simplifies control and feedback and is easier to learn than comparable systems. It has limitations, in this case, requiring more stable aircraft than the Wrights had constructed, and provided a narrower scope for feedback than had their system.

Before I wander too far, let me sum up. We need to look at interfaces in history because they constrain our abilities at the same time that they enable new ones. They take time to evolve and their evolution may have serious implications for how a device is used. They also allow the transfer of a skill from ourselves to the machine. These are points I hope to take up in later posts.

Tuesday, December 7, 2010

Human Fallibility

Today is the 69th anniversary of Pearl Harbor. The degree of confusion that before, during, and after the attack is something that many people cannot accept. The American tendency towards conspiracy theories and witch hunts, the need to lay the blame at someone's doorstep, has never been stronger than in the continuing struggle to understand Pearl Harbor.

The simple truth is that the attack was very daring, very well conducted, and not as successful as it needed to be. We were scarcely able to believe that the Japanese were capable of carrying it out. The Japanese were unwilling to accept that we could recover that quickly.  Part of this was racism on both sides, part of it was wishful thinking by both parties, and a very large part of it was, and is, the failure to understand how significantly war had changed between 1918 and 1941. It was difficult for anyone to grasp all of the changes. Some understood the transformation in tactics and materiel, some saw how logistics and strategy had changed, a few even comprehended the new information warfare of global communications. Few, if any were able to put all of that together.

One of our chief problems looking back is to try to put ourselves in the minds of commanders and politicians, from Yamamoto to FDR. We look at the rapid metamorphosis of naval warfare over the next four years and read it backward. We forget that, despite the successfully Otranto Raid by British torpedo planes on the Italian fleet the year before, no naval air attack on this scale had ever been attempted. We forget too how much attention in Washington was then directed at Europe and the Atlantic. Finally, we forget how much the human factor, and human fallibility control events, whether then or now.


I've written a blog post everyday for the past month. I did this to prove to myself that I could do it as much as to acclimatize myself to blog writing. I am going to slow down now, allow myself more time and more thought, in order, I hope, to produce better results. I expect I'll be putting up something two or three times a week.

Monday, December 6, 2010

Embodiments of Memory

One of the constants of human history has been the embodiment of various human mental skills and needs in our tools. The 30,000 year-old calendars of Marshack mentioned previously were an early example of this, as were the accounting tokens described by Denise Schmandt-Besserat that just preceded writing. Both were means of investing a part of our memory in physical objects. Writing greatly intensified this trend, but only in the last hundred-and-fifty years, with the advent of mechanical and electronic recording, have we have greatly increased the pace. Today we stand at a crossroads, for companies such as Intel are seriously contemplating neural computing interfaces by 2020. Even if they are off by as much as Bell was in the 1960s about the advent of practical picture phones, that is by about thirty years, the realization of a direct connection between your brain and cyberspace is likely to happen by 2050 or so.

Of course it might prove impractical, but we already have practical, though limited, wearable computers today. How quickly they will catch on is anyone's guess, but I think five to ten years is a good guess. (But, like Alonzo Gray's opinion of machine guns, I may be completely missing the mark.) My point is, that the present situation, in which we locate more and more of our memory in personal electronic devices and in clouds of computer servers online, is going to greatly intensify this process of offloading our mental processes. We cannot know what that will look or be like, but for the historian, it opens up a golden opportunity, one in which we may gain new insights into how the processes of externalization work, and allow us to apply them to the past. That is if all the information we experience does not drive us crazy.

For more on this, see: "New Ways of Remembering" on the MU eLearning Blog.

Sunday, December 5, 2010

A Case of Mistaken Hindsight

Last night, I wrote about Alonzo Gray and his opinions on the future of cavalry with respect to machines guns, motor vehicles, and airplanes. My point was our difficulty in recognizing why our predecessors failed to see the possibilities of new machines. There is a reverse side to this too. When we look back at the beginnings of a technology, we notoriously do so with the prescience of hindsight. We often make the mistake of attributing all the power of the mature machine to its embryonic form. We then wonder why it failed to have the impact we would have expected, or been sufficiently appreciated at the time, probably due to some mistaken and fixed belief against it. The machine gun is a good case in point.

Mechanical precursors to the machine gun antedate the First World War by two centuries. The Puckle Gun of 1718, which operated something like a gigantic revolver, was the first. It had the misfortune both of appearing after the end of a long and exhausting war (War of the Spanish Succession, 1701-1714) and of being overly complicated, probably to complex to be produced in any numbers.

The mitrailleuse (a multi-barreled, hand-operated weapon) is a completely different story. It was adopted in large numbers by the French army under Napoleon III in the 1860s. It did cause terrible casualties on occasion during the Franco-Prussian War (1870-18710), but it was hampered in part by conceptual issues, in part by secrecy, but in part by its own nature. The main conceptual issue was whether to treat it as an infantry weapon or as an artillery piece. That may sound like a trivial, quibbling distinction, but it was critical. Because it was decided that it should operate as artillery, it was used at the ranges typical of artillery in its day. Unfortunately, its range was too limited for that use, so its use was restricted in ways it would not have been had it been assigned directly to the infantry (as Maxim machine guns were in the First World War). Because it was kept in great secrecy before the War, the crews were not fully prepared and issues that would normally have been worked out before the fighting began were not.

We are used to seeing film from the World Wars of machine guns cutting down entire platoons in the blink of an eye. The mitrailleuse had a much lower rate of fire and could not easily be directed across a wide area. Instead, it was possible, maybe even typical for it to simply obliterate one or two men at a time. This was undoubtedly terrifying to advancing troops, but it would not necessarily stop an attack. It simply did not have the technical sophistication of the later fully automatic weapons, or even of some contemporary hand-cranked ones like the Gatling, the Nordenfeldt, or the Gardner.

Because we tend to project the character of the mature technology back on its earliest variants, it is natural for us to exaggerate the non-technical problems (inappropriate use as an artillery weapon and the effects of secrecy) and neglect the technical issues when we read about the French defeats in 1870. Of course we do not do this just with machine guns. We do it with most technologies unless we have lived through their evolution ourselves. That makes it too easy for us to criticize decisions made about machines and the uses they were put to in any given time. It also makes us wonder about the intelligence or character of men and women who failed to see the potential of new technologies as they first emerged. This is just one more of those filters through which we too often distort the past.

Saturday, December 4, 2010

"Facts Important for Cavalry to Know"

We forget sometimes how rapidly things changed in the past, and how much difference four or five years could make. In 1910, Capt. Alonzo Gray of the 14th U.S. Cavalry, wrote:

I give it as my humble opinion that increased range of firearms and the addition of machine guns, increase the sphere of action of, and necessity for, well-organized cavalry; that bicycles, motorcycles and automobiles will prove to be only valuable auxiliaries to cavalry in transporting information back to the rear, and thus saving an unnecessary expenditure of horse flesh; and that while flying machines may bring information, by so doing they will widen the sphere of action of good cavalry; and, more than ever before, as a result of such information, it will be necessary to have good cavalry ready to move on extremely short notice.
(Alonzo Gray, Cavalry Tactics as Illustrated by The War Of The Rebellion, Together with Many Interesting Facts Important for Cavalry to Know, 1910, p. 3)

That bit about the machines guns has stuck with me since I first read it in 1985. In 1910, this may not have been an unreasonable opinion. Gray may not have had cavalry charging against machine guns in mind, though, as the Australian Light Horse would soon prove in Palestine, such a thing was possible and could succeed in the right circumstances. He probably had in mind the use of machine guns as a lightweight substitute for horse artillery, which made perfect sense and was a widely held idea.

Likewise he could not imagine motor vehicles that would move cross country. The tank was still six years off, and the Jeep about thirty. Airplanes were interesting but not yet taken seriously, least of all by the U.S. Army, which would lag well behind the Europeans.

Had he written five years later, the picture would have altered radically. By then, airplanes had proven their worth; cavalry had been immobilized by trenches and barbed wire on the Western Front, where they spent their time waiting to pursue the enemy once the infantry had broken through; machine guns had proven of more utility than horses, and, in London, Winston Churchill had convened the Landship Committee, whose ultimate product would be the tank.

As with most of us, he could not foresee what would happen as a given technology matured, but only what it could do at the time. Likewise, he could not see how the combination of several technologies would create whole new contexts and shape the future. No one could then, not even H.G. Wells. No one can now.

Friday, December 3, 2010

Fact and Fiction

What do we really know of the past? I think we all have a strong tendency to picture it very much as we have seen it in movies and television, perhaps read it in a novel, and generally have a stilted view. I know that the Sherlock Holmes stories influence how I see the late Victorians and picture London. Likewise, the Hornblower stories of C.S. Forester, or the more recent novels of Patrick O'Brian about his naval heroes Aubrey and Maturin, strongly color the view of even professional historians about war at sea in the age of Nelson. Fortunately, both Forester and O'Brian were excellent researchers and built convincing stories on the exploits of figures like Lord Thomas Cochrane and on Nelson himself. I have seen their works cited to provide examples in more than one naval history in recent years. Perhaps the intent is simply to provide a context that many readers will quickly understand, but it does create an interesting tension between fact and fiction. How much does a historian writing this way tend to mingle the historical and fictional characters in his mind?

When I was working on my dissertation, Kenneth Branaugh's production of Henry V was released. I was working on the sixteenth century, not the fifteenth, but I know it did have some influence. One of the figures I focused on was Sir John Smythe, a cousin of Queen Elizabeth, a sometime soldier, and a thoroughly pedantic and old-fashioned writer on military affairs. He advocated giving up firearms and returning to the good old longbow of Henry V's time. Shakespeare is supposed to have based the character of Fluellen, one of Henry's officers and advisers, partly on Smythe. Ian Holm played Fluellen, and did so in a way that would have fit Smythe like a glove. I am not sure how much this influenced my view of him in the dissertation, but I think it did have some effect, possibly reinforcing my view of him as a pedant. (Curiously, others think Fluellen was based more on Sir Roger Williams, another Elizabethan soldier who was diametrically opposed to Smythe, and about whom I also wrote.)

Another figure I wrote about was Blaise de Monluc, considered as one of the models for d'Artagnan, even though he lived a century earlier. I do know that the fictional character made the real one more attractive to me, and perhaps more accepting of some of the tales of adventure he told in his memoirs.

I don't want to belabor the point, but it is good to realize that historians may be just as influenced by literature, movies, or television as anyone. The effects may not be great, but their subtlety can easily color their depiction of the character and personality of their subjects.

Thursday, December 2, 2010

The Ticking of the Clock

In the past few posts I have been focusing on clocks in the fourteenth and fifteenth centuries. The human desire to tell time runs very deeply, of course, and has an amazingly long history. It is also one of the central parts of the story of who we are and, I believe, how our minds developed. Tonight I just want to focus on one bit of it, as a way to focus attention on the importance of time in evolution of culture and cognition.

About thirty years ago, I became aware of the work of Alexander Marshack. Over a number of years, from the sixties to the nineties, he carefully researched and sought to explain a variety of artifacts, in bone or stone, showing careful groupings of markings. Because these had some aspects that indicated that they were more than mere tallies, that is there seemed to be some differentiation in the markings, he came to the conclusion that they might be numeric notations, almost certainly representing some sort of lunar calendars.

His thesis was controversial because of the age of these artifacts. Some of them date back almost 30,000 years. In other words if Marshack had it right, we have been keeping careful track of time for 300 centuries. Incidentally, it means that we could count well that far in the past.

Now let us pause for a moment and consider what this might mean.  It inidcates that we have been aware of passing time, and must have had language for it, for about six times longer than we have had writing. Likewise, we have had some concept of numbers six times longer than we have been writing. There is in fact a school of thought, largely based on the work of Denise Schmandt-Besserat, that the earliest writing, cuneiform, arose out of an accounting system based on fired-clay tokens stored in clay envelopes. (Since the tokens could only be gotten too by breaking the envelopes she argues that it became common to make impressions of the tokens in the clay before they were placed inside and the envelope was fired. Later the tokens were replaced by markings made with a stylus. There are a lot of problems with all of this, and it may not be a full explanation of the origin of writing, but we do know that writing long remained largely a vehicle for taking inventories and settling debts.)

Around the time we began to alter our sense of time with the clock,  Europeans were learning the concept of zero with the very slow spread of Hindu-Arabic numerals and changing their record keeping by developing  double-entry bookkeeping. These are aspects of the information revolution that marked the years between 1300 and 1600 that we need to keep in mind. But, and here is the point to which I wish to direct your attention, and whose importance is only beginning to dawn on me, these represent changes in our cognition. I am not yet sure how large these changes were, nor am I sure how they fit into the larger pattern of changes wrought by technology, art, and humanism in those years; nevertheless, given what has followed, I think it worth pondering what changes to our minds were wrought by something as seemingly insignificant as the ticking of the clock and the notation of nothing (zero).


Further Reading:

There is a good, brief essay about Marshack's work by a friend and collaborator of his, Michael Hudson, from UMKC - "After the Ice Age: How Calendar-keeping shaped Early Social Structuring."

For Denise Schmandt-Besserat, see her web page at the University of Texas.

Wednesday, December 1, 2010

A Matter of Timing

If mechanical clocks had not been invented when they were, would they have eventually been developed? A friend of mine insists they would have been; I am inclined to agree, but am not certain of when. The idea of equal hours was already in existence, for time candles were based on it, and those had been around since at least the ninth century. Sand glasses were invented about the same time as the mechanical clock and were also based on equal hours. Very elaborate clepsydras (water clocks) had been built that moved figures around to tell time. They were in many ways similar to mechanical clocks, but they were terribly prone to error due to temperature variation. In all likelihood, the idea of falling water gave rise to the idea that a clock could be built using falling weights. (Spring-driven clocks came a bit later.) But here's the thing: elaborate clepsydras had been known and built for well over a thousand years, but prior to the thirteenth century, we know of no attempt to build a weight-driven clock.

So, the question is, when would this have been done? We cannot know. It might have waited a few decades or a few hundred years. The medieval and Renaissance mind was fertile when it came to inventions. Perhaps only the needs of long-distance navigation would have forced the issue. In that case the outcome of Cloudesley Shovell's demise might not have been the race to create the chronometer, but the race to invent a simple clock. The idea of the watchmaker God (God conceived of as a craftsman who built and wound up the world, then sat back and watched it go) would then not have arisen in the previous century, and an important underpinning of the philosophy and science of the early modern world would have been missing. That leads to the unknowable question, what would have replaced it? Would it have taken philosophy and science in a different direction?

Matters of timing are important. We have a good idea of what happens when an invention comes along before the world is ready. Ada Lovelace wrote what is now considered the first computer program in 1842-1843, but the machine for which it was written was never completed, and her effort produced no fruits for over a century. They remain in the realm of inspiring historical curiosities. What we lack is a good idea of what happens to inventions that are born late. Even more importantly, we have difficulty imagining how the lack of one idea at a critical time, perhaps an idea inspired by an invention, might have affected history, but that is a subject for consideration at another time.

Tuesday, November 30, 2010

Counterclockwise

I want to try a thought experiment. What if the mechanical (weight- or spring-driven) clock had not been invented, or had failed to gain acceptance? This is known as a counterfactual (or a what-if), as it runs contrary to events. Exercises of this sort are useful for getting a better grasp on the real consequences of an event, or in this case, a technology.

So let us suppose that, at the end of the thirteenth century, the mechanical clock was unknown and was not subsequently invented. This does not mean that there were no timekeeping devices, or that there were no geared devices of any sort. There were sundials and water clocks (clepsydras), as there had been since ancient times. There were time candles, known at least as far back as the ninth century. There may or may not have been sandglasses. One of the curious things about the history of timekeeping is that mechanical clocks and sandglasses appear in the record at about the same time. For short timings, the sorts of things we might use an egg timer for today, Europeans would recite a given number of Paternosters (Lord's Prayers).

Of these methods, the sandglass and the time candle were the only ones to keep constant time with equal minutes or hours. Both were used more for timing than for telling time, though sandglasses would be used by ships for centuries to time a watch, which normally consisted of eight "glasses" turned every half-hour. Time candles seem to have been rare. They maintained a constant time by the simple expedient of marking them at fixed intervals.

Sundials were incapable of keeping equal hours as they were in fact based on the unequal hours of the sun itself. Clepsydras could be geared to work with an escapement, like a mechanical clock in fact, and so could be made to keep standard hours if desired. The Chinese had in fact created at least one water clock of this sort, but they had not followed up on it. Most water clocks were far simpler, and while they could be made to ring an alarm, like a mechanical clock, they were based on how much water would flow through an opening in a given time, thus moving a float. They were not given to unequal hours in the same way as a sundial, but they were unequal due to temperature fluctuations that affected the flow of water, particularly in winter.

It seems unlikely that equal hours would have caught on, or would have taken much longer to catch on, without the mechanical clock. Conceivably some form of clepsydra, like the one known Chinese example, might have been used for monastic or town clocks, but it would have been awkward in a bell tower and useless in much of northern Europe. Without equal hours, time would have continued to move according to the natural rhythms of the sun and the seasons. Work routines would not have been disrupted in the 1380s and after.

Quite possibly the idea of standardized measurements would have been weakened. I may be mistaken, but standard hours were the first measurement to have been constant over a long period across most of Europe. There were no time zones, so the actual time varied from place to place, according to when noon occurred, but the length of an hour became fixed long before other international measurements.

The sciences would have progressed more slowly, or followed a different path. As the accuracy of mechanical clocks increased, fine timing became vital both to astronomy and experimental science. By the end of the seventeenth century, seconds had become critical and for a few experiments, very accurate clocks that measured one one-thousandth of a minute were in use. If the examples of standardized minutes had not been available, would other kinds of standardized measurements have been postponed as well? Would scientists today still be making mistakes in converting English, French, and German inches, feet, and pounds back and forth?

The implications of not having equal seconds, hours, and minutes do not end there. No digital electronic device, and many analog ones, including televisions, would work. Telecommunications as we know it could not exist. Railway and flight schedules would be a nightmare, as varying lengths of hours at different longitudes and a lack of uniform time zones would mean that timetables would have to be recalculated constantly for a myriad of departure locations and destinations. Of course navigation would be just as hard, for while it is possible to go north or south without knowing the time, all east-west travel (moving from one longitude to another) requires knowing the time as precisely as possible. A few minutes error can be catastrophic, as Sir Cloudesley Shovell and his men discovered when their ships were lost due to the inability of the navigators to correctly calculate the longitude in 1707. They found themselves in the Scilly Isles, rather than off the coast of Brittany, in a storm. Only after chronometers were introduced was this sort of thing avoidable. Sailing ships only travelled at a few miles per hour. Imagine the navigational errors in a jet flying a hundred times as fast.

We would simply not have the modern world in any recognizable form without mechanical clocks. They are one of those central inventions that not only had a direct effect on civilization, but had a tremendous knock-on effect (like a cue-ball sending billiard balls in several directions) as well. I have only scratched the surface of the possibilities here, and barely mentioned the cognitive effects, which might have been even more profound.

Monday, November 29, 2010

Mediated Senses

This morning, I saw an "Infographics Feature" from the MIT Technology Review with the portentous title, "The Mobile Device Is Becoming Humankind's Primary Tool." One of the graphs shows that there will soon be almost as many cell phones and mobile devices in the world as there are people. It then goes on to illustrate what surveys show about how those devices are used.

Also, a friend, in response to yesterday's post on "Horses and Biplanes" sent me a piece about railroads and the introduction of time zones from Wired, "Nov. 18, 1883: Railroad Time Goes Coast to Coast." He was struck by how difficult it was to get everyone's time synchronized and the scale of problems this caused.

Those two pieces do not have much to do with each other superficially, but they do share a link. The MIT feature is about how smartphones and other mobile devices are coming to mediate our interactions with reality. The Wired piece is about a juncture when technology greatly increased its role in mediating one of our senses - time. Likewise, "Horses and Biplanes" ended by observing that our digitally mediated environment might be affecting our ability to understand the embodied past. It might make it easier to understand it in an abstract way, but at the same time it makes it harder for us to identify with the physical and sensory experiences of our ancestors.

Clocks in the fourteenth century were the first complex machines to mediate between one of our senses and the world. Earlier, language, writing, art, and calendars had all played roles in mediating reality, but with the invention of mechanical time keeping, I suspect a fundamental change occurred. Today our sense of time is almost wholly artificial and our trust in instrumentation is nearly complete. (I am reminded of those stories of people who have driven their cars into creeks and ponds because their GPS system told them that there was a road or bridge that did not exist.) That would have been essentially inconceivable in the 1380s, but one of the effects of clocks on history, it seems to me, is that it made us more amenable to trusting devices more than our senses.

Sunday, November 28, 2010

Of Horses and Biplanes

In the early years of the last century, men and women were confronted by many different kinds of machinery for the first time. Aircraft were a tremendous unknown, but pilots quickly learned to adapt. Even though they were flying hundreds and thousands of feet in the air, and had to learn to think and act in three dimensions, one is struck by the extent that attitudes towards the horse seem to have carried over into early aviation. It is notable that large numbers of the aces of the First World War were former cavalrymen, so this is perhaps to be expected, but it is driven home forcefully by the memoir, The Red Battle Flyer, of the Red Baron, Freiherr Manfred von Richtofen.

The early pages are filled with Richtofen's exploits steeple chasing with various horses prior to the War. He wrote of his horses and his planes in much the same way, sometimes giving the machine almost animate qualities, such as this passage:

I was surrounded by an inky blackness. Beneath me the trees bent down in the gale. Suddenly I saw right in front of me a wooded height. I could not avoid it. My Albatros managed to take it. I was able to fly only in a straight line. Therefore I had to take every obstacle that I encountered. My flight became a jumping competition purely and simply.  [Emphasis added.]

 The metaphors he uses throughout the book are all of the chase and the hunt. One feels that he and his contemporaries created a relationship with their machines similar to the ones that they had previously had with their horses. (It is striking to note, that one of the arguments against the introduction of enclosed cockpits in the thirties was that the pilots could not clearly hear the engines; they were using all of their senses, literally flying by the seat of their pants.)

We seem always to have treated our machines, at least the complex ones, as if they are alive. While we may logically know they are not alive, the intuitive and imaginative portions of the mind do treat them as if they are. Sailing ships were virtually living machines, and were certainly treated as such by their crews. Like early aircraft, they had to be operated by feel, sight, and sound. In a sense, they spoke to the whole person. This is something we are losing today as we come to rely more and more on sight in operating our digital machines.

In a previous post, I suggested that clocks created a different understanding of time, one actually rooted in the structure of the left and right hemispheres of the brain. We cannot fully recover the sense of time our pre-fourteenth century ancestors had. I am also suggesting that as we move away from the analog feedback that machines have given us in the past, and replace them with digital readouts and displays, even if they mimic the old dials and gauges, we are less-and-less able to experience machines in the way our predecessors did, even a century ago. In dealing with the past, we need to keep this in mind.


The quote is taken from Manfred von Richtofen, The Red Battle Flyer, trans. J. Ellis Barker, 1918, p. 94.

Saturday, November 27, 2010

A Change of Direction

It must seem that I am departing a long way from the original emphasis of this blog in looking at military history, and particularly at battles. Of course it is, but that is my prerogative. The truth is that I love military history and have previously written about how battles were narrated and depicted, and to some extent about what that could tell us about the mental processes of those who wrote or drew or painted them. In point of fact, it is the mental processes that hold my attention, even if the drums and trumpets, the colorful uniforms, the complexity of the flow of events distracts me now and again.

But there is something else I want to point out, something that while it does simply represent my point of view, is, to me, a very important point. Any kind of history, any specialty, whether by subject, nation, region, or time period is just a piece of universal history. Universal history has gone out of fashion in the past century, but properly understood as a holistic picture of the world, one that has no proper bounds other than the nature of humanity; it must be the ending point of all historical endeavors. Maybe that sounds romantic or simply crazy to you. This is really my starting point, whether I am writing about the battle of Mobile Bay, the cognitive effects of clocks, the workings of Machiavelli's imagination (I inevitably end up coming back to Machiavelli like a lodestone), Tolkein's hatred of machinery, the significance of soldiers crying in a folk song, or the Red Baron's relationship to his airplanes. I was mistaken in trying to make this a blog about one aspect of history, for neither history nor my mind works that way. In the end, what matters is what these subjects can tell us about the spirit of a time, the nature of humanity, and our relationship with the world.

As always, I am feeling my way toward something that I may never reach. I suppose that could be taken as history as a form of mysticism, which, is, perhaps, not far from how I really view history. Sometimes I may sound pompous, even to my self. Oftentimes I may make statements that are maddeningly vague and which I cannot completely prove (and may later repudiate).

Everything in history is a clue to who we are and may become.

Friday, November 26, 2010

Clocks and the Brain

We know that time perceptions changed with the coming of the clock but were not instantaneous. Rather they were a slow, even glacial series of revolutions, as Paul Glennie and Nigel Thrift would have it (Glennie and Thrift, 162). Nevertheless, there were significant changes underway by 1400 and which were fully elaborated by 1500. These, if I understand the neuropsychology summarized by Iain McGilchrist correctly, should have already begun to have an effect on brain organization (McGilchrist, 74-77). They may have applied more to the elites and certain professions (sailors and merchants, for instance) but appear to have been widespread.

To begin with, we need to recognize different kinds of time. There is natural, experiential time through which we maintain a continuous, narrative flow of sensations in context. This is strongly associated with the right hemisphere of the brain. Damage to the right posterior cortex can cause a loss of this continuous time and impair the ability to understand narratives. McGilchrist likens this to Capgras Syndome, also caused by right hemispheric deficits. In the case of Capgras, the same individual seen at different times in different moods or with a different haircut is perceived as a different person, usually an impostor (McGilchrist, 54). The person affected with Capgras sees only differences and cannot properly connect other people seen in different contexts or times. Individuals with damage to the right posterior cortex, likewise, often cannot connect things that happen at different times or even sequentially.

The second kind of time, time experienced as discrete units (hours, minutes, seconds), is largely the province of the left hemisphere. The more abstract the perception of time, the more the left hemisphere is involved. Glennie and Thrift go so far as to argue there are many different kinds of clock time, depending on factors such as how important it is to know the precise time and how easily it is to get the correct time. These would still all be left-hemispheric functions, I believe, and greater precision should be generally related to greater abstraction. Likewise, linear time appears to be a more abstract experience than cyclical time. It arises as a major point of view only in a few societies, and it primarily associated with the monotheistic religions of the Middle East and Europe.

Prior to 1300, time for most people was cyclical and continuous (Berman, 306, notes 7 & 9). Linear time was part of Christianity, but was primarily of interest to clerics. Even the yearly cycle of saint's days and feasts, drawn from the linear history of Christianity, would have reinforced the cycle of the seasons.

In the thirteenth century, before the spread of clocks, civil statutes and other documents expressed time in terms of the sun and the canonical hours (such as Prime, Terce, Sext, and None, which varied in length according to the season) that were rung on church bells. By the early fourteenth century, at Bristol, regulations of the port, market, and trade were using the standardized (unvarying) hours struck by a clock (Glennie and Thrift, 172). By the end of the century, artisans were also denoting time in this way in depositions. Less than a century later, clock time was common in correspondence (173) and it was not uncommon to record important events down to the quarter of an hour (181 & n. 60).

So what can we make of all this. Obviously men and women did not suddenly lose their sense of continuous, narrative time. Over the course of these two centuries, and the two that followed (when minutes and seconds first became important, they were gradually developing that portion of the left brain associated with abstract time and abstract analysis to a greater degree than in the past. I believe this happened across the board, from peasants to kings. This goes along with many other aspects of the Renaissance information revolution, although other aspects of it ran in the contrary direction. Because human perception of time is so central to understanding the world, it is critical to our ability to comprehend the minds and perceptions of our ancestors. We cannot build up a nuance portrait without this kind of knowledge.


Sources:

Paul Glennie and Nigel Thrift, "Revolutions in the Times," in David N. Livingstone and Charles W.J. Withers, eds., Geography and Revolution. University of Chicago Press, 2005, 161-198.

Iain McGilchrist, The Master and His Emissary: The Divided Brain and the Making of the Western World. Yale University Press, 2009.

Morris Berman, The Reenchantment of the World. Cornell University Press, 1981.

Thursday, November 25, 2010

Clock Time


Clocks are one of those things that affect a great many others. In the case of clocks, it is because they affect one of our fundamental perceptions: time. But in the beginning, I think they had a second effect, changing, or perhaps sharpening, our common understanding of cause and effect. For a period of about five hundred years the clock was the most complex device in existence, certainly the most complex one most people would ever encounter. It was also understandable. One could look at the clockworks, big or small, and have their arcane processes explained. One thing followed logically after another. Effect followed cause; all was visible, orderly, and explicable. Is it any wonder that it would become a model for the universe and the divine order?

For the small percentage of humanity living in European towns, clock time came to replace nature as the timekeeper from the 1380s onward. No longer did hours vary in length with the seasons. They were uniform; urban workers were now expected to work the same length of time each day regardless of the month or the changing length of daylight.

Humankind was learning to live to an artificial rhythm, one at odds with the world around them but also with their own circadian rhythms. One wonders if they did not also become more irritable. People also became more used to breaking things up into units, into atomizing their experiences.

New geographies of time emerged. This was not immediate, and for the first century or two was communal, private citizens could rarely afford clocks, and watches were unknown before 1500. But clocks became smaller and more accurate, slowly making their way into homes, first of nobles, then of the bourgeoisie.  The minute, the hour, and the day were mapped not on the sun and the moon, but on the clock. As clocks increased in accuracy, minutes began to matter.

These changes coincided with the emergence of humanist notions of historical time and time periods. Religion and mythology had long constructed chronologies with different ages in them, but these were not historical time periods. Typically, all of history was one and continuous. The Renaissance began to see that there were discontinuities and that the ancient world and the modern world were not one seamless whole. Through the work of men like Bruni, the differences came to be understood and appreciated.

So it was not only the daily sense of time that was being altered in the fourteenth and fifteenth centuries. It seems to me that the entire human sense of time was changing radically in those years. Likewise, I would argue that both clocks and the new understanding of history were creating a broader command of causality, one that would come to rely less-and-less on divine intervention. These were important changes in the mental processes of Renaissance Europeans.

Wednesday, November 24, 2010

Rapid Change Is Nothing New

We tend to think of the rapid change of our times as somehow both the norm and unique at the same time. We want our time to be special, so we too often fail to understand the extent or the kinds of changes our ancestors saw, while, at the same time, magnifying every little fad of ours into momentous change. The truth is that there were changes going on in Europe five, six, and seven hundred years ago that were cognitively as wrenching as anything we have experienced.

I've mentioned humanism in the preceding posts. It introduced a new educational agenda, along with new ways of looking at language, history, politics, and authority. Meanwhile, in England and Bohemia, serious religious upheavals were underway, leading to religious wars that lasted fifteen years in Bohemia, and occasionally spilling out into neighboring lands. But of course all of Europe was wracked by wars, and when not by wars, often by marauding bands of ex-soldiers. The Plague had thinned the population and altered social and economic relationships all over the continent.

If the humanists were mainly interested in the word, the artists of the time were evincing a new interest in the eye. The development of linear perspective was critical in the rise of the sciences and engineering in Europe. It taught a new way to observe the present, just as humanist textual analysis taught new ways to see the past.

There were so many new technologies contributing too. If, when Leonardo Bruni wrote his De Militia in 1422, printing was not yet a reality, he was a beneficiary of cheap paper and eyeglasses. His childhood would have included revolts and labor unrest that included riots by workers being forced to work according to the new concepts about time resulting from the spread of clocks. Metallurgy was also on the move among Europeans in the wake of the blast furnace and the trip hammer, though these had been in existence for some time. We can see this in the development of armor between 1300 and 1500, moving from chainmail with smaller pieces of steel plate armor, to full suits of plate, culminating in the magnificently worked suits that we admire in museums today, and in which we anachronistically dress the Knights of the Round Table.

One of the reasons, particularly after 1400, for improvements in armor was the introduction of guns to Europe by the 1320s. These, as well as the powder they used, underwent continual improvements in these years, ultimately wrecking medieval walls with unprecedented speed by the mid-1400s, requiring new types of fortification, and affecting the lives of city dwellers everywhere by fostering new types of city planning (to ensure the most effective fortifications). They also altered the balance of power on more than one occasion, contributing to the endemic warfare and political turmoil of the times.

It is against this backdrop that we must see the people and events of the Renaissance. Life failed to stop moving, nor did the human mind cease to find new directions for development. Likewise the battlefield was in constant flux, with major changes coming with each new war or even each new campaign.

Tuesday, November 23, 2010

The Importance of a Single Word

Why was it important for the Renaissance to recognize the Roman soldier (miles, milites in Latin) as a foot soldier? Was it really so vital? Consider that many of the Roman sources, Caesar, Livy, Vegetius and Frontinus were already well known, and that they were used as models. Only a dozen years before Bruni worked out the real meaning of miles in detail, Christine da Pisan (also spelled Pizan) had written a treatise called The Book of the Deeds of Arms and Chivalry, in which she largely updated Vegetius' late Roman work on warfare. Authors of the time were doing what we all do, casting the Romans as people like themselves. By and large they lacked historical perspective and Roman soldiers were very like medieval men-at-arms to them.

The new historical perspective of the Renaissance gave a little clearer picture of the Romans and their legions, though it remained confused. A century after Bruni wrote De Militia, Machiavelli would publish his Art of War, a work with a long life. During his political and administrative career, Machiavelli had created both the ideology and structure of a citizen, infantry-dominated militia army. He codified and tried to bring it up to date it in The Art of War. The details he laid out were not widely adopted. The basic ideology was. It was still being reprinted as a useful military treatise for study almost three centuries later.

Of more concrete importance was the fact that the new understanding of miles allowed a better understanding of the Roman legions. We are still working out some of the details, and many of the important points were not clarified even a century ago, but what was known was influential. The mixed nature of heavy and light infantry combined with some cavalry would affect various experiments in military organization down to the Napoleonic Wars. Even more important, the descriptions of the Roman three-line order and their ability to relieve tired troops with fresh ones in the midst of battle inspired Maurice and William Lewis of Nassau in the 1590s to begin experimenting with new infantry drills. While these did not directly imitate the Roman methods, they provided a consistent method for the slow muskets of the time to maintain a continuous fire. From Maurice, it is a relatively straight line to Napoleon, Grant, and Moltke. Tactics and military organization followed a continuous evolution from that point onwards.

These developments might have come about without the new understanding or Roman history, but they might well have taken a different route as well. We cannot know with certainty, but this change, itself a result of a new sensitivity to language and history, that is a cognitive change, seems very important.



For Bruni's work and its influence, see C.C. Bailey, War and society in renaissance Florence: the De Militia of Leonardo Bruni, Toronto: University of Toronto Press, 1961.

For more on Maurice of Nassau's indebtedness to Roman examples, see Geoffrey Parker, "The Limits to Revolution in Military Affairs: Maurice of Nassau, the Battle of Nieuwpoort (1600), and the Legacy," The Journal of Military History, vol. 71, no. 2 (2007) 331-372.

Monday, November 22, 2010

An Information Revolution

I want to jump forward to the Late Middle Ages and Renaissance. The Western mind was in much ferment and information technology, broadly defined, was racing forward. There were new understandings of texts and language, of time and space, of cause and effect. Changes in belief were gradually making headway. Armies and soldiers underwent almost constant evolution for a period of something like three or four hundred years.

At times I tend to write like a technological determinist, looking to inventions as the source of change. That is largely because I find it so difficult to separate the various threads of the story. Oftentimes I see things happening as part of a co-evolution. The early portion of the information revolution, which centered on the development of mechanical paper mills by the 1280s, cheap paper, and a consequently lower cost for manuscript books, would not have taken off as fast, or at all, had not the medieval universities provided a ready market. (I wonder too if the development of reading glasses, also prior to the 1280s, was not as important in allowing scholars a longer or easier working life.) The humanism of the centuries that followed was greatly facilitated by cheap paper and less expensive books. Printing, developed somewhere around 1440, needed both cheap paper, the pre-existing market for books, and the demand of educated audience, whether scholastically trained from the universities, or the new humanists. The demands and preferences of the humanists and scholastics came to define the form the printed book took.

So what has any of this to do with wars and soldiers? Well, for one thing, we begin to find military treatises, often with some basis in ancient Greece and Rome, as well as works meant to inspire the nobility going off to war. Initially, these could be quite dear and show an abysmal lack of understanding of the past, even of the terms used by the Romans. It was not until Leonardo Bruni's De Militia (1422) that it was firmly established that Roman soldiers were foot soldiers, not cavalry.  This and the translations of ancient works that followed provided much food for thought in the decades that followed. While we might have arrived at the same point, the Roman examples were important, and at times, critical. This is one of the threads I want to follow in subsequent posts.

Sunday, November 21, 2010

Cognitive Change and Tactical Change

From what I wrote last night, it follows that periods in which we see significant changes in cognition, belief, and expansion of the general body of knowledge should also be those where we see changes in warfare. This is because these are the periods in which the limits on thought and action were changing. It has been the case in several periods. The earliest for which we have good evidence is classical Greece.

Among the ancient Greeks, for a period of two- to three-hundred years, hoplite warfare was dominant. This centered on a citizen soldier armed with a spear, sword, a large, round, bronze-covered shield, a close, bronze helmet, greaves, and a cuirass of either bronze or laminated linen. They lined up in formations several men deep (eight perhaps being usual) and charged. The major portion of the battle is supposed to have been the othismos, which is depicted by modern historians as a prolonged shoving match with shields, after which one side or the other broke. Most casualties are assumed to have occurred after the othismos. That is the way they fought one another, and it is considered by modern historians to be fairly ritualized. (There are a lot of uncertainties about the mechanics of all of this, and frankly it is difficult to read the scholarly articles and be completely convinced. The blog "Hollow Lakedaimon" has a number of compelling posts following the arguments for and against the prevailing interpretation.)

Fortunately, my point is not about the details of hoplite battle, but, rather, that it changed dramatically in the later years of the fifth century B.C., and continued to change throughout the following century. The changes become discernable in the Peloponnesian War and those that followed. As early as Sphacteria in 425, when Athenian light troops played a decisive role against the Spartans, and Delium in 424 B.C., when Thebans began to vary the formations and tactics of hoplites to defeat the Athenians, it is clear that the old, formalized, semi-ritualized forms of combat were obsolescent. In the decades that followed, both changes in how hoplites fought and in the use of light troops led to a revolution in Greek warfare, culminating in the rise of Philip and Alexander of Macedon, who incorporated Greek ideas into a much more flexible system.

This coincided with tremendous cognitive changes at Athens and elsewhere. This was the age of Socrates (who in fact fought at Delium), Thucydides (who was a general as well as an historian), Sophocles, Euripides, Aristophanes, and Xenophon (another commander and literary figure). Plato and Aristotle (who was Alexander's tutor) followed. No one doubts that new ways of thinking emerged in this time period, but they are not generally tied to other developments, such as the change in equipment and tactics. As I will try to show in future posts, these changes often occur in periods of intellectual ferment. The old limitations are removed and new modes emerge not just in philosophy and literature, but in all walks of life, war included.

We cannot separate military history from cultural and intellectual history.

Saturday, November 20, 2010

Becoming Aware of Mental Boundaries

There are many sorts of limiting factors that have kept actors in historical events from behaving in certain ways or that prevent an event from unfolding in the way that would seem most likely or logical to us. Of these, the hardest to grasp are often mental limits. When we read history, we must keep in mind not only the factors that governed the thought and behavior of individuals in the past, but also the ones that govern our thoughts and judgments as we read about them or try to place ourselves in their shoes.

There is a trope in science fiction that suggests that a modern person could survive more easily in the past than a person from the past could survive in our time. It is largely hogwash, for each time has a set of skills and a basic body of knowledge that is quite different from its remote predecessors and successors. I mention this because I think it speaks to the core of the way we see ourselves in regard to past generations. We need to be as aware of the bounds of our own mental world as we are of the actors in the historical dramas we read and re-imagine in our minds.

Broadly speaking, factors that limit our mental worlds fall into two broad categories. One might be termed the limits of knowledge and belief. The other consists of limits imposed by mental habits and cognitive processes.  These are not always distinct, and some things fall in between, particularly phobias, which might be termed both belief and habit. I am thinking here of Grant's previously mentioned horror of returning the same way he came, which so influenced his behavior on campaign when temporarily checked. In my previous discussion of the repeated loss of fleets to storms by the Romans during the First Punic War, I suggested that lack of knowledge was one of the main reasons for this. Clearly that would fall into the first category. At the same time, the commanders of the fleets were clearly landlubbers, with the mental habits and equipment of soldiers, not sailors. That would fall into my second category.

When dealing with figures and events from the past two or three centuries (at least in dealing with the West), the differences between their minds and ours are largely the result of differences in knowledge and belief. As one moves back beyond that, differences in mental habits and cognition become more and more important. A close friend once told me that he did not like to read about periods before the American Civil War, because he found it too difficult to understand the way people thought when he moved into earlier periods.

I have noticed when moving as far back as Elizabethan times that significant differences in mental habits appear. For instance, I was floored by Francois de La Noue, the third-ranking Huguenot general in the French Wars of Religion. He wrote penetrating analyses of campaigns and battles, showing an excellent understanding of cause and effect while ferreting out lessons from contemporary combat, but when he moved to larger events and the ultimate causes of the Wars, he moved into a species of magical thinking, invoking God's wrath on France for impiety, witchcraft, gambling, and similar sins. It is as if he crossed some threshold of abstraction and lost his reason. Of course it was reasonable in his time, even if such thinking (as emerged from certain quarters after Hurricane Katrina and the 2004 Tsunami) draws derision today. Many people still think in that way, but they are not usually leading military or political figures (I hope.)

Even when we read something as concrete and seemingly non-intellectual as an account of a battle, we are dealing with this sort of thing, at least unconsciously. To develop a truer understanding of the past, we need to make it explicit and pull it out into the open.

Friday, November 19, 2010

But Why Didn't They ...

Most often a historian tries to show why something happened. In other words, the author is mainly interested in creating a chain of cause and effect. That is the way it is normally done, but it may leave readers wondering why the actors in the historical drama did not do things in a different way, or why they did not know something that seems obvious to us.

For instance, in the First Punic War, the Romans managed to win almost every naval battle, but it took them a very long time to win the war, in large part because they kept losing entire fleets to storms. Since they normally sailed close to land, it is logical that they could have been able to beach their galleys and avoid major losses. The problem was probably threefold. First, as J.H. Thiel noted repeatedly in Roman Sea-Power Before the Second Punic War (1954), the Romans had no prior history of large-scale naval or maritime endeavors. They had very little knowledge of the sea and of storms at sea. That their naval leaders failed for a long time to gain such knowledge is largely due to the fact that they were not admirals in any real sense. Instead, they were magistrates, elected from the aristocracy and replaced yearly, intended to lead legions on land. Few of them commanded at sea for more than a campaigning season, so they had little chance to gain experience.

Note here there are two limiting factors. (There were probably others that were specific to a given time and place, but these two seem the most important and the most widespread.) The structure of Roman government meant that men would be chosen from among the large land-owning families to serve for a year. They lacked knowledge of the sea and they lacked time to gain it. So one limitation was structural. The other was the simple unfamiliarity of the citizens with the sea. As more than one nation has found out, you can make seamen into soldiers, but it is much more difficult for a landsman to become a sailor.

Some historians, such as Theil, do spend a time on these kinds of negative explanations. Any good historian will be aware of the limiting factors, but they may not discuss them in detail or at length. One of the reasons for this is often that the limiting factors are complex and structural, difficult to explain in a short space. Another is that historians internalize many of these as they work over the material and become intimately familiar with the events, the people, and the time in question. The knowledge becomes implicit after a while. Implicit knowledge is difficult to communicate, difficult sometimes even to verbalize. It also creates a barrier between the author and the audience, because the author knows things and has difficulty comprehending what knowledge the audience has or what it lacks. At what point does the explanation become a digression, the digression a distraction, and the distraction a separate project?

So the question of "why didn't they" can apply just as much to the limitations on the author as on the people about whom he is writing.

Thursday, November 18, 2010

The More Things Change, the More Their Names Stay the Same

The last two posts on Mobile Bay remind me of an old problem, the conservatism of naming. Over-and-over again in history we see a word change its meaning, sometimes drastically, in a short period of time and create intense confusion.

Take the torpedo. To the Victorians the name could mean: (1) a stationary naval mine, (2) a spar torpedo, (3) a towed torpedo, (4) a primitive land mine, and (5) an automobile or fish torpedo - what we would think of as a torpedo today. A spar torpedo was essentially a long pole or boom with an explosive charge attached. It was extended from the bow or at an angle from a vessel and was intended to detonate when it struck another boat or ship. It was with one of these that the Confederate submarine Hunley sank the USS Housatonic. Towed torpedoes were explosive charges fitted with vanes or fins. When towed by a ship, the vanes caused them to swing out to the side of the ship. As the attacking ship passed, it was intended that the towed torpedo would strike the enemy vessel and explode. The fish torpedo (so called because it swam under the water) was also often called a Whitehead torpedo, after its inventor.

From the context, we can usually tell which weapon the author intended. This is, as I said, an old problem, and the solution is not always so obvious. The early history of guns and gunpowder is a case in point. For many years, there were scholars who believed that guns and gunpowder were an Arabic invention, or at least that they appeared in the Middle East before they did in Europe. One of the chief reasons for this was because the terms barud (pronounced with both vowels long, something like "bay-rood" I believe) and naft originally seem to have meant saltpeter and an inflammable mixture used as an incendiary weapon, respectively, but both came to mean gunpowder. Historians running into this were not always certain which meaning was intended, nor were they at all certain when the usage changed. In fact the use of naft to mean either gunpowder or an incendiary continued side-by-side for a time. (J.R. Partington has a good discussion of this in Chapter V of his, A History of Greek Fire and Gunpowder, 1960, reprinted 1999, esp. pp. 189-197.)

Similar confusion arose in China and in Europe over the terminology of the new weapons. Decades of scholarship were needed to sort it all out.  Today we continue to apply old words in specific ways to new technologies. We might recall that, a hundred years ago, a computer was a person who made computations, or that a tank was still strictly something for holding liquids. The more things change, the more their names remain the same.

Wednesday, November 17, 2010

Damning the Torpedoes Again

Twenty years after the Civil War, David Dixon Porter, step-brother of David Farragut, and himself one of the major Union naval heroes, published his The Naval History of the Civil War. While there is much in it that is clearly pro-Navy propaganda and aimed at glorifying the officers of the fleet, it also tells us much about how a professional sea officer viewed the events of the War. There are a couple of things in his chapter on the Battle of Mobile Bay that I think deserve mention in the context of the previous post.

He makes much of Farragut's aggressiveness and personal bravery, noting that he "had already earned the sobriquet of 'The Old Salamander.'" (p. 567) Presumably salamander is here intended in the older sense of a fiery spirit, not of an amphibian with a long tail. By the way, he gives the famous quote as "'D--n the torpedoes,' he replied, 'follow me!'" (p. 573) Note too, as I failed to last night, that this followed directly on the sudden sinking of the ironclad monitor Tecumseh from a torpedo. 

Clearly Farragut was aggressive to a fault, but he had hesitated to attack without ironclads. He had been outside Mobile Bay since January, and had repeatedly requested at least one ironclad. Porter writes: "At that time Farragut had not an ironclad, and, being convinced that it would be madness to attack these forts without such aid, made his wants known to the Navy Department, and the vessels were eventually supplied." (p. 567)

This point is made later in the chapter, and it is reiterated that he thought it foolish to force the forts and face at least one Confederate ironclad without at least a single monitor. As it was, he eventually got four. But note, even as aggressive an officer as this felt he needed a technological force multiplier in the shape of ironclad monitors. They did not prove vital in the battle that followed, but they gave Farragut and his men a psychological edge they needed.

Tuesday, November 16, 2010

Damning the Torpedoes

To amplify literally means to enlarge, increase, or exaggerate. Amplitude is a related term, often used to describe the height of a wave. I often think of amplifying as a heightening. It is in this sense of heightening or exaggerating an existing tendency that I meant, "technology does not so much create, as it amplifies," in my previous post. It may do so at the expense of another tendency.

We are terribly aware of it today, for instance, as the internet seemingly breaks down certain kinds of received authority, both by allowing self-publication (like this) and also by permitting formerly isolated enthusiasts, experts, and autodidacts to pool information on myriad topics (as in the case of Wikipedia). Automobiles represented a similar tendency in American society in the last century, pushing the restlessness and desire to spread out, so noted throughout our history, to its logical conclusion.

Returning to steamers and ironclads for the moment, we can see different ways they functioned within the naval and maritime culture of the day.

I have already noted the geometric mania that emerged in mid- and late-Victorian naval tactics. While this was partly the result of the increased maneuverability gained with steam, it was also an opportunity for British officers to demonstrate the extent to which they were making their profession into a science.  Reducing every discipline to a science was a ruling passion of that century, and the naval and military services of the major powers were as prone to it as any other group. Not only were they able to present their new tactics as having a mathematical and geometric (and therefore scientific) basis, but the transformation of the naval officer into an engineer was also getting its toehold, as engineers became essential for both the design and running of a warship. This would not have been the case had sailing ships continued to dominate the sea lanes.

I have also noted the extreme aggressiveness of Royal Naval officers, who seem to have been taught to fight even against greatly superior odds. This Nelsonic ideal was already present in the small US Navy, and seems to have influenced other fleets as well. In the American Civil War steamers and ironclads were used by both sides and produced some spectacular battles and incidents. Mobile Bay (August 1864) was one of the largest and most spectacular. While it is not clear that Farragut actually said, the famous line, "damn the torpedoes, full speed ahead," it captures the force and daring with which he handled the Union squadron.

The attack itself would have been impossible, and nearly inconceivable fifty years earlier. Not even the insanely brave Lord Thomas Cochrane, one of the principal models for Horatio Hornblower, would likely have led a fleet of sailing ships against two fortresses, a small collection of enemy vessels, and torpedoes (as naval mines were then called). Sailing ships did not normally attack fortifications on land; they were too vulnerable to hot shot and other incendiaries. In the 1850s, during the Crimean War, they had begun to do so, but tentatively and always wary of Russian torpedoes. It was only with the advent of true armored warships after 1858 that this became possible. Even then, torpedoes were a terrible danger, one well understood since the sinking of the USS Cairo by one twenty months previously. Even given the aggressiveness of officers like David Farragut or his foster brother David Dixon Porter, it would have been expressed very differently without ironclads.

In future posts, I hope to make this matter of technologies abilities to amplify or heighten clearer, as well as delineating the way it operates in greater detail.

Monday, November 15, 2010

Amplification

In these posts, I have been focusing on the century from the collapse of the empire of Napoleon to the collapse of the continental European empires. For me, this period is a laboratory for studying the effects and significance of technology on society, culture, and mind. The changes were across the board, but in military and naval terms, the technical changes are well documented and understandable. These were not trivial changes, but profound. Some of the reactions they provoked should give us pause in considering our own.

A quarter-century ago, I began this journey looking for the deeper effects of technology by focusing on a specific and highly visible area (early modern military thought). What I learned then still troubles me. While I have learned much more, adding layers and subtlety to my interpretation, the first and most important lesson has been this:

technology does not create so much as it amplifies.

Sunday, November 14, 2010

A Geometric Mania

Last night I noted the Victorian vogue for the ram as a naval weapon. I want to explore a little further some consequences of that.

From at least the time of the Anglo-Dutch Wars (1650s and 1660s) to the Battle of Navarino (1827), naval tactics were determined by the wind. Because more guns could be concentrated on the sides of ships, than at the ends, it was necessary for warships to present their broadsides to the enemy. This meant that the normal battle formation was line ahead - that is each ship followed in a line directly behind the one in front. Previously, the line abreast, with the ships sailing side-by-side, had been preferred, as this had allowed oared galleys the best scope to use their rams.

The introduction of steam allowed warships to maneuver without wind, and by the 1850s, just prior to the development of the ironclad and the reintroduction of the ram, some theorists began speculating on new possibilities for tactics and naval formations.

After the American Civil War, during which much new technology debuted, but without significant tactical innovations, a long debate ensued between different schools of thought. In the Royal Navy, this took the form of arguments for the old line ahead (supplemented as necessary by the line abreast) against what were termed "group" tactics. In these, small groups of three or four ships were deployed in triangles, diamonds, or similar formations to maneuver and fight in a quasi-independent manner during a battle.

This combined with another obsession of the time, the desire to establish the precise turning radius and best speed for each ship in the fleet. It was felt that if these could be determined, then the handling of a squadron on the day of battle be much improved. (The basic issue was that, during the 1860s, 1870s, and 1880s, warship design evolved so quickly that it was rare for more than two ships to have similar characteristics, making it difficult for a squadron to maneuver together.) Edward Reed, designer of most of the early British ironclads, devoted considerable space to problems of turning radius, top speed, and related topics in his 1869 treatise, Our Iron-Clad Ships. He pointed out the difficulties of determining these measurements empirically, showing in effect that both human and technical factors were involved, causing considerable variation from day to day.

Unfortunately, his advice was often ignored. The combination of interest in the new tactics and the obsession with speeds and turning radii led to the sort of geometrical mania that often afflicts military and naval theorists. The result was that various admirals tried out a number of very intricate evolutions for their squadrons, almost as if playing variations on musical themes. Queen Victoria was likely fortunate that these were never put to the test of war. Not only were many of them ill-conceived, but they may have put many of her seamen in peril from her own ships.

Sir George Tryon, Commander-in-Chief of the Mediterranean Fleet, flew his flag in HMS Victoria. It was while practicing one of his pet evolutions, that of having two lines of ships sailing parallel to each other reverse direction by executing a 180-degree turn towards each other, that the Victoria (and Sir George) were lost. (Wikimedia Commons has an animation diagramming the maneuver and collision.)

Tryon either failed to calculate the correct turning radii of the Victoria and Camperdown, or made a mistake in his flag signals, and ignored this when it was pointed out to him. As a result, the ram of Camperdown tore into his flagship, and Victoria (and Sir George) went down. In the heat of battle, there is no telling how many other ships might have been lost in this way. The demise of the ram led capital ships back to more conventional tactics, while the development of long-range gunnery allowed much wider spacing between vessels.

Saturday, November 13, 2010

The Return of the Ram

Emulation and offensive mindedness took a particularly bizarre turn in naval warfare during and after the American Civil War. In Classical times, naval warfare had largely been dominated by oared galleys equipped with massive, bronze rams. Tactics revolved around ramming or boarding enemy ships. Even before the end of antiquity, ramming was little used, and the rams were turned into spurs, better suited to breaking an enemy's oars or superstructure than punching a hole in his hull. With the advent of the sailing ship of the line in the years around 1600, the galley was relegated to a minor status, suitable for use in shallow waters and to protect estuaries.  Ramming made no sense in an age of ships at the mercy of the wind and armed with heavy  batteries of artillery.

That changed with the coming of steam. Once again ships could maneuver in any direction without reference to the wind. Some in the decades before the 1860s thought the ram might be the weapon of the future. This was particularly the case in the French and American navies, interested as they were with negating the Royal Navy's huge advantage in ships of the line. The ram attached to the steam ship seemed to offer a way that smaller vessels might inflict fatal damage on a large one. Some experimentation was underway before the bombardment of Fort Sumter, but it was the need of the Confederacy to counter the potential naval power of the Union that led them to build armored rams. They were emulating the substance of ancient weapon and its tactics in a way seldom seen in history.

The ram was to have a curious history in the years that followed. The Confederates had a certain amount of success with it, beginning with the attacks on wooden warships at Hampton Roads in 1862. The CSS Virginia (formerly the USS Merrimack) was the first of these rams, was successful with her ram, but also learned the fragility of the weapon. Her ram was stuck fast in the hull of the USS Cumberland, nearly sinking the Virginia, but sheared off instead. Had the Virginia retained her ram, the story of the next day, when she faced the revolutionary USS Monitor in battle, might have been different. As it was, neither ironclad was able to do significant damage to the other with their guns.

For the duration of the war, the South continued to build rams, and experience some successes with them. The North built only a few, intended for use on rivers, and, curiously under the command of an army officer. There successes were minor, but they were to exercise a powerful fascination on the naval mind in the decade following the peace. Most major navies built at least some rams, and the Austrians won the one major naval action (the Battle of Lissa, 1866) in which they were employed, sinking two Italian warships by ramming. There were tactical treatises written to investigate the best way to use the ram.

In the end, though, the ram was sunk as an important weapon by a new technology, perfected just a year after Lissa by an Englishman (Robert Whitehead, who was also the maternal great-grandfather of the children depicted in the Sound of Music) working for the Austrians. This was the modern (or automobile, i.e., self-propelled) torpedo. In effect, the torpedo took the idea of the ram (punching a hole below the waterline of a ship) to its logical extension, for it was a ram at a distance. By 1893, when the ram-bowed HMS Victoria was accidentally rammed and sunk by HMS Camperdown (parodied in the Alec Guinness film, Kind Hearts and Coronets), the ram was effectively dead as a weapon. Ramming would remain a tactic for emergencies, and was several times employed against submarines in the World Wars, but it was never again considered for use against armored ships in battle.

At the beginning of this post, I commented that the reintroduction of the ram was both an example of emulation, and of the offensive-mindedness I have previously noted in the Civil War era and after. At its heart, the ram is a purely offensive weapon. Unlike guns, which can be fired at a distance and used to keep an attacker at bay, the ram can only be used by charging directly at an enemy and driving the attack home. (The ancients did have an awkward tactic of forming their galleys into a circle with the rams pointing outward, but it was rarely effective and would have been pure suicide against ironclads armed with big guns capable of firing from a distance.)

The sailors of this era had been just as impressed with the success of Horatio Nelson's extreme aggressiveness at St. Vincent, the Nile, Copenhagen, and Trafalgar, as the soldiers of the period were by Napoleon's on land. Unlike their counterparts on terra firma, the sailors were confronted not with minor advances in weaponry, but by a whole series of revolutions in propulsion, gunnery, construction, and, finally, by armor. Armor plating, in particular, was a threat to the offensive, giving the defensive a great advantage, and, in so doing, posing the threat of a stalemate at sea. This was unacceptable, and the ram, was for a while, one of the responses to an unacceptable situation. It took time for this bit of emulation to prove itself more of a problem than a solution, and its supporters could always point to the great naval battles of the past.

To Qualify What I Have Been Writing

There is an inherent contrdiction between the brevity of a blog post and the length and complexity of a human life, a war, or even of a battle. I find as I write that whatever I compose is a great simplification of reality. This was true of my dissertation and the small amount of work I have published. It is even truer of these blog posts. My goal here is to explore some topics over time and in a provisional way, mulling things over and thinking aloud, rather than trying to set down any final and irrevicable statement of fact or interpretation. History is too big and too complex for that.

Friday, November 12, 2010

Emulation and Creativitiy

"Play, become suddenly and unexpectedly too real, confuses the emotions."
    --Algernon Blackwood, "The Wings of Horus"

J.E. Lendon's Soldiers and Ghosts (Yale, 2006) continues to haunt me four years after I first read it. There are many things there to chew over, but for the present, I want to draw attention to the sometimes literal imitation of the mythic heroes of Greece and Rome by warriors and generals. The examples that struck me most forcefully are the attempts by Julian (sometimes called the Apostate, the last formally pagan emperor of Rome) during his final, Persian campaign in 363. (Ledon details this in his evocatively titled Chapter XIII, "Julian in Persia, AD 363: Triumph of the Ghosts.") Julian literally tried to repeat various scenes from the life of Alexander and the Trojan War, even at the risk of his life and his army. He was emulating Achilles, Odysseus, and Alexander because it was the correct thing to do, just as emulation of past forms was the correct way of writing at the time. There was no real possibility of creativity or originality. Julian was reviving something essentially dead.

In a previous post ("Churchill and the Personal"), I noted Winston Churchill's tendency to emulate his father and his ancestor Marlborough. One can possibly see a similarity to Julian in this behavior, particularly his antics in instigating a cavalry charge at Omdurman in 1898, and in the defense of Antwerp in 1914. Whatever he was doing, he was certainly playing in some very deep and serious sense, but I do not get the impression he was trying to revive something. In his own creative and peculiar way, he was trying to learn as well as to lead. Had he not done these things, and many others, had he not written his way through the four volumes of Marlborough, the story of an earlier war between Britain and a power that sought world domination, could he have emerged in 1940 as the savior of his country? Could he have worked as he did with Eisenhower and FDR without having so completely internalized the ideal of coalition warfare create by Marlborough and Prince Eugene of Savoy?

The difference, I think, between Julian and Churchill is between emulation for its own sake and emulation as a creative process. The latter is a process that has shaped civilization in many times and many places. It is under appreciated today, for we have moved away from it, but it was the essence of the Renaissance and much of what emerged from it. This is something I will return to in later posts. For now, it is enough to note its existence.

Thursday, November 11, 2010

Remembrance

It is supposed to be a day for remembrance, though over time what we are asked to remember has changed somewhat.

Originally it was to commemorate the end of a war and to celebrate peace. Later it became a day to honor the service of veterans, but that was still in a time when war was seen as somewhat glorious, so that there was still a sense of great accomplishment and victory. Now, when wars have changed their character so, the emphasis is even more on the service of the individual and his or her sacrifices.

What will it become in another 91 years? Will there still be wars and will there still be veterans? Or will the sacrifices from a "war to end all wars" finally be consummated so that there are no more wars and no need to create more veterans? Would that not be the ultimate way to honor service and sacrifice?

Wednesday, November 10, 2010

Churchill and the Personal

In the past two posts I have written a bit about the importance of quirks and of offensive mindedness. This leads me to one of the most unusual, and, at least before 1918 offensive-minded political figures of the last century, Winston Churchill. Joseph Maiolo notes that in war, in politics, and in his historical writing, Churchill believed in great men and "inspired leadership" as the answer to most problems. (Joseph Maiolo, Cry Havoc. New York: Basic Books, 2010, p. 173.) This is true of Churchill at man levels, and we might extend it without exaggeration by saying that politics, war, and history were all personal for WSC to an extent rarely seen. At least until he became prime minister in 1940, everything seems to have been a great adventure for him.

A couple of examples ought to suffice to make the point. In 1914 Winston was First Lord of the Admiralty, a position roughly equivalent to the American Secretary of the Navy prior to the consolidation of the armed forces in 1947. The tradition had been that the First Lord, unless an admiral, stayed out of actual operations and planning. Churchill, in the unique position of being able to stay in contact with naval squadrons all over the world through radio and telegraph, could not help but direct operations himself. As if this were not enough, he also agreed to take over the air defenses of the United Kingdom from the Army, and also popped over to Belgium, where he ended up personally directing the British forces at Antwerp for two days, possibly saving the Belgian Army from destruction. Over-and-over in the first year of the War, he became directly involved in ways that no other politician in Britain or America ever had or would again.

This deep personal involvement is also reflected in his biographical writing. His two main subjects were his father, Randolph Churchill, and his illustrious ancestor, John Churchill, First Duke of Marlborough. Not only did he research and write long biographies of both men, there is strong evidence that he consciously tried to follow, and complete his father's political career, holding the same offices and exhibiting similar patterns of political behavior (indeed many contemporaries between the Wars were afraid he had inherited his father's madness as well). The case of Marlborough is more complex, for the imitation was much less direct, but it is important to note that he spends much time justifying many of his ancestor's personal and political actions, which seemed dishonorable to the Victorians and their successors. Perhaps of more import, he seems to have used Marlborough's close association with the Imperialist general Prince Eugene, and also his handling of his Dutch allies, as a model for his own cooperation with Roosevelt and Eisenhower.  I believe that Winston was able to draw both examples and comfort from his biographical and historical work. He was clearly working with a strong belief in great men. Whether his belief was due to his own perception of himself, or whether the reverse was true is not exactly clear, but it made history and biography personal for him in a way that has rarely been true for other writers.