Thursday, December 30, 2010

Imitation Not Affectation, Pt. 1

Since I read J.E. Lendon's marvelous Ghosts and Soldiers a few years ago, I've been struggling to get a firm grasp on the concept of imitation. (In one of my first blog posts, "Churchill and the Personal," I discussed the role of imitation in Winston Churchill's life and biographical writings, stressing the importance of his exploration of his father's life and of the poltico-military career of his ancestor, John Churchill, Duke of Marlborough.) Iain McGilchrist gives it a special place in his recent The Master and His Emissary, an exploration of recent research into the neuropsychology of the brain's two hemispheres and its application to history.

We are used to thinking of imitation in a reduced, deracinated sense of copying or simulation, as something almost dishonest or at least distasteful. It is a view that fits in with our misunderstanding of creativity, our present-day cult of originality, and our academic witch-hunts of plagiarists.  The ancient world, the Middle Ages, and the Renaissance understood the matter differently. At the very least it was what Thomas Mann (in his speech "Freud and the Future," quoted by McGilchrist) calls "mythic imitation."

This is no mere act of copying, but of becoming; not a matter of Napoleon behaving like Charlemagne, but of becoming Charlemagne, as Mann would have it. The left hemisphere will only allow the former interpretation of imitation as legitimate or possible; the right hemisphere, open to paradox and metaphor, is accepting of the latter. We find this type of imitation operating among the early Caesars, but clearly in trouble by the time of the late Roman emperors, such as Julian in the 360s. It seems to have resurfaced in the Middle Ages, not discussed by McGilchrist, where it was present not just in the mystical Imitation of Christ, but also in much of the behavior discussed by Johann Huizinga in his classics, Waning of the Middle Ages and Homo Ludens, examinations of symbolic and play behavior in civilization.

This mythic imitation, which we might also call creative imitation, is central to the Renaissance. Literally nothing about the age is explicable without this understanding. For the writers, thinkers, artists, musicians, craftsmen, and rulers who directly participated, it was necessary to absorb as much of the ancient world as possible.  It was not about memorizing facts or techniques, but about absorbing antiquity into their very being. Recovery of the past became a creative act, and imitation became a necessary part of an old world renewed.

In a very real sense, they still thought of themselves as part of the Roman Empire. Much of the political propaganda of the late Renaissance and Reformation established the national monarchies, not just the Holy Roman Empire, as legitimate successors to Augustus and Constantine. When Leonardo Bruni and Machiavelli sought military reform, they turned to the Romans; in the former case by analyzing their military terminology to prove that the legions were primarily composed of foot soldiers; in the latter, by consciously and unconsciously adapting Roman military formations to the modern world of cannons and arquebuses. Machiavelli took imitation directly into his political works, the most important of which, The Discourses on the First Ten Books of Livy, was couched as a commentary on the Roman historian Titus Livy. This was not entirely a conscious effort; as Sydney Anglo points out, many of the examples Machiavelli picks support his arguments badly and sometimes contradict his points. There was an unconscious reworking of the material. Machiavelli may also have been imitating his own father, who had edited an early printed edition of Livy when the great political writer was a boy.

You will have gathered that little of what we think of as imitation was involved in the Renaissance idea of imitation. It was not a superficial, external affectation, but, as in the case of Napoleon two centuries ago, and Churchill in the last century, a dynamic process of absorbing and re-imagining the past with both conscious and unconscious aspects.

In my next post, I want to look more deeply into this process by looking at the work of Albrecht Durer.

Friday, December 17, 2010

A Slight Pressure on the Trigger

Not long ago, I ran across a news story about the U.S. Army's latest infantry weapon, the XM25, a "smart" grenade launcher using a laser rangefinder, onboard computers, and programmable ammunition to explode small grenades above an enemy hidden from sight. It's a case of exporting a difficult skill to a machine with an easy-to-use (dare I say point-and-shoot) interface. Armies around the world have a long history of this; you could reduce the entire history of firearms to this one theme without undue distortion.

This off-loading of skill has followed three arcs. The first made it easier to fire (or use) the gun. The second facilitated rapid fire. The third simplified hitting the target.

The first guns were awkward things fired by touching a coal or heated wire to the powder while holding the stock under one arm. Hitting anything at all was chancy. First the coal was replaced by a slow match, a smoldering, yard-long coil of treated cord that functioned much like a punk on the Fourth of July. So the soldier could keep his eye on the enemy when firing, a long, S-shaped, hinged rod (the serpentine) was introduced to mechanically lower the match into the power. As yet there were no triggers. The serpentine had to be balanced with the fingers of the right hand preventing premature firing, thus with the addition of levers and springs for control, the trigger was invented. A slight pressure on the trigger at the right moment fired the gun with minimal distraction. This was the first real user interface in firearms.

For over two hundred years this was the dominant system, but after 1650 was gradually replaced by the flintlock, wherein a piece of flint was driven against a piece of steel, sending a shower of sparks into the powder. It was easier and safer to use, as having a burning match around loose gunpowder was always an invitation to disaster. Flintlocks were also more reliable, misfiring half as often as matchlocks. After 1811, mercury fulminate, which explodes when struck sharply much like the cap in a cap gun, began to displace flints. Again, it was more reliable and easier to use. (With each change, the amount of effort in loading and firing the gun decreased.) By 1860, the modern methods of firing were in place.

The second arc was to make loading and firing faster. Each of the innovations just described made the process a little better, so that an experienced soldier in 1815 could fire six to ten times faster than one in 1500. So long as guns remained muzzle loading (loaded from the far end of the barrel), little more could be done. To make them faster (and easier) to load, they had to be loaded from the breach, leading to leakage of gas. Various things had been tried before 1800; however, the real breakthroughs came in the decades following the Napleonic Wars. Interchangeable parts and machine tools made all the difference. Guns were produced with finer tolerances between parts, which in turn meant less gas leakage. Then, with the introduction of fulminate primers, cartridges, which had previously been made of paper, began to incorporate metal. Originally just a metal disc to hold the primer was used,  later, cartridges became completely metallic. These expanded slightly when fired and making a tighter gas seal. Even better, since they were not crushable like paper, metallic cartridges could be fed from a magazine (either a box or tube with a spring at one end). By the 1860s a green horn armed with a magazine rifle holding seven or more shots, could fire faster than a drill sergeant armed with an older, muzzle-loading weapon. From that point on, repeating weapons have only become faster.

The third arc was accuracy. Initially, matchlocks and flintlocks improved this, as each required less attention than its predecessor, but the real star was rifling. It was expensive and took longer to load, so it was not popular before the second quarter of the nineteenth century, when the French found a way (the Minie ball) to create an acceptable rate of fire with muzzle-loading rifles using a conical bullet with an expanding base that created a gas-tight seal. This gave them and their British allies an advantage over the Russians in the Crimean War (1853-6), but it was the new breach-loading, repeating rifles that made the greatest difference. Firing and loading had been simplified to the point that a soldier could give most of his attention to hitting his target. For the first time, gun makers and ordnance boards began paying attention to sights. They worked out ways to make the sights simple and foolproof, so that less training would be needed to make a marksman. By the last decades of the nineteenth century, smokeless powder removed the last major obstacle to accuracy, eliminating the dense clouds of smoke created each time a gun was fired.

Over the last century-and-a-half, military rifles have continued to focus on creating weapons that are easy to use, fire rapidly, and are simple to aim. With the XM25, it appears we may have reached a new plateau, one where a computer does everything but select the target and make the decision to pull the trigger. It seems that everything else has been transferred to the gun. Will those functions also be externalized?

Sunday, December 12, 2010

A Bit about Interfaces

Interface is a modern word for a very old thing. The OED gives its first use in 1882 as a scientific term. The modern usage only dates from the 1960s. Language is our interface with one another and, to a limited extent, with dogs and parrots, which have been experimentally shown capable of understanding human language. The calendar markings Marshack detected on 30,000 year-old bones were at once both interface and recording. (See the post: Embodiments of Memory.) Interfaces have been around for as long as we have been fully human, if not longer.

Recently interfaces, the apparatus that allows us to control and receive feedback from our creations, have forced themselves on my consciousness. They don't get much attention in history. They are not entirely neglected, but often relegated to specialist niches. Outside of the history of technology, it seems to me that interfaces appear most frequently in naval and aviation history. Nevertheless they are important both because they constrain what we can do, and directly mediate between the mind and the environment.

Interfaces do not just pop into existence. They evolve and change over time, their characteristics adapting to the needs of the moment and granting us new capabilities. We can see this in the early history of aircraft. We are used to airplanes controlled by a combination of a joystick or yolk (a semicircular wheel attached to a moveable column) and pedals, but this was not the case in the earliest machines. The Wrights for many years used a harness connecting the pilot by wires to the control surfaces and wings (which warped or bent), an ingenious system for translating the movements of the body, as well as hands and feet, into the motion of the airplane. This allowed the involvement of the whole body in control and reception of feedback, but it was difficult to master. It also allowed the Wrights to build highly unstable machines that were difficult to fly. A more easily mastered system was needed.

Another early system involved two wheels, one controlling pitch (movement of the nose up and down) and the other wheel roll (movement of the wingtips up and down), combined with rudder pedals for steering the machine. The use of three separate controls required a higher level of coordination than the joystick.

What the joystick did was combine pitch and roll control into an easily mastered interface, making learning and flying easier. (Later, of course, machine gun triggers would be added for military aircraft further unifying the controls.)  The joystick was also capable of numerous improvements and adaptation to other purposes. To us the joystick may seem an obvious device for controlling aircraft, weapons, and anything else where delicate or accurate movements are involved, but it emerged slowly, and anonymously, appearing by 1910.

The joystick represents some of the most important characteristics of good interfaces. It simplifies control and feedback and is easier to learn than comparable systems. It has limitations, in this case, requiring more stable aircraft than the Wrights had constructed, and provided a narrower scope for feedback than had their system.

Before I wander too far, let me sum up. We need to look at interfaces in history because they constrain our abilities at the same time that they enable new ones. They take time to evolve and their evolution may have serious implications for how a device is used. They also allow the transfer of a skill from ourselves to the machine. These are points I hope to take up in later posts.

Tuesday, December 7, 2010

Human Fallibility

Today is the 69th anniversary of Pearl Harbor. The degree of confusion that before, during, and after the attack is something that many people cannot accept. The American tendency towards conspiracy theories and witch hunts, the need to lay the blame at someone's doorstep, has never been stronger than in the continuing struggle to understand Pearl Harbor.

The simple truth is that the attack was very daring, very well conducted, and not as successful as it needed to be. We were scarcely able to believe that the Japanese were capable of carrying it out. The Japanese were unwilling to accept that we could recover that quickly.  Part of this was racism on both sides, part of it was wishful thinking by both parties, and a very large part of it was, and is, the failure to understand how significantly war had changed between 1918 and 1941. It was difficult for anyone to grasp all of the changes. Some understood the transformation in tactics and materiel, some saw how logistics and strategy had changed, a few even comprehended the new information warfare of global communications. Few, if any were able to put all of that together.

One of our chief problems looking back is to try to put ourselves in the minds of commanders and politicians, from Yamamoto to FDR. We look at the rapid metamorphosis of naval warfare over the next four years and read it backward. We forget that, despite the successfully Otranto Raid by British torpedo planes on the Italian fleet the year before, no naval air attack on this scale had ever been attempted. We forget too how much attention in Washington was then directed at Europe and the Atlantic. Finally, we forget how much the human factor, and human fallibility control events, whether then or now.


I've written a blog post everyday for the past month. I did this to prove to myself that I could do it as much as to acclimatize myself to blog writing. I am going to slow down now, allow myself more time and more thought, in order, I hope, to produce better results. I expect I'll be putting up something two or three times a week.

Monday, December 6, 2010

Embodiments of Memory

One of the constants of human history has been the embodiment of various human mental skills and needs in our tools. The 30,000 year-old calendars of Marshack mentioned previously were an early example of this, as were the accounting tokens described by Denise Schmandt-Besserat that just preceded writing. Both were means of investing a part of our memory in physical objects. Writing greatly intensified this trend, but only in the last hundred-and-fifty years, with the advent of mechanical and electronic recording, have we have greatly increased the pace. Today we stand at a crossroads, for companies such as Intel are seriously contemplating neural computing interfaces by 2020. Even if they are off by as much as Bell was in the 1960s about the advent of practical picture phones, that is by about thirty years, the realization of a direct connection between your brain and cyberspace is likely to happen by 2050 or so.

Of course it might prove impractical, but we already have practical, though limited, wearable computers today. How quickly they will catch on is anyone's guess, but I think five to ten years is a good guess. (But, like Alonzo Gray's opinion of machine guns, I may be completely missing the mark.) My point is, that the present situation, in which we locate more and more of our memory in personal electronic devices and in clouds of computer servers online, is going to greatly intensify this process of offloading our mental processes. We cannot know what that will look or be like, but for the historian, it opens up a golden opportunity, one in which we may gain new insights into how the processes of externalization work, and allow us to apply them to the past. That is if all the information we experience does not drive us crazy.

For more on this, see: "New Ways of Remembering" on the MU eLearning Blog.

Sunday, December 5, 2010

A Case of Mistaken Hindsight

Last night, I wrote about Alonzo Gray and his opinions on the future of cavalry with respect to machines guns, motor vehicles, and airplanes. My point was our difficulty in recognizing why our predecessors failed to see the possibilities of new machines. There is a reverse side to this too. When we look back at the beginnings of a technology, we notoriously do so with the prescience of hindsight. We often make the mistake of attributing all the power of the mature machine to its embryonic form. We then wonder why it failed to have the impact we would have expected, or been sufficiently appreciated at the time, probably due to some mistaken and fixed belief against it. The machine gun is a good case in point.

Mechanical precursors to the machine gun antedate the First World War by two centuries. The Puckle Gun of 1718, which operated something like a gigantic revolver, was the first. It had the misfortune both of appearing after the end of a long and exhausting war (War of the Spanish Succession, 1701-1714) and of being overly complicated, probably to complex to be produced in any numbers.

The mitrailleuse (a multi-barreled, hand-operated weapon) is a completely different story. It was adopted in large numbers by the French army under Napoleon III in the 1860s. It did cause terrible casualties on occasion during the Franco-Prussian War (1870-18710), but it was hampered in part by conceptual issues, in part by secrecy, but in part by its own nature. The main conceptual issue was whether to treat it as an infantry weapon or as an artillery piece. That may sound like a trivial, quibbling distinction, but it was critical. Because it was decided that it should operate as artillery, it was used at the ranges typical of artillery in its day. Unfortunately, its range was too limited for that use, so its use was restricted in ways it would not have been had it been assigned directly to the infantry (as Maxim machine guns were in the First World War). Because it was kept in great secrecy before the War, the crews were not fully prepared and issues that would normally have been worked out before the fighting began were not.

We are used to seeing film from the World Wars of machine guns cutting down entire platoons in the blink of an eye. The mitrailleuse had a much lower rate of fire and could not easily be directed across a wide area. Instead, it was possible, maybe even typical for it to simply obliterate one or two men at a time. This was undoubtedly terrifying to advancing troops, but it would not necessarily stop an attack. It simply did not have the technical sophistication of the later fully automatic weapons, or even of some contemporary hand-cranked ones like the Gatling, the Nordenfeldt, or the Gardner.

Because we tend to project the character of the mature technology back on its earliest variants, it is natural for us to exaggerate the non-technical problems (inappropriate use as an artillery weapon and the effects of secrecy) and neglect the technical issues when we read about the French defeats in 1870. Of course we do not do this just with machine guns. We do it with most technologies unless we have lived through their evolution ourselves. That makes it too easy for us to criticize decisions made about machines and the uses they were put to in any given time. It also makes us wonder about the intelligence or character of men and women who failed to see the potential of new technologies as they first emerged. This is just one more of those filters through which we too often distort the past.

Saturday, December 4, 2010

"Facts Important for Cavalry to Know"

We forget sometimes how rapidly things changed in the past, and how much difference four or five years could make. In 1910, Capt. Alonzo Gray of the 14th U.S. Cavalry, wrote:

I give it as my humble opinion that increased range of firearms and the addition of machine guns, increase the sphere of action of, and necessity for, well-organized cavalry; that bicycles, motorcycles and automobiles will prove to be only valuable auxiliaries to cavalry in transporting information back to the rear, and thus saving an unnecessary expenditure of horse flesh; and that while flying machines may bring information, by so doing they will widen the sphere of action of good cavalry; and, more than ever before, as a result of such information, it will be necessary to have good cavalry ready to move on extremely short notice.
(Alonzo Gray, Cavalry Tactics as Illustrated by The War Of The Rebellion, Together with Many Interesting Facts Important for Cavalry to Know, 1910, p. 3)

That bit about the machines guns has stuck with me since I first read it in 1985. In 1910, this may not have been an unreasonable opinion. Gray may not have had cavalry charging against machine guns in mind, though, as the Australian Light Horse would soon prove in Palestine, such a thing was possible and could succeed in the right circumstances. He probably had in mind the use of machine guns as a lightweight substitute for horse artillery, which made perfect sense and was a widely held idea.

Likewise he could not imagine motor vehicles that would move cross country. The tank was still six years off, and the Jeep about thirty. Airplanes were interesting but not yet taken seriously, least of all by the U.S. Army, which would lag well behind the Europeans.

Had he written five years later, the picture would have altered radically. By then, airplanes had proven their worth; cavalry had been immobilized by trenches and barbed wire on the Western Front, where they spent their time waiting to pursue the enemy once the infantry had broken through; machine guns had proven of more utility than horses, and, in London, Winston Churchill had convened the Landship Committee, whose ultimate product would be the tank.

As with most of us, he could not foresee what would happen as a given technology matured, but only what it could do at the time. Likewise, he could not see how the combination of several technologies would create whole new contexts and shape the future. No one could then, not even H.G. Wells. No one can now.

Friday, December 3, 2010

Fact and Fiction

What do we really know of the past? I think we all have a strong tendency to picture it very much as we have seen it in movies and television, perhaps read it in a novel, and generally have a stilted view. I know that the Sherlock Holmes stories influence how I see the late Victorians and picture London. Likewise, the Hornblower stories of C.S. Forester, or the more recent novels of Patrick O'Brian about his naval heroes Aubrey and Maturin, strongly color the view of even professional historians about war at sea in the age of Nelson. Fortunately, both Forester and O'Brian were excellent researchers and built convincing stories on the exploits of figures like Lord Thomas Cochrane and on Nelson himself. I have seen their works cited to provide examples in more than one naval history in recent years. Perhaps the intent is simply to provide a context that many readers will quickly understand, but it does create an interesting tension between fact and fiction. How much does a historian writing this way tend to mingle the historical and fictional characters in his mind?

When I was working on my dissertation, Kenneth Branaugh's production of Henry V was released. I was working on the sixteenth century, not the fifteenth, but I know it did have some influence. One of the figures I focused on was Sir John Smythe, a cousin of Queen Elizabeth, a sometime soldier, and a thoroughly pedantic and old-fashioned writer on military affairs. He advocated giving up firearms and returning to the good old longbow of Henry V's time. Shakespeare is supposed to have based the character of Fluellen, one of Henry's officers and advisers, partly on Smythe. Ian Holm played Fluellen, and did so in a way that would have fit Smythe like a glove. I am not sure how much this influenced my view of him in the dissertation, but I think it did have some effect, possibly reinforcing my view of him as a pedant. (Curiously, others think Fluellen was based more on Sir Roger Williams, another Elizabethan soldier who was diametrically opposed to Smythe, and about whom I also wrote.)

Another figure I wrote about was Blaise de Monluc, considered as one of the models for d'Artagnan, even though he lived a century earlier. I do know that the fictional character made the real one more attractive to me, and perhaps more accepting of some of the tales of adventure he told in his memoirs.

I don't want to belabor the point, but it is good to realize that historians may be just as influenced by literature, movies, or television as anyone. The effects may not be great, but their subtlety can easily color their depiction of the character and personality of their subjects.

Thursday, December 2, 2010

The Ticking of the Clock

In the past few posts I have been focusing on clocks in the fourteenth and fifteenth centuries. The human desire to tell time runs very deeply, of course, and has an amazingly long history. It is also one of the central parts of the story of who we are and, I believe, how our minds developed. Tonight I just want to focus on one bit of it, as a way to focus attention on the importance of time in evolution of culture and cognition.

About thirty years ago, I became aware of the work of Alexander Marshack. Over a number of years, from the sixties to the nineties, he carefully researched and sought to explain a variety of artifacts, in bone or stone, showing careful groupings of markings. Because these had some aspects that indicated that they were more than mere tallies, that is there seemed to be some differentiation in the markings, he came to the conclusion that they might be numeric notations, almost certainly representing some sort of lunar calendars.

His thesis was controversial because of the age of these artifacts. Some of them date back almost 30,000 years. In other words if Marshack had it right, we have been keeping careful track of time for 300 centuries. Incidentally, it means that we could count well that far in the past.

Now let us pause for a moment and consider what this might mean.  It inidcates that we have been aware of passing time, and must have had language for it, for about six times longer than we have had writing. Likewise, we have had some concept of numbers six times longer than we have been writing. There is in fact a school of thought, largely based on the work of Denise Schmandt-Besserat, that the earliest writing, cuneiform, arose out of an accounting system based on fired-clay tokens stored in clay envelopes. (Since the tokens could only be gotten too by breaking the envelopes she argues that it became common to make impressions of the tokens in the clay before they were placed inside and the envelope was fired. Later the tokens were replaced by markings made with a stylus. There are a lot of problems with all of this, and it may not be a full explanation of the origin of writing, but we do know that writing long remained largely a vehicle for taking inventories and settling debts.)

Around the time we began to alter our sense of time with the clock,  Europeans were learning the concept of zero with the very slow spread of Hindu-Arabic numerals and changing their record keeping by developing  double-entry bookkeeping. These are aspects of the information revolution that marked the years between 1300 and 1600 that we need to keep in mind. But, and here is the point to which I wish to direct your attention, and whose importance is only beginning to dawn on me, these represent changes in our cognition. I am not yet sure how large these changes were, nor am I sure how they fit into the larger pattern of changes wrought by technology, art, and humanism in those years; nevertheless, given what has followed, I think it worth pondering what changes to our minds were wrought by something as seemingly insignificant as the ticking of the clock and the notation of nothing (zero).


Further Reading:

There is a good, brief essay about Marshack's work by a friend and collaborator of his, Michael Hudson, from UMKC - "After the Ice Age: How Calendar-keeping shaped Early Social Structuring."

For Denise Schmandt-Besserat, see her web page at the University of Texas.

Wednesday, December 1, 2010

A Matter of Timing

If mechanical clocks had not been invented when they were, would they have eventually been developed? A friend of mine insists they would have been; I am inclined to agree, but am not certain of when. The idea of equal hours was already in existence, for time candles were based on it, and those had been around since at least the ninth century. Sand glasses were invented about the same time as the mechanical clock and were also based on equal hours. Very elaborate clepsydras (water clocks) had been built that moved figures around to tell time. They were in many ways similar to mechanical clocks, but they were terribly prone to error due to temperature variation. In all likelihood, the idea of falling water gave rise to the idea that a clock could be built using falling weights. (Spring-driven clocks came a bit later.) But here's the thing: elaborate clepsydras had been known and built for well over a thousand years, but prior to the thirteenth century, we know of no attempt to build a weight-driven clock.

So, the question is, when would this have been done? We cannot know. It might have waited a few decades or a few hundred years. The medieval and Renaissance mind was fertile when it came to inventions. Perhaps only the needs of long-distance navigation would have forced the issue. In that case the outcome of Cloudesley Shovell's demise might not have been the race to create the chronometer, but the race to invent a simple clock. The idea of the watchmaker God (God conceived of as a craftsman who built and wound up the world, then sat back and watched it go) would then not have arisen in the previous century, and an important underpinning of the philosophy and science of the early modern world would have been missing. That leads to the unknowable question, what would have replaced it? Would it have taken philosophy and science in a different direction?

Matters of timing are important. We have a good idea of what happens when an invention comes along before the world is ready. Ada Lovelace wrote what is now considered the first computer program in 1842-1843, but the machine for which it was written was never completed, and her effort produced no fruits for over a century. They remain in the realm of inspiring historical curiosities. What we lack is a good idea of what happens to inventions that are born late. Even more importantly, we have difficulty imagining how the lack of one idea at a critical time, perhaps an idea inspired by an invention, might have affected history, but that is a subject for consideration at another time.