Saturday, September 21, 2013

Walled Gardens

Occasionally I read something that bothers me in obscure and subtle ways. Such things gnaw at me and lead to reflection. Recently an article at an important journalism education site (Poynter) by a distinguished e-journalist and professor (Bill Adair, who has created a Pulitzer-wining news site) provoked such a reaction. The only way I can explain it is to say that it strikes both at the heart of my beliefs about books, how we interact with them, and how important they are in shaping our patterns of cognition (and hence our fundamental construction of reality).

Adair writes about his disappointment in current ebooks and discusses the kind multimedia ebooks he would like to see. He begins with a biography of Bruce Springsteen that he feels ought to have included recordings of the music. Given the mess that music copyright has become, the author did not try to incorporate audio files, but agreed with Adair that this should happen in the future. Perhaps this would have been a good idea, but the lack of music forced Adair to go out and construct his own soundtrack as he read the book. Adair says he downloaded several albums. I wonder if he also searched out interesting and different performances of these songs on the web of which he was previously unaware? Is that important? The difference between the two behaviors is the difference between two different levels or kinds of engagement with the book and the music.

The second book Adair read is Dan Brown's Inferno. He understandably wanted to see maps, illustrations, and other materials described in the book. He did find a website that pulled together this material, but felt it should have been built into the ebook. Unlike the Springsteen bio, what he wanted here was not too different from an illustrated edition of a print book, though he would have liked animated maps showing the movements of the characters. I find myself more sympathetic to this than I do to his desire for a soundtrack, partly because of the tradition of illustrations in books and partly because of peculiarity in the way I related to music.

Adair confronts me with a fundamentally different perspective on the book than the one I have evolved over the half-century of my existence. In this article, Adair gives the impression of wanting ebooks to be self-contained vehicles for consumption. I'm not sure that is his intention, some comments towards the end of the article point beyond that, but it is how it initially struck me. Quite possibly I misrepresent him in some important ways and am merely using him as a stalking horse.

Books have been at the center of my life since infancy. I was read to even before I was born, and many of my most vivid childhood memories are of being read to or reading. That seems to be true for a lot of people, each of whom has their own understanding of, and relationship, to books. Three books that I was given as a child and teenager shaped by perception of books in ways I did not then understand. I was a precocious reader and my grandmother Reed gave me a set of the full Encyclopedia Britannica at the age of eight. This was not the children's edition, but the complete, adult, 1968 edition. It sat on the shelves of my bedroom closet where I could easily reach it. My mother still laughs about seeing me sitting cross-legged on the bed with volumes of it open in front of me. In the days before Wikipedia it was my place for quickly finding information. Anything I wanted to find was seemingly in its pages, and it created in me the habit of looking things up as they struck me. Today it sits, cherished, but rarely used, on the bottom shelf of a bookcase in my living room, replaced by the Internet. From those volumes I developed a lifetime's habits that have carried over into the age of the web, but also gained an understanding of how all information is interrelated and how one must go beyond the confines of a single book, even one as voluminous and authoritative as the Britannica.

Years later, the Britannica was inadvertently involved in another important lesson about the authority of books. In working in early modern European history, one inevitably confronts the witch craze and its difficult historiography. One of the stranger incidents is the witch-cult theory of Egyptologist Margaret Murray that rose to prominence in the 1920s and 1930s. Briefly, she held that there had been an organized pagan cult that survived from the neolithic to the end of the Middle Ages in Europe. Her influence on academics was never large, but her ideas were propagated to the wider public (being embraced by the emerging neopagan and wiccan movements, as well as by numerous horror writers and movie makers) both through her books and through having authored the article on witchcraft in the 1929 Britannica. She fell into complete disrepute academically in the Sixties after the publication of Elliot Rose's critique of her work, A Razor for a Goat. This drove home to me the transitory nature of the authority of books, but also something of the complex web of relations between books.

The second of the three works that were shaping my understanding of book at that early age was W.S. Barring-Gould's The Annotated Sherlock Holmes. Barring-Gould collected all 56 original short stories and four novels, along with several essays on pressing subjects (such as the exact number of wives Dr. Watson had), into a wonderful two-volume edition, the broad margins of which were filled with notes on obscure points of Sherlockian chronology, names, places, events, objects that were no longer so familiar (such as antimacassars) and why they were so named, and the values of old British coins. (I gained a solid working knowledge of pre-decimal British coins that stood me in good stead when studying history at a graduate level.) Barring-Gould showed me how much could be gleaned from works of fiction about the world and also how those fictional worlds could interpenetrate with our own. It made me understand that scholarship and fiction could not only live together, but could also complement and supplement one another. Later, reading Samuel Rosenberg's Naked Is the Best Disguise: The Death and Resurrection of Sherlock Holmes, a deeply flawed but fascinating book on Conan Doyle and his creation, I was led back to Barring-Gould more than once. I think this was one of the first times I began to notice the interplay that can occur between books.

The third book was Brigadier Peter Young's Edgehill, 1642: The Campaign and Battle, published in the 1970s by Roundwood Press, as part of a series that also includes Young's Marston Moor, 1644: The Campaign and Battle, and Margaret Toynbee's Cropredy Bridge, 1644: The Campaign and Battle. The book is a good, solid account of the two opposing armies and actions of the first campaign of the English Civil Wars. What distinguishes this, and its accompanying volumes, from most histories is the extensive publication of contemporary sources and accounts of the battle that make up about half of the book. It is possible for the reader to have easy access to the materials the author used and to see the battle through many different eyes. It's something that I wish could be done more often. For me as a high school student, I suppose it opened up something of the how the author had gone about his work and also something of the historical method for the first time. It also gave me access to the network of connections that exist between a published book and its sources, which of course is part of a much larger network of interdependent books and sources.

Peter Young's approach in Edgehill brings us back to a parallel track to what Adair would like to see in ebooks. Young provided original source material, pictures, and maps. All of that was possible in a printed book, but of course the animations, video, and sound have not been possible. Were he alive and writing today, I have little doubt that an Edgehill ebook would at least contain animated maps, along with animated diagrams or videos of re-enactors demonstrating the complexities of seventeenth-century military drill. In fact it could well contain performances of contemporary songs also.

So what is it that struck me so viscerally in Adair's article and gnaws at me as I write this? I think it is a fundamental difference between two different platforms, one I've not really seen explored in essays, though surely I've just missed it. What Young was doing in Edgehill and Barring-Gould did in his Annotated Sherlock Holmes was to try to expand the web of relationships beyond the text. In Young's case, he wanted to open up sources he had used, but that would require travel, or at least a microfilm machine, to access in most instances. Barring-Gould was trying to open up an existing text to make it more accessible to readers and perhaps more interesting. Both were dealing with the limitations of ink on paper, and, as happens so often, pushing up against a technical limitation led to a creative solution. Fundamentally, it was about increasing access.

The ebooks Adair critiques are completely different creatures. An ebook exists on a platform that can do many other things, far beyond just presenting text and pictures. In almost all cases, the possibility of instantly going out and exploring the net to find additional resources exists. Adair is arguing that those resources be pre-packaged by the author and publisher. While that is convenient, it is also an attempt to reduce the scope of discovery by making it too easy for the reader not to explore what might be out there. Fundamentally, it is about limiting access.

More and more we are seeing this kind of approach in etextbooks from the major publishers. For them, textbooks are no longer books, but elaborate, interactive, multimedia platforms, seemingly intended to keep the student in a walled garden and dependent on their products. The etextbook today is often an accumulation of texts, interactive animations, video, online quizzes and assignments, collaborative note-taking, to the point that the instructor can almost seem superfluous. It attempts to be a totality.

Adair may not have this in mind. He is writing for journalists and in the last part of his article, he is advocating using multimedia in books as a means for journalists to bring difficult-to-access content to readers, but in the next sentence refers to Hollywood using enhanced ebooks to promote television and the recent film version of Les Miserables.

Let me suggest a scenario. Suppose Dan Brown's publishers decide to put out a special, deluxe, enhanced ebook of his best-known work, The Da Vinci Code for its tenth anniversary. Presume for a moment that they want to provide an experience that will cause readers to pay a higher price for this ebook. They certainly have a mountain of material from which to choose. They could easily include the text, parts or all of the movie (perhaps having the book and movie keyed to each other in such as way that clicking on a line in the book would take one to the corresponding scene in the movie, and that clicking on a scene in the movie took the viewer to the corresponding chapter in the book). They could have very high definition images of the paintings that play such a critical part in the story. There is, for instance, a 16 gigapixel image of Da Vinci's Last Supper available that allows viewers to move in for super closeups. (That works out to more than one pixel for every 4/1000th's of an inch.) While the file size would make it prohibitive to include the whole image in an ebook right now, it would allow the publisher to include very high-quality closeups of specific features of the painting mentioned in the book. The ebook might also contain any of a number of essays (in print or video) that take readers through the symbolism and history of the book. Add to that an online forum where readers could discuss the book, maybe even chat with the author on occasion, and the publisher would certainly charge a premium. They might even be able to use the forum to market other products. But it would be closed, pulling the consumer (I'm not sure the buyer of this work should be called a reader) ever deeper into the product, rather than directing them out, to find other understandings of the art works, other interpretations of the symbols, the whole sordid history of the Priory of Sion, or even the larger world of Grail research or a reading of the Gnostic gospels.

In short, I am reacting to a difference between consumption and reading, between life in a walled garden and a life of exploration. This brings us back to the web of books I mentioned earlier. At some point in my life, I became aware of this phenomenon. I'm not sure when or how, but I do know that historians were actively exploring this when I was in college. Carlo Ginzburg's oddly titled work, The Cheese and the Worms: The Cosmos of a Sixteenth-Century Miller, was the first that I read. Using transcripts of two heresy trials of a Friulian miller named Domenico Scandella (also known as Menocchio), Ginzburg tried to reconstruct the books he had read, how he read them, and how these books became interconnected and exerted a mutual influence on one another (and with folk beliefs) in Menocchio's mind. (The title refers to Menocchio's belief that God did not create the world, but that it arose from spontaneous generation, just as it was believed that worms were spontaneously generated by the fermentation of cheese.)

A decade after Ginzburg, Lisa Jardine and Anthony Grafton published an important micro-study of the reading habits of Elizabethan polymath Gabriel Harvey ("Studied for Action": How Gabriel Harvey Read His Livy, Past and Present, 129, Nov. 1990). By analyzing the copious annotations Harvey made in the margins of his books, Jardine and Grafton were able to study the way he referred back and forth to them, as well as how his reading influenced an important political circle at Elizabeth's court. As with Menochhio, but in a much more organized fashion, they were able to examine the way reading one work influenced the understanding of another. In short, they were able to sketch out a small web of works and their interconnections.

My own studies at the time were attempting to unravel how printing, firearms, and, to a lesser extent clocks, influenced the cognitive world of sixteenth-century military intellectuals. The most sophisticated and studied of them, Machiavelli, gave a clear, if metaphorical, guide to the interplay of his reading and experience in a famous letter to Francesco Vettori. He describes his composition process as a conversation between various historical figures, which we knew he drew from Livy and from more recent historical works, as well as from his own personal experience as a diplomat. He constructed his longest work, the Discourses on the First Ten Books of Livy, around the great Roman historian, whose work we know he had known since childhood. Yet he clearly drew conclusions from it that are unsupported by Livy's examples and that must have come form his reading of others, as well as from personal experience as a bureaucrat and diplomat.

Isn't this the manner in which we all read. We do not read any one book in isolation. Instead we read one book in the context of all we have read before. I am most aware of this in reading non-fiction, and particularly history, but am conscious of it too in reading fiction. It may be that Sherlock Holmes pops up when reading a mystery, or Carl Jung when reading a science fiction novel. In some cases, there are straightforward reasons for this. Nero Wolfe and Archie Goodwin are clearly and openly a take off on Holmes (both Sherlock and Mycroft) and Watson. Plot elements are widely borrowed from one author of fiction to another. Rosenberg, who wrote the Sherlock Holmes study Naked Is the Best Disguise, at one point made his living as a lawyer for a film studio. One of the major lessons he learned was that when two stories share plot elements, it is more likely that they derive them from a third, common source, rather than one making a direct borrowing from the other. It's been said that most modern authors owe Shakespeare royalties.

But sometimes in fiction it is less obvious. I don't know if Frank Herbert consciously borrowed ideas about the collective unconscious from Jung in his six Dune novels, but I find it impossible to read them without thinking about Jung, as well of course, to the explicit references to the Oresteia - as most of the major characters bear the same family name (Atreides). With the references to the Oresteia, a whole other set of associations unfold, across the many different versions of the plays, starting with Aeschylus and concluding with the Sartre, Eugene O'Neill, and T.S. Eliot, the last two with their wonderful observations on the playing out of curses, metaphorical and otherwise, and Sartre's play with its desolation reflecting back on the existential crises of Herbert's main characters. Herbert's novels, for many years the best selling science fiction of all time, encourage this kind of exploration beyond their boundaries and reflection on their ideas. They draw one inward into oneself but also push one out to see the world as a complex of ecology of living beings and developing minds.

There are many different styles of reading. It may be that creating a closed and self-referencing platform is exactly what some readers want. The problem, to me, is that this may become the norm. Commerce has a way of progressively reducing and homogenizing our options until they are all the same. If our ebooks must be enhanced, I want the ability to turn those features on and off. I don't want them to be intrusive, obnoxiously distracting me from the text. For instance, the faint underlining, that Kindle uses to show popular highlights, is sometimes annoying and something I toggle on and off. Even page turning animations that most reading apps offer as an option are distracting to me.

More problematic is what the walled gardens of our textbooks, and potentially our children and young adult books might offer to the next generation. Will they be encouraged to go out and look up things on their own, or will it all be spoon fed to them? Will critical reading be possible to such a generation, or will they fall for the apparent totality of information presented to them? That is what bothers me about Adair's article. It is not that I believe he advocates anything like this, but, rather, that the path he suggests will lead publishers down a path that ends there.

Monday, December 26, 2011

A Most Dangerous Day

Sometimes turning points go unrecognized. Likewise, some things that seem like turning points are not.

Christmas Day of 1861 may have been the most dangerous day of the Civil War for the North. There were greater panics in Washington over later events. The emergence of the ironclad CSS Virginia (formerly the USS Merrimack) at Hampton Roads the following March 8 terrified many in the cabinet, though the scale of the threat was grossly exaggerated. Lee's invasion of Pennsylvania, which culminated with his defeat at Gettysburg in 1863, was a similar crisis, and is often seen, erroneously, as the turning point of the War. Neither event posed the kind of threat Lincoln's cabinet grappled with on December 25 and 26, 1861.

The threat on those dates came not from the Confederacy, but from across the Atlantic, and from the blustering of the American political establishment. It is known to history as the Trent Affair. Capt. Charles Wilkes, who was already in violation of his orders, seized Confederate representatives from the British mail packet ship Trent. The Lincoln administration had imprisoned Mason and Slidell, the Confederate diplomats Wilkes had seized. Not only was the arrest a clear violation of international law, and a betrayal of the principles over which the United States had fought the War of 1812 (which had resulted from Royal Navy ships stopping American ships and removing any sailors who appeared to be British subjects to serve in the long wars against France), but it was also a highly popular act of political insanity. Americans in 1861 loved "twisting the lions's tail," but few understood that the recent ironclad revolution made that much more dangerous than in the past.

In 1861, the ironclad was the new super weapon, the stealth bomber of its day, but it was so new that it was extremely rare. While we tend to think of ironclads as quintessentially American, in 1861, both the USA and the CSA had exactly none in commission. Both countries were aware of the need for them, and were feverishly trying to complete one, but only England and France possessed them. In December 1861, Queen Victoria had two in commission, one launched but incomplete, and one building, while Napoleon III had one in commission, and several building or launched but incomplete. It was already assumed that no wooden warship could stand against them. And unlike the soon-to-be-launched Monitor, these were all sea-going vessels with armor superior to anything the US could produce. Finally, since it was clear that Napoleon would declare war if Victoria did, there was no question of breaking a blockade through political maneuvering.

On Christmas day, 1861, Lincoln's cabinet was assembled to consider the response to the British ultimatum. London had intentionally toned down its original response, but many still believed that the administration could not accept it as is. Lincoln initially favored the recommendation of Senator Charles Sumner to submit the matter to arbitration, but the French attitude made that almost impossible. There were still voices for war with Britain in the cabinet, but Secretary of State Seward wanted to release the prisoners and comply with  the main British demands. No decision was reached that day, but Lincoln told Seward to bring his best arguments back to the cabinet the next day, while he would draw up the best argument he could make for arbitration. When the cabinet reconvened on the 26th, Lincoln presented no argument to Seward, accepting his position completely, afterward telling him that he had been unable to find a single, satisfactory argument for arbitration. He understood that a request for arbitration would not be acceptable to Britain or France, and that he would be faced with an international as well as a civil war.

Over the course of those two days, American foreign policy grew up, and the possibility of victory in world wars not even imaginable emerged. The future "Special Relationship" between American and Britain was still far off, but relations between the two countries pulled back permanently from the brink of war.



Bibliography:

Foreman, Amanda. A World on Fire: Britain’s Crucial Role in the American Civil War, Random House, 2011.

Goodwin, Doris Kearns, Team of Rivals: The Political Genius of Abraham Lincoln, Simon & Schuster, 2005.

Reed, Sir Edward James, Our Iron-Clad Ships: Their Qualities, Performances, and Cost, J. Murray, 1869,

Symonds, Craig L., Lincoln and His Admirals, Oxford, 2008.

Tuesday, March 29, 2011

A Hall of Mirrors

We live in hall of mirrors, a world of augmented memories constantly recreating not only the past, but also the memories of past futures never realized. Those futures were also built on memories reconstituted according to the beliefs of their times (as our ours) with the memory technologies of their times, in turn incorporating earlier futures past, in infinite regress. Past and future are continuously revisited, re-imagined, and reconstructed as we feel it should have been, sometimes taken as fact, sometimes as fiction, but always fictitious.

Most of the time we are no more aware of this than we are that all of our memories are re-imagined each time we recall them. Memory, both personal and collective, is an act of imagination that rewrites each memory even as it is remembered. Like Heraclitus' river, it is an ever-changing stream into which we can never step twice.

This is what neuroscience tells us about memory, but we do not act upon these findings, instead continuing to behave as if memory, and thus reality, were fixed entities. Just as the ancient Greeks and most philosophers in the West have rejected Heraclitus' notions of flux and constant change as the bases for reality, in favor of eternal verities and archetypes, we will most likely reject the science that tells us that our minds are in a constant state of flux and change, forever failing to understand or acknowledge the consequences.

Curiously, we know these things to be true, as we speak often of how unreliable our memories are, and legally accept the possibilities of false memories. Is it so much easier to ignore this and go on living in a world of a fixed and concrete sort? This is an issue long recognized by historians, though only a few books, such Thomas Desjardin's, These Honored Dead, about the construction and reconstruction of the Battle of Gettysburg, or Jill Lepore's, The Name of War, (about King Phillip's War) deal with it at length.

These are no mere academic arguments; as a people that argues political and cultural positions from history on a daily basis in the national media, these are vital issues. If we are going to make cases based on the ideas of the Founding Fathers (though which ones and at what point in their lives is always a sticky point), beliefs found in the Bible (where again we are dealing with the problem of which one and even more with what point in time), the Enlightenment (with a diversity of opinion ranging from Hume to Rousseau, generally I prefer Montesquieu), Lincoln (whose ideas evolved rapidly), FDR (the idealistic pragmatist), then we need to understand how they re-imagined events themselves, and how we have re-imagined them as well. If we can't be bothered to do this, then we are simply surrendering our minds to manipulation by extremists, propagandists, and advertisers.

This is not the worst; the way it contributes to an inflexible mindset is the greater danger. If you think you can fit the past into a little, unchanging box, you are more likely to treat the present in the same way. In an era of rapid change and rolling crises, that is a recipe for disaster, if not extinction. Neuroscience is undoubtedly getting a lot wrong that will have to be corrected latter, and a number of unsupportable claims are being based on it's findings, but our own experiences of memory and perception show that it is fundamentally right about remembering being a form of re-imagining. We need to learn to act on that insight and stop ignoring it.

Wednesday, March 9, 2011

"The problem is to change the rules...."

Last weekend, I reread a favorite essay, Gregory Bateson's, "From Versailles to Cybernetics." He delivered it as a speech in 1966 and published it a few years later in Steps to an Ecology of Mind. In places it reads as a jeremiad, but the overall point, and the overall tone are something else entirely. Speaking just a few weeks shy of his sixty-fourth birthday, he considered the two most important events of his lifetime the 1919 Treaty of Versailles and the emergence of cybernetics in the half-decade after the Second World War. The first he saw as a great tragedy, the second a great sign of hope, though one too easily misused and abused.

The Versailles Treaty seems as pivotal today, though for somewhat different reasons, as it did forty-five years ago. To us, it's importance results from its boundary drawing in western and central Asia; to Bateson it represented a series of attitudinal changes, including important ones about how international relations should be conducted.

He saw cybernetics as having a similar impact, that is, as having changed attitudes about how the world should be run. But he also warned that the use of cybernetic theories (specifically game theories) to determine the parameters of international relations, was quite dangerous. He might have added that using those theories in that way to determine any patterns of behavior is hazardous.

We tend to see cybernetics in a very limited and narrow way. It isn't just about computers, or games, or even thermostats, but about complex relationships, which is what computers and games, and thermostats analyze. Your immune system is just as much a cybernetic system as any supercomputer. Cybernetics is really the philosophy and study of systems that adapt to their environment.

One reason it represented such attitudinal change, was that it broke with older ideas of causation. (If you think concepts of causation are unimportant or incidental, I would direct your attention to the debates over evolution and creationism, to the arguments concerning the causes and remedies of the present economic and ecological crises, to most of the developments in cancer research over the past few decades, to the causes of almost any war you care to mention, or simply to how a flower grows.) As long as we saw cause as rather direct and acting in one direction, even if it might have multiple effects (so-called billiard-ball causality, as the arguments resembled a cue ball striking and transmitting motion to one or more other balls) we could have only the most limited understanding of complex systems. We were stuck with either trying to reduce everything to simple logic, physical systems, or throwing up our hands and shouting "Deus vult!" ("God wills it!")

Cybernetics added information to the equation, it developed the idea of feedback (and feedback loops) and made possible the understanding of complex interactions within cells that become cancerous, and between them and the immune system, itself dependent on a variety of feedback mechanisms. It permitted us to look more closely at the events that led to both World Wars, the arms races, the diplomatic miscalculations, and the societal and psychological states that produced them.

Unfortunately, Bateson saw in his own time, that instead of being used to interpret the detailed complexity of political, economic, or social systems, cybernetics was employed in a deterministic way to give advice about what to do. Too few variables were allowed into play. The games and simulations were based on reductionist assumptions about human behavior. The results could have led to nuclear war. Used this way, cybernetics simply reinforced the existing rules of the game. It could not lead to a way out because it did not allow for new rules. In his words, "The problem is to change the rules...." Cybernetics could be (can be) used to lead to greater flexibility or greater rigidity. (Steps to an Ecology of Mind, 1972, p. 477.)

We remain in much the same situation today. Limited variables, limited rules, and limited choices are considered viable and acceptable. That should be unacceptable and, unless it leads to a chaotic series of events through feedback, kicking the social, political, and economic systems into new states, the results are likely dire.

Wednesday, January 26, 2011

Change Was Not Slow

In my last posts (now some weeks past), I touched on ways Renaissance thought and behavior differed from our own. One habit of our minds is to see the past as homogenous, changing slowly, and constantly becoming more modern (what we used to mean by "progress" before we became disillusioned about it). It's an illusion born of our own lazy rationalizations. The idea that change was slow would have come as a shock to many people in many places in the past. To them the world could be wild and unpredictable.

Suppose you had been born in London in 1600 and lived to the ripe old age of one hundred. (There were a few centenarians around, though far fewer than today.) You would have lived through one change of dynasty, multiple civil wars, revolution, the beheading of one king, nearly a decade of military dictatorship, a major outbreak of plague, a fire that nearly destroyed your city, an invasion, and the overthrow of another king. You would have seen the birth of the modern British political system in the fights between Whigs and Tories (the latter still going very strong), the issuance of a Bill of Rights, and the balance of parliamentary and royal power.

Meanwhile, almost all of the English colonies in North America were founded, Boston became a major city, British interest in India began with the acquisition of Bombay, and there was an abortive attempt to gain and maintain a foothold in North Africa. Economically, you would have witnessed the birth of stock markets, the first modern economic bubble as tulip speculation ravaged the Dutch economy, and the creation of the Bank of England (along with modern methods of financing war and government debt). Militarily the entire way wars were fought changed on both land and sea, as guns and ships were improved, pikes were replaced by bayonets, sieges were reduced to an exact science, and jurists sought to impose order and restraint through generally accepted laws of war, to prevent a repetition of the horrors of wars in Germany and the Low Countries during the years of your middle age.

Technologically, there were practical experiments with submarines, vacuum pumps were invented (establishing that vacuums do exist and proving Aristotle and the Catholic Church wrong), and Galileo would discover the pendulum while others would put it to use to make more accurate clocks. With the telescope he would discover the moons of other planets and their imperfections (again refuting Aristotle). Hooke and Leeuwenhoek would discover miniscule worlds with the microscope. The scientific method would be formulated and societies created to promote it, Newton and Leibniz would lay out the rules for calculus, Kepler would do the same for planetary motion, and Newton would follow up with rules for motion and gravity.

All of that, plus the political philosophies of Hobbes and Locke, religious radicalism advocating pacifism and communism, visible climatic change (it got much colder and the Baltic repeatedly froze), and rapid changes in style and fashion might have begun to dominate your life. Nor must we forget the decline of witchcraft persecutions, having reached their climax in England in 1644-5 and New England in 1692 (though the last prosecution for witchcraft in England was in 1944).

It was hardly a boring time to be alive. The political change was as rapid as anything we have experienced. Social and intellectual change was almost as great. All of them experienced great swings of the pendulum. Can we really think that change was always slow and moving in only one direction?

Thursday, January 6, 2011

Imitation Not Affection, Pt. 3

In the first two parts of this essay, I strove to define mythic or creative imitation based on my reading of both Iain McGilchrist and J.E. Lendon, considered it in the context of Machiavelli and Durer, wondered if it might have some relationship to Tolkien's idea of "sub-creation," and suggested it is closely allied to the principle of anagogy, which is seeing or moving through and to a higher level of thought or understanding. Having used Durer as an example, I'll continue on with him.

For centuries, one of Durer's most famous and most enigmatic pieces, Melancholia I, has defied full explanation. The tools on the ground are mostly carpentry tools, symbolic of Saturn, who as a god and planet ruled melancholics. The figure of the winged woman is likely to be the soul, while the ladder propped against the structure indicates unfinished work. The sphere, polyhedron, dividers, and magic square may be multivalent, referring to the soul's work, the search for harmony, or the movements of the heavens (specifically Saturn or Jupiter).

The whole engraving may be read allegorically, though all interpretations remain somewhat confused, as is the composition. Wojciech Balus pointed this out and focused instead on the way Durer deliberately manipulated compositional elements in the engraving to create a strong sense of "the undecidable," unsettled, and chaotic sense of melancholy. (See: Wojciech Balus, "'Melencolia I': Melancholy and the Undecidable," Artibus et Historiae, 15:30 (1994), 9-21.) I think Balus is correct without going far enough. The picture is constructed to invite us to share the emotion, move into and through it, and imitate it.

We are being invited into the artist's mind. Put another way, the print becomes a powerful form of non-verbal communication, one dependent not on a "reading" of the symbols but communicated by the disharmonious composition and comprehended by human abilities of empathy and mimesis. These are the means of putting ourselves into the mind of another, of understanding melancholy by imitating the artist's state of mind. Imitation (per McGilchrist) is fundamentally a matter of empathy and mimesis by which we may learn about anything from deep emotions to the skills of the most complex disciplines and professions.

I have pursued the theme of imitation in Durer because he gives it masterful expression, allows us to see different sides of it, and also because he illustrates another theme from McGilchrist essential to understanding the Renaissance: the idea of the "skill that hides itself." This may sound like an odd notion, but we are all familiar with these skills in some basic form. As Sam Willis notes in The Fighting Temeraire (Pegasus Books, 2010, 136-137); these are akin to learning to ride a bicycle: they may be imitated, never taught, and cannot be readily described.

We have a tendency to think of skills as things, precisely definable and measurable, absolutely testable and certifiable. (This misunderstanding of skill is one of the points at which our modern educational system, indeed our whole culture, has begun to crumble.) The ancient, medieval, and early modern world had a different, more organic understanding. Looked at this way, a skill was a mystery mastered over time, an indissoluble portion of a whole body of lore and knowledge; the expression of that skill, whether in a painting, a piece of joinery, a necklace, a book, or anything else was proof of mastery, a matter of quality, not of abstract testing. Most importantly, it became implicit knowledge, very much in the sense of David Bohm's Wholeness and the Implicate Order (Routledge, 1980). What this means is that it was internalized to the point at which the execution of the skill was effortless and "self-hiding" in McGilchrist's formulation. The ultimate self-hiding skill for the Renaissance man or woman was mythic, creative imitation.

So let us return to the example of Durer's Melancholia I. His skill was hidden on multiple levels. We have already seen how he manipulated conventional compositional techniques to create the desired effect. Balus goes into great detail on this, showing for instance, how he raised the vanishing point of the perspective to create a sense of unease, and how he complemented this by removing many of the horizontal lines that would visually point to the vanishing point outside the frame of the picture. The extent of Durer's concealed skills does not end there. The polyhedron that dominates a part of the print is extremely complex and very difficult to draw in perspective, a real virtuoso performance given the tools and techniques of the days, as Terence Lynch observed in the 1980s. (Terence Lynch, "The Geometric Body in Durer's Engraving Melencholia I," Journal of the Warburg and Courtauld Institutes, 45 (1982): 226-232.)

Durer also gives us important clues to how self-hiding skills and imitation declined and were eventually replaced by modern concepts of measurable, certifiable skills and training. After his years in Italy, he had acquired a solid grounding in the new techniques of perspective. He capitalized on the new market for instructional texts that flourished in the last decades of the fifteenth century by publishing books on the proper way to draw in perspective and create mechanical drawing aids. This was part of a flood of books that explained the "mysteries" of many crafts and created an atmosphere in which it became possible to reduce the mysteries to simple sets of skills that could be taught to anyone. The foundations for the rationalization of work had begun.

Once skills could be reduced to instructions, the methods for learning them changed. They could be learned in parts and with a much reduced attention to imitation. The importance of imitation was gradually reduced over the subsequent four centuries, replaced by modern methods of teaching and working, and left as the preserve of small children and certain professions. Method could replace skill and deep understanding. If McGilchrist is correct, then it also represented a transition from a way of learning and understanding that privileged the strengths of the right hemisphere to one that relied more heavily on the left. Whether he is correct about this or not, it made possible key aspects of our modern world, with its emphasis on ease of learning, exact reproduction, rationalization of work and knowledge, and mechanization. We are materially richer for that, but we are in crisis; what comes next?

Sunday, January 2, 2011

Imitation Not Affectation, Pt. 2

In the previous post, we looked at mythic (or creative) imitation, noting that it is not what we normally mean by that word. It is a deep psychological process of absorbing, re-imagining, and reinterpreting the character of another person or culture from the past into one's being. In the Middle Ages and Renaissance, this was tied closely to the Imitation of Christ, but like most human phenomena, imitation is nuanced and many-hued.

Albrecht Durer was the leading Northern Renaissance painter and probably the greatest master of woodcut. Among, other distinctions, he composed more self-portraits than anyone prior to Rembrandt. The earliest is a silver-point he did at the age of 13, but the one I find most interesting and disturbing is the "Self-Portrait in a Fur-Trimmed Coat" (1500). As with some of his depictions of Christ, such as the "Self-Portrait as the Man of Sorrows" (1522) and his "Head of the Dead Christ" (1503), where he used himself as a model, this self-portrait, through the use of formal compositional elements borrowed from depictions of Jesus, is a conscious portrayal of the artist as Christ. It intrigues me more than the "Man of Sorrows" because of the detail and the haunting gaze. Are we to take this, as Erwin Panofsky and Roland Bainton did in the 1940s as an Imitation of Christ? Bainton writing of this and other works: "Such examples enable one to comprehend how fully the men of that generation moved in a perpetual Passion Play in which each and all might take the role of Christus." (Roland H. Bainton, "Durer and Luther as the Man of Sorrows," Art Bulletin, 29:4 (1947), 272.) That would place it solidly in the tradition of mythic imitation previously discussed.

Should we see this in the context of Renaissance individualism? Could he be likening the artist's creativity to that of the deity's? Is he seeing himself as a reflection of Christ with a clear-eyed stare into the mirror? I think all are possible and may have co-existed with the Imitation of Christ in Durer. We are seeing something of the beginnings of modern ego there but tinged with humility, piety, and imitation; not in the form it would take a century-and-a-half later when Newton drew parallels between himself and Christ in a prideful way.

It is too easy to read our own ideas into Durer's work, but elements of the Imitation of Christ, or as Bainton put it, living out a passion play, is clearly part of this. This is clear in some of his later works, where he identifies his own illnesses with Christ's suffering. But I am also suggesting that, at least in the 1500 self-portrait, he is commenting on his own talent and creativity by creating a formal portrait that is a compositional imitation of paintings of Jesus. There may be an element of ego in this; I think that's why I am bothered by his eyes, but there's much more. The closest recent equivalent I find is in Tolkien's writing on what he called "sub-creation". This is the idea (I am simplifying here) that no human can be truly creative, in any independent sense, but rather that all human creations must be a reflection of the Creator. If the sub- or secondary creation was complete and convincing, it would not require suspension of disbelief and would contain it's own truths, reflecting higher truths. This allowed Tolkien to create a world that was not Christian while still reflecting his very devout Catholicism. (See his 1938 essay "On Fairy-Stories" in J.R.R. Tolkien, Tree and Leaf, Houghton Mifflin, 1989.)

Although it is a considerable extension of his idea, I think the best way to think of both Tolkien's and much of Durer's best work, including the self-portrait of 1500, but also his amazingly detailed nature studies, such as his famous Young Hare, The Little Owl, and even The Large Turf (literally a watercolor study of a piece of dirt and grass), is to consider them as anagogic. There is a tradition going back to early medieval times of four ways of reading the Bible: literal, allegorical, tropological (moral), and anagogical. We are all too familiar with the literal and allegorical. The tropological way of reading the Bible involves finding moral parallels that move beyond mere allegory, metaphor rather than simile. If you can find anagogic in a dictionary, you are likely to find something about a mystical reading of the Bible. It is an act of seeing (literally being led) through to a higher reality beyond the word or image. The four modes of interpretation naturally became models for interpreting the Book of Nature, as well as holy writ. Both Tolkien and Durer engage in allegory on occasion, but at their best, they go beyond, seeing through the reality in front of them to the numinous beyond. That is what I believe happens in truly mythic imitation, it leads us through or beyond ourselves into a new level or mode of understanding, of being, but that takes us to the next part of this discussion, and that is another post.