Saturday, August 5, 2023

Is 2023 a Pivotal Year?

 I am trying to think when I have felt this way before. I suspect I must have, but cannot recall. Maybe it is age, or the converging foci of my professional and personal interests, but this year feels particularly significant. We are almost 2/3rds of the way through 2023. It is clear that something has shifted or is shifting in climate change and our understanding of it. Instead of a decades-long shift, we seem to have hit a sudden shift of state. We are seeing things happen in a few months that were supposed to take until the middle of the century. Despite some aspects slowing, such as the rate of population growth, slowing in recent years, it feels as if the Great Acceleration is now in overdrive. 

Meanwhile, the world political situation seems to continue to undergo ever more rapid change. Maybe I am exaggerating this, but it is clear that America is at a crisis point where it must move one way or another. NATO has suddenly added more members. Ukraine and Russia appear locked in a deadly embrace for the foreseeable future while Putin stumbles and appears more withdrawn. China remains an enigma, but while Xi flexes his muscles internally and externally, there are signs of something wrong - extensive corruption in his nuclear missile forces have forced the replacement of its leaders with more politically reliable outsiders, while the foreign minister first went missing, was officially replaced, and now Beijing is trying to remove all official references to him. They are locked in a long-term struggle with the US, but the two economies remain deeply enmeshed. That makes for a very different kind of Cold War than the one of my childhood and youth. The War in Ukraine continues to cause global problems with oil and grain supplies. Instability seems to have increased again in the Horn of Africa. South Asia's economic challenges are being compounded by climatological disasters. It goes on and on. Oil, wheat, heat - they are causing huge disruptions, destruction, and death across even affluent countries that are more-or-less self-sufficient in the first two. 

Since last November, we have had a heightened sense of both the threat and the promise of information and communication technology (ICT). We had grown so accustomed to the constant change and penetration of these technologies in our lives, that we (myself included) had missed much of what was happening. Over the past decade, first with ISIS, then the Syrian Civil War, and now the Russo-Ukrainian War, the more deeply sinister aspects of our connectedness have come into focus: first as a medium of propaganda, radicalization, and disinformation; now as a means of passive intelligence collection, reconnaissance, and targeting. The distinction between civilians and soldiers has been blurring for more than a century, were further dissolved by the long-term requirements for nuclear deterrence, and are all but gone now. The front line is no longer in Syria, Sudan, or Ukraine. It is your cell phone, your tablet, or your laptop. Information and drone warfare are the defining characteristics of today's conflict (a forever war created as much by the greed of ICT companies, the surreality of intelligence agencies, as by the GWOT) just as tanks, airplanes, and submarines were of that Great War that began 109 years ago last week. Oh, and just to make matters more unstable, the richest man in the world, whose actions appear to many to be increasingly erratic, owns half the satellites in orbit, communication satellites that are now vital to the functioning of most militaries around the world. 

Of course last November marked the beginning of the hype and rapid spread of generative AI (GAI). There had been a lot of warning tremors, mostly graphical in nature, as we became familiar with the strange output of applications like Stable Diffusion, DALL-E, and Midjourney. At the end of November, OpenAI, announced ChatGPT. Whatever you think of either the company or the product (or their critics), it made most of us aware of the potential for good and evil of GAI. It was gasoline thrown on a fire, or maybe on a lot of different fires. How do we react? How should we react? What does it mean? How much of it is hype? Going back to events, this time 108 years ago, I am reminded of the confusion and difficulty the British government and military had responding to the first Zeppelin raids. How to give warnings? Was it best to attack their bases or pull fighters back from the Western Front? What to do about shelters? What information to release after an attack? How to minimize disruption to war production? 

Those, however, were less intense than the situation we face today. The crisis may be less about GAI itself and more about how companies like Microsoft have chosen to implement it as part of all of their products, regarding how companies have chosen to use it to increase their holy cows (productivity and profit), or how seriously it threatens to disrupt every aspect of creativity and education. The intense exploitation of human labor to prepare training data for it is another aspect. The environmental and climatological problems will be severe if usage continues to increase and if energy and thermal efficiency of processors is not increase by two or more orders of magnitude.This week a study predicted that by 2025, GAI might use as much energy as all human workers. And, of course, it is folded into both the Cold War, and because of its ability to crank out disinformation, our forever information war. 

The responses to all of this from Left, Center, and Right have been pretty predictable, and equally inadequate. Just to take the GAI mess, I have been very attracted to the critique of it that built up around two groups, DAIR and Critical AI. I think they are right for the most part. Their months (really years) of critique are paying off in certain arenas. The press is finally beginning to understand that real dangers and pay less attention to the self-serving, existential-risk mongering of the AI leaders. Educators are tuned into their critiques. Frankly, though, I do not think they are doing much good. As with most other technologies, our society, culture, and economy embrace them regardless of cost. They may not be Don Quixotes, but it feels a bit like France in May 1940 - much depends if they are more like the French Army or the BEF. 

I have deeper concerns. While the view of mind (consciousness, sentience, intelligence, etc.) that some of the GAI creators and supporters espouse is hideous - call it a sort of reduction of everything to data science - there are many other models of mind out there, and it is not always clear, in fact it rarely seems clear, which one they follow. (I will try to explain my own model in another blog post, but it is derived from the one Gregory Bateson espoused, but with additions to take into account later research and philosophy.) I know it is too much to ask people who are writing about these issues to go around explaining just what they conceive mind to be, it is just that it bugs me. Why do I care? It tells you a great deal about their humanity, their attitudes toward humanity, and to the more general issues of dealing with the other sentient, conscious, or intelligent species (and possibly systems) that we share the planet with. Critics and commentators look at the problems GAI raised for humans, often with a very good display feminist, queer, and decolonial theory and practice, I saw nothing about what it might mean for other species. My impression is that they are often locked into ideas about the nature of mind that are too narrow.

We inhabit a world with a great many other minds. They may range from the simplest to ones as complex as our own. We need to also acknowledge that some theories of mind, and these do not require mysticism and can be firmly grounded in materialist notions of the world, do not see it as confined to the individual, but as the interactions between individuals and the world around them. Once we start down that road, we are in a different place regarding other species, our technologies, what we are doing to the environment, the climate, our cultures, societies, economies, etc. Maybe there is a theoretical framework out there that can embrace all of that. I don't know. My feeling is that our present theories are both limiting and failing us. Maybe they can expand, maybe we need a new theory, maybe we just need to walk away from any kind of overarching theory. The latter is where I am. I have been there for probably thirty-five years. Instead of elevating theory to dogma or ideology, theories to me are simply tools. Like all tools, a theory is good for some jobs and not others. Just as having the wrong tools, having the wrong theories limits what you can accomplish. While there is still a lot of good analysis going on within those theories, it feels more and more incomplete with each passing year. Maybe we need to break it apart and see what we can do with it. Maybe we need to walk away from theory for a while and work from an understanding that the problems are even bigger and more complex than we have theorized, must be approached both from a cold realism and from a deep empathy and compassion, that our best efforts will always be somewhat inadequate, and that we cannot ignore any part of world or its systems. 

We are also going to have to change our understanding of who and what we are in more than one way. We can start by recognizing that we are part of not just a biosphere but also a noösphere. I am not using that in the sense Vernadsky and the Russian Cosmicists mean it, nor in the usage of Pierre Teilhard de Chardin - though I was raised on some of his concepts. I propose this not as a philosophy or a theology but as a practical and pragmatic concept. The biological world is full of communication and many kinds of minds, networks of mind and communication that we know cross species. We are only now becoming aware of how complex this all is. We are adding to the complexity with our computers and digital communications. We have been thinking of how those tool affect us, and we may be thinking about how their energy and other resource demands affect the biosphere, but we have done very little so far to begin considering how they play in the larger mental life world in which they, and we, are embedded. 

What is the effect on all living minds if we are not just poisoning our physical but our mental environment with disinformation, misinformation, a basic distrust of reality, a basic distrust of ourselves and others? Does it make it easier for us to ignore them? Does it make it easier for us to destroy them? Does it keep us from making the connections we need both with them and to help keep them from the destruction and harm we create? 

There are some who think that GAI will alter what it means to be human and to be more or differently creative. I have a lot of doubts about that but the moment is pregnant with possibilities. I am convinced that we will end up with some kind of hybrid creativity. How we think about them, ourselves, and the thinking world around us, can lead to different outcomes, some of them probably unimaginable. Where we are now, our understanding or misunderstanding of the dramatic climate and weather we are experiencing, the levels and kinds of misinformation we accept, the theories or ideologies we hold to, the local, national, and global political situations and the infinite war, all of these will shape our the sorts of creativity that emerge. 

These emergent creativities will be vitally important. They could give us tools to solve our problems and answer the most important questions, allowing us to become something more than we are, allowing the thinking and feeling world to become more than it is, or they could limit, stunt, and destroy. I firmly believe that this year, and the next two or three, are really pivotal - not just because of AI, but because our reactions to the environmental, climatological, social, and political turmoil is going to shape things for a very long time. 

I hope I am wrong. I hope that I am just blowing events and temporary conditions out of all proportion. Given that I am writing this, I obviously believe that I have at least some small insight, but I really want to be wrong. 

No comments: