Thursday, October 26, 2017

The Robot Sophia and Misunderstanding Technology

I am really bothered by the amount of attention the android Sophia is getting for dumb publicity stunts. First off, the blatancy of the stunt (making a robot a citizen of a country) is horrendous. So is the idiotic banter with the robot. But really we've had robots (and robotic weapons) for a century-and-a-half. In fact, I would argue that the first real robot was a weapon perfected between 1866 and 1868, the locomotive (or Whitehead) torpedo. In its original form, it was self-propelled and could self-correct its depth (though it could not correct its course for another three decades when gyroscopes were introduced). The first actual (combat) use of it occurred 140 years ago this past May, when the Peruvian rebel ironclad Huascar was unsuccessfully attacked with two torpedos by boats from HMS Shah.

Of course I could argue that the first practical robot was a thermostatically controlled egg incubator around 1620 by Cornelius Drebbel, who is also credited with inventing the first practical submarine. Thermostats really didn't take off until Victorian times,  but the represent the  basic model of a cybernetic control. 

We have to start approaching our technologies from an evolutionary and co-evolutionary perspective and quit freaking out. Although there are some things wrong with it, primarily the author's unstinting optimism and capitalist tendencies, one of the best places to start is Kevin Kelly's 2010 book, What Technology Wants. Kelly makes a good argument (I think) for treating technology as a seventh kingdom of life, as well as sketching out some useful approaches to technology (the Amish Hacker chapter is a must read). Conversely, it is useful to take a look at what cybernetics tells us about ourselves and our minds, which was part of the discussions of the (in)famous Macy's Conferences. To me the most approachable starting point for that remains Gregory Bateson's work in his collection, Steps to an Ecology of Mind, and his follow up book, Mind and Nature. He was not trying to argue that living beings are machines, or that machine metaphors are the best way of understanding mind (though he recognized them as useful heuristics), but cybernetics useful for understanding the continuum of mind and the nature of learning. 

That leads me to the second thing I think we need to do: understand how technologies need to be seen as part of a much larger, interconnected web or webs, both in the way technologies interact with each other, but also how they interact with the larger informational space we perceive as the world of living things. I have written before in this blog about the importance of even simple human machine interfaces, but we really need to have an understanding of how complex the interactions  are between machines, between machines and humans, and between machines and nature. That allows us to enter into an ecological view that is very enlightening and useful. If we do not do so, we tend to limit ourselves to simplistic us-versus-them scenarios and replaying the original Star Trek or 2001 in our heads. The latter, of course, does place machines, specifically HAL 9000 and the Alien Monolith in an evolutionary perspective stretching from the Australopithecines to the Star Child. 

So I think the hype and the marketing surrounding Sophia are counter-productive, as they take us only where the PR people, the developers, and the venture capitalists (not necessarily in that order, or even that they want the same things) want us to go. It is vital that we deepen and broaden our thinking about robots, AI, and all of the other technologies enveloping and transforming our lives and worlds. 

No comments: