Posthuman Life reading group (Summer 2015) by Phil Percs

ubermensch04

Not sure why I didn’t notice this before:

Posthuman Life reading group (Summer 2015) of David Roden’s new book.

All beings so far have created something beyond themselves; and do you want to be the ebb of this great flood and even go back to the beasts rather than overcome man? What is the ape to man? A laughingstock or a painful embarrassment. And man shall be just that for the overman: a laughingstock or a painful embarrassment…

Man is a rope, tied between beast and overman–a rope over an abyss…

Speculative Posthumanism and “Dark Phenomenology” (Debbie Goldgaber)
Interview with David Roden (friend of the blog R. Scott Bakker, at Figure/Ground)
Announcement Post (Debbie Goldgaber)

July 8: Intro and Chapter 1 (hosted by Debbie Goldgaber)

July 15:  Chapters 2 + 3 (hosted by BP Morton)

Queering the Human: Is the Transhuman already here? (B.P. Morton)
Dark Posthumanism I: summer’s ice (David Roden, at enemyindustry)

July 22:  Chapters 4+5 (hosted by Debbie Goldgaber)

The King of Weird Futures (Rick Searle, at Utopia or Dystopia)

July 29: Chapters 6 +7 (hosted by Debbie Goldgaber)

A Gallapagos Objection to Speculative Posthumanism? (Jon Cogburn)
Posthuman Life: The Galapagos Objection (David Roden, at enemyindustry)

August 5: Chapter 8 and concluding remarks (hosted by Jon Cogburn)

Hosted by Phil Percs.

Cognitive Ecology: The Hard-Wiring of Human/Natural Systems

ecosystem_video_bck

How does the environment shape the ways an animal processes information and makes decisions? How do constraints imposed on nervous systems affect an animal’s activities?  My friend R. Scott Bakker of Three Pound Brain commenting on my post on Kevin Kelley mentioned Daniel Dennett’s recent critique of the dilemmas of the notion of Singularity at the Edge.org: http://edge.org/response-detail/26035. Scott expounding on this said:

I think Dennett completely misses the point of the Singularity, but his view on the problems arising out of the proliferation of special AI strike me as sound.

Dennett’s never sat down to work out an understanding of cognitive ecology, so I think what he says suffers for want of clarity. But this is what he’s angling at, and to that extent I’m inclined to agree with him. His whole position (like mine) turns on evolution sculpting pre-established harmonies between biological systems. Now we’re in the process of demolishing those harmonies. His point is systematic, even if it doesn’t come across that way.

Habitat destruction is a useful analogue.

A ‘cognitive ecology’ consists of those environmental information structures prone to cue various heuristic systems, systems adapted (via evolution/learning) to solve on the cheap. Their economy derives from their selectivity, the fact they need only be sensitive to certain information, and can neglect everything else. They can neglect everything else, take it for granted, simply because, ancestrally at least, it always remained both sequestered and fixed.

This is why neuroscience and AI pose the myriad conundrums they do. And this is why a neuroscientifically rationalized world filled with AI will very likely scramble our ancestral heuristic regimes. All the lacunae and the continuities we evolved to take for granted can no longer be taken for granted.

What I take away from this is from Dennett is the fear that we are ceding too much authority to our external intelligence systems (from Dennett’s post):

What’s wrong with turning over the drudgery of thought to such high-tech marvels? Nothing, so long as (1) we don’t delude ourselves, and (2) we somehow manage to keep our own cognitive skills from atrophying.

(1) It is very, very hard to imagine (and keep in mind) the limitations of entities that can be such valued assistants, and the human tendency is always to over-endow them with understanding—as we have known since Joe Weizenbaum’s notorious Eliza program of the early 1970s. This is a huge risk, since we will always be tempted to ask more of them than they were designed to accomplish, and to trust the results when we shouldn’t.

(2) Use it or lose it. As we become ever more dependent on these cognitive prostheses, we risk becoming helpless if they ever shut down. The Internet is not an intelligent agent (well, in some ways it is) but we have nevertheless become so dependent on it that were it to crash, panic would set in and we could destroy society in a few days. That’s an event we should bend our efforts to averting now, because it could happen any day.

This first is the notion that we as humans have a tendency to anthropomorphize things, to give them a life and intelligence that they essentially do not have; to animate and humanize things, objects, and processes that are neither human nor remotely caring of what humans are or think about emotions, behaviors, intentions, etc. And, second, Dennett fears that if we forget how to use our brains as environmental adaptive systems that have been hard-wired over the eons to perform specific functions in regards to our working with the environment that we may lose those specific functions that are not hard-wired in the brain, but are rather culturally transmitted and learned through generational techniques of memory and performance (i.e., ritual and mimesis among other techniques).

What Scott is referring too is that we are severing ourselves from both of these conceptual relations with mind and learning, and replacing them with a dependence on technical mediators and machinic agents to do the work we once did ourselves (i.e., allowing the machines to do our thinking for us; as well as dismembering our need to use our brains for the menial environmental tasks of performative knowledge transmission: memory storage off-loaded into machines, rather than in our brains, etc.).

For Scott we are in the process of demolishing the pre-established harmonies between biological systems that have been part of the evolutionary heritage of millions of years of hard earned biological engineering we term the evolutionary process. And in our time we are relinquishing this natural heritage and entering into a new artificial heritage that is severing our roots in biological systems altogether.

But I wonder if this process is at all new? Haven’t we since developing language, writing, and other external storage systems: data storage devices from the use of clay, papyrus, paper, micro-film, disk-storage, up to current use of atomic inscription, etc. been slowly evolving external systems of mind and knowledge for thousands of years? Is this at all something new? Or, is it rather that end point of a curve that started millennia ago? I’d suggest that we are entering a final mutation and transformation that started thousands of years ago rather than something that has evolved into a sudden disruption of the artificial/natural divide. I would even wonder if we have ever truly been natural at all? Much rather haven’t we always been the one creature on this planet that has never had a harmonious presence in Nature? Haven’t we always been at odds with our environment due to our inadequate need for clothing and protection from the elements unlike most other natural organisms? Have we ever been natural?

As Walter J. Ong said years ago: “In a writing or print culture, the text physically bonds whatever it contains and makes it possible to retrieve any kind of organization of thought as a whole. In primary oral cultures, where there is no text, the narrative serves to bond thought more massively and permanently than other genres.”1 We know that there are two forms in which cultures have transmitted their knowledge: 1) ritual and per formative art; and, writing or other graphic, iconic, or other material symbolic transmissions. This is nothing new.

We also know that various cultures have either produced open or closed forms of knowledge transmission, some based of elite and controlled transmission and others on more democratic and open forms. Books like Paul Connerton’s How Societies Remember the details of such systems.

In some ways it comes down to how we organize information in societies. At first glance, the universe seems hostile to order. Thermodynamics dictates that over time, order—or information—disappears. Yet, we’ve developed systems that allow effective systems to be built that fight the entropic dissolution of order or information depletion. We know that one of the central cores of civilization’s growth was the data storage device we all use: the Library or Archive. Such repositories extend back to various civilizations. And, as stated above, it always came down to how these civilizations used these repositories and organized the information in them that promoted this battle against disorder and entropy. As the Roman civilization crumbled the Catholic Church would accumulate and transmit the treatises it deemed worthy of inscription and transmission through its vase library storage systems. Yet, it used a system that was both elitist and secretive in most ways, containing knowledge in a dead language of Latin while the populace of the realms spoke in vernacular languages and did not learn of these works. Only later did certain of the Latin scholars working among themselves slowly uncover and begin to vernacularize such knowledge. I want go into details.

So in this sense the slow awakening and democratization of knowledge which was given to all the people opened the door onto learning and memory that had been closed off since the days of Rome and Greece during the Renaissance. In our time the age of print is coming to an end. With such libraries and institutions of physical storage themselves slowly fading away as new electronic and digital storage systems migrate the vast storehouse of information and texts from print to digitalization. Many in our culture see this as both a threat and an error. Humanists and old school liberal educators deplore the effects of this new medium of digitalization of life and memory.

 Plus we have barely discerned the need to understand this new medium and the glut of information into vast data-enclaves which humans no longer have much control over. This Big Data world will be prone to abuse at the hands of commercial and governmental organizations is just beginning to surface as we remember the Snowden Affair. Yet, my concern is that we have no back up source of information storage that might keep that knowledge more permanent and not depend on electricity. If the electric grids of the planet went down this new digital environment would be brought down in a short time and with it the accumulated knowledge of many civilizations. This to me is a global concern.

The other aspect is the per formative knowledge that can never be put into digital form: the ritual practices of various cultural memory that have never been part of the written or oral record, but have been maintained in mimetic and other physical forms across thousands of years of iterative practice. What of these?

In his recent book Posthuman Life: Philosophy at the Edge of the Human David Roden speaking of Stelarc’s notions of extending intelligence beyond the Earth he tells us these performances decouple the body from its ecology and from the empathic responses of observers – even when dangling from skin hooks over a city street, Stelarc never appears as suffering or abject. They register the body’s potential for “off world” environments rather than its actual functional involvements with our technological landscape. Space colonization is not a current use-value or industrial application, but a project for our planned obsolescence:

The terrestrial body will be obsolete from the moment a certain sub-population feels compelled to launch itself into an impossible, unthinkable future of space colonization. To say that the obsolescence of the body is produced is to say that it is compelled. To say that it is compelled is to say that it is “driven by desire” rather than by need or utility.2

As Roden will remark the “basis of our interest in becoming posthuman is not our formal responsibility to current or future members of our species; any attempt to account for the posthuman is a necessarily irresponsible risk to the integrity of the species”. (ibid.) What he’s saying is much the same as Scott, that we are disconnecting the hard-wired biological systems that have governed our evolutionary heritage for thousands of years, the “integrity of the species” is at stake.

As we disconnect from both our biological systems and our capacity to know and learn, and allow either a migration into external artificial agents, else an external agent take-over of thought and reasoning, allowing these AI systems to make decisions for us ubiquitously one must ask: What are we giving up for such conveniences? Are we giving up our humanity? If we become so dependent on external memory systems and integral AI decisioning systems to do our thinking and selection for us what will we become? How is this effecting our own cognitive ecology? Will we devolve into systems of stupidity? We may still have the ability to use and retrieve data, but will we become more machine like in the process: more automaton than human, versed in the latest technologies of search and selection, but unable to reason on these processes? Will we follow the gradient curve of laziness, allow the machines to think for us and accept only what is given to us and trust its ability to make the call, the decisions that we are no longer willing or able any longer to make for ourselves? Will we give up our ability to think and reason altogether?

Will most of our intelligence be pre-processed for us outside our control by algorithmic agents and self-learning mechanisms, agents that will have control over the selection, sorting, retrieval, collation, and processing of archival knowledge we have access too; while making the decision whether we have the coded access to this level of information? What are the legal ramifications? Will knowledge be totally commercialized to the point that one will pay for research access through a corporate account? Will all knowledge and power be bound to certain corporate and governmental control? Will we no longer be free to investigate and learn on our own, but be depended on pre-selected information gathered, collated, analyzed, composed, and served up to us as knowledge without or consent or control? Will we become slaves of machinic intelligence by design and ubiquity without ever realizing that this is what is happening before it is too late? Will these non-intentional, not-conscious systems and algorithms become our agents of tyranny without even knowing their own role in this enslavement? Will the rule of Code become our next invisible dictatorship? Our lives coded and reprogrammed to serve these new ubiquitous commercial research and learning regimes? Will we become so enamored of our toys that we will give up our remaining freedoms to the systems of knowledge and power that we’ve invested so much desire in? Or, will our children become so normalized to this native world of artificiality that a cultural amnesia will set in when such institutions as Libraries and freedom to explore on our own becomes a relic of the past? Will our children and their children grow up in technological enclaves and not even realize that their parents once roamed the wild lands of knowledge?

If we are beginning to decouple our biological systems from the environment that has constrained our thinking and being from the beginning of our evolution what will this mean? Are we entering dangerous territory by dissolving the core relation between environment and knowledge? If the ubiquitous Infosphere becomes our new environment, an artificial environment based on knowledge systems that are hidden in every object so that our dependency on this environment mixes the signals that have been hard-wired into our biological systems what effect will this entail? When I read about AI, Smart Cities, Big Data, and all the other intelligent agents that are become so much a part of our economic, political, social, and spiritual heritage what does this mean for us if we disconnect from the natural in mold ourselves to this new artificial world? What if we wake up and our children have become so normalized to this new Infosphere of ubiquitous computing that they will never know that there was a natural order outside the realm of information? Or we ready for such a brave new world?

  1. Ong, Walter J. (2007-03-16). Orality and Literacy: The Technologizing of the Word (New Accents) (p. 139). Taylor & Francis. Kindle Edition.
  2. Roden, David (2014-10-10). Posthuman Life: Philosophy at the Edge of the Human (Kindle Locations 4331-4334). Taylor and Francis. Kindle Edition.