On David Roden’s Dark Phenomenology

I originally made a post on FB (Facebook) on Steven Shaviro’s new book Discognition which elaborated on aspects of Frank Jackson’s notions on Qualia. At the end of this essay he mentioned the work of a friend, David Roden. David is developing an approach he terms ‘dark phenomenology’, and it is this that I wish to clarify and expand upon. To do this I’ll be digressing across a spectrum of various concepts, authors, philosophers, neuroscientists, etc.. Bare with me…

Terrence W. Deacon, in his recent Incomplete Nature: How Mind Emerged from Matter will argue that a complete theory of the world that includes us, and our experience of the world, must make sense of the way that we are shaped by and emerge from such specific absences. What is absent matters, and yet our current understanding of the physical universe suggests that it should not. A causal role for absence seems to be absent from the natural sciences. (p. 3) As he suggested in his conclusion, “It’s time to recognize that there is room for meaning, purpose, and value in the fabric of physical explanations, because these phenomena effectively occupy the absences that differentiate and interrelate the world that is physically present.” (p. 541)

David Roden in Posthuman Life will argue that our understanding of human agency in terms of iterability and différance leads to a moderately revisionary (but still interesting) account of what human rationality and agency consists in. But this leads us beyond the human by suggesting how rationality and agency depend on structures that are shared by nonhuman systems that may lack the capacities associated with human agency, or have other powers that humans do not enjoy… (p. 45).1 For Roden first person experience is fractured by these “dark” elements of experience which “offers no standard for [their] own description or interpretation.”

To understand notions of iterability and  différance we need to work through the logics of ‘presence’ and ‘absence’ in Western traditions of philosophy. Most of Western Philosophy from the time of Plato till the postmoderns was based on the logic of ‘presence’ rather than ‘absence’. Deacon and, I will say, Roden – in his Dark Phenomenology, will both offer the perspective that ‘absence’ not ‘presence’ is key to our current understanding of how we build up our perceptions of the world. As David reports it “the problem of interpretation arises because there are empirical and theoretical grounds for holding that some phenomenology is “dark”. Dark phenomenology is experienced; but experiencing it offers no standard for its own description or interpretation.” (p. 76)

So let’s begin…

First, presence describes an original state, a state that must have come first.  As I gaze out into the world I can say the world is present to my observing eye.  If that is the case, then my observing consciousness must be present to my own self-reflection.  It thus follows that meaning, in its most pure sense, as conscious thought, must be present to me as I gaze out onto the world.  Presence is, therefore, the main predicate for a text’s meaning (its sense or its reference), despite the fact that this meaning is always absent and in need of reconstruction through reading or interpretation.

For this reason, a second moment of presence invades consciousness as absence (i.e., in the parlance of post-modern thought: the disappearance of the world behind the veils of language, consciousness going astray, the reign of death, non-sense, irrationality).  In this way gaps, absences and deficiencies of all imaginable kinds (the structurality or play of a structure) are subordinated to a principle of presence. Is it possible to imagine an absence without reference to the principle of presence? It would be a radical absence, something always and from the beginning absent, missing, lost to experience.  If there was such an absence, how could we glimpse it?

We glimpse it between repetitions as their repeatability. If the present moment can be repeated (i.e. remembered) then, preceding the present moment, is the possibility of its being repeated in memory (i.e., memory itself as repeatability).  So memory precedes and exceeds the present moment, which we will have remembered.

In Shaviro the crux comes here: “This leads to the ironic consequence that first-person experience cannot be captured adequately by first-person observation and reflection. “What the subject claims to experience should not be granted special epistemic authority since it is possible for us to have a very partial and incomplete grasp of its nature”.”

This “incomplete grasp” of nature/reality is Deacon’s as well as Roden’s acknowledgment that what is important is not what is present to consciousness, but rather what is absent in presence. Let me clarify. In chapter 7 of David’s The Posthuman Life he will develop his dark phenomenological approach. He lays the ground by arguing for a substantive or substantial formalist approach that is based on a non-teleological account of human/technique interaction in which – as in other cognitive scientific accounts – would see our evolutionary cognitive adaptations within a human and technological schema that supports abstraction but not autonomous self-augmentation. Let me digress…

Michel Tomasello in his recent book A Natural History of Human Thinking maintains that our prehuman ancestors, like today’s great apes, were social beings who could solve problems by thinking. But they were almost entirely competitive, aiming only at their individual goals. As ecological changes forced them into more cooperative living arrangements, early humans had to coordinate their actions and communicate their thoughts with collaborative partners. Tomasello develops what he terms the “shared intentionality hypothesis” which captures how these more socially complex forms of life led to more conceptually complex forms of thinking. In order to survive, humans had to learn to see the world from multiple social perspectives, to draw socially recursive inferences, and to monitor their own thinking via the normative standards of the group. Even language and culture arose from the preexisting need to work together and coordinate thoughts.

What this implies is that we developed external memory systems that could be transmitted across time, from generation to generation. Merlin McDonald in his book Origins of the Modern Mind would develop a staged history of this notion. Donald traces the evolution of human culture and cognition from primitive apes to the era of artificial intelligence, and presents an original theory of how the human mind evolved from its presymbolic form. In the emergence of modern human culture, Donald proposes, there were three radical transitions. During the first, our bipedal but still apelike ancestors acquired “mimetic” skill–the ability to represent knowledge through voluntary motor acts–which made Homo erectus successful for over a million years. The second transition–to “mythic” culture–coincided with the development of spoken language. Speech allowed the large-brained Homo sapiens to evolve a complex preliterate culture that survives in many parts of the world today. In the third transition, when humans constructed elaborate symbolic systems ranging from cuneiforms, hieroglyphics, and ideograms to alphabetic languages and mathematics, human biological memory became an inadequate vehicle for storing and processing our collective knowledge. The modern mind is thus a hybrid structure built from vestiges of earlier biological stages as well as new external symbolic memory devices that have radically altered its organization.

My own view is that these external memory storage and transmission systems have been part of an evolving and elaborate combination of technics and technology which humans have shaped, but that in turn have shaped our cognitive relations with each other and our environments. Bernard Stiegler recently argued in his book Techniques and Time that “technics” forms the horizon of human existence. This fact has been suppressed throughout the history of philosophy, which has never ceased to operate on the basis of a distinction between episteme and tekhne. The thesis of the book is that the genesis of technics corresponds not only to the genesis of what is called “human” but of temporality as such, and that this is the clue toward understanding the future of the dynamic process in which the human and the technical consists. Another digression… this time on Aristotle and the notion of Epistêmê and technê:

Epistêmê is the Greek word most often translated as knowledge, while technê is translated as either craft or art. Without going into a full history there are at times in Aristotle that he’ll confuse the two forms. Aristotle says that the person with epistêmê and the person with technê share an important similarity. Aristotle contrasts the person of experience (empeiria) with someone who has technê or epistêmê. Yet, at other times he’ll argue that at person who has a technê goes beyond experience to a universal judgment. Aristotle goes on to say that in general the sign of knowing or not knowing is being able to teach. Because technê can be taught, we think it, rather than experience, is epistêmê ( 981b10). Presumably the reason that the one with technê can teach is that he knows the cause and reason for what is done in his technê. So we can conclude that the person with technê is like the person with epistêmê; both can make a universal judgment and both know the cause, etc.

All this brings me back to something David says in Chapter 7 of his book:

“Abstraction exposes habits and values to a manifold of sensory affects and encounters (§ 8.2). It entails that the evolution of particular technologies depends on hugely complex and counter-final interactions, catalysed by transmissibility and promiscuous reusability (Ellul 1964: 93).” (p. 160)

Now if we put that into the perspective of Tomasello and his “shared intentionality hypothesis,” along with Donald’s notions of various hybrid cognitive revolutions in transmission of cultural memory or representational systems of external storage as complex and counter-final (i.e., having no teleological or autonomous impact). We begin to see a picture emerging of what David will term dark phenomenology. Following Stiegler David will argue that the “essence of a technology is not simply to be found in an analysis of its internal functioning but in the concrete ways in which these functions are integrated in matter. The invention of a new device is neither the instantiation of an abstract Platonic diagram nor the invention of an isolated thing, but the production of a mutable pattern open to dynamic alteration (Stiegler 1998: 77– 8).” (p. 162) He’ll go on to say:

“This reaffirms my claim that a phenomenological ontology which reduces abstract technical entities to their uses is inadequate. Technical entities are more than bundles of internal or external functions. They are materialized potentialities for generating new functions as well as modifiable strategies for integrating and reintegrating functions…” (p. 162)

What is important here is the notion of “materialized potentialities”. What does this mean? Aristotle’s proposal in Book Theta of his Metaphysics, that “a thing is said to be potential if, when the act of which it is said to be potential is realized, there will be nothing im-potential (“that is, there will be nothing able not to be,” (in HS, 45) (see: http://www.iep.utm.edu/agamben/). Giorgio Agamben will offer us a opening onto this. Agamben argues that this ought not be taken to mean simply that “what is not impossible is possible” but rather, highlights the suspension or setting aside of im-potentiality in the passage to actuality. This suspension, though, does not amount to a destruction of im-potentiality, but rather to its fulfilment; that is, through the turning back of potentiality upon itself, which amounts to its “giving of itself to itself,” im-potentiality, or the potentiality to not be, is fully realized in its own suspension such that actuality appears as nothing other than the potentiality to not not-be. While this relation is central to the passage of voice to speech or signification and to attaining toward the experience of language as such, Agamben also claims that in this formulation Aristotle bequeaths to Western philosophy the paradigm of sovereignty, since it reveals the undetermined or sovereign founding of being. As Agamben concludes, ‘“an act is sovereign when it realizes itself by simply taking away its own potentiality not to be, letting itself be, giving itself to itself’” (HS 46).

Ultimately this leads us back to David’s dark phenomenology which is part of what is now termed ‘speculative realism’ in the sense that he will use the concept of ‘withdrawal’ as part of his substantial formalism:

“The conditions for the phenomenology of technology thus show that the existence of technological items exceeds their phenomenological manifestation. Technologies can withdraw from particular human practices (Verbeek 2005: 117). If SP is correct, they may even withdraw from all human practices.” (p. 163)

This notion of withdrawal or disconnection began in the Object-Oriented substantial formalism of Graham Harman, although David uses this concept a little differently. Graham will modify Heidegger’s notions of readiness-to-hand (Zuhandenheit), saying it “refers to objects insofar as they withdraw from human view into a dark subterranean reality that never becomes present to practical action any more than it does to theoretical awareness” (p. 1).2 The point here is that it is not conscious, it is absence under the sign of presence (as we observed in the beginning). So that we never have direct access to objects, but only indirect access since we are apprehending that which is attained only by way of absence rather than direct presence. One could draw from this a complete history of ontology as ‘eye’ or ‘optical’ based, and an opposing one that is based on other senses than the eye; affective relations, etc. The notion of the eye has been central to metaphysics since Aristotle or before.

William McNeill in his The Glance of the Eye: Heidegger, Aristotle, and the Ends of Theory explores the phenomenon of the Augenblick, or glance of the eye, in Heidegger s thought, and in particular its relation to the primacy of seeing and of theoretical apprehending (theoria) both in Aristotle and in the philosophical and scientific tradition of Western thought. McNeill argues that Heidegger s early reading of Aristotle, which identifies the experience of the Augenblick at the heart of ethical and practical knowledge (phronesis), proves to be a decisive encounter for Heidegger s subsequent understanding and critique of the history of philosophy, science, and technology. It provides him with a critical resource for addressing the problematic domination of theoretical knowledge in Western civilization.

So Harman and Roden both are developing a form of counter-theoretic or dark phenomenology in the sense that it is no longer guided by the ‘glance of the eye’. As Harman would suggest when the things “withdraw from presence into their dark subterranean reality, they distance themselves not only from human beings, but from each other as well. If the human perception of a house or tree is forever haunted by some hidden surplus in the things that never becomes present, the same is true of the sheer causal interaction between rocks or raindrops.” (TB, p. 2)

So that when David tells us that “technological items exceeds their phenomenological manifestation,” and that “technologies can withdraw from particular human practices,” he is countering this whole scientific tradition of the eye and exposing us to a darker apprehension of absence rather than presence. Or, I should qualify, saying the “presence within absence,” which is apprehended indirectly through various apparatuses, etc. This will lead Roden to state:

Thus we should embrace a realist metaphysics of technique in opposition to the phenomenologies of Verbeek and Ihde. Technologies according to this model are abstract, repeatable particulars realized (though never finalized) in ephemeral events (§ 6.5). (p. 163)

I’ll need to expand this… but it’s grown too long as is. I did not go into the work of Verbeek or Ihde, so will have to take that up at another point. The main thrust is as David tells us, he is moving toward  “a model that addresses the “abstract particularity” of technique while leaving room for a more detailed metaphysical treatment of technicity” (p. 163). This notion of technicity as Arthur Bradley will tell us following as Roden did, the work of Derrida:

In Jacques Derrida’s view, we live in a state of originary technicity. It is impossible to define the human as either a biological entity (a body or species) or a philosophical state (a soul, mind or consciousness), he argues, because our “nature” is constituted by a relation to technological prostheses. According to a logic that will be very familiar to readers of his work, technology is a supplement that exposes an originary lack within what should be the integrity or plenitude of the human being itself. To put it in a word, what we call the “human” is thus the product of an aporetic relation between interiority and exteriority where each term defines, and contaminates, its other. If Derrida was arguably the first thinker to explicitly propose a philosophy of originary technicity— although there are obvious precedents in the work of Marx, Nietzsche, Bergson, Husserl and Leroi-Gouhran— this line of enquiry has been pursued, refined and extended by a number of other figures including, most notably, Bernard Stiegler. The technological turn in continental philosophy also feeds into a more general crisis about what— if anything— might now be said to be “proper” to humanity. This can be witnessed in the recent debate— gathering together voices from science fiction, cultural theory and the human, life and cognitive sciences— about our so-called “posthuman” future.3

Ultimately as David reminds us if “phenomenology cannot tell us what phenomenology is a priori, then phenomenological investigation cannot secure knowledge of phenomenological necessity. In particular, we have no grounds for holding that we understand what it is to occupy a world that any sophisticated cognizer must share with us.” (p. 76)


 

  1. Roden, David (2014-10-10). Posthuman Life: Philosophy at the Edge of the Human (p. 160). Taylor and Francis. Kindle Edition.
  2. Harman, Graham (2011-08-31). Tool-Being: Heidegger and the Metaphysics of Objects (p. 1). Open Court. Kindle Edition.
  3. Armand, Louis; Bradley, Arthur; Zizek, Slavoj; Stiegler, Bernard; Miller, J. Hillis; Wark, McKenzie; Amerika, Mark; Lucy, Niall; Tofts, Darren; Lovink, Geert (2013-07-19). Technicity (Kindle Locations 1468-1478). Litteraria Pragensia. Kindle Edition.

The original post on FB for those who don’t have access to it:

Steven Shaviro discusses David Roden‘s notions of Dark Phenomenology in the first chapter of his book, Discogniton (“Thinking Like a Philosopher”), and I quote:

“When we no longer have concepts to guide our intuitions, we are in the realm of what David Roden calls dark phenonemology. Roden extends the arguments of Kant, Sellars, and Metzinger. Since I am able to experience the subtlety of red, but I can only conceive and remember this experience as one of red in general, there must be, within consciousness itself, a radical “gulf between discrimination and identification”. This leads to the ironic consequence that first-person experience cannot be captured adequately by first-person observation and reflection. “What the subject claims to experience should not be granted special epistemic authority since it is possible for us to have a very partial and incomplete grasp of its nature”.

“In other words, rather than claiming (as Dennett does, for instance) that noncognitive phenomenal experience is somehow illusory, Roden accepts such experience, espousing a full “phenomenal realism”. But the conclusion he draws from this non-eliminativist realism is that much of first-person experience “is not intuitively accessible”. I do not necessarily know what I am sensing or thinking. It may well be that I can only figure out the nature of my own experiences indirectly, in the same ways – through observation, inference, and reporting – that I figure out the nature of other people’s experiences. Introspective phenomenological description therefore “requires supplementation through other modes of enquiry”. Roden concludes that we can only examine the “dark” areas of our own phenomenal experience objectively, from the outside, by means of “naturalistic modes of enquiry… such as those employed by cognitive scientists, neuroscientists and cognitive modelers”.

“Roden’s account of dark phenomenology is compelling; but I find his conclusion questionable. For surely the crucial distinction is not between first person and third person modes of comprehension, so much as between what can be cognized, and what cannot. Phenomenological introspection and empirical experimentation are rival ways of capturing and characterizing the nature of subjective experience. But dark phenomenology points to a mode of experience that resists both sorts of conceptualization.” (Kindle Locations: 490-560)1

In the above passage one discovers the differences within the neuroscientific community of the sciences, and the philosophical community: the neurosciences are stripping the lineaments of Kantian intuition and/or ‘phenomenological introspection’ (first person) out of the equation altogether; while those within the philosophical world seek to save the last bastion of Kantian thought from the veritable erosion in a sea of technological systems outside the purview of consciousness. This is the battle confronting 21st Century thought. The Neurosciences vs. Philosophy. On the one hand you have those who believe philosophy should not be seen as opposing so much the sciences as being the guardian of thought itself; maintaining that without philosophy the scientists would not have the theoretical frameworks within which to carry on their conceptual discourses. On the other you have the neuroscientists who could care less about the specifics of thought, but rather seek an understanding of the very real and empirical operations and functions of the brain that gives rise to thought. It’s this intermediary realm between material/immaterial that is at issue. In older forms the physicalist arguments reduced everything to the brain, but newer neurosciences are taking into consideration that things are not so easily reduced; yet, there is no agreement among scientists or philosophers as to what this gap or blank is between the material and immaterial, or even if such questions are pertinent to the task. So that for scientists it’s not so much about frameworks as it is about the pragmatic truth of actual process in real-time that have nothing to do with philosophical intuitionism and much more about the way the brain interacts with the environments within which it is folded.

Already neurosciences, imaging technologies (i.e., fRMI, etc.), and interface tech are bridging the material/immaterial gap without understanding the full details of the processes involved. Along with computer/brain interfaces that can be applied intrinsically and extrinsically to a person, allowing for new and exciting abilities for those whose bodies were otherwise incapacitated access to speech, communication, and computing systems, there is the intraoperative collusion of biochemical and hardware intermediation that up till recently would have been seen as impossible. Yet, in our time technology and invention is bringing a revolution in such splicings of human and machine. More and more those like Andy Clarke are being proven right that humans are already becoming Cyborgs… are, maybe we always already were. Technology that we create is in return changing who and what we are as humans. Some say this is the posthuman divide, a crossing of the Rubicon between human and technology that will change our mode of being in the world forever. What it will lead to is anyone’s guess. David Roden will term it the disconnection thesis: a point beyond which we just don’t know what is being reached in the way of ‘wide descendants’ (or posthuman progeny), one we can only speak of speculatively rather than ontologically with any depth of resolution.

Only time will tell who will come out on top, here; but I suspect if history has a say, that the sciences will uncover the processes of thought in the brain as being outside the control of the first-person navigator we term the Subject altogether. Philosophers want to retain a connection to our sense of Self and Personality, to hold onto the metaphysical basis of human thought and exceptionalism. But the sciences day by day are eroding the very ground and foundations of human subjectivity and self upon which western metaphysics since Plato has encircled itself. The battle continues… and, as Steven suggests, Roden’s “dark phenomenology points to a mode of experience that resists both sorts of conceptualization.” Where it will lead we will need to follow…

1. Steven Shaviro. Discognition. Repeater (April 19, 2016)


 

2 thoughts on “On David Roden’s Dark Phenomenology

  1. Thanks again for this wonderful commentary, Craig. As always, I’m floored by the way that you come at stuff I’ve written from unexpected directions, with greater erudition than I could bring to bear, if I’m honest. I think New Substantivism needs embedding in the kind of empirically informed cognitive ethology that Tomasello and others are developing. That being said, I’m leery about the l trope of originary technicity, in part because its rhetoric of contamination seems to draw on a model of presence that its proponents have to reject; in part because it replicates a transcendental method of accounting for subjectivity and agency that the epistemology of dark phenomenology places in doubt.

    Harman’s notion of withdrawal suggested itself strongly when I was writing about the relationship between technique and its phenomenology. This being said, iterable techniques withdraw not because they are ontologically closed (somehow) but because they’re subject to unbounded differential repetition – as Steven Shaviro implies, they’re intimately involved (which, on reflection, is perhaps a reason for retaining some of what’s good in the idea of originary technicity). So this isn’t an OOO model of technology, but pluralist account for which techniques must be both historically particular and repeatable.

    Liked by 1 person

    • Yep, that’s why I said in the paragraph above: “This notion of withdrawal or disconnection began in the Object-Oriented substantial formalism of Graham Harman, although David uses this concept a little differently.” I should have been more explicit in what I meant by “differently”… 😉 And, yes, the technicity aspect needs a revisioning … let’s face it there is always further revisions and changes in notions. One can’t take anything as fixed in iron… just as you revised Ellul and Ihde and Veerbek, etc. One takes ideas then modulates them into one’s own tune… like good jazz!

      When you say: “Abstraction exposes habits and values to a manifold of sensory affects and encounters (§ 8.2). It entails that the evolution of particular technologies depends on hugely complex and counter-final interactions, catalysed by transmissibility and promiscuous reusability (Ellul 1964: 93).” (p. 160)

      I think that fairly well sums up the argument against telos, as well as your notion of pluralism, iterability, and repetition.

      Liked by 1 person

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s