Plasma Research at the University of Missouri

Ever wonder how stupid our government is? I do all the time. Take plasma fusion for instance. The science underpinning much of fusion energy research is plasma physics. Plasmas—the fourth state of matter—are hot gases, hot enough that electrons have been knocked free of atomic nuclei, forming an ensemble of ions and electrons that can conduct electrical currents and can respond to electric and magnetic fields. The science of plasmas is elegant, far-reaching, and impactful. Comprising over 99% of the visible universe, plasmas are also pervasive. It is the state of matter of the sun’s center, corona, and solar flares. Plasma dynamics are at the heart of the extraordinary formation of galactic jets and accretion of stellar material around black holes. On earth it is the stuff of lightning and flames. Plasma physics describes the processes giving rise to the aurora that gently illuminates the far northern and southern nighttime skies. Practical applications of plasmas are found in various forms of lighting and semiconductor manufacturing, and of course plasma televisions.

University of Missouri engineer Randy Curry and his team have developed a method of creating and controlling plasma that could revolutionize American energy generation and storage. Besides liquid, gas and solid, matter has a fourth state, known as plasma. Fire and lightning are familiar forms of plasma. Life on Earth depends on the energy emitted by plasma produced during fusion reactions within the sun. However, Curry warns that without federal funding of basic research, America will lose the race to develop new plasma energy technologies. The basic research program was originally funded by the Office of Naval Research, but continued research has been funded by MU.

The difference between these multibillion dollar programs and the one offered by the University of Missouri is that physicists usually rely on electromagnetic magnetic fields to harness the power of plasma, the fourth state of matter, in fusion power experiments. But University of Missouri researchers have managed to create rings of plasma that can hold their shape without the use of outside electromagnetic fields—possibly paving the way for a new age of practical fusion power and leading to the creation of new energy storage devices.

Traditional efforts to achieve nuclear fusion have relied upon multi-billion-dollar fusion reactors, called tokamaks, which harness powerful electromagnetic fields to contain the super-heated plasmas resulting from the fusion reactions. The ability to create plasma with self-confining electromagnetic fields in the open air could eliminate the need for external electromagnetic fields in future fusion experiments, and with it, much of the expense.

The researchers created plasma rings about 15 centimeters in diameter that flew through the air across distances up to 60 centimeters. The rings lasted just 10 milliseconds, but reached temperatures greater than the sun’s fiery fusion core surface at around 6600 to 7700 degrees K (6327 to 7427 degrees C). Plasma physicists suspect that magnetic fields are still involved—but that the plasma rings create their own.

“This plasma has a self-confining magnetic field,” said Randy Curry, an engineer and physicist at the University of Missouri in Columbia. “If one can generate and contain it without large magnets involved, of course fusion energy would be an application.” But the researchers’ success in creating self-contained plasma rings came as a surprise. “We did not expect that,” Curry says.

The plasma device at MU could be enlarged to handle much larger amounts of energy, according to Curry. With sufficient funding, they could develop a system within three to five years that would also be considerably smaller. He noted that they used old technologies to build the current prototype of the plasma-generating machine. Using newer, miniaturized parts, he suggests they could shrink the device to the size of a bread box

According to Science President Barack Obama last week submitted a $3.8 trillion budget request to Congress for 2014 that, if enacted, would boost the research budgets of nearly every federal agency. His continued support for science stands out in an otherwise flat budget that aims to shrink the federal deficit by clamping down on entitlement programs and raising money by revising the tax code. The president’s spending blueprint should lift the spirits of a community that, along with all other sectors of the economy, has endured a bumpy political ride for the past year. The president’s $143 billion request for research and development more than erases a nearly $10 billion dip from 2012 to 2013 caused by sequestration—the $85 billion, across-the-board cut in discretionary spending that went into effect in March.

With such breakthroughs as the University of Missouri team is working with one sees just the opposite in both Europe and United States projects using large fusion reactors based on magnetic coils that are over budgeted an continue to cost taxpayers and governments more money than projected. At Joint European Torus (JET), where Euratom is investigating the possibility of recasting JET as an international facility after 2018, asking the other six ITER partners—China, India,    Japan, Russia, South Korea, and the United States—to contribute to the cost of keeping it running. But with ITER already expected to cost several times    the original estimate, the partners may not be keen to shoulder the extra burden. (here)

So if Randy Clark of the University of Missouri and his team can produce plasma fusion that does not need magnetic coils why should we continue funding such large Manhattan style projects as JET and our own Max Planck reactors? “This plasma has a self-confining magnetic field” said Randy Clark: if this is true (see video here) then the cost of maintaining such large fusion reactors would be a thing of the past. What’s interesting is that if the International Community could get behind such projects we could truly have energy for the world that would be clean and safe, because unlike the older forms had waste products while this form does not and is self-renewable. Let’s hope within the next ten years they can make solid headway toward this goal.

History, Cosmology, and Philosophy

Spontaneous creation is the reason there is something rather than nothing, why the universe exists, why we exist. It is not necessary to invoke God to light the blue touch paper and set the universe going. … We seem to be at a critical point in the history of science, in which we must alter our conception of goals and of what makes a physical theory acceptable. It appears that the fundamental numbers, and even the form, of the apparent laws of nature are not demanded by logic or physical principle.

– Stephen Hawking, The Grand Design

Anytime we mention history we discover a truth: history is always past, beyond us, transcendent. So if history is always and forever fallen into past time, the flow of an irreversible zone of non-meaning that we can neither contemplate nor imagine, then what are the conditions  necessary for its arising in discourse? We never have direct access to history – unless there are time-travelers among us; we only ever have indirect access to it through thinking it. But then is history nothing but fantasy? How do we think something that can never be directly or indirectly known? And, what of that greatest of all histories, the Universe itself? Cosmological history? How do those strange travelers of time, the physicists, cosmologists of the Big Bang and other theories, formulate their grand histories of the universe (or multiverse) when they never have direct access to that strange history? More importantly how can our understanding of cosmology and the sciences help us transform philosophy as we’ve come to know it into a instrument that allows us to both epistemologically and ontologically evaluate it and justify the truth of it by these sciences and their physical and mathematical theories? Or is it science itself that should be transformed by philosophy?

Continue reading

Heroes of Science: Arthur Galston

In his early research the biologist, Arthur Galston experimented with a plant growth regulator, triiodobenzoic acid, and found that it could induce soybeans to flower and grow more rapidly. However, he also noted that if applied in excess, the compound would cause the plant to shed its leaves.

The Military-Industrial Complex of the era used Galston’s findings in the development of the powerful defoliant Agent Orange, named for the orange stripe painted around steel drums that contained it. The chemical is now known to have contained dioxins, which have proven to be associated with cancers, birth defects and learning disabilities. From 1962 to 1970, American troops released an estimated 20 million gallons of the chemical defoliant to destroy crops and expose Viet Cong positions and routes of movement during the Vietnam War.

As an activist he wrote letters, academic papers, broadcasts and seminars, thet described the environmental damage wrought by Agent Orange, noting that the spraying on riverbank mangroves in Vietnam was eliminating “one of the most important ecological niches for the completion of the life cycle of certain shellfish and migratory fish.” Galston traveled to Vietnam to monitor the impact of the chemical. In 1970, with Matthew S. Meselson of Harvard University and other scientists, Galston charged that Agent Orange also presented a potential risk to humans. The scientists lobbied the Department of Defense to conduct toxicological studies, which found that compounds in Agent Orange could be linked to birth defects in laboratory rats. The revelation led President Richard M. Nixon to order a halt to the spraying of Agent Orange.

Continue reading

Emergence of Scientific Culture

Just started reading Stephen Gaukroger’s The Emergence of a Scientific Culture: Science and the Shaping of Modernity 1210-1685. This is the first in a projected series of works that will trace the historical emergence and consolidation of scientific culture in the West during the modern era. The second volume The Collapse of Mechanisim and the Rise of Sensibility: Science and the Shaping of Modernity, 1680-1760 was recently published as well. Another volume The Naturalisation of the Human and the Humanisation of Nature: Science and the Shaping of Modernity, 1750-1825 should follow soon. He says in the forward that future volumes will bring us right up to current debates over the unification of Science and scientific naturalism. What I do like is that he doesn’t seem to have any axe to grind. He includes both the religious and non-theistic philosophical and cultural perspectives that underpin that history.

At the center of this work is the theme of natural philosophy as it emerged out of scholastic Aristotelianism during the thirteenth century. It was this enterprise that underpinned the systematic theology of that era, as well as “giving natural philosophy a cognitive priority that was to become one of the key features of early-modern scientific culture” (17). It was during this era that “natural philosophy was transformed from a wholly marginal enterprise into the unique model of cognitive inquiry generally” (17).

I’ve barely begun to skim the surface of this promising work, but am already fascinated by the richness and depth of detail that I see in its 500 or so pages. For those interested he has a short introductory on his second volume on berfrois:

Natural Philosophy and a New World Picture

Heroes of Science: Pierre Gassendi

Pierre Gassendi,  was one of the prodigies of the early seventeenth century. He was born in 1592 in Provence, went to college at Digne, and by the age of sixteen was lecturing there. After studying theology at Aix-en-Provence, he taught theology at Digne in 1612. When he received his doctorate in theology, he became a lecturer in philosophy at Aix, and then canon of Grenoble. Quite early in life, Gassendi began his extensive scientific researches, assisted and encouraged by some of the leading intellectuals of Aix, like Peiresc. The philosophy course that he taught led Gassendi to compile his extended critique of Aristotelianism, the first part of which appeared as his earliest publication in 1624, the Exercitationes Paradoxicae adversus Aristoteleos. This was followed by several scientific and philosophical works, which gained Gassendi great renown in the intellectual world and brought him into contact with the man who was to be his lifelong friend, Father Marin Mersenne. In 1633, Gassendi was appointed Provost of the Cathedral of Digne, and in 1645, professor of mathematics at the Collège Royal in Paris. Gassendi retired in 1648 and died in 1655.

In spite of his tremendous role in the formation of “the new science” and “the new philosophy,” Gassendi’s fame has survived mainly for his criticisms of Descartes’ Meditations and not for his own theories, which throughout the seventeenth century had rivaled those of his opponent. He is also remembered for the part he played in reviving the atomic theory of Epicurus. But, by and large, until quite recently, Gassendi’s status as an independent thinker has been most neglected. Perhaps this is due in part to Descartes’ judgment of him, and in part to the fact that he usually presented his ideas in extremely lengthy Latin tomes, which are only now being translated into French.

But Gassendi, in his lifetime, had an extremely important intellectual career, whose development, perhaps more than that of René Descartes, indicates and illustrates what J. H. Randall called “the making of the modern mind.” Gassendi started out his philosophical journey as a sceptic, apparently heavily influenced by his reading of the edition of Sextus brought out in 1621, as well as by the works of Montaigne and Charron. This phase of “scientific Pyrrhonism” served as the basis for Gassendi’s attacks on Aristotle as well as on the contemporary pseudoscientists and made Gassendi one of the leaders of the Tétrade. However, he found the negative and defeatist attitude of humanistic scepticism unsatisfactory, especially in terms of his knowledge of, and interest in, the “new science.” He announced then that he was seeking a via media between Pyrrhonism and Dogmatism. He found this in his tentative, hypothetical formulation of Epicurean atomism, a formulation that, in many respects, comes close to the empiricism of modern British philosophy.

– Richard H. Popkin, The History of Scepticism

Note: adding a new category that will offer historical and critical biographical details on the history of science and key players within that history.

Machinic Life: The Replicants are (among) Us

“‘Organisms are resilient patterns in a turbulent flow— patterns in an energy flow.’

Carl Woese, Noble Prize winner

“I believe that I have somewhere said (but cannot find the passage) that the principle of continuity renders it probable that the principle of life will hereafter be shown to be part, or consequence, of some general law…”

– Charles Darwin in a Letter to George Wallich

“Pan-mechanism is not simply the claim that being is composed entirely of machines, but that all interactions are machinic interactions.”

– Levi R. Bryant (MOO)

For a long while there was a thin red line that divided inanimate matter from animate life forms, chemistry from biology, but in the last few years many scientists working within biophysics and molecular biology are blurring such distinctions and discovering new and surprising things about matter and its operational life. Take the ribosome for instance:

The ribosome is a tiny organelle present in all living cells in thousands of copies that manufactures the protein molecules on which all life is based. It effectively operates as a highly organized and intricate miniature factory, churning out those proteins— long chain-like molecules— by stitching together a hundred or more amino acid molecules in just the right order, and all within a few seconds. And this exquisitely efficient entity is contained within a complex chemical structure that is just some 20– 30 nanometres in diameter— that’s just 2– 3 millionths of a centimetre! Think about that— an entire factory, with all the elements you’d expect to find in any regular factory, but within a structure so tiny it is completely invisible to the naked eye.1

Another scientist, Peter M. Hoffmann, tells us in his work in molecular biology using the touch based rather than site based atomic force microscopy (AFM’s) he “discovered the fascinating science of molecular machines. I realized that life is the result of noise and chaos, filtered through the structures of highly sophisticated molecular machines that have evolved over billions of years. I realized, then, there can be no more fascinating goal than to understand how these machines work— how they turn chaos into life.”2

Attacks against reductionist or methodological naturalism have become a staple of the new turn toward religion in science. Religious philosophers like Alvin Plantinga (2011).’Where the Conflict Really Lies: Science, Religion and Naturalism’ would have us believe that there is a deep and serious conflict between naturalism and science:

“Taking naturalism to include materialism with respect to human beings, I argue  that it is improbable, given naturalism and evolution, that our cognitive  faculties are reliable. It is improbable that they provide us with a suitable  preponderance of true belief over false. But then a naturalist who accepts  current evolutionary theory has a defeater for the proposition that our faculties are reliable. Furthermore, if she has a defeater for the proposition that her cognitive faculties are reliable, she has a defeater for any belief she takes to be produced by her faculties. But of course all of her beliefs have been produced by her faculties—including, naturally enough, her belief in naturalism and evolution. That belief, therefore—the conjunction of naturalism and evolution—is one that she can’t rationally accept. Hence naturalism and evolution are in serious conflict: one can’t rationally accept them both.” (p.xiv)

Yet if we return to the beginning of this form of naturalist tradition in the seventeenth century, with the invention of the first microscopes, scientists searched for the secret of life at ever smaller scales. Biological cells were first described in Robert Hooke’s Micrographia in 1665 (Figure 0.1). It took until 1902 for chromosomes to be identified as carriers of inheritance. The structure of DNA was deciphered in 1953, and the first atomic-scale protein structure was obtained in 1959. Yet, even while scientists dissected life into smaller and smaller pieces, the mystery of life remained elusive.

Continue reading

+1 Standard Model: Experiments Deliver a Death Blow to Supersymmetry?

Cambridge scientists at the Large Hadron Collider (LHC) at CERN, near Geneva, have spotted one of the rarest particle decays ever seen in nature.

The result is very damaging to new theories like the extremely popular Supersymmetry.

Current knowledge about the most fundamental matter particles (quarks and leptons, such as an electron) and the forces between them is embedded in the so-called Standard Model. The particle masses are a consequence of their interactions with the Higgs field. Exciting the Higgs field in particle collisions at the LHC recently resulted in the discovery of the Higgs boson. (Science Daily, Nov. 13, 2012)

In particle physics, supersymmetry (often abbreviated SUSY) is a symmetry that relates elementary particles of one spin to other particles that differ by half a unit of spin and are known as superpartner. In a theory with unbroken supersymmetry, for every type of boson there exists a corresponding type of fermion with the same mass and internal quantum numbers (other than spin), and vice-versa.

There is no direct evidence for the existence of supersymmetry. It is motivated by possible solutions to several theoretical problems. Since the superpartners of the Standard Model particles have not been observed, supersymmetry must be a broken symmetry if it is a true symmetry of nature. This would allow the superparticles to be heavier than the corresponding Standard Model particles.

"Standard Model Lagrangian" mug from CERN.

Standard Model Lagrangian” mug from CERN.

The Standard Model of particle physics is a theory concerning the electromagnetic, weak, and strong nuclear interactions, which mediate the dynamics of the known subatomic particles. Developed throughout the mid to late 20th century, the Standard Model is truly “a tapestry woven by many hands”, sometimes driven forward by new experimental discoveries, sometimes by theoretical advances. It was a collaborative effort in the largest sense, spanning continents and decades. The current formulation was finalized in the mid 1970s upon experimental confirmation of the existence of quarks. Since then, discoveries of the bottom quark (1977), the top quark (1995), and the tau neutrino (2000) have given further credence to the Standard Model. More recently, (2011–2012) the apparent detection of the Higgs boson completes the set of predicted particles. Because of its success in explaining a wide variety of experimental results, the Standard Model is sometimes regarded as a “theory of almost everything”.

Read the article at Science Daily: click here…

Stephen Hawking: Science vs. Philosophy?

“The strolls of a sceptic through the debris of culture—rubble and dust as far as the eye can see. The wanderer has found everything already in ruins, furrowed down and across by the plough of unremitting human thought. The wanderer puts forth his walking stick with caution; then he comes to a halt, leaning on it, and smiles.”


– Bruno Schulz, The Wanderings of a Sceptic


Stephen Hawking in his new book, The Grand Design, throws down a challenge to all those philosophers who pretend to deal with the great questions:

Why is there something rather than nothing?
Why do we exist?
Why this particular set of laws and not some other? 

He goes on the say that at one time these questions were for philosophy, but now, he tells us – “philosophy is dead”. [1] He attacks philosophy saying that it “has not kept up with modern developments in science, particularly physics. Scientists have become the bearers of the torch of discovery in our quest for knowledge” (GD: Loc 42). The arrogance with which he states this position is almost that of and old time dogmatist in its scathing belittlement of philosophy and philosophers.

Just for the fun of it let’s take him at his word and see just what he’s up to with his game of science taking the full helm of traditional metaphysical thought from philosophy, and discover what answers he provides to the questions above.

Continue reading

Epistemic Naturalism: Quine, Goldman, Kuhn, and Brassier

“Philosophy of science is philosophy enough.”
– W.V. Quine

Broadly speaking the Analytical tradition in philosophy can be characterized by an emphasis on clarity and formal logic and analysis of language, and a profound dependence and respect for the natural sciences. Some of the main precursors of this movement in philosophy are Bertrand Russell, Ludwig Wittegenstein, G.E. Moore, Gottlob Frege, and the logical positivists who derive from them.

W.V. Quine was one of the first to propound an influential naturalized epistemologyHe ultimately wanted to replace traditional epistemology with the natural sciences (i.e., psychology ). He felt that the psychological study of how people produce theoretical “output” from sensory “input,” and the other is the logical reconstruction of our theoretical vocabulary in sensory terms. In Quine’s view, the second approach cannot succeed, and so we are left with psychology. The basis of this view is a theory of knowledge that limits its scope and methods to those of the natural sciences and their conclusions. Within this domain there is three main forms of naturalized epistemic theories: replacement, cooperative, and substantive naturalisms. Replacement would have us abandon traditional forms of epistemology in favor of naturalist science and its methods. Cooperative epistemic forms tells us that traditional epistemology would benefit from the cognitive sciences. Substantive epistemic centers on the factual assertions of ‘facts of knowledge’ and ‘natural facts’.

Alvine Goldman on the other hand provided what he termed causal reliabilism. This is a theory of knowledge that states that a justified true belief counts as knowledge only if it is caused in a suitably reliable way. What Goldman tells us is that it is necessary also to construct a theory of what epistemic justification really is, as opposed to how common sense takes it to be. That theory will be grounded in our psychological understanding of how beliefs are formed, and it will include assessments of those processes in terms of reliability.

Thomas Kuhn applied a naturalistic approach to the social sciences using epistemological questions. Kuhn inspired naturalism is not incompatible with the naturalism that draws on psychology and the natural sciences. Such naturalistic epistemologists as Alvin Goldman and Philip Kitcher have fruitfully applied insights from both the natural and the social sciences in the attempt to understand knowledge as a simultaneously cognitive and social phenomenon.

Naturalistic epistemologists seek an understanding of knowledge that is scientifically informed and integrated with the rest of our understanding of the world. Their methods and commitments differ, because they have varying views about the precise relationship between science and epistemology and even about which sciences are most important to understanding knowledge.

Epistemic naturalists usually try two sorts of approaches: 1) either they try to show the issue is empirical and then to apply scientific data, results, methods, and theories to it directly; or, 2),  they try to undermine a problem’s motivation by showing it arises only on certain false, non-naturalistic assumptions.

Yet, despite its efforts, naturalistic epistemology does face serious challenges from the problems of circularity and normativity. They are seeking nothing more nor less than the unification of science and philosophy. Others such as Ray Brassier seek instead a revisionary naturalism within this same tradition.

Brassier in his work Nihil Unbound pushed the limits of nihilism to its final extent. He linked epistemological naturalism in Anglo-American philosophy (Sellears) with anti-phenomenological realism in French philosophy. Against certain post-analytical streams of thought that have tried to bring together Heidegger and Wittgenstein against scientism and scepticism, he offers a version of eliminative materialism loosely coupled with speculative forms of philosophy.

It is in this non-dialectical turn in materialism that I’ve found congenial with my own thought. As Brassier tells us “The junction of metaphysics and epistemology is marked by the intersection of two threads: the epistemological thread that divides sapience from sentience and the metaphysical thread that distinguishes the reality of the concept from the reality of the object.  …For just as epistemology without metaphysics is empty, metaphysics without epistemology is blind. (T 279)” 1

It is this fine line or balancing act between the two disciplines that marks a distinction that makes the distinction needed to obviate many of the difficulties we face within both Analytical and Continental traditions. Against grand theories and final narratives that try to fit science into a ‘Theory of Everything’ Brassier wants to do something different: “Science does not need to deny the significance of our evident psychological need for narrative; it just demotes it from its previously foundational metaphysical status to that of an epistemically derivative ‘useful fiction’.”(interview)

As he recently related, he is a “nihilist precisely because I still believe in truth, unlike those whose triumph over nihilism is won at the cost of sacrificing truth. I think that it is possible to understand the meaninglessness of existence, and that this capacity to understand meaning as a regional or bounded phenomenon marks a fundamental progress in cognition.” (Ibid.) The notion of a regional or bounded conception of phenomenon is key to this form of epistemic naturalism that some have called a revisionary naturalism. His thought is aligned with Wilfred Sellars work in that as he said in correspondence with  on Being’s Poem:  “Sellars is concerned with developing a metaphysical vision in which not only  are secondary qualities integrated and their relationship to primary qualities  explained, but the articulation between the sensation of the former and the conception of the latter is also accounted for.” It is just here that epistemology and metaphyisics touch base with each other without one or the other having some central priority over the other.

1. Elliott, Jane; Attridge, Derek (2012-03-12). Theory After ‘Theory’ (p. 279). Taylor & Francis. Kindle Edition.

Stephen Jay Gould – The Political Side of Science

“This truth involves both a menace and a promise. It shows that the evils arising from the unjust and unequal distribution of wealth, which are becoming more and more apparent as modern civilization goes on, are not incidents of progress, but tendencies which must bring progress to a halt; that they will not cure themselves, but, on the contrary, must, unless their cause is removed, grow greater and greater, until they sweep us back into barbarism by the road every previous civilization has trod.”

– Henry George, Progress and Poverty
 

 Stephen Jay Gould used to love touting that there was no progress in evolution. As he once said: “The fact of evolutionary change through time doesn’t represent progress as we know it. Progress isn’t inevitable. Much of evolution is downward in terms of morphological complexity, rather than upward. We’re not marching toward some greater thing.”

Even though he was an anti-progressivist, Gould, was an avid advicate of leftist politics, founding Science for the People, which is a “magazine for Working Scientists active in the Anti Capitalist Movement”. Gould was born and raised in the Queensborough of New York City, New York. His father Leonard was a court stenographer, and his mother Eleanor was an artist. Raised in a secular Jewish home, Gould did not formally practice organized religion and preferred to be called an agnostic. Politically, though he “had been brought up by a Marxist father,” he has stated that his father’s politics were “very different” from his own. According to Gould, the most influential political book he read was C. Wright Mills’The Power Elite, as well as the political writings of Noam Chomsky. Gould continued to be exposed to progressive viewpoints on the politicized campus of Antioch College in the early 1960s. In the 1970s Gould joined a left-wing academic organization called “Science for the People.” Throughout his career and writings he spoke out against cultural oppression in all its forms, especially what he saw as pseudoscience in the service of racism and sexism.

In an essay Towards a Science for the People, Bill Zimmerman, Len Radinsky, Mel Rothenberg and Bart Meyers argue for a Socialist perspective for a new politicization of science saying that “science is inevitably political, and in the context of contemporary American corporate capitalism, that it contributes greatly to the exploitation and oppression of most of the people both in this country and abroad”. They understand that the difficulties for a scientist resides in the economic funding of the sciences: “Some scientists have recognized this situation and are now participating in nationally coordinated attempts to solve pressing social problems within the existing political-economic system. However, because their work is usually funded and ultimately controlled by the same forces that control basic research, it is questionable what they can accomplish. For example, sociologists hoping to alleviate some of the oppression of ghetto life have worked with urban renewal programs only to find the ultimate priorities of such programs are controlled by the city political machines and local real estate and business interests rather than by the needs of the people directly affected by such programs.”

These radical scientists see little hope in changing the system through effective reform: “Traditional attempts to reform scientific activity, to disentangle it from its more malevolent and vicious applications, have failed. Actions designed to preserve the moral integrity of individuals without addressing themselves to the political and economic system which is at the root of the problem have been ineffective. The ruling class can always replace a Leo Szilard with an Edward Teller. What is needed now is not liberal reform or withdrawal, but a radical attack, a strategy of opposition. Scientific workers must develop ways to put their skills at the service of the people and against the oppressors.”

Gould was a tireless worker against the troubling view of creationism: see McLean vs. Arkansas. Although, as one critic, Rober Wright, maintains that Gould plays unwittingly into the hands of the Creationists beacuse of his “thinking on the fundamental issue of “directionality,” or “progressivism”—that is, how inclined evolution is (if at all) to build more complex and intelligent animals over time”. In his article The Accidental Creationist, Wright tells us “Gould is not helping the evolutionists against the creationists, and the sooner the evolutionists realize that the better. For, as Maynard Smith has noted, Gould “is giving nonbiologists a largely false picture of the state of evolutionary theory.” Gould was a long time promoter of “punctuated-equilibria” as the main engine of evolution rather than the orthodox Darwinists stance on “natural selection”. Most Darwinits see Gould as a popularizer who seems to have a lot of authority in the eyes of the reading public, but is considered out of touch with the mainstream views within his own scientific community. As Daniel C. Dennett, a defender of the orthodox Darwinian stance states it:

“What Darwin discovered, I claim, is that evolution is ultimately an algorithmic process — a blind but amazingly effective sorting process that gradually produces all the wonders of nature. This view is reductionist only in the sense that it says there are no miracles. No skyhooks. All the lifting done by evolution over the eons has been done by nonmiraculous, local lifting devices — cranes. Steve (Gould) still hankers after skyhooks. He’s always on the lookout for a skyhook — a phenomenon that’s inexplicable from the standpoint of what he calls ultra-Darwinism or hyper-Darwinism. Over the years, the two themes he has most often mentioned are “gradualism” and “pervasive adaptation.” He sees these as tied to the idea of progress — the idea that evolution is a process that inexorably makes the world of nature globally and locally better, by some uniform measure.” 

But Gould argued against those like Daniel Dennett who suggest that evolutionary development is driven by a purpose – that there is a guiding hand, as it were, in evolutionary development – an inevitable progress up a ‘ladder’ from lower to higher life forms and, finally, to homo sapiens. Natural selection itself does not imply a progression from lower to higher life forms, argues Gould: “Life is a ramifying bush with millions of branches, not a ladder. Darwinism is a theory of local adaptation to changing environments, not a tale of inevitable progress. ‘After long reflection’, Darwin wrote, ‘I cannot avoid the conviction that no innate tendency to progressive development exists’.” (An Urchin in the Storm, p211)

One of Gould’s recurrent themes was life’s ‘contingency’. He does not deny that natural selection leads to a greater complexity of life forms. But the developing complexity of life, Gould maintains, is only a by-product ‘incidental’ to evolution and not necessary or inevitable. And complex creatures represent only a tiny proportion of the whole.

Whether we agree with Gould’s science or not we can all agree that he tried to fight the good fight, give people hope, to create a body of work that would defend us against ourselves. As one pundit, David Prindle, Ph.D., argues, “Stephen Jay Gould may teach us that the best political theory is not political theory per se but, rather, science expanded to its philosophical potential. A grand theory of life may be a better starting point for addressing legitimacy, justice, and equality than is any set of explicitly political assumptions.” (Stephen J. Gould as political theorist)

 

Books of Interest

Just discovered three books of interest.

1. Theory After ‘Theory’. Editors Elliott, Jane; Attridge, Derek. Taylor & Francis. Routledge (2011)

This volume has essays by Brian Massumi, Ray Brassier, Peter Hallward, Eugene Thacker, Bernard Stiegler and others. The editors speking of the late demise of theory tell us that “for some, ‘Theory’ was already passing with the end of the 1970s, whereas for others, the 1980s and early 1990s represent the height of ‘Theory’, in which feminist, postcolonial, queer and critical race theorists made their most significant contributions. Since the mid-1990s, the story goes, theory has continued to diversify, drawing on the work of a range of new figures and examining a host of new archives and arenas, but its newer incarnations offer at most a kind of afterlife of the once vital object that was ‘Theory’, a diluted form lacking in both intellectual substance and institutional prominence. As a result, conversations regarding the status of theory have become akin to an ongoing wake, in which participants debate the merits of the deceased and consider the possibilities for a resurrection desired by some and feared by others.”

Brian Massumi offers a political ensemble: “The present tense where memory and perception come disjunctively together is the time of the event that is like a lost between of the towers and their ruins, an interval in which life was suspended for an instantaneous duration that was more like a stilled eternity than a passing present, comprehending reflection gone AWOL.”

Ray Brassier tells us that “the question ‘What is real?’ stands at the crossroads of metaphysics and epistemology. More exactly, it marks the juncture of metaphysics and epistemology with the seal of conceptual representation.”

Peter Hallward seeks a politics of movement and mobilization: “Recent examples of the sort of popular will that I have in mind include the determination, assembled by South Africa’s United Democratic Front, to overthrow an apartheid based on culture and race, or the mobilization of Haiti’s Lavalas to confront an apartheid based on privilege and class. Conditioned by the specific strategic constraints that structure a particular situation, such mobilizations test the truth expressed in the old cliché, ‘where there’s a will there’s a way’. Or to adapt Antonio Machado’s less prosaic phrase, taken up as a motto by Paulo Freire: the partisans of such mobilizations assume that ‘there is no way, we make the way by walking it’ (Machado 1978).”

Eugene Thacker delves into the debates within the biopolitical spectrum:  “Today, in an era of biopolitics, it seems that life is everywhere at stake, and yet it is nowhere the same. The question of how and whether to value life is at the core of contemporary debates over bare life and the state of exception.”

2. F. Vander Valk.Essays on Neuroscience and Political Theory: Thinking the Body Politic. Taylor & Francis. Routledge (2012)

There is an interesting essay by Adrian Johnston author of several excellent works, especially his work on Zizek and Badiou: Žižek’s Ontology: A Transcendental Materialist Theory of  Subjectivity (2008), and Badiou, Žižek, and Political Transformations: The Cadence of Change (2009). His essay in this book, Toward a Grand Neuropolitics – or, Why I am Not an Immanent Naturalist or Vital Materialist, which delves into the philosophy of “immanent naturalism” as typified by William Connelly who’s stance within his books Neuropolitics and A World of Becoming offers him grist for the mill. Johnston mentions Jane Bennett’s new work as well Vibrant Matter as well. I’ve only been able to do a cursory scan this and other essays wihtin this excellent volume of essays, but am intrigued by the subject already.

As Frank Vander Valk says in the introduction to the volume: “One of the consequences of the claims about the revolutionary nature of neuroscience has been that established concepts, ideas, and texts from political theory have not been sufficiently integrated into the emerging discussion of social (and political) neuroscience. This collection addresses that problem by explicitly connecting neuroscience research to major figures in the history of political theory (e.g. Aristotle, Hobbes) and specific issues in the field (e.g. deliberative democracy, gender, subjectivity). These are important first steps, not only in working through what neuroscience means (and does not mean!) for political theory, but also for providing examples of the contribution that political theorists can make to understanding the richness of biocultural entities.

3. A Leftist Ontology: Beyond Relativism and Identity Politics. Editor Carsten Strathausen. (2009)

William Connelly whom we met in the prevous volume tells us int the introduction to this grouping of philosophical discussions by George Kateb, Charles Taylor, and Judith Butler among others tells us that although each of them may differ over critical stances within leftist political and philosophical traditions, yet they all converge on three important aspects of the ontological dimension:

First, each embraces a positive ontological orientation, as when Taylor focuses on the complexity of human embodiment, supports a fugitive philosophy of transcendence, seeks to become more closely attuned to a final moral source that cannot be known in a classical epistemic way, and defines ethical life in terms of a plastic set of intrinsic purposes to be pursued rather than a set of universal laws to be obeyed. Each of the others takes different stances on the same issues. Second, each theorist discerns a loose set of relations between the ontology adopted, the ethical-political priorities endorsed, and specific dangers and possibilities to be identified. None suggests that an ontology determines a political stance, but all contend that it filters into politics, so that it would be a mistake to say that ontology has no influence on politics. Taylor’s faith in the grace of a loving God, for instance, enters into his politics, even if the element of mystery he discerns in divinity means that he does not delineate the tight set of moral commands presented by Pope Benedict XVI and a large section of the evangelical movement in America. Third, each figure acknowledges the ontology he or she embraces to be susceptible to reflective and comparative defense; but most conclude that it is unlikely to be established either by such airtight arguments or universal recognition that it rules every other possibility out of court. Each party-though perhaps to different degrees-is thus a pluralist, seeking to bring their onto-orientation into the public realm while recoiling back on tensions and uncertainties in it enough to invite open-textured negotiations with others. Each advances a bicameral orientation to citizenship, seeking to give his or her own orientation public presence while conceding a place to others. Discernible in the differences between them is the common appreciation of a paradoxical element in politics.”

Is the Sun an Autopoietic System?

The Sun was formed about 4.57 billion years ago from the collapse of part of a giant molecular cloud that consisted mostly of hydrogen and helium and which probably gave birth to many other stars. This age is estimated using computer models of stellar evolution and through nucleocosmochronology.  The result is consistent with the radiometric date of the oldest Solar System material, at 4.567 billion years ago.  Studies of ancient meteorites reveal traces of stable daughter nuclei of short-lived isotopes, such as iron-60, that form only in exploding, short-lived stars. This indicates that one or more supernovae must have occurred near the location where the Sun formed. A shock wave from a nearby supernova would have triggered the formation of the Sun by compressing the gases within the molecular cloud, and causing certain regions to collapse under their own gravity.  As one fragment of the cloud collapsed it also began to rotate due to conservation of angular momentum and heat up with the increasing pressure. Much of the mass became concentrated in the center, while the rest flattened out into a disk which would become the planets and other solar system bodies. Gravity and pressure within the core of the cloud generated a lot of heat as it accreted more gas from the surrounding disk, eventually triggering nuclear fusion. Thus, our Sun was born.

– from Sun – Wikipedia

Is the Sun an Autopoietic System?

Autopoiesis means self-production, and autopoietic system means the system that produce itself. The concept of “autopoiesis” was originally proposed by biologists Humberto Maturana and Francisco Varela, and the term “autopoiesis” is invented from Greek words: “auto” for self- and “poiesis” for creation or production (Maturana & Varela 1972, Varela et. al. 1974, Maturana & Varela 1980; 1987).

An autopoietic machine is a machine organized (defined as a unity) as a network of process of production (transformation and destruction) of components that produces the components which: (i) through their interactions and transformations continuously regenerate and realize the network of processes (relations) that produced them; and (ii) constitute it (the machine) as a concrete unity in the space in which they (the components) exist by specifying the topological domain of its realization as such a network. It follows that an autopoietic machine continuously generates and specifies its own organization through its operation as a system of production of its own components, and does this in an endless turnover of components under conditions of continuous perturbations and compensation of perturbations.” (Maturana & Varela 1980; p.79)

In short, an autopoietic system is a unity whose organization is defined by a particular network of production processes of elements, not by the components themselves or their static relations. Summarizing the concept of autopoiesis, it turns out that the system has three fundamental features; (1) element as momentary event, (2) boundary reproduction of the system, (3) element constitution based on the system.

The crucial point of autopoiesis in systems theory is the shift of viewpoint of element from substances to momentary events. Element of the system conventionally considered to keep existing, for example cell in living system or actor in social system. In the autopoietic system theory, however, the elements are the momentary event that has no duration. It means that elements disappear as soon as they are realized. Consequently, system must produce the elements in order to keep itself existing. Thus, the boundary of system is determined circularly by the production of elements, and it is called autopoietic system.

In this sense, autopoietic system does not emerge from some so-called “bottom-up”,  just because the concept of bottom-up is assumed to be given elements before emerging as system. Autopoietic systems intrinsically imply circular relation between the system and its elements. As Nicklas Luhmann once related:

“Whether the unity of an element should be explained as emergence `from below’ or as constitution `from above’ seems to be a matter of theoretical dispute. We opt decisively for the latter. Elements are elements only for the system that employs them as units and they are such only through this system. This is formulated in the concept of autopoiesis.”(Luhmann 1984; p.22)

In this sense I believe that the Sun and all stars are indeed autopoietic systems.

Randall Honold attacks E.O. Wilson as Epistemologist

” I think what you wrote is an exclamation of triumph. You had lived out your passion to travel far, to discover and embrace novel styles of visual art, to ask the questions in a new way, and from all that create an authentically original work. In this sense your career is one for the ages; it was not paid out in vain. In our own time, by bringing rational analysis and art together and joining science and humanities in partnership, we have drawn closer to the answers you sought…

              – Wilson, Edward O.. The Social Conquest of Earth – On Paul Gauguin

“What would an OOO critique of Wilson look like, then? What alternative, maybe better, understandings of evolution and climate change could OOO give us?”

              – Randall Honold, E.O. Wilson, Climate Change, and OOO (Part One)

In a post on environmental critique Randall Honold related an outburst he’d made in a recent meeting in which E.O. Wilson’s new work, The Social Conquest of Earth, was discussed, saying: “The problem with Wilson is he’s an unrepentant epistemologist!” Is he? When Wilson says this about consciousness and it’s pretensions: “Consciousness, having evolved over millions of years of life-and-death struggle, and moreover because of that struggle, was not designed for self-examination. It was designed for survival and reproduction. Conscious thought is driven by emotion; to the purpose of survival and reproduction, it is ultimately and wholly committed” (SCE 203-207 KL*). Yes, I will agree that Wilson is an affirmer of Science above other forms of knowledge, and Wilson is no lover of philosophy and sees in it part of the problem not the solution. That Wilson supports a unrepentant form of scienctific humanism, and sponsors a return to the purity of Reason and a New Enlightenment does this lead us to his underlying epistemology? If not then what does? Let’s find out….

Wilson’s approach is to bring together the best we have in both science and the humanities. A sort of Sellarsian move of bringing the manifest and scientific images together in a unified picture of the universe and our place in it. He believes we are tottering on the edge of economic, climatic, social, and planetary collapse. To solve the problems we face he tells us we will need to bring together information from multiple disciplines, ranging from molecular genetics, neuroscience, and evolutionary biology to archaeology, ecology, social psychology, and history (SCE 230-231). I would only add art and philosophy, and even, love, into the mix. For Wilson the Church of Science holds the Truth, and is the keeper of Knowledge. Yet, for many of us Science has no corner on truth, it is no monlithic tower of knowledge dictating to the rest of the world the one true version of the Gospel of Truth. Science is only one of many conditions of philosophy as Badiou has stated over and over.

Scientists are not guileless observers, patiently recording the facts that nature places before them, but crafty cultural operators, manipulating vast technical resources to precipitate artificial new phenomena, and then networking like mad, through the production, distribution and exchange of masses of words, diagrams and statistics. They negotiate, in short, not with the objective world but with each other. As Johnathan Ree remarked recently “the norms of science, like those of morality or politics, are ideals rather than realities, and pointing out that we do not always live up to them is not the same as telling us to stop trying.” (Johnathan Ree: The Cult of Science) Paul Feyerbend in his book, The Tyranny of Science, once argued that science refers not to a single entity, but rather to a complex and ever-changing array of practice, theories, values and institutions. Therefore to speak of science in the singular, let alone to describe it as tyrannous is a category error. 2 In other words there is no monlithic substance or object we can call science, there are instead an interrelated set of disciplines and practices, a network of programs, institutions, and learning processes and methods that are all related within an epistemic framework of knowledge we call the sciences.

Continue reading

Alternate Life Forms: The Shadow World of our Biosphere

Nasa released information regarding alternate life forms which eerily coexist with us on planet earth.

As the new scientist describes it:

We could be witnessing the first signs of a “shadow biosphere” – a parallel form of life on Earth with a different biochemistry to all others. Bacteria that grow without phosphorous, one of the six chemical elements thought to be essential for life, have been isolated from California’s Mono Lake. Instead of phosphorous, the bacteria substitute the deadly poison arsenic.

But this is nothing new…

Arsenic, which is chemically similar to phosphorus, while poisonous for most Earth life, is incorporated into the biochemistry of some organisms. Some marine algae incorporate arsenic into complex organic molecules such as arsenosugars and arsenobetaines. Fungi and bacteria can produce volatile methylated arsenic compounds. Arsenate reduction and arsenite oxidation have been observed in microbes (Chrysiogenes arsenatis). Additionally, some prokaryotes can use arsenate as a terminal electron acceptor during anaerobic growth and some can utilize arsenite as an electron donor to generate energy. It has been speculated that the earliest life on Earth may have used arsenic in place of phosphorus in the backbone of its DNA.

Continue reading