Friday, June 5, 2009

Postmodern Scientific Inquiry

The world of the policy analyst doesn't often coincide with the world of broad trends in intellectual history. As such, words that I know appear overwrought to Evan - such as "postmodern" - can be quite exciting for me when I get the chance to re-engage them. It brings me back to the halcyon days of studying sociology at William and Mary, when I was able to put down the math and the economics and dive into "theory". Recently I ran into postmodernism twice in the same day: first, at the Postmodern Conservative blog, and second, in an old book on postmodern literary criticism I discovered at my mother in law's house, which had belonged to my late grandfather in law, a University of Chicago classicist. The postmodern conservative (pomocon) blog was a new application of postmodernism, for me. The use of postmodernism in literary criticism is obviously an old (the oldest?) endeavor.

It raised a question for me - and perhaps other people have already engaged this - about what "postmodern science" would look like. Postmodern humanities, literary criticism, and philosophy are quite clearly delineated. Certain social sciences have strong "postmodern" trends; in sociology this paradigm is more often referred to as "poststructuralism". But the natural sciences are grounded in empiricism and positivism, and therefore seem to me to be far more closely tied to modernism. It has been suggested that the future will increasingly privilege science, but is there an inherent contradiction if the future also continues to displace modernism with some variant of postmodernism? Will science survive in a postmodern future, or will it have to change to survive?

I suppose from a relatively sophomoric perspective on intellectual history (i.e. - mine), postmodernism is most starkly characterized by three qualities: (1.) self-referentialism, (2.) deconstruction, and (3.) a residual "skepticism of modernism" category. Instead of concluding that science is doomed to irrelevance in a postmodern future (and clearly, the future may not be postmodern at all), I want to assume that it will survive and highlight some ways in which the natural sciences are already adapting to the three foundational qualities of postmodern inquiry.


Postmodernism is often dismissed as "relativism" because of its insistence on the acknowledgement of self-referentialism: simply put, that our reality is often contextual. While this could clearly degenerate into a crass relativism, the value of a self-referential outlook is also obviously capable of illuminating individual biases. The application of self-referentialism to art and literary criticism is obvious; everyone enjoys these things differently. The application to the social sciences has also occurred smoothly (albeit somewhat later). Sociologists speak of the "social construction of reality" and the dependence of our interpretation of reality on the social environment in which we develop. While these versions of self-referentialism are clear, the existence of self-referentialism in the natural sciences may not be as evident. One development in physics that I've always found interesting is string theory, particularly M-theory, which predicts the existence of 11 dimensions of spacetime and the coexistence of multiple universes, each defined by the vibrations of sub-sub-atomic particles known as "strings". While TV shows treat multiple dimensions as paranormal, serious physicists understand that the existence of multi-dimensional spacetime opens the theoretical possibility of time travel, multiple dimensions, and parallel universes. The first link I provide is to a synopsis of the "Fringe" series on Fox, describe their foray into multidimensional reality in the season finale. It all sounds fantastic and fictional until you listen to the second link I provide to Dr. Michio Kaku of the City University of New York. His description of string theory is eerily similar to the fictional Dr. Walter Bishop, of "Fringe". As I understand it, string theory is definitive of physics today, in much the same way that evolution undergirds all of modern biology. String theory also forces us to acknowledge that our casual understanding of reality is inseparable from the fact that we are tied to four-dimensional spacetime and cannot conceive of other dimensions that define unknown worlds beyond our own. Sociologists can talk about how our social environment influences our understanding of reality, and art critics can talk about how an understanding of art is dependent on contexts, tastes, and personal experiences, but these sorts of self-referentialism pale in comparison to string theory's assertion that there are whole dimensions beyond the grasp of our minds.


As I understand it, deconstructionism in postmodernism emerged with Heidegger's engagement of understandings of "being" prior to Plato, and is more famously associated with Derrida's method of textual criticism. Derrida's deconstructionism essentially sought to break out and understand a problem (or a text) in terms of it's constituent parts and foundational assumptions. These constituent parts are often inherently contradictory and may reveal an underlying instability or conflict in a superficially coherent text. Berger and Luckmann connect this method of understanding "knowledge" with the contextualism of self-referentialism by reversing Derrida's project and explaining how various social environments "construct" a fundamentally contingent reality that people experience as they develop. Deconstruction can therefore be understood as the skeptical unearthing of the contextual contingencies that were touched on, above.

Once again I feel that although on the surface the natural sciences seem adroitly tied to the superficially tangible "reality", they have actually been quite willing to deconstruct their understanding of phenomena. One example that stands out in my mind is Richard Dawkins's selfish gene theory, which has revolutionized "Darwinian" evolution by pointing out that reproduction by organisms is really a sideshow to the reproduction of genetic material that occurs considerably less visibly, not to mention less salaciously. The selfish gene fits Derrida's understanding of deconstructionism beautifully. Seeming contradictions such as intragenomic conflicts, "junk DNA", the sacrifices of a vast majority of a bee hive for the reproductive prospects of a single female, female cannibalism, etc. are all explained by gene-based rather than organism-based evolution. As far as we know, we can't take this process any further than the gene because genes are the most fundamental reproducing biological component.

I include a graphic of what is known as a Mandelbrot Set in this section as a teaser for further exploration of the relevance of complex adaptive systems for the practical application of deconstructionism in the natural (and social) sciences. Suffice it to say that complex adaptive systems research explores how the non-linear interaction of very, very simple processes can produce an "emergent" pattern that is not only surprisingly complex, but also surprisingly robust and ordered. Breaking down our complex reality into these (1.) non-linear interactions, and (2.) simple foundational processes certainly seems to me to be an endeavor of which Derrida would approve.

Skepticism of Modernism

I'm not entirely sure whether the renowned Kurt Godel would appreciate my inclusion of his visage in this section or not. Godel is famous for his "incompleteness theorem", which in a nutshell said that a finite set of axioms that are universally true cannot prove all theorems that are true, specifically theorems that incorporate more information than the set of axioms themselves. The incompleteness theorem has huge implications for both falsification in general (the foundational empirical paradigm of science under modernism), and also for "theories of everything" pursued by scientists like Hawking and Kaku (granted, they both know this of course!). The incompleteness theorem's relationship with falsification is probably best explored in the context of what's known as the "liar's paradox". I'll take the liberty of expanding on the basic liar's paradox to make it applicable to the problem of falsifiability, and hope that I don't jumble things too badly:

It is a trivial task to prove my evaluation of a random person on the street that "that man is a liar". We can investigate the congruence of the man's statements and reality and ascertain whether my statement about him was true, or false. The falsification criteria, presumably, is that if he has told no lies he is not a liar. Since he has certainly made only a finite number of statements over the course of his lifetime it is theoretically possible to catalogue and evaluate them and apply certain decision rules to determine whether my statement is true or false. This is an admittedly weird example of the every day task of science. But what if that man were to approach us and say "I am lying"? That statement cannot simultaneously be true and consistent with any of the decisions rules (axioms) that we have identified for the purpose of falsification. His statement can only be "true" if he is in fact lying. But if he is in fact lying, then it is not true that he is lying!

I personally haven't completely wrapped my mind around the implications of the incompleteness theorem, but I do know that it did irreparable damage to the strong claims of positivists like Russell to identifying complete systems of mathematical truth. While practical applications of the incompleteness theorem aren't presented here, it certainly raises doubts about the limits of falsification as a vehicle for scientific inquiry if indeed some theories cannot be falsified without revealing the axioms utilized for falsification to be inconsistent themselves. Falsification has already been challenged by Thomas Kuhn, and we may find that a postmodern science will have shed it's Popperian pretensions for a more Kuhnian outlook. As a statistician of sorts myself, I don't say this lightly, and I also don't think we'll totally abandon falsification - but the shortcomings may stand out in sharper relief. As a final note on the problems of modernity-as-a-system-of-positivist-falsification, I'd note that George Soros in finance (and certainly others in other fields) more explicitly connects the practice of self-referencing with the inherent incompleteness of any axioms used to explain a system that incorporates oneself. Indeed, the implications of Godel are probably going to be much broader in the social sciences (where the scientist is part and parcel of the object of analysis) than in the natural sciences. As Keynes has said: "The ideas of economists and political philosophers, both when they are right and when they are wrong, are more powerful than is commonly understood. Indeed the world is ruled by little else." But natural scientists can still be more circumspect about the validity of their methods of falsification, particularly when the theorem being falsified involves multidimensional space that our axioms may not be able to grapple with (Obviously it's trivial for our math to grapple with multidimensional space - students learn how to do that in middle school! But in the natural sciences, axioms of falsification involve broader empirical techniques than mathematics alone, and it is these that may not be up to the task).

Concluding Thoughts

Even over the course of writing this post I've become more optimistic, not only about the fact that the lessons of postmodernism are not lost on natural scientists, but that the future will provide an intriguing and fruitful synthesis of modern scientific foundations and postmodern deconstructions of those foundations. Indeed, many of the greatest discoveries of the past century have already embraced this synthesis.

No comments:

Post a Comment

All anonymous comments will be deleted. Consistent pseudonyms are fine.