Stevens, Scalia, and the hermeneutics of Scripture

While my parents were in town my father mentioned attending an event where Justices Stevens and Scalia held a discussion on interpretation of the Constitution. Scalia’s position, at least as it was related to me, was that interpretation of the Constitution begins first with an investigation of the original intent of the authors of law, whether the Constitution itself or the various court decisions through the years. For instance, if there is a case involving an early court decision, it is important to begin by examining all aspects of the case, including the notes taken by the clerks in private sessions. In other words, Scalia would subscribe (loosely) to the first hermeneutic principle proposed by Fee and Stuart in their influential book How to read the Bible for all it’s Worth: a passage cannot mean to us what it could not have meant to its original audience. The current meaning of the passage foundationally rests on what the passage originally meant, and thus to properly interpret a passage our first move is to attempt to determine the passage’s original meaning. The first question of scriptural analysis is one of original meaning: the conditions and intent of the author in its original context.

Stevens, on the other hand, wants to suggest that our accounts of the original intent are unreliable, or at least uncertain. How do we know the notes taken by the 22 year old scribe accurately reflect the original intent of the Justices? How can we possibly know what nine guys 200 years ago meant when they were writing something? Going back and speculating on original intent seems to be a difficult, if not impossible enterprise. There is too much of a gap there, and even if we could be certain of their intent, what if it doesn’t really apply today? How do we know that what justices thought and how they interpreted law 200 years ago is still a valid interpretation today? Furthermore, how do we know the intent of the original founders jives with our current reality? They seemed to think non-whites were only three fifths of a person, and that Native Americans weren’t people at all. This seems not just old-fashioned today, but plain wrong. We believe today, at least in theory, that the ideals enshrined in the Declaration of Independence should apply to all people – not just white males. In such cases, how far can we really take the intent of the original framers or Justices?

The disagreement then, is one of hermeneutic principles. What approach do we take when we are trying to interpret a document written in a particular time period, context and culture that isn’t our own. It isn’t as much a disagreement in results as it is one of method. Where do we even start when we are trying to figure out what something means *for us*? It doesn’t seem to be a terribly clear cut question.

This doesn’t only apply to the Constitution, of course. For my purposes, it’s much more interesting in the approach different people take to Scripture. The approach taken by many churches today rests firmly with Stuart and Fee (and Scalia) – that first of all a passage cannot mean to us what it could not have meant to its original audience. If we take this as a starting point, though, it seems to leave us with some difficulties. First, the tools for getting at what the text meant “back then” are historical, not theological. In other words, to figure out what the passage meant, we need primarily to be good historians or anthropologists, rather than good theologians. In some sense, this makes theology subordinate to history, and, to borrow a criticism from Barth, makes the starting point of theology man instead of God. We can see this tension at work in the Quest and reaction to the Historical Jesus. The tools for getting at the Historical Jesus are history and literary criticism, not theology. As a result, (and because he tends to look a lot like the people who “created” him) the Historical Jesus turns out to be a pretty bad place to start a theological journey.

Second, as Stevens points out, our attempts to understand what the actual intent and original context of the author and audience is, at best, speculative and uncertain. The two thousand year gap is a big one to close, and while we can make guesses about the original intent, we are so far removed culturally from the modern near east, let alone the ancient near east, that our statements about the situation of the church in a particular city at a particular time are all a kind of fiction. This seems especially true because we generally attempt to read our source document (say, Ephesians) to get clues about what the cultural and socioeconomic context of the church was, then apply those cultural and socioeconomic realities to the source document as a lens to determine what the text “meant”. The unfortunate reality is that we don’t have a lot of extra-biblical sources that tell us what the church in Ephesus was like independent of Ephesians, and thus, our socio-cultural reasoning tends about particular New Testament churches tends to be circular.

Finally, the attempt to limit the meaning of the text to its original context seems to deny, in a sense, that the word of God is “living and active” – that it has any relevance for today. Christianity has proven remarkably resilient, surviving and even thriving in contexts quite different from its origin. A large part of this, it can be argued, is that the teachings of the Bible can be painted and repainted in new contexts while still remaining relevant. While Stuart and Fee would certainly not argue that Scripture could never speak to a subject outside its original context, I contend that their first hermeneutic principle is none the less highly restrictive, and when taken seriously effectively limits the interpretation of scripture to narrow historical contexts that have little relevance to today. Modern notions of egalitarianism, capitalism, and democratic government were completely outside the scope of the patriarchal, feudal, authoritarian structures of the day – structures that formed the basis for much of the original context of Scripture. If we accept Fee and Stuart’s first principle and apply it rigorously, it seems the scope of Scripture, and the critique it can bring to bear, is highly limited.

In short, I think as we approach Scripture we should be more open to the view of Stevens – that we should start first with what Scripture means and how it speaks to us today, and then go look at what it meant as a secondary enterprise. This is a starting point that makes many people (including me) a bit uncomfortable, because it seems to endorse the Liberal principle that Scripture can be interpreted however it needs to be in light of our current context, rather than being grounded by a guiding, universal context. However, if we believe both that God is the same yesterday, today and forever, and that his words speak to us where we are, I think we must believe that a community which openly and honestly submits itself to Scripture can faithfully follow Christ without needing to first interpret scripture across a gap that may well be intractable.

Sony NEX-3 – First Thoughts

Christmas came early in the form of my Sony NEX-3. One of the recent trends in SLRs has been a new class of small, light, mirrorless SLRs that bridge the gap between point and shoots and larger SLRs. Panasonic and Olympus have a variety of bodies in the Micro Four-Thirds mount, Samsung has the new NX-100, and Sony has come out with the NEX-3 and NEX-5. After some deliberation, I decided one of the smaller cameras would be a nice addition to my gear, and that I would get significantly more use out of it than a smoker, at least between now and my birthday. At any rate, I placed an order for the NEX-3, and thanks to Amazon Prime it showed up this morning. Fantastic.

So far, it’s a definite case of the good, the bad and the ugly. Thoughts and pictures below.

The Good:

There’s a lot to like about this camera. Sony has managed to cram an APS-C sensor into a body almost the same size as a typical point and shoot. The sensor isn’t bad either – 14MP, well controlled noise at high ISO, and nice dynamic range. Several of the shots below below were taken handheld at 6400. Whatever else it is, the NEX-5 is an impressive technical achievement.

The Bad:

Lenses. There are currently only 3 of them for the entirely new E mount, and none of them are terribly good. The 18-55 kit lens has quite a bit of distortion (there is a Lightroom profile, so less of a concern), the 16mm pancake is interesting, but too wide to be useful as a walkaround lens, and the 18-200 is amazingly expensive and currently unavailable. To be fair, I’m used to shooting with lenses that cost 2-3 times what this camera kit does, but the kit lens is far from the best piece of glass I’ve ever used. Sony has an aggressive roadmap for the E mount, but for now, there’s a severe shortage of good glass.

The Ugly:

The user interface. People have commented on this quite a bit, and I won’t rehash it here, but I will say it’s not as bad as reported. There are some definite awkward moments while using the camera, but it’s certainly not unusable. The battery life leaves quite a bit to be desired – if you were going to be using it all day long you’d probably need a spare. The autofocus is generally good, but has moments where it just doesn’t get it. Metering can be a bit poor, though again I’m used to Nikon’s class-leading meter.

I’ll have more thoughts in the coming days, especially after I get my M39 adapter and throw some rangefinder lenses on the body.

May you approach God with an open spirit…

May you approach God with an open spirit
not seeking to disguise your face,
or cover the nakedness of your soul.

May you know God
not as dispassionate object,
a thing to be mastered and controlled,
but as loving Father,
incarnate Son,
indwelling Spirit,
three persons,
one God.

And may your heart be opened
not only to Heaven,
but every person you meet,
each encounter a revelation,
new wisdom
new mystery
unending unveiling
face to face with the Eternal Thou.

Type 1 and Type 2 Errors of Doctrine

Dr. Richard Beck recently had a couple of posts on his blog regarding “The Theology of Type 1 and Type 2 Errors“, specifically dealing with the ideas of “saved” and “lost”. His second post expanded on the (I think) interesting idea that really the disagreements we have as Christians are fundamentally disagreements about what God is like. Both of these posts are rather interesting, but they got me thinking about the idea of Type 1 and Type 2 errors in terms of things like doctrine.

For those of you who aren’t statisticians or scientists dealing with automated classification systems, Type 1 and Type 2 errors are specific terms we use when talking about the kinds of errors we can make when classifying or predicting events. Because I deal with classifications more than I do with statistics per-se, I tend to think of Type 1 and Type 2 errors in the slightly different but related vocabulary of “false positives” and “false negatives”. Simply put, a false positive occurs when we declare something to be true when it is in fact false, or say something happened when in fact it did not. A false negative occurs when we incorrectly say something is false when it was really true, or that nothing happened when in fact something did.

One important aspect of Type 1 and Type 2 errors is that they are inherently related – we can set an arbitrary Type 1 error rate (even down to zero), but as we decrease our chance of making a Type 1 error, we increase our chance of making a Type 2 error. One of the easiest (and most classic) examples to illustrate this is the legal system. Consider a capital murder trial. The jury commits a Type 1 error if they convict the defendant when he or she was actually innocent. The verdict is a false positive, because we’re saying the defendant actually committed the crime, but they in fact did not. We have falsely sentenced an innocent person, possibly to die. On the other hand, the jury commits a Type 2 error if they acquit the defendant when he or she was in fact guilty. This verdict was a false negative – we said the defendant didn’t commit the crime, though in fact they did. Notice how we can change, and indeed bias the frequency of our errors. We can reduce our Type 1 error rate to zero if we never convict anyone, but we will be certain that all guilty people will also go free. Likewise we can make sure no murders are ever escape justice if we sentence everyone to prison, regardless of their actual guilt. In the absence of these two extremes, however, we can never be certain that we will never commit an error – and furthermore we should expect that we will commit errors; the best we can do is bias ourselves to making certain *types* of errors.

Critically, both Type 1 and Type 2 errors are errors. This sounds obvious, but isn’t always appreciated. A practical example in my field is the idea of “security” and “reliability” in circuit breakers. Reliability means that the circuit breaker *must* open when there is a problem. Failure to do so could mean the destruction of property and even death. In other words, it is unacceptable to have a false negative. If there is a real problem, we need to act on it. On the other hand, we don’t want the device to operate when there isn’t a problem either. If your circuit breaker tripped every time you turned on a light switch, it would become annoying quickly. If this actually happened, you would uninstall the technology that’s intended to protect you because, in effect, it kept crying “wolf”. This is called “security” – if the device operates when it isn’t supposed to it can give us headaches. In this example, both kinds of errors are bad. They are not, however, equally bad. In this case, killing someone is much worse than annoying someone, so circuit breakers tend to be biased toward reliability at the expense of security. There are things we can do that can reduce the rates of *both* types of errors, but we cannot eliminate both of them completely.

Perhaps the trickiest part of those whole deal is that for any given instance, it’s impossible to *know* whether you’ve made an error simply based on looking at the data. Statistically, the concept of Type 1 and Type 2 errors are related to the probability that the results you saw would have been generated “by chance”. In other words, our conclusion about the data is supported – the data does appear to indicate that what we’re saying happened really happened. The problem is that there is a small (but finite) probability the data could have looked that way simply by chance. There is a chance you can interpret the data “correctly” (by applying whatever criteria are appropriate), reach an incorrect conclusion, and furthermore not be aware that your conclusion is incorrect.

But this was supposed to be a post about doctrine, right?

In the absence of certainty (actually being God), we have to start with the premise that there is at least a possibility we will be wrong about some of our doctrinal decisions. In fact, it’s more than a possibility – there is almost a certainty that everybody is wrong about something. Obviously we aren’t aware of the doctrinal errors we make – if we were, we would correct them. Our reading of the text (data), may be perfectly consistent, “correct”, and still be wrong. In other words, we could select an good, appropriate hermeneutic, apply it consistently and honestly to the full body of Scripture, and still come to a conclusion that is in fact not the way God will ultimately decide things. Furthermore, because we chose an appropriate measure of interpretation and applied it correctly, there would be no way we could externally verify that we reached an incorrect (from God’s perspective) conclusion.

This seems problematic. If we can never be certain about doctrinal correctness (i.e. we accept that we can look at a text “correctly” and still commit a Type 1 or Type 2 error), does that bring everything to a standstill? Well, no, I would suggest. Remember that the idea of Type 1 and Type 2 errors are coming out of statistics, and the field of statistics didn’t collapse because we can’t be certain about things. In fact, it thrives because of it. In such a system, what criteria could we apply to produce an acceptable body of doctrine and belief? I think in general, we can look to scientific inquiry as a guide for ways in which we can improve our ability to avoid making both Type 1 and Type 2 doctrinal errors.

First, in science, we require experiments to be repeatable. One scientist’s study doesn’t confirm something to be true. In 1989, several scientists with credible reputations claimed to have discovered cold fusion – a claim that if true promised to be a safe and clean energy source that would basically solve the world’s energy problems. The initial results were confirmed by major research labs at Texas A&M and Georgia Tech. But as a larger group of scientists attempted to replicate the results, there were problems – nobody could get it to work. The researchers who initially confirmed the results discovered there had been problems in their experimental setup which cause erroneous results. After additional investigation, the original scientist’s claim was rejected. Doctrinally, I believe we can apply a similar principle of repeatability. Can other people who are looking at the same data I am and reading with similar method at least verify that my conclusion is sound? To be clear, this is not a call for the democratization of doctrine. This is not a suggestion to adopt the most widely held belief as true. The majority of people in Iceland believe in gnomes and fairy spirits, but that doesn’t make it true. Our doctrinal reading must conform to the text. But if I am the only person who reads the text this way, and almost nobody else can even see where I’m coming from, that would call into question how repeatable my conclusion really is.

Second, scientists generally follow a particular method in reaching their conclusions. I can’t change the method simply because it gives results more in line with what I want. If I can convince people there is something flawed about the method, then I might be able to suggest its change – and in fact the method of scientific inquiry has changed over time. But changes in the method are made by the community as a whole over time – not by a few rogue individuals who are wanting to get different results. In this sense, the tradition of science is important. The scientific community decides what methods are “good” and what methods are “bad”. These decisions are not arbitrary – in fact there are often very good reasons why a particular method is followed. Likewise doctrinally, adherence to a “good” hermeneutic is of paramount importance. If particular doctrines do not conform to reasonable and standard hermeneutics, as informed by the greater tradition of Christianity, we should be sufficiently skeptical of them. This is not to say our hermeneutic is required to be static – indeed it seems obvious that our understanding of God should grow and change over time. It is to say, however, that changes in our method need to be informed and accepted by the broader community before truly becoming orthodox.

Finally, even though this is far from egalitarian, experts should be trusted more than laypeople. We tend to trust Stephen Hawking more than Billy Joe Jones when it comes to the field of Theoretical Physics. That’s not to say Hawking always gets it right, or that Billy Joe might not have some interesting things to say on the subject. It is to say, though, that if our lives were on the line and we could only choose one person to answer a question about neutrinos, we’d be placing a call to Cambridge instead of Mobile. Modern Evangelical Christianity tends to push the other direction – in general with a large anti-academic bias where experts are largely distrusted. Academics don’t always get it right, and laypeople don’t always get it wrong, but experts generally possess tools and training which allow them to make better sense of data than someone without such training. In general, we are less likely to commit Type 1 and Type 2 error when we assign greater value to the opinions of people who have spent years of their lives not only learning about Christian theology, but living lives which have been shaped by a serious commitment to spiritual formation. We shouldn’t immediately dismiss the viewpoints of people who don’t meet this criteria, but we should be inherently suspicious of new viewpoints that arise (or old viewpoints that are perpetuated) primarily by people who have little training and little obvious commitment to spiritual formation and discipline.

Evangelism After Christendom (Part 3)

We left Stone as he was questioning the assumptions that underpin much of the modern shifts in Christian evangelism – namely that success can be judged on whether or not the result attracts more people. Rather, Stone’s premise declared that only with the proper telos can evangelism truly be said to be successful. In other words, Christian evangelism isn’t primarily about attracting people to Christ – rather it’s about living lives that are a virtuous witness to God’s reign of peace, and any attraction people have to that is simply a by product.

Stone ends Part 1 with the following statement:

My conviction is that plurality, historicity, and difference, while naturally producing feelings of insecurity, are nonetheless central to the task of telling the story of the people of God.  For that story is itself the story of an encounter with difference (including God’s difference!) and a record of how that encounter makes a people distinctive in the world.  The store of the people of God is the story of a people who encounter other stories in a variety of ways, sometimes in the form of a gift and an offer while at other times in the form of a confrontation and a scandal. We need not be paralyzed in making decisions about our own story or frightened about allowing it to interact with other stories, provided we do so with appropriate discipline, suspicion, self-criticism, and humility. After all, our story is not entirely rosy. It is a story of detours and dead ends, reversals and failure. It is a record of faithlessness, stubbornness, and rebellion as much as it is a story of obedience and hope. We need the whole of the Bible, because as a whole it does not shrink from narrating both sides of the story.

Stone’s basic conviction is that “conversion” has much less to do with accepting a few core propositions (e.g. God exists, Jesus is the Son of God, etc.) as much as it is a complete change of worlds – participation in a new reign, a new story, a new reality. Just as the Bible is presented not as a list of facts or a historical document, but as a story, Stone’s view is that the bottom line to Christian practice and living is fundamentally participation in that story. As he quotes MacIntyre, “I can only answer the question, ‘What am I to do?’ if I can answer the prior question ‘Of what story or stories do I find myself apart?'”

As a result, Stone spends Part 2 looking at Israel, Jesus, and the early church to provide a foundation for the overarching story of God, which will in turn provide a foundation for his later movements in our participation in the continuing story of God.

Stone begins by looking at the people of Israel. He starts with, what for many Christians is a point commonly overlooked – that the Bible is an entirely Jewish production. As he points out, most Christians, when asked what the most discussed topic in the Bible is, are unlikely to answer “Israel”. Rather, “we have been trained to think of the Bible as handing over information about important beliefs (sin, death, salvation, faith, God, etc.).” As Stone points out, there are interesting peculiarities about Israel’s formation. The story of the Old Testament is not a story about a god or gods as much as it is a story of a particular people, chosen and set apart as a community with a completely different identity and purpose, “chosen, called, liberated and led by God”. Israel’s identity, whether Abraham’s trust in God or Moses’s attack on contemporary social, political, and economic standards, performs as a contrast story to the prevailing norms of the day.

The idea of God’s free choice, of election, is powerful in the consciousness of Israel. Stone writes that our tendency is to universalize this and try to make Israel stand as a symbol for all God’s people past and future, but that the Bible doesn’t really lend itself to this interpretation. There is, however, an ambiguity in the meaning of election – that Israel is both chosen by and chosen for. Because of this, Israel has a “double relationship” of sorts, with God and the nations. The prophets call the people of Israel to remembrance both with terms of intimacy with God (beloved, firstborn) and warning as idolatry crept in (adulterer, prostitute). Stone:

Remembering turns out to be one of the central and defining activities of the people of Israel. It is the basis for both their cultic and their moral life. It funds prophetic reform and liberative praxis and is never to be simply equated with a “conservative” as over against a “progressive” outlook. Remembering likewise gives this people’s existence its narrative quality – God’s dealings with them in the past are decisive for making sense out of the present and guiding them into the future.

But where does this call to remembrance lead, exactly? Stone again:

As Micah’s vision makes clear, the “ways” of God embodied in this particular people (for the nations) are ways of justice and peace, the very substance of what Israel will come to understand as holiness.  The prophets critique any understanding of holiness that is purely formal, ceremonial, and positional, and that does not include the transformation of human hearts (Jer. 31:27-34) along with social and economic arrangements. … God’s purpose in history is not just the creation of holy individuals but the creation of a holy people, a people whose very existence in the world is a living testimony to the rule of God. Holiness, therefore, is unreservedly social, political, and economic.

Stone terms this way of living, God’s character and God’s ways “shalom”. But there is something surprising about the prophet’s vision:

What we learn from the Hebrew prophets, therefore, is that to live  toward and out of shalom, as the beginning and the end of the story of the people of God, is to be eminently realistic. It is not shalom but the present order that lacks legitimacy.  It is not hope but complacency that has no firm basis in reality. … [M]uch of God’s rule of shalom may still be coming, but it is no less “real”.

Because of this hope and this realistic confidence in God’s presence and activity in history, the people of God are released from the burden of needing to control history, “to make things come out right”. To be the people of God is not a matter of presuming that our plans coincide with God’s; it is a matter of trusting, being open, and being guided and led into an uncontrollable future.

You don’t have to know much about modern fundamental or mainline evangelical Christianity to realize this vision is a sharp contrast to that presented in most churches today. While it may be popular to talk about a “Christian worldview”, often that worldview looks suspiciously similar to the capitalist, democratic worldviews that also came out of enlightenment liberalism. He will critique this in detail later, but even from this point, Stone levels a rather scathing (and unfortunately all too accurate) criticism at modern Christianity: namely that serves more as a perpetrator of the status quo than an alternative community founded on the principles of God.  From the time of Constantine on, he argues, the church has been so “in bed” as it were with the “powers and principalities” of the world that it has no hope of offering a substantive critique of their practices. When churches are structured like corporations, when their leaders are elected by democratic ballot, and when their economic structures treat individuals as spiritual consumers (and we must pause here to reflect a moment and acknowledge these things to be true), it has effectively adopted the idolatrous practices of its surrounding culture, rather than remaining true – remembering – its true calling.

Psalms of lament

I recently discovered Mark Hamilton’s blog, and have started reading through his series on worship and their relation to the Psalms. One of the ones which stuck out to me was his discussion of Psalm 3, and our tendency to make worship a “power of positive thinking” event:

One of the more disturbing aspects of worship in Christian congregations today is the strong bias toward good cheer and superficial encouragement, no matter the circumstances, no matter the feelings that people bring with them to the service, no matter how much we have to hide or deny to keep up the facade.  In some places, we do not confess our sins, do not acknowledge systemic evil in the world, do not lament the suffering of people (unless someone runs a plane into a building), as if we believed that hope can only survive in a pretend world.

Definitely worth the read.

Klein

So for those of you who don’t know, we recently picked out a dachshund puppy, who came to live with us today. We’ll try not to be those people who take way too many pictures of the puppy, but to be honest, right now he’s too cute not to take pictures of. So without further ado…

Evangelism After Christendom – Reflections (part 2)

Remember that book I was reading a long time ago? Evangelism After Christendom?  Yeah. It’s back thanks to a Kindle edition.

When we left Stone, he was attempting to give us the idea that in the Christian tradition, evangelism could possibly be viewed as a core practice in a loosely Macintyrian sense. Stone also takes some time to point out core problems with the way evangelism is often executed in modern churches. Chief among these problems, he argues, is that evangelism has become essentially a marketing regime which seeks to attract new people by either a) trying to make the gospel more intellectually respectable b) trying to demonstrate that it is practical (good for society, economy, or personal psychology), or c) attempting to alter the traditional “stuffiness” that has categorized church in the past and instead make church more accessible to a wider audience. Stone:

Creative reconstructions of evangelism are being attempted today, and they succeed in expanding the church by adapting it to new generations that are put off by boring liturgies, irrelevant preaching, and stuffy pipe-organ music.  But while these reconstructions have triumphed in making the church more relevant to the tastes, expectations, preferences, and quest for self-fulfillment of both the unchurched and the dechurched, they have utterly failed to challenge the racism, individualism, violence, and affluence of Western culture.  They in no way subvert an existing unjust order but rather mimic and sustain it.  Our greatest challenge is to find ways of practicing evangelism in a post-Christendom culture without at the same time playing by the rules of that culture.

Cliff’s notes? Marketing evangelism works – at least if what you mean by “works” is “attract more people”, but it doesn’t do a terribly good job of remaining true to the Christian ethos, which if you will remember from our first discussion, is what really matters. Stone again:

We kid ourselves if we think we have moved beyond Christendom simply because we are able to reach more people by getting rid of our stained glass and stuffy sermons and providing a “product” that is more user-friendly. Neither large-scale revivals that boast thousands of converts nor fast-growing megachurches that have dropped from the sky into suburban parking lots as of late are in any way indications of the proximity of God’s reign, nor is their winsomeness and friendliness to be equated with Isaiah’s “peace.” In fact, the failure of evangelism in our time is implied as much by the vigorous “success” of some churches in North America as by the steady decline of others.

This is, I think, a profound statement. You may recall a recent post where we talked about the metrics we use to evaluate whether God is “working.” What is true on an individual level is also in many ways true for Christianity as a collective – namely that we tend to view God “working” in rather selfish terms – specifically when it looks like our agenda is “winning”, our political candidates are getting elected, and our numbers are increasing. There are no shortage of problems with this theology, as pointed out in the previous post, but Stone adds another: by using metrics of success that are external to the practice, we are essentially distorting and subverting the practice itself and trading excellence for sheer effectiveness, and indeed by confusing the two. Returning to the oft-used analogy of sports, effectiveness is winning a championship – excellence is playing to your highest potential day in and day out, letting the results speak for themselves. Ted Williams is considered to be one of the finest hitters to ever play the game of baseball, but he never won a World Series. You don’t necessarily have to be excellent to be effective – in fact, being effective can be achieved in plenty of ways contrary to the ethos (ideals) or telos (purpose) of the tradition you find yourself apart of.

One way Stone proposes that we counter this tendency is to first ground evangelism theologically, rather than allowing it to be whatever it wants in order to be successful.

Those who think theologically rarely think about evangelism, and those who think about evangelism rarely take the discipline of theology very seriously.  For one thing, very little in the present reward system of most churches supports thinking theologically about evangelism. Excellence in evangelism is almost wholly governed by numerical measures of success, and pastors are rewarded primarily insofar as they attain those measures.  Those who produce the literature on evangelism – especially that which concentrates on the models that are widely touted as successful in the North American context – are particularly reluctant to think critically about the theology presupposed in their practice. Their focus instead is on finding new and creative ways to express Christian beliefs and practices – forms that are more indigenous, user-friendly, and “relevant” to the experience of contemporary human beings, or more successful in making converts in an already crowded marketplace of competitors.

This book is written out of the conviction that there is no substitute for serious theological inquiry about evangelism as a practice.  In fact, theological inquiry is itself an intrinsic part of that practice.  We cannot proceed by merely trotting out a handful of “successful” pastors of fast-growing congregations to tell us what “works”.  For it is the very question of what we are working toward, what is deemed valuable and beautiful, what we are seeking, that in our time must be reexamined and that too often goes unchallenged altogether.

The “practicality of theology does not lie merely in its strategic movement toward concrete proposals for action. Practical theology is not a bag of tricks, but a process of laying bare the assumptions that guide our practice and then drawing critically upon the practical wisdom of Scripture and the Christian tradition in order to rethink and reconstruct those assumptions.

Stone’s conclusion? Evangelism isn’t about trying to translate the message we think we know into a new context, but about residing in a changing context and remaining (or becoming) faithful witnesses of God’s peace. This is not about setting up an alternate culture that never interacts with the world around it. It is not a culture that is different because it shuns sex, drugs and rock and roll, but because it challenges, in the case of our current position, the very foundations of modern society – the economic, social and political power structures that so often serve as today’s “powers and principalities of this dark world”.  Evangelism, for Stone, is primarily about remaining grounded in a life of faithful dedication to the ethos of the Christian tradition – in his words, “witness to God’s reign of peace”.

When the practice of evangelism is not grounded firmly in the comprehensive life of witness, the church is inevitably instrumentalized, reduced to a mere tool in the service of heralding the gospel, rather than the social embodiment of God’s new creation in Christ, the very news that is to be heralded as good. For, as always, the embodiment is the heralding; the medium is the message; incarnation is invitation.  That is why, as I shall attempt to argue throughout this book, it is impossible for the church to evangelize the world and, at the same time, to serve as a chaplain to the state and allow itself to be disciplined by the logic if the market.

There are some real issues in that statement – issues that challenge the predominant theology (primarily soteriology and eschatology) in some deep and profound ways. My personal belief is that most people are not ready for the type of change that Stone is outlining, but that it might be possible to move things slowly in that direction.

The Future of Everything: The Science of Prediction

This weekend I finished reading David Orrell’s book “The Future of Everything: The Science of Prediction“.  As an applied scientist, the public perception of scientific modeling has been a side interest of mine. In particular, as science is pressed more and more into the service of politics and ideology, the general lack of understanding about what scientists know and how they know it should be a deep concern to us all. In The Future of Everything, Orrell attempts to give an overview of how scientific modeling has developed, what its shortcomings are, and how far we can really expect mathematical models to predict the future.

Effectively, Orrell starts with the following observation: despite an exponential increase in funding and computing power over the last 100 years, predictive models (particularly in the fields of weather and economic forecasting) have made surprisingly little progress in producing accurate predictions about the future. In fact, modern weather forecasts for beyond a few days are only marginally more accurate, on the whole, than a forecast based on the climatological average for a particular day, in spite of their increasing complexity. Orrell spends much of the book exploring why models fail to give accurate predictions, with climate, the economy, and genetics as his three case studies.

Over the course of the book, Orrell explores a variety of shortcomings in modern mathematical models which aren’t necessarily solved by better computers or more complicated models. Some of the most important ones are (in no particular order):

  • Attempting to model complex non-linear systems is mathematically problematic: In the 18th century, mathematical modeling seemed to offer limitless progress.  Newton’s laws had transformed a seemingly complicated universe into a few lines of mathematics. If we could predict the course of the stars and planets, surely the world was at our command.  Well, not exactly.  As it turns out, Newton’s laws of motion turn out to be one of the easiest physical things to model. As Orrell says, part of Newton’s genius was picking a system that was possible to model – the same being true of Gregor Mendel’s study of genetic traits in peas. There may be simple equations for how a planet moves around the sun, but trying to predict how the wind blows (or how a plane flies) is a lot more complicated.
  • Chaos: Jeff Goldblum made chaos a trendy term in Jurassic Park, but it remains fairly misunderstood. In modeling, a chaotic system is one where small changes in the initial conditions can dramatically alter the trajectory of the system. Because we can never know the precise initial conditions of a system like the atmosphere or the economy, small perturbations in the initial conditions (or parameters) used in models can have a large effect on the resulting predictions. The fact that model parameterization is often at least somewhat subjective compounds this issue.
  • Computational irreducibility: Systems exist which are fairly simple, non-chaotic, produce clear patterns, behave according to only a few rules, and yet are computationally impossible to predict. The best example of this is Conway’s Game of Life. The Game of Life functions according to only four rules, yet it is impossible to write equations which will predict the state of a cell at any arbitrary time. The only way to find out is to run the system.
  • Emergent properties: Emergent properties refer to the unpredictable ways which simple entities interact to form complex results. Think “the whole is greater than the sum of its parts.” These emergent properties cannot be simplified to simple physical laws.
  • Feedback loops: Most systems have competing positive and negative feedback loops which control the system. One example is blood clotting. Positive feedback is necessary to quickly stop bleeding. If unchecked, all your blood would clot and you would die, so negative feedback slows the process when it reaches an appropriate level. The way feedback loops interact with each other complicates model parameterization.
  • Matching the model to past observed data does not ensure accurate predictions: Just because a model matches past observed data does not mean it is correct, nor that it offers any predictive power about the future. A chicken might build a model that predicts a long and happy life based on observations of the farmer coming to feed him every morning. That model holds well, until the day he becomes the farmer’s dinner.

Orrell summarizes as follows:

  • Prediction is a holistic business. Our future weather, health, and wealth depend on interrelated effects and must be treated in an integrated fashion.
  • Long-term prediction is no easier than short-term prediction.  The comparison with reality is just farther away.
  • We cannot accurately predict systems such as the climate for two reasons: (1) We don’t have the equations. In an uncomputable system, they don’t exist; and (2) The ones we have are sensitive to errors in parameterization. Small changes to existing models often result in a wide spread of different predictions.
  • We cannot accurately state the uncertainty in predictions.  For the same two reasons.
  • The effects of climate change on health and the economy (and their effects on the climate) are even harder to forecast. When different models are combined, the uncertainties multiply.
  • The emergence of new diseases is inherently random and unpredictable. Avian flu may be the next big killer – but a bigger worry is the one that no one has heard about yet.
  • Simple predictions are still possible. These usually take the form of general warnings rather than precise statements.
  • Models can help us understand system fragilities.  A warmer climate may cause tundra to melt and rainforests to burn, thus releasing their massive stores of carbon.  However, the models cannot predict the exact probability of such events, or their exact consequences.

So where does that leave us? Orrell again:

Einstein’s theory of relativity was accepted not because a committee agreed that it was a very sensible model, but because its predictions, most of which were highly counterintuitive, could be experimentally verified.  Modern GCMs (Global Climate Models) have no such objective claim to validity, because they cannot predict the weather over any relevant time scale. Many of their parameters are invented and adjusted to approximate past climate patterns.  Even if this is done using mathematical procedures, the process is no less subjective because the goals and assumptions are those of the model builders. Their projections into the future – especially when combined with the output of economic models – are therefore a kind of fiction.  The fact that climate change is an important and contentious issue makes it all the more important that we acknowledge this.  The problem with the models is not that they are subjective or objective – there is nothing wrong with a good story, or an informed and honestly argued opinion. It is that they are couched in the language of mathematics and probabilities: subjectivity masquerading as objectivity.  Like the Wizard of Oz, they are a bit of a sham.

[A]s I argued in this book, we cannot obtain accurate equations for atmospheric, biological, or social systems, and those we have are typically sensitive to errors in parameterization.  By varying a handful of parameters within apparently reasonable bounds, we can get a single climate model to give radically different answers.  These problems do not go away with more research or a faster computer; the number of unknown parameters explodes, and the crystal ball grows murkier still. … We can’t mathematically calculate the odds, even if it looks serious, scientific, and somehow reassuring to do so.

Orrell is clear to point out, however, that the fact we cannot guarantee the accuracy of our predictions does not mean they are necessarily wrong, or shouldn’t be heeded. Varying parameters in climate models may in fact produce a wide range of results, but that doesn’t mean we should take a wait and see approach. Economic models failed spectacularly to predict the current economic crisis – but it still happened.

Orrell’s argument, then, is for a kind of literacy when using scientific models to inform decisions. Scientific predictions can be helpful, and often are. But they are limited in their ability to predict future events with certainty, and these problems aren’t necessarily going to be solved with better data and models, or with more powerful computers. They shouldn’t be ignored, but rather viewed for what they are: a tool for helping us understand the present, and hopefully make the best decisions we can about the future.