Skip to content

As Barth's concept and usage of Krisis develops in the second edition, it becomes a key to understanding his revolutionary theological reading of The Epistle to the Romans.

In September 1921 Barth wrote the preface to his second edition of The Epistle to the Romans, and one can fairly regard the second edition as written a period of intense and unpredictable turmoil after August 1918 (the date of the preface to the first edition), early 1919 (publication and critical reception), and mid-1921: Armistice and German capitulation, Swiss general strike, European famine, military demobilization (less prominent but still present in Switzerland), and the "Spanish" H1N1 influenza pandemic. General crisis pervades this historical backdrop, and lent authenticity to the concept of Krisis that Barth so effectively deployed in the second edition.

Krisis is a distinctive development in the second edition; Barth used the term only once in the first (and not in particularly theological sense, um der Gleichartigkeit der Krisis willen, in der alle Menschen aller Stufen immer wieder vor Gott stehen ("thanks to the similarity of the Krisis in which humans of all ranks stand before God"). By contrast, Barth introduces Krisis in the preface to the second edition. He responded to the charge that he imposed his meanings upon the text of Romans rather than drawing meaning from it --the charge of eisegesis in place of exegesis--"and that my method implies this" (Der Verdacht, hier werde mehr ein- als ausgelegt, ist ja wirklich das Naheliegendste, was man über meinen ganzen Versuch sagen kann.) He continued:

Wenn ich ein „System” habe, so besteht es darin, daß ich das, was Kierkegaard den „unendlichen qualitativen Unterschied” von Zeit und Ewigkeit genannt hat, in seiner negativen und positiven Bedeutung möglichst beharrlich im Auge behalte, „Gott ist im Himmel und du auf Erden”. Die Beziehung dieses Gottes zu diesem Menschen . . . ist für mich das Thema der Bibel und die Summe der Philosophie in Einem. Die Philosophen nennen diese Krisis des menschlichen Erkennens den Ursprung. Die Bibel sieht an diesem Kreuzweg Jesus Christus.

I translate this as:

If I have a system, it consists in this, that I keep in view as tenaciously as possible, in both its negative and positive sense, that which Kierkegaard termed the "infinite qualitative distinction" between time and eternity". "God is in heaven and you are on earth." The relation of this God and this human . . . is for me the theme of the Bible and the sum total of philosophy in one. The philosophers call this Krisis the origin of human recognizing. The Bible looks at [this] crossroad [as] Jesus Christ."

Sir Edwyn Hoskyns by distinction translated:

. . . if I have a system, it is limited to a recognition of what Kierkegaard called the 'infinite qualitative distinction' between time and eternity, and to my regarding this as possessing negative as well as positive significance: "God is in heaven and thou art on earth." The relation between such a God and such a man . . . is for me the theme of the Bible and the essence of philosophy. Philosophers name this KRISIS of human perception—the Prime Cause: the Bible beholds the same cross-roads—the figure of Jesus Christ.

One can see here how Hoskyns subtly renders Barth more as a Cambridge don than as a brilliant, activist pastor. It is no denigration of such dons to note that their métier is careful nuance, and such nuances can tire easily. By translating des menschlichen Erkennens den Ursprung as the Prime Cause (of human perception), Hoskyns imports a philosophical term more allusive to St. Thomas than Barth's text really allows. (For example: Cum igitur Deus sit prima causa universalis non unius generis tantum, sed totius entis, impossibile est quod aliquid contingat; ST I:103:7, "Therefore as God is the first universal cause, not of one genus only, but of all being in general, it is impossible for anything to occur" [outside of God's governing].). "Philosophers calls this Krisis the origins of human recognizing" (or perceiving): Erkennens in the text is not exactly the same as its relative Erkentnis. The phrase "the essence of philosophy" shies away from Barth's declarative unity between the theme of the Bible and the sum total (Summe not Summa) of philosophy.

In the first usage of the critical term Krisis, Barth's received English translation already steers in a more philosophical and less expressive direction. Barth thanks his brother, the philosopher Heinrich Barth, for leading him "to recognize the importance" of Plato and Kant, but Karl does not wish to be philosopher in the academic sense, "I have also paid more attention to what may be culled form the writings of Kierkegaard and Dostoevsky that is of importance for the interpretation of the New Testament. The latter I owe more particularly to the hints given me by Edward Thurneysen" —(und das vermehrte Aufmerken auf das, was aus Kierkegaard und Dostojewski für das Verständnis des neuen Testamentes zu gewinnen ist, wobei mir besonders die Winke von Eduard Thurneysen erleuchtend gewesen sind.).

As Barth's concept and usage of Krisis develops in the second edition, it becomes a key to understanding his revolutionary theological reading of The Epistle to the Romans.

I have long loved the elegant verse of David T. W. McCord, "Books fall open, / you fall in, / delighted where / you've never been." Sometimes that's even true. (A poet with a day job, McCord also raised millions for Harvard as director of the Harvard College Fund.)

Several months ago a book fell open for me, one that I had long known about but never really read: Karl Barth's The Epistle to the Romans. I started to read it not long after I resolved to nurture my knowledge of Greek by reading some New Testament daily. I chose Romans because it was there (and because I read much of it for a seminar with the late great Prof. Christian Beker): it demands attention, is linguistically difficult, and seems to be found in the midst of every major turn in Christian history, such as Martin Luther and Karl Barth.

I also chose Barth because I also have access, courtesy of Princeton Theological Seminary Library, to the German text of Barth's second edition, 1922, which open the way for so-called "dialectical theology" and a decisive turn from previous liberal Protestantism by many German-language writers. So I could exercise my Greek, Latin (I read parallel in the Nova Vulgata) and German, languages that I spent a lot of time and effort learning when I was an undergraduate, and which I do not wish to lose at this later stage of my life.

Little did I realize, when I began this project in 2019, that it would become so timely. I began Romans exactly because I have been so dismayed by the slow slide into fascist authoritarianism underway in my native country. (I hesitate to call it my home any longer; I live here as a resident alien.) I had some inkling that it has something to say to my condition. I was little prepared for how much it had to say.

Barth's book came out in successive editions in 1919 (finished in the last days of the War, December 1918), and a second edition in 1922, as well as other later minor alterations, clarifications, and republication, with an English translation by Sir Edwyn Hoskyns in 1933. The Epistle to the Romans established Barth's early reputation as a disruptor; the second edition in particular critiqued (or bulldozed) the cultural liberalism of the Protestant theology of his teachers at Marburg and elsewhere. The term "dialectical theology" which arose in the early 1920s highlighted frequent use of paradoxical formulations and polar opposites by several writers (Barth, Gogarten, and Brunner prominent among them) and was always --given the prominence of "dialectic" among the heirs of Hegel, and especially Marx--something of a misnomer. There is a very great amount of distinguished scholarship regarding the origins and development of "dialectical theology" in the 1920s and the disagreements of its original proponents as time went on (and especially in the 1930s), and I do not intend to rehearse any of that here.

My focus instead is how unexpectedly contemporary the second edition of Barth's Epistle to the Romans now seems to be, despite its readily apparent and important differences with the present moment. I am enough of a historian to remember and acknowledge that history does not repeat itself, or even often rhyme (in words erroneously attributed to Mark Twain), but somehow sometimes retains a certain metrical force, like blank verse in iambic pentameter. Or in Yogi Berra's reputed words, "It's like déjà vu all over again."

When reading the German second edition (available in the Karl Barth Digital Library, or via Hathitrust here (Vorsicht: Fraktur!), I immediately felt great sympathy for the late Sir Edwyn Hoskyns, translator (13th Baronet of Harewood, County Hereford, and Dean of Corpus Christi College, Cambridge). Sir Edwyn faced the daunting challenges of Barth's German which intentionally pushes beyond normal boundaries with multiple-compounded words, neologisms, paratactic paragraph-long sentences, and now-obscure references to contemporaneous events and well-known persons. Sir Edwyn's English version of the German sixth edition (1928) unavoidably presses Barth's free-range German into Oxbridge English, and the language of the Authorized Version. He inadvertently misses or minimizes Barth's Sprachspiel or playing with the language in an attempt to demolish over-familiar and well-worn phrases bearing the echoes of discredited theologies, politics, and cultural outlooks. On the other hand, following Barth's German too closely would nearly completely obfuscate his meaning and significance for English readers, and linguistically daze them.

Barth's German is variously playful, allusive, hortatory, monitory, and nearly aphoristic. (I will give examples in succeeding blog entries.). It reminded me less of the weighty German of Barth's later Kirchliche Dogmatik than of writers associated with German "expressionism:" Georg Büchner (especially Woyzeck, adapted for Alban Berg's opera), Frank Wedekind (Lulu plays, the basis of Berg's opera, and Frühlings Erwachen or Spring Awakening, source of the 2006 musical), Franz von Unruh (especially Opfergang, anti-war drama written during the Siege of Verdun/Schlacht um Verdun). I also thought of writers associated with the Austrian Jugendstil, especially Robert Musil, Arthur Schnitzler, Stefan Zweig, and Georg Trakl (associated with Der Brenner, influential in the revival of Kierkegaard in German-speaking lands, and a link via friendship to Ludwig Wittgenstein). The influence of both Kierkegaard and Nietzsche are especially prominent in Barth's second edition of Romans, and has been studied thoroughly.

Barth used his expressive, nearly expressionist language to torpedo the discredited cultural, Protestant hegemony represented above all by his (respected) teacher Wilhelm Herrmann. A "Lutheran Neo-Kantian," Herrmann taught Barth to speak of God dialectically or in opposites: dogmatic/critical, Yes/No, veiling/unveiling, objective/subjective, in line with an emphasis upon the religious experience of the individual (from Friedrich Schleiermacher). So Barth did not repudiate everything that Herrmann taught him. But—Herrmann was one of the 93 signers of Manifesto "To the Civilized World" (An die Kulturwelt: Ein Aufruf), which unequivocally supported German militarism and military actions in 1914, specifically including the "Rape of Belgium" (in essence confirmed by later scholarship despite certain Allied fabrications). Herrman was not the only prominent Protestant theologian to sign An die Kulturwelt: so did Adolf Diessmann, Adolf von Harnack, Adolf von Schlatter, and Reinhold Seeberg (father of Erich Seeberg, the eventual Dean of the "German Christian" Protestant theological faculty in Berlin during the National Socialist régime). So Barth's demolition of the language of liberal Protestantism extended his cultural critique of discredited, compromised, fatally flawed accommodation of liberal Protestant theology with German militarism. Barth's citizenship and employment as a Swiss pastor allowed him both access to the German linguistic community and freedom from German war-time censors, however sympathetic the officially-neutral German-speaking Swiss may have been to the German side, and their consequential anxiety of about any kind of wartime dissent.

Barth began his project of demolishing the whole discredited line of Protestant piety focused on individual experience (and accompanying social and political irrelevance) in 1919 but by 1922 his second edition reflected his dismay with the disaster then unfolding in revolutionary Russia as it morphed into the early Soviet Union. Barth's demolition extended both to the right (socially mainstream, bürgerlich Protestantism), to the left (red to redder socialism), and the Roman Catholicism already susceptible to revanchist or restorationist, imperialial-fascist fantasies in Austria, Italy, Spain, and elsewhere). Safe in stable Switzerland, he nevertheless experienced (with the rest of Europe) the disorienting political and cultural shock of imperial crackup, and the cowardice of German wartime leaders, especially the Kaiser, Tirpitz, and Ludendorff. Nor did economic disaster spare Switzerland: the general strike (Landesstreik) in November 1918 strained civil society more than any time since the Sonderbund war of 1847-1848. The H1N1 "Spanish" influenza epidemic 1918-1919 hit Switzerland very hard, and the inadequate official response brought the nation to the brink of civil war.

Barth wrote his second edition, then, in a lightly industrialized (weaving) town in the midst of entrenched economic disaparities, national and international political dysfunction, and pandemic. This is not the same world as 2019-2020, but it seems eerily adjacent. Even more, the collapse of a public theological or religious rhetoric associated with denuded assumptions of political, cultural hegemony and the thorough discrediting of its moral authority (in Europe then), can only remind one of the cultural bankruptcy of American evangelicalism now, so thoroughly colonized or even weaponized by the current chaotic American régime. In America in 2020, as in Europe in 1922, all these deteriorating conditions reflect both acknowledged and unacknowledged wounds and diseases left festering for decades.

Barth's response was vigorous social critique and withering theological appraisal reflected his wide reading and wide-ranging cultural allusions that extend far beyond normal academic theological discourse. He speaks with a preacher's or prophet's voice, not the voice a professor (although he is certainly academically well informed). The world of his text —what it presumes, upon which it comments, whose pretensions it exposes—will be the content of further blog entries.

I finally got around to reading Stephen Greenblatt's Will in the World: How Shakespeare Became Shakespeare years late, and after thoroughly enjoying the first part of his book The Swerve: How the World Became Modern. I emphasis first part, because that book is really two texts: one is a thoroughly entertaining and engaging account of how Poggio Bracciolini found and published the sole sizable manuscript of Lucretius' De Rerum Natura, a text that informed 15th-17th century skepticism towards Christianity. The second, less recondite and rather pretentious text, is essentially an anti-religious polemic from a rather tired point of view, that the so-called "Renaissance" bloomed with insightful and wise humanism and overcame centuries of superstition and dogmatism --in other words, Burckhardt's or Gibbon's point of view of the Middle Ages. This second text springs not only from a historically questionable point of view, but contains factual errors (such as ascribing the New Testament sentence, "Jesus wept" to Luke rather than John --an easy error any publisher's fact-checkers should have caught). The failure of the second part of Greenblatt's text do not overshadow the value of the first, but one does wonder, why bother?

Will in the World is an equally engaging reading of most of Shakespeare's works (poetry and theatre) to inform careful speculation about his life, inner life, and how he came to produce them. It does not seek to explain Shakespeare so that his achievement becomes merely quotidian --far from it! Insofar as facts about Will's life are known, but threadbare, due to the careful record-keeping of legal proceedings in Elizabeth's England, Greenblatt bases his account on well-received points of text well-known to scholars. What Greenblatt brings is careful but not over-fussy readings of Shakespeare's text themselves with an eye to what may have informed his imagination, such as the premature death of his son Hamnet in the period just before Will wrote Hamlet --an unsubtle play on names, especially since both orthography and pronunciation were not standard in Will's time. Consequently Greenblatt's text is noticeably furnished with careful speculations, verbs including must have, likely have, may have, and words such as probably and apparently —all in service of trying to pry beyond the veil of historical silence to ask what was going on in Will's mind and spirit.

Scholars may or may not approve of some, all, or any of Greenblatt's well-founded conjectures, and David Stenhouse justifiably would retitle the book Will in the Work. Greenblatt discovers no new facts, but does not disguise his conjectures. For a non-specialist reader, he conveys vividly the peculiarities of London ca. 1590-1610, odd legal features of property and time (such as "liberties," a legal trick by which the civic legal authority over formerly monastic lands, once held by houses long since dissolved, continued to be limited). Will's world is evoked very credibly in many places; in others not so much. Will's Catholic sympathies, or down-right Catholicism, seems less well-established than Greenblatt would make it. Will's utter silence regarding the leading source of social unrest and legal danger in his time still seems more prudential than protective: we just don't know what he thought about any questions of faith.

The problem is that for a figure in his time, Shakespeare leaves no letters, very few reliably autograph manuscripts, no diaries, few reliable, personal reminiscences remain from those who knew him, and few legal records beyond the barely factual. No wonder speculation remains that the figure of Will was a front for someone more notably connected or well-born, who however remains equally elusive, or more so. Compared even with earlier figures such as John Calvin or Heinrich Bullinger, figures at home in a different but equally litigious, dangerous, and ambiguous culture, Will left so little --but what he did leave was sometimes startling revelatory of human feelings that are still accessible, as well as affections and desires long since, often thought unworthy (such as same-sex desire), but now recovered and celebrated.

I'm glad I read it --it helps me to hear Will's works and words on stage. Where's Will? is a more serious game of where's Waldo? however -- and he's still hard to spot. Greenblatt's New Historicism serves him well in this case. The book is worthwhile to read, and if you wish a dive into Will's texts, I recommend it. His reading alone of Julius Ceasar in Elizabethan context is worthwhile —and in the current age of political lies, alternative facts, fake news, looming tyranny, and unbridled ambition, is highly topical.

O Dieu dont le Fils unique, par sa vie, sa mort et sa resurrection, nous a merite, les recompenses du salut eternel, faites que, meditant ses mysteres dans le tres saint Rosaire de la bienheureuse Vierge Marie, nous mettions a profit les lescons qu'ils contiennent afin d'obtenir ce qu'ils nous font esperer. Par la meme Jesus-Christ, votre Fils notre Seigneur. Amen.

Three years later, Jones' book still resonates broadly with the continuing decline of WCA.

End of White Christian AmericaThe End of White Christian America, by Robert P. Jones. New York: Simon and Schuster, 2016. 309 pages. [With a new afterword covering the 2016 election.]  ISBN 9781501122323, available at Sacred Heart University Library.

Jones' book garnered considerable attention in the religious media in 2016. Reading it three years later, one cannot help but ask frequently, "What about?" some event since 2016, because the book preparation and publication date precluded any coverage of the interminably long, bitter 2015-2016 election cycle. Since November 2016, so many "what abouts?" arise, even over Jones' basic contention that a carefully-defined "White Christian America" (WCA) is dying or has died.  The "new afterword covering the 2016 election" is identical with the article, "T—— Can't Reverse the Decline of White Christian America," The Atlantic, July 4, 2017.  (I em-dash the name to try to discourage trolls from spamming this post.)

Jones' sticks consistently to a concept of white protestant Christian America, its churches, web of associations, and cultural agenda (abbreviated WCA). He is clear when this infrastructure of influence extends to cover Eastern Orthodox (which he glibly labels "Greek Orthodox" although assorted Greek, Russian and other derivations and jurisdictions are a huge question in those communities). Jones maintains a boundary between Christians of white, European descent, and African American Christians, because of the heritage of slavery and Jim Crow entwined with the churches, but he gives little attention to the growing Asian presence in the mainline WCA churches, or growth of other ethnically-based Christian churches (Afro-Caribbean, for example).

Jones efficiently traces the distinctions between the two historical descendants of pre-1920 WCA: mainline, ecumenically-oriented churches, and evangelical churches. I believe he fails to consider fully, however, just how porous those distinctions can be, or the significant differences between what has been called "soft-core" versus "hard-core" evangelicalism --the latter more doctrinally oriented and fundamentalist, the former more experiential and open to individuals who move in and out of a community (seen in the rise of name-brand groups such as The Vineyard). Particular communities, in fact have changed position within WCA with some difficulty, yet irreversibly. Why and how?

For example, Hope College in Michigan was a mainline college with a Reformed  with a heritage in the  Reformed Church in America (the less-enclosed of the Dutch Protestant denominations) through the 1970s. It liked to remember its association with Robert Schuller '47 (his son was a class mate there, '76), and the Science Center (!) was named after Norman Vincent Peale. But it went more evangelical in the 1980s-1990s, under the leadership of Gordon van Wylen; by the 1990s the College had moved clearly and definitely in an evangelical direction under the direction of Chaplain Ben Patterson and a milquetoast senior college leadership.  (See James Kennedy's Can Hope Endure, 2005.) To this day it is contrary college policy to "by statement, practice, or intimation . . .  to promote a vision of human sexuality that is contrary to this [fundamentalist] understanding of biblical teaching." Jones identifies opposition to gay and lesbian rights as one of the binding commitments that evangelical Christians must maintain without compromise in order to show their bona fides --and the problems this will bring with a younger generation of Americans. Numerous alumni/ae have repudiated this position, or simply dropped any sense of allegiance or support, while support has increased from evangelicals. The cost of changing official policy for the college, given its choices, would probably be prohibitive.

By contrast, Princeton Theological Seminary moved in the opposite direction. Beginning from an incohate, majority position in the 1970s, the Seminary fully mirrored the protracted and heated conversations and conflicts that coursed through the Presbyterian General Assembly, especially after the union of the (northern) UPCUSA and the (southern) PCUS in 1983. Under the long presidency of Thomas Gillespie (1983-2004) the Seminary maintained the (PCUSA) Presbyterian Church's official line of "acceptance or members but without ordination" despite the evident hypocrisy of this decision with in its own community. Any discussion or criticism of these policies was resisted and tamped down upon (as I discovered 1992-1996 during the course work and examinations for Ph.D.) These hypocrisies culminated with the death of the community's beloved and celebrated musician David Weadon in 1996, from HIV-related causes, who died afraid to reveal his illness for fear of his job and health coverage. Thomas Gillespie did express remorse and a change of heart, too late for David, of course. In 2011 the PCUSA finally voted to allow the ordinations of gay and lesbian persons. The Seminary now hosts a chapter of BGLASS (Bisexual, Lesbian, and Straight Supporters) and formally hosts discussions of gay and lesbian issues in an affirming manner in its Center for Theology, Women, and Gender. From a position squarely in the evangelical opposition to gay and lesbian rights, the Seminary has moved to the center in tandem with its historic denominational alliance.

Jones' book suggests that such institutional shifts are rare, but I believe that they are more common than he realizes, because they reflect the changing concerns of individual white protestant Christians as well. Evangelical churches have (historically) surely seen their share of former or lapsed members of mainline churches who have undergone some kind of conversion or new-life experience and join an evangelical congregation. The traffic certainly moves in the other direction as well: numerous evangelical Christians have moved out of Evangelical churches in response to changing understandings of scripture, history, cultural and religious political connections, and geographies. Very little work has been done on cross-overs. In 1993 Benton Johnson, Dean Hoge, and Donald Luidens tried to examine the famous claim by Dean Kelley (1972) that conservative churches were growing (and liberal churches declining) because the more liberalizing denominations were weak: low commitment, and moral and theological commitments too fuzzy to mobilize members' energies. In their 1989 study of 500 Presbyterian baby-boomer confirmands (in other words, confirmed around age 14 between roughly 1960 and 1980) found that they did not, in general leave the Presbyterian church because they sought doctrinal and moral orthodoxy in conservative churches. Some remained in mainline churches, and many left, but not for Dean Kelley's supposed reasons.

Jones' book examines WCA responses to three general topics: political involvement, gay and lesbian rights, and racial tensions and histories. Since his preparation (late 2015) and publication in 2016, much has happened that actually confirms his historical narratives of change, decline, and acceptance. As he wrote in The Atlantic in July 2017, the emphases and apparent desires of the present Presidential administration will not reverse WCA decline --indeed, the present leadership may be the death-rattle of WCA rather than its rejuvenation. One of the benefits of reading Jones' book now (2019) is that it contextualizes numerous deeply divisive conflicts of the past 26 months in what went before: the world did not begin anew in November 2016. The politics of nostalgia and fear have proven very powerful, and the traditional evangelical narrative of persecution has found a weird new life in the face of public anti-Semitism. Ironically some of the mainline churches have found a new voice for social inclusion in an era marked by rising hate speech, acts of violence, anti-semitism, and anti-immigrant rhetoric and actions.

John Fea's book Believe Me: the Evangelical Road to D—T— develops the narrative of persecution, nostalgia, and fear, which will wind up at the dead end that looms for the "court evangelicals" and possibly for evangelicalism as a whole. The signs of coming trouble and a profound day of reckoning are unmistakable. Young evangelicals problematize or flee the label: the Princeton Evangelical Fellowship, continuing from 1931, changed its name to Princeton Christian Fellowship because of the narrow and overly partisan meanings that have gathered around the term "evangelical." (The Princeton Fellowship pre-dated the use of the term "evangelical" by the National Association of Evangelicals by a decade.) David Gushee, formerly an evangelical (still very much a professor at Mercer University) has written Still Christian: Following Jesus Out of American Evangelicalism (2017) in a similar spirit.

Three years later, Jones' book still resonates broadly with the continuing decline of WCA. The mainline denominations still struggle with the realities of decline, some rather badly. Many do nothing meaningful whatsoever about preparing or adapting present and future clergy to the reality of ministry as a part-time job. The evangelical wing of WCA has in large part completely mortgaged its moral and cultural standing to the current incumbent of the White House. One shudders to think of the court evangelicals' fall when that person is either impeached, not re-elected, or even retires after eight years. Currently 72 years old, in any case he won't be around forever. What then? Who will write this story then, and what will the next social form of evangelicalism look like?

Three years later, Jones' book could also benefit from some examination of the end of white, Catholic America as well. The changing composition of Catholicism can mask for a while the dramatic departure of white Catholics, many deeply angry and hurt by the continuing pedophile scandals and insider infighting regarding positions associated with the current Pontiff. White Catholic America (the other WCA?) replicated white, protestant, Christian America's web of institutions in the earlier 20th century to a remarkable degree. Those are also coming apart, for somewhat different but related reasons. In some states, the cultural power of the Catholic Church, once palpable, has dissipated almost completely. This would be a fascinating companion study that could probe much deeper the realities described by David Masci and Gregory A. Smith in October 2018. Personally speaking, I sometimes feel that I work on a campus filled with pissed-off Catholics.  They are not happy, and for that Church the day of reckoning approaches, as well.

Jones' final chapter, a "eulogy," organizes a lot of material on the frame of Kübler-Ross' theory of the stages of grief, from denial and anger to acceptance.  This is a dubious scheme insofar as many pastoral and health practitioners have abandoned this scheme as unhelpful and overly schematic, but it serves Jones acceptably as a prism with which to examine topics as diverse as the Institute for Religion and Democracy (IRD), essentially a destructive and disruptive distraction, to Russell Moore's various activities in the Southern Baptist Convention, to the panentheistic wanderings of some mainline denominational leaders (such as John Dorhauer of the United Church of Christ).  As a sociologist (and not particularly a theologian), Jones tends to locate the initiative in the churches with people, and unintentionally commit the theological mistake that doomed WCA: that this is entirely under human control. If the past decade has shown anything, it is how little is under effective human control.  Many preachers like to quote Alfred, Lord Tennyson's famous line in Morte d'Arthur:  "The old order changeth, yielding place to new" --but few go on to the next lines:

And God fulfils Himself in many ways, 
Lest one good custom should corrupt the world.

Did whatever was good in the WCA wind up corrupting the world, or vice-versa (and what does "world" mean here, anyway)?  Jones is a sociologist of religion and sticks to his domain.  Inside (or emerging from) the dying or dead WCA, churches might see their situation differently. How now will God fulfil God's intentions for the church and for all creation?  Karl Barth's famous rebuke of religion in the church resonates broadly.

I am indebted to John Fea for pointing out Michael J. Douma's How Dutch Americans Stayed Dutch: An Historical Perspective on Ethnic Minorities (Amsterdam University Press, 2014, 978-90-8964-645-3). Douma's book has been a delight, enlightening and useful to my continuing question "how can I teach about evangelicalism to students who have almost no awareness of it?" without becoming either divine or pedantic. Americans of Dutch heritage are no more uniformly evangelical than any other group, but Douma's insights provide clues to the challenge of teaching about other people's arguments to those who don't already know or care about them.

Fea pointed out Douma's book and Douma's response to a misleading article in the Economist that sets out a complex reality in simplistic, bite-sized terms appropriate to The Economist's readers. The pretext was a remarks by U.S. Ambassador Pete Hoekstra, and the sagas of Betsy DeVos and Erik Prince, all recognizably Dutch-American conservatives of a certain positivist stripe. Like many Americans academics, in past months I have winced at the antics, pratfalls, and utter cluelessness of Betsy DeVos, incumbent Secretary of Education. Anyone who knows West Michigan (and Holland, Michigan in particular) will know name well, such as the DeVos Field House at Hope College, and the endless genuflection towards the Amway Corporation, alleged to be a barely legalized, cult-like pyramid scheme. A member of the Van Andel family (DeVos' relations) has established rules for restricted access to Holland State Park's Big Red Lighthouse appropriate to a medieval lord of the manor (photo above); Erik Prince (Betsy's brother) remains a person of interest to Robert Mueller's investigation. Any long familiar with the Dutch American pale of settlement in West Michigan might role their eyes.

To West Michigan, Dutch American culture, I am an outsider with one foot inside that small tent. One quarter of my personal ancestry is Dutch (maternal grandmother, and Fries to be exact) and may mother lived decades as a Dutch American expatriate in distant, foreign parts --those of industrial eastern Michigan. (Her ashes are fittingly interred in Grand Rapids.) I earned my bachelor's degree at Hope College, but only after three years at Michigan State in heathen East Lansing. So I could have been an insider, but chose otherwise. (I will say more in a subsequent post.)

Douma's eminently readable book, accessible public history well-informed by theoretical, scholarly insights, presents Dutch American ethnicity as an evolving set of internal disagreements about how to cope with an external human and natural environment very different from the particular, original locations in the small country from which the ancestors emigrated. He limits his investigation to the 19th and 20th-century Dutch immigration to the Middle West, which was only tangentially related to 17th- and 18th-century Dutch American immigration to New York and New Jersey; he also leaves aside Dutch "Indos" from Indonesia.

Location, location: the emigrants came from pre-industrial villages and small cities in Gronigen, Friesland, Utrecht, and Overijssel that were transformed by industrialization and modern transportation shortly after their departure in the 19th century. They arrived in differing areas of the Middle West: West Michigan, the plains of Iowa (Pella, Orange City), burgeoning Chicago (South Holland), and were dispersed throughout Wisconsin.

The emigrants' descendants experienced varying personal and community outcomes in urban, small city, and rural locations. Dutch-American immigrant identity largely evaporated by the 1920s in many locations except two areas with a critical mass of shared ancestry: the West Michigan axis of Holland and Grand Rapids, and the the neighborhoods of Pella, Iowa (southeast of Des Moines) and Orange City (northwest Iowa). Three of those areas were anchored by colleges associated with the Reformed Church in America (RCA): Hope College (Holland, Michigan), Central College (Pella, Iowa), and Northwestern College (Orange City, Iowa). Grand Rapids became the Mecca of the Christian Reformed Church (CRC), and home of Calvin College and Theological Seminary (to become Calvin University in 2020).

The educational institutions are an important hook: Dutch Americans were justly famous for their work ethic and religious commitment. As my mother said, "God made the world, but the Dutch made Holland," referring to the dikes, sea-walls, and canals of the Netherlands, an intending the remark to mean, "therefore, get to work." Dutch Protestant Christianity of the Reformed tradition carried all the marks of Calvin's humanist character: based on texts (the Bible above all), theological reflection, and leaning towards pietism in a rather learned, cerebral manner. The revivalist enthusiasms of late 19th-century America were alien to Dutch temperaments and Dutch Americans became evangelical only as those immigrant tendencies passed. Originally birthed in the afschieding (secession) of orthodox, traditional Dutch Calvinists from the Netherlands State protestant church (Nederlandse Hervormde Kerk) in 1834, the secessionists in America fell out amongst themselves in 1857 over the Dutch immigrant's incorporation into the American Reformed Protestant Dutch Church (now RCA), which some considered to be entirely too worldly, lax, and American.

Consequently: the Dutch colleges became involved in Reformed disputes (Hope, founded 1851 and 1866 to the RCA; Calvin founded 1876 to the CRC; the RCA founded Northwestern College in 1882, and took control of Central College, founded 1853, in 1916). Consequently (also), Dutch Protestant religion took on an disputatious character that both nurtured and was fed by intellectual argument. Consequently (also), Dutch Americans became over-represented in skilled trades, the professions, and the sciences. West Michigan, which lacked major extractable natural resources, and depended upon manufacturing and trade (with its access to the Great Lakes), owed much of its economic development to skilled labor, and the manufacture of furniture, building materials (such as bricks), and pharmaceuticals.

Douma's book lends some weight to a view that Dutch American cultural and economic impact was not hindered but furthered by intra-Dutch immigrant debates and rivalries. In West Michigan cities the narcissism of small differences between the RCA and CRC correlated with a range of economic and cultural positions and produced varying responses to and acceptance of mainstream Anglo-American culture (regarding organizations such as the Freemasons, for example). Southern Michigan, originally part (with northern Ohio and Indiana) of the Midwestern "third New England" by the mid-19th was long habituated to Yankee habits of thrift and cultural positions such as Abolitionism; the Dutch immigrants were both similar and different to the Yankees as well as to the numerous other ethnic minorities present (especially Eastern European). Dutch Americans were at first outsiders to the fraught American conflicts that foreshadowed the Civil War, and a number of young Dutch American men absorbed a "American" habits and dispositions through war-time military service. Dutch American rivalries extended a discourse that unintentionally preserved or prolonged Dutch American identity in those areas of Michigan and Iowa that held a critical mass of Dutch descendants. In time these descendants remembered not Dutch culture so much as the culture of their grandparents or great-grandparents: "Tulip Time" in Holland Michigan (an ethnic festival in May) harks back not so much to the Netherlands as to memories of an idealized Netherlands in the minds of the early immigrants.  Dutch American identity has by now evaporated or turned into genealogical interests with a barely religious overlay.  The institutions of the CRC, RCA and the colleges have moved on to other identities and evolving missions.

What does this tell me about teaching American evangelicalism to secular or minimally Catholic undergraduates who don't have (or sometimes want) a clue? It reminds me that cultural identities are always a work in time, evolving in changing circumstances, and apt to idealize their own pasts. Their disputes, far from weakening them (unless they become too divisive), in fact strengthen them by giving the participants something to really care about. Whether many Evangelicals' current nearly cultic for the Chief Executive will in fact divide them from their recent compatriots (those Evangelicals who did not support him) remains to be seen: how divisive will that dispute become? Douma's book also reminds me of the way that religious commitment can be felt as nearly an ethnic identity, and thoroughly entangled with multiple, sometimes conflicting other commitments.

Sometimes it can also really help when a professor includes a sufficient (but not overpowering) testimony: "here's why this subject really matters to me." I find that students often respond to genuine commitment: this is important because it expresses something close to my heart. (I have seen, believe it or not, the teaching of accounting standards enlivened in this manner.) My subsequent post will tell a bit more of my own story.

 

Joseph Smith Jr.'s role as an innovator, the "disruption" of Mormon proclamation, and the America that received or rejected his message all form an enduring, cultural matrix that later facilitated the popularization of Clayton Christensen's theory of “disruptive innovation” as a popular idea.

NB This is a very long post, and might be more readable in this .pdf document.

Abstract: Christensen’s theory of disruptive innovation has been popularly successful but faced increasing scrutiny in past years.  Christensen is a Latter-day Saint, and the career of Joseph Smith, Jr. and his ideas form a powerful backdrop for Christen’s theory.   Smith's role as an innovator, the "disruption" of Mormon proclamation, and the America that received or rejected his message all form an enduring, cultural matrix that later facilitated the popularization of “disruptive innovation” as an idea. Uncritical embrace of the popular theory, especially in higher education, implies acceptance of a cultural assumptions and suggest that the theory is less useful than has been claimed.  This essay is a reflection on history and experience, not a theoretical quid-pro-quo, or take-down.

Something always struck me as oddly familiar and decidedly off-key about Christensen's confident claims that innovation would explosively disrupt American higher education.  A previous boss dismissed my views as the reflex of one in a dying profession (librarianship). 

I never accepted her cynical critique; neither could I put my finger on why I still disagreed strongly.  Then a new insight came to me while teaching a historical survey course on American Religion about the idea's interesting and problematic deep structure.  My insight sprang from some familiarity both with the discourse of innovation in higher education, and 19th-century American religion, two widely known but widely separated fields.  What I have found will, I hope, give pause to thoughtful educational and academic leaders.  Uncritical embrace of "disruptive innovation" might implicate them in religious and cultural commitments that should give them pause, especially if they lead faith-affiliated organizations.

The first backstory:  Christensen first introduced disruptive innovation in 1995 in an article aimed at corporate managers rather than researchers.  In 1997 he expanded his ideas in The Innovators Dilemma which caught on in 2000s with managers, business writers, and consultants in a big way, with the aura supplied by Harvard Business School.  Since 2011 he has predicted that as many as half of American colleges and universities would close or go bankrupt in fifteen years (so in 2021-2026). He is still at it: in April 2017 he maintained his claim when speaking to Salesforce.org's Higher Education Summit.  "I might bet that it takes nine years rather than ten."  Sometimes he's been wrong (about the iPhone, for example), but he does not vacillate.

Disruptive innovation has become big business, popularized not only by Christensen (who has come to regret losing control of the term), but by a host of management consultants, pundits, and experts.  Among librarians, David Lewis applied Christensen's ideas in 2004, expanded into a book in 2016 that has been on the whole well-received by those in my "dying profession."  Predictable pushback came in 2014 from another Harvard professor (of history, Jill Lepore), and a detailed reexamination in the MIT Sloan Management Review, and others.  Christensen in turn defended his ideas and reformulated some of them in 2015, on the 20th anniversary of his initial publication. 

Three years later, if the concept has lost some valence or he's just wrong about higher education, why rehash this now and for the umpteenth time?

That's where the second backstory becomes relevant.  Christensen (of old Latter-day Saint stock) is not just coincidentally Mormon; that identity is central to his person and that background to his work. 

When I teach my historical survey of American Religion, in due course we come to the so-called Second Great Awakening in the first half of the 19th century.  Scholars give special attention to the "Burnt-Over District" of western New York, home of many potent religious and political ideas associated with "Whig Evangelicalism": abolition, temperance, the rights of women, and reforms of public health, education, prisons, orphanages, etc.  The District fostered not only mainstream Christian restorationist and evangelical movements (such as Disciples of Christ, ("Campbellites"), Methodists and Baptists), but also countless Millennialist, New-Thought, and Spiritualist communes, Shakers, Millerites (early Seventh-Day Adventists) -and Joseph Smith Jr.'s Latter Day Saints. 

Smith resists casual dismissal: was he truly a prophet of the living God (the mainstream Mormon view)? -a womanizing fraud (the view of many of his contemporaries, and critics since)? -a self-deluded prophet who eventually bought into own fabrications and could not extricate himself (a sort of early David Koresh)? -or some kind of mystic or pyschic with unusual access to otherworldly regions and the subconcious (a sort of frontier, raw-edged Francis of Assisi)?

Smith promulgated the enormous Book of Mormon (Skousen's critical first edition is 789 pages).  He claimed an angel guided him to find ancient plates in a hill near Palmyra, New York, which he translated from unknown languages with the help of seer stones and a hat, and dictated on-the-fly to his wife Emma Hale Smith and others, all in 65 days.   Even if he made it up, or shared authorship, or plagiarized part, it is an amazing performative achievement.  A densely layered anthology of documents, speakers, and authors, the text can be regarded (if not as Scripture) as an early forerunner of "magical realism." All this from 20-something farmhand with a little primary education.

Smith was truly a "rough stone rolling" whom generations of Mormons have never managed to smooth. "No man knows my history," he is reported to have said; "I cannot tell it: I shall never undertake it. . . . If I had not experienced what I have, I would not have believed it myself."  His innovative edginess strongly contrasts with earnest, family-oriented, upbeat, corporate image of contemporary Mormons.

"Innovative" -there's that word.  In matters of religion, innovation is not always a positive idea.  Smith's most famous innovations were the origins of the Book of Mormon, and the "new and everlasting covenant" (revelation) of both eternal marriage, and the principle of plural marriage.  The last innovation directly challenged 19th century American ideas about the family, and occasioned a furious opposition of a scale rarely seen in American history (leaving aside for the moment the historical plague of racism).  Latter Day Saints were opposed in Ohio, persecuted in Missouri (the Governor ordered extermination); Smith was murdered in 1844 in Illinois by a lynch mob acting out of congeries of fears.

The subsequent succession crisis would have led to fatal splintering were it not for Brigham Young.  A considerable majority of Mormons followed his leadership from Illinois in the Great Trek to Salt Lake City (1846-1848); Young's organization both preserved and transformed the Latter-day Saints; they lost their prophet but gained a hyphen.  The founder’s innovations would have perished without Young's tenacity, sheer longevity (died 1877) and "courageous leadership" or "iron-fisted rule," depending your point of view.

These two long backstories are essential for understanding both the meteoric rise of "disruptive innovation" and its recently waning appeal as an explanatory theory in light of qualms about its accuracy.

Joseph Smith, Jr., can be seen as an exemplary disruptive innovator.

"'Disruption' describes a process whereby a smaller company with fewer resources is able to successfully challenge established incumbent businesses." While incumbents focus upon improving their product or service (especially on behalf of their most profitable customers), they tend to ignore less profitable market sectors, or exceed their needs.  Entrants seek out those ignored market segments and can gain customers by providing more suitable products or services, frequently simpler and at lower cost. Incumbents can fail to respond in kind while entrants "move upmarket," improve their offerings, maintain their advantages and early success, and eventually drive out or acquire incumbents when their formerly most profitable customers shift purchases to the entrants. (Christensen, 2015). 

Since Christensen has complained that terms have become "sloppy" and undermine his theory's usefulness, I have tried to paraphrase his core idea carefully, and to present Mormon history even-handedly.  My central claim is that Smith's role as an innovator, the "disruption" of Mormon proclamation, and the America that received or rejected his message all form an enduring, cultural matrix that later facilitated the popularization of “disruptive innovation” as an idea.  Christensen's religion did not cause him to create his theory, but did contribute a framework that fostered its reception, as well as its swift extension beyond its first cases in manufacturing disc drives, construction implements, and other tangibles.

Christensen's commitment to his Church is beyond question; the intellectual, doctrinal traditions of his faith powerfully molded his character and thinking, by his own admission.  By all accounts he is a charming and sincere man, and would never have smuggled his religion into his professional thinking; he is an honest and forthright broker.  This essay is a reflection on history and experience, not a theoretical quid-pro-quo, or take-down.

Smith's movement "came to market" at a critical time in American religious history.  The Constitutional de-coupling of religious institutions from the state apparatus was one of the primary innovations of the new nation.  The new "open market" meant that incumbent legally established, financially supported Christian churches had to compete for adherents and support.  In Connecticut, for example, the "standing order" of political and religious aristocracy came to an end in 1817.  Such competition challenged the established churches' previous role as assumed arbiters of social legitimacy.  If you have a wide field of religious choices, you could also choose "none of the above." National anxieties about declining status of established forms of Christianity in large part fueled the striking resurgence of Christian groups loosely termed "evangelical." 

A significant portion of those evangelical groups can be described as "restorationist," appealing to a New Testament proclamation of God's restitution of all things.  This was taken to mean a return to the "primitive church, " a re-pristination of Christianity - taking it back to the putative purity of the founders.  This led to a proliferation of religious bodies, almost all of which inherited the binary certitude from earlier centuries that each was "correct" and hence others were necessarily "incorrect." Each group warranted its appeals in the lost purity of the first Christians.  For many groups, the path back to purity had been cleared by curbing the incumbent religious churches by disestablishment, and they hoped that clearance would level their status.

Since the religious marketplace of early 19th-century America had only recently been opened, doctrinal disputes that now seem arcane often paralleled heated social and culture divisions.  Smith's own family mirrored the national situation; his grandfather was a Universalist (that all humans could receive God's corrective grace; they opposed evangelicals); his mother has been identified as Presbyterian (officially, she would have believed that God's eternal decree predestined some to eternal blessedness and foreordained others to eternal damnation; Presbyterians tended to be allied with evangelicals).  Joseph Jr. may have caught a "spark of (evangelical) Methodism" at local rural revival meetings. His maternal grandfather and parents experienced visions and voices; like many farmers they used divining rods to find water and looked for buried treasure, a kind of folk magic.  They believed in prophecy and vision, tended towards skepticism about "organized religion,” and were receptive to new religious ideas.  He is reported to have told his mother, "I can take my Bible, and go into the woods, and learn more in two hours, than you can learn at meeting in two years, if you should go all the time."

In the religious marketplace in western New York, Smith’s family were typical of a market segment often ignored by the more well-established groups who appealed to more properous farmers, townspeople, and entrepreneurs (see Johnson’s A Shopkeepers Millennium).  Smith's family, on the other hand, were downwardly mobile recent arrivals from Vermont without a network of support, a consequence both of poor decisions and environmental strains such as the “Year without a Summer” (1816).  They typify the impoverished, rural working class on the inner edge of the frontier, a down-market segment less promising to more prominent groups, for whom the competitive religious marketplace was particularly nettlesome.

The 14-year-old Joseph was confused by the "cry and tumult" of Presbyterians vs. Baptist vs. Methodist, all using "both reason and sophistry," to "establish their own tenets and disprove all others."  He asked, "What is to be done? Who of all these parties are right; or, are they all wrong together?  If any one of them be right, which is it, and how shall I know?"  In other words, his market segment saw that ecclesiastical competition compromised the integrity of all parties. Reading a biblical text that directed him to ask God, he went into the woods (again!) and reported he experienced a dramatic theophany: the "Personage" answered "that I must join none of them," their "creeds were an abomination" and adherents "were all corrupt."  His take-away: he realized "at a very early period of my life, that I was destined to prove a disturber and annoyer."  Joseph’s subsequent innovations certainly disturbed and annoyed other groups.

Care must be taken, however, in simply equating Joseph’s social location with a commercial market position, because the religious “marketplace” differs in important ways from commerce: product differentiation, lock-in, and brand loyalty.

The religious "product" is not a commodity, but a sense of living affiliation with a group that makes doctrinal, moral, and behavioral claims in such a way that simultaneous affiliation with more than one group is either prohibited or discouraged.  The ultimate outcome, some kind of eternal blessedness, in principle excludes other ultimate outcomes.  Today many children in "mixed" families can feel religious differences strongly (and opt for "none"). For example, an individual cannot be a Catholic in good standing and simultaneously an evangelical Baptist in good standing -their claims and ideas simply conflict too much; if both present in the same family, some accommodation must be reached.  Joseph Smith Jr. found such exclusive "product differentiation" troublesome.

Religious adherents' "market lock in" is high: one might radically change one's affiliation once or twice in a lifetime, but more often is unusual and perhaps suspect, and "conversion" can exact high social costs.  The religious fervor of New York's Burned Over district in Joseph Smith, Jr.'s time left religious organizations in flux, so that conversion costs were often much less than before or after.  All early Latter Day Saints nevertheless had to make a clear decision that departed from their inherited religious affiliations.

A religious group's "brand loyalty" involves a constellation of commitments; socialist Fundamentalists and alt-right Episcopalians are vanishingly rare (for example).  The brand loyalty of early Latter Day Saints evolved from 1830 to 1844, becoming progressively stronger both in positive response to Joseph Smith Jr.'s continuing revelations, and defensive response to violent persecution.  For example, early Saints' constellation of commitments was ambivalent towards slavery; initially as Northerners early adherents opposed it; then revelations and teachings evolved to allow some slave-holders to join in Missouri (a slave state). After Smith’s murder, his son Joseph Smith III and widow Emma Hale Smith repudiated both slavery and plural marriage in the Reorganized Church of Jesus Christ of Latter-day Saints in Missouri, the "minority" successor group.  By contrast, Brigham Young's larger "majority" successor not only retained plural marriage but attempted to legalize slavery in the Utah Territory.  Since Republicans, starting in 1854, sought to abolish "twin relics of barbarism," slavery and polygamy (a jab at Young's group), it is unclear whether that commitment arose from core convictions or defensive resistance.

"Disruptive innovation" in the religious marketplace has to be treated carefully, because of not only the special nature of the religious market place, but also rigorous examination of the idea of "disruptive innovation:" it does not mean just any disruption.

Whatever the sources of Joseph Smith Jr.,’s ideas, he led a movement that "gain[ed] customers (i.e., adherents) by providing more suitable, often simpler products or services, frequently at a lower cost."  (Latter-day Saints have never had professional clergy; their commitment to mutual assistance is exemplary.)  Market incumbents (more organized and better financed competing groups) were slow to respond in kind, and as Smith's group moved "upmarket," it maintained its "advantages and early success" -high rates of "lock-in," group cohesion, and brand loyalty.  Smith's group, however, never quite succeeded in driving the "incumbents" out of the market or even acquiring most of their former customers.  Their sense of urgency lost its edge.

Why are the Latter-day Saints “latter-day”?  This code phrase refers above all to a shared set of cultural and religious assumptions and commitments in early 19th-century America.  "Millennialism" was the belief that the coming of the Kingdom of God (promised in the Bible) was imminent and that America, with its special covenant of religious liberty, would be central to its arrival.  Millennialism came in two distinct forms with opposite answers to the question, "Will Christ return before or after the promised millennium (1000 years) of peace?"  Pre-millennialists emphasized the troubles (tribulations) that would both precede and signal Christ's return to reign for 1000 years before the Last Judgement.  Post-millennialists proclaimed that Christ would return and the Last Judgement occur after the millennium of peace, culminating in his return; their task was to make the world ready for the Kingdom of God.  Both expect the Kingdom very soon: we are all living in the biblical "latter days."

This recondite disagreement has important implications.  Post-millennialists were all about social reforms that would make the United States so like the Kingdom of God that American society would usher in the millennium.  Pre-millennialists emphasized that Christ would only come after dramatically increasing tribulations.  Things getting worse and worse were a sign of his approach –hence they disfavored social reforms as a distraction from the real work of preparation for evil times. (Historical aside: the War of the Secession largely discredited post-millennialism, which morphed into the program of social reforms in the Progressive era. Pre-millennialism evolved into dispensational Christian fundamentalism, combining expectation of tribulation with a believe in the factual, literal innerrancy of the Bible.)

Latter-day Saints' enduring post-millennialism shows, among other ways, in their boundless optimism.  The familiar, earnest young missionaries (think The Book of Mormon, the Broadway show) are a token of the continuing commitment of the Latter-day Saints to usher in the latter days, although they expect them less imminently. Millennialism is common coin no longer.  Despite the popularity of the "Left Behind" series of books and movies, only a small minority of survivalists or "preppers" appeal to Biblical warrants for their expectations of imminent tribulations (disaster).

Detached from Christianity, expectations of imminent disaster and rebirth went rogue in American culture long ago.  The Silicon Valley today, for example, is home to many who expect a "singularity" in which the artificial intelligence outstrips human intelligence and introduces profound changes to civilization as a whole ­–another sort of secular millennium in which technology has replaced a Messiah as the anointed power.  Popular movies and books have made dystopia a cultural cliché.  (What's the disaster this time? Nuclear war, apes, viruses, climate change, or the abrupt disappearance of millions?). How many jokes about "voting in the last American election" (double entendre) play on similar fears?

"Disruptive innovation's" popularity exploded in the 1990s and 2000s exactly because of the numerous hopes and fears raised by the advent of the Internet and its devices and social media.  Josh Linkner warned, "disrupt or be disrupted," (The Road to Reinvention, 2014) and that binary choice spoke in apocalyptic tones to incumbent mass media, libraries, bookstores, journalists, travel agents, financial consultants, university presidents, and anyone else who deals in "information" as a commodity.  Such urgent warnings shout to established corporations, "The end is near: you can't innovate fast enough; you're not even the right people to do it."  Incumbent organizations were counted out simply because of their incumbency: MOOCs would inevitably disrupt brick-and-mortar educational institutions, now denigrated because of their mere physicality. 

The popular version of “disruptive innovation” played dystopian fears of the collapse of the known "incumbent" corporations and rise of an economy of perpetual disruption -Schumpter's capitalist creative destruction now recast as "disruptive innovation" with a brutalist, binary emphasis: disrupt or be disrupted.  The archetype "creative disruptor" is the technological whiz-kid (I nominate the Mark, "Zuck Almighty") whose revelatory "Book of Faces" and continuing revelations of a "new and everlasting platform" will usher in a thousand-year era of effortless, limitless, and unfailingly upbeat social confabulation.  Except when many kinds of terrorists, Russian autocrats, vaccine deniers, and deranged stalkers confabulate as well.

What does this have to do with Clayton Christensen?  Well, both a little and a lot.  He cannot deny his own popularization of his ideas through his books, media appearances, securities fund (the short-lived Disruptive Growth Fund, launched in 2000 at just the wrong time), and army of students, friends, and defenders such as Thomas Thurston in TechCrunch.  He lost control of "disruptive innovation" as a term of art precisely because of its appeal to those who make a living from in-your-face, counterintuitive claims.  Lepore identified circular reasoning in the popular idea of creative disruption ("If an established company doesn't disrupt, it will fail, and if it fails it must be because it didn't disrupt").  This logical circle may or may not characterize highly-disciplined case studies of Christensen's theory, but certainly rings true to the endless popular iterations.

Whether Christensen's theory holds up almost does not matter to "disruptive innovation" as a popular idea.  By analogy, in Smith's America, as Terryl Givens has noted, what mattered about the Book of Mormon was not its teachings or particular message. "It was the mere presence of the Book of Mormon itself as an object that . . . served as concrete evidence that God had opened the heavens again."  In that era all manner of revelations appeared: the Shakers' Holy, Sacred, and Divine Roll and Book, the visionary experiences of Ellen G. White (one of the originators of the Seventh-Day Adventists), and the visions of rival claimants of Smith's prophetic mantel among the Latter Day Saints after his death.  Kathleen Flake has noted, "Henry Ford wanted a car in every home. Joseph Smith was the Henry Ford of revelation. He wanted every home to have one, and the revelation he had in mind was the revelation he'd had, which was seeing God."  The heavens, once opened, proved harder to close.

The popular idea "creative disruption" has attached itself, meme-like, to a lot of second- and third-rate scams.  Business theory has fewer brightly defined disciplinary boundaries than physics. King's and Baatartogtokh's conclusion that the theory has limited predictive power does not render Christensen's ideas useless, but does suggest that "disruptive innovation" will not be the "one theory to rule them all," and with the profits or prophets bind them. 

Joseph Smith Jr. claimed that the encounter he had with the Holy in the woods warned him not to join any of the (Protestant) groups in his vicinity, whose creeds were all "corrupt" and "an abomination."  Christian restorationists called the very early Christian movement in the short times reflected in the New Testament texts "the primitive church," and regarded all subsequent developments as not merely wrong, but apostate: those knew the truth but deliberately denied it.  Joseph Smith, Jr.'s saw his new revelation as a giant delete key on all of Christian history, Orthodox Eastern, Catholic, and Protestant.  All of it had to go.

In a similar manner, popular "disruptive innovation" connotes the passing destruction of all that is wrong with sclerotic corporate capitalism, and the restoration of the pure, "invisible hand" of the marketplace that allegedly guided early capitalists.  This popular view resonates with a great deal of cultural, political libertarianism, that giant corporations and government bureaucracy are apostasy betraying the true faith handed down from the founders (either Jesus or Adam Smith, as you wish).  "Move fast and break things," advised the Zuck; what can be disrupted should be disrupted.  Including, it would now seem, democracy wherever it might be found.

Disciplined use of the theory of "disruptive innovation" in carefully defined circumstances provides explanatory clarity but its predictive power is in fact more of a hope than an established fact, despite the protests of Christensen's defenders.  This means that it is one theory among other theories: Michael Porter's theory of competitive advantages and multifactorial analyses will likely work equally well in other carefully-defined situations.  Similarly, The Church of Jesus Christ of Latter-day Saints has found ways of regarding other religious groups positively (even Evangelicals, often the most hostile), and has moved on from the language of "apostasy."  Originally intending to hit that giant delete key, subsequent Latter-day Saints have found a way to live as active readers of their particular texts in the midst of many other readers of many other texts.  This has relevance on the ground.  Given the official LDS teachings regarding divorce and homosexuality, some LDS families have found informal means to include and tolerate differences within their members, coming to resemble the family life Joseph Smith Jr.' knew as a boy.  (Others have continued to shun their "apostates.")

Unlike Smith, Christensen never intended to promulgate a "unified field theory" of religion or business development. He is not completely responsible for losing control of his theory as a popular idea.  The close of his 20-year, 2015 re-evaluation, "We still have a lot to learn" acknowledges that "disruption theory does not, and never will, explain everything about innovation specifically or business success generally." 

Christensen's modesty still did not inhibit him from doubling down on his claim that half of American colleges and universities would close by 2025.)  Allowing his claim relative rather than revelatory validity dispels the apocalyptic fears of barbarians at the gates.  His primary analogy in The Innovative University (2011) is "changing the DNA of American Higher Education from the Inside Out," (the subtitle).  He claims that all American colleges and universities other than a branch of Brigham Young University in Idaho share the DNA of Harvard: all these institutions want to become, apparently, Research-1 universities if given money and the chance.  What does that really mean, and is that really true?  Such a simple analogy does grave injustice to community colleges (vital economic links for countless communities and immigrants), specialized schools such as Maine Maritime Academy, or even an elite liberal arts college such as DePauw University.  The popular view that higher education has changed and can change little, is flat wrong: ask any productive historian of higher education.  Change and innovation (whether disruptive or other) will not appear equally and everywhere overnight.  The higher education sector is not (thank heavens) the Silicon Valley, or McKinsey & Co.

Yet all is not well: the economic model underpinning American higher education is likely unsustainable in the coming decades for many reasons.  Higher education also forms a huge social and financial investment that unlikely to dissipate.  Distance education, information technology, changing social expectations, shifting demographics will all play a role in whether particular colleges and universities can continue to be viable. Disciplined use of the theory of "disruptive innovation" will likely hold some, but is unlikely to hold all explanatory and predictive keys.  The truth is out there but it will be much more complex.

The striking persistence of the popular "disruptive innovation" in senior management circles (typified by the Salesforce.com higher education event) reveals not only persistent fears and enduring threats, but short attention spans devoted to keeping up with the swift pace of too many events.  I suspect that popular "disruptive innovative" functions in a manner more affiliative than explanatory: "if you don't get it, you're not one of us. -- You think Jill Lepore, or King and Baartatogtokh, might be right, eh?  Let's see how long you last in the C-Suite" (--especially if you can pronounce the latter's name).

"Disruptive innovation" elicits fears useful for those who want to shake up certain professions in health, law, librarianship, and the professoriate, but by now its been over-used.  At librarians' meetings (ALA, ACRL) I have developed the habit of responding to the expression, "disruptive innovation" with the question, "what are you selling?"  Fear sells "solutions;" its potency as a means of marketing continues nearly unrivaled.  No one ever sold an expensive library services platform with the phrase, "this won't solve all your problems."  Since 1985 I have sat through many presentations that predicted the closure of libraries within ten years - Christensen's remark "I might bet that it takes nine years rather than ten" would find a new audience.  We who are about to be disrupted salute you, Prophet.

Nevertheless: printed books, asking questions, research assistance, and personal relationships with library instructors endure.  They were warned, but they persisted.  It is past time to find a more accurate analysis and predictive theory of the future of libraries and higher education.

if you know Yewno, and if Yewno, exactly what do you know? --that "exactly what" will likely contain machine-generated replications of problematic human biases.

This is the third of "undiscovered summer reading" posts, see also the first and second.

At the recent Association of College and Research Libraries conference Baltimore I came across Yewno, a search-engine-like discovery or exploration layer that I had heard about.  I suspect that Yewno or something like it could be the "next big thing" in library and research services.  I have served as a librarian long enough both to be very interest, and to be wary at the same time --so many promises have been made by the information technology commercial sector and the reality fallen far short --remember the hype about discovery services?

Yewno-logoYewno is a so-called search app; it "resembles as search engines --you use it to search for information, after all--but its structure is network-like rather than list-based, the way Google's is. The idea is to return search results that illustrate relationships between relevant sources" --mapping them out graphically (like a mind map). Those words are quoted from Adrienne LaFrance's Atlantic article on growing understanding of the Antikythera mechanism as an example of computer-assisted associative thinking (see, all these readings really come together).  LaFrance traces the historical connections between "undiscovered public knowledge," Vannevar Bush's Memex (machine) in the epochal As We May Think, and Yewno.  The hope is that through use of an application such as Yewno, associations could be traced between ancient time-keeping, Babylonian and Arabic mathematics, medieval calendars, astronomy, astrological studies, ancient languages, and other realms of knowledge. At any rate, that's the big idea, and it's a good one.

So who is Yewno meant for, a what's it based on?

Lafrance notes that Yewno "was built primarily for academic researchers," but I'm not sure that's true, strictly. When I visited the Yewno booth at ACRL, I thought several things at once: 1) this could be very cool; 2) this could actually be useful; 3) this is going to be expensive (though I have neither requested nor received a quote); and 4) someone will buy them, probably Google or another technology octopus. (Subsequent thought: where's Google's version of this?)  I also thought that intelligence services and corporate intelligence advisory firms would be very, very interested --and indeed they are.  Several weeks later I read Alice Meadows' post, "Do You Know About Yewno?" on the Scholarly Kitchen blog, and her comments put Yewno in clearer context. (Had I access to Yewno, I would have searched, "yewno.")

Yewno is a start-up venture by Ruggero Gramatica (if you're unclear, that's a person), a research strategist with a background in applied mathematics (Ph.D. King's College, London) and M.B.A. (University of Chicago). He is first-named author of "Graph Theory Enables Drug Repurposing," a paper (DOI) on PLOS One that introduces:

a methodology to efficiently exploit natural-language expressed biomedical knowledge for repurposing existing drugs towards diseases for which they were not initially intended. Leveraging on developments in Computational Linguistics and Graph Theory, a methodology is defined to build a graph representation of knowledge, which is automatically analysed to discover hidden relations between any drug and any disease: these relations are specific paths among the biomedical entities of the graph, representing possible Modes of Action for any given pharmacological compound. We propose a measure for the likeliness of these paths based on a stochastic process on the graph.

Yewno does the same thing in other contexts:

an inference and discovery engine that has applications in a variety of fields such as financial, economics, biotech, legal, education and general knowledge search. Yewno offers an analytics capability that delivers better information and faster by ingesting a broad set of public and private data sources and, using its unique framework, finds inferences and connections. Yewno leverages on leading edge computational semantics, graph theoretical models as well as quantitative analytics to hunt for emerging signals across domains of unstructured data sources. (source: Ruggero Gramatica's LinkedIn profile)

This leads to several versions of Yewno: Yewno Discover, Yewno Finance, Yewno Life Sciences, and Yewno Unearth.  Ruth Pickering, the companies co-founder and CEO of Business Development & Strategy Officer, comments, "each vertical uses a specific set of ad-hoc machine learning based algorithms and content. The Yewno Unearth product sits across all verticals and can be applied to any content set in any domain of information."  Don't bother calling the NSA --they already know all about it (and probably use it, as well).

Yewno Unearth is relevant to multiple functions of publishing: portfolio categorization, the ability to spot gaps in content, audience selection, editorial oversight and description, and other purposes for improving a publisher's position, both intellectually and in the information marketplace. So  Yewno Discovery is helpful for academics and researchers, but the whole of Yewno is also designed to relay more information about them to their editors, publishers, funders, and those who will in turn market publications to their libraries.  Elsevier, Ebsco, and ProQuest will undoubtedly appear soon in librarians' offices with Yewno-derived information, and that encounter likely could prove to be truly intimidating.  So Yewno might be a very good thing for a library, but not simply an unalloyed very good thing.

So what is Yewno really based on? The going gets more interesting.

Meadows notes that Yewno's underlying theory emerged from the field of complex systems at the foundational level of econophysics, an inquiry "aimed at describing economic and financial cycles utilized mathematical structures derived from physics." The mathematical framework, involving uncertainty, stochastic (random probability distribution) processes and nonlinear dynamics, came to be applied to biology and drug discovery (hello, Big Pharma). This kind of information processing is described in detail in a review article, Deep Learning in Nature (Vol. 521, 28 May 2015, doi10.1038/nature14539).  Developing machine learning, deep learning "allows computational models that are composed of multiple processing layers to learn representations of data with multiple levels of abstraction."  Such deep learning "discovers intricate structure in are data sets by using the backpropagation algorithm to indicate how a machine should change its internal parameters that are used to compute the representation in each layer from the representation in the previous layer." Such "deep convolutional nets" have brought about significant break-throughs when processing images, video, speech, and "recurrent nets" have brought new learning powers to "sequential data such as text and speech."

The article goes on in great detail, and I do not pretend I understand very much of it.  Its discussion of recurrent neural networks (RNNs), however, is highly pertinent to libraries and discovery.  The backpropagational algorithm is basically a process that adjusts the weights used in machine analysis while that analysis is taking place.  For example, RNNs "have been found to be very good at predicting the next character in the text, or next word in a sequence," and by such backpropagational adjustments, machine language translations have achieved greater levels of accuracy. (But why not complete accuracy? --read on.)  The process "is more compatible with the view that everyday reasoning involves many simultaneous analogies that each contribute plausibility to a conclusion." In their review's conclusion, the authors expect "systems that use RNNs to understand sentences or whole documents will become much better when they learn strategies for selectively attending to one part at a time."

After all this, what do you know? Yewno presents the results of deep learning through recurrent neural networks that identify nonlinear concepts in a text, a kind of "knowledge." Hence Ruth Pickering can plausibly state:

Yewno's mission is "Knowledge Singularity" and by that we mean the day when knowledge, not information, is at everyone's fingertips. In the search and discovery space the problems that people face today are the overwhelming volume of information and the fact that sources are fragmented and dispersed. There' a great T.S. Eliot quote, "Where's the knowledge we lost in information" and that sums up the problem perfectly. (source: Meadows' post)

Ms. Pickering perhaps revealed more than she intended.  Her quotation from T.S. Eliot is found in a much larger and quite different context:

Endless invention, endless experiment,
Brings knowledge of motion, but not of stillness;
Knowledge of speech, but not of silence;
Knowledge of words, and ignorance of the Word.
All our knowledge brings us nearer to our ignorance,
All our ignorance brings us nearer to death,
But nearness to death no nearer to GOD.
Where is the Life we have lost in living?
Where is the wisdom we have lost in knowledge?
Where is the knowledge we have lost in information?
The cycles of Heaven in twenty centuries
Bring us farther from GOD and nearer to the Dust. (Choruses from The Rock)

Eliot's interest is in the Life we have lost in living, and his religious and literary use of the word "knowledge" signals the puzzle at the very base of econophysics, machine learning, deep learning, and backpropagational algorithms.  Deep learning performed by machines mimics what humans do, their forms of life.  Pickering's "Knowledge Singularity" alludes to the semi-theological vision of the Ray Kurzweil's millennialist "Singularity;" a machine intelligence infinitely more powerful than all human intelligence combined.  In other words, where Eliot is ultimately concerned with Wisdom, the Knowledge Singularity is ultimately concerned with Power.  Power in the end means power over other people: otherwise it has no social meaning apart from simply more computing.  Wisdom interrogates power, and questions its ideological supremacy.

For example, three researchers at the Center for Information Technology Policy at Princeton University have shown that "applying machine learning to ordinary human language results in human-like semantic biases." ("Semantics derived automatically from language corpora contain human-like biases," Science 14 April 2017, Vol. 356, issue 6334: 183-186, doi 10.1126/science.aal4230). The results of their replication of a spectrum of know biases (measured by the Implicit Association Test) "indicate that text corpora contain recoverable and accurate imprints of our historic biases, whether morally neutral as towards insects or flowers, problematic as race and gender, for even simply veridical, reflecting the status quo distribution of gender with respect to careers or first names. Their approach holds "promise for identifying and addressing sources of bias in culture, including technology."  The authors laconically conclude, "caution must be used in incorporating modules constructed via unsupervised machine learning into decision-making systems."  Power resides in decisions such decisions about other people, resources, and time.

Arvind Narayanan, who published the paper with Aylin Caliskan and Joanna J. Bryson, noted that "we have a situation where these artificial-intelligence systems may be perpetuating historical patterns of bias that we might find socially unacceptable and which we might be trying to move away from."  Princeton researchers developed an experiment with a program called GloVe that replicated the Implicit Association test in machine-learning representation of co-occurent words and phrases.  Researchers at Stanford turn this loose on roughtly 840 billion words from the Web, and looked for co-occurences and associations of words such as "man, male" or "woman, female" with "programmer engineer scientist, nurse teacher, librarian."   They showed familiar biases in distributions of associations, biases that can "end up having pernicious, sexist effects."

For example, machine-learning programs can translate foreign languages into sentences taht reflect or reinforce gender stereotypes. Turkish uses a gender-neutral, third person pronoun, "o."  Plugged into the online translation service Google Translate, however, the Turkish sentence "o bir doktor" and "o bir hemşire" are translated into English as "he is a doctor" and "she is a nurse."  . . . . "The Biases that we studied in the paper are easy to overlook when designers are creating systems," Narayanan said. (Source: Princeton University, "Biased Bots" by Adam Hadhazy.)

Yewno is exactly such a system insofar as it mimics human forms of life which include, alas, the reinforcement of biases and prejudice.  So in the end, do you know Yewno, and if Yewno, exactly what do you know? --that "exactly what" will likely contain machine-generated replications of problematic human biases.  Machine translations will never offer perfect, complete translations of languages because language is never complete --humans will always use it new ways, with new shades of meaning and connotations of plausibility, because human go on living in their innumerable, linguistic forms of life.  Machines have to map language within language (here I include mathematics as kinds of languages with distinctive games and forms of life).  No "Knowledge Singularity" can occur outside of language, because it will be made of language: but the ideology of "Singularity" can conceal its origins in many forms of life, and thus appear "natural," "inevitable," and "unstoppable." 

The "Knowledge Singularity" will calcify bias and injustice in an everlasting status quo unless humans, no matter how comparatively deficient, resolve that knowledge is not a philosophical problem to be solved (such as in Karl Popper's Worlds 1, 2, and 3), but a puzzle to be wrestled with and contested in many human forms of life and language (Wittgenstein). Only by addressing human forms of life can we ever address the greater silence and the Life that we have lost in living.  What we cannot speak about, we must pass over in silence (Wovon man nicht sprechen kann, darüber muss man schweigen, sentence 7 of the Tractatus) --and that silence, contra both the positivist Vienna Circle and Karl Popper (who was never part of it) is the most important part of human living.  In the Tractatus Wittengenstein dreamt, as it were, a conclusive solution to the puzzle of language --but such a solution can only be found in the silence beyond strict logical (or machine) forms: a silence of the religious quest beyond the ethical dilemma (Kierkegaard).

This journey through my "undiscovered summer reading," from the Antikythera mechanism to the alleged "Knowledge Singularity," has reinforced my daily, functional belief that knowing is truly something that humans do within language and through language, and that the quest which makes human life human is careful attention to the forms of human life, and the way that language, mathematics, and silence are woven into and through those forms. The techno-solutionism inherent in educational technology and library information technology --no matter how sophisticated-- cannot undo the basic puzzle of human life: how do we individually and social find the world? (Find: in the sense of locating, of discovering, and of characterizing.)  Yewno will not lead to a Knowledge Singularity, but to derived bias and reproduced injustice, unless we acknowledge its limitations within language. 

The promise of educational and information technology becomes more powerful when approached with modesty: there are no quick, technological solutions to puzzles of education, of finance, of information discovery, of "undiscovered public knowledge."  What those of us who are existentially involved with the much-maligned, greatly misunderstood, and routinely dismissed "liberal arts" can contribute is exactly what makes those technologies humane: a sense of modesty, proportion, generosity, and silence.  Even to remember those at this present moment is a profoundly counter-cultural act, a resistance of the techno-idology of unconscious bias and entrenched injustice.

American Values Religious Voices: 100 Days. 100 Letters "has brought together a diverse group of scholars to write letters to [the executive and his] administration, and our elected officials in the House and Senate. These letters articulate core American values that are rooted or reflected in our various faith traditions

American Values Religious Voices: 100 Days. 100 Letters "has brought together a diverse group of scholars to write letters to [the executive and his] administration, and our elected officials in the House and Senate. These letters articulate core American values that are rooted or reflected in our various faith traditions."

The scholars joined the founder of this idea, Rabbi Andrea L. Weiss, Associate Professor of Bible at the Hebrew Union College-Jewish Institute of Religion in New York City.  She felt that the words and actions of the campaign and its aftermath call into question fundamental values that has long defined this country and culture.  She and the writers are hopeful that their observations will spark ideas and insights about what it means to be an American, and to contribute to our national discourse.

The group is non-partisan, and characterized by a sense of relevancy and urgency.  The contributors show the kind of diversity that makes us who we are as a nation: religions, races, generations, genders, ages, political viewpoints, sexual orientations, and geographic location.

Join with me in subscribing to these letters (form in the footer of the campaign's home page.  As of today (January 22) three letters have been written and sent, by Rabbi Weiss, John Kutsko, Director of the Society of Biblical Literature and Professor of Biblical Studies at Emory University, and Eric Barreto, Weyerhauser Associate Professor of New Testament at Princeton Theological Seminary.

Will this campaign make any difference?  To the current incumbent, perhaps not.  It is vital for the rest of us to stand up, however, and show both the clarity and restraint that are the best values of all those who work in higher education, whether as teaching scholars or in other roles.

Please also write to express your support, to [email] -- this is a vital witness at a perilous time in our country's history, and time when we need to re-affirm the key values of religious diversity, principled argumentation, respect for facts, respect for differences of point of view, and humility.  (And see Frank Bruni in the January 22, 2017 New York Times regarding humility.)

The Hapsburg, Viennese (1900s) central problems of the nature and limits of language, expression, and communication, authenticity, and symbolic expression are all involved --in a remarkably similar manner-- in the political and social events that crystalized in 2016 in America.

Events during 2016 (and not just the US election) have caused me to re-assess my work and many of the assumptions of my life, from the ground up.  What do I mean when I say I'm a Christian, and how does that really come out in my life?  What are my fundamental commitments?  How do I identify myself as a reflective, thoughtful person in a culture that has no room for reflection, thoughtfulness, and a most casual disregard for basic matters of truth?

So I began to start again, to re-examine authors and composers that have affected me the most from my beginnings: with the Greeks; the Greek New Testament; the writings of John Calvin and Karl Barth; George Herbert, W. H. Auden, T. S. Eliot; J. R. R. Tolkien; Søren Kierkegaard, Ludwig Wittgenstein; J. S. Bach; Johannes Brahms, to name a few.  (I am very aware that those are entirely European and male, but I have to be honest about how and what I learned, and when, and who I am.)

As regards Wittgenstein, I also have returned to renew my reading of German, with Hermann Hesse's Demian in a dual-language edition, and to the intellectual world of 19th- and early 20th-century German-language writers and thinkers, especially in Vienna.  Hence I returned to Wittgenstein's Vienna, by Allan Janik and Stephen Toulmin (Simon & Schuster, 1973), a book that still holds up well more than 40 years later.

The authors mount a multi-pronged argument, that difficulties regarding communication, language, and expression arose in late Hapsburg, fin de siècle Vienna that became ripe for philosophical investigation, and that these difficulties were shared across a wide spectrum of artists, musicians, writers, physicians (including psychoanalysts) and scholars --almost all of whom knew each other, at least socially.  The central figure is the now-not-so-well-known Karl Kraus and his scathing critique of Viennese bourgeois and aristocratic life and social realities (almost all of which were officially denied), and the loose circle of Krausians such as Otto Wagner and Adolf Loos (architecture), Artur Schnitzler, Hugo von Hofmannsthal, and Robert Musil (literature), Arnold Schoenberg (music), Gustav Klimt, Oskar Kokoscha and Egon Schiele (art).  Musil satirically dubbed a semi-fiction empire of his fiction in Der Mann ohne Eigenschaften (The Man Without Qualities) "Kakania," a play on the  kaiserlich-königlich or kaiserlich und königlich, expressions that provided status indicators in the complicated social hierarchy of the time, and equally concealed the dishonesty and deceit at the core of the Hapsburg state.

Janik and Toulin articulate their central thesis most clearly:

We have spotlighted the problem of language in Hoffmannsthal, because this serves to introduce and illustrate our own central hypothesis about Viennese culture --namely that to be a fin de siècle Viennese artist or intellectual, conscious of the social realities of Kakania, one had to face the problem of the nature and limits of language, expression, and communication. . . . . By the year 1900, the linked problems of communication, authenticity, and symbolic expression had been faced in parallel in all the major fields of thought and art. . . . So the stage was set for a philosophical critique of language, given in completely general terms. (page 117 and 119, with authors' emphasis)

Janik and Toulmin then examine how this task presented itself to thinkers such as Ernst Mach, the Kantian traditionalists, and the "anti-philosophers" following Schopenhauer, Kierkegaard, and (to some extent) Mauthner.  They will then place Wittgenstein (the early, of the Tractatus) in this context.

These central problems of the nature and limits of language, expression, and communication, authenticity, and symbolic expression are all involved --in a remarkably similar manner-- in the political and social events that crystalized in 2016, although tendencies toward strident nationalism, racism, and the remarkable coarsening of political and social communication were well apparent decades before.  Social media have created a post-factual communication of deceit which is routinely denied by the networks' owners' who benefit from the confusion and monetarized deceit.  American society faces a number of problems that simply cannot be addressed given the terms of communication and expression before and in 2016.  Like the creaking, only-semi-functional Kakanian state, the American state and culture is a shaken, badly sagging structure (and infrastructure), and it does not require unusual foresight to anticipate any number of future disasters, whether ecological, political, social, cultural, or military.

Like the blissfully ignorant inhabits of Kakania in the early 1900s, we 21st-century Americans may all have completely missed the ominous and destructive potential of our times, in large part as a function of inauthentic language, expression, and communication gone wild.  Our language and media have lost their mooring in authentic use and useful social understanding of justice.  I will be reading more Wittgenstein, and more about him, and writing here.  We may be facing, as found Karl Kraus' writings, Die Letzte Tage der Menschheit (The Last Days of Humanity).