Skip to content

O Dieu dont le Fils unique, par sa vie, sa mort et sa resurrection, nous a merite, les recompenses du salut eternel, faites que, meditant ses mysteres dans le tres saint Rosaire de la bienheureuse Vierge Marie, nous mettions a profit les lescons qu'ils contiennent afin d'obtenir ce qu'ils nous font esperer. Par la meme Jesus-Christ, votre Fils notre Seigneur. Amen.

A longer perspective of religious traditions can offer significant insights into what seems to have become a cultural cul-de-sac of social media.

A longer perspective of religious traditions can offer significant insights into what seems to have become a cultural cul-de-sac of social media.

Iona Abbey, Easter 2013
Iona Abbey, Easter 2013 (Larger)

Cal Newport's Digital Minimalism (see previous long entry) has led him to "stumble across this growing tension between social media and religion" which he details in his April 8 post. The real issue is not the existence of social media itself, but the manner in which it fragments and dissipates focus and attention --and attention is a common element to religious practices in many streams of faith or traditions, both Eastern and Western.

I noted previously that Newport pays attention to figures as diverse as Aristotle, Thoreau, and Abraham Lincoln, and acknowledges the depths of the discussion of focus and human flourishing, "but he avoids getting pulled off task." Well and good --but the relation between focus, attention, and human flourishing (becoming a better human being) is a very old concern with a long, long literature (to say the least!). Newport's book is not primarily addressed to those in any faith community or tradition. Those who are such a community, however, cannot fail to pick up the resonances of this discussion.

One might first point out that social media is hardly the first significant distraction to come around, and the idea that "social media might be accidentally undermining religion" needs perspective. Religious practices, whether or not reified into something that contemporary Westerners understand as a "religion," have been around a very long time. Social media is not yet quite 20 years old ("The Facebook," launched February 2004; Friendster 2002). The view that social media can or will quickly and significantly undermine practices going back thousands of years so could be superficial. Intellectual ideas of about what a religion is have changed, and practices have changed; but those practices have been severely challenged before and somehow persisted. Will runaway social media usage really accomplish what centuries of pogroms, persecutions, and cultural coopting have failed to do? Richard Dawkins and Sam Harris might be so lucky.

This observation does not dismiss Newport's question. That social media do in fact seek to capture, control, fragment, and dissipate users' attention is a feature, not a bug. They do that by design, as Newport showed and skewered brilliantly. This is why the great Zuck's "regrets" about fake news and hate speech ring so hollow: those who write and spread fake news and engage in hateful communication are using social media exactly as intended. Social media was meant for everyone to share, and do the trolls share! The idea that platform does not foster content is ideological sleight of hand, a fib to deny culpability. Much of the content of social media is nevertheless utter poison, and it might be even more threatening to many religious traditions than the fracturing of attention itself.

A longer perspective can offer significant insights into what seems to have become a cultural cul-de-sac.

Decades ago I was privileged to study with Diogenes Allen at Princeton Theological Seminary when he was in a very productive period of his life. An Oxford (Rhodes) scholar, he was a searching and provocative reader of texts classic and modern, and three books in particular have stuck with me: Kierkegaard's Sickness Unto Death and Works of Love, and Simone Weil's Forms of the Implicit Love of God (and her related "Reflections on the Right Use of School Studies with a View to the Love of God"). These are my sources for the following reflections.

The fragmentation of attention was brilliantly and lucidly addressed by Kierkegaard in 1849 in Sygdommen til Døden (The Sickness of Unto Death). SK was deeply enamoured of the Danish language, and thoroughly embedded in Danish history: the events of unsuccessful revolution of 1848 in Copenhagen "have been of world-historical significance and have overturned everything . . . . Every system has been exploded. In the course of a couple of months, the past has been ripped away from the present with such passion that it seems like a generation has gone by." (From The Point of View for My Work as an Author, written in 1848 and later published in 1859, also below). In other words, a world not unlike our own in many ways.

But when "the threads of intelligence broke," and when "everyone who has, in various ways, been a spokesman in the past has been reduced either to silence or to the embarrassment of being forced to purchase a brand-new suit of clothing," a new way forward could break open, and SK searched for language and forms of life for that new world. He found it in his word for despair, fortvivlelse, an for- intensification of tvivl (doubt or question); for+tvivlelse is both mega-doubt and raises the stakes: meta-doubt. In despair, such meta-doubt is to be of two minds: not only split apart from one's self, but split apart from God. An anti-Hegelian, SK saw an irresoluble polarity of temporal/eternal, freedom/necessity, consciousness/unconsciousness in human experience, which is fundamentally an experience of divided-ness or fragmentation.

SK rings the changing forms of fortvivlelse (despair) through these polarities. All humans experience despair or fragmentation; the depth is despair is to live unaware that one is in despair: the fragmentation that admits no fragmentation. This condition is universal, and the only way out of this condition is for the self to live "by relating itself to its own self, and by willing to be itself the self is grounded transparently in the Power which posited it." (English transl. by Lowrie, p. 147) . To be "grounded transparently" is to do the works of love, which is the Summary of the Law.

The person who truly admits, holds on to, and focuses this despair (fortvivlelse) will open oneself to the Reality of world (or creation) it its beauty, order, in service to others, in respect for practices that focuses this attention, and in friendship. These are, in Simone Weil's terms, the forms of the implicit love of God. God (the Power which posits us) seems absent, and we can only love (or focus upon) God through the forms of Reality by which God's presence is implicit and mediated. The "right use of school studies" develops a lower kind of attention, which is "extremely effective in increasing the power of attention that will be available at the time of prayer, on the condition that [studies] are carried out with a view to this purpose and this purpose alone." Weil writes, "the key to a Christian concept of studies is the realization that prayer consists of attention." (Waiting for God, p. 105)

When I read Newport's Deep Work three years ago, I was very much struck by the sense in which deep work could be extremely effective training to the power of concentration and attention. (Deep work is "professional activities performed in a state of distraction-free concentration that push your cognitive capabilities to their limits. These efforts create new value, improve your skill, and are hard to replicate.") Deep work rightly undertaken can train a form of energy (a habitus) for prayer and focus upon the "meta:" Reality, God, Emptiness, Submission, Torah among other names in varied traditions. This is not really the "deep work" that Newport meant, who was writing at a different level for a different audience. The resonance is unavoidable for those who have ears to hear.

As a librarian, my work is collaborative and can sometimes feel trapped in the shallows, not the depths. My writing and focused reading helps me to return to the depths. Library service, at its best, is a soul-craft that provides a glimpse of the sacred trust of learners, teachers, and traditions as they undergo necessary and unavoidable change. That glimpse does not substitute depth for mere metrics of productivity. A librarian's deep work is not necessarily solitary, but does require clear boundaries for time with colleagues and time alone.

This sounds high-flown but it has direct, practical implications. During "crunch time," students do in fact limit digital social media. Project Information Literacy learned in 2011 (.pdf) that "students use a “less is more” approach to manage and control all of the IT devices and information systems available to them while they are in the library during the final weeks of the term." Further study (.pdf) has shown that they use a "hybrid approach to conducting research and finding information. " While this study needs to be updated --students in 2011 had a different experience of social media on smart phones than do students in 2019-- the PIL study suggests that at least the seeds are present for a successful critique and debunking of the casual "the more sharing, the better" slogan, beginning with high-stress times like the close of the semester.

Shifting focus away from the forms of despair that feast upon distraction, and towards purposeful existence, is always going to be a tough sell for many. As Annie Savoy says about Ebie Calvin "Nuke" LaLoosh in Bull Durham, 'The world is made for people who aren’t cursed with self-awareness." By contrast I know three young individuals who voluntarily went off social media this past year, because they wished to serious attention to learning surgical nursing, biochemistry, and vocal performance (music). In the words of one, "enough of the self-created drama." If that insight is possible for a few, it might be possible for just enough at a tipping point.

To return: does social media in fact undermine religion?

If by "religion" one means what one sees in an ordinary church, synagogue, or mosque on an ordinary sabbath, then on a superficial level the answer is yes. The great mass of those who lead lives of quiet desperation may never become aware of their own fragmentation or despair --and social media is intended to fragment, to snuff out any intimations that something is not right (almost in the sense of Neo's choice to accept the blue "normal" pill in The Matrix, further reminiscent of Alice in Wonderland's choice of potions).

If by "religion" one goes further not only into self-reflection and self-awareness, but into active contemplation of the order of the universe, then the answer is probably no. In that perspective, social media shrinks to a mere mosquito-buzz level of irritation, or worse, a poisoned well. Despair is a general condition of the human race, neither created nor extended by mere social media. But social media can be its tool.

Social media as a channel of despair will admit no alternative. It becomes reality; reality's fragmentation is both its content and its platform. No wonder the Silicon Valley elite will not allow their own children near it. Only true disruption (not mere "innovative disruption") will reveal fragmentation and despair for what it is --and disruption, "moving fast and breaking things" is what social media was originally all about. In the polarity of despair, its infinitude of disruption has become a finitude of being shared: a world in every way Zucked-up.

Change is hard. Adaptation to changing conditions is hard enough without the burden of misplaced, scary, apocalyptic language.

After doubling down, Clay Christensen has tripled down. This is a familiar ploy: if you say something doubtful, then repetition will make it come true. In the words of one critic, Christensen is to business what Malcolm Gladwell is to sociology.

Christensen and Michael B. Horn, the former the apostle of disruptive innovation and the latter his St. Timothy, recently repeated their claim that 50% of colleges would fail in the next decade, or 10-15 years. Their goal line has been reasonably consistent: in 2011 it was "as many as half," (we're 8 years in) and 2013 it was 25 percent in the next 10 to 15 years (6 years in); in 2017 "as many as half" within a decade (2 years in); in 2019. Christensen and Horn write that "some college and university presidents . . . tell us in public and private settings that they think the 50 percent failure prediction is conservative -- that is, the number of failures will be far higher." Names? Places? and does executives saying that make it so (is this presumption of their predictive competence warranted)?

But the reasons for this prediction keep changing, and there's the ploy: save the effect but change the cause. One is reminded of the late Sydney Brenner's Occam's Broom: "sweep under the carpet what you must to leave your hypotheses consistent." The reason for such precipitous closures, foreseen (in 2011) in the period 2021-2026 were disruption due to innovative technologies: the subtitle of The Innovative University is Changing the DNA of Higher Education from the Inside Out. The main idea is that "in the DNA" of American colleges and universities is a desire to be like Harvard: wealthy and comprehensive. By contrast Brigham Young University/Idaho exhibits a completely different, innovative strand of DNA: "in how it serves students by a combination of distance learning, on-site learning, and lower-cost alternatives to residential college" (quoting my review from November 2012).

Implicit in the metaphor of DNA is a certain determinism: you cannot change your own DNA, after all (without extremely powerful, still-developing technologies that occasion many moral questions). Your DNA determines, in this view, what you will be: such "Harvard" DNA will be a fatal flaw in many colleges and universities, according to Christensen and Eyring. Can you really change your DNA from the inside out? It's a clumsy metaphor: no wonder Christensen has abandoned it.

That 2011 book also quite ignored several inconvenient facts about BYU/Idaho. Mormons get a considerable price break there that "Gentiles" do not receive --pointing obviously to a hefty subvention from LDS sources. BYU/Idaho is, after all, a satellite of a powerful, wealthy, comprehensive mother ship in Provo, Utah: the satellite campus can hardly be a synecdoche, assuming that what is true for the part is true for the whole. The economics of BYU/Idaho and the considerable technological subsidy its online instructions receives from the mother system is simply left out. Apparently Occam's Broom works well in Idaho and Utah.

Is Christensen's central claim true, that the DNA of American colleges and universities propels them to desire to become Harvard. Is that really true? What about those institutions that could have become Harvard (or Michigan), and chose a different path? Did a liberal arts college that chose to remain a liberal arts college necessarily thereby fail its DNA? Ask too many questions, and the whole edifice collapses.

Christensen's account and predictions rely on a very superficial knowledge of the history of higher education. That lack of knowledge allows him to claim that nothing has really changed in American higher education in 150 years. How about women's education? (--one of Christensen's real blind spots). How about public community colleges? A comparison: religious groups in the Abrahamic traditions often work by one person with some kind of authority meeting and talking to others who are supposed to be instructed. Isn't the real point what they say? --words with profound differences that mask a reductive similarity of communication. Closer to home, James McCosh (of Princeton) and Clayton Christensen (of Harvard Business School) both stand or stood in front of others and talked: is that really the sum of development between them? Again: Occam's Broom.

In their 2019 opinion, Christenen and Horn change the goal-posts: now the cause of instructional failure will be changing business model driven by implicitly disruptive technologies. Does anyone remember the educational TV boom in the 1960s? That also was a changing business model. Of course business models are changing, and have changed in the past: again, consult a deeper understanding of American higher education. All current debates about business models, missions, curricula, and the needs of students have a very long history, such as Crisis on the Campus by Michael Harrington (ca. 1951), or the famous General Education in a Free Society (1950). Students have never been mere passive recipients (although the contemporary view of them as "consumers" drives them to passivity). From the colonial to the denominational colleges, land-grant colleges, public universities: the business models have evolved. The business model in higher education is changing as we speak.

In other words, Christensen and Horn present the same tired and superficial nostrums of ten years ago. Even the historical examples in his ur-text, The Innovators Dilemma (1997), are questionable. Predicting the apocalypse is an old business.

I have argued elsewhere that the fearsome language occasioned by "disruptive technologies" has origins in the pre-Millennial Restorationist theologies of early 19th century frontier America, especially the Burnt-Over district of upstate New York, and showcased pre-eminently by Joseph Smith, Jr. There's a reason that his Latter-Day Saints are latter. Christensen was formed in a community that stands by Smith's proclamations. I do not pretend that he has smuggled theology into business, but rather that there is an elective affinity between such mainstream LDS thinking and business disruption: Joseph Smith Jr. was supposed to put the rest of Christianity out of business (the "great apostasy" from the 1st century to 1830). How has that worked out for for the Latter Day Saints? (--and nevermind the very current cosmetic name changes).

I continue to wonder whether discussions of disruptive innovation in higher education are in fact a cloak for expediting other changes, less technological but no less disruptive, initiated by senior academic leadership. To be a disciple of "disruptive innovation" means you're a member of that club. Is this called group think? Has it served GE well?

So will somewhere around 50% of American colleges and universities fail in the next decade? This might be the case, but not for Christensen's and Horn's (and Eyring's) reasons. In their recent post in Inside Higher Education, they cite situations in New England. I live in Connecticut: this is daily reality for me. Demographics are shifting: colleges and universities in the northeast quadrant of the continental US are going to have a hard time on that basis alone. Some have already closed, more probably will --but not because of changing business models driven by disruptive technologies. Demographic change exerts a constant pressure not unlike climate change. The real question is, who can adapt, and how well? The population probably will not achieve replacement rate, even in the southwest.

American higher education may be entering a "perfect storm" of demographic change, economic turmoil, and moral and cultural drift or outright corruption (e.g. the recent admissions scandals). All of this is cause for deep concern; none of it depends upon the snake-oil of "innovative disruption" via technologies that power changing business models. For many institutions, strategies of differentiation based upon price point, purpose, and location will matter a great deal. Strategies based on pure number crunching accompanied by credulous faith in technologies will probably not work. Online education is here to stay, but it will not disrupt on-ground education as much as non-technological demographic trends will. This is not the stuff of disruption, but of long-term anticipated and unanticipated consequences of historical change, about which Harvard Business School professors have no more particular expertise than anyone else. When disruptive innovation gets dumbed down, it isn't disruptive anymore, but just change.

Change is hard. Adaptation to changing conditions is hard enough without the burden of misplaced, scary, apocalyptic language.

Three years later, Jones' book still resonates broadly with the continuing decline of WCA.

End of White Christian AmericaThe End of White Christian America, by Robert P. Jones. New York: Simon and Schuster, 2016. 309 pages. [With a new afterword covering the 2016 election.]  ISBN 9781501122323, available at Sacred Heart University Library.

Jones' book garnered considerable attention in the religious media in 2016. Reading it three years later, one cannot help but ask frequently, "What about?" some event since 2016, because the book preparation and publication date precluded any coverage of the interminably long, bitter 2015-2016 election cycle. Since November 2016, so many "what abouts?" arise, even over Jones' basic contention that a carefully-defined "White Christian America" (WCA) is dying or has died.  The "new afterword covering the 2016 election" is identical with the article, "T—— Can't Reverse the Decline of White Christian America," The Atlantic, July 4, 2017.  (I em-dash the name to try to discourage trolls from spamming this post.)

Jones' sticks consistently to a concept of white protestant Christian America, its churches, web of associations, and cultural agenda (abbreviated WCA). He is clear when this infrastructure of influence extends to cover Eastern Orthodox (which he glibly labels "Greek Orthodox" although assorted Greek, Russian and other derivations and jurisdictions are a huge question in those communities). Jones maintains a boundary between Christians of white, European descent, and African American Christians, because of the heritage of slavery and Jim Crow entwined with the churches, but he gives little attention to the growing Asian presence in the mainline WCA churches, or growth of other ethnically-based Christian churches (Afro-Caribbean, for example).

Jones efficiently traces the distinctions between the two historical descendants of pre-1920 WCA: mainline, ecumenically-oriented churches, and evangelical churches. I believe he fails to consider fully, however, just how porous those distinctions can be, or the significant differences between what has been called "soft-core" versus "hard-core" evangelicalism --the latter more doctrinally oriented and fundamentalist, the former more experiential and open to individuals who move in and out of a community (seen in the rise of name-brand groups such as The Vineyard). Particular communities, in fact have changed position within WCA with some difficulty, yet irreversibly. Why and how?

For example, Hope College in Michigan was a mainline college with a Reformed  with a heritage in the  Reformed Church in America (the less-enclosed of the Dutch Protestant denominations) through the 1970s. It liked to remember its association with Robert Schuller '47 (his son was a class mate there, '76), and the Science Center (!) was named after Norman Vincent Peale. But it went more evangelical in the 1980s-1990s, under the leadership of Gordon van Wylen; by the 1990s the College had moved clearly and definitely in an evangelical direction under the direction of Chaplain Ben Patterson and a milquetoast senior college leadership.  (See James Kennedy's Can Hope Endure, 2005.) To this day it is contrary college policy to "by statement, practice, or intimation . . .  to promote a vision of human sexuality that is contrary to this [fundamentalist] understanding of biblical teaching." Jones identifies opposition to gay and lesbian rights as one of the binding commitments that evangelical Christians must maintain without compromise in order to show their bona fides --and the problems this will bring with a younger generation of Americans. Numerous alumni/ae have repudiated this position, or simply dropped any sense of allegiance or support, while support has increased from evangelicals. The cost of changing official policy for the college, given its choices, would probably be prohibitive.

By contrast, Princeton Theological Seminary moved in the opposite direction. Beginning from an incohate, majority position in the 1970s, the Seminary fully mirrored the protracted and heated conversations and conflicts that coursed through the Presbyterian General Assembly, especially after the union of the (northern) UPCUSA and the (southern) PCUS in 1983. Under the long presidency of Thomas Gillespie (1983-2004) the Seminary maintained the (PCUSA) Presbyterian Church's official line of "acceptance or members but without ordination" despite the evident hypocrisy of this decision with in its own community. Any discussion or criticism of these policies was resisted and tamped down upon (as I discovered 1992-1996 during the course work and examinations for Ph.D.) These hypocrisies culminated with the death of the community's beloved and celebrated musician David Weadon in 1996, from HIV-related causes, who died afraid to reveal his illness for fear of his job and health coverage. Thomas Gillespie did express remorse and a change of heart, too late for David, of course. In 2011 the PCUSA finally voted to allow the ordinations of gay and lesbian persons. The Seminary now hosts a chapter of BGLASS (Bisexual, Lesbian, and Straight Supporters) and formally hosts discussions of gay and lesbian issues in an affirming manner in its Center for Theology, Women, and Gender. From a position squarely in the evangelical opposition to gay and lesbian rights, the Seminary has moved to the center in tandem with its historic denominational alliance.

Jones' book suggests that such institutional shifts are rare, but I believe that they are more common than he realizes, because they reflect the changing concerns of individual white protestant Christians as well. Evangelical churches have (historically) surely seen their share of former or lapsed members of mainline churches who have undergone some kind of conversion or new-life experience and join an evangelical congregation. The traffic certainly moves in the other direction as well: numerous evangelical Christians have moved out of Evangelical churches in response to changing understandings of scripture, history, cultural and religious political connections, and geographies. Very little work has been done on cross-overs. In 1993 Benton Johnson, Dean Hoge, and Donald Luidens tried to examine the famous claim by Dean Kelley (1972) that conservative churches were growing (and liberal churches declining) because the more liberalizing denominations were weak: low commitment, and moral and theological commitments too fuzzy to mobilize members' energies. In their 1989 study of 500 Presbyterian baby-boomer confirmands (in other words, confirmed around age 14 between roughly 1960 and 1980) found that they did not, in general leave the Presbyterian church because they sought doctrinal and moral orthodoxy in conservative churches. Some remained in mainline churches, and many left, but not for Dean Kelley's supposed reasons.

Jones' book examines WCA responses to three general topics: political involvement, gay and lesbian rights, and racial tensions and histories. Since his preparation (late 2015) and publication in 2016, much has happened that actually confirms his historical narratives of change, decline, and acceptance. As he wrote in The Atlantic in July 2017, the emphases and apparent desires of the present Presidential administration will not reverse WCA decline --indeed, the present leadership may be the death-rattle of WCA rather than its rejuvenation. One of the benefits of reading Jones' book now (2019) is that it contextualizes numerous deeply divisive conflicts of the past 26 months in what went before: the world did not begin anew in November 2016. The politics of nostalgia and fear have proven very powerful, and the traditional evangelical narrative of persecution has found a weird new life in the face of public anti-Semitism. Ironically some of the mainline churches have found a new voice for social inclusion in an era marked by rising hate speech, acts of violence, anti-semitism, and anti-immigrant rhetoric and actions.

John Fea's book Believe Me: the Evangelical Road to D—T— develops the narrative of persecution, nostalgia, and fear, which will wind up at the dead end that looms for the "court evangelicals" and possibly for evangelicalism as a whole. The signs of coming trouble and a profound day of reckoning are unmistakable. Young evangelicals problematize or flee the label: the Princeton Evangelical Fellowship, continuing from 1931, changed its name to Princeton Christian Fellowship because of the narrow and overly partisan meanings that have gathered around the term "evangelical." (The Princeton Fellowship pre-dated the use of the term "evangelical" by the National Association of Evangelicals by a decade.) David Gushee, formerly an evangelical (still very much a professor at Mercer University) has written Still Christian: Following Jesus Out of American Evangelicalism (2017) in a similar spirit.

Three years later, Jones' book still resonates broadly with the continuing decline of WCA. The mainline denominations still struggle with the realities of decline, some rather badly. Many do nothing meaningful whatsoever about preparing or adapting present and future clergy to the reality of ministry as a part-time job. The evangelical wing of WCA has in large part completely mortgaged its moral and cultural standing to the current incumbent of the White House. One shudders to think of the court evangelicals' fall when that person is either impeached, not re-elected, or even retires after eight years. Currently 72 years old, in any case he won't be around forever. What then? Who will write this story then, and what will the next social form of evangelicalism look like?

Three years later, Jones' book could also benefit from some examination of the end of white, Catholic America as well. The changing composition of Catholicism can mask for a while the dramatic departure of white Catholics, many deeply angry and hurt by the continuing pedophile scandals and insider infighting regarding positions associated with the current Pontiff. White Catholic America (the other WCA?) replicated white, protestant, Christian America's web of institutions in the earlier 20th century to a remarkable degree. Those are also coming apart, for somewhat different but related reasons. In some states, the cultural power of the Catholic Church, once palpable, has dissipated almost completely. This would be a fascinating companion study that could probe much deeper the realities described by David Masci and Gregory A. Smith in October 2018. Personally speaking, I sometimes feel that I work on a campus filled with pissed-off Catholics.  They are not happy, and for that Church the day of reckoning approaches, as well.

Jones' final chapter, a "eulogy," organizes a lot of material on the frame of Kübler-Ross' theory of the stages of grief, from denial and anger to acceptance.  This is a dubious scheme insofar as many pastoral and health practitioners have abandoned this scheme as unhelpful and overly schematic, but it serves Jones acceptably as a prism with which to examine topics as diverse as the Institute for Religion and Democracy (IRD), essentially a destructive and disruptive distraction, to Russell Moore's various activities in the Southern Baptist Convention, to the panentheistic wanderings of some mainline denominational leaders (such as John Dorhauer of the United Church of Christ).  As a sociologist (and not particularly a theologian), Jones tends to locate the initiative in the churches with people, and unintentionally commit the theological mistake that doomed WCA: that this is entirely under human control. If the past decade has shown anything, it is how little is under effective human control.  Many preachers like to quote Alfred, Lord Tennyson's famous line in Morte d'Arthur:  "The old order changeth, yielding place to new" --but few go on to the next lines:

And God fulfils Himself in many ways, 
Lest one good custom should corrupt the world.

Did whatever was good in the WCA wind up corrupting the world, or vice-versa (and what does "world" mean here, anyway)?  Jones is a sociologist of religion and sticks to his domain.  Inside (or emerging from) the dying or dead WCA, churches might see their situation differently. How now will God fulfil God's intentions for the church and for all creation?  Karl Barth's famous rebuke of religion in the church resonates broadly.

To West Michigan, Dutch American culture, I am an outsider with one foot inside that small tent. I don't live there anymore.

To West Michigan, Dutch American culture, I am an outsider with one foot inside that small tent. As a child I was always aware that to my mother there was a qualitative difference between "here" (meaning Saginaw, Michigan) and "there" (meaning Grand Rapids). She spent most of her growing years in a house on Calvin Street in eastern Grand Rapids, just down the street from the site of Calvin College (then). Her Dutch American relatives, six aunts and numerous others, lived around the area; in the summers we drove to Newaygo to attend a summer church camp run by her home church, Westminster Presbyterian in Grand Rapids (where her ashes are now interred).

This Dutch American background (such as it was) became more vivid to me in the two years I spent at Hope College, 1974-1976. I earned my degree there after three years at Michigan State (one year in a music program there that gave me practically no transferrable credits). I came to Hope as an outsider with some sense of how things worked in that community, but I had been formed by highly negative experiences in a mediocre public high school, and then three years (1971-1974) at completely secular Michigan State. I studied in an "alternative" residential liberal arts college, Justin Morrill College, which was closed in 1979. I liked Hope's far greater structure, but I never took it as the definition of a liberal arts college.  I knew there were other options, some of them very good.

I was completely unprepared for what Hope meant by Christian college at that time, since I was really interested in Classics (Greek, Latin, history, philosophy), and German, and kept a low profile in almost everything else except organ performance. My academic experience there was intense, demanding (I had was the only Classics major and had an Oxford-like experience of demanding, fast-paced tutorials), and formed in me the habits that Princeton would nurture to maturity. It prepared for me for the intensity of graduate studies in a major program of the history of Christianity (Princeton Theological Seminary), without which I would have been lost. That Calvin wrote in Latin (complex, literate, humanistic Latin) was no news to me.  (My dissertation work was on the Carolingians, thankfully, far removed from the obsessions of the Reformed.) The Christian emphasis, however, was at first an puzzling add on, even as I was nurturing a desire to study the history of Christianity very deeply.

Hope's cultural pendulum at that point swung "liberal" (thanks to then-recently-departed President Calvin Vander Werf), so I largely ignored the cultural evangelicalism of many of the students around me. It was a comfortable, even snug world, but it was never really my world; I would not have stayed there had that been possible. My academic work was done. I freely admit that I was in Hope, but not really of it. I joined the German Club, otherwise I was what in Princeton they called "a grind."

After Hope I spent a year as a Fulbright Commission English teaching assistant in Vienna (arranged by a powerful Hope professor with many ties there, Paul Fried), and then took up M.Div. studies at Princeton Theological Seminary. Princeton in turn left me with an enduring respect for serious, top-flight scholarship, tough writers such as Kierkegaard, Barth, Bonhoeffer, and a far more global sense that both "Reformed" and "Anglican" worlds that were much broader and more diverse than my experience in Holland, Michigan had suggested. Eventually I became a librarian (another story), and ten years later returned to Princeton for doctoral study. The encounters I had with Evangelical doctoral students in the Ph.D. seminars were frustrating (with one exception), because their preparation was often so superficial, even glib. (Even that one exception previously had left the Assemblies of God for the Lutherans, ELCA.)  While at Hope I had become involved in an Episcopal Church, and that involvement left me with enduring liturgical preferences that eventually made my sojourn in the Presbyterian Church untenable.

During all this time I grew up in the cultural orbit (neighboring township) of Frankenmuth, Michigan, which offers a curious contrasting parallel to Dutch Americans in West Michigan. Frankenmuth ("courage of the Franks") was settled (1845) by immigrants from Rosstal, Franconia (Bavaria) sent by Pastor Johann Konrad Wilhelm Löhe just two years before A.J. van Raalte led his group to the shores of Lake Macatawa in West Michigan. The settlers of Frankenmuth in time wound up in the very conservative Missouri Synod and worshiped in German well into the 20th century, later than most equally conservative Christian Reformed churches worshiped in Dutch. The doctrinal rigidity of both groups is formally similar and each has regarded itself as "the true church," to the obvious exclusion of the other tradition (never mind everyone else). Growing up I was a heathen Congregationalist, so was (or would have been) beyond the pale of respectability in both groups: an outsider, with one foot inside the tent.

These German-American Lutheran families were emotionally convinced, I am sure, that Jesus spoke Martin Luther's German via God's true Bible, just as many Dutch in West Michigan would have assumed that Calvin spoke and wrote in Dutch, and correctly conveyed Jesus' teachings in Dutch (of course! --he could not have been French!). To this day when I am confronted by passionate attachment to the 1611 Authorized English Bible, or the 1662 (or 1928) Book of Common Prayer, I can only smile: I have been here with others in other languages. Linguistic fundamentalists are everywhere, I suppose. Both groups used theology and language as shields against encroaching "American" ideas in rising generations --a losing fight, to be sure.

PiperHolland

Douma's book (How Dutch Americans Stayed Dutch) delineated the manner in which Dutch Americans created and marketed new traditions through the development of Tulip Time in Holland, Michigan. Tulip Time marked not how much they remembered about the Netherlands, but how much they had forgotten. He specifically refers to Eric Hobsbawm's "invention of tradition," that cultural practices or traditions may not be genuinely historic but are adapted or invented to serve ideological ends. In turn, Werner Sollors (The Invention of Ethnicity) extended this to ethnic traditions; and in turn Douma extends it to Tulip Time in particular. It established a new channel of Dutch American ethnic identity that was a modern re-interpretation of actual 19th century Dutch identity which by the 1930s was passing or had passed away. By 1975 (my only direct experience of Tulip Time), it had become an unintentional but devastating caricature of Dutch Americans themselves, quite apart from anything really related to the Netherlands. It was very precious. It re-interpreted ethnicity in service to an ideology of the market.

I watched (though unawares as a child) this same invention unfold in Frankenmuth, mutatis mutandis. In 1959 William (Jr.) "Tiny" Zehnder Jr.,and Dorothy Zehnder organized a Bavarian Folk Festival to inaugurate major additions and renovations to the old Fischer's Hotel on Main Street. (I remember it!) The Bavarian Inn sat opposite Zehnder's restaurant (another repurposed former hotel), which had been operated since 1927 by William Zehnder, Sr., and then by Tiny's brothers. The original festival (1959) was a success and the community organized a Civic Events Council to oversee it annual continuation. From its beginning, the Bavarian Festival was an invented tradition, one marked by usually polite sibling and community rivalries.  For many years the Festival was a major "all hands" event in a small town, and a major source of social and financial capital. As the residents' ability to volunteer decreased due to homemakers' return to the work force and employment that did not allow so much time off, the Festival gently downsized and its now four days rather than a week, and under the control more of commercial entities more than of volunteer community organizations.

Other than commerce, why did the Festival endure? Its continuation was possible because of the unusual, cohesive character of the town, where civil, business, church, and school authorities all knew each other their whole lives. It expressed a positive way forward with a German American identity in a town that still felt it. German Americans, unlike Dutch Americans, had to negotiate the realities of being related to the enemy in two world wars, an enemy who committed the Holocaust, and in defeat endured a bitterly divided homeland (1949-1989). German Americans sought to be a model all-American minority because earlier generations (especially 1914-1918) were none too sure about them.  In the 19th century, earlier German American celebrations originated in the overlapping circles of workplace, Arbeiterverein (workers clubs), churches, and civic organizations.  That network largely passed by the turn of the 20th century (the Arbeiterverein were sometimes suspected of socialism!). With well-known German American celebrations in Wisconsin and Chicago as both example and warning, Frankenmuth's Bavarian Festival --entirely unrelated to any of those earlier-- allowed ethnic reclamation by using the word Bavarian rather than German. In the 1960s and 1970s hardly a (West) German flag was to be found: all the flags were the lozenge-patterned blue and white Bavarian flag. I worked as a waiter in the Bavarian Inn in the summers of 1972-1974 and 1976, putting on the slight lilt of Frankenmuth English to complement the hokey costume.

"Historic Frankenmuth" is made-up history at its finest, an imagined narrative in service to an ideology of the market. The town looks like a theme park mashed up with a wedding venue and a fudge shop. Holland, by contrast, is a larger small city with more to do than just tourism; the Dutch kitsch is comparatively restricted to Windmill Island and a few other locations. They are, each in their way, sui generis appropriations of fading ethnic consciousness.

When I lived in Europe, I immediately sensed the profound difference between the invented traditions of Tulip Time and Bavarian Festival and the national experiences and characters of the Netherlands, Bavaria, and Austria.  The gap left me scornful of those invented American ethnicities for a long time. To be sure, each community remembered the largely rural, pre-industrial 19th-century Netherlands or Franconia, with a great deal left out that was present even then.  For example, each neglected to mention that in both the Netherlands and much of Bavarian a significant amount of the population was Catholic! (Franconia was historically mixed). Subsequent to their departures, indutrialization and the experiences of the wars and the then-very-present Cold War assured a general atmosphere of willful social amnesia and fear of the past that contrasted very oddly with the happy-go-lucky invented pasts in Frankenmuth or Holland. I suspect that imagined history has returned over there as well, in the form of ultra-right or neo-Nazi movements.

Since I had very little background in evangelicalism, scholarly examination of the Bible was nothing new to me: the textual methods were very similar to those employed by Classicists on "difficult" texts. Early on in Princeton (1977) I just could not fathom the passionate objections to documentary hypotheses about the Hebrew Bible ("Old Testament") and the Gospels. I had little appreciation for the anxiety of many classmates and their habits of proof-texting or the assumption that Jesus' place and time was just like ours. Hence I had little idea how passionately many would cling to their belief that God could bless only procreating, married heterosexuals.  It turned out, over a decade or so, that many alumni/ae of Hope were gay or lesbian --so many that once I asked one, "Was I really so socially out of touch that was completely oblivious to your identity?"  He responded, "How could I have expected you to know something that even I did not know or acknowledge about myself at that time?"

During the 1980s and 1990s the horrific experiences of illness and deaths of numerous gay friends, and those who survived, meant that ahead of the curve I grew away from the homophobic culture in which I was raised. I was also living in the East, and in much more cosmopolitan, pluralistic environments. I grew impatient with the endless Presbyterian fights over the ordination of gay and lesbian ministers. I was so done with that. When a person I knew in seminary and truly respected was essentially run out of his parish in California (by vengeful elders of a neighboring Presbytery, not by his own congregation), I called B.S. --I had had enough. In 1992 I joined the Church of St. Luke in the Fields in New York City (Episcopal) and embraced my identity as a high-church Episcopalian, but one who likes good preaching, competent theological reflection, and tenacious, progressive social outreach.   My "elective affinity" ethnicity had long since become Scottish (in large part because of my name), and my Dutch heritage became less important. My understanding of Calvin was completely revised by reading William Bouwsma's John Calvin: A Sixteenth-Century Portrait (1989) in my Ph.D. residency.  Bouwsma restored Calvin to a context of other 16th-century writers and humanists such as Eramus and Montaigne.  I found that my previous understanding of Calvin had been just as invented as Tulip Time. When I visited Hope once for an alumni/ae event, I realized that I grown away from what I never really embraced anyway.

In the same years, Hope's pendulum swung in an extremely conservative direction during the campus pastorate of a certain Ben Patterson (1993-2000), an evangelical hired by Gordon van Wylen and tolerated by John Jacobsen (presidents). Patterson instituted or encouraged practices --such as public confession, confrontations with faculty members, praying outside the residential rooms of gay students for their conversion and correction--which I regarded as beyond the pale, divisive, and unfaithful. James Kennedy's Can Hope Endure? A Historical Case Study in Christian Higher Education (2005) confirmed my worst fears. Patterson's departure in 2000 did not usher in much change, however. In 2005 the highly respected Miguel de la Torre (since at Iliff School of Theology, Denver) was forced out of the faculty (how many Hispanics did they have then or since?). De la Torre's offense: he wrote a newspaper column satirically condemning James "Focus on the Family" Dobson's "outing" of the animated character Sponge Bob Square Pants as gay. (I'm not making this up! --can sponges be gay? Who knew?) Plainly the College could not tolerate any challenge to televangelists and their ilk lest its stream of money from evangelical supporters dry up.  (I'm looking at you, DeVos and Van Andel families!) Apparently if Dobson said it, then College President James Bultman believed it, and that settled it. (Always beware of making the former baseball coach your College president!)

Nothing changed. In 2009 Dustin Lance Black was insultingly treated by the same College president, and (still) Dean of Students Richard Frost, treatment that warranted national press attention.  Opponents of this rude nonsense organized a group Hope Is Ready, but unfortunately it was not.  The College's current policy (2011) is riddled with inconsistencies and hypocrisy: "Hope College will not recognize or support campus groups whose aim by statement, practice, or intimation is to promote a vision of human sexuality that is contrary to this understanding of biblical teaching."  Further down: "Hope College promotes the indispensable value of intellectual freedom . . . . Hope College affirms the dignity of every person." Obviously this is untrue and a bold-faced lie: you can talk about "it" (non-heteronormative sexualities) but do nothing more than talk.  Your talk better not "promote a vision." (What does that even mean?0 As though the College says: We talk the talk of intellectual freedom and personal dignity, but we will not walk the walk.  If you talk about this particular subject that is "contrary to . . . biblical teaching," we will shut you up.  Apparently being gay is, according to Hope College, contagious. This kind of policy relegates the College to the evangelical reservation: only those who agree need apply, and are wanted; the rest are second-rate. It affirms superficiality and mediocrity as a consequence of narrow-minded, misguided Christian faith.  It is unfortunately consistent with Richard Frost referring to "you people" in a semi-clandestine conversation with Dustin Lance Black.

In 2013 James and Deborah Fallows visited Holland as part of their journey through America that they called "American Futures" and resulted in their book Our Towns: A 100,000-Mile Journey Into the Heart of America (2018). Holland was one of the first towns they visited, and they saw much to like: a vibrant, highly functional community with a both financial and social capital and sense of the future quite at odds with our paralyzed and dysfunctional national discourse. They wrote about the many positive aspects of Holland, but about its negative aspects, too. In his final post about Holland, James included a number of 'I won't live there' messages, the first of which came from me:

I'm a graduate of Hope College, magna cum laude in [XX subject in the late 1970s]. I know the area well. I have some Dutch ancestry. My sister is [an official] about 30 miles north. I know Holland and western Michigan and Dutch-American culture from the inside.

I grant all the excellent qualities you have written about --hard work, ingenuity, social cohesion, and a sense of an America very different from DC or NYC.

I won't live in Holland, and when my own children [three ages 15-19] have looked at colleges (or will), I never suggested my alma mater. My reason: the social narrowness of smug Dutch-American culture. Although there is a very significant Latino population in Holland, it has not successfully challenged Dutch-American Christian Reformed hegemony. That hegemony will allow no compromises.

You alluded to this smugness when you mentioned the failure of the gay rights initiative(s) there. I wouldn't want to raise my children in this atmosphere, and I don't want my children going to college in it. The hateful things that were said during that discussion give evidence of the smugness of that culture.

I live in Connecticut now (outside New Haven), and there's a lot wrong with CT. But we experience far more cultural, religious, and racial diversity here. It's not perfect, but we're working on it.

Holland has many fine qualities. But it's suffocating for many people, including me. Do mention the numerous people from Holland, and Western Michigan, who have fled the cultural suffocation.

Later in the same post James Fallows summed up Hope College pretty accurately (and with more than a touch of snark):

Hope College, once considered a "Harvard of the Midwest," now aspires to be a middlebrow Christian college. Babbit lives! A pharisaical pedagogy prevails ("Thank, God, we are not as others!")

James Bultman, Richard Frost, and Hope College trustees: I'm looking at you.  In 2017 Bultman's successor, John C. Knapp, resigned a year after nearly being forced out, by most accounts because he wanted to move the College to a more mainstream, inclusive position, again warranting negative national attention.

In 2016, the 40th anniversary of my graduation from Hope College came without my even remembering it. I received an unsolicited note subsequently from a Hope development officer, and I responded:

The dust-up about Lance Black was truly the end of me and Hope College, then. Living in CT, a state where the legislature passed marriage equality, a judgement that was sustained by popular referendum in 2008, the whole “gay” controversy is just so over, and marriage equality is an established fact on the ground here, and was in 2009. Amazing to say, the sky has not fallen in, western civilization did not come to an end here (necessarily more than it has anywhere else in the age of Trump); I don’t notice that personal morality has improved (or declined) since 2008. But candor has improved, and that can’t be a bad thing. Good friends who have been partners for decades —longer than many so-called “straight” couples— have become legally equal to my own marriage relationship, and I can’t see what’s wrong here.

Perhaps this is an overly confessional letter, because you wrote to me at the end of Lent, a good time to attempt greater self-awareness. I just don’t think about Hope College or my past relationship with it very much; it doesn’t feel relevant to much in my day. Our three children have each found their way through the college application process, and I never considered recommending Hope College to them — I just think they would find it too “other.” My younger son is a finalist for a  merit scholarship at DePauw University School of Music (vocal performance), but living in Greencastle may be a stretch for him. [In May 2018 he finished his sophomore year there, and is commited to staying to complete his degree.] He calls it the middle of nowhere, but I’ve let him know that nowhere is somewhere other than central Indiana —I’ve seen the middle of nowhere, and it’s called Houghton, Michigan. DePauw’s “look and feel” is much more emotionally and religiously accessible than is Hope's, and since he has both profound faith questions as well as long-time gay friends (though he is straight), I just didn’t see him at Hope.

Since I’m not wealthy —I’m director of an academic library— and not the profile of the usual Hope alumnus, I really don’t think I have very much to offer your College. I do wish Hope College well. My own acquaintance with the Reformed tradition at Princeton Seminary led me to understand it as very open to the world, to the new findings of the humanities and sciences, and not afraid of the truth. I suspect that colleges of any theological stripe which regard themselves as the Fortresses of Faith will have a very tough go of it in the coming decades. If Hope College were a good deal more open, and more willing to defy previously-articulated evangelical orthodoxies, it could really have something very positive to offer American higher education. Lord knows that higher education (and especially private higher education) as a sector is in deep trouble.

That note, and this blog post, says what I have to say.

Michael Douma's book was really helpful to me. I can now see, in the course of my own family background, how genuine Dutch identity in the Netherlands changed as it did from the 19th century to the modern, very liberal state. I can see how Dutch Americans evolved their own historical tradition that is almost a caricature of the Dutch and really has nothing to do with them. Just as Frankenmuth Bavarian identity has almost nothing to do with contemporary Bavaria and Franconia. That Hope College chose to double-down previous mistakes and became a defensive denizen of the shrinking evangelical academic reservation is a consequence of the "invented narrative" of Dutch American culture, shop-worn and sad. The accelerating withdrawal by younger "New Millennials" from organized religion of every stripe bodes ill for a College that values a defensive orthodoxy over liberating pedagogies.

It's almost July, and I remember how amazingly beautiful West Michigan can be this time of year, especially near the Lake. Shelly and I will visit my sister in Muskegon, and our younger son at Blue Lake Fine Arts Camp. Grand Rapids has changed profoundly: for example, the town LGTBQ adopted anti-discrimination ordinances in 1994, East Grand Rapids in 2015; Holland has yet to do so. The Grand Rapids arts community thrives, as do numerous ethnic communities. There is much to like and much more ahead than behind.

I regret that Hope College chose the path that it has (Babbit lives!). I havelittle to do with it or about it. My own life has gone on elsewhere, and for that Deo gratias!

Mouse Books give easy access to classic texts in a new format --especially essays or stories that often are not commercially viable on their own. The Mouse Books project wants to offer readers more ideas, insight, and connections for readers' lives.

Brothers_K_grande_246097a1-d3d3-4008-856a-487204748363_540xThe digital era was supposed to make books and lengthy reading obsolete: Larry Sanger (co-founder of Wikipedia, originator of citizendium.org and WatchKnowLearn.org) memorably critiqued faulty assumptions in 2010, Individual Knowledge in the Internet Age (here as .pdf; see also my posts here and here). "Boring old books" played a part. Clay Shirky of NYU wrote, "the literary world is now losing its normative hold" on our culture," "--no one reads War and Peace. It's too long, and not so interesting. . . This observation is no less sacrilegious for being true." Ah, the satisfying thunk of a smashed idol. Goodbye, long, boring not so interesting books.

Except that a funny thing has happened on the way to the book burning. (Danke schoen, Herr Goebbels) Printed books have somehow held on: unit sales of print books were up 1.9% in 2016, at 687.2 million world-wide, the fourth straight year of print growth. Rumors of demise now seem premature. What gives?

The print book is far more subtly crafted than many digital soothsayers realize. Printed books have evolved continuously since Gutenberg: just take a look at scholarly monographs from 1930, 1950, 1970, 1990, and 2010. The current printed book, whether popular, trade, high-concept, or scholarly monograph, is a highly-designed and highly-evolved object.  Publishers are very alert to readers' desires and what seems to work best.  It was hubris to think that a lazily conceived and hastily devised digital book format could simply replace a printed book with an object equally useful: look at the evolution of the epub format (for example).

Designers will always refer to what has been designed previously, as well as new and present needs and uses when designing an object: consider the humble door. Poorly done e-books were a product of the "move fast and break things" culture that doomed many ideas that appealed to thinking deeper than the one-sided imaginations of bro-grammer digital denizens.

Enter Mouse Books. Some months ago David Dewane was riding the bus in Chicago. "[I] happened to be reading a physical book that was a piece of classic literature. I wondered what all the other people on the bus were reading." He wondered, why don't those people read those authors on their smart phones? "I wondered if you made the book small enough—like a passport or a smart notebook—if you could carry it around with you anywhere."

David and close friends began to experiment, and eventually designed printed books the size and thickness of a mobile phone. They chose classic works available in the public domain, either complete essays (Thoreau's On the Duty of Civil Disobedience) or chapters (Chapters 4 and 5 of The Brothers Karamazov, "The Grand Inquisitor," in Constance Garnett's translation. These are simply, legibly printed in Bookman Old Style 11-point font. Each book or booklet is staple bound ("double stitched") with a sturdy paper cover, 40-50 pages, 3 1/2 by 5 1/2 inches or just about 9 by 14 cm --a very high quality, small product.

David and the Mouse Team (Disney copyright forbids calling them Mouseketeers) aim for ordinary users of mobile phones. They want to provide a serious text that can be worn each day "on your body" in a pocket, purse, or bag, and gives a choice between pulling out the phone or something more intellectually and emotionally stimulating. Mouse Books give easy access to classic texts in a new format --especially essays or stories that often are not commercially viable on their own (such as Melville's Bartleby the Scrivener, or Thoreau's essay, which are invariably packaged with other texts in a binding that will bring sufficient volume and profit to market.) The Mouse Books project wants to offer readers more ideas, insight, and connections for readers' lives.

As a business, Mouse Books is still experimental, and has sought "early adopters:" willing co-experimentalists and subjects. This means experimenting with the practice of reading, with classics texts of proven high quality, and complementing the texts with audio content, podcasts, and a social media presence. These supplements are also intended to be mobile --handy nearly anywhere you could wear ear buds.

As a start-up or experiment, Mouse Books has stumbled from time to time in making clear what a subscriber would get for funding the project on Kickstarter, what the level of subscriptions are, and differences in US and outside-the-US subscriptions. The subscriptions levels on the Mouse Books drip (or d.rip) site do not match the subscription option offered directly on the Mouse Books Club web site. As a small "virtual company," this kind of confusion goes with the territory --part of what "early adaptors" come to expect. That said, Mouse Books is also approaching sufficient scale that marketing clarity will be important for the project to prosper.

This is a charming start-up that deserves support, and is highly consonant with the mission of librarians: to connect with others both living and dead, to build insight, to generate ideas. The printed book and those associated with it--bookstores, libraries, editors, writers, readers, thinkers--are stronger with innovative experiments such as Mouse Books. The printed book continues to evolve, and remains a surprisingly resilient re-emergent, legacy technology.

More about Mouse books:

Web site: https://mousebookclub.com/collections/mouse-books-catalog

drip site (blog entries): https://d.rip/mouse-books?

Video:

 

The printed book, once thought to be a obsolete technology, shows strong signs of re-emergence as a stable cultural experience not apt to be interrupted by digital distractions or the dopamine kicks of addictive social media. 


Printed book imageTypewriters, mechanical watches, vinyl recordings, newspapers, printed books --obsolete technologies, right? Get with the program: countless incumbent industries and professions have been rendered pointless: disrupt or be disrupted --right? This has been the dominant cultural narrative --right?

I first heard about the obsolescence of librarians 35 years ago at the start of my career. Columbia University soon after accepted dominant cultural narratives and closed their graduate library school, college of pharmacy, and departments of geography and linguistics. Pharmaceuticals? Digital and print librarians? Linguistics and languages? Geographic information systems? --all obsolete (Whoops!). Since those who proclaim their demise have usually been selling some replacement, cynicism follows fast. Another prediction of demise, another day.

Entirely outside of libraries, a counter-narrative has grown. David Sax popularized one in Revenge of the Analog: Real Things and Why Real Things Matter (PublicAffairs, 2016): we interface with the world in a tactile, communal world.   At Harvard Business School, Prof. Ryan Raffaeli studies organizational behavior, using field research.  He contributes much more sophisticated thinking about re-emergent technologies. He has found that "incumbent" technologies and industries can make a comeback. This story has important implications for libraries.

Some technologies re-emerge from disruption and destruction, especially those that had a long history. Count out VHS tapes and punch cards: those were transitional. Typewriters have had a long enough history, as do fountain or nib pens (extending the dip, quill-type pens since 1827) .

Printed books, like other technologies, brought whole occupations and kinds of work with them: not just printers, but also binders, sellers, retailers, and of course librarians. As a candidate for "innovative disruption" by digital books, the demise of the printed book, so loudly proclaimed ten years ago, mandated the demise of book stores, libraries, librarians, publishers, editors.  Now anyone can write a book (see Amazon); who needs editors? Who needs libraries or bookstores?

Some disruptions are truly innovative --others just disruptions, and others just hype, but shouting as real (see previous post). The disruption narrative is not sufficiently incorrect (although it can be applied poorly), but the consequence corollary of the incumbent industries' necessary inability to adapt --and certainty of their demise-- is less well-founded.  Raffaelli's research shows that technologies can re-emerge, a cognitive process in two phases: first largely cultural, temporal, and narrative process; second a competitive process in a re-defined market with distinctive values not strictly established by price. His leading example is the Swiss mechanical watch-making industries; his second is the return and rise of independent book sellers in the USA.

Both the watch-makers and the book sellers lost substantial market shares when disruptive, good-enough technologies moved upmarket and claimed their most profitable customers: watchmakers with the rise of cheaper, more accurate quartz watches in the 1970s; book sellers with the rise of major chain bookstores in the 1990s, followed by Amazon. They keenly felt their losses: numerous Swiss firms closed or discontinued manufacturing; from 1995 to 2009 around 1,400 bookstores closed. Enough hung on, however, to rebound: how did they do it?

Raffaelli identifies the terms of competition: old terms such as price, availability, and quality change with the entry of disruptive technologies to market. The survivors have re-defined the competition: how they want to compete, and what value proposition they offer to their customers. He traces a complex process of de-coupling product and organizational identity and renegotiation of foundational concepts and business roles. The process is both bottom-up (from the "factory floor" or fundamental, front-line production or service) and top-down: from industry alliances, design thinking, and organizational management.

In the Swiss mechanical watch industry, he has identified entrepreneurs and guardians. Entrepreneurs are alert to market signals, cultural currents, and emerging narratives that suggest that new communities are forming new values. Guardians by contrast preserve older technologies and enduring values and counterbalance the entrepreneurs; both are necessary for the process of cognitive re-emergence. When the industry drew near to complete collapse, collectors began to purchase mechanical watches at high prices at auctions, signaling that their small community found genuine value expressed momentarily in price. Entrepreneurs realized that the market for mechanical watches had not completely disappeared, but changed: the value lay not in keeping time for a price, but in expressing a cultural signal. Guardians, meanwhile, had preserved enough of the technology that recovery was possible; veteran employees preserved crucial tools and skills that enabled a recovery. Each needed the other; the leadership necessary for re-emergence arose not just from the top level of the organization and industry, but from the commitment and wisdom of key skilled workers. Mechanical watches were then marketed as high-end, luxury items that "said something" about their owners. As new customers entered or moved up-market, they adopted such watches as a sign of cultural status and belonging.

Independent booksellers successfully re-framed their market as primarily community, secondarily as inventory. First the chain stores (Borders, Barnes & Noble) out-competed them on price, then Amazon on price and inventory availability. Independent booksellers have focused instead on 3 Cs: Community and local connections, Curation of inventory that enhanced a personal relationship with customers, and Convening events for those with similar interests: readings, lectures, author signings, and other group events. The booksellers' trade association (American Booksellers Association or ABA) facilitates booksellers' connections with local communities with skills, best practices, effective use of media, and outreach to other local business and organizations (--even libraries, once considered the booksellers' competitors). The re-emergent market was defined both by entrepreneurial booksellers, front-line service guardians, a growing social movement committed to localism, and industry-scale cooperation. Between 2009 and 2017 the ABA reported +35% more independent booksellers: from 1,651 to 2,321 nation-wide. A sign of the integration of booksellers with community spaces: for 2017 sales up 2.6% over 2016.

Like independent bookstores, the "library brand" remains strongly bound to printed books --after all, the name derives from "liber" (Latin), confirmed with "biblos" (Greek). The printed book, once thought to be a obsolete technology, shows strong signs of re-emergence as a stable cultural experience not apt to be interrupted by digital distractions or the dopamine kicks of addictive social media.  This brand identity will persist even though libraries offer many kinds of resources in many formats --including millions of digital books.

What does such technology and market re-emergence have to do with libraries? These cases suggest the emerging re-definition of libraries (as both old and new) is analogous to much of Raffaelli's work, and that the narrative frame of "disruptive innovation in higher education" can be --should be-- challenged by a this more useful counter-narrative, "new and re-emergent technologies in higher education."

While libraries' role as mere "book providers" has been challenged by disruptive technological service entrants such as the Internet, Amazon, and social media, libraries' role as a channel for trusted, stable information is stronger than ever. The Pew Research Center survey data from Fall 2016 found that 53% of Millennials (those 18 to 35 at that time) say they used a library --a generational cohort (not just college students--the study focused on public libraries). This compared with 45% of Gen Xers, 43% of Baby Boomers, and 36% of Silent Generation. In 2016 Pew also reported that libraries help "a lot" in deciding what information they can trust, from 24% in 2015 to 37% in 2016. Women held that opinion more strongly, 41%. Recent anecdotes suggest that such opinions have not changed direction.


Boston-public-library-free-to-allLibraries are regarded as very strong assets to a community: the high values placed on pleasant space, safety, and community events also emerged in the Pew studies. Coupled with bottom-up initiatives from front-line librarians and individual organizations, the American Library Association has devoted substantial attention and resources to initiatives such as the ACRL Framework for Information Literacy in Higher Education, and the Libraries Transform campaign.  Libraries' free-to-all traditions (supported by tuition, tax dollars, and other sources) do not track community impact as easily as do independent bookstore sales figures. Their value proposition for their communities becomes clear in usage figures (at SHU growth in usage has outpaced growth in enrollment) and the faculties' documented turn towards librarians in helping undergraduate students develop research, critical analysis, and information literacy skills.

As a re-emergent technology, printed books sustain a host of skills, occupations, organizations, and cultural signals that do not boil down to a single, simplistic, marketable narrative. Conceived in the late 20th century as "information resources," books gave way to digital representation; conceived as "documented knowledge," the act of reading books in a library context provides a tangible experience of informed learning, cultural absorption, and community participation. Libraries provide many services. Without the "brand" of reading books, and the sustaining services of librarians, the library would turn into derelict, zombie storage spaces. Knowledge is a communal good as well as a private act; it is never simply an individual achievement: free to all. We are all culturally embedded in the minds of our predecessors and communities for weal and woe --and without libraries, bookstores, timekeepers, and printed books, we will not be able to progress from woe to weal.

 

Joseph Smith Jr.'s role as an innovator, the "disruption" of Mormon proclamation, and the America that received or rejected his message all form an enduring, cultural matrix that later facilitated the popularization of Clayton Christensen's theory of “disruptive innovation” as a popular idea.

NB This is a very long post, and might be more readable in this .pdf document.

Abstract: Christensen’s theory of disruptive innovation has been popularly successful but faced increasing scrutiny in past years.  Christensen is a Latter-day Saint, and the career of Joseph Smith, Jr. and his ideas form a powerful backdrop for Christen’s theory.   Smith's role as an innovator, the "disruption" of Mormon proclamation, and the America that received or rejected his message all form an enduring, cultural matrix that later facilitated the popularization of “disruptive innovation” as an idea. Uncritical embrace of the popular theory, especially in higher education, implies acceptance of a cultural assumptions and suggest that the theory is less useful than has been claimed.  This essay is a reflection on history and experience, not a theoretical quid-pro-quo, or take-down.

Something always struck me as oddly familiar and decidedly off-key about Christensen's confident claims that innovation would explosively disrupt American higher education.  A previous boss dismissed my views as the reflex of one in a dying profession (librarianship). 

I never accepted her cynical critique; neither could I put my finger on why I still disagreed strongly.  Then a new insight came to me while teaching a historical survey course on American Religion about the idea's interesting and problematic deep structure.  My insight sprang from some familiarity both with the discourse of innovation in higher education, and 19th-century American religion, two widely known but widely separated fields.  What I have found will, I hope, give pause to thoughtful educational and academic leaders.  Uncritical embrace of "disruptive innovation" might implicate them in religious and cultural commitments that should give them pause, especially if they lead faith-affiliated organizations.

The first backstory:  Christensen first introduced disruptive innovation in 1995 in an article aimed at corporate managers rather than researchers.  In 1997 he expanded his ideas in The Innovators Dilemma which caught on in 2000s with managers, business writers, and consultants in a big way, with the aura supplied by Harvard Business School.  Since 2011 he has predicted that as many as half of American colleges and universities would close or go bankrupt in fifteen years (so in 2021-2026). He is still at it: in April 2017 he maintained his claim when speaking to Salesforce.org's Higher Education Summit.  "I might bet that it takes nine years rather than ten."  Sometimes he's been wrong (about the iPhone, for example), but he does not vacillate.

Disruptive innovation has become big business, popularized not only by Christensen (who has come to regret losing control of the term), but by a host of management consultants, pundits, and experts.  Among librarians, David Lewis applied Christensen's ideas in 2004, expanded into a book in 2016 that has been on the whole well-received by those in my "dying profession."  Predictable pushback came in 2014 from another Harvard professor (of history, Jill Lepore), and a detailed reexamination in the MIT Sloan Management Review, and others.  Christensen in turn defended his ideas and reformulated some of them in 2015, on the 20th anniversary of his initial publication. 

Three years later, if the concept has lost some valence or he's just wrong about higher education, why rehash this now and for the umpteenth time?

That's where the second backstory becomes relevant.  Christensen (of old Latter-day Saint stock) is not just coincidentally Mormon; that identity is central to his person and that background to his work. 

When I teach my historical survey of American Religion, in due course we come to the so-called Second Great Awakening in the first half of the 19th century.  Scholars give special attention to the "Burnt-Over District" of western New York, home of many potent religious and political ideas associated with "Whig Evangelicalism": abolition, temperance, the rights of women, and reforms of public health, education, prisons, orphanages, etc.  The District fostered not only mainstream Christian restorationist and evangelical movements (such as Disciples of Christ, ("Campbellites"), Methodists and Baptists), but also countless Millennialist, New-Thought, and Spiritualist communes, Shakers, Millerites (early Seventh-Day Adventists) -and Joseph Smith Jr.'s Latter Day Saints. 

Smith resists casual dismissal: was he truly a prophet of the living God (the mainstream Mormon view)? -a womanizing fraud (the view of many of his contemporaries, and critics since)? -a self-deluded prophet who eventually bought into own fabrications and could not extricate himself (a sort of early David Koresh)? -or some kind of mystic or pyschic with unusual access to otherworldly regions and the subconcious (a sort of frontier, raw-edged Francis of Assisi)?

Smith promulgated the enormous Book of Mormon (Skousen's critical first edition is 789 pages).  He claimed an angel guided him to find ancient plates in a hill near Palmyra, New York, which he translated from unknown languages with the help of seer stones and a hat, and dictated on-the-fly to his wife Emma Hale Smith and others, all in 65 days.   Even if he made it up, or shared authorship, or plagiarized part, it is an amazing performative achievement.  A densely layered anthology of documents, speakers, and authors, the text can be regarded (if not as Scripture) as an early forerunner of "magical realism." All this from 20-something farmhand with a little primary education.

Smith was truly a "rough stone rolling" whom generations of Mormons have never managed to smooth. "No man knows my history," he is reported to have said; "I cannot tell it: I shall never undertake it. . . . If I had not experienced what I have, I would not have believed it myself."  His innovative edginess strongly contrasts with earnest, family-oriented, upbeat, corporate image of contemporary Mormons.

"Innovative" -there's that word.  In matters of religion, innovation is not always a positive idea.  Smith's most famous innovations were the origins of the Book of Mormon, and the "new and everlasting covenant" (revelation) of both eternal marriage, and the principle of plural marriage.  The last innovation directly challenged 19th century American ideas about the family, and occasioned a furious opposition of a scale rarely seen in American history (leaving aside for the moment the historical plague of racism).  Latter Day Saints were opposed in Ohio, persecuted in Missouri (the Governor ordered extermination); Smith was murdered in 1844 in Illinois by a lynch mob acting out of congeries of fears.

The subsequent succession crisis would have led to fatal splintering were it not for Brigham Young.  A considerable majority of Mormons followed his leadership from Illinois in the Great Trek to Salt Lake City (1846-1848); Young's organization both preserved and transformed the Latter-day Saints; they lost their prophet but gained a hyphen.  The founder’s innovations would have perished without Young's tenacity, sheer longevity (died 1877) and "courageous leadership" or "iron-fisted rule," depending your point of view.

These two long backstories are essential for understanding both the meteoric rise of "disruptive innovation" and its recently waning appeal as an explanatory theory in light of qualms about its accuracy.

Joseph Smith, Jr., can be seen as an exemplary disruptive innovator.

"'Disruption' describes a process whereby a smaller company with fewer resources is able to successfully challenge established incumbent businesses." While incumbents focus upon improving their product or service (especially on behalf of their most profitable customers), they tend to ignore less profitable market sectors, or exceed their needs.  Entrants seek out those ignored market segments and can gain customers by providing more suitable products or services, frequently simpler and at lower cost. Incumbents can fail to respond in kind while entrants "move upmarket," improve their offerings, maintain their advantages and early success, and eventually drive out or acquire incumbents when their formerly most profitable customers shift purchases to the entrants. (Christensen, 2015). 

Since Christensen has complained that terms have become "sloppy" and undermine his theory's usefulness, I have tried to paraphrase his core idea carefully, and to present Mormon history even-handedly.  My central claim is that Smith's role as an innovator, the "disruption" of Mormon proclamation, and the America that received or rejected his message all form an enduring, cultural matrix that later facilitated the popularization of “disruptive innovation” as an idea.  Christensen's religion did not cause him to create his theory, but did contribute a framework that fostered its reception, as well as its swift extension beyond its first cases in manufacturing disc drives, construction implements, and other tangibles.

Christensen's commitment to his Church is beyond question; the intellectual, doctrinal traditions of his faith powerfully molded his character and thinking, by his own admission.  By all accounts he is a charming and sincere man, and would never have smuggled his religion into his professional thinking; he is an honest and forthright broker.  This essay is a reflection on history and experience, not a theoretical quid-pro-quo, or take-down.

Smith's movement "came to market" at a critical time in American religious history.  The Constitutional de-coupling of religious institutions from the state apparatus was one of the primary innovations of the new nation.  The new "open market" meant that incumbent legally established, financially supported Christian churches had to compete for adherents and support.  In Connecticut, for example, the "standing order" of political and religious aristocracy came to an end in 1817.  Such competition challenged the established churches' previous role as assumed arbiters of social legitimacy.  If you have a wide field of religious choices, you could also choose "none of the above." National anxieties about declining status of established forms of Christianity in large part fueled the striking resurgence of Christian groups loosely termed "evangelical." 

A significant portion of those evangelical groups can be described as "restorationist," appealing to a New Testament proclamation of God's restitution of all things.  This was taken to mean a return to the "primitive church, " a re-pristination of Christianity - taking it back to the putative purity of the founders.  This led to a proliferation of religious bodies, almost all of which inherited the binary certitude from earlier centuries that each was "correct" and hence others were necessarily "incorrect." Each group warranted its appeals in the lost purity of the first Christians.  For many groups, the path back to purity had been cleared by curbing the incumbent religious churches by disestablishment, and they hoped that clearance would level their status.

Since the religious marketplace of early 19th-century America had only recently been opened, doctrinal disputes that now seem arcane often paralleled heated social and culture divisions.  Smith's own family mirrored the national situation; his grandfather was a Universalist (that all humans could receive God's corrective grace; they opposed evangelicals); his mother has been identified as Presbyterian (officially, she would have believed that God's eternal decree predestined some to eternal blessedness and foreordained others to eternal damnation; Presbyterians tended to be allied with evangelicals).  Joseph Jr. may have caught a "spark of (evangelical) Methodism" at local rural revival meetings. His maternal grandfather and parents experienced visions and voices; like many farmers they used divining rods to find water and looked for buried treasure, a kind of folk magic.  They believed in prophecy and vision, tended towards skepticism about "organized religion,” and were receptive to new religious ideas.  He is reported to have told his mother, "I can take my Bible, and go into the woods, and learn more in two hours, than you can learn at meeting in two years, if you should go all the time."

In the religious marketplace in western New York, Smith’s family were typical of a market segment often ignored by the more well-established groups who appealed to more properous farmers, townspeople, and entrepreneurs (see Johnson’s A Shopkeepers Millennium).  Smith's family, on the other hand, were downwardly mobile recent arrivals from Vermont without a network of support, a consequence both of poor decisions and environmental strains such as the “Year without a Summer” (1816).  They typify the impoverished, rural working class on the inner edge of the frontier, a down-market segment less promising to more prominent groups, for whom the competitive religious marketplace was particularly nettlesome.

The 14-year-old Joseph was confused by the "cry and tumult" of Presbyterians vs. Baptist vs. Methodist, all using "both reason and sophistry," to "establish their own tenets and disprove all others."  He asked, "What is to be done? Who of all these parties are right; or, are they all wrong together?  If any one of them be right, which is it, and how shall I know?"  In other words, his market segment saw that ecclesiastical competition compromised the integrity of all parties. Reading a biblical text that directed him to ask God, he went into the woods (again!) and reported he experienced a dramatic theophany: the "Personage" answered "that I must join none of them," their "creeds were an abomination" and adherents "were all corrupt."  His take-away: he realized "at a very early period of my life, that I was destined to prove a disturber and annoyer."  Joseph’s subsequent innovations certainly disturbed and annoyed other groups.

Care must be taken, however, in simply equating Joseph’s social location with a commercial market position, because the religious “marketplace” differs in important ways from commerce: product differentiation, lock-in, and brand loyalty.

The religious "product" is not a commodity, but a sense of living affiliation with a group that makes doctrinal, moral, and behavioral claims in such a way that simultaneous affiliation with more than one group is either prohibited or discouraged.  The ultimate outcome, some kind of eternal blessedness, in principle excludes other ultimate outcomes.  Today many children in "mixed" families can feel religious differences strongly (and opt for "none"). For example, an individual cannot be a Catholic in good standing and simultaneously an evangelical Baptist in good standing -their claims and ideas simply conflict too much; if both present in the same family, some accommodation must be reached.  Joseph Smith Jr. found such exclusive "product differentiation" troublesome.

Religious adherents' "market lock in" is high: one might radically change one's affiliation once or twice in a lifetime, but more often is unusual and perhaps suspect, and "conversion" can exact high social costs.  The religious fervor of New York's Burned Over district in Joseph Smith, Jr.'s time left religious organizations in flux, so that conversion costs were often much less than before or after.  All early Latter Day Saints nevertheless had to make a clear decision that departed from their inherited religious affiliations.

A religious group's "brand loyalty" involves a constellation of commitments; socialist Fundamentalists and alt-right Episcopalians are vanishingly rare (for example).  The brand loyalty of early Latter Day Saints evolved from 1830 to 1844, becoming progressively stronger both in positive response to Joseph Smith Jr.'s continuing revelations, and defensive response to violent persecution.  For example, early Saints' constellation of commitments was ambivalent towards slavery; initially as Northerners early adherents opposed it; then revelations and teachings evolved to allow some slave-holders to join in Missouri (a slave state). After Smith’s murder, his son Joseph Smith III and widow Emma Hale Smith repudiated both slavery and plural marriage in the Reorganized Church of Jesus Christ of Latter-day Saints in Missouri, the "minority" successor group.  By contrast, Brigham Young's larger "majority" successor not only retained plural marriage but attempted to legalize slavery in the Utah Territory.  Since Republicans, starting in 1854, sought to abolish "twin relics of barbarism," slavery and polygamy (a jab at Young's group), it is unclear whether that commitment arose from core convictions or defensive resistance.

"Disruptive innovation" in the religious marketplace has to be treated carefully, because of not only the special nature of the religious market place, but also rigorous examination of the idea of "disruptive innovation:" it does not mean just any disruption.

Whatever the sources of Joseph Smith Jr.,’s ideas, he led a movement that "gain[ed] customers (i.e., adherents) by providing more suitable, often simpler products or services, frequently at a lower cost."  (Latter-day Saints have never had professional clergy; their commitment to mutual assistance is exemplary.)  Market incumbents (more organized and better financed competing groups) were slow to respond in kind, and as Smith's group moved "upmarket," it maintained its "advantages and early success" -high rates of "lock-in," group cohesion, and brand loyalty.  Smith's group, however, never quite succeeded in driving the "incumbents" out of the market or even acquiring most of their former customers.  Their sense of urgency lost its edge.

Why are the Latter-day Saints “latter-day”?  This code phrase refers above all to a shared set of cultural and religious assumptions and commitments in early 19th-century America.  "Millennialism" was the belief that the coming of the Kingdom of God (promised in the Bible) was imminent and that America, with its special covenant of religious liberty, would be central to its arrival.  Millennialism came in two distinct forms with opposite answers to the question, "Will Christ return before or after the promised millennium (1000 years) of peace?"  Pre-millennialists emphasized the troubles (tribulations) that would both precede and signal Christ's return to reign for 1000 years before the Last Judgement.  Post-millennialists proclaimed that Christ would return and the Last Judgement occur after the millennium of peace, culminating in his return; their task was to make the world ready for the Kingdom of God.  Both expect the Kingdom very soon: we are all living in the biblical "latter days."

This recondite disagreement has important implications.  Post-millennialists were all about social reforms that would make the United States so like the Kingdom of God that American society would usher in the millennium.  Pre-millennialists emphasized that Christ would only come after dramatically increasing tribulations.  Things getting worse and worse were a sign of his approach –hence they disfavored social reforms as a distraction from the real work of preparation for evil times. (Historical aside: the War of the Secession largely discredited post-millennialism, which morphed into the program of social reforms in the Progressive era. Pre-millennialism evolved into dispensational Christian fundamentalism, combining expectation of tribulation with a believe in the factual, literal innerrancy of the Bible.)

Latter-day Saints' enduring post-millennialism shows, among other ways, in their boundless optimism.  The familiar, earnest young missionaries (think The Book of Mormon, the Broadway show) are a token of the continuing commitment of the Latter-day Saints to usher in the latter days, although they expect them less imminently. Millennialism is common coin no longer.  Despite the popularity of the "Left Behind" series of books and movies, only a small minority of survivalists or "preppers" appeal to Biblical warrants for their expectations of imminent tribulations (disaster).

Detached from Christianity, expectations of imminent disaster and rebirth went rogue in American culture long ago.  The Silicon Valley today, for example, is home to many who expect a "singularity" in which the artificial intelligence outstrips human intelligence and introduces profound changes to civilization as a whole ­–another sort of secular millennium in which technology has replaced a Messiah as the anointed power.  Popular movies and books have made dystopia a cultural cliché.  (What's the disaster this time? Nuclear war, apes, viruses, climate change, or the abrupt disappearance of millions?). How many jokes about "voting in the last American election" (double entendre) play on similar fears?

"Disruptive innovation's" popularity exploded in the 1990s and 2000s exactly because of the numerous hopes and fears raised by the advent of the Internet and its devices and social media.  Josh Linkner warned, "disrupt or be disrupted," (The Road to Reinvention, 2014) and that binary choice spoke in apocalyptic tones to incumbent mass media, libraries, bookstores, journalists, travel agents, financial consultants, university presidents, and anyone else who deals in "information" as a commodity.  Such urgent warnings shout to established corporations, "The end is near: you can't innovate fast enough; you're not even the right people to do it."  Incumbent organizations were counted out simply because of their incumbency: MOOCs would inevitably disrupt brick-and-mortar educational institutions, now denigrated because of their mere physicality. 

The popular version of “disruptive innovation” played dystopian fears of the collapse of the known "incumbent" corporations and rise of an economy of perpetual disruption -Schumpter's capitalist creative destruction now recast as "disruptive innovation" with a brutalist, binary emphasis: disrupt or be disrupted.  The archetype "creative disruptor" is the technological whiz-kid (I nominate the Mark, "Zuck Almighty") whose revelatory "Book of Faces" and continuing revelations of a "new and everlasting platform" will usher in a thousand-year era of effortless, limitless, and unfailingly upbeat social confabulation.  Except when many kinds of terrorists, Russian autocrats, vaccine deniers, and deranged stalkers confabulate as well.

What does this have to do with Clayton Christensen?  Well, both a little and a lot.  He cannot deny his own popularization of his ideas through his books, media appearances, securities fund (the short-lived Disruptive Growth Fund, launched in 2000 at just the wrong time), and army of students, friends, and defenders such as Thomas Thurston in TechCrunch.  He lost control of "disruptive innovation" as a term of art precisely because of its appeal to those who make a living from in-your-face, counterintuitive claims.  Lepore identified circular reasoning in the popular idea of creative disruption ("If an established company doesn't disrupt, it will fail, and if it fails it must be because it didn't disrupt").  This logical circle may or may not characterize highly-disciplined case studies of Christensen's theory, but certainly rings true to the endless popular iterations.

Whether Christensen's theory holds up almost does not matter to "disruptive innovation" as a popular idea.  By analogy, in Smith's America, as Terryl Givens has noted, what mattered about the Book of Mormon was not its teachings or particular message. "It was the mere presence of the Book of Mormon itself as an object that . . . served as concrete evidence that God had opened the heavens again."  In that era all manner of revelations appeared: the Shakers' Holy, Sacred, and Divine Roll and Book, the visionary experiences of Ellen G. White (one of the originators of the Seventh-Day Adventists), and the visions of rival claimants of Smith's prophetic mantel among the Latter Day Saints after his death.  Kathleen Flake has noted, "Henry Ford wanted a car in every home. Joseph Smith was the Henry Ford of revelation. He wanted every home to have one, and the revelation he had in mind was the revelation he'd had, which was seeing God."  The heavens, once opened, proved harder to close.

The popular idea "creative disruption" has attached itself, meme-like, to a lot of second- and third-rate scams.  Business theory has fewer brightly defined disciplinary boundaries than physics. King's and Baatartogtokh's conclusion that the theory has limited predictive power does not render Christensen's ideas useless, but does suggest that "disruptive innovation" will not be the "one theory to rule them all," and with the profits or prophets bind them. 

Joseph Smith Jr. claimed that the encounter he had with the Holy in the woods warned him not to join any of the (Protestant) groups in his vicinity, whose creeds were all "corrupt" and "an abomination."  Christian restorationists called the very early Christian movement in the short times reflected in the New Testament texts "the primitive church," and regarded all subsequent developments as not merely wrong, but apostate: those knew the truth but deliberately denied it.  Joseph Smith, Jr.'s saw his new revelation as a giant delete key on all of Christian history, Orthodox Eastern, Catholic, and Protestant.  All of it had to go.

In a similar manner, popular "disruptive innovation" connotes the passing destruction of all that is wrong with sclerotic corporate capitalism, and the restoration of the pure, "invisible hand" of the marketplace that allegedly guided early capitalists.  This popular view resonates with a great deal of cultural, political libertarianism, that giant corporations and government bureaucracy are apostasy betraying the true faith handed down from the founders (either Jesus or Adam Smith, as you wish).  "Move fast and break things," advised the Zuck; what can be disrupted should be disrupted.  Including, it would now seem, democracy wherever it might be found.

Disciplined use of the theory of "disruptive innovation" in carefully defined circumstances provides explanatory clarity but its predictive power is in fact more of a hope than an established fact, despite the protests of Christensen's defenders.  This means that it is one theory among other theories: Michael Porter's theory of competitive advantages and multifactorial analyses will likely work equally well in other carefully-defined situations.  Similarly, The Church of Jesus Christ of Latter-day Saints has found ways of regarding other religious groups positively (even Evangelicals, often the most hostile), and has moved on from the language of "apostasy."  Originally intending to hit that giant delete key, subsequent Latter-day Saints have found a way to live as active readers of their particular texts in the midst of many other readers of many other texts.  This has relevance on the ground.  Given the official LDS teachings regarding divorce and homosexuality, some LDS families have found informal means to include and tolerate differences within their members, coming to resemble the family life Joseph Smith Jr.' knew as a boy.  (Others have continued to shun their "apostates.")

Unlike Smith, Christensen never intended to promulgate a "unified field theory" of religion or business development. He is not completely responsible for losing control of his theory as a popular idea.  The close of his 20-year, 2015 re-evaluation, "We still have a lot to learn" acknowledges that "disruption theory does not, and never will, explain everything about innovation specifically or business success generally." 

Christensen's modesty still did not inhibit him from doubling down on his claim that half of American colleges and universities would close by 2025.)  Allowing his claim relative rather than revelatory validity dispels the apocalyptic fears of barbarians at the gates.  His primary analogy in The Innovative University (2011) is "changing the DNA of American Higher Education from the Inside Out," (the subtitle).  He claims that all American colleges and universities other than a branch of Brigham Young University in Idaho share the DNA of Harvard: all these institutions want to become, apparently, Research-1 universities if given money and the chance.  What does that really mean, and is that really true?  Such a simple analogy does grave injustice to community colleges (vital economic links for countless communities and immigrants), specialized schools such as Maine Maritime Academy, or even an elite liberal arts college such as DePauw University.  The popular view that higher education has changed and can change little, is flat wrong: ask any productive historian of higher education.  Change and innovation (whether disruptive or other) will not appear equally and everywhere overnight.  The higher education sector is not (thank heavens) the Silicon Valley, or McKinsey & Co.

Yet all is not well: the economic model underpinning American higher education is likely unsustainable in the coming decades for many reasons.  Higher education also forms a huge social and financial investment that unlikely to dissipate.  Distance education, information technology, changing social expectations, shifting demographics will all play a role in whether particular colleges and universities can continue to be viable. Disciplined use of the theory of "disruptive innovation" will likely hold some, but is unlikely to hold all explanatory and predictive keys.  The truth is out there but it will be much more complex.

The striking persistence of the popular "disruptive innovation" in senior management circles (typified by the Salesforce.com higher education event) reveals not only persistent fears and enduring threats, but short attention spans devoted to keeping up with the swift pace of too many events.  I suspect that popular "disruptive innovative" functions in a manner more affiliative than explanatory: "if you don't get it, you're not one of us. -- You think Jill Lepore, or King and Baartatogtokh, might be right, eh?  Let's see how long you last in the C-Suite" (--especially if you can pronounce the latter's name).

"Disruptive innovation" elicits fears useful for those who want to shake up certain professions in health, law, librarianship, and the professoriate, but by now its been over-used.  At librarians' meetings (ALA, ACRL) I have developed the habit of responding to the expression, "disruptive innovation" with the question, "what are you selling?"  Fear sells "solutions;" its potency as a means of marketing continues nearly unrivaled.  No one ever sold an expensive library services platform with the phrase, "this won't solve all your problems."  Since 1985 I have sat through many presentations that predicted the closure of libraries within ten years - Christensen's remark "I might bet that it takes nine years rather than ten" would find a new audience.  We who are about to be disrupted salute you, Prophet.

Nevertheless: printed books, asking questions, research assistance, and personal relationships with library instructors endure.  They were warned, but they persisted.  It is past time to find a more accurate analysis and predictive theory of the future of libraries and higher education.

On Tyranny is not only about an American moment, but about a worldwide one.

image from libapps.s3.amazonaws.comOn Tyranny: Twenty Lessons from the Twentieth Century, by Timothy Snyder.  Duggan Books (Crown), 2017. 126 p. ISBN 978-0804190114. List price $8.99

Yale University professor Timothy Snyder has spent a long time learning the languages, reading the documents, exploring the archives, and listening to witnesses of the totalitarian tyrannies of Europe in the last century --particularly of Nazi Germany and the Stalinist Soviet Union. His scholarship bore particular fruit in books such as Bloodlands: Europe between Hitler and Stalin, and Black Earth: the Holocaust as History and Warning. He came to recognize that certain characteristics in the development of those tyrannies are present in the world today, and in the United States. This book is no partisan screed: Snyder recognizes in the 45th President features he knows from other contexts; those other contexts underscore the drift towards totalitarianism apparent from Russian to Europe to the USA. On Tyranny is not only about an American moment, but about a worldwide one.

This short book consists of a brief introduction, twenty short chapters, and an epilogue. Each chapter directs an action, such as no. 12, "Make eye contact and small talk" followed by a historical example, or expansion of the point. All the actions can be undertaken or performed in daily life; there is no grand theory here.

In place a grand theory, there is a fundamental point: respect and value facts, truth, and accurate usage of our common language. In Moment (magazine), he explained: "Once you say that there isn’t truth and you try to undermine the people whose job it is to tell the truth, such as journalists, you make democracy impossible." He told Bill Maher (at 2:02) than while "post-fact" postmodernism might connote "Berkeley, baguettes, and France and nice things," it more likely means that "every day doesn't matter; details don't matter; facts don't matter; all that matters is the message, the leader, the myth, the totality" --a condition of Europe in the 1920s.  Such disdain for the truth goes hand-in-hand with conspiracy theories that put assign blame to a group associated with undermining the purity of the majority. "Rather than facing up to the fact that life is hard and that globalization presents challenges, you name and blame people and groups who you say are at fault."  Jews, Mexicans, Muslims, Rohingya, Tutsis, Hutus, globalists, evolutionists, or any other "outsider."  The myth: "Make [fill in the blank] great again."

A librarian or research might particularly resonate with Snyder's directions, "Be kind to our language," "Believe in truth," and "Investigate" (lessons 9-11). This is all a way to prepare to "be calm when the unthinkable arrives" (lesson 18) --when a leader exploits a catastrophic event to urge follows to trade freedom for security, and suspends the rule of law. The Chief Executive may or may not be attempt to stage a coup; that American democracy survived the dark moment after the Charlottesville.  Snyder told Salon in August, "We are hanging by our teeth to the rule of law. That was my judgment at the beginning of his presidency and it is still my judgment now. The rule of law is what gives us a chance to rebuild the system after this is all done."

Whether or not current politics result in tyranny and oppression is still (at this writing) an open question. The importance of Snyder's book is that it points beyond this moment to the wider trends and challenges of a world which is global (like it or not), connected (like it or not), and interdependent on both our natural climates and accrued, hard-won cultural heritages. A University founded on "a rigorous and interdisciplinary search for truth and wisdom" that "forms the cornerstone of all University life and welcomes people from all faiths and cultures" cannot leave our students unprepared. In order to make history, young Americans will have to know some (p. 126)  Will that be the twenty-first lesson on tyranny from the twenty-first century?

--Gavin Ferriby

Hartley argues that liberal arts educations widen a student's horizon, inquire about human behavior and find opportunities for products and services that will meet human needs. The "softer" subjects helps persons to determine which problem they're trying to solve in the first place.

FuzzyAndTheTechieJacketCoverThe Fuzzy and the Techie: Why the Liberal Arts Will Rule the Digital World, by Scott Hartley.  New York: Houghton Mifflin Harcourt, 2017. ISBN 978-0544-944770 $28.00 List.

Hartley writes that a "false dichotomy" divides computer sciences and the humanities, and extends this argue to STEM curricula as well. For example, Vinod Khosla of Sun Microsystems has claimed that "little of the material taught in liberal arts programs today is relevant to the future." Hartley believes that such a mind-set is wrong, for several reasons. Such a belief encourages students to pursue learning only in vocational terms: preparing for a job. STEM field require intense specialization, but some barrier to coding (for example) are dropping with web services or communities such as GitHub and Stack Overflow. Beyond narrow vocational boundaries, Hartley argues that liberal arts educations widen a student's horizon, inquire about human behavior and find opportunities for products and services that will meet human needs. The "softer" subjects helps persons to determine which problem they're trying to solve in the first place.

That said, the book does not move much further. Hartley never really tries to provide a working definition for true "liberal arts" education except to distinguish it STEM or Computer Science. By using the vocabulary of "fuzzy" and "techie" he encountered at Stanford, he inadvertently extends a mentality that has fostered start-ups notably acknowledged to be unfriendly to women. So far as I could determine, a mere handful of Hartley's sources as noted were published elsewhere than digitally--although the "liberal arts," however defined, have a very long tradition of inquiry and literature that Hartley passes by almost breezily, and is very little in evidence. His book is essentially a series of stories of companies and their founders, many of whom did not earn "techie" degrees.

Mark Zuckerberg's famous motto "move fast and break things" utterly discounted the social and cultural values of what might get broken. Partly in consequence, the previously admired prodigies of Silicon Valley start-ups are facing intense social scrutiny in 2017 in part as a result of their ignorance of human fallibility and conflict.
Hartley is on to a real problem, but he needs to do much more homework to see how firmly rooted the false dichotomy between sciences and humanities is rooted in American (and world-wide) culture. The tendency, for example, to regard undergraduate majors as job preparation rather than as disciplined thinking, focused interest and curiosity is so widespread that even Barack Obama displayed it. ("Folks can make a lot more, potentially, with skilled manufacturing or the trades than they might with an art history degree" --Barack Obama's remark in Wisconsin in 2014; he did retract it later).

Genuine discussion of the values of humanities and STEM degrees can only take place with the disciplined thinking, awareness of traditions, and respect for diversity that are hallmarks of a true liberal arts education.