Skip to content

I have long loved the elegant verse of David T. W. McCord, "Books fall open, / you fall in, / delighted where / you've never been." Sometimes that's even true. (A poet with a day job, McCord also raised millions for Harvard as director of the Harvard College Fund.)

Several months ago a book fell open for me, one that I had long known about but never really read: Karl Barth's The Epistle to the Romans. I started to read it not long after I resolved to nurture my knowledge of Greek by reading some New Testament daily. I chose Romans because it was there (and because I read much of it for a seminar with the late great Prof. Christian Beker): it demands attention, is linguistically difficult, and seems to be found in the midst of every major turn in Christian history, such as Martin Luther and Karl Barth.

I also chose Barth because I also have access, courtesy of Princeton Theological Seminary Library, to the German text of Barth's second edition, 1922, which open the way for so-called "dialectical theology" and a decisive turn from previous liberal Protestantism by many German-language writers. So I could exercise my Greek, Latin (I read parallel in the Nova Vulgata) and German, languages that I spent a lot of time and effort learning when I was an undergraduate, and which I do not wish to lose at this later stage of my life.

Little did I realize, when I began this project in 2019, that it would become so timely. I began Romans exactly because I have been so dismayed by the slow slide into fascist authoritarianism underway in my native country. (I hesitate to call it my home any longer; I live here as a resident alien.) I had some inkling that it has something to say to my condition. I was little prepared for how much it had to say.

Barth's book came out in successive editions in 1919 (finished in the last days of the War, December 1918), and a second edition in 1922, as well as other later minor alterations, clarifications, and republication, with an English translation by Sir Edwyn Hoskyns in 1933. The Epistle to the Romans established Barth's early reputation as a disruptor; the second edition in particular critiqued (or bulldozed) the cultural liberalism of the Protestant theology of his teachers at Marburg and elsewhere. The term "dialectical theology" which arose in the early 1920s highlighted frequent use of paradoxical formulations and polar opposites by several writers (Barth, Gogarten, and Brunner prominent among them) and was always --given the prominence of "dialectic" among the heirs of Hegel, and especially Marx--something of a misnomer. There is a very great amount of distinguished scholarship regarding the origins and development of "dialectical theology" in the 1920s and the disagreements of its original proponents as time went on (and especially in the 1930s), and I do not intend to rehearse any of that here.

My focus instead is how unexpectedly contemporary the second edition of Barth's Epistle to the Romans now seems to be, despite its readily apparent and important differences with the present moment. I am enough of a historian to remember and acknowledge that history does not repeat itself, or even often rhyme (in words erroneously attributed to Mark Twain), but somehow sometimes retains a certain metrical force, like blank verse in iambic pentameter. Or in Yogi Berra's reputed words, "It's like déjà vu all over again."

When reading the German second edition (available in the Karl Barth Digital Library, or via Hathitrust here (Vorsicht: Fraktur!), I immediately felt great sympathy for the late Sir Edwyn Hoskyns, translator (13th Baronet of Harewood, County Hereford, and Dean of Corpus Christi College, Cambridge). Sir Edwyn faced the daunting challenges of Barth's German which intentionally pushes beyond normal boundaries with multiple-compounded words, neologisms, paratactic paragraph-long sentences, and now-obscure references to contemporaneous events and well-known persons. Sir Edwyn's English version of the German sixth edition (1928) unavoidably presses Barth's free-range German into Oxbridge English, and the language of the Authorized Version. He inadvertently misses or minimizes Barth's Sprachspiel or playing with the language in an attempt to demolish over-familiar and well-worn phrases bearing the echoes of discredited theologies, politics, and cultural outlooks. On the other hand, following Barth's German too closely would nearly completely obfuscate his meaning and significance for English readers, and linguistically daze them.

Barth's German is variously playful, allusive, hortatory, monitory, and nearly aphoristic. (I will give examples in succeeding blog entries.). It reminded me less of the weighty German of Barth's later Kirchliche Dogmatik than of writers associated with German "expressionism:" Georg Büchner (especially Woyzeck, adapted for Alban Berg's opera), Frank Wedekind (Lulu plays, the basis of Berg's opera, and Frühlings Erwachen or Spring Awakening, source of the 2006 musical), Franz von Unruh (especially Opfergang, anti-war drama written during the Siege of Verdun/Schlacht um Verdun). I also thought of writers associated with the Austrian Jugendstil, especially Robert Musil, Arthur Schnitzler, Stefan Zweig, and Georg Trakl (associated with Der Brenner, influential in the revival of Kierkegaard in German-speaking lands, and a link via friendship to Ludwig Wittgenstein). The influence of both Kierkegaard and Nietzsche are especially prominent in Barth's second edition of Romans, and has been studied thoroughly.

Barth used his expressive, nearly expressionist language to torpedo the discredited cultural, Protestant hegemony represented above all by his (respected) teacher Wilhelm Herrmann. A "Lutheran Neo-Kantian," Herrmann taught Barth to speak of God dialectically or in opposites: dogmatic/critical, Yes/No, veiling/unveiling, objective/subjective, in line with an emphasis upon the religious experience of the individual (from Friedrich Schleiermacher). So Barth did not repudiate everything that Herrmann taught him. But—Herrmann was one of the 93 signers of Manifesto "To the Civilized World" (An die Kulturwelt: Ein Aufruf), which unequivocally supported German militarism and military actions in 1914, specifically including the "Rape of Belgium" (in essence confirmed by later scholarship despite certain Allied fabrications). Herrman was not the only prominent Protestant theologian to sign An die Kulturwelt: so did Adolf Diessmann, Adolf von Harnack, Adolf von Schlatter, and Reinhold Seeberg (father of Erich Seeberg, the eventual Dean of the "German Christian" Protestant theological faculty in Berlin during the National Socialist régime). So Barth's demolition of the language of liberal Protestantism extended his cultural critique of discredited, compromised, fatally flawed accommodation of liberal Protestant theology with German militarism. Barth's citizenship and employment as a Swiss pastor allowed him both access to the German linguistic community and freedom from German war-time censors, however sympathetic the officially-neutral German-speaking Swiss may have been to the German side, and their consequential anxiety of about any kind of wartime dissent.

Barth began his project of demolishing the whole discredited line of Protestant piety focused on individual experience (and accompanying social and political irrelevance) in 1919 but by 1922 his second edition reflected his dismay with the disaster then unfolding in revolutionary Russia as it morphed into the early Soviet Union. Barth's demolition extended both to the right (socially mainstream, bürgerlich Protestantism), to the left (red to redder socialism), and the Roman Catholicism already susceptible to revanchist or restorationist, imperialial-fascist fantasies in Austria, Italy, Spain, and elsewhere). Safe in stable Switzerland, he nevertheless experienced (with the rest of Europe) the disorienting political and cultural shock of imperial crackup, and the cowardice of German wartime leaders, especially the Kaiser, Tirpitz, and Ludendorff. Nor did economic disaster spare Switzerland: the general strike (Landesstreik) in November 1918 strained civil society more than any time since the Sonderbund war of 1847-1848. The H1N1 "Spanish" influenza epidemic 1918-1919 hit Switzerland very hard, and the inadequate official response brought the nation to the brink of civil war.

Barth wrote his second edition, then, in a lightly industrialized (weaving) town in the midst of entrenched economic disaparities, national and international political dysfunction, and pandemic. This is not the same world as 2019-2020, but it seems eerily adjacent. Even more, the collapse of a public theological or religious rhetoric associated with denuded assumptions of political, cultural hegemony and the thorough discrediting of its moral authority (in Europe then), can only remind one of the cultural bankruptcy of American evangelicalism now, so thoroughly colonized or even weaponized by the current chaotic American régime. In America in 2020, as in Europe in 1922, all these deteriorating conditions reflect both acknowledged and unacknowledged wounds and diseases left festering for decades.

Barth's response was vigorous social critique and withering theological appraisal reflected his wide reading and wide-ranging cultural allusions that extend far beyond normal academic theological discourse. He speaks with a preacher's or prophet's voice, not the voice a professor (although he is certainly academically well informed). The world of his text —what it presumes, upon which it comments, whose pretensions it exposes—will be the content of further blog entries.

I finally got around to reading Stephen Greenblatt's Will in the World: How Shakespeare Became Shakespeare years late, and after thoroughly enjoying the first part of his book The Swerve: How the World Became Modern. I emphasis first part, because that book is really two texts: one is a thoroughly entertaining and engaging account of how Poggio Bracciolini found and published the sole sizable manuscript of Lucretius' De Rerum Natura, a text that informed 15th-17th century skepticism towards Christianity. The second, less recondite and rather pretentious text, is essentially an anti-religious polemic from a rather tired point of view, that the so-called "Renaissance" bloomed with insightful and wise humanism and overcame centuries of superstition and dogmatism --in other words, Burckhardt's or Gibbon's point of view of the Middle Ages. This second text springs not only from a historically questionable point of view, but contains factual errors (such as ascribing the New Testament sentence, "Jesus wept" to Luke rather than John --an easy error any publisher's fact-checkers should have caught). The failure of the second part of Greenblatt's text do not overshadow the value of the first, but one does wonder, why bother?

Will in the World is an equally engaging reading of most of Shakespeare's works (poetry and theatre) to inform careful speculation about his life, inner life, and how he came to produce them. It does not seek to explain Shakespeare so that his achievement becomes merely quotidian --far from it! Insofar as facts about Will's life are known, but threadbare, due to the careful record-keeping of legal proceedings in Elizabeth's England, Greenblatt bases his account on well-received points of text well-known to scholars. What Greenblatt brings is careful but not over-fussy readings of Shakespeare's text themselves with an eye to what may have informed his imagination, such as the premature death of his son Hamnet in the period just before Will wrote Hamlet --an unsubtle play on names, especially since both orthography and pronunciation were not standard in Will's time. Consequently Greenblatt's text is noticeably furnished with careful speculations, verbs including must have, likely have, may have, and words such as probably and apparently —all in service of trying to pry beyond the veil of historical silence to ask what was going on in Will's mind and spirit.

Scholars may or may not approve of some, all, or any of Greenblatt's well-founded conjectures, and David Stenhouse justifiably would retitle the book Will in the Work. Greenblatt discovers no new facts, but does not disguise his conjectures. For a non-specialist reader, he conveys vividly the peculiarities of London ca. 1590-1610, odd legal features of property and time (such as "liberties," a legal trick by which the civic legal authority over formerly monastic lands, once held by houses long since dissolved, continued to be limited). Will's world is evoked very credibly in many places; in others not so much. Will's Catholic sympathies, or down-right Catholicism, seems less well-established than Greenblatt would make it. Will's utter silence regarding the leading source of social unrest and legal danger in his time still seems more prudential than protective: we just don't know what he thought about any questions of faith.

The problem is that for a figure in his time, Shakespeare leaves no letters, very few reliably autograph manuscripts, no diaries, few reliable, personal reminiscences remain from those who knew him, and few legal records beyond the barely factual. No wonder speculation remains that the figure of Will was a front for someone more notably connected or well-born, who however remains equally elusive, or more so. Compared even with earlier figures such as John Calvin or Heinrich Bullinger, figures at home in a different but equally litigious, dangerous, and ambiguous culture, Will left so little --but what he did leave was sometimes startling revelatory of human feelings that are still accessible, as well as affections and desires long since, often thought unworthy (such as same-sex desire), but now recovered and celebrated.

I'm glad I read it --it helps me to hear Will's works and words on stage. Where's Will? is a more serious game of where's Waldo? however -- and he's still hard to spot. Greenblatt's New Historicism serves him well in this case. The book is worthwhile to read, and if you wish a dive into Will's texts, I recommend it. His reading alone of Julius Ceasar in Elizabethan context is worthwhile —and in the current age of political lies, alternative facts, fake news, looming tyranny, and unbridled ambition, is highly topical.

I have written previously about Matthew Battle's 2003 book Library: An Unquiet History (Norton)—see this link for text rescued from a previous blog, and page down to January 24, 2011 (or just page-search "Matthew Battles" and it's the second occurence). I took part in a recent informal conversation about this book at a recent library conference, and I enjoyed re-visiting it.

Battles seeks to "read the library" (page 14).

I explore the library's intertwined relations of fancy and authenticity, of folly and epiphany, of the Parnassan and the universal. My method . . . mirrors that of Eugene Gant [a character is Thomas Wolfe's Of Time and the River]: I pick up a volume . . . . [and] follow a trail. . . I drop on passage to follow another, threading my way among the ranges of books, lost among the shelves. . . . What I'm looking for are points of transformation, those moments where readers, authors, and librarians question the meaning of the library itself.

pages 20-21

Battles does not seek either a reductive account nor a comprehensive exposition of the history of libraries, but points out transformations of text, reader, author, and librarian. His pursuit takes him to ancient Mesopotamia and classical antiquity, ancient China, the Aztec realm and its predecessors and successor, Renaissance and early modern Europe, all the way to Nazi Germany and the ethnic wars of southeast Europe. His most recent example is probably the destruction of the Buddhist statues at Bamiyan, Afghanistan, by the Taliban which sought iconoclasm and destruction to enforce their lethal, ideologically pure, and ethnically cleansed mockery of Islam.

One of Battles' major points is that the same insight that has led some cultural, political, or military leaders to found, build, and sustain libraries is the insight that has led to their destruction: a library is a source of power, prestige, and memory, and when power changes hands, prestige is re-distributed, and memories (in the new leaders view) must be extinguished, libraries are destroyed. Whether or not the library of Alexandria was in fact torched by Romans, Muslims, or Christians, whether or not the First August Emperor (Shi Huangdi) of China in fact destroyed all records of previous states as well as élite or Mandarin writings, libraries have withered and been dispersed as surely as they have been built. Nazi German librarians made a Faustian bargain to preserve themselves, some of their books, and their profession by subordinating it to fictions of Volk, blood, and soil, only to see themselves even further marginalized in war and the subsequent re-founding of the German states (Jahre Null). Libraries have an unquiet history not despite their development and success, but exactly because of it.

The years since Battle's 2003 publication have only confirmed this, alas. Islamist insurgents fleeing Timbuktu before advancing French soldiers torched two library buildings in 2013, destroying priceless and unique Sufi manuscripts --the Sufis insufficiently Islamist in their view (many of the manuscripts were subsequently found to have survived). Many other manuscripts of this center of Islamic learning had been (or have been) moved and recorded elsewhere (and subjected to the dangers of humidity levels never occurring in Mali). In 2012 and after, manuscripts had to be protected again, in a remarkably multi-pronged and multi-part effort brilliantly described in Joshua Hammer's The Bad-Ass Librarians of Timbuktu: and Their Race to Save the World's Most Precious Manuscripts (New York: Simon & Schuster, 2016) The Islamic State of Iraq and the Levant (or Syria, ISIL or ISIS or Daesh) destroyed many cultural artifacts of ancient civilization in their crude, terroristic, and bigoted reign, including the Central Library of Mosul, the University of Mosul, the library of a 265-year-old Latin (Dominican) Church, and the Mosul Museum Library. The Peoples' Republic of China has determined to destroy Uighur culture and many artifacts and collections of Kashgar and elsewhere in Xinjiang, a remarkable example of Han Chinese racism and bigotry that goes hand-in-glove with Han Chinese destruction of Tibetan culture.

Digital destruction is also certainly possible of a kind that marginalizes Nicholson Baker's carefully enacted, idiosyncratic, and self-hyped outrage at the "loss" of newspapers that were already acidifying (Double Fold: Libraries and the Assault on Paper, 2001). By "digital destruction" I mean the promulgation of (anti-) social media and its assault on any concepts of truth, such that cultural memory is relegated to the memory hole. The infuriating, bland shallowness of Mark Zuckerberg leads as example one, but many others follow, including Jack Dorsey of Twitter, and Steven Huffman and Alex Ohanian of Reddit, who celebrated "freedom from the press." These panglossian, superficial, and flashy developers with the sketchiest, and most desultory, slap-dash notions of "freedom of speech" and neo-liberal deregulation have unleashed hordes of edge-lords whose only real goal is to burn it all down: language, truth, discourse, respect for other points of view, and tolerance for disagreement. They join the insufferably woke of right and left to rend any democratic policy and polity, and to worship authoritarian ideologies masquerading as enlightened thinking. Their red pills become the cyanide tablets concealed by the romantic spies of the thrillers. All this digital destruction is as assuredly an assault on the web of language, concepts, habits, skills, and dispositions that build and enact libraries and inquiry, as the depredations of Communist Chinese, Islamists, and fascists of every stripe and country, an assault that moves forward 24/7/365.

What might save libraries? Ironically: burial, or going off-line, inaccessible, and impossible to locate. What goes around comes around, in the long run, but it can be a very long run. The arc of history may or may not bend towards justice, but it does oscillate between truth, power, brutality, and lies, and in the end the truth is surprisingly durable. How else did anything at all survive from antiquity, whether classical, Asian, or Meso-American? One of the keys is librarians —who remain slightly suspect and patronized by academic leaders because they know, deep down, that librarians don't only work for them and their institutional goals and objectives. Librarians have something bigger in mind —as Michael Moore knows, and said ten years ago when HarperCollins attempted to intimidate him, and require a re-write of his book Stupid White Men to tone down his criticism of George W. Bush.

I really didn't realize the librarians were, you know, such a dangerous group.
They are subversive. You think they're just sitting there at the desk, all quiet and everything. They're like plotting the revolution, man. I wouldn't mess with them. You know, they've had their budgets cut. They're paid nothing. Books are falling apart. The libraries are just like the ass end of everything, right?

Daily Kos, October 20, 2009

Well, not the ass end of everything. Rather like Balaam's Ass (Numbers 22:21-39): inconvenient both to Balaam and the rulers of the Moabites, seeing an angel blocking the narrow passage to the future, and pointing out injustice, inaccuracy, and lies. It's a tougher job than many might think, rarely recognized, and frequently obscured by librarians' own professional commitments —but speaking through the unlikely and the disparaged, and shining a light on truths, nevertheless. Caveat lector.

I just finished --very much behind the curve--Eric Klinenberg's Palaces of the People, and of course I liked it. It gives pride of place to libraries in the evolving social structure of the 20th century in the United States, and of difficulties since. So why wouldn't I like that? I was still left with some questions (and maybe Klinenberg will continue on this subject in his future work).

Klinenberg focused on public libraries, and that's an important feature and caveat at once. My sister (now retired) was director of an urban public library in Michigan for twenty-four years, and I saw first hand how physical infrastructure, service commitments, political savvy, and sheer determination assisted in revitalizing Muskegon, Michigan --still a town with difficulties, but with a splendid public library. Deborah Fallows has noted that libraries are where you need to go if want to find out what is really happening in a town, that libraries often are the first institutions that take action to fill gaps in a community." I've seen this truth with my own eyes in a wide variety of locations. The three locations of Hamden Public Library (CT) are centers of the whole community, and alive with activity.

What about academic libraries? As Joshua Kim noted, "Klinenberg knows that academic libraries are under a quiet attack from every quarter of academia." I suspect that the old saw the "we don't need libraries now that we have the internet (or Amazon)" has finally begun to wear out, but there are many challenges to academic libraries, not just budgets.

In reality, libraries have made their services and technologies so seamless and transparent that users are often unaware that libraries have contributed to their success. I have heard some of my own faculty say, "I don't use the library; I get everything I need on the internet," blissfully unaware that they are finding things exactly because librarians did a lot of work behind the veil, as it were, to make sure that our users have access to expensive resources.

Then there's another old saw, "Students don't read books anymore." Funny --I remember faculty complaining about the growing lack of student reading in the early 1970s. I suspect t'was ever thus, and without doubt the advent of digital distractions has cut into paper book circulations. But if public library circulations are up, and printed book sales are up, where are all those books going? Pew Research surveys have shown that young people are more likely to use libraries and borrow books than many of their elders --but not college students? huh?

On a university campus, what I see is this: students use the library physical space both to stay on task, and to do so in the proximity of their friends. It is social capital, in Klinenberg's sense, but indirect. The good students want to keep up with the other good students: they do that through their studies, and many of them do that in the library, regardless how they interact (or don't) with library resources and services. The library also provides access to workstations and printers that some students lack: it is a relative leveller in an academic society that, like wider society, continues to stratify.

The question, do "good students" use the library, or do they become (and remain) "good students" because they use the library? What does "using" the library mean --space, context, resources, services, technology, or some personal amalgam of all the above? And what is a "good student" anyway? --is that just a proxy for academic high achievers, or can a student be "good" in a way not wholly described by grades? What about the student whose mind or heart is actually changed by something she learns through study, regardless how that is measured in cumulative academic assessment? (Is there any way even to know this?) Those spaces need not be "palaces" --a slightly misleading metaphor, though well-intentioned. But the spaces need to be adequate. Not every campus can afford a library space that wins architectural awards, and not every campus should.

I think that an emphasis upon cumulative academic assessment --GPA, etc.--and the question, how much of that is influenced by library services and resources and spaces?-- is too narrow. What an academic library might best provide is a space to be in process, to be different, really to grow or learn, to project or experiment with new ways for a student to be in the world, in the future. This is almost impossible to measure and so falls by the wayside of an academic culture increasingly dominated by numerical assessments, evidence-based practice, and "closing the circle" from assessment to improvement, "excellence and innovation in teaching" as my institution puts it. Good academic library spaces can offer (in an economic metaphor) affordances for such assessment, but so much more.

The digital library space can offer something similar --diluted or strengthened--to distance learners as well. I take Joshua Kim's reproof that online education is much more than MOOCs. At my library we work very hard to make sure that our services to online services match what we do for on-ground students, both in quality and resources. Some of our students taking online courses are in fact enrolled in on-ground university curricula. We have done online consultation sessions with students who are physically located upstairs or in the neighboring buildings. It is a challenge and a growth point to figure out how to be library fully in the digital space, a conundrum because these days no one is in the physical library space without interaction with the digital. No one is using a paper-only library on a university campus these days.

Campus social infrastructure is evolving, as well as off-campus. Different kinds of spaces on campus (maker-spaces, for example) are thriving and fostering a social capital in a manner not quite seen before --similar to what has gone on, but also different, and better (I hope!). I agree with Kim, Klinenberg, Barbara Fister that libraries are an essential social infrastructure. The question remains whether those who control the institution's fiscal resources will agree. One of the principle weaknesses of academic libraries lies in that dissonance: those who make the most important decisions about the level of funding for an academic library never actually use the academic library themselves. When did you last seen a President or Provost in the stacks, or reading, or even interacting with others in the building? What senior staff leadership sees, and what users see, can be quite different.

source: applift.com

Two recent articles or reports, published completely separately but oddly complementary, give shape to the ominous information landscape today, so hostile to expertise and alien to nuance. The first is published in Nature, "Information Gerrymandering and Undemocratic Decisions," by Alexander J. Stewart et al.; the other (.pdf) is Source Hacking: Media Manipulation in Practice, by Joan Donovan and Brian Friedberg, by the digital think tank Data & Society, founded by danah boyd (lower case). Donovan and Friedberg have roles in the Technology and Social Change Research Project of the Shorenstein Center of the Harvard Kennedy School of Government at Harvard University.

"Information Gerrymandering" reports results of an experiment in which people were recruited to participate in a voting game, involving 2,500 participants and 120 iterations. The game divided participants into two platforms, purple or yellow, and the goal was to win the most votes (first past the post). Would-be winners had to convince others to join their party; in the event of a deadlock, both parties lose. The authors writes, "a party is most effective when it influences the largest possible number of people just enough to flip their votes, without wasting influence on those who are already convinced." When willingness to compromise is unevenly distributed, those who have a lot of zealots, who in principle oppose any compromise, have an advantage. When both sides use such a zealous strategy, however, deadlock results and both sides lose.

To seed the game the authors added influencers, whom they dubbed "zealous bots" to argue against compromise and persuade others to agree with them. They ran the test in Europe and America (whether purple or yellow was better), and then ran similar analyses in UK and USA legislative bodies. They write,

[O]ur study on the voter game highlights how sensitive collective decisions are to information gerrymandering on an influence network, how easily gerrymandering can arise in realistic networks and how widespread it is in real-world networks of political discourse and legislative process. Our analysis provides a new perspective and a quantitative measure to study public discourse and collective decisions across diverse contexts. . . .

Symmetric influence assortment allows for democratic outcomes, in which the expected vote share of a party is equal to its representation among voters; and low influence assortment allows decisions to be reached with broad consensus despite different partisan goals. A party that increases its own influence assortment relative to that of the other party by coordination, strategic use of bots or encouraging a zero-sum worldview benefits from information gerrymandering and wins a disproportionate share of the vote—that is, an undemocratic outcome. However, other parties are then incentivized to increase their own influence assortment, which leaves everyone trapped in deadlock."

Information Gerrymandering and Undemocratic Decisions, p. 120

This is oddly synchronous with current events (August-September 2019), which seem turbo-charged to attract attention and conflict, and to deflect persuasion and obfuscate any nuance. Zealotry is a strategy to maximize attention and conflict, and to discourage the nuance that makes compromise and persuasion possible. Those who shout the loudest get the most attention. Zealous bots, indeed!

That's where the second article comes in, Source Hacking. Zealots can now use online manipulation in very specific ways with extremely fine-grained methods on very narrow slices of online attention or "eyes." Donovan and Friedberg call this "source hacking," a set of techniques for hiding the sources of misleading or false information, in order to circulate it widely in "mainstream" media. These techniques or tactics are:

  • Viral sloganeering, repackaging extremist talking points for social media and broadcast media amplification;
  • Leak forgery, creating a spectacle by sharing false or counterfeit documents;
  • Evidence collages, consisting of misinformation from multiple sources that is easily shareable, often as images (hence collages);
  • Keyword squatting, strategic domination of keywords via manipulation and "sock-puppet" false-identity accounts, in order to misrepresent the behavior of disfavored groups or opponents.

The authors ask journalists and media figures to understand how viral slogans ("jobs not mobs" was a test case), and to understand their role in inadvertently assisting covertly planned campaigns by extremists to popularize a slogan already frequently shared in highly polarized online communities, such as Reddit groups or 4chan boards. "Zealous bots" indeed!

Taken together, these two articles vividly delineate how zealots can take over information exchanges and trim their "boundaries" of discourse (gerrymander them) to depress any and all persuasion, nuance, or complexity. These zealots do so by using very precise tactics of viral sloganeering, leaking forged documents, creating collages of false or highly misleading evidence pasted together from bits of truth, and domination of certain keywords (squatting) so as to manipulate algorithms and engage in distortion, blaming, and threats. Taken together, such communication reaches a "tipping point" (a phrase used by Claire Wardle of First Draft News in 2017) in which misinformation and misrepresentation overwhelm any accurate representation, nuanced discussion, persuasion, to meaningful exchange.

Those who wanted to "move fast and break things" have certainly succeeded, and it remains to be seen whether anything can remain whole in their wake, outside of communities of gift (scholarly) exchange explicitly dedicated to truth and discernment. Libraries have to house, encourage, foment, and articulate those values and communities --hardly a value-free librarianship, and one that does risk sometimes tolerating unjust power relationships because their alternatives are even worse.

The ultimate question for a responsible man to ask is not how he is to extricate himself heroically from the affair, but how the coming generation is to live! It is only from this question, with the responsibility towards history, that fruitful solutions can come, even if for the time being they are very humiliating.

Dietrich Bonhoeffer, "After Ten Years," 1943, translated and published in Letters and Paper from Prison

(And no, that is not a nod to a certain court evangelical who pretends to understand Bonhoeffer, but who can't speak a word of German, and is simply a shoddy scholar.)

Milano - Castello sforzesco - Michelangelo, Pietà Rondanini by Michelangelo (1564) - Foto Giovanni Dall'Orto, 6-jan-2006 - 05
Milano - Castello sforzesco - Michelangelo, Pietà Rondanini (1564)
Foto Giovanni Dall'Orto, 6-jan-2006 - 05. Source: Wikimedia; CC-BY-2.0

I've been mulling Arthur C. Brooks article in The Atlantic, "Your Professional Decline is Coming (Much) Sooner Than You Think" -- cheerily subtitled "Here how to make the most of it."

My first thought was: this article has an important core, but somehow was badly edited. The title seems to have been made up by someone for whom such a prospect seems distant. The subtitle is straight-up advice-column mush. Worse, Brooks seems to have buried the lede --something that he does not do elsewhere in his writings,, and is done here in such a way that it's not a sign of his alleged decline. I cannot avoid wondering whether somehow The Atlantic's editors were nervous about this article, and whether it touched some raw nerve, perhaps in an editor closer to Brook's age than the one who titled the article.

Brooks lede is important, and worth reading and emphasis. In the course of the article, he recounts their stories, and concludes, "Be Johann Sebastian Bach, not Charles Darwin. How does one do that?"

Well, not by working for a think-tank, no matter how distinguished. Some of Brook's nervousness seems to me to be a product of the Massachusetts Avenue hothouse in DC: the American Enterprise Institute is right next door to the Carnegie Endowment for International Peace, the Brookings Institution, and across the street from the Johns Hopkins University School of Advanced International Studies and the Peterson Institute of International Economics. (I regret slighting other worthy organizations on those blocks.). Just a little competitive, no? And from this perch is one supposed not to feel old past 45?

Maybe because I've always worked in higher education I'm used these past decades to being an old person amidst the young --since I was 28, in fact. I have known so many young people, and like them, work well with them, and certainly am not threatened by them. I see their youth and ingenuity up close well enough to know that being young "before professional decline" can be a great, good thing --and not, much of time. Especially now, in the age of locked-down anxiety.

Years ago a wonderful Benedictine monk, the Rev. Fr. Gabriel Coless, who has the serene, very long-now outlook of his order, taught me a lasting lesson. Coless was reflecting on a common experience: a mannerly University building operations worker was reviving a recalcitrant air conditioner on a warm September day in a seminar room, while several students gossiped about a particularly idiotic, recent scandal involving a certain professor of theology, and the spouse of another one. Just after the uniformed, sweaty, good-natured mechanic left the room, Coless commented, "the order of practical wisdom and the order of academic intelligence have nothing to do with each other." The mechanic knew nothing of Scholastic theology, but treated people well and was intensely loyal to a wife with a long-term chronic illness. His practical wisdom outshone any of the supposed academic brilliance reposing in that other asinine, arrogant professor. Young people, no less than their elders, can confuse academic brilliance with practical intelligence, and one suspects nowhere so much as in Washington think tanks.

Brooks draws on British psychologist Raymond Cattell's distinction between fluid and crystallized intelligence, and finds that fluid intelligence has special valence for young people. He's right, but sparkling fluid intelligence is not much use to a considerable majority of young people, who either don't have much of it to begin with, or must cope with circumstances very different from the denizens of think tanks and the academy. Brooks example, perhaps drawn from Cattell: poets done with half their creative output by age 40, on account of the waning of their fluid intelligence? If the other half is done by, let's say, age 80, that's a lengthening and changing of creativity, but hardly its senescence. Just lately I've been reading the late W.S. Merwin, who produced such amazing work in his 80s--exactly because he was setting his mental and creative habits when he was in his 20s and 30s. His later work was nothing he might have imagined fifty years before, but he could not have done it without his earlier work. Perhaps I am merely cherry-picking a contrary example, but I believe that there are many others, as well. Marilynne Robinson's Gilead, published 2004, won the Pulitzer Prize for Fiction a year later--when she was 62-63, and twenty years after her earlier well-received fiction, Housekeeping. Her writing took a long time to crystalize. Does that mean she lost her fluidity of expression and imagination, or merely removed the incidental to reveal the essential, sculpted away the unnecessary stone to reveal the true figure?

That is Brook's most arresting and poignant metaphor --shaping his life by subtracting what is extraneous.

What I need to do, in effect, is stop seeing my life as a canvas to fill, and start seeing it more as a block of marble to chip away at and shape something out of. I need a reverse bucket list. My goal for each year of the rest of my life should be to throw out things, obligations, and relationships until I can clearly see my refined self in its best form.

Printed version, July 2019, page 73

After listening to the wisdom of the Hindu sage Acharya ("Teacher"), Brooks makes four specific commitments:

  • Jump (leaving his current status and prestige)
  • Serve (fully sharing ideas in the service of others, primarily by teaching at a university --more about that in a moment)
  • Worship refreshment of the soul, pursuing one's spiritual heritage, shaping work itself as a transcendental commitment
  • Connect --becoming more conscious of the roots that bind us together, each to each (as aspen trees).

Brooks avoided E.M. Forster's "Only connect" --by now maybe trite, but still just resistant enough to mere sentimentality.

These worthy commitments and important insights were unfortunately buried under a load of repetitive citations from social science and real editorial nervousness. I wish Brooks had started his article with his account of his earlier, unhappy career as a professional French Hornist. That story leads directly to Brook's specific commitments, and foreshadowed his later encounter with some famous, bitter old man on a plane. Then distill the social science before the conclusion.

That no one can maintain peak professional performance indefinitely is no news, however many people (especially men, but sometimes women) attempt it. How many failed intimate relationships are the collateral damage of such fantasies! That intelligence and imagination changes as one ages is also no real news.

The poignant force of Brook's piece is that he realizes all this from his vantage point in elite Washington, and is willing to step away before others might wish he had. How different are the current, comparative cases of Associate Justices Ruth Bader Ginsburg and Clarence Thomas --the later seems simply to have soured in self-imposed isolation beyond what even many conservatives can stomach, while the former must watch many of her life's deep commitments under assault every term. Her grace and wit outshines his embittered silence.

Brooks apparently desires to teach in a universtiy, and I wish him well. I doubt that he will have to get sucked into the machinery of academic life --advising, committee work, and empty sloganeering from so-called thought-leaders (much less the poisonous atmosphere of freedom-versus-safety controversies). I've certainly seen enough academic colleagues simply rot on the vine (in some cases with alcohol), and in other cases turn so rancid that their colleagues dearly desire their departure under any circumstances whatsoever. I've also watched faculty in the later years connect with students in a manner that changes their lives (both the students', and sometimes the teacher's). I hope the latter for Brooks, knowing that such connection is forged in a lifetime of experience, some of it unhappy, and in thinking and re-thinking about what is really important.

May the sculpture of his life reveal a strength and liveliness that would be lost in a think-tank, and may his students rise and bless his memory decades later. Only connect.

All the things that are wrong in the world seem conquered by a library's simple unspoken promise: Here I am, please tell me your story; here is my story, plea

A library is a good place to soften solitude; a place where you feel part of a conversation that has gone on for hundreds and hundreds of years even when you're all alone. The library is a whispering post. You don't need to take a book off a shelf to know there is a voice inside that is waiting to speak to you, and behind that was someone who truly believed that if he or she spoke, someone would listen. It was that affirmation that always amazed me. Even the oddest, most particular book was written with that kind of crazy courage—the writer's belief that someone would find his or her book important to read. I was struck by how precious and foolish and brave that belief is, and how necessary, and how full of hope it is to collect these books and manuscripts and preserve them. It declares that all these stories matter, and so does every effort to create something that connects us to another another, and to our past and to what is still to come. I realized that this entire time, learning about the library, I had been convincing myself that my hope to tell a long-lasting story, to create something that endured, to be alive somehow as long as someone would read my books, was what drove me on, story after story; it was my lifeline, my passion, my way to understand who I was. . . .

All the things that are wrong in the world seem conquered by a library's simple unspoken promise: Here I am, please tell me your story; here is my story, please listen.

--Susan Orlean, The Library Book, New York: Simon & Schuster, 2018, pp. 309-310

--Whispering post? See this article from Rochford, Essex, England. Rochford is approximately 70km east of central London.

A longer perspective of religious traditions can offer significant insights into what seems to have become a cultural cul-de-sac of social media.

A longer perspective of religious traditions can offer significant insights into what seems to have become a cultural cul-de-sac of social media.

Iona Abbey, Easter 2013
Iona Abbey, Easter 2013 (Larger)

Cal Newport's Digital Minimalism (see previous long entry) has led him to "stumble across this growing tension between social media and religion" which he details in his April 8 post. The real issue is not the existence of social media itself, but the manner in which it fragments and dissipates focus and attention --and attention is a common element to religious practices in many streams of faith or traditions, both Eastern and Western.

I noted previously that Newport pays attention to figures as diverse as Aristotle, Thoreau, and Abraham Lincoln, and acknowledges the depths of the discussion of focus and human flourishing, "but he avoids getting pulled off task." Well and good --but the relation between focus, attention, and human flourishing (becoming a better human being) is a very old concern with a long, long literature (to say the least!). Newport's book is not primarily addressed to those in any faith community or tradition. Those who are such a community, however, cannot fail to pick up the resonances of this discussion.

One might first point out that social media is hardly the first significant distraction to come around, and the idea that "social media might be accidentally undermining religion" needs perspective. Religious practices, whether or not reified into something that contemporary Westerners understand as a "religion," have been around a very long time. Social media is not yet quite 20 years old ("The Facebook," launched February 2004; Friendster 2002). The view that social media can or will quickly and significantly undermine practices going back thousands of years so could be superficial. Intellectual ideas of about what a religion is have changed, and practices have changed; but those practices have been severely challenged before and somehow persisted. Will runaway social media usage really accomplish what centuries of pogroms, persecutions, and cultural coopting have failed to do? Richard Dawkins and Sam Harris might be so lucky.

This observation does not dismiss Newport's question. That social media do in fact seek to capture, control, fragment, and dissipate users' attention is a feature, not a bug. They do that by design, as Newport showed and skewered brilliantly. This is why the great Zuck's "regrets" about fake news and hate speech ring so hollow: those who write and spread fake news and engage in hateful communication are using social media exactly as intended. Social media was meant for everyone to share, and do the trolls share! The idea that platform does not foster content is ideological sleight of hand, a fib to deny culpability. Much of the content of social media is nevertheless utter poison, and it might be even more threatening to many religious traditions than the fracturing of attention itself.

A longer perspective can offer significant insights into what seems to have become a cultural cul-de-sac.

Decades ago I was privileged to study with Diogenes Allen at Princeton Theological Seminary when he was in a very productive period of his life. An Oxford (Rhodes) scholar, he was a searching and provocative reader of texts classic and modern, and three books in particular have stuck with me: Kierkegaard's Sickness Unto Death and Works of Love, and Simone Weil's Forms of the Implicit Love of God (and her related "Reflections on the Right Use of School Studies with a View to the Love of God"). These are my sources for the following reflections.

The fragmentation of attention was brilliantly and lucidly addressed by Kierkegaard in 1849 in Sygdommen til Døden (The Sickness of Unto Death). SK was deeply enamoured of the Danish language, and thoroughly embedded in Danish history: the events of unsuccessful revolution of 1848 in Copenhagen "have been of world-historical significance and have overturned everything . . . . Every system has been exploded. In the course of a couple of months, the past has been ripped away from the present with such passion that it seems like a generation has gone by." (From The Point of View for My Work as an Author, written in 1848 and later published in 1859, also below). In other words, a world not unlike our own in many ways.

But when "the threads of intelligence broke," and when "everyone who has, in various ways, been a spokesman in the past has been reduced either to silence or to the embarrassment of being forced to purchase a brand-new suit of clothing," a new way forward could break open, and SK searched for language and forms of life for that new world. He found it in his word for despair, fortvivlelse, an for- intensification of tvivl (doubt or question); for+tvivlelse is both mega-doubt and raises the stakes: meta-doubt. In despair, such meta-doubt is to be of two minds: not only split apart from one's self, but split apart from God. An anti-Hegelian, SK saw an irresoluble polarity of temporal/eternal, freedom/necessity, consciousness/unconsciousness in human experience, which is fundamentally an experience of divided-ness or fragmentation.

SK rings the changing forms of fortvivlelse (despair) through these polarities. All humans experience despair or fragmentation; the depth is despair is to live unaware that one is in despair: the fragmentation that admits no fragmentation. This condition is universal, and the only way out of this condition is for the self to live "by relating itself to its own self, and by willing to be itself the self is grounded transparently in the Power which posited it." (English transl. by Lowrie, p. 147) . To be "grounded transparently" is to do the works of love, which is the Summary of the Law.

The person who truly admits, holds on to, and focuses this despair (fortvivlelse) will open oneself to the Reality of world (or creation) it its beauty, order, in service to others, in respect for practices that focuses this attention, and in friendship. These are, in Simone Weil's terms, the forms of the implicit love of God. God (the Power which posits us) seems absent, and we can only love (or focus upon) God through the forms of Reality by which God's presence is implicit and mediated. The "right use of school studies" develops a lower kind of attention, which is "extremely effective in increasing the power of attention that will be available at the time of prayer, on the condition that [studies] are carried out with a view to this purpose and this purpose alone." Weil writes, "the key to a Christian concept of studies is the realization that prayer consists of attention." (Waiting for God, p. 105)

When I read Newport's Deep Work three years ago, I was very much struck by the sense in which deep work could be extremely effective training to the power of concentration and attention. (Deep work is "professional activities performed in a state of distraction-free concentration that push your cognitive capabilities to their limits. These efforts create new value, improve your skill, and are hard to replicate.") Deep work rightly undertaken can train a form of energy (a habitus) for prayer and focus upon the "meta:" Reality, God, Emptiness, Submission, Torah among other names in varied traditions. This is not really the "deep work" that Newport meant, who was writing at a different level for a different audience. The resonance is unavoidable for those who have ears to hear.

As a librarian, my work is collaborative and can sometimes feel trapped in the shallows, not the depths. My writing and focused reading helps me to return to the depths. Library service, at its best, is a soul-craft that provides a glimpse of the sacred trust of learners, teachers, and traditions as they undergo necessary and unavoidable change. That glimpse does not substitute depth for mere metrics of productivity. A librarian's deep work is not necessarily solitary, but does require clear boundaries for time with colleagues and time alone.

This sounds high-flown but it has direct, practical implications. During "crunch time," students do in fact limit digital social media. Project Information Literacy learned in 2011 (.pdf) that "students use a “less is more” approach to manage and control all of the IT devices and information systems available to them while they are in the library during the final weeks of the term." Further study (.pdf) has shown that they use a "hybrid approach to conducting research and finding information. " While this study needs to be updated --students in 2011 had a different experience of social media on smart phones than do students in 2019-- the PIL study suggests that at least the seeds are present for a successful critique and debunking of the casual "the more sharing, the better" slogan, beginning with high-stress times like the close of the semester.

Shifting focus away from the forms of despair that feast upon distraction, and towards purposeful existence, is always going to be a tough sell for many. As Annie Savoy says about Ebie Calvin "Nuke" LaLoosh in Bull Durham, 'The world is made for people who aren’t cursed with self-awareness." By contrast I know three young individuals who voluntarily went off social media this past year, because they wished to serious attention to learning surgical nursing, biochemistry, and vocal performance (music). In the words of one, "enough of the self-created drama." If that insight is possible for a few, it might be possible for just enough at a tipping point.

To return: does social media in fact undermine religion?

If by "religion" one means what one sees in an ordinary church, synagogue, or mosque on an ordinary sabbath, then on a superficial level the answer is yes. The great mass of those who lead lives of quiet desperation may never become aware of their own fragmentation or despair --and social media is intended to fragment, to snuff out any intimations that something is not right (almost in the sense of Neo's choice to accept the blue "normal" pill in The Matrix, further reminiscent of Alice in Wonderland's choice of potions).

If by "religion" one goes further not only into self-reflection and self-awareness, but into active contemplation of the order of the universe, then the answer is probably no. In that perspective, social media shrinks to a mere mosquito-buzz level of irritation, or worse, a poisoned well. Despair is a general condition of the human race, neither created nor extended by mere social media. But social media can be its tool.

Social media as a channel of despair will admit no alternative. It becomes reality; reality's fragmentation is both its content and its platform. No wonder the Silicon Valley elite will not allow their own children near it. Only true disruption (not mere "innovative disruption") will reveal fragmentation and despair for what it is --and disruption, "moving fast and breaking things" is what social media was originally all about. In the polarity of despair, its infinitude of disruption has become a finitude of being shared: a world in every way Zucked-up.

It wasn't that time stopped in the library. It was as if it were captured here, collected here, and in all libraries--and not only my time, my life, but all human time as well. In the library, time is dammed up--not just stopped but saved. The library is a gathering pool of narratives and of the people who come to find them. It is where we can glimpse immortality; in the library, we can live forever.

--Susan Orlean, The Library Book, pages 11-12

Change is hard. Adaptation to changing conditions is hard enough without the burden of misplaced, scary, apocalyptic language.

After doubling down, Clay Christensen has tripled down. This is a familiar ploy: if you say something doubtful, then repetition will make it come true. In the words of one critic, Christensen is to business what Malcolm Gladwell is to sociology.

Christensen and Michael B. Horn, the former the apostle of disruptive innovation and the latter his St. Timothy, recently repeated their claim that 50% of colleges would fail in the next decade, or 10-15 years. Their goal line has been reasonably consistent: in 2011 it was "as many as half," (we're 8 years in) and 2013 it was 25 percent in the next 10 to 15 years (6 years in); in 2017 "as many as half" within a decade (2 years in); in 2019. Christensen and Horn write that "some college and university presidents . . . tell us in public and private settings that they think the 50 percent failure prediction is conservative -- that is, the number of failures will be far higher." Names? Places? and does executives saying that make it so (is this presumption of their predictive competence warranted)?

But the reasons for this prediction keep changing, and there's the ploy: save the effect but change the cause. One is reminded of the late Sydney Brenner's Occam's Broom: "sweep under the carpet what you must to leave your hypotheses consistent." The reason for such precipitous closures, foreseen (in 2011) in the period 2021-2026 were disruption due to innovative technologies: the subtitle of The Innovative University is Changing the DNA of Higher Education from the Inside Out. The main idea is that "in the DNA" of American colleges and universities is a desire to be like Harvard: wealthy and comprehensive. By contrast Brigham Young University/Idaho exhibits a completely different, innovative strand of DNA: "in how it serves students by a combination of distance learning, on-site learning, and lower-cost alternatives to residential college" (quoting my review from November 2012).

Implicit in the metaphor of DNA is a certain determinism: you cannot change your own DNA, after all (without extremely powerful, still-developing technologies that occasion many moral questions). Your DNA determines, in this view, what you will be: such "Harvard" DNA will be a fatal flaw in many colleges and universities, according to Christensen and Eyring. Can you really change your DNA from the inside out? It's a clumsy metaphor: no wonder Christensen has abandoned it.

That 2011 book also quite ignored several inconvenient facts about BYU/Idaho. Mormons get a considerable price break there that "Gentiles" do not receive --pointing obviously to a hefty subvention from LDS sources. BYU/Idaho is, after all, a satellite of a powerful, wealthy, comprehensive mother ship in Provo, Utah: the satellite campus can hardly be a synecdoche, assuming that what is true for the part is true for the whole. The economics of BYU/Idaho and the considerable technological subsidy its online instructions receives from the mother system is simply left out. Apparently Occam's Broom works well in Idaho and Utah.

Is Christensen's central claim true, that the DNA of American colleges and universities propels them to desire to become Harvard. Is that really true? What about those institutions that could have become Harvard (or Michigan), and chose a different path? Did a liberal arts college that chose to remain a liberal arts college necessarily thereby fail its DNA? Ask too many questions, and the whole edifice collapses.

Christensen's account and predictions rely on a very superficial knowledge of the history of higher education. That lack of knowledge allows him to claim that nothing has really changed in American higher education in 150 years. How about women's education? (--one of Christensen's real blind spots). How about public community colleges? A comparison: religious groups in the Abrahamic traditions often work by one person with some kind of authority meeting and talking to others who are supposed to be instructed. Isn't the real point what they say? --words with profound differences that mask a reductive similarity of communication. Closer to home, James McCosh (of Princeton) and Clayton Christensen (of Harvard Business School) both stand or stood in front of others and talked: is that really the sum of development between them? Again: Occam's Broom.

In their 2019 opinion, Christenen and Horn change the goal-posts: now the cause of instructional failure will be changing business model driven by implicitly disruptive technologies. Does anyone remember the educational TV boom in the 1960s? That also was a changing business model. Of course business models are changing, and have changed in the past: again, consult a deeper understanding of American higher education. All current debates about business models, missions, curricula, and the needs of students have a very long history, such as Crisis on the Campus by Michael Harrington (ca. 1951), or the famous General Education in a Free Society (1950). Students have never been mere passive recipients (although the contemporary view of them as "consumers" drives them to passivity). From the colonial to the denominational colleges, land-grant colleges, public universities: the business models have evolved. The business model in higher education is changing as we speak.

In other words, Christensen and Horn present the same tired and superficial nostrums of ten years ago. Even the historical examples in his ur-text, The Innovators Dilemma (1997), are questionable. Predicting the apocalypse is an old business.

I have argued elsewhere that the fearsome language occasioned by "disruptive technologies" has origins in the pre-Millennial Restorationist theologies of early 19th century frontier America, especially the Burnt-Over district of upstate New York, and showcased pre-eminently by Joseph Smith, Jr. There's a reason that his Latter-Day Saints are latter. Christensen was formed in a community that stands by Smith's proclamations. I do not pretend that he has smuggled theology into business, but rather that there is an elective affinity between such mainstream LDS thinking and business disruption: Joseph Smith Jr. was supposed to put the rest of Christianity out of business (the "great apostasy" from the 1st century to 1830). How has that worked out for for the Latter Day Saints? (--and nevermind the very current cosmetic name changes).

I continue to wonder whether discussions of disruptive innovation in higher education are in fact a cloak for expediting other changes, less technological but no less disruptive, initiated by senior academic leadership. To be a disciple of "disruptive innovation" means you're a member of that club. Is this called group think? Has it served GE well?

So will somewhere around 50% of American colleges and universities fail in the next decade? This might be the case, but not for Christensen's and Horn's (and Eyring's) reasons. In their recent post in Inside Higher Education, they cite situations in New England. I live in Connecticut: this is daily reality for me. Demographics are shifting: colleges and universities in the northeast quadrant of the continental US are going to have a hard time on that basis alone. Some have already closed, more probably will --but not because of changing business models driven by disruptive technologies. Demographic change exerts a constant pressure not unlike climate change. The real question is, who can adapt, and how well? The population probably will not achieve replacement rate, even in the southwest.

American higher education may be entering a "perfect storm" of demographic change, economic turmoil, and moral and cultural drift or outright corruption (e.g. the recent admissions scandals). All of this is cause for deep concern; none of it depends upon the snake-oil of "innovative disruption" via technologies that power changing business models. For many institutions, strategies of differentiation based upon price point, purpose, and location will matter a great deal. Strategies based on pure number crunching accompanied by credulous faith in technologies will probably not work. Online education is here to stay, but it will not disrupt on-ground education as much as non-technological demographic trends will. This is not the stuff of disruption, but of long-term anticipated and unanticipated consequences of historical change, about which Harvard Business School professors have no more particular expertise than anyone else. When disruptive innovation gets dumbed down, it isn't disruptive anymore, but just change.

Change is hard. Adaptation to changing conditions is hard enough without the burden of misplaced, scary, apocalyptic language.