Skip to content

Change is hard. Adaptation to changing conditions is hard enough without the burden of misplaced, scary, apocalyptic language.

After doubling down, Clay Christensen has tripled down. This is a familiar ploy: if you say something doubtful, then repetition will make it come true. In the words of one critic, Christensen is to business what Malcolm Gladwell is to sociology.

Christensen and Michael B. Horn, the former the apostle of disruptive innovation and the latter his St. Timothy, recently repeated their claim that 50% of colleges would fail in the next decade, or 10-15 years. Their goal line has been reasonably consistent: in 2011 it was "as many as half," (we're 8 years in) and 2013 it was 25 percent in the next 10 to 15 years (6 years in); in 2017 "as many as half" within a decade (2 years in); in 2019. Christensen and Horn write that "some college and university presidents . . . tell us in public and private settings that they think the 50 percent failure prediction is conservative -- that is, the number of failures will be far higher." Names? Places? and does executives saying that make it so (is this presumption of their predictive competence warranted)?

But the reasons for this prediction keep changing, and there's the ploy: save the effect but change the cause. One is reminded of the late Sydney Brenner's Occam's Broom: "sweep under the carpet what you must to leave your hypotheses consistent." The reason for such precipitous closures, foreseen (in 2011) in the period 2021-2026 were disruption due to innovative technologies: the subtitle of The Innovative University is Changing the DNA of Higher Education from the Inside Out. The main idea is that "in the DNA" of American colleges and universities is a desire to be like Harvard: wealthy and comprehensive. By contrast Brigham Young University/Idaho exhibits a completely different, innovative strand of DNA: "in how it serves students by a combination of distance learning, on-site learning, and lower-cost alternatives to residential college" (quoting my review from November 2012).

Implicit in the metaphor of DNA is a certain determinism: you cannot change your own DNA, after all (without extremely powerful, still-developing technologies that occasion many moral questions). Your DNA determines, in this view, what you will be: such "Harvard" DNA will be a fatal flaw in many colleges and universities, according to Christensen and Eyring. Can you really change your DNA from the inside out? It's a clumsy metaphor: no wonder Christensen has abandoned it.

That 2011 book also quite ignored several inconvenient facts about BYU/Idaho. Mormons get a considerable price break there that "Gentiles" do not receive --pointing obviously to a hefty subvention from LDS sources. BYU/Idaho is, after all, a satellite of a powerful, wealthy, comprehensive mother ship in Provo, Utah: the satellite campus can hardly be a synecdoche, assuming that what is true for the part is true for the whole. The economics of BYU/Idaho and the considerable technological subsidy its online instructions receives from the mother system is simply left out. Apparently Occam's Broom works well in Idaho and Utah.

Is Christensen's central claim true, that the DNA of American colleges and universities propels them to desire to become Harvard. Is that really true? What about those institutions that could have become Harvard (or Michigan), and chose a different path? Did a liberal arts college that chose to remain a liberal arts college necessarily thereby fail its DNA? Ask too many questions, and the whole edifice collapses.

Christensen's account and predictions rely on a very superficial knowledge of the history of higher education. That lack of knowledge allows him to claim that nothing has really changed in American higher education in 150 years. How about women's education? (--one of Christensen's real blind spots). How about public community colleges? A comparison: religious groups in the Abrahamic traditions often work by one person with some kind of authority meeting and talking to others who are supposed to be instructed. Isn't the real point what they say? --words with profound differences that mask a reductive similarity of communication. Closer to home, James McCosh (of Princeton) and Clayton Christensen (of Harvard Business School) both stand or stood in front of others and talked: is that really the sum of development between them? Again: Occam's Broom.

In their 2019 opinion, Christenen and Horn change the goal-posts: now the cause of instructional failure will be changing business model driven by implicitly disruptive technologies. Does anyone remember the educational TV boom in the 1960s? That also was a changing business model. Of course business models are changing, and have changed in the past: again, consult a deeper understanding of American higher education. All current debates about business models, missions, curricula, and the needs of students have a very long history, such as Crisis on the Campus by Michael Harrington (ca. 1951), or the famous General Education in a Free Society (1950). Students have never been mere passive recipients (although the contemporary view of them as "consumers" drives them to passivity). From the colonial to the denominational colleges, land-grant colleges, public universities: the business models have evolved. The business model in higher education is changing as we speak.

In other words, Christensen and Horn present the same tired and superficial nostrums of ten years ago. Even the historical examples in his ur-text, The Innovators Dilemma (1997), are questionable. Predicting the apocalypse is an old business.

I have argued elsewhere that the fearsome language occasioned by "disruptive technologies" has origins in the pre-Millennial Restorationist theologies of early 19th century frontier America, especially the Burnt-Over district of upstate New York, and showcased pre-eminently by Joseph Smith, Jr. There's a reason that his Latter-Day Saints are latter. Christensen was formed in a community that stands by Smith's proclamations. I do not pretend that he has smuggled theology into business, but rather that there is an elective affinity between such mainstream LDS thinking and business disruption: Joseph Smith Jr. was supposed to put the rest of Christianity out of business (the "great apostasy" from the 1st century to 1830). How has that worked out for for the Latter Day Saints? (--and nevermind the very current cosmetic name changes).

I continue to wonder whether discussions of disruptive innovation in higher education are in fact a cloak for expediting other changes, less technological but no less disruptive, initiated by senior academic leadership. To be a disciple of "disruptive innovation" means you're a member of that club. Is this called group think? Has it served GE well?

So will somewhere around 50% of American colleges and universities fail in the next decade? This might be the case, but not for Christensen's and Horn's (and Eyring's) reasons. In their recent post in Inside Higher Education, they cite situations in New England. I live in Connecticut: this is daily reality for me. Demographics are shifting: colleges and universities in the northeast quadrant of the continental US are going to have a hard time on that basis alone. Some have already closed, more probably will --but not because of changing business models driven by disruptive technologies. Demographic change exerts a constant pressure not unlike climate change. The real question is, who can adapt, and how well? The population probably will not achieve replacement rate, even in the southwest.

American higher education may be entering a "perfect storm" of demographic change, economic turmoil, and moral and cultural drift or outright corruption (e.g. the recent admissions scandals). All of this is cause for deep concern; none of it depends upon the snake-oil of "innovative disruption" via technologies that power changing business models. For many institutions, strategies of differentiation based upon price point, purpose, and location will matter a great deal. Strategies based on pure number crunching accompanied by credulous faith in technologies will probably not work. Online education is here to stay, but it will not disrupt on-ground education as much as non-technological demographic trends will. This is not the stuff of disruption, but of long-term anticipated and unanticipated consequences of historical change, about which Harvard Business School professors have no more particular expertise than anyone else. When disruptive innovation gets dumbed down, it isn't disruptive anymore, but just change.

Change is hard. Adaptation to changing conditions is hard enough without the burden of misplaced, scary, apocalyptic language.

Sam Anderson writes in the New York Times Magazine, Sunday, March 24, 2019:

As the English writer G.K. Chesterton once put it, in a quote I found printed in my corny old travel journal: “The whole object of travel is not to set foot on foreign land; it is at last to set foot on one’s own country as a foreign land.” After looking at a Roman stone wall topped by a Saxon stone wall topped by a medieval English wall next to a modern paved street, I began to see what a thin crust of national history the United States actually stands on. I began to realize how silly and narrow our notion of exceptionalism is — this impulse to consider ourselves somehow immune to the forces that shape the rest of the world. The environment I grew up in, with its malls and freeways, its fantasies of heroic individualism, began to seem unnatural. I started to sense how much reality exists elsewhere in the world — not just in a theoretical sense, in books and movies, but with the full urgent weight of the real. And not just in Europe but on every other continent, all the time, forever. I began to realize how much I still had to learn before I could pretend to understand anything. 

https://www.nytimes.com/interactive/2019/03/20/magazine/rick-steves-travel-world.html

This little book is a classic --it was an essay in Harpers in 1939, and achieved iconic status during the war because of Flexner's association with Albert Einstein and the Institute for Advanced Study in Princeton. It is still worth reading

The Usefulness of Useless Knowledge, by Abraham Flexner, with a companion essay by Robbert Dijkgraaf. Princeton: University Press, 2017. 93 pages. ISBN 9780691174761. Sacred Heart University Library: Q126.8 .F54, 2017

This little book is a classic --it was an essay in Harpers in 1939, and achieved iconic status during the war because of Flexner's association with Albert Einstein and the Institute for Advanced Study in Princeton. It is still worth reading, especially now with a companion essay by Robbert Dijkgraaf, Leon Levy professor at the Institute, a mathematical physicist and string theorist.

The entire point of the essay is that useless knowledge is not useful yet --with all things, it is a matter of time. The period 1939-1945 was a spectacular demonstration of this truth: in 1939 an obscure article appeared in Physical Review, "The Mechanism of Nuclear Fission" on the exact day that World War II began in Europe (September 1). The previous year Alan Turing completed his Ph.D. dissertation, Systems of Logic Based on Ordinals at Princeton University, and Institute mathematician Johann von Neumann wanted to hire him as a postdoctoral assistant, but Turing returned to England. By 1945 esoteric nuclear fission had resulted in two atomic bombs, as well as Turing's Bombe, a mechanical computer to decipher the German Enigma code, made possible by logic based on ordinal numbers. In both cases that useless knowledge did not remain useless very long.

Flexner freely admits the waste and appearance of waste of a great deal of speculation and experiment, because exactly where the another critical insight will arise is never clear and cannot be predicted. "To be sure, we shall waste some precious dollars." It "looks prodigious. It is not really so. All the waste that could be summed up in developing the science of bacteriology is as nothing compared to the advantages which have accrued from the discoveries of Pasteur" and many others. "Science, like the Mississippi, begins in a tiny rivulet in the distant forest. Gradually other streams swell its volume. And the roaring river that burst the disks is formed from countless sources."

The critical factor for Flexner (and for us!) is spiritual and intellectual freedom, and a narrow market ideology can threaten this as surely as any other. "An institution which sets free successive generations of human souls is amply justified whether or not this graduate or that makes a so-called useful contribution to human knowledge." Flexner is deeply aware that market-driven economy can crowd out exactly what nourishes the market. In 1939 Flexner's reflection has "a peculiar poignancy. In certain age areas --Germany and Italy especially--the effort is now being made to clamp down the freedom of the human spirit." Mutatis mutandis, now we see this spirit alive in China, Hungary, Russia, and maybe in the utter refusal of science and truth by some in the United States. The real enemy is the person "who tries to mold the human spirit so that it will not dare to spread its wings, as its wings were once spread in Italy and Germany, as well as in Great Britain and the United States." What comes around goes around, especially fear as practiced by some politicians.

Dijkgraaf writes in the prologue, "Supporting applied and not-yet-applied research is not just smart, but a social imperative." And yet, "our current research climate" increasingly is governed by imperfect 'metrics' and policies."

Driven by an every-deepening lack of funding, against a background of economic uncertainty, global political turmoil, and ever-shortening time cycles, research criteria are becoming dangerously skewed toward conservative short-term goals that may address more immediate problems but miss out on the huge advances that human imagination can bring in the long term.

Nicholas Carr made the case in 2010 that as the Internet encourages rapid, distracted sampling of small bits of often unconnected information, humans are losing capacity for reflection and concentration (a point also made abundantly by Cal Newport's 2016 Deep Work). Research skewed by current factors of money, turmoil, and the refusal of truth will miss engagement with deep questions --and remain in the shallows without even the awareness that depths exist.

What does this have to do with a private teaching university? We certainly have no funds for research and little time away from the metered productivity of publication, teaching, and departmental governance. Just those priorities can inadvertently obscure the truth that no one really knows where the next scientific discovery, cultural insight, or social movement will come from. Here is as good as anywhere.

The point of Flexner's essay still holds: major advances invariably come from the most obscure corners. Who knew that nearly incomprehensible physics papers by a Swiss patent office worker would still be cited and proven correct more than a century later? We sell our students short is we cave into pressure simply to prepare them for a job at the neglect of their minds and their spirits. Do our skewed metrics just get in the way? Will they learn the deep respect for truth, and that truth is possible, that is the basis of any real thinking?

Newport Digital Minimalism +-+4460807056_140Digital Minimalism, by Cal Newport. Random House, 2019. 256 pages.  ISBN 9780525542872  (Other sources cited in this article are listed at the end.)

Minimalist –anything spare or stripped to its bare essentials.  Minimalism became a cultural movement, then a social commitment: live with less than 100 things, or Marie Kondo’s things that spark joy. Is digital minimalism a contradiction in terms? Is it relevant to higher education, or is it another so-called luxury like “liberal arts education?”  Read on.

  1. Digital Minimalism

The present is a digitally maximal time –but it’s all very vague.  What’s good is great and best in huge quantities: maximum use of digital media (and social media in particular) to give “people the power to build community and bring the world closer together” (—Facebook’s mission statement). How? Just connect, share, and like, and somehow good things will happen.  “You never know, maybe you’ll find this useful” – one of the weakest sales propositions ever.

Anything or anyone that seems to resist digital maximalism risk the label “Luddite,” a dismissive reference to 19th-century weavers who destroyed machines to save, they thought, their way of life.  Newport is no Luddite: he’s a professor of computer science at Georgetown, author of erudite papers on distributed networks as well as popular works such as So Good They Can’t Ignore You (2012).  In Deep Work: Rules for Focused Success in a Distracted World (2016) he explored how to maintain focus to do optimal, cognitively demanding work in a world of distractions: high-value, high-impact undertakings rather than low-impact tasks.  Digital Minimalism sprang from his readers’ struggles with the role of new technologies in their personal lives.  The insight came to him while on a walk on a deserted beach in the Bahamas (Newport strongly recommends walking as an analog practice)— a lovely location for “deep work!”

Digital minimalism is “a philosophy of technology use in which you focus your online time on a small number of carefully selected and optimized activities that strongly support things you value” –without the infamous Fear of Missing Out (FOMO).  Living this philosophy successfully means engaging long-term in cost/benefit analyses: is the benefit worth the time?  Time is the most truly limited resource. Clutter is costly; optimizing your time is crucial; intentionality is satisfying: consistently following through on your commitments.

Newport unpacks all this lucidly (he is, after all, a computer scientist).  His first chapters lay the foundations: why declutter your digital life? What can you gain? How do you do it and stick to it? His latter chapters focus on practices: how to do a digital de-clutter; how to grow comfortable again with spending time alone; how to reclaim real leisure; how to avoid digital rabbit holes such as clicking “like,” and how to find other digital minimalists: community support. He seeks to answer Andrew Sullivan’s plaintive essay, “I Used to Be a Human Being” (2016): to help upend Sullivan’s lament “by providing a constructive way to engage and leverage the latest innovations to your advantage” –to be able to “say with confidence: ‘Because of technology, I’m a better human being than I ever was before.’”

Wait—isn’t this the point of an education? Newport acknowledges the depths here: Aristotle, Thoreau, Abraham Lincoln; but he avoids getting pulled off-task.  The book is a readable length, but its shadow stretches very far indeed: becoming a better human being stretches far beyond dispelling the enchantments of technology.

Back, for a moment to Sullivan:  his moment of insight came after illness, sleeplessness, the demands of a profitable media business (blog), and dwindling friendships. “Multi-tasking was a mirage. This was a zero-sum game. I either lived as a voice online or I lived human being in the world that humans had lived in since the beginning of time.”  Why zero-sum? He had (has) only so much time to pay attention. The ceaseless wind-tunnel of distraction “denies us the deep satisfaction that comes with accomplishing daily tasks well, a denial perhaps felt most acutely by those for whom such tasks are also a livelihood —and an identity.”

Many university teachers have noticed that students (especially undergraduates) now seem even less prepared to engage in serious thinking, research, writing, and lab work than a decade ago.  Their observations dovetail with major shifts in student mental health observed by counselors in the past few years, validated by Jean Twenge’s research on those born 1995-2012, who grew up with constant access to social media. “Rates of teen depression and suicide have skyrocketed since 2011. . . Much of this deterioration can be traced to their phones . . . . The effect of screen activities is unmistakable: The more time teens spend looking at screens, the more likely they are to report symptoms of depression . . . . This trend has been especially steep among girls.” Twenge’s teenage research subjects in 2015-2016 are (or will) enroll in university classes 2018-2021.

Can this be blithely dismissed: That’s progress, you can’t stop it?  “Progress” hides a more sinister reality: the social media apps these young people use so often have been specifically engineered to encourage maximal use through intermittent positive reinforcement and the drive for social approval.

  • Apple engineers Justin Santamaria and Chris Marcellino developed the iPhone push-notification technology that affects the same neurological pathways as gambling and drug use: “reward-based behavior that activates the brain’s dopamine pathways.”
  • Tristan Harris (“Design Ethicist” at Google) notes that humans crave approval, and companies tweeks their apps to hook their users with the power of unpredictable positive feedback, sprinkling “intermittent variable rewards [likes, tags, tweets, etc.] all over their products because its good for business.” Getting a reward is like winning at a slot machine, and “several billion people have a slot machine in their pocket.”
  • Sean Parker (Facebook founder) remembers, “The thought process that went into building these applications, Facebook being the first of them, ... was all about: How do we consume as much of your time and conscious attention as possible?”

The combination of phones and social media apps is specifically designed to hook users –especially young people—into prolonged use because their business model is to expose them to paid advertising, political, and entertainment content intended to shape their behavior and gather their votes and dollars.  A great many users (especially the young) are compulsively on their phones because they have been hooked –exactly what the phones were designed to do.  Sean Parker fears that social media “literally changes your relationship with society, with each other ... It probably interferes with productivity in weird ways. God only knows what it's doing to our children's brains.”  Sullivan suggests that this enslavement is merely “new antidepressants of a non-pharmaceutical variety.”

These intentions are not a new idea, but in digital engineering now taken to new extremes.  Timothy Wu writes that newspapers were drastically changed by the introduction of advertising in the 19th century: readers became not just subscribers, but also an audience the newspapers delivered to advertisers.  Matthew Crawford notes that the first industrial assembly lines, by demanding concentration on repetitious tasks, so altered the experience of work that Henry Ford’s workers simply walked out in 1913.  When Ford wanted to add 100 workers to the line, he had to hire 963, and was forced to double the daily wage to keep the line staffed.  In broader social terms, Crawford writes elsewhere that advertising through social media apps of the claim a large portion of the “attentional commons” for private purposes in the attention economy, with the resulting surfeit of messages and enervated users.  Just as Ford’s innovations in long term fomented a powerful labor union, could a “user union” come to counterbalance corporate attention engineering?

Resisting these claims to wage labor or attention engineering is not new.  The 19th-century Arts and Crafts movement inspired by John Ruskin and William Morris grew from their revulsion against mechanized production and the Dickensian, oppressive division of sweatshop labor in Victorian England.  Newport advances Thoreau’s famous axiom in Walden, “The cost of a thing is the amount of what I will call life which is required to be exchanged for it, immediately or in the long run.”  Rather than the standard account of cost in money, Thoreau counts the cost in life: attention, connection, his pleasure of living deliberately.  In the first chapter, “Economy,” Thoreau gives a very straightforward, New England accounting of his life on the pond, replete with tables (his kind of spread-sheet) to show his point that frequently more is actually less.  By contrast, do not our Concord-like students, crushed and smothered under their load of distraction and debt, come to lead lives of quiet desperation?

Is there any solution or alternative?

A hint of a solution has been given, ironically, by Facebook itself.  Its own David Ginsberg and Moira Burke ask, “is spending time on social media bad for us?” After reviewing a lot of research, they conclude, “it really comes down to how you use the technology.”  This gives the game away: reflective, intentional use (in Newport’s words) “punctures the myth of Facebook as a foundational technology that everyone should just “use” in some generic sense . . . . [they] are encouraging people to think critically about exactly what they want to get out of this service.”

Newport realizes the potential of Ginsberg’s and Burke’s admission. “This mind-set is potentially disastrous” for Facebook because it could result in far less time spent in it, dramatically decreasing its value for advertisers and investors.  Any explicit comparison of the real costs of time and attention with the real benefits of social media threatens Facebook’s business model.

Reflective, intentional, and “critical use is a critical problem for the attention economy.”  By developing minimal and deliberate use of digital technology, users might “front only the essential facts of life,” to see if they can learn what it has to teach: to choose a focused life.  Have universities, by so catering to students’ and parents’ anxieties, accepted their students’ distraction by social media unreflectively?  The “attentional commons” of higher education has always faced competition, but now faces determined competitors armed with the specific agenda to “consume as much of [their students] time and attention as possible” (Sean Parker).

Universities can reclaim their cultural relevance when they come to understand that the greatest threat to education today is not careerism, financial instability, or political hostility, but distraction.  If higher education will ever offer a coherent alternative to a depressed and frazzled generation, it will have to engage the powerful corporate and cultural forces that want to hold students, faculty, and staff hostage to engineered, hyper-palatable mental pseudo-stimuli.  This engagement will have to be smart, flexible, subtle, and persistent if we are to challenge the fast food of social media with the slow cookery of a strenuous education.

The past few months have shown that Facebook and other social media sites are hardly invincible and certainly not foundational, as they face sharp-eyed scrutiny from public, government, and investors alike.  Now is the time for higher education to step up to the challenge of distraction.

Where is this wisdom to be found, and where is the source of this understanding?

  1. Critical Attention, Disruptive Humility, and Analog Reality

Much fuller responses come from a varied handful of writers.  Gary Rogowski and Michael Crawford return to attention required by the physical world of objects, activities, and pleasures that live on despite the blandishments of the digital.  Rogowski’s title and subtitle reveals his point: Handmade: Creative Focus in an Age of Distraction.  Examining what his life has to teach, the essential facts of his life (like Thoreau), Rogowski finds a correspondence between our hands and our thoughts: “Long ago we learned to think by using our hands, not the other way around.”  The sheer physicality of “analog” asks us “to give good evidence of yourself. Do good work.”

Craftsmanship “must reckon with the infallible judgment of reality, where one’s failures or shortcomings cannot be interpreted away.”  Crawford believes “the mechanical arts have a special significance of our time because they cultivate not creativity, but the less glamorous virtue of attentiveness.  Things need fixing and tending no less than creating.”  Sheer physicality, the recalcitrance of the real, is a way of learning the humility of attention. As Rogowski puts it, “situated among our fellows in norms and practices that shape a life, the environment matters.”

The humility of attention is not new. Alexander Langlands inquired into the origins and true meanings of traditional crafts, he bumped into the embeddedness of craeft, an Old English word connoting an amalgam of knowledge, power, and skill, extended to a sense of wisdom and resourcefulness. “We can’t put our finger on exactly what craeft was.”  As an archaeologist, he constantly studies material culture, the “deep time signatures” of so many traditional crafts.  They remind him, “we are makers, and that we have always lived in a world of making.”  The cognitive contemplation of making exercise the mind in silence and solitude.  Craeft is a form of intelligence, an ingenuity through which we can think, contemplate, and be: powerful, resourceful, and knowledgeable through the medium of making in a world of diminishing resources and increasing environmental instability.  To be craefte is all about a mindful life achieved through beautiful simplicity: the humility of attention, failure, and persistence.

Craeft provides a high-touch physical higher education.  It pays attention to the reality of physical, tangible materials as they both shape and respond to human desires, needs, and intentions.  Langlands’ “deep time signatures” are written with the ink of character on the paper of experience, an in-this-moment contact with forms of past life and future practice that transcend any individual’s encounters.  In the 21st century, is this mere nostalgia, or wishful thinking?

Jaron Lanier, one of the primary inventors of “virtual reality” practices these deep signatures of time through his music and writing.  Understanding the virtual imitation of reality so deeply, Lanier wrestles with actual reality acutely: difficult-to-play ancient instruments such as the begena (ancient harp), suling (flute), or esraj (sitar)—he has collected over a thousand.  He doubles down on being human by re-learning the ancient crafts of playing, the deep signatures of time that learning such ancient skills requires.

Lanier’s craeft of music extends to his craeft of writing, and understanding of books.  His Who Owns The Future (2013) concludes with entwined meditations, “The Fate of Books,” and “What Is To Be Remembered?”  Having watched the disruption of the music business and unintentional (but devastating) impoverishment of musicians, Lanier see a similar pattern dveloping regarding reading and writers.  Six years later, his hopes, fears, and expectations for books are fascinating: some have already come true.  Among them:

  • Readers will be second-class citizens (because ownership of a printed copy has become a contract of access to digital content; the reader is left no capital, nothing to resell –a rejection of a true market economy);
  • Many books will be available only via a particular device, such as a particular company’s tablet (think: Kindle, Nook, or digital rights protection such as Adobe Digital Editions);
  • Readers will spend a lot of time hassling with forgotten passwords, expired credit cards, or being locked into the wrong device . . . for years at a time;
  • People will pay less to read . . . while people will earn still less from writing (repeating a pattern seen in music and other media in which “software swallows everything”);
  • By the time books have mostly gone digital, the owners of the top Internet servers . . . will be more powerful and richer than they were before.

What does Lanier see about a book that is worth saving?  A book is not an artifact, but a synthesis of fully realized individual personhood with human continuity (my italics).  To Lanier, the pattern of devaluation and exploitation of “content” (music, literature) feels off-kilter and short-sighted. The network that fails to recognize and preserve human continuity will serve only its masters, just as digital music has fostered a dehumanization of music.  When the human role is reduced to producing “output” (or “content”) just like a collective or algorithm, and when a winner-takes-all market-place ideology becomes the sole means of valuation, the very qualities that make music or literature, reading or writing, humanely worthwhile are simply sidelined, as irrelevant externalities.  To paraphrase the famous declaration from Vietnam, “we had to burn this art form to price it.”

Lanier proposes this (potentially offensive) thought experiment: “Would you want to send a collectively programmed robot to have sex on your behalf because it was better than you, or would you want to have the sex yourself and get better by doing?” Too often Lanier has heard in response, “I’d prefer to have the best available robot,” silently passing over the nexus of data, servers, technological finance, and market ideology that make that choice seem preferable. By analogy, “If market pricing is the only legitimate test of quality, why are we still bothering with proving theorems?” Why don’t we just let the market determine whether a theorem is true?”  Content-by-algorithm or -collective is slow suicide for creators and researchers and slow homicide for everyone else.  Lanier concludes, “I am arguing that there is more than one way to build an information economy, and we’ve chosen the self-destructive option.”

“Content” at market price seems to be so easily available.  For example, printed books are still common.  Hyper-focused on “content-creation” and marketplace valuation, the “siren servers” of technology obscure how each book bears the marks of the deep signatures of time.  A modern book is a carefully crafted, progressively evolving artifact that only implies the depth of human continuity and experience in its design.  If you doubt this, just look at books printed a century or two ago to see the differences in design and the evolution of writers’ and readers’ expectations.  The book is a synthesis of fully realized personhood with the human continuity signed with time’s deep signatures. A printed book simply transposed into a digital reader as some kind of file bring convenience at the cost of interaction (annotating, bookmarking, easy page-to-page comparison or reference –and the present alternatives to those activities in software are simply dreadful).  A time may come when a digital book represents human continuity –but not yet nor for a long while.  We are creatures of touch and smell as well as sight, hearing, and taste.

I believe Lanier’s insight (that books and other media are syntheses “of fully realized individual personhood with human continuity”) foregrounds the odd and enlightening “revenge of the analog” (David Sax’s book). The return of “disrupted” technologies (prominently but not only vinyl recordings, paper books, film photography, and board games) is possible not only because digital versions work so well, but because they work too well.  Convenience becomes a trap, a mirage of experience rather than genuine experience. Sax’s different narrative shows that technological innovation is not an inevitable and irreversible march, but “a series of trials that helps us understand who we are and how we operate.”  The overwhelming superiority of digital media ironically leads to a shift of valuing: an older technology can sometimes work better, and its inconvenience and inefficiency becomes its renewed strength.  Print publishing, retail sales, on-site workers in a vibrant community (such as Shinola in Detroit), on-ground education, face-to-face interaction in meetups, book clubs, meditation groups, summer camps—all of these ideas and practices have proven surprisingly resilient.  Resilience cannot charm away real difficulties: no analog Dr. Pangloss!  But it does suggest that a popular narrative of irresistible disruption might not be all there is; resilience is an alternative to battle-field metaphors of victory and defeat.

The worship of disruptive technologies (and disruption in general) can go too far. “Disruptive innovation” may not in fact be the rule, but the exception.  Clayton Christiansen’s celebrated theory is not beyond criticism (by his Harvard colleague Jill Lepore) and equally cogent alternatives (Harvard colleague Michael Porter’s theories of competitive strategy and advantage).  Very recently Alfred University president Mark Zupan called Christiansen’s bluff.  Christiansen has doubled down on his 2011 prediction that as many as half of American universities would close or go bankrupt in 10 to 15 years.  Zupan has wagered $1 million, to be given to Christiansen’s non-profit Institute if at least half of all traditional universities fail or merge by 2030; if not, Christiansen would contribute the same sum to Alfred University’s endowment.  Will Christiansen be willing to put his money where his mouth is?  (See also this previous blog entry.)

The humility of paying careful attention to deeply fallible and human face-to-face encounters and processes calls the bluff of technologism –the belief that (often disruptive) technology will cure all woes, a new variety of traditional snake-oil wonders (climate change: “there’s an app for that!”).  Ryan Raffaelli (also from Harvard Business School) has studied how new value can be created for “old innovations” –a subtle and delightful riposte.  Raffaelli has studied, in particular, how the Swiss watch industry saved itself by reinventing its identity: a technology re-emergence.  Swiss watchmakers who survived the competition of cheaper imports saw prices for mechanical watches up for auction increase dramatically.  Suddenly they realized: there could still be value latent in an older technology re-conceived as a social and cultural fashion signal.  Just when the Apple watch might have swept away the Swiss watch industry, instead the watch industry revived as a progression for those who, having worn an Apple watch, now wanted something up-market that feels and signals “genuine.”  As went Swiss watches, so have fountain pens (Goulet Pens), notebooks (Moleskine), and independent bookstores, now numbering more than in 2012.

Here—at last!—is an opening for liberal arts education.

  1. Analog Liberal Arts Education Against the Grain: Digital Minimalism in Action

The term “liberal arts” now seems so old-fashioned or vague that it has been seen to impede further conversation.  I once spoke with the president of a well-regarded liberal arts college who claimed that she no longer tried to use the term, because no one knows what it means.  Her view discounts the real value of the phrase “liberal arts education,” ambiguous and elusive, which has thus resisted facile, ideological characterization. The ambiguity is the impediment that re-conceives value.  The word “liberal” in this case means something far older than the tired liberal-versus-conservative polarization that disfigures (or has destroyed) public discourse: liberal as in liberating, free-making, getting disentangled from the sloppy shortcuts of everyday thinking.  “Liberal arts” connotes a narrative of thoughtful practices out of synch with careerism and digital maximalism.

The narrative of the liberal arts, a braid of habits and practices from (inter alia) classical antiquity, medieval universities, American colleges in the New National period, and the rise of sciences and social sciences in Progressive era, is an amalgam that is unavoidably counter-cultural. Powerful ideologies, whether arising from crown, state, church, market, or faction have always arrayed themselves against it.  Overt ideologies are the obvious culprits, but equally antagonistic have been the subtle habits and dispositions that can cloak the subtle, slippery demands of power, position, and fortune.  The “liberal arts” have been both the servants of power and privilege and a primary source of their interrogation. This both/and contradiction or ambiguity, with a strong whiff of the impractical, means that contemporary marketplace ideology can hardly avoid patronizing liberal arts higher education as outmoded, expensive, boring, elitist, wasteful, and ideologically suspect.  That suspicion shows exactly why liberal arts higher education meets a critical need for pause and reflection in an ideologically riven time.  Being a thoughtful, humane person has never been easy.

What are the liberal arts?  Many answers have been proposed that, following William Cronon’s insight, come down to lists: essential facts, mandatory readings, curricular requirements, etc.  The original artes liberales were a medieval list of seven subjects that set male aristocrats off from everyone else.  Corresponding modern lists of competencies are scarecly inspiration and carry the odor of the curricular committee: competencies in commmunication, major modes of thought (science, mathematics, arts, social sciences, etc.), cultural heritages, etc.  Nevertheless, Cronon cannot resist making another list, ten personal qualities which he associates with people that seem to embody the values of a liberal education (here only several):

  1. They listen and they hear;
  2. They read and they understand;
  3. They can talk with anyone;
  4. They can write clearly, persuasively, and movingly . . .
  1. They practice humility, tolerance, and self-criticism. . .
  1. They follow E.M. Forster’s injunction from Howards End: “Only connect . . .”

(The ones left out here are necessary as well: read Cronon’s short essay)

All of these personal qualities are well associated with the analog, “real” world –contra the critics who see “reality” only as an excuse for performative hard-headedness.  Cronon:

A liberal education is not something any of us achieve; it is not a state.  Rather it is a way of living in the face of our own ignorance, a way of groping toward wisdom in full recognition of our own folly, a way of educating ourselves without any illusion that our educations will ever be complete.

Cronon then emphasizes that each of these qualities is also education for human community:

Each of the qualities I have described is a craft or a skill or a way of being in the world that frees us to act with greater knowledge or power. . . . [W]e need to confront one further paradox about liberal education. In the act of making us free, it also binds us to the communities that gave us our freedom in the first place; it makes us responsible to those communities in ways that limit our freedom. In the end, it turns out that liberty is not about thinking or saying or doing whatever we want. It is about exercising our freedom in such a way as to make a difference in the world and make a difference for more than just ourselves.

Cronon’s “craft, skill, or way of being in the world” returns to craeft: a way of living that bears the deep signatures of time, that nurtures the craeft of attention, humility, and human continuity.  This is the real point of a liberal arts college education –that it cannot be achieved or stated in four years, but is a life-long way of living in the face of our own knowledge and ignorance, wisdom and folly, and inevitably incomplete connection.  Only connect: human freedom of the service of human community, human continuity.

Matthew Crawford proposes a fruitful metaphor for connection: the cultural jig.  The metaphor’s origins lie in carpentry: “a jig is a device or procedure that guides a repeated action by constraining the environment in such a way as to make the action go smoothly, the same each time, without his having to think about it.” For example, a jig can guide a saw, so that a carpenter need not measure many boards individually.  A cultural jig can contribute to personal character “that is built through habit, becoming a reliable pattern of responses to a variety of situations:” such as (historically) thrift, parental authority, personal accountability.  Tradition can be a robust cultural jig, fostering a community of practice in which real independence through interdependence does seem to become possible (though never guaranteed).  The conundrum is that real freedom and self-mastery require some dependence on and mastery of cultural jigs, such as attentiveness.  When a liberal arts education is effective, a student (or graduate) is never adverse to seeking, genuine cultural jigs to guard against folly, foibles, and simple ignorance.

This line of thinking can get very woolly very fast (and maybe already has).  It can have real-world consequences, such as careful thinking about what makes work meaningful.  British business researchers Catherine Bailey and Adrian Madden have done research to discover what does, in fact, make work meaningful, and their conclusions bear striking similarity to the consequences of a liberal arts education.

How and why do people find their work meaningful? When:

  • work matters to more people than just to the workers themselves, or their bosses;
  • when it can mean engagement with mixed, uncomfortable, or even painful thoughts and feelings, not just euphoria or happiness, and a sense of coping with sometimes intractable challenges;
  • when meaningfulness in work is allowed to emerge in an episodic rather than scripted, sustained way, in moments that are not forced or managed, but “contain high levels of emotion and personal relevance, and thus become redolent of the symbolic meaningfulness of work;”
  • when workers have the time and space to become reflective, as meaningfulness can be realized over time rather than spontaneously or momentarily;
  • when the feeling of engagement is exactly a feeling or personal contribution in an organizational environment of tasks and responsibilities, formed in a narrative of engagement and satisfaction.

By contrast work becomes meaningless when connections are broken: people from their values; leaders from subordinates; pointless tasks without connection to any real problem; over-riding people’s better judgment; disconnection from personal relationships; exposure to personal harm (disconnection from safety and well-managed risk).  Organizations can cultivate meaningful work by stressing personal connections, well-delegated responsibilities, real-world problems, real-world beneficiaries, and recognized accomplishments.

Such meaningful work can become the occasion of the personal attributes that William Cronon identified in liberally educated individuals (above).  These can well be extended in context of work to build the practice of forgiveness, as Rogowski illumines it: “Screwing up is a given. Forgiveness is not. Unless you practice it.”  A liberal arts education is an ideal way to learn how to fail, since it is a given that one will fail, but also how to recover and go on.  An education that teaches how to fail, however, is neither what a college admissions officer wants to represent, nor what the anxious parents of a high-school senior want to hear.

Only connect: human, working freedom in the service of a good greater than mere organizational continuity, a craeft that conserves and strengthens human continuity.  The imperative to connect is, in a vital sense, a powerful cultural jig.

But can you make a living doing this?  The predominant narrative in American society now is that a liberal arts education is a frivolous waste or élite privilege that prepares a young person poorly for the “real world” (always taken to mean: the grimly hyper-competitive world that the writer envisions for everyone else).  It a liberal arts education of little value in “today’s world?”

A recent Mellon Foundation study answers: no: a liberal arts education incurs costs and permits benefits on par with pre-professional degrees.  This is a carefully nuanced study that looks carefully at what constitutes a liberal arts education, and that mere attendance at a liberal arts college is not always a good proxy.  It tries to control for factors such as parental income and achievement, college selectivity, typical incomes in various professions, and other factors.  It warns that observed correlations do not prove causations, since in any case no real control group is available to perform a meaningful (or even ethical) experiment.  Nevertheless, “claims that liberal education is of little value because it does not lead to employment is clearly not supported by the existing data.”  At the same time: more work needs to be done.

All this seems a long way from digital minimalism –or is it? I am convinced that one of the best ways to counter simplistic digital maximalism –a vague giving “people the power to build community and bring the world closer together” (Facebook) is exactly the craeft of connection that bears the deep signatures of time in human continuities in a variety of media.

A liberal arts education is analog in a digital world: intentionally inconvenient as a strategy for identifying enduring value.  The power of a “digital cleanse” (Newport) is vastly increased when a liberal arts education can provide genuine alternatives and powerful cultural jigs: engagement with real people, over-arching, important questions, and intractable problems.  Screwing up is a given. Forgiveness is not. Practicing craeft is a jig of attention, humility, forgiveness, failing and moving beyond failure, human continuity in a real community.  Only connect.

Resources linked or mentioned in this blog entry:

Allen, Mike. “Sean Parker unloads on Facebook, ‘God only knows what it’s doing to our children’s brains,’” Axios, November 9, 2017

Bailey, Catherine, and Adrian Madden. “What Makes Work Meaningful—Or Meaningless.” MIT Sloan Management Review, Summer 2016.

Christiansen, Clayton M., Michael E. Raynor, and Rory McDonald.  “What is Disruptive Innovation,” Harvard Business Review, December 2015.

Crawford, Matthew B.  Shop Class as Soulcraft: An Inquiry Into the Value of Work, Penguin Books, 2009,  especially pages 41-42.

Crawford, Matthew B.  The World Beyond Your Head: On Becoming an Individual in an Age of Distraction,  Farrer, Straus and Giroux, 2015, especially pages 8-20.

Cronon, William.  “’Only Connect’ . . . The Goals of a Liberal Education,” The American Scholar 67, Autumn 1998 (full-text available on his website)

Ginsberg, David, and Moira Burke. “Hard Questions: Is Spending Time of Social Media Bad for Us?” Facebook Newsroom, December 15, 2017

Harris, Tristan. “How Technology is Hijacking Your Mind—from a Magician and Google Design Ethicist,” Medium: Thrive Global, May 18, 2016

Hill, Catherine B., and Elizabeth Davidson Pisacreta.  The Economic Benefits and Costs of a Liberal Arts Education.  An essay commission by The Andrew W. Mellon Foundation under the auspices of the Mellon Research Forum on the Value of Liberal Arts Education.  The Foundation, 2019.

Langlands, Alexander.  Craeft: an Inquiry into the Origins and True Meaning of Traditional Crafts. New York: Norton, 2018.

Lanier, Jaron.  Who Owns the Future? New York: Simon and Schuster, 2013.

Lederman, Doug. “Clay Christensen, Doubling Down,: Inside Higher Ed, April 28, 2017.

Lepore, Jill. “The Disruption Machine: What The Gospel of Innovation Gets Wrong,” New Yorker, June 23, 2014.

Lewis, Paul. “’Our Minds Can Be Hijacked’: the Tech Insiders Who Fear a Smartphone Dystopia,” The Guardian October 6, 2017

Porter, Michael. “The Five Competitive Forces That Shape Strategy.” Harvard Business Review, January 2008.

Raffaelli, Ryan and Carmen Nobel.  “How Independent Bookstores of Thrived in Spite of Amazon.com”  Harvard Business School, Working Knowledge, November 20, 2017.

Raffaelli, Ryan.  Technology Reemergence: Creating New Markets for Old Technologies, Swiss Mechanical Watchmaking 1970-2008.  Administrative Science Quarterly, May 2018.

Rogowski, Gary. Handmade: Creative Focus in the Age of Distraction.  Fresno, California: Linden Publishing, 2017.

Sax, David, The Revenge of the Analog: Real Things and Why They Matter.  New York: Public Affairs, 2016.

Sullivan, Andrew.  "I Used to Be a Human Being."  New York, September 2016

Thoreau, Henry David. Walden: A Fully Annotated Edition, edited by Jeffrey S. Cramer, New Haven: Yale University Press, 2004, especially the first chapter, “Economy”

Twenge, Jean M. “Have Smartphones Destroyed a Generation?” The Atlantic, September 2017.

Wu, Tim. The Attention Merchants: The Epic Scramble to Get Inside our Heads, New York Vintage Books, 2017, especially pages 11-17.

Zupan, Mark.  “Betting on (Non-Profit) Higher Education,” Rochester Democrat and Chronicle, February 28, 2019.

I love to return to a persistent source of wonder and thoughtfulness: the Metropolitan Museum's Artist Project.

"There was something deeply anti-authoritarian about just looking and observing and telling the truth as you saw it."  Swoon (Caledonia Dance Curry)

Sometimes I am utterly depressed by the state of higher education and disdain now shown for what used to be called the liberal arts.  This kind of education has come to be seen as either élitist, or old fashioned (analog), or simply useless --and higher education now has to be all about usefulness: return on investment, financial incentives, and serving the customer. Arts, humanities, history, and philosophy are all banished either to irrelevance or dismissed by accruing some number of credits in a core curriculum. Questions, beauty, and nuance are for boring people.

At dark moments I love to return to a persistent source of wonder and thoughtfulness: the Metropolitan Museum's Artist Project.  At first sight it seems to be an advertisement for a book --a worthwhile book, but still an ad.  Scroll down the page further and one simply sees an array of six "seasons" each with twenty artists names.  The real treasure is hidden and has to be discovered, a little at a time.  Some of the artists names are famous, and some less so.

Each three-minute video presents an artist looking at a place, piece, or genre at the Museum and then speaking what she or he sees.  Sometimes the pairing of an artist with an object is an extension of the artist's own work; other times it is a purposeful contrast.

For example, the photographer Thomas Struth, known for his Museum Photographs of visitors to famous art museum (Art Institute of Chicago, Accademia in Venice) which he expanded to include visitors to churches, and powerfully significant site such as Times Square or Yosemite National Park.  Struth makes viewers aware of their own active participation in the completion of the work's meaning, not as passive consumers but as re-interpreters of the past for the needs of the present" (Metropolitan Museum exhibition press release, 2003) . By contrast, Struth responds to the Chinese Buddhist sculpture (Room 208), a silent place "not so easy" to find.  Struth finds these sculptures humble, "can we afford humblesness these days?"  He asks, "Can it change my life? Can it transform my opinion or my existence in some small or larger way?"

Martha Rosler is a multi-media artist who uses photography, video, performance, and space, to examine womens' experience in everyday life in the public sphere, such as Secrets from the Street: No Disclosure (1980) or her current Irrespective at the Jewish Museum (open until March 3, 2019).  Contrary to expectation, she goes to The Cloisters, the Museum's special medieval building and collection at the northern tip of Manhattan located at the northern tip of Manhattan, "like another world floating in the clouds."  Early in the 1960s this art interacted with her work with abstract expressionism, but also opened the way for narrativity; "As a first-generation Brooklyn Jew I was trying to see where I fit in that story."

One of the most powerful moments in this anthology of videos is Swoon's account of Honore Daumier's The Third Class Carriage, ca. 1862.  Swoon sees a rough painting with almost cartoonish strokes, and yet "some people have said that this painting sentimentalizes poverty --and I disagree I think he's getting the complexity of life here."  She sees "something warm and rough and something beautiful and difficult; there's a compassion in this piece." Daumer is "a relentless social observer --he's always expressing his point of view. . . . You feel the love is contained within looking."  Swoon asks, "how do I make something relevant?  Daumier's position as seen through this work is just: look at people, observe what's going on, record that, give it fidelity in its simplest truth.  One of the highest functions of art I have identified within my own work is to be the vessel for empathy.  When I see this painting it just strikes the flint. I try to walk in those footsteps."

Humbleness, narrativity, telling the truth as you see it, fidelity, empathy --all solid concepts and self-awareness that, if learned well, makes a liberal arts education actually liberating.  No wonder the present STEM masters of the universe want to banish that liberating, inconvenient and subversive in their brave new world.  Resist: strike the flint.

Three years later, Jones' book still resonates broadly with the continuing decline of WCA.

End of White Christian AmericaThe End of White Christian America, by Robert P. Jones. New York: Simon and Schuster, 2016. 309 pages. [With a new afterword covering the 2016 election.]  ISBN 9781501122323, available at Sacred Heart University Library.

Jones' book garnered considerable attention in the religious media in 2016. Reading it three years later, one cannot help but ask frequently, "What about?" some event since 2016, because the book preparation and publication date precluded any coverage of the interminably long, bitter 2015-2016 election cycle. Since November 2016, so many "what abouts?" arise, even over Jones' basic contention that a carefully-defined "White Christian America" (WCA) is dying or has died.  The "new afterword covering the 2016 election" is identical with the article, "T—— Can't Reverse the Decline of White Christian America," The Atlantic, July 4, 2017.  (I em-dash the name to try to discourage trolls from spamming this post.)

Jones' sticks consistently to a concept of white protestant Christian America, its churches, web of associations, and cultural agenda (abbreviated WCA). He is clear when this infrastructure of influence extends to cover Eastern Orthodox (which he glibly labels "Greek Orthodox" although assorted Greek, Russian and other derivations and jurisdictions are a huge question in those communities). Jones maintains a boundary between Christians of white, European descent, and African American Christians, because of the heritage of slavery and Jim Crow entwined with the churches, but he gives little attention to the growing Asian presence in the mainline WCA churches, or growth of other ethnically-based Christian churches (Afro-Caribbean, for example).

Jones efficiently traces the distinctions between the two historical descendants of pre-1920 WCA: mainline, ecumenically-oriented churches, and evangelical churches. I believe he fails to consider fully, however, just how porous those distinctions can be, or the significant differences between what has been called "soft-core" versus "hard-core" evangelicalism --the latter more doctrinally oriented and fundamentalist, the former more experiential and open to individuals who move in and out of a community (seen in the rise of name-brand groups such as The Vineyard). Particular communities, in fact have changed position within WCA with some difficulty, yet irreversibly. Why and how?

For example, Hope College in Michigan was a mainline college with a Reformed  with a heritage in the  Reformed Church in America (the less-enclosed of the Dutch Protestant denominations) through the 1970s. It liked to remember its association with Robert Schuller '47 (his son was a class mate there, '76), and the Science Center (!) was named after Norman Vincent Peale. But it went more evangelical in the 1980s-1990s, under the leadership of Gordon van Wylen; by the 1990s the College had moved clearly and definitely in an evangelical direction under the direction of Chaplain Ben Patterson and a milquetoast senior college leadership.  (See James Kennedy's Can Hope Endure, 2005.) To this day it is contrary college policy to "by statement, practice, or intimation . . .  to promote a vision of human sexuality that is contrary to this [fundamentalist] understanding of biblical teaching." Jones identifies opposition to gay and lesbian rights as one of the binding commitments that evangelical Christians must maintain without compromise in order to show their bona fides --and the problems this will bring with a younger generation of Americans. Numerous alumni/ae have repudiated this position, or simply dropped any sense of allegiance or support, while support has increased from evangelicals. The cost of changing official policy for the college, given its choices, would probably be prohibitive.

By contrast, Princeton Theological Seminary moved in the opposite direction. Beginning from an incohate, majority position in the 1970s, the Seminary fully mirrored the protracted and heated conversations and conflicts that coursed through the Presbyterian General Assembly, especially after the union of the (northern) UPCUSA and the (southern) PCUS in 1983. Under the long presidency of Thomas Gillespie (1983-2004) the Seminary maintained the (PCUSA) Presbyterian Church's official line of "acceptance or members but without ordination" despite the evident hypocrisy of this decision with in its own community. Any discussion or criticism of these policies was resisted and tamped down upon (as I discovered 1992-1996 during the course work and examinations for Ph.D.) These hypocrisies culminated with the death of the community's beloved and celebrated musician David Weadon in 1996, from HIV-related causes, who died afraid to reveal his illness for fear of his job and health coverage. Thomas Gillespie did express remorse and a change of heart, too late for David, of course. In 2011 the PCUSA finally voted to allow the ordinations of gay and lesbian persons. The Seminary now hosts a chapter of BGLASS (Bisexual, Lesbian, and Straight Supporters) and formally hosts discussions of gay and lesbian issues in an affirming manner in its Center for Theology, Women, and Gender. From a position squarely in the evangelical opposition to gay and lesbian rights, the Seminary has moved to the center in tandem with its historic denominational alliance.

Jones' book suggests that such institutional shifts are rare, but I believe that they are more common than he realizes, because they reflect the changing concerns of individual white protestant Christians as well. Evangelical churches have (historically) surely seen their share of former or lapsed members of mainline churches who have undergone some kind of conversion or new-life experience and join an evangelical congregation. The traffic certainly moves in the other direction as well: numerous evangelical Christians have moved out of Evangelical churches in response to changing understandings of scripture, history, cultural and religious political connections, and geographies. Very little work has been done on cross-overs. In 1993 Benton Johnson, Dean Hoge, and Donald Luidens tried to examine the famous claim by Dean Kelley (1972) that conservative churches were growing (and liberal churches declining) because the more liberalizing denominations were weak: low commitment, and moral and theological commitments too fuzzy to mobilize members' energies. In their 1989 study of 500 Presbyterian baby-boomer confirmands (in other words, confirmed around age 14 between roughly 1960 and 1980) found that they did not, in general leave the Presbyterian church because they sought doctrinal and moral orthodoxy in conservative churches. Some remained in mainline churches, and many left, but not for Dean Kelley's supposed reasons.

Jones' book examines WCA responses to three general topics: political involvement, gay and lesbian rights, and racial tensions and histories. Since his preparation (late 2015) and publication in 2016, much has happened that actually confirms his historical narratives of change, decline, and acceptance. As he wrote in The Atlantic in July 2017, the emphases and apparent desires of the present Presidential administration will not reverse WCA decline --indeed, the present leadership may be the death-rattle of WCA rather than its rejuvenation. One of the benefits of reading Jones' book now (2019) is that it contextualizes numerous deeply divisive conflicts of the past 26 months in what went before: the world did not begin anew in November 2016. The politics of nostalgia and fear have proven very powerful, and the traditional evangelical narrative of persecution has found a weird new life in the face of public anti-Semitism. Ironically some of the mainline churches have found a new voice for social inclusion in an era marked by rising hate speech, acts of violence, anti-semitism, and anti-immigrant rhetoric and actions.

John Fea's book Believe Me: the Evangelical Road to D—T— develops the narrative of persecution, nostalgia, and fear, which will wind up at the dead end that looms for the "court evangelicals" and possibly for evangelicalism as a whole. The signs of coming trouble and a profound day of reckoning are unmistakable. Young evangelicals problematize or flee the label: the Princeton Evangelical Fellowship, continuing from 1931, changed its name to Princeton Christian Fellowship because of the narrow and overly partisan meanings that have gathered around the term "evangelical." (The Princeton Fellowship pre-dated the use of the term "evangelical" by the National Association of Evangelicals by a decade.) David Gushee, formerly an evangelical (still very much a professor at Mercer University) has written Still Christian: Following Jesus Out of American Evangelicalism (2017) in a similar spirit.

Three years later, Jones' book still resonates broadly with the continuing decline of WCA. The mainline denominations still struggle with the realities of decline, some rather badly. Many do nothing meaningful whatsoever about preparing or adapting present and future clergy to the reality of ministry as a part-time job. The evangelical wing of WCA has in large part completely mortgaged its moral and cultural standing to the current incumbent of the White House. One shudders to think of the court evangelicals' fall when that person is either impeached, not re-elected, or even retires after eight years. Currently 72 years old, in any case he won't be around forever. What then? Who will write this story then, and what will the next social form of evangelicalism look like?

Three years later, Jones' book could also benefit from some examination of the end of white, Catholic America as well. The changing composition of Catholicism can mask for a while the dramatic departure of white Catholics, many deeply angry and hurt by the continuing pedophile scandals and insider infighting regarding positions associated with the current Pontiff. White Catholic America (the other WCA?) replicated white, protestant, Christian America's web of institutions in the earlier 20th century to a remarkable degree. Those are also coming apart, for somewhat different but related reasons. In some states, the cultural power of the Catholic Church, once palpable, has dissipated almost completely. This would be a fascinating companion study that could probe much deeper the realities described by David Masci and Gregory A. Smith in October 2018. Personally speaking, I sometimes feel that I work on a campus filled with pissed-off Catholics.  They are not happy, and for that Church the day of reckoning approaches, as well.

Jones' final chapter, a "eulogy," organizes a lot of material on the frame of Kübler-Ross' theory of the stages of grief, from denial and anger to acceptance.  This is a dubious scheme insofar as many pastoral and health practitioners have abandoned this scheme as unhelpful and overly schematic, but it serves Jones acceptably as a prism with which to examine topics as diverse as the Institute for Religion and Democracy (IRD), essentially a destructive and disruptive distraction, to Russell Moore's various activities in the Southern Baptist Convention, to the panentheistic wanderings of some mainline denominational leaders (such as John Dorhauer of the United Church of Christ).  As a sociologist (and not particularly a theologian), Jones tends to locate the initiative in the churches with people, and unintentionally commit the theological mistake that doomed WCA: that this is entirely under human control. If the past decade has shown anything, it is how little is under effective human control.  Many preachers like to quote Alfred, Lord Tennyson's famous line in Morte d'Arthur:  "The old order changeth, yielding place to new" --but few go on to the next lines:

And God fulfils Himself in many ways, 
Lest one good custom should corrupt the world.

Did whatever was good in the WCA wind up corrupting the world, or vice-versa (and what does "world" mean here, anyway)?  Jones is a sociologist of religion and sticks to his domain.  Inside (or emerging from) the dying or dead WCA, churches might see their situation differently. How now will God fulfil God's intentions for the church and for all creation?  Karl Barth's famous rebuke of religion in the church resonates broadly.

To West Michigan, Dutch American culture, I am an outsider with one foot inside that small tent. I don't live there anymore.

To West Michigan, Dutch American culture, I am an outsider with one foot inside that small tent. As a child I was always aware that to my mother there was a qualitative difference between "here" (meaning Saginaw, Michigan) and "there" (meaning Grand Rapids). She spent most of her growing years in a house on Calvin Street in eastern Grand Rapids, just down the street from the site of Calvin College (then). Her Dutch American relatives, six aunts and numerous others, lived around the area; in the summers we drove to Newaygo to attend a summer church camp run by her home church, Westminster Presbyterian in Grand Rapids (where her ashes are now interred).

This Dutch American background (such as it was) became more vivid to me in the two years I spent at Hope College, 1974-1976. I earned my degree there after three years at Michigan State (one year in a music program there that gave me practically no transferrable credits). I came to Hope as an outsider with some sense of how things worked in that community, but I had been formed by highly negative experiences in a mediocre public high school, and then three years (1971-1974) at completely secular Michigan State. I studied in an "alternative" residential liberal arts college, Justin Morrill College, which was closed in 1979. I liked Hope's far greater structure, but I never took it as the definition of a liberal arts college.  I knew there were other options, some of them very good.

I was completely unprepared for what Hope meant by Christian college at that time, since I was really interested in Classics (Greek, Latin, history, philosophy), and German, and kept a low profile in almost everything else except organ performance. My academic experience there was intense, demanding (I had was the only Classics major and had an Oxford-like experience of demanding, fast-paced tutorials), and formed in me the habits that Princeton would nurture to maturity. It prepared for me for the intensity of graduate studies in a major program of the history of Christianity (Princeton Theological Seminary), without which I would have been lost. That Calvin wrote in Latin (complex, literate, humanistic Latin) was no news to me.  (My dissertation work was on the Carolingians, thankfully, far removed from the obsessions of the Reformed.) The Christian emphasis, however, was at first an puzzling add on, even as I was nurturing a desire to study the history of Christianity very deeply.

Hope's cultural pendulum at that point swung "liberal" (thanks to then-recently-departed President Calvin Vander Werf), so I largely ignored the cultural evangelicalism of many of the students around me. It was a comfortable, even snug world, but it was never really my world; I would not have stayed there had that been possible. My academic work was done. I freely admit that I was in Hope, but not really of it. I joined the German Club, otherwise I was what in Princeton they called "a grind."

After Hope I spent a year as a Fulbright Commission English teaching assistant in Vienna (arranged by a powerful Hope professor with many ties there, Paul Fried), and then took up M.Div. studies at Princeton Theological Seminary. Princeton in turn left me with an enduring respect for serious, top-flight scholarship, tough writers such as Kierkegaard, Barth, Bonhoeffer, and a far more global sense that both "Reformed" and "Anglican" worlds that were much broader and more diverse than my experience in Holland, Michigan had suggested. Eventually I became a librarian (another story), and ten years later returned to Princeton for doctoral study. The encounters I had with Evangelical doctoral students in the Ph.D. seminars were frustrating (with one exception), because their preparation was often so superficial, even glib. (Even that one exception previously had left the Assemblies of God for the Lutherans, ELCA.)  While at Hope I had become involved in an Episcopal Church, and that involvement left me with enduring liturgical preferences that eventually made my sojourn in the Presbyterian Church untenable.

During all this time I grew up in the cultural orbit (neighboring township) of Frankenmuth, Michigan, which offers a curious contrasting parallel to Dutch Americans in West Michigan. Frankenmuth ("courage of the Franks") was settled (1845) by immigrants from Rosstal, Franconia (Bavaria) sent by Pastor Johann Konrad Wilhelm Löhe just two years before A.J. van Raalte led his group to the shores of Lake Macatawa in West Michigan. The settlers of Frankenmuth in time wound up in the very conservative Missouri Synod and worshiped in German well into the 20th century, later than most equally conservative Christian Reformed churches worshiped in Dutch. The doctrinal rigidity of both groups is formally similar and each has regarded itself as "the true church," to the obvious exclusion of the other tradition (never mind everyone else). Growing up I was a heathen Congregationalist, so was (or would have been) beyond the pale of respectability in both groups: an outsider, with one foot inside the tent.

These German-American Lutheran families were emotionally convinced, I am sure, that Jesus spoke Martin Luther's German via God's true Bible, just as many Dutch in West Michigan would have assumed that Calvin spoke and wrote in Dutch, and correctly conveyed Jesus' teachings in Dutch (of course! --he could not have been French!). To this day when I am confronted by passionate attachment to the 1611 Authorized English Bible, or the 1662 (or 1928) Book of Common Prayer, I can only smile: I have been here with others in other languages. Linguistic fundamentalists are everywhere, I suppose. Both groups used theology and language as shields against encroaching "American" ideas in rising generations --a losing fight, to be sure.

PiperHolland

Douma's book (How Dutch Americans Stayed Dutch) delineated the manner in which Dutch Americans created and marketed new traditions through the development of Tulip Time in Holland, Michigan. Tulip Time marked not how much they remembered about the Netherlands, but how much they had forgotten. He specifically refers to Eric Hobsbawm's "invention of tradition," that cultural practices or traditions may not be genuinely historic but are adapted or invented to serve ideological ends. In turn, Werner Sollors (The Invention of Ethnicity) extended this to ethnic traditions; and in turn Douma extends it to Tulip Time in particular. It established a new channel of Dutch American ethnic identity that was a modern re-interpretation of actual 19th century Dutch identity which by the 1930s was passing or had passed away. By 1975 (my only direct experience of Tulip Time), it had become an unintentional but devastating caricature of Dutch Americans themselves, quite apart from anything really related to the Netherlands. It was very precious. It re-interpreted ethnicity in service to an ideology of the market.

I watched (though unawares as a child) this same invention unfold in Frankenmuth, mutatis mutandis. In 1959 William (Jr.) "Tiny" Zehnder Jr.,and Dorothy Zehnder organized a Bavarian Folk Festival to inaugurate major additions and renovations to the old Fischer's Hotel on Main Street. (I remember it!) The Bavarian Inn sat opposite Zehnder's restaurant (another repurposed former hotel), which had been operated since 1927 by William Zehnder, Sr., and then by Tiny's brothers. The original festival (1959) was a success and the community organized a Civic Events Council to oversee it annual continuation. From its beginning, the Bavarian Festival was an invented tradition, one marked by usually polite sibling and community rivalries.  For many years the Festival was a major "all hands" event in a small town, and a major source of social and financial capital. As the residents' ability to volunteer decreased due to homemakers' return to the work force and employment that did not allow so much time off, the Festival gently downsized and its now four days rather than a week, and under the control more of commercial entities more than of volunteer community organizations.

Other than commerce, why did the Festival endure? Its continuation was possible because of the unusual, cohesive character of the town, where civil, business, church, and school authorities all knew each other their whole lives. It expressed a positive way forward with a German American identity in a town that still felt it. German Americans, unlike Dutch Americans, had to negotiate the realities of being related to the enemy in two world wars, an enemy who committed the Holocaust, and in defeat endured a bitterly divided homeland (1949-1989). German Americans sought to be a model all-American minority because earlier generations (especially 1914-1918) were none too sure about them.  In the 19th century, earlier German American celebrations originated in the overlapping circles of workplace, Arbeiterverein (workers clubs), churches, and civic organizations.  That network largely passed by the turn of the 20th century (the Arbeiterverein were sometimes suspected of socialism!). With well-known German American celebrations in Wisconsin and Chicago as both example and warning, Frankenmuth's Bavarian Festival --entirely unrelated to any of those earlier-- allowed ethnic reclamation by using the word Bavarian rather than German. In the 1960s and 1970s hardly a (West) German flag was to be found: all the flags were the lozenge-patterned blue and white Bavarian flag. I worked as a waiter in the Bavarian Inn in the summers of 1972-1974 and 1976, putting on the slight lilt of Frankenmuth English to complement the hokey costume.

"Historic Frankenmuth" is made-up history at its finest, an imagined narrative in service to an ideology of the market. The town looks like a theme park mashed up with a wedding venue and a fudge shop. Holland, by contrast, is a larger small city with more to do than just tourism; the Dutch kitsch is comparatively restricted to Windmill Island and a few other locations. They are, each in their way, sui generis appropriations of fading ethnic consciousness.

When I lived in Europe, I immediately sensed the profound difference between the invented traditions of Tulip Time and Bavarian Festival and the national experiences and characters of the Netherlands, Bavaria, and Austria.  The gap left me scornful of those invented American ethnicities for a long time. To be sure, each community remembered the largely rural, pre-industrial 19th-century Netherlands or Franconia, with a great deal left out that was present even then.  For example, each neglected to mention that in both the Netherlands and much of Bavarian a significant amount of the population was Catholic! (Franconia was historically mixed). Subsequent to their departures, indutrialization and the experiences of the wars and the then-very-present Cold War assured a general atmosphere of willful social amnesia and fear of the past that contrasted very oddly with the happy-go-lucky invented pasts in Frankenmuth or Holland. I suspect that imagined history has returned over there as well, in the form of ultra-right or neo-Nazi movements.

Since I had very little background in evangelicalism, scholarly examination of the Bible was nothing new to me: the textual methods were very similar to those employed by Classicists on "difficult" texts. Early on in Princeton (1977) I just could not fathom the passionate objections to documentary hypotheses about the Hebrew Bible ("Old Testament") and the Gospels. I had little appreciation for the anxiety of many classmates and their habits of proof-texting or the assumption that Jesus' place and time was just like ours. Hence I had little idea how passionately many would cling to their belief that God could bless only procreating, married heterosexuals.  It turned out, over a decade or so, that many alumni/ae of Hope were gay or lesbian --so many that once I asked one, "Was I really so socially out of touch that was completely oblivious to your identity?"  He responded, "How could I have expected you to know something that even I did not know or acknowledge about myself at that time?"

During the 1980s and 1990s the horrific experiences of illness and deaths of numerous gay friends, and those who survived, meant that ahead of the curve I grew away from the homophobic culture in which I was raised. I was also living in the East, and in much more cosmopolitan, pluralistic environments. I grew impatient with the endless Presbyterian fights over the ordination of gay and lesbian ministers. I was so done with that. When a person I knew in seminary and truly respected was essentially run out of his parish in California (by vengeful elders of a neighboring Presbytery, not by his own congregation), I called B.S. --I had had enough. In 1992 I joined the Church of St. Luke in the Fields in New York City (Episcopal) and embraced my identity as a high-church Episcopalian, but one who likes good preaching, competent theological reflection, and tenacious, progressive social outreach.   My "elective affinity" ethnicity had long since become Scottish (in large part because of my name), and my Dutch heritage became less important. My understanding of Calvin was completely revised by reading William Bouwsma's John Calvin: A Sixteenth-Century Portrait (1989) in my Ph.D. residency.  Bouwsma restored Calvin to a context of other 16th-century writers and humanists such as Eramus and Montaigne.  I found that my previous understanding of Calvin had been just as invented as Tulip Time. When I visited Hope once for an alumni/ae event, I realized that I grown away from what I never really embraced anyway.

In the same years, Hope's pendulum swung in an extremely conservative direction during the campus pastorate of a certain Ben Patterson (1993-2000), an evangelical hired by Gordon van Wylen and tolerated by John Jacobsen (presidents). Patterson instituted or encouraged practices --such as public confession, confrontations with faculty members, praying outside the residential rooms of gay students for their conversion and correction--which I regarded as beyond the pale, divisive, and unfaithful. James Kennedy's Can Hope Endure? A Historical Case Study in Christian Higher Education (2005) confirmed my worst fears. Patterson's departure in 2000 did not usher in much change, however. In 2005 the highly respected Miguel de la Torre (since at Iliff School of Theology, Denver) was forced out of the faculty (how many Hispanics did they have then or since?). De la Torre's offense: he wrote a newspaper column satirically condemning James "Focus on the Family" Dobson's "outing" of the animated character Sponge Bob Square Pants as gay. (I'm not making this up! --can sponges be gay? Who knew?) Plainly the College could not tolerate any challenge to televangelists and their ilk lest its stream of money from evangelical supporters dry up.  (I'm looking at you, DeVos and Van Andel families!) Apparently if Dobson said it, then College President James Bultman believed it, and that settled it. (Always beware of making the former baseball coach your College president!)

Nothing changed. In 2009 Dustin Lance Black was insultingly treated by the same College president, and (still) Dean of Students Richard Frost, treatment that warranted national press attention.  Opponents of this rude nonsense organized a group Hope Is Ready, but unfortunately it was not.  The College's current policy (2011) is riddled with inconsistencies and hypocrisy: "Hope College will not recognize or support campus groups whose aim by statement, practice, or intimation is to promote a vision of human sexuality that is contrary to this understanding of biblical teaching."  Further down: "Hope College promotes the indispensable value of intellectual freedom . . . . Hope College affirms the dignity of every person." Obviously this is untrue and a bold-faced lie: you can talk about "it" (non-heteronormative sexualities) but do nothing more than talk.  Your talk better not "promote a vision." (What does that even mean?0 As though the College says: We talk the talk of intellectual freedom and personal dignity, but we will not walk the walk.  If you talk about this particular subject that is "contrary to . . . biblical teaching," we will shut you up.  Apparently being gay is, according to Hope College, contagious. This kind of policy relegates the College to the evangelical reservation: only those who agree need apply, and are wanted; the rest are second-rate. It affirms superficiality and mediocrity as a consequence of narrow-minded, misguided Christian faith.  It is unfortunately consistent with Richard Frost referring to "you people" in a semi-clandestine conversation with Dustin Lance Black.

In 2013 James and Deborah Fallows visited Holland as part of their journey through America that they called "American Futures" and resulted in their book Our Towns: A 100,000-Mile Journey Into the Heart of America (2018). Holland was one of the first towns they visited, and they saw much to like: a vibrant, highly functional community with a both financial and social capital and sense of the future quite at odds with our paralyzed and dysfunctional national discourse. They wrote about the many positive aspects of Holland, but about its negative aspects, too. In his final post about Holland, James included a number of 'I won't live there' messages, the first of which came from me:

I'm a graduate of Hope College, magna cum laude in [XX subject in the late 1970s]. I know the area well. I have some Dutch ancestry. My sister is [an official] about 30 miles north. I know Holland and western Michigan and Dutch-American culture from the inside.

I grant all the excellent qualities you have written about --hard work, ingenuity, social cohesion, and a sense of an America very different from DC or NYC.

I won't live in Holland, and when my own children [three ages 15-19] have looked at colleges (or will), I never suggested my alma mater. My reason: the social narrowness of smug Dutch-American culture. Although there is a very significant Latino population in Holland, it has not successfully challenged Dutch-American Christian Reformed hegemony. That hegemony will allow no compromises.

You alluded to this smugness when you mentioned the failure of the gay rights initiative(s) there. I wouldn't want to raise my children in this atmosphere, and I don't want my children going to college in it. The hateful things that were said during that discussion give evidence of the smugness of that culture.

I live in Connecticut now (outside New Haven), and there's a lot wrong with CT. But we experience far more cultural, religious, and racial diversity here. It's not perfect, but we're working on it.

Holland has many fine qualities. But it's suffocating for many people, including me. Do mention the numerous people from Holland, and Western Michigan, who have fled the cultural suffocation.

Later in the same post James Fallows summed up Hope College pretty accurately (and with more than a touch of snark):

Hope College, once considered a "Harvard of the Midwest," now aspires to be a middlebrow Christian college. Babbit lives! A pharisaical pedagogy prevails ("Thank, God, we are not as others!")

James Bultman, Richard Frost, and Hope College trustees: I'm looking at you.  In 2017 Bultman's successor, John C. Knapp, resigned a year after nearly being forced out, by most accounts because he wanted to move the College to a more mainstream, inclusive position, again warranting negative national attention.

In 2016, the 40th anniversary of my graduation from Hope College came without my even remembering it. I received an unsolicited note subsequently from a Hope development officer, and I responded:

The dust-up about Lance Black was truly the end of me and Hope College, then. Living in CT, a state where the legislature passed marriage equality, a judgement that was sustained by popular referendum in 2008, the whole “gay” controversy is just so over, and marriage equality is an established fact on the ground here, and was in 2009. Amazing to say, the sky has not fallen in, western civilization did not come to an end here (necessarily more than it has anywhere else in the age of Trump); I don’t notice that personal morality has improved (or declined) since 2008. But candor has improved, and that can’t be a bad thing. Good friends who have been partners for decades —longer than many so-called “straight” couples— have become legally equal to my own marriage relationship, and I can’t see what’s wrong here.

Perhaps this is an overly confessional letter, because you wrote to me at the end of Lent, a good time to attempt greater self-awareness. I just don’t think about Hope College or my past relationship with it very much; it doesn’t feel relevant to much in my day. Our three children have each found their way through the college application process, and I never considered recommending Hope College to them — I just think they would find it too “other.” My younger son is a finalist for a  merit scholarship at DePauw University School of Music (vocal performance), but living in Greencastle may be a stretch for him. [In May 2018 he finished his sophomore year there, and is commited to staying to complete his degree.] He calls it the middle of nowhere, but I’ve let him know that nowhere is somewhere other than central Indiana —I’ve seen the middle of nowhere, and it’s called Houghton, Michigan. DePauw’s “look and feel” is much more emotionally and religiously accessible than is Hope's, and since he has both profound faith questions as well as long-time gay friends (though he is straight), I just didn’t see him at Hope.

Since I’m not wealthy —I’m director of an academic library— and not the profile of the usual Hope alumnus, I really don’t think I have very much to offer your College. I do wish Hope College well. My own acquaintance with the Reformed tradition at Princeton Seminary led me to understand it as very open to the world, to the new findings of the humanities and sciences, and not afraid of the truth. I suspect that colleges of any theological stripe which regard themselves as the Fortresses of Faith will have a very tough go of it in the coming decades. If Hope College were a good deal more open, and more willing to defy previously-articulated evangelical orthodoxies, it could really have something very positive to offer American higher education. Lord knows that higher education (and especially private higher education) as a sector is in deep trouble.

That note, and this blog post, says what I have to say.

Michael Douma's book was really helpful to me. I can now see, in the course of my own family background, how genuine Dutch identity in the Netherlands changed as it did from the 19th century to the modern, very liberal state. I can see how Dutch Americans evolved their own historical tradition that is almost a caricature of the Dutch and really has nothing to do with them. Just as Frankenmuth Bavarian identity has almost nothing to do with contemporary Bavaria and Franconia. That Hope College chose to double-down previous mistakes and became a defensive denizen of the shrinking evangelical academic reservation is a consequence of the "invented narrative" of Dutch American culture, shop-worn and sad. The accelerating withdrawal by younger "New Millennials" from organized religion of every stripe bodes ill for a College that values a defensive orthodoxy over liberating pedagogies.

It's almost July, and I remember how amazingly beautiful West Michigan can be this time of year, especially near the Lake. Shelly and I will visit my sister in Muskegon, and our younger son at Blue Lake Fine Arts Camp. Grand Rapids has changed profoundly: for example, the town LGTBQ adopted anti-discrimination ordinances in 1994, East Grand Rapids in 2015; Holland has yet to do so. The Grand Rapids arts community thrives, as do numerous ethnic communities. There is much to like and much more ahead than behind.

I regret that Hope College chose the path that it has (Babbit lives!). I havelittle to do with it or about it. My own life has gone on elsewhere, and for that Deo gratias!

I am indebted to John Fea for pointing out Michael J. Douma's How Dutch Americans Stayed Dutch: An Historical Perspective on Ethnic Minorities (Amsterdam University Press, 2014, 978-90-8964-645-3). Douma's book has been a delight, enlightening and useful to my continuing question "how can I teach about evangelicalism to students who have almost no awareness of it?" without becoming either divine or pedantic. Americans of Dutch heritage are no more uniformly evangelical than any other group, but Douma's insights provide clues to the challenge of teaching about other people's arguments to those who don't already know or care about them.

Fea pointed out Douma's book and Douma's response to a misleading article in the Economist that sets out a complex reality in simplistic, bite-sized terms appropriate to The Economist's readers. The pretext was a remarks by U.S. Ambassador Pete Hoekstra, and the sagas of Betsy DeVos and Erik Prince, all recognizably Dutch-American conservatives of a certain positivist stripe. Like many Americans academics, in past months I have winced at the antics, pratfalls, and utter cluelessness of Betsy DeVos, incumbent Secretary of Education. Anyone who knows West Michigan (and Holland, Michigan in particular) will know name well, such as the DeVos Field House at Hope College, and the endless genuflection towards the Amway Corporation, alleged to be a barely legalized, cult-like pyramid scheme. A member of the Van Andel family (DeVos' relations) has established rules for restricted access to Holland State Park's Big Red Lighthouse appropriate to a medieval lord of the manor (photo above); Erik Prince (Betsy's brother) remains a person of interest to Robert Mueller's investigation. Any long familiar with the Dutch American pale of settlement in West Michigan might role their eyes.

To West Michigan, Dutch American culture, I am an outsider with one foot inside that small tent. One quarter of my personal ancestry is Dutch (maternal grandmother, and Fries to be exact) and may mother lived decades as a Dutch American expatriate in distant, foreign parts --those of industrial eastern Michigan. (Her ashes are fittingly interred in Grand Rapids.) I earned my bachelor's degree at Hope College, but only after three years at Michigan State in heathen East Lansing. So I could have been an insider, but chose otherwise. (I will say more in a subsequent post.)

Douma's eminently readable book, accessible public history well-informed by theoretical, scholarly insights, presents Dutch American ethnicity as an evolving set of internal disagreements about how to cope with an external human and natural environment very different from the particular, original locations in the small country from which the ancestors emigrated. He limits his investigation to the 19th and 20th-century Dutch immigration to the Middle West, which was only tangentially related to 17th- and 18th-century Dutch American immigration to New York and New Jersey; he also leaves aside Dutch "Indos" from Indonesia.

Location, location: the emigrants came from pre-industrial villages and small cities in Gronigen, Friesland, Utrecht, and Overijssel that were transformed by industrialization and modern transportation shortly after their departure in the 19th century. They arrived in differing areas of the Middle West: West Michigan, the plains of Iowa (Pella, Orange City), burgeoning Chicago (South Holland), and were dispersed throughout Wisconsin.

The emigrants' descendants experienced varying personal and community outcomes in urban, small city, and rural locations. Dutch-American immigrant identity largely evaporated by the 1920s in many locations except two areas with a critical mass of shared ancestry: the West Michigan axis of Holland and Grand Rapids, and the the neighborhoods of Pella, Iowa (southeast of Des Moines) and Orange City (northwest Iowa). Three of those areas were anchored by colleges associated with the Reformed Church in America (RCA): Hope College (Holland, Michigan), Central College (Pella, Iowa), and Northwestern College (Orange City, Iowa). Grand Rapids became the Mecca of the Christian Reformed Church (CRC), and home of Calvin College and Theological Seminary (to become Calvin University in 2020).

The educational institutions are an important hook: Dutch Americans were justly famous for their work ethic and religious commitment. As my mother said, "God made the world, but the Dutch made Holland," referring to the dikes, sea-walls, and canals of the Netherlands, an intending the remark to mean, "therefore, get to work." Dutch Protestant Christianity of the Reformed tradition carried all the marks of Calvin's humanist character: based on texts (the Bible above all), theological reflection, and leaning towards pietism in a rather learned, cerebral manner. The revivalist enthusiasms of late 19th-century America were alien to Dutch temperaments and Dutch Americans became evangelical only as those immigrant tendencies passed. Originally birthed in the afschieding (secession) of orthodox, traditional Dutch Calvinists from the Netherlands State protestant church (Nederlandse Hervormde Kerk) in 1834, the secessionists in America fell out amongst themselves in 1857 over the Dutch immigrant's incorporation into the American Reformed Protestant Dutch Church (now RCA), which some considered to be entirely too worldly, lax, and American.

Consequently: the Dutch colleges became involved in Reformed disputes (Hope, founded 1851 and 1866 to the RCA; Calvin founded 1876 to the CRC; the RCA founded Northwestern College in 1882, and took control of Central College, founded 1853, in 1916). Consequently (also), Dutch Protestant religion took on an disputatious character that both nurtured and was fed by intellectual argument. Consequently (also), Dutch Americans became over-represented in skilled trades, the professions, and the sciences. West Michigan, which lacked major extractable natural resources, and depended upon manufacturing and trade (with its access to the Great Lakes), owed much of its economic development to skilled labor, and the manufacture of furniture, building materials (such as bricks), and pharmaceuticals.

Douma's book lends some weight to a view that Dutch American cultural and economic impact was not hindered but furthered by intra-Dutch immigrant debates and rivalries. In West Michigan cities the narcissism of small differences between the RCA and CRC correlated with a range of economic and cultural positions and produced varying responses to and acceptance of mainstream Anglo-American culture (regarding organizations such as the Freemasons, for example). Southern Michigan, originally part (with northern Ohio and Indiana) of the Midwestern "third New England" by the mid-19th was long habituated to Yankee habits of thrift and cultural positions such as Abolitionism; the Dutch immigrants were both similar and different to the Yankees as well as to the numerous other ethnic minorities present (especially Eastern European). Dutch Americans were at first outsiders to the fraught American conflicts that foreshadowed the Civil War, and a number of young Dutch American men absorbed a "American" habits and dispositions through war-time military service. Dutch American rivalries extended a discourse that unintentionally preserved or prolonged Dutch American identity in those areas of Michigan and Iowa that held a critical mass of Dutch descendants. In time these descendants remembered not Dutch culture so much as the culture of their grandparents or great-grandparents: "Tulip Time" in Holland Michigan (an ethnic festival in May) harks back not so much to the Netherlands as to memories of an idealized Netherlands in the minds of the early immigrants.  Dutch American identity has by now evaporated or turned into genealogical interests with a barely religious overlay.  The institutions of the CRC, RCA and the colleges have moved on to other identities and evolving missions.

What does this tell me about teaching American evangelicalism to secular or minimally Catholic undergraduates who don't have (or sometimes want) a clue? It reminds me that cultural identities are always a work in time, evolving in changing circumstances, and apt to idealize their own pasts. Their disputes, far from weakening them (unless they become too divisive), in fact strengthen them by giving the participants something to really care about. Whether many Evangelicals' current nearly cultic for the Chief Executive will in fact divide them from their recent compatriots (those Evangelicals who did not support him) remains to be seen: how divisive will that dispute become? Douma's book also reminds me of the way that religious commitment can be felt as nearly an ethnic identity, and thoroughly entangled with multiple, sometimes conflicting other commitments.

Sometimes it can also really help when a professor includes a sufficient (but not overpowering) testimony: "here's why this subject really matters to me." I find that students often respond to genuine commitment: this is important because it expresses something close to my heart. (I have seen, believe it or not, the teaching of accounting standards enlivened in this manner.) My subsequent post will tell a bit more of my own story.

 

Part of my work since 2009 has been teaching topics in American religion to undergraduates. Since my scholarly training focused on Christianity, most of the class concerned Protestants and Catholics in American history and culture. Most of the students lacked any real working knowledge of any religious community, even if they were graduates of Catholic schools (a small minority). The course will meet a distribution requirement, and with vanishingly few religion majors, I kept a broad focus. Given my students' effective religious illiteracy things went reasonably well.  (I do not intend to exclude any American religion, but I do want to stick to my competencies.)

In teaching about evangelicalism, I hit a concrete wall. My students have assumed that Evangelicals by definition have been always and only conservative Republicans. They might feel some sympathy, I have learned, with a few conservative Evangelical viewpoints, especially amongst the males (immigration; economics; and the racial subtexts). But for the vast majority of my New England small-c and Capital-C c/Catholic students, Evangelicals are a strange tribe: inexplicable in all their ways, potentially hostile to Catholics and Northeasterners in general, and motivated by ineluctable commitments. Neither conservative Republicans nor high-profile Evangelicals are highly visible on the regional Tri-State, southern New England cultural spectrum. As one student wrote, "Evangelicalism: not for me." I am hardly trying to turn them into Evangelicals (I made that abundantly clear, and they heard me), but I had hoped to shed a little light on Evangelical history and culture in hopes of building some respect for this particular "other." I needed help.

I ran across John Fea's blog The Way of Improvement Leads Home after reading several chapters of his book by the same title; Fea's blog is genuine assistance to those few who would like to understand Evangelicals better, but have no interest in becoming Evangelicals ourselves. His new book Believe Me: The Evangelicals Road To Donald Trump (please order from Eerdmans, not Amazon) tells a story from inside Evangelicalism to those Evangelicals who did not vote for Trump, and to the rest of us.  Fea attended Trinity International University and teaches at Messiah College (Pa.); he earned his Ph.D. from SUNY/Stony Brook, so he also has commitments to scholarship off the Evangelical academic reservation. Thanks to John, I also began to read Frances Fitzgerald's The Evangelicals: The Struggle to Shape America (a Pulitzer Prize winner) and Robert Jones' The End of White Christian America. I returned to Mark Noll's landmark The Scandal of the Evangelical Mind (1994) as well as George Marsden's landmark Fundamentalism and American Culture (2nd edition, 2006).

In 2017 Mathew Mayhew (Education, Ohio State) et al. wrote "Expanding Perspectives on Evangelicalism: How Non-evangelical Students Appreciate Evangelical Christianity," (Rev Relig Res (2017) 59:207–230 DOI 10.1007/s13644-017-0283-8), a survey-based social science project. The investigation revealed distinct differences in students' attitudes towards their evangelical peers related to demographics, institution type, and academic major. Students who self-identified as having religious experience (or identity) were apt to be somewhat more sympathetic to Evangelical students, who might well feel ostracized or devalued in more secular academia. "How do we encourage appreciation of a worldview as polarizing as the one the evangelical narrative represents?" (p. 225) When does a challenging or provocative Evangelical viewpoint become perceived as divisive or hostile? This is an eye of a needle hard to pass through.

This challenge is particularly trying where no Evangelical students are present. I have found an analogy when trying to teach about the fervor of 19th-century Prohibitionists: most students will recognize the problems of alcohol abuse and alcoholism but advocates for Prohibition simply no longer exist. Students might well respond to the challenges (or provocations) of "hot-button" issues such as abortion rights, LGBTQ rights (and cake-bakers, florists, et al.), and immigrants with or without documents --but lack any awareness of Evangelical resonance. I have had one earnest student say, "I don't believe in evolution because I'm Catholic," and had to point out to her that she may have unawares absorbed an oft-held Evangelical viewpoint, but that her refusal cannot be based upon specifically Catholic bases, at least according to the Pope (then Benedict XVI). I must also reflect that my African American and Hispanic students often will reveal greater awareness of Evangelicalism than whites.

I return to the question: how does one teach about those who regard their faith as primary to those who are unaware of why any faith might be primary? (Granted the former category can include a great deal of wishful thinking, rationalization, and even fear and hypocrisy when things go wrong: read Believe Me.) Years ago I encountered a similar wall when trying to teach about Dietrich Bonhoeffer and why he chose to participate, however tangentially, in the July 1944 plot against Hitler. One Muhlenberg College student candidly observed, "We don't understand anything about sacrifice because we have never been asked to sacrifice anything." The gulf is more imagination than thinking, or the ability to think. (I am by no means assuming that Bonhoeffer was or would be Evangelical in contemporary North American usages of the word; Eric Metaxas' book has been justly condemned as poorly sourced and even more poorly written, and I decline to link to it.)

In response, I have to cast back to my own limited experience of something bordering Evangelical America both at Princeton Theological Seminary and at Hope College (in my next post.) Personal experience may be a last resort --I am at my last resort.

Mouse Books give easy access to classic texts in a new format --especially essays or stories that often are not commercially viable on their own. The Mouse Books project wants to offer readers more ideas, insight, and connections for readers' lives.

Brothers_K_grande_246097a1-d3d3-4008-856a-487204748363_540xThe digital era was supposed to make books and lengthy reading obsolete: Larry Sanger (co-founder of Wikipedia, originator of citizendium.org and WatchKnowLearn.org) memorably critiqued faulty assumptions in 2010, Individual Knowledge in the Internet Age (here as .pdf; see also my posts here and here). "Boring old books" played a part. Clay Shirky of NYU wrote, "the literary world is now losing its normative hold" on our culture," "--no one reads War and Peace. It's too long, and not so interesting. . . This observation is no less sacrilegious for being true." Ah, the satisfying thunk of a smashed idol. Goodbye, long, boring not so interesting books.

Except that a funny thing has happened on the way to the book burning. (Danke schoen, Herr Goebbels) Printed books have somehow held on: unit sales of print books were up 1.9% in 2016, at 687.2 million world-wide, the fourth straight year of print growth. Rumors of demise now seem premature. What gives?

The print book is far more subtly crafted than many digital soothsayers realize. Printed books have evolved continuously since Gutenberg: just take a look at scholarly monographs from 1930, 1950, 1970, 1990, and 2010. The current printed book, whether popular, trade, high-concept, or scholarly monograph, is a highly-designed and highly-evolved object.  Publishers are very alert to readers' desires and what seems to work best.  It was hubris to think that a lazily conceived and hastily devised digital book format could simply replace a printed book with an object equally useful: look at the evolution of the epub format (for example).

Designers will always refer to what has been designed previously, as well as new and present needs and uses when designing an object: consider the humble door. Poorly done e-books were a product of the "move fast and break things" culture that doomed many ideas that appealed to thinking deeper than the one-sided imaginations of bro-grammer digital denizens.

Enter Mouse Books. Some months ago David Dewane was riding the bus in Chicago. "[I] happened to be reading a physical book that was a piece of classic literature. I wondered what all the other people on the bus were reading." He wondered, why don't those people read those authors on their smart phones? "I wondered if you made the book small enough—like a passport or a smart notebook—if you could carry it around with you anywhere."

David and close friends began to experiment, and eventually designed printed books the size and thickness of a mobile phone. They chose classic works available in the public domain, either complete essays (Thoreau's On the Duty of Civil Disobedience) or chapters (Chapters 4 and 5 of The Brothers Karamazov, "The Grand Inquisitor," in Constance Garnett's translation. These are simply, legibly printed in Bookman Old Style 11-point font. Each book or booklet is staple bound ("double stitched") with a sturdy paper cover, 40-50 pages, 3 1/2 by 5 1/2 inches or just about 9 by 14 cm --a very high quality, small product.

David and the Mouse Team (Disney copyright forbids calling them Mouseketeers) aim for ordinary users of mobile phones. They want to provide a serious text that can be worn each day "on your body" in a pocket, purse, or bag, and gives a choice between pulling out the phone or something more intellectually and emotionally stimulating. Mouse Books give easy access to classic texts in a new format --especially essays or stories that often are not commercially viable on their own (such as Melville's Bartleby the Scrivener, or Thoreau's essay, which are invariably packaged with other texts in a binding that will bring sufficient volume and profit to market.) The Mouse Books project wants to offer readers more ideas, insight, and connections for readers' lives.

As a business, Mouse Books is still experimental, and has sought "early adopters:" willing co-experimentalists and subjects. This means experimenting with the practice of reading, with classics texts of proven high quality, and complementing the texts with audio content, podcasts, and a social media presence. These supplements are also intended to be mobile --handy nearly anywhere you could wear ear buds.

As a start-up or experiment, Mouse Books has stumbled from time to time in making clear what a subscriber would get for funding the project on Kickstarter, what the level of subscriptions are, and differences in US and outside-the-US subscriptions. The subscriptions levels on the Mouse Books drip (or d.rip) site do not match the subscription option offered directly on the Mouse Books Club web site. As a small "virtual company," this kind of confusion goes with the territory --part of what "early adaptors" come to expect. That said, Mouse Books is also approaching sufficient scale that marketing clarity will be important for the project to prosper.

This is a charming start-up that deserves support, and is highly consonant with the mission of librarians: to connect with others both living and dead, to build insight, to generate ideas. The printed book and those associated with it--bookstores, libraries, editors, writers, readers, thinkers--are stronger with innovative experiments such as Mouse Books. The printed book continues to evolve, and remains a surprisingly resilient re-emergent, legacy technology.

More about Mouse books:

Web site: https://mousebookclub.com/collections/mouse-books-catalog

drip site (blog entries): https://d.rip/mouse-books?

Video: