Skip to content

To West Michigan, Dutch American culture, I am an outsider with one foot inside that small tent. I don't live there anymore.

To West Michigan, Dutch American culture, I am an outsider with one foot inside that small tent. As a child I was always aware that to my mother there was a qualitative difference between "here" (meaning Saginaw, Michigan) and "there" (meaning Grand Rapids). She spent most of her growing years in a house on Calvin Street in eastern Grand Rapids, just down the street from the site of Calvin College (then). Her Dutch American relatives, six aunts and numerous others, lived around the area; in the summers we drove to Newaygo to attend a summer church camp run by her home church, Westminster Presbyterian in Grand Rapids (where her ashes are now interred).

This Dutch American background (such as it was) became more vivid to me in the two years I spent at Hope College, 1974-1976. I earned my degree there after three years at Michigan State (one year in a music program there that gave me practically no transferrable credits). I came to Hope as an outsider with some sense of how things worked in that community, but I had been formed by highly negative experiences in a mediocre public high school, and then three years (1971-1974) at completely secular Michigan State. I studied in an "alternative" residential liberal arts college, Justin Morrill College, which was closed in 1979. I liked Hope's far greater structure, but I never took it as the definition of a liberal arts college.  I knew there were other options, some of them very good.

I was completely unprepared for what Hope meant by Christian college at that time, since I was really interested in Classics (Greek, Latin, history, philosophy), and German, and kept a low profile in almost everything else except organ performance. My academic experience there was intense, demanding (I had was the only Classics major and had an Oxford-like experience of demanding, fast-paced tutorials), and formed in me the habits that Princeton would nurture to maturity. It prepared for me for the intensity of graduate studies in a major program of the history of Christianity (Princeton Theological Seminary), without which I would have been lost. That Calvin wrote in Latin (complex, literate, humanistic Latin) was no news to me.  (My dissertation work was on the Carolingians, thankfully, far removed from the obsessions of the Reformed.) The Christian emphasis, however, was at first an puzzling add on, even as I was nurturing a desire to study the history of Christianity very deeply.

Hope's cultural pendulum at that point swung "liberal" (thanks to then-recently-departed President Calvin Vander Werf), so I largely ignored the cultural evangelicalism of many of the students around me. It was a comfortable, even snug world, but it was never really my world; I would not have stayed there had that been possible. My academic work was done. I freely admit that I was in Hope, but not really of it. I joined the German Club, otherwise I was what in Princeton they called "a grind."

After Hope I spent a year as a Fulbright Commission English teaching assistant in Vienna (arranged by a powerful Hope professor with many ties there, Paul Fried), and then took up M.Div. studies at Princeton Theological Seminary. Princeton in turn left me with an enduring respect for serious, top-flight scholarship, tough writers such as Kierkegaard, Barth, Bonhoeffer, and a far more global sense that both "Reformed" and "Anglican" worlds that were much broader and more diverse than my experience in Holland, Michigan had suggested. Eventually I became a librarian (another story), and ten years later returned to Princeton for doctoral study. The encounters I had with Evangelical doctoral students in the Ph.D. seminars were frustrating (with one exception), because their preparation was often so superficial, even glib. (Even that one exception previously had left the Assemblies of God for the Lutherans, ELCA.)  While at Hope I had become involved in an Episcopal Church, and that involvement left me with enduring liturgical preferences that eventually made my sojourn in the Presbyterian Church untenable.

During all this time I grew up in the cultural orbit (neighboring township) of Frankenmuth, Michigan, which offers a curious contrasting parallel to Dutch Americans in West Michigan. Frankenmuth ("courage of the Franks") was settled (1845) by immigrants from Rosstal, Franconia (Bavaria) sent by Pastor Johann Konrad Wilhelm Löhe just two years before A.J. van Raalte led his group to the shores of Lake Macatawa in West Michigan. The settlers of Frankenmuth in time wound up in the very conservative Missouri Synod and worshiped in German well into the 20th century, later than most equally conservative Christian Reformed churches worshiped in Dutch. The doctrinal rigidity of both groups is formally similar and each has regarded itself as "the true church," to the obvious exclusion of the other tradition (never mind everyone else). Growing up I was a heathen Congregationalist, so was (or would have been) beyond the pale of respectability in both groups: an outsider, with one foot inside the tent.

These German-American Lutheran families were emotionally convinced, I am sure, that Jesus spoke Martin Luther's German via God's true Bible, just as many Dutch in West Michigan would have assumed that Calvin spoke and wrote in Dutch, and correctly conveyed Jesus' teachings in Dutch (of course! --he could not have been French!). To this day when I am confronted by passionate attachment to the 1611 Authorized English Bible, or the 1662 (or 1928) Book of Common Prayer, I can only smile: I have been here with others in other languages. Linguistic fundamentalists are everywhere, I suppose. Both groups used theology and language as shields against encroaching "American" ideas in rising generations --a losing fight, to be sure.

PiperHolland

Douma's book (How Dutch Americans Stayed Dutch) delineated the manner in which Dutch Americans created and marketed new traditions through the development of Tulip Time in Holland, Michigan. Tulip Time marked not how much they remembered about the Netherlands, but how much they had forgotten. He specifically refers to Eric Hobsbawm's "invention of tradition," that cultural practices or traditions may not be genuinely historic but are adapted or invented to serve ideological ends. In turn, Werner Sollors (The Invention of Ethnicity) extended this to ethnic traditions; and in turn Douma extends it to Tulip Time in particular. It established a new channel of Dutch American ethnic identity that was a modern re-interpretation of actual 19th century Dutch identity which by the 1930s was passing or had passed away. By 1975 (my only direct experience of Tulip Time), it had become an unintentional but devastating caricature of Dutch Americans themselves, quite apart from anything really related to the Netherlands. It was very precious. It re-interpreted ethnicity in service to an ideology of the market.

I watched (though unawares as a child) this same invention unfold in Frankenmuth, mutatis mutandis. In 1959 William (Jr.) "Tiny" Zehnder Jr.,and Dorothy Zehnder organized a Bavarian Folk Festival to inaugurate major additions and renovations to the old Fischer's Hotel on Main Street. (I remember it!) The Bavarian Inn sat opposite Zehnder's restaurant (another repurposed former hotel), which had been operated since 1927 by William Zehnder, Sr., and then by Tiny's brothers. The original festival (1959) was a success and the community organized a Civic Events Council to oversee it annual continuation. From its beginning, the Bavarian Festival was an invented tradition, one marked by usually polite sibling and community rivalries.  For many years the Festival was a major "all hands" event in a small town, and a major source of social and financial capital. As the residents' ability to volunteer decreased due to homemakers' return to the work force and employment that did not allow so much time off, the Festival gently downsized and its now four days rather than a week, and under the control more of commercial entities more than of volunteer community organizations.

Other than commerce, why did the Festival endure? Its continuation was possible because of the unusual, cohesive character of the town, where civil, business, church, and school authorities all knew each other their whole lives. It expressed a positive way forward with a German American identity in a town that still felt it. German Americans, unlike Dutch Americans, had to negotiate the realities of being related to the enemy in two world wars, an enemy who committed the Holocaust, and in defeat endured a bitterly divided homeland (1949-1989). German Americans sought to be a model all-American minority because earlier generations (especially 1914-1918) were none too sure about them.  In the 19th century, earlier German American celebrations originated in the overlapping circles of workplace, Arbeiterverein (workers clubs), churches, and civic organizations.  That network largely passed by the turn of the 20th century (the Arbeiterverein were sometimes suspected of socialism!). With well-known German American celebrations in Wisconsin and Chicago as both example and warning, Frankenmuth's Bavarian Festival --entirely unrelated to any of those earlier-- allowed ethnic reclamation by using the word Bavarian rather than German. In the 1960s and 1970s hardly a (West) German flag was to be found: all the flags were the lozenge-patterned blue and white Bavarian flag. I worked as a waiter in the Bavarian Inn in the summers of 1972-1974 and 1976, putting on the slight lilt of Frankenmuth English to complement the hokey costume.

"Historic Frankenmuth" is made-up history at its finest, an imagined narrative in service to an ideology of the market. The town looks like a theme park mashed up with a wedding venue and a fudge shop. Holland, by contrast, is a larger small city with more to do than just tourism; the Dutch kitsch is comparatively restricted to Windmill Island and a few other locations. They are, each in their way, sui generis appropriations of fading ethnic consciousness.

When I lived in Europe, I immediately sensed the profound difference between the invented traditions of Tulip Time and Bavarian Festival and the national experiences and characters of the Netherlands, Bavaria, and Austria.  The gap left me scornful of those invented American ethnicities for a long time. To be sure, each community remembered the largely rural, pre-industrial 19th-century Netherlands or Franconia, with a great deal left out that was present even then.  For example, each neglected to mention that in both the Netherlands and much of Bavarian a significant amount of the population was Catholic! (Franconia was historically mixed). Subsequent to their departures, indutrialization and the experiences of the wars and the then-very-present Cold War assured a general atmosphere of willful social amnesia and fear of the past that contrasted very oddly with the happy-go-lucky invented pasts in Frankenmuth or Holland. I suspect that imagined history has returned over there as well, in the form of ultra-right or neo-Nazi movements.

Since I had very little background in evangelicalism, scholarly examination of the Bible was nothing new to me: the textual methods were very similar to those employed by Classicists on "difficult" texts. Early on in Princeton (1977) I just could not fathom the passionate objections to documentary hypotheses about the Hebrew Bible ("Old Testament") and the Gospels. I had little appreciation for the anxiety of many classmates and their habits of proof-texting or the assumption that Jesus' place and time was just like ours. Hence I had little idea how passionately many would cling to their belief that God could bless only procreating, married heterosexuals.  It turned out, over a decade or so, that many alumni/ae of Hope were gay or lesbian --so many that once I asked one, "Was I really so socially out of touch that was completely oblivious to your identity?"  He responded, "How could I have expected you to know something that even I did not know or acknowledge about myself at that time?"

During the 1980s and 1990s the horrific experiences of illness and deaths of numerous gay friends, and those who survived, meant that ahead of the curve I grew away from the homophobic culture in which I was raised. I was also living in the East, and in much more cosmopolitan, pluralistic environments. I grew impatient with the endless Presbyterian fights over the ordination of gay and lesbian ministers. I was so done with that. When a person I knew in seminary and truly respected was essentially run out of his parish in California (by vengeful elders of a neighboring Presbytery, not by his own congregation), I called B.S. --I had had enough. In 1992 I joined the Church of St. Luke in the Fields in New York City (Episcopal) and embraced my identity as a high-church Episcopalian, but one who likes good preaching, competent theological reflection, and tenacious, progressive social outreach.   My "elective affinity" ethnicity had long since become Scottish (in large part because of my name), and my Dutch heritage became less important. My understanding of Calvin was completely revised by reading William Bouwsma's John Calvin: A Sixteenth-Century Portrait (1989) in my Ph.D. residency.  Bouwsma restored Calvin to a context of other 16th-century writers and humanists such as Eramus and Montaigne.  I found that my previous understanding of Calvin had been just as invented as Tulip Time. When I visited Hope once for an alumni/ae event, I realized that I grown away from what I never really embraced anyway.

In the same years, Hope's pendulum swung in an extremely conservative direction during the campus pastorate of a certain Ben Patterson (1993-2000), an evangelical hired by Gordon van Wylen and tolerated by John Jacobsen (presidents). Patterson instituted or encouraged practices --such as public confession, confrontations with faculty members, praying outside the residential rooms of gay students for their conversion and correction--which I regarded as beyond the pale, divisive, and unfaithful. James Kennedy's Can Hope Endure? A Historical Case Study in Christian Higher Education (2005) confirmed my worst fears. Patterson's departure in 2000 did not usher in much change, however. In 2005 the highly respected Miguel de la Torre (since at Iliff School of Theology, Denver) was forced out of the faculty (how many Hispanics did they have then or since?). De la Torre's offense: he wrote a newspaper column satirically condemning James "Focus on the Family" Dobson's "outing" of the animated character Sponge Bob Square Pants as gay. (I'm not making this up! --can sponges be gay? Who knew?) Plainly the College could not tolerate any challenge to televangelists and their ilk lest its stream of money from evangelical supporters dry up.  (I'm looking at you, DeVos and Van Andel families!) Apparently if Dobson said it, then College President James Bultman believed it, and that settled it. (Always beware of making the former baseball coach your College president!)

Nothing changed. In 2009 Dustin Lance Black was insultingly treated by the same College president, and (still) Dean of Students Richard Frost, treatment that warranted national press attention.  Opponents of this rude nonsense organized a group Hope Is Ready, but unfortunately it was not.  The College's current policy (2011) is riddled with inconsistencies and hypocrisy: "Hope College will not recognize or support campus groups whose aim by statement, practice, or intimation is to promote a vision of human sexuality that is contrary to this understanding of biblical teaching."  Further down: "Hope College promotes the indispensable value of intellectual freedom . . . . Hope College affirms the dignity of every person." Obviously this is untrue and a bold-faced lie: you can talk about "it" (non-heteronormative sexualities) but do nothing more than talk.  Your talk better not "promote a vision." (What does that even mean?0 As though the College says: We talk the talk of intellectual freedom and personal dignity, but we will not walk the walk.  If you talk about this particular subject that is "contrary to . . . biblical teaching," we will shut you up.  Apparently being gay is, according to Hope College, contagious. This kind of policy relegates the College to the evangelical reservation: only those who agree need apply, and are wanted; the rest are second-rate. It affirms superficiality and mediocrity as a consequence of narrow-minded, misguided Christian faith.  It is unfortunately consistent with Richard Frost referring to "you people" in a semi-clandestine conversation with Dustin Lance Black.

In 2013 James and Deborah Fallows visited Holland as part of their journey through America that they called "American Futures" and resulted in their book Our Towns: A 100,000-Mile Journey Into the Heart of America (2018). Holland was one of the first towns they visited, and they saw much to like: a vibrant, highly functional community with a both financial and social capital and sense of the future quite at odds with our paralyzed and dysfunctional national discourse. They wrote about the many positive aspects of Holland, but about its negative aspects, too. In his final post about Holland, James included a number of 'I won't live there' messages, the first of which came from me:

I'm a graduate of Hope College, magna cum laude in [XX subject in the late 1970s]. I know the area well. I have some Dutch ancestry. My sister is [an official] about 30 miles north. I know Holland and western Michigan and Dutch-American culture from the inside.

I grant all the excellent qualities you have written about --hard work, ingenuity, social cohesion, and a sense of an America very different from DC or NYC.

I won't live in Holland, and when my own children [three ages 15-19] have looked at colleges (or will), I never suggested my alma mater. My reason: the social narrowness of smug Dutch-American culture. Although there is a very significant Latino population in Holland, it has not successfully challenged Dutch-American Christian Reformed hegemony. That hegemony will allow no compromises.

You alluded to this smugness when you mentioned the failure of the gay rights initiative(s) there. I wouldn't want to raise my children in this atmosphere, and I don't want my children going to college in it. The hateful things that were said during that discussion give evidence of the smugness of that culture.

I live in Connecticut now (outside New Haven), and there's a lot wrong with CT. But we experience far more cultural, religious, and racial diversity here. It's not perfect, but we're working on it.

Holland has many fine qualities. But it's suffocating for many people, including me. Do mention the numerous people from Holland, and Western Michigan, who have fled the cultural suffocation.

Later in the same post James Fallows summed up Hope College pretty accurately (and with more than a touch of snark):

Hope College, once considered a "Harvard of the Midwest," now aspires to be a middlebrow Christian college. Babbit lives! A pharisaical pedagogy prevails ("Thank, God, we are not as others!")

James Bultman, Richard Frost, and Hope College trustees: I'm looking at you.  In 2017 Bultman's successor, John C. Knapp, resigned a year after nearly being forced out, by most accounts because he wanted to move the College to a more mainstream, inclusive position, again warranting negative national attention.

In 2016, the 40th anniversary of my graduation from Hope College came without my even remembering it. I received an unsolicited note subsequently from a Hope development officer, and I responded:

The dust-up about Lance Black was truly the end of me and Hope College, then. Living in CT, a state where the legislature passed marriage equality, a judgement that was sustained by popular referendum in 2008, the whole “gay” controversy is just so over, and marriage equality is an established fact on the ground here, and was in 2009. Amazing to say, the sky has not fallen in, western civilization did not come to an end here (necessarily more than it has anywhere else in the age of Trump); I don’t notice that personal morality has improved (or declined) since 2008. But candor has improved, and that can’t be a bad thing. Good friends who have been partners for decades —longer than many so-called “straight” couples— have become legally equal to my own marriage relationship, and I can’t see what’s wrong here.

Perhaps this is an overly confessional letter, because you wrote to me at the end of Lent, a good time to attempt greater self-awareness. I just don’t think about Hope College or my past relationship with it very much; it doesn’t feel relevant to much in my day. Our three children have each found their way through the college application process, and I never considered recommending Hope College to them — I just think they would find it too “other.” My younger son is a finalist for a  merit scholarship at DePauw University School of Music (vocal performance), but living in Greencastle may be a stretch for him. [In May 2018 he finished his sophomore year there, and is commited to staying to complete his degree.] He calls it the middle of nowhere, but I’ve let him know that nowhere is somewhere other than central Indiana —I’ve seen the middle of nowhere, and it’s called Houghton, Michigan. DePauw’s “look and feel” is much more emotionally and religiously accessible than is Hope's, and since he has both profound faith questions as well as long-time gay friends (though he is straight), I just didn’t see him at Hope.

Since I’m not wealthy —I’m director of an academic library— and not the profile of the usual Hope alumnus, I really don’t think I have very much to offer your College. I do wish Hope College well. My own acquaintance with the Reformed tradition at Princeton Seminary led me to understand it as very open to the world, to the new findings of the humanities and sciences, and not afraid of the truth. I suspect that colleges of any theological stripe which regard themselves as the Fortresses of Faith will have a very tough go of it in the coming decades. If Hope College were a good deal more open, and more willing to defy previously-articulated evangelical orthodoxies, it could really have something very positive to offer American higher education. Lord knows that higher education (and especially private higher education) as a sector is in deep trouble.

That note, and this blog post, says what I have to say.

Michael Douma's book was really helpful to me. I can now see, in the course of my own family background, how genuine Dutch identity in the Netherlands changed as it did from the 19th century to the modern, very liberal state. I can see how Dutch Americans evolved their own historical tradition that is almost a caricature of the Dutch and really has nothing to do with them. Just as Frankenmuth Bavarian identity has almost nothing to do with contemporary Bavaria and Franconia. That Hope College chose to double-down previous mistakes and became a defensive denizen of the shrinking evangelical academic reservation is a consequence of the "invented narrative" of Dutch American culture, shop-worn and sad. The accelerating withdrawal by younger "New Millennials" from organized religion of every stripe bodes ill for a College that values a defensive orthodoxy over liberating pedagogies.

It's almost July, and I remember how amazingly beautiful West Michigan can be this time of year, especially near the Lake. Shelly and I will visit my sister in Muskegon, and our younger son at Blue Lake Fine Arts Camp. Grand Rapids has changed profoundly: for example, the town LGTBQ adopted anti-discrimination ordinances in 1994, East Grand Rapids in 2015; Holland has yet to do so. The Grand Rapids arts community thrives, as do numerous ethnic communities. There is much to like and much more ahead than behind.

I regret that Hope College chose the path that it has (Babbit lives!). I havelittle to do with it or about it. My own life has gone on elsewhere, and for that Deo gratias!

I am indebted to John Fea for pointing out Michael J. Douma's How Dutch Americans Stayed Dutch: An Historical Perspective on Ethnic Minorities (Amsterdam University Press, 2014, 978-90-8964-645-3). Douma's book has been a delight, enlightening and useful to my continuing question "how can I teach about evangelicalism to students who have almost no awareness of it?" without becoming either divine or pedantic. Americans of Dutch heritage are no more uniformly evangelical than any other group, but Douma's insights provide clues to the challenge of teaching about other people's arguments to those who don't already know or care about them.

Fea pointed out Douma's book and Douma's response to a misleading article in the Economist that sets out a complex reality in simplistic, bite-sized terms appropriate to The Economist's readers. The pretext was a remarks by U.S. Ambassador Pete Hoekstra, and the sagas of Betsy DeVos and Erik Prince, all recognizably Dutch-American conservatives of a certain positivist stripe. Like many Americans academics, in past months I have winced at the antics, pratfalls, and utter cluelessness of Betsy DeVos, incumbent Secretary of Education. Anyone who knows West Michigan (and Holland, Michigan in particular) will know name well, such as the DeVos Field House at Hope College, and the endless genuflection towards the Amway Corporation, alleged to be a barely legalized, cult-like pyramid scheme. A member of the Van Andel family (DeVos' relations) has established rules for restricted access to Holland State Park's Big Red Lighthouse appropriate to a medieval lord of the manor (photo above); Erik Prince (Betsy's brother) remains a person of interest to Robert Mueller's investigation. Any long familiar with the Dutch American pale of settlement in West Michigan might role their eyes.

To West Michigan, Dutch American culture, I am an outsider with one foot inside that small tent. One quarter of my personal ancestry is Dutch (maternal grandmother, and Fries to be exact) and may mother lived decades as a Dutch American expatriate in distant, foreign parts --those of industrial eastern Michigan. (Her ashes are fittingly interred in Grand Rapids.) I earned my bachelor's degree at Hope College, but only after three years at Michigan State in heathen East Lansing. So I could have been an insider, but chose otherwise. (I will say more in a subsequent post.)

Douma's eminently readable book, accessible public history well-informed by theoretical, scholarly insights, presents Dutch American ethnicity as an evolving set of internal disagreements about how to cope with an external human and natural environment very different from the particular, original locations in the small country from which the ancestors emigrated. He limits his investigation to the 19th and 20th-century Dutch immigration to the Middle West, which was only tangentially related to 17th- and 18th-century Dutch American immigration to New York and New Jersey; he also leaves aside Dutch "Indos" from Indonesia.

Location, location: the emigrants came from pre-industrial villages and small cities in Gronigen, Friesland, Utrecht, and Overijssel that were transformed by industrialization and modern transportation shortly after their departure in the 19th century. They arrived in differing areas of the Middle West: West Michigan, the plains of Iowa (Pella, Orange City), burgeoning Chicago (South Holland), and were dispersed throughout Wisconsin.

The emigrants' descendants experienced varying personal and community outcomes in urban, small city, and rural locations. Dutch-American immigrant identity largely evaporated by the 1920s in many locations except two areas with a critical mass of shared ancestry: the West Michigan axis of Holland and Grand Rapids, and the the neighborhoods of Pella, Iowa (southeast of Des Moines) and Orange City (northwest Iowa). Three of those areas were anchored by colleges associated with the Reformed Church in America (RCA): Hope College (Holland, Michigan), Central College (Pella, Iowa), and Northwestern College (Orange City, Iowa). Grand Rapids became the Mecca of the Christian Reformed Church (CRC), and home of Calvin College and Theological Seminary (to become Calvin University in 2020).

The educational institutions are an important hook: Dutch Americans were justly famous for their work ethic and religious commitment. As my mother said, "God made the world, but the Dutch made Holland," referring to the dikes, sea-walls, and canals of the Netherlands, an intending the remark to mean, "therefore, get to work." Dutch Protestant Christianity of the Reformed tradition carried all the marks of Calvin's humanist character: based on texts (the Bible above all), theological reflection, and leaning towards pietism in a rather learned, cerebral manner. The revivalist enthusiasms of late 19th-century America were alien to Dutch temperaments and Dutch Americans became evangelical only as those immigrant tendencies passed. Originally birthed in the afschieding (secession) of orthodox, traditional Dutch Calvinists from the Netherlands State protestant church (Nederlandse Hervormde Kerk) in 1834, the secessionists in America fell out amongst themselves in 1857 over the Dutch immigrant's incorporation into the American Reformed Protestant Dutch Church (now RCA), which some considered to be entirely too worldly, lax, and American.

Consequently: the Dutch colleges became involved in Reformed disputes (Hope, founded 1851 and 1866 to the RCA; Calvin founded 1876 to the CRC; the RCA founded Northwestern College in 1882, and took control of Central College, founded 1853, in 1916). Consequently (also), Dutch Protestant religion took on an disputatious character that both nurtured and was fed by intellectual argument. Consequently (also), Dutch Americans became over-represented in skilled trades, the professions, and the sciences. West Michigan, which lacked major extractable natural resources, and depended upon manufacturing and trade (with its access to the Great Lakes), owed much of its economic development to skilled labor, and the manufacture of furniture, building materials (such as bricks), and pharmaceuticals.

Douma's book lends some weight to a view that Dutch American cultural and economic impact was not hindered but furthered by intra-Dutch immigrant debates and rivalries. In West Michigan cities the narcissism of small differences between the RCA and CRC correlated with a range of economic and cultural positions and produced varying responses to and acceptance of mainstream Anglo-American culture (regarding organizations such as the Freemasons, for example). Southern Michigan, originally part (with northern Ohio and Indiana) of the Midwestern "third New England" by the mid-19th was long habituated to Yankee habits of thrift and cultural positions such as Abolitionism; the Dutch immigrants were both similar and different to the Yankees as well as to the numerous other ethnic minorities present (especially Eastern European). Dutch Americans were at first outsiders to the fraught American conflicts that foreshadowed the Civil War, and a number of young Dutch American men absorbed a "American" habits and dispositions through war-time military service. Dutch American rivalries extended a discourse that unintentionally preserved or prolonged Dutch American identity in those areas of Michigan and Iowa that held a critical mass of Dutch descendants. In time these descendants remembered not Dutch culture so much as the culture of their grandparents or great-grandparents: "Tulip Time" in Holland Michigan (an ethnic festival in May) harks back not so much to the Netherlands as to memories of an idealized Netherlands in the minds of the early immigrants.  Dutch American identity has by now evaporated or turned into genealogical interests with a barely religious overlay.  The institutions of the CRC, RCA and the colleges have moved on to other identities and evolving missions.

What does this tell me about teaching American evangelicalism to secular or minimally Catholic undergraduates who don't have (or sometimes want) a clue? It reminds me that cultural identities are always a work in time, evolving in changing circumstances, and apt to idealize their own pasts. Their disputes, far from weakening them (unless they become too divisive), in fact strengthen them by giving the participants something to really care about. Whether many Evangelicals' current nearly cultic for the Chief Executive will in fact divide them from their recent compatriots (those Evangelicals who did not support him) remains to be seen: how divisive will that dispute become? Douma's book also reminds me of the way that religious commitment can be felt as nearly an ethnic identity, and thoroughly entangled with multiple, sometimes conflicting other commitments.

Sometimes it can also really help when a professor includes a sufficient (but not overpowering) testimony: "here's why this subject really matters to me." I find that students often respond to genuine commitment: this is important because it expresses something close to my heart. (I have seen, believe it or not, the teaching of accounting standards enlivened in this manner.) My subsequent post will tell a bit more of my own story.

 

Part of my work since 2009 has been teaching topics in American religion to undergraduates. Since my scholarly training focused on Christianity, most of the class concerned Protestants and Catholics in American history and culture. Most of the students lacked any real working knowledge of any religious community, even if they were graduates of Catholic schools (a small minority). The course will meet a distribution requirement, and with vanishingly few religion majors, I kept a broad focus. Given my students' effective religious illiteracy things went reasonably well.  (I do not intend to exclude any American religion, but I do want to stick to my competencies.)

In teaching about evangelicalism, I hit a concrete wall. My students have assumed that Evangelicals by definition have been always and only conservative Republicans. They might feel some sympathy, I have learned, with a few conservative Evangelical viewpoints, especially amongst the males (immigration; economics; and the racial subtexts). But for the vast majority of my New England small-c and Capital-C c/Catholic students, Evangelicals are a strange tribe: inexplicable in all their ways, potentially hostile to Catholics and Northeasterners in general, and motivated by ineluctable commitments. Neither conservative Republicans nor high-profile Evangelicals are highly visible on the regional Tri-State, southern New England cultural spectrum. As one student wrote, "Evangelicalism: not for me." I am hardly trying to turn them into Evangelicals (I made that abundantly clear, and they heard me), but I had hoped to shed a little light on Evangelical history and culture in hopes of building some respect for this particular "other." I needed help.

I ran across John Fea's blog The Way of Improvement Leads Home after reading several chapters of his book by the same title; Fea's blog is genuine assistance to those few who would like to understand Evangelicals better, but have no interest in becoming Evangelicals ourselves. His new book Believe Me: The Evangelicals Road To Donald Trump (please order from Eerdmans, not Amazon) tells a story from inside Evangelicalism to those Evangelicals who did not vote for Trump, and to the rest of us.  Fea attended Trinity International University and teaches at Messiah College (Pa.); he earned his Ph.D. from SUNY/Stony Brook, so he also has commitments to scholarship off the Evangelical academic reservation. Thanks to John, I also began to read Frances Fitzgerald's The Evangelicals: The Struggle to Shape America (a Pulitzer Prize winner) and Robert Jones' The End of White Christian America. I returned to Mark Noll's landmark The Scandal of the Evangelical Mind (1994) as well as George Marsden's landmark Fundamentalism and American Culture (2nd edition, 2006).

In 2017 Mathew Mayhew (Education, Ohio State) et al. wrote "Expanding Perspectives on Evangelicalism: How Non-evangelical Students Appreciate Evangelical Christianity," (Rev Relig Res (2017) 59:207–230 DOI 10.1007/s13644-017-0283-8), a survey-based social science project. The investigation revealed distinct differences in students' attitudes towards their evangelical peers related to demographics, institution type, and academic major. Students who self-identified as having religious experience (or identity) were apt to be somewhat more sympathetic to Evangelical students, who might well feel ostracized or devalued in more secular academia. "How do we encourage appreciation of a worldview as polarizing as the one the evangelical narrative represents?" (p. 225) When does a challenging or provocative Evangelical viewpoint become perceived as divisive or hostile? This is an eye of a needle hard to pass through.

This challenge is particularly trying where no Evangelical students are present. I have found an analogy when trying to teach about the fervor of 19th-century Prohibitionists: most students will recognize the problems of alcohol abuse and alcoholism but advocates for Prohibition simply no longer exist. Students might well respond to the challenges (or provocations) of "hot-button" issues such as abortion rights, LGBTQ rights (and cake-bakers, florists, et al.), and immigrants with or without documents --but lack any awareness of Evangelical resonance. I have had one earnest student say, "I don't believe in evolution because I'm Catholic," and had to point out to her that she may have unawares absorbed an oft-held Evangelical viewpoint, but that her refusal cannot be based upon specifically Catholic bases, at least according to the Pope (then Benedict XVI). I must also reflect that my African American and Hispanic students often will reveal greater awareness of Evangelicalism than whites.

I return to the question: how does one teach about those who regard their faith as primary to those who are unaware of why any faith might be primary? (Granted the former category can include a great deal of wishful thinking, rationalization, and even fear and hypocrisy when things go wrong: read Believe Me.) Years ago I encountered a similar wall when trying to teach about Dietrich Bonhoeffer and why he chose to participate, however tangentially, in the July 1944 plot against Hitler. One Muhlenberg College student candidly observed, "We don't understand anything about sacrifice because we have never been asked to sacrifice anything." The gulf is more imagination than thinking, or the ability to think. (I am by no means assuming that Bonhoeffer was or would be Evangelical in contemporary North American usages of the word; Eric Metaxas' book has been justly condemned as poorly sourced and even more poorly written, and I decline to link to it.)

In response, I have to cast back to my own limited experience of something bordering Evangelical America both at Princeton Theological Seminary and at Hope College (in my next post.) Personal experience may be a last resort --I am at my last resort.

Mouse Books give easy access to classic texts in a new format --especially essays or stories that often are not commercially viable on their own. The Mouse Books project wants to offer readers more ideas, insight, and connections for readers' lives.

Brothers_K_grande_246097a1-d3d3-4008-856a-487204748363_540xThe digital era was supposed to make books and lengthy reading obsolete: Larry Sanger (co-founder of Wikipedia, originator of citizendium.org and WatchKnowLearn.org) memorably critiqued faulty assumptions in 2010, Individual Knowledge in the Internet Age (here as .pdf; see also my posts here and here). "Boring old books" played a part. Clay Shirky of NYU wrote, "the literary world is now losing its normative hold" on our culture," "--no one reads War and Peace. It's too long, and not so interesting. . . This observation is no less sacrilegious for being true." Ah, the satisfying thunk of a smashed idol. Goodbye, long, boring not so interesting books.

Except that a funny thing has happened on the way to the book burning. (Danke schoen, Herr Goebbels) Printed books have somehow held on: unit sales of print books were up 1.9% in 2016, at 687.2 million world-wide, the fourth straight year of print growth. Rumors of demise now seem premature. What gives?

The print book is far more subtly crafted than many digital soothsayers realize. Printed books have evolved continuously since Gutenberg: just take a look at scholarly monographs from 1930, 1950, 1970, 1990, and 2010. The current printed book, whether popular, trade, high-concept, or scholarly monograph, is a highly-designed and highly-evolved object.  Publishers are very alert to readers' desires and what seems to work best.  It was hubris to think that a lazily conceived and hastily devised digital book format could simply replace a printed book with an object equally useful: look at the evolution of the epub format (for example).

Designers will always refer to what has been designed previously, as well as new and present needs and uses when designing an object: consider the humble door. Poorly done e-books were a product of the "move fast and break things" culture that doomed many ideas that appealed to thinking deeper than the one-sided imaginations of bro-grammer digital denizens.

Enter Mouse Books. Some months ago David Dewane was riding the bus in Chicago. "[I] happened to be reading a physical book that was a piece of classic literature. I wondered what all the other people on the bus were reading." He wondered, why don't those people read those authors on their smart phones? "I wondered if you made the book small enough—like a passport or a smart notebook—if you could carry it around with you anywhere."

David and close friends began to experiment, and eventually designed printed books the size and thickness of a mobile phone. They chose classic works available in the public domain, either complete essays (Thoreau's On the Duty of Civil Disobedience) or chapters (Chapters 4 and 5 of The Brothers Karamazov, "The Grand Inquisitor," in Constance Garnett's translation. These are simply, legibly printed in Bookman Old Style 11-point font. Each book or booklet is staple bound ("double stitched") with a sturdy paper cover, 40-50 pages, 3 1/2 by 5 1/2 inches or just about 9 by 14 cm --a very high quality, small product.

David and the Mouse Team (Disney copyright forbids calling them Mouseketeers) aim for ordinary users of mobile phones. They want to provide a serious text that can be worn each day "on your body" in a pocket, purse, or bag, and gives a choice between pulling out the phone or something more intellectually and emotionally stimulating. Mouse Books give easy access to classic texts in a new format --especially essays or stories that often are not commercially viable on their own (such as Melville's Bartleby the Scrivener, or Thoreau's essay, which are invariably packaged with other texts in a binding that will bring sufficient volume and profit to market.) The Mouse Books project wants to offer readers more ideas, insight, and connections for readers' lives.

As a business, Mouse Books is still experimental, and has sought "early adopters:" willing co-experimentalists and subjects. This means experimenting with the practice of reading, with classics texts of proven high quality, and complementing the texts with audio content, podcasts, and a social media presence. These supplements are also intended to be mobile --handy nearly anywhere you could wear ear buds.

As a start-up or experiment, Mouse Books has stumbled from time to time in making clear what a subscriber would get for funding the project on Kickstarter, what the level of subscriptions are, and differences in US and outside-the-US subscriptions. The subscriptions levels on the Mouse Books drip (or d.rip) site do not match the subscription option offered directly on the Mouse Books Club web site. As a small "virtual company," this kind of confusion goes with the territory --part of what "early adaptors" come to expect. That said, Mouse Books is also approaching sufficient scale that marketing clarity will be important for the project to prosper.

This is a charming start-up that deserves support, and is highly consonant with the mission of librarians: to connect with others both living and dead, to build insight, to generate ideas. The printed book and those associated with it--bookstores, libraries, editors, writers, readers, thinkers--are stronger with innovative experiments such as Mouse Books. The printed book continues to evolve, and remains a surprisingly resilient re-emergent, legacy technology.

More about Mouse books:

Web site: https://mousebookclub.com/collections/mouse-books-catalog

drip site (blog entries): https://d.rip/mouse-books?

Video:

 

The printed book, once thought to be a obsolete technology, shows strong signs of re-emergence as a stable cultural experience not apt to be interrupted by digital distractions or the dopamine kicks of addictive social media. 


Printed book imageTypewriters, mechanical watches, vinyl recordings, newspapers, printed books --obsolete technologies, right? Get with the program: countless incumbent industries and professions have been rendered pointless: disrupt or be disrupted --right? This has been the dominant cultural narrative --right?

I first heard about the obsolescence of librarians 35 years ago at the start of my career. Columbia University soon after accepted dominant cultural narratives and closed their graduate library school, college of pharmacy, and departments of geography and linguistics. Pharmaceuticals? Digital and print librarians? Linguistics and languages? Geographic information systems? --all obsolete (Whoops!). Since those who proclaim their demise have usually been selling some replacement, cynicism follows fast. Another prediction of demise, another day.

Entirely outside of libraries, a counter-narrative has grown. David Sax popularized one in Revenge of the Analog: Real Things and Why Real Things Matter (PublicAffairs, 2016): we interface with the world in a tactile, communal world.   At Harvard Business School, Prof. Ryan Raffaeli studies organizational behavior, using field research.  He contributes much more sophisticated thinking about re-emergent technologies. He has found that "incumbent" technologies and industries can make a comeback. This story has important implications for libraries.

Some technologies re-emerge from disruption and destruction, especially those that had a long history. Count out VHS tapes and punch cards: those were transitional. Typewriters have had a long enough history, as do fountain or nib pens (extending the dip, quill-type pens since 1827) .

Printed books, like other technologies, brought whole occupations and kinds of work with them: not just printers, but also binders, sellers, retailers, and of course librarians. As a candidate for "innovative disruption" by digital books, the demise of the printed book, so loudly proclaimed ten years ago, mandated the demise of book stores, libraries, librarians, publishers, editors.  Now anyone can write a book (see Amazon); who needs editors? Who needs libraries or bookstores?

Some disruptions are truly innovative --others just disruptions, and others just hype, but shouting as real (see previous post). The disruption narrative is not sufficiently incorrect (although it can be applied poorly), but the consequence corollary of the incumbent industries' necessary inability to adapt --and certainty of their demise-- is less well-founded.  Raffaelli's research shows that technologies can re-emerge, a cognitive process in two phases: first largely cultural, temporal, and narrative process; second a competitive process in a re-defined market with distinctive values not strictly established by price. His leading example is the Swiss mechanical watch-making industries; his second is the return and rise of independent book sellers in the USA.

Both the watch-makers and the book sellers lost substantial market shares when disruptive, good-enough technologies moved upmarket and claimed their most profitable customers: watchmakers with the rise of cheaper, more accurate quartz watches in the 1970s; book sellers with the rise of major chain bookstores in the 1990s, followed by Amazon. They keenly felt their losses: numerous Swiss firms closed or discontinued manufacturing; from 1995 to 2009 around 1,400 bookstores closed. Enough hung on, however, to rebound: how did they do it?

Raffaelli identifies the terms of competition: old terms such as price, availability, and quality change with the entry of disruptive technologies to market. The survivors have re-defined the competition: how they want to compete, and what value proposition they offer to their customers. He traces a complex process of de-coupling product and organizational identity and renegotiation of foundational concepts and business roles. The process is both bottom-up (from the "factory floor" or fundamental, front-line production or service) and top-down: from industry alliances, design thinking, and organizational management.

In the Swiss mechanical watch industry, he has identified entrepreneurs and guardians. Entrepreneurs are alert to market signals, cultural currents, and emerging narratives that suggest that new communities are forming new values. Guardians by contrast preserve older technologies and enduring values and counterbalance the entrepreneurs; both are necessary for the process of cognitive re-emergence. When the industry drew near to complete collapse, collectors began to purchase mechanical watches at high prices at auctions, signaling that their small community found genuine value expressed momentarily in price. Entrepreneurs realized that the market for mechanical watches had not completely disappeared, but changed: the value lay not in keeping time for a price, but in expressing a cultural signal. Guardians, meanwhile, had preserved enough of the technology that recovery was possible; veteran employees preserved crucial tools and skills that enabled a recovery. Each needed the other; the leadership necessary for re-emergence arose not just from the top level of the organization and industry, but from the commitment and wisdom of key skilled workers. Mechanical watches were then marketed as high-end, luxury items that "said something" about their owners. As new customers entered or moved up-market, they adopted such watches as a sign of cultural status and belonging.

Independent booksellers successfully re-framed their market as primarily community, secondarily as inventory. First the chain stores (Borders, Barnes & Noble) out-competed them on price, then Amazon on price and inventory availability. Independent booksellers have focused instead on 3 Cs: Community and local connections, Curation of inventory that enhanced a personal relationship with customers, and Convening events for those with similar interests: readings, lectures, author signings, and other group events. The booksellers' trade association (American Booksellers Association or ABA) facilitates booksellers' connections with local communities with skills, best practices, effective use of media, and outreach to other local business and organizations (--even libraries, once considered the booksellers' competitors). The re-emergent market was defined both by entrepreneurial booksellers, front-line service guardians, a growing social movement committed to localism, and industry-scale cooperation. Between 2009 and 2017 the ABA reported +35% more independent booksellers: from 1,651 to 2,321 nation-wide. A sign of the integration of booksellers with community spaces: for 2017 sales up 2.6% over 2016.

Like independent bookstores, the "library brand" remains strongly bound to printed books --after all, the name derives from "liber" (Latin), confirmed with "biblos" (Greek). The printed book, once thought to be a obsolete technology, shows strong signs of re-emergence as a stable cultural experience not apt to be interrupted by digital distractions or the dopamine kicks of addictive social media.  This brand identity will persist even though libraries offer many kinds of resources in many formats --including millions of digital books.

What does such technology and market re-emergence have to do with libraries? These cases suggest the emerging re-definition of libraries (as both old and new) is analogous to much of Raffaelli's work, and that the narrative frame of "disruptive innovation in higher education" can be --should be-- challenged by a this more useful counter-narrative, "new and re-emergent technologies in higher education."

While libraries' role as mere "book providers" has been challenged by disruptive technological service entrants such as the Internet, Amazon, and social media, libraries' role as a channel for trusted, stable information is stronger than ever. The Pew Research Center survey data from Fall 2016 found that 53% of Millennials (those 18 to 35 at that time) say they used a library --a generational cohort (not just college students--the study focused on public libraries). This compared with 45% of Gen Xers, 43% of Baby Boomers, and 36% of Silent Generation. In 2016 Pew also reported that libraries help "a lot" in deciding what information they can trust, from 24% in 2015 to 37% in 2016. Women held that opinion more strongly, 41%. Recent anecdotes suggest that such opinions have not changed direction.


Boston-public-library-free-to-allLibraries are regarded as very strong assets to a community: the high values placed on pleasant space, safety, and community events also emerged in the Pew studies. Coupled with bottom-up initiatives from front-line librarians and individual organizations, the American Library Association has devoted substantial attention and resources to initiatives such as the ACRL Framework for Information Literacy in Higher Education, and the Libraries Transform campaign.  Libraries' free-to-all traditions (supported by tuition, tax dollars, and other sources) do not track community impact as easily as do independent bookstore sales figures. Their value proposition for their communities becomes clear in usage figures (at SHU growth in usage has outpaced growth in enrollment) and the faculties' documented turn towards librarians in helping undergraduate students develop research, critical analysis, and information literacy skills.

As a re-emergent technology, printed books sustain a host of skills, occupations, organizations, and cultural signals that do not boil down to a single, simplistic, marketable narrative. Conceived in the late 20th century as "information resources," books gave way to digital representation; conceived as "documented knowledge," the act of reading books in a library context provides a tangible experience of informed learning, cultural absorption, and community participation. Libraries provide many services. Without the "brand" of reading books, and the sustaining services of librarians, the library would turn into derelict, zombie storage spaces. Knowledge is a communal good as well as a private act; it is never simply an individual achievement: free to all. We are all culturally embedded in the minds of our predecessors and communities for weal and woe --and without libraries, bookstores, timekeepers, and printed books, we will not be able to progress from woe to weal.

 

Burnett's and Evans' chapters, "How not to find a job" and "Designing your dream job" should be required reading for all college seniors --example: dysfunctional belief: I am looking for a job; reframe: I am pursuing a number of offers.

Burnett-Designing Your LifeDesigning your Life: How to Build a Well-lived, Joyous Life, by Bill Burnett and Dave Evans. 

New York: Knopf, 2017. 238 p.  ISBN: 978-1-101-87532-2

I am reviewing a book that the library doesn't own, because you really should buy your own copy of this one. This is a book with directions, exercises, useful "try stuff" pages that invite margin notations, doodles, objections, and questions.

"Designing your life" sounds strange at first. The book is based by two practitioners of "design theory" at Stanford who have taught one of the most popular courses there in the Design program --the home of "human-centered design." Burnett and Evans ultimately ask: what makes you happy? through the medium of design process: try, prototype, critique, try again:

  • Be curious;
  • Try stuff;
  • Reframe problems;
  • Know it's a process;
  • Ask for help

They reframe a lot of dysfunctional beliefs: you won't have one perfect life, rather:" you can have multiple plans and lives with you; we judge our life by the outcome, rather: life is a process, not an outcome. Their chapters, "How not to find a job" and "Designing your dream job" should be required reading for all college seniors --example: dysfunctional belief: I am looking for a job; reframe: I am pursuing a number of offers.

This sounds very like happy-clappy self-help, but its not: they ask some tough questions. They move in a counter-cultural direction: the things that we say that we want (a good grade) are not the things that will really make us happy. It would be really interesting to read this book along with Kahneman's Thinking Fast and Slow as a way to identify our own dysfunctional beliefs and cognitive fallacies.

Be curious. Go out and get a copy (or download a copy). Try stuff. Reframe your dysfunctional beliefs. Learn that happiness is a process, too. There is no right choice --there is only good choosing.

Joseph Smith Jr.'s role as an innovator, the "disruption" of Mormon proclamation, and the America that received or rejected his message all form an enduring, cultural matrix that later facilitated the popularization of Clayton Christensen's theory of “disruptive innovation” as a popular idea.

NB This is a very long post, and might be more readable in this .pdf document.

Abstract: Christensen’s theory of disruptive innovation has been popularly successful but faced increasing scrutiny in past years.  Christensen is a Latter-day Saint, and the career of Joseph Smith, Jr. and his ideas form a powerful backdrop for Christen’s theory.   Smith's role as an innovator, the "disruption" of Mormon proclamation, and the America that received or rejected his message all form an enduring, cultural matrix that later facilitated the popularization of “disruptive innovation” as an idea. Uncritical embrace of the popular theory, especially in higher education, implies acceptance of a cultural assumptions and suggest that the theory is less useful than has been claimed.  This essay is a reflection on history and experience, not a theoretical quid-pro-quo, or take-down.

Something always struck me as oddly familiar and decidedly off-key about Christensen's confident claims that innovation would explosively disrupt American higher education.  A previous boss dismissed my views as the reflex of one in a dying profession (librarianship). 

I never accepted her cynical critique; neither could I put my finger on why I still disagreed strongly.  Then a new insight came to me while teaching a historical survey course on American Religion about the idea's interesting and problematic deep structure.  My insight sprang from some familiarity both with the discourse of innovation in higher education, and 19th-century American religion, two widely known but widely separated fields.  What I have found will, I hope, give pause to thoughtful educational and academic leaders.  Uncritical embrace of "disruptive innovation" might implicate them in religious and cultural commitments that should give them pause, especially if they lead faith-affiliated organizations.

The first backstory:  Christensen first introduced disruptive innovation in 1995 in an article aimed at corporate managers rather than researchers.  In 1997 he expanded his ideas in The Innovators Dilemma which caught on in 2000s with managers, business writers, and consultants in a big way, with the aura supplied by Harvard Business School.  Since 2011 he has predicted that as many as half of American colleges and universities would close or go bankrupt in fifteen years (so in 2021-2026). He is still at it: in April 2017 he maintained his claim when speaking to Salesforce.org's Higher Education Summit.  "I might bet that it takes nine years rather than ten."  Sometimes he's been wrong (about the iPhone, for example), but he does not vacillate.

Disruptive innovation has become big business, popularized not only by Christensen (who has come to regret losing control of the term), but by a host of management consultants, pundits, and experts.  Among librarians, David Lewis applied Christensen's ideas in 2004, expanded into a book in 2016 that has been on the whole well-received by those in my "dying profession."  Predictable pushback came in 2014 from another Harvard professor (of history, Jill Lepore), and a detailed reexamination in the MIT Sloan Management Review, and others.  Christensen in turn defended his ideas and reformulated some of them in 2015, on the 20th anniversary of his initial publication. 

Three years later, if the concept has lost some valence or he's just wrong about higher education, why rehash this now and for the umpteenth time?

That's where the second backstory becomes relevant.  Christensen (of old Latter-day Saint stock) is not just coincidentally Mormon; that identity is central to his person and that background to his work. 

When I teach my historical survey of American Religion, in due course we come to the so-called Second Great Awakening in the first half of the 19th century.  Scholars give special attention to the "Burnt-Over District" of western New York, home of many potent religious and political ideas associated with "Whig Evangelicalism": abolition, temperance, the rights of women, and reforms of public health, education, prisons, orphanages, etc.  The District fostered not only mainstream Christian restorationist and evangelical movements (such as Disciples of Christ, ("Campbellites"), Methodists and Baptists), but also countless Millennialist, New-Thought, and Spiritualist communes, Shakers, Millerites (early Seventh-Day Adventists) -and Joseph Smith Jr.'s Latter Day Saints. 

Smith resists casual dismissal: was he truly a prophet of the living God (the mainstream Mormon view)? -a womanizing fraud (the view of many of his contemporaries, and critics since)? -a self-deluded prophet who eventually bought into own fabrications and could not extricate himself (a sort of early David Koresh)? -or some kind of mystic or pyschic with unusual access to otherworldly regions and the subconcious (a sort of frontier, raw-edged Francis of Assisi)?

Smith promulgated the enormous Book of Mormon (Skousen's critical first edition is 789 pages).  He claimed an angel guided him to find ancient plates in a hill near Palmyra, New York, which he translated from unknown languages with the help of seer stones and a hat, and dictated on-the-fly to his wife Emma Hale Smith and others, all in 65 days.   Even if he made it up, or shared authorship, or plagiarized part, it is an amazing performative achievement.  A densely layered anthology of documents, speakers, and authors, the text can be regarded (if not as Scripture) as an early forerunner of "magical realism." All this from 20-something farmhand with a little primary education.

Smith was truly a "rough stone rolling" whom generations of Mormons have never managed to smooth. "No man knows my history," he is reported to have said; "I cannot tell it: I shall never undertake it. . . . If I had not experienced what I have, I would not have believed it myself."  His innovative edginess strongly contrasts with earnest, family-oriented, upbeat, corporate image of contemporary Mormons.

"Innovative" -there's that word.  In matters of religion, innovation is not always a positive idea.  Smith's most famous innovations were the origins of the Book of Mormon, and the "new and everlasting covenant" (revelation) of both eternal marriage, and the principle of plural marriage.  The last innovation directly challenged 19th century American ideas about the family, and occasioned a furious opposition of a scale rarely seen in American history (leaving aside for the moment the historical plague of racism).  Latter Day Saints were opposed in Ohio, persecuted in Missouri (the Governor ordered extermination); Smith was murdered in 1844 in Illinois by a lynch mob acting out of congeries of fears.

The subsequent succession crisis would have led to fatal splintering were it not for Brigham Young.  A considerable majority of Mormons followed his leadership from Illinois in the Great Trek to Salt Lake City (1846-1848); Young's organization both preserved and transformed the Latter-day Saints; they lost their prophet but gained a hyphen.  The founder’s innovations would have perished without Young's tenacity, sheer longevity (died 1877) and "courageous leadership" or "iron-fisted rule," depending your point of view.

These two long backstories are essential for understanding both the meteoric rise of "disruptive innovation" and its recently waning appeal as an explanatory theory in light of qualms about its accuracy.

Joseph Smith, Jr., can be seen as an exemplary disruptive innovator.

"'Disruption' describes a process whereby a smaller company with fewer resources is able to successfully challenge established incumbent businesses." While incumbents focus upon improving their product or service (especially on behalf of their most profitable customers), they tend to ignore less profitable market sectors, or exceed their needs.  Entrants seek out those ignored market segments and can gain customers by providing more suitable products or services, frequently simpler and at lower cost. Incumbents can fail to respond in kind while entrants "move upmarket," improve their offerings, maintain their advantages and early success, and eventually drive out or acquire incumbents when their formerly most profitable customers shift purchases to the entrants. (Christensen, 2015). 

Since Christensen has complained that terms have become "sloppy" and undermine his theory's usefulness, I have tried to paraphrase his core idea carefully, and to present Mormon history even-handedly.  My central claim is that Smith's role as an innovator, the "disruption" of Mormon proclamation, and the America that received or rejected his message all form an enduring, cultural matrix that later facilitated the popularization of “disruptive innovation” as an idea.  Christensen's religion did not cause him to create his theory, but did contribute a framework that fostered its reception, as well as its swift extension beyond its first cases in manufacturing disc drives, construction implements, and other tangibles.

Christensen's commitment to his Church is beyond question; the intellectual, doctrinal traditions of his faith powerfully molded his character and thinking, by his own admission.  By all accounts he is a charming and sincere man, and would never have smuggled his religion into his professional thinking; he is an honest and forthright broker.  This essay is a reflection on history and experience, not a theoretical quid-pro-quo, or take-down.

Smith's movement "came to market" at a critical time in American religious history.  The Constitutional de-coupling of religious institutions from the state apparatus was one of the primary innovations of the new nation.  The new "open market" meant that incumbent legally established, financially supported Christian churches had to compete for adherents and support.  In Connecticut, for example, the "standing order" of political and religious aristocracy came to an end in 1817.  Such competition challenged the established churches' previous role as assumed arbiters of social legitimacy.  If you have a wide field of religious choices, you could also choose "none of the above." National anxieties about declining status of established forms of Christianity in large part fueled the striking resurgence of Christian groups loosely termed "evangelical." 

A significant portion of those evangelical groups can be described as "restorationist," appealing to a New Testament proclamation of God's restitution of all things.  This was taken to mean a return to the "primitive church, " a re-pristination of Christianity - taking it back to the putative purity of the founders.  This led to a proliferation of religious bodies, almost all of which inherited the binary certitude from earlier centuries that each was "correct" and hence others were necessarily "incorrect." Each group warranted its appeals in the lost purity of the first Christians.  For many groups, the path back to purity had been cleared by curbing the incumbent religious churches by disestablishment, and they hoped that clearance would level their status.

Since the religious marketplace of early 19th-century America had only recently been opened, doctrinal disputes that now seem arcane often paralleled heated social and culture divisions.  Smith's own family mirrored the national situation; his grandfather was a Universalist (that all humans could receive God's corrective grace; they opposed evangelicals); his mother has been identified as Presbyterian (officially, she would have believed that God's eternal decree predestined some to eternal blessedness and foreordained others to eternal damnation; Presbyterians tended to be allied with evangelicals).  Joseph Jr. may have caught a "spark of (evangelical) Methodism" at local rural revival meetings. His maternal grandfather and parents experienced visions and voices; like many farmers they used divining rods to find water and looked for buried treasure, a kind of folk magic.  They believed in prophecy and vision, tended towards skepticism about "organized religion,” and were receptive to new religious ideas.  He is reported to have told his mother, "I can take my Bible, and go into the woods, and learn more in two hours, than you can learn at meeting in two years, if you should go all the time."

In the religious marketplace in western New York, Smith’s family were typical of a market segment often ignored by the more well-established groups who appealed to more properous farmers, townspeople, and entrepreneurs (see Johnson’s A Shopkeepers Millennium).  Smith's family, on the other hand, were downwardly mobile recent arrivals from Vermont without a network of support, a consequence both of poor decisions and environmental strains such as the “Year without a Summer” (1816).  They typify the impoverished, rural working class on the inner edge of the frontier, a down-market segment less promising to more prominent groups, for whom the competitive religious marketplace was particularly nettlesome.

The 14-year-old Joseph was confused by the "cry and tumult" of Presbyterians vs. Baptist vs. Methodist, all using "both reason and sophistry," to "establish their own tenets and disprove all others."  He asked, "What is to be done? Who of all these parties are right; or, are they all wrong together?  If any one of them be right, which is it, and how shall I know?"  In other words, his market segment saw that ecclesiastical competition compromised the integrity of all parties. Reading a biblical text that directed him to ask God, he went into the woods (again!) and reported he experienced a dramatic theophany: the "Personage" answered "that I must join none of them," their "creeds were an abomination" and adherents "were all corrupt."  His take-away: he realized "at a very early period of my life, that I was destined to prove a disturber and annoyer."  Joseph’s subsequent innovations certainly disturbed and annoyed other groups.

Care must be taken, however, in simply equating Joseph’s social location with a commercial market position, because the religious “marketplace” differs in important ways from commerce: product differentiation, lock-in, and brand loyalty.

The religious "product" is not a commodity, but a sense of living affiliation with a group that makes doctrinal, moral, and behavioral claims in such a way that simultaneous affiliation with more than one group is either prohibited or discouraged.  The ultimate outcome, some kind of eternal blessedness, in principle excludes other ultimate outcomes.  Today many children in "mixed" families can feel religious differences strongly (and opt for "none"). For example, an individual cannot be a Catholic in good standing and simultaneously an evangelical Baptist in good standing -their claims and ideas simply conflict too much; if both present in the same family, some accommodation must be reached.  Joseph Smith Jr. found such exclusive "product differentiation" troublesome.

Religious adherents' "market lock in" is high: one might radically change one's affiliation once or twice in a lifetime, but more often is unusual and perhaps suspect, and "conversion" can exact high social costs.  The religious fervor of New York's Burned Over district in Joseph Smith, Jr.'s time left religious organizations in flux, so that conversion costs were often much less than before or after.  All early Latter Day Saints nevertheless had to make a clear decision that departed from their inherited religious affiliations.

A religious group's "brand loyalty" involves a constellation of commitments; socialist Fundamentalists and alt-right Episcopalians are vanishingly rare (for example).  The brand loyalty of early Latter Day Saints evolved from 1830 to 1844, becoming progressively stronger both in positive response to Joseph Smith Jr.'s continuing revelations, and defensive response to violent persecution.  For example, early Saints' constellation of commitments was ambivalent towards slavery; initially as Northerners early adherents opposed it; then revelations and teachings evolved to allow some slave-holders to join in Missouri (a slave state). After Smith’s murder, his son Joseph Smith III and widow Emma Hale Smith repudiated both slavery and plural marriage in the Reorganized Church of Jesus Christ of Latter-day Saints in Missouri, the "minority" successor group.  By contrast, Brigham Young's larger "majority" successor not only retained plural marriage but attempted to legalize slavery in the Utah Territory.  Since Republicans, starting in 1854, sought to abolish "twin relics of barbarism," slavery and polygamy (a jab at Young's group), it is unclear whether that commitment arose from core convictions or defensive resistance.

"Disruptive innovation" in the religious marketplace has to be treated carefully, because of not only the special nature of the religious market place, but also rigorous examination of the idea of "disruptive innovation:" it does not mean just any disruption.

Whatever the sources of Joseph Smith Jr.,’s ideas, he led a movement that "gain[ed] customers (i.e., adherents) by providing more suitable, often simpler products or services, frequently at a lower cost."  (Latter-day Saints have never had professional clergy; their commitment to mutual assistance is exemplary.)  Market incumbents (more organized and better financed competing groups) were slow to respond in kind, and as Smith's group moved "upmarket," it maintained its "advantages and early success" -high rates of "lock-in," group cohesion, and brand loyalty.  Smith's group, however, never quite succeeded in driving the "incumbents" out of the market or even acquiring most of their former customers.  Their sense of urgency lost its edge.

Why are the Latter-day Saints “latter-day”?  This code phrase refers above all to a shared set of cultural and religious assumptions and commitments in early 19th-century America.  "Millennialism" was the belief that the coming of the Kingdom of God (promised in the Bible) was imminent and that America, with its special covenant of religious liberty, would be central to its arrival.  Millennialism came in two distinct forms with opposite answers to the question, "Will Christ return before or after the promised millennium (1000 years) of peace?"  Pre-millennialists emphasized the troubles (tribulations) that would both precede and signal Christ's return to reign for 1000 years before the Last Judgement.  Post-millennialists proclaimed that Christ would return and the Last Judgement occur after the millennium of peace, culminating in his return; their task was to make the world ready for the Kingdom of God.  Both expect the Kingdom very soon: we are all living in the biblical "latter days."

This recondite disagreement has important implications.  Post-millennialists were all about social reforms that would make the United States so like the Kingdom of God that American society would usher in the millennium.  Pre-millennialists emphasized that Christ would only come after dramatically increasing tribulations.  Things getting worse and worse were a sign of his approach –hence they disfavored social reforms as a distraction from the real work of preparation for evil times. (Historical aside: the War of the Secession largely discredited post-millennialism, which morphed into the program of social reforms in the Progressive era. Pre-millennialism evolved into dispensational Christian fundamentalism, combining expectation of tribulation with a believe in the factual, literal innerrancy of the Bible.)

Latter-day Saints' enduring post-millennialism shows, among other ways, in their boundless optimism.  The familiar, earnest young missionaries (think The Book of Mormon, the Broadway show) are a token of the continuing commitment of the Latter-day Saints to usher in the latter days, although they expect them less imminently. Millennialism is common coin no longer.  Despite the popularity of the "Left Behind" series of books and movies, only a small minority of survivalists or "preppers" appeal to Biblical warrants for their expectations of imminent tribulations (disaster).

Detached from Christianity, expectations of imminent disaster and rebirth went rogue in American culture long ago.  The Silicon Valley today, for example, is home to many who expect a "singularity" in which the artificial intelligence outstrips human intelligence and introduces profound changes to civilization as a whole ­–another sort of secular millennium in which technology has replaced a Messiah as the anointed power.  Popular movies and books have made dystopia a cultural cliché.  (What's the disaster this time? Nuclear war, apes, viruses, climate change, or the abrupt disappearance of millions?). How many jokes about "voting in the last American election" (double entendre) play on similar fears?

"Disruptive innovation's" popularity exploded in the 1990s and 2000s exactly because of the numerous hopes and fears raised by the advent of the Internet and its devices and social media.  Josh Linkner warned, "disrupt or be disrupted," (The Road to Reinvention, 2014) and that binary choice spoke in apocalyptic tones to incumbent mass media, libraries, bookstores, journalists, travel agents, financial consultants, university presidents, and anyone else who deals in "information" as a commodity.  Such urgent warnings shout to established corporations, "The end is near: you can't innovate fast enough; you're not even the right people to do it."  Incumbent organizations were counted out simply because of their incumbency: MOOCs would inevitably disrupt brick-and-mortar educational institutions, now denigrated because of their mere physicality. 

The popular version of “disruptive innovation” played dystopian fears of the collapse of the known "incumbent" corporations and rise of an economy of perpetual disruption -Schumpter's capitalist creative destruction now recast as "disruptive innovation" with a brutalist, binary emphasis: disrupt or be disrupted.  The archetype "creative disruptor" is the technological whiz-kid (I nominate the Mark, "Zuck Almighty") whose revelatory "Book of Faces" and continuing revelations of a "new and everlasting platform" will usher in a thousand-year era of effortless, limitless, and unfailingly upbeat social confabulation.  Except when many kinds of terrorists, Russian autocrats, vaccine deniers, and deranged stalkers confabulate as well.

What does this have to do with Clayton Christensen?  Well, both a little and a lot.  He cannot deny his own popularization of his ideas through his books, media appearances, securities fund (the short-lived Disruptive Growth Fund, launched in 2000 at just the wrong time), and army of students, friends, and defenders such as Thomas Thurston in TechCrunch.  He lost control of "disruptive innovation" as a term of art precisely because of its appeal to those who make a living from in-your-face, counterintuitive claims.  Lepore identified circular reasoning in the popular idea of creative disruption ("If an established company doesn't disrupt, it will fail, and if it fails it must be because it didn't disrupt").  This logical circle may or may not characterize highly-disciplined case studies of Christensen's theory, but certainly rings true to the endless popular iterations.

Whether Christensen's theory holds up almost does not matter to "disruptive innovation" as a popular idea.  By analogy, in Smith's America, as Terryl Givens has noted, what mattered about the Book of Mormon was not its teachings or particular message. "It was the mere presence of the Book of Mormon itself as an object that . . . served as concrete evidence that God had opened the heavens again."  In that era all manner of revelations appeared: the Shakers' Holy, Sacred, and Divine Roll and Book, the visionary experiences of Ellen G. White (one of the originators of the Seventh-Day Adventists), and the visions of rival claimants of Smith's prophetic mantel among the Latter Day Saints after his death.  Kathleen Flake has noted, "Henry Ford wanted a car in every home. Joseph Smith was the Henry Ford of revelation. He wanted every home to have one, and the revelation he had in mind was the revelation he'd had, which was seeing God."  The heavens, once opened, proved harder to close.

The popular idea "creative disruption" has attached itself, meme-like, to a lot of second- and third-rate scams.  Business theory has fewer brightly defined disciplinary boundaries than physics. King's and Baatartogtokh's conclusion that the theory has limited predictive power does not render Christensen's ideas useless, but does suggest that "disruptive innovation" will not be the "one theory to rule them all," and with the profits or prophets bind them. 

Joseph Smith Jr. claimed that the encounter he had with the Holy in the woods warned him not to join any of the (Protestant) groups in his vicinity, whose creeds were all "corrupt" and "an abomination."  Christian restorationists called the very early Christian movement in the short times reflected in the New Testament texts "the primitive church," and regarded all subsequent developments as not merely wrong, but apostate: those knew the truth but deliberately denied it.  Joseph Smith, Jr.'s saw his new revelation as a giant delete key on all of Christian history, Orthodox Eastern, Catholic, and Protestant.  All of it had to go.

In a similar manner, popular "disruptive innovation" connotes the passing destruction of all that is wrong with sclerotic corporate capitalism, and the restoration of the pure, "invisible hand" of the marketplace that allegedly guided early capitalists.  This popular view resonates with a great deal of cultural, political libertarianism, that giant corporations and government bureaucracy are apostasy betraying the true faith handed down from the founders (either Jesus or Adam Smith, as you wish).  "Move fast and break things," advised the Zuck; what can be disrupted should be disrupted.  Including, it would now seem, democracy wherever it might be found.

Disciplined use of the theory of "disruptive innovation" in carefully defined circumstances provides explanatory clarity but its predictive power is in fact more of a hope than an established fact, despite the protests of Christensen's defenders.  This means that it is one theory among other theories: Michael Porter's theory of competitive advantages and multifactorial analyses will likely work equally well in other carefully-defined situations.  Similarly, The Church of Jesus Christ of Latter-day Saints has found ways of regarding other religious groups positively (even Evangelicals, often the most hostile), and has moved on from the language of "apostasy."  Originally intending to hit that giant delete key, subsequent Latter-day Saints have found a way to live as active readers of their particular texts in the midst of many other readers of many other texts.  This has relevance on the ground.  Given the official LDS teachings regarding divorce and homosexuality, some LDS families have found informal means to include and tolerate differences within their members, coming to resemble the family life Joseph Smith Jr.' knew as a boy.  (Others have continued to shun their "apostates.")

Unlike Smith, Christensen never intended to promulgate a "unified field theory" of religion or business development. He is not completely responsible for losing control of his theory as a popular idea.  The close of his 20-year, 2015 re-evaluation, "We still have a lot to learn" acknowledges that "disruption theory does not, and never will, explain everything about innovation specifically or business success generally." 

Christensen's modesty still did not inhibit him from doubling down on his claim that half of American colleges and universities would close by 2025.)  Allowing his claim relative rather than revelatory validity dispels the apocalyptic fears of barbarians at the gates.  His primary analogy in The Innovative University (2011) is "changing the DNA of American Higher Education from the Inside Out," (the subtitle).  He claims that all American colleges and universities other than a branch of Brigham Young University in Idaho share the DNA of Harvard: all these institutions want to become, apparently, Research-1 universities if given money and the chance.  What does that really mean, and is that really true?  Such a simple analogy does grave injustice to community colleges (vital economic links for countless communities and immigrants), specialized schools such as Maine Maritime Academy, or even an elite liberal arts college such as DePauw University.  The popular view that higher education has changed and can change little, is flat wrong: ask any productive historian of higher education.  Change and innovation (whether disruptive or other) will not appear equally and everywhere overnight.  The higher education sector is not (thank heavens) the Silicon Valley, or McKinsey & Co.

Yet all is not well: the economic model underpinning American higher education is likely unsustainable in the coming decades for many reasons.  Higher education also forms a huge social and financial investment that unlikely to dissipate.  Distance education, information technology, changing social expectations, shifting demographics will all play a role in whether particular colleges and universities can continue to be viable. Disciplined use of the theory of "disruptive innovation" will likely hold some, but is unlikely to hold all explanatory and predictive keys.  The truth is out there but it will be much more complex.

The striking persistence of the popular "disruptive innovation" in senior management circles (typified by the Salesforce.com higher education event) reveals not only persistent fears and enduring threats, but short attention spans devoted to keeping up with the swift pace of too many events.  I suspect that popular "disruptive innovative" functions in a manner more affiliative than explanatory: "if you don't get it, you're not one of us. -- You think Jill Lepore, or King and Baartatogtokh, might be right, eh?  Let's see how long you last in the C-Suite" (--especially if you can pronounce the latter's name).

"Disruptive innovation" elicits fears useful for those who want to shake up certain professions in health, law, librarianship, and the professoriate, but by now its been over-used.  At librarians' meetings (ALA, ACRL) I have developed the habit of responding to the expression, "disruptive innovation" with the question, "what are you selling?"  Fear sells "solutions;" its potency as a means of marketing continues nearly unrivaled.  No one ever sold an expensive library services platform with the phrase, "this won't solve all your problems."  Since 1985 I have sat through many presentations that predicted the closure of libraries within ten years - Christensen's remark "I might bet that it takes nine years rather than ten" would find a new audience.  We who are about to be disrupted salute you, Prophet.

Nevertheless: printed books, asking questions, research assistance, and personal relationships with library instructors endure.  They were warned, but they persisted.  It is past time to find a more accurate analysis and predictive theory of the future of libraries and higher education.

On Tyranny is not only about an American moment, but about a worldwide one.

image from libapps.s3.amazonaws.comOn Tyranny: Twenty Lessons from the Twentieth Century, by Timothy Snyder.  Duggan Books (Crown), 2017. 126 p. ISBN 978-0804190114. List price $8.99

Yale University professor Timothy Snyder has spent a long time learning the languages, reading the documents, exploring the archives, and listening to witnesses of the totalitarian tyrannies of Europe in the last century --particularly of Nazi Germany and the Stalinist Soviet Union. His scholarship bore particular fruit in books such as Bloodlands: Europe between Hitler and Stalin, and Black Earth: the Holocaust as History and Warning. He came to recognize that certain characteristics in the development of those tyrannies are present in the world today, and in the United States. This book is no partisan screed: Snyder recognizes in the 45th President features he knows from other contexts; those other contexts underscore the drift towards totalitarianism apparent from Russian to Europe to the USA. On Tyranny is not only about an American moment, but about a worldwide one.

This short book consists of a brief introduction, twenty short chapters, and an epilogue. Each chapter directs an action, such as no. 12, "Make eye contact and small talk" followed by a historical example, or expansion of the point. All the actions can be undertaken or performed in daily life; there is no grand theory here.

In place a grand theory, there is a fundamental point: respect and value facts, truth, and accurate usage of our common language. In Moment (magazine), he explained: "Once you say that there isn’t truth and you try to undermine the people whose job it is to tell the truth, such as journalists, you make democracy impossible." He told Bill Maher (at 2:02) than while "post-fact" postmodernism might connote "Berkeley, baguettes, and France and nice things," it more likely means that "every day doesn't matter; details don't matter; facts don't matter; all that matters is the message, the leader, the myth, the totality" --a condition of Europe in the 1920s.  Such disdain for the truth goes hand-in-hand with conspiracy theories that put assign blame to a group associated with undermining the purity of the majority. "Rather than facing up to the fact that life is hard and that globalization presents challenges, you name and blame people and groups who you say are at fault."  Jews, Mexicans, Muslims, Rohingya, Tutsis, Hutus, globalists, evolutionists, or any other "outsider."  The myth: "Make [fill in the blank] great again."

A librarian or research might particularly resonate with Snyder's directions, "Be kind to our language," "Believe in truth," and "Investigate" (lessons 9-11). This is all a way to prepare to "be calm when the unthinkable arrives" (lesson 18) --when a leader exploits a catastrophic event to urge follows to trade freedom for security, and suspends the rule of law. The Chief Executive may or may not be attempt to stage a coup; that American democracy survived the dark moment after the Charlottesville.  Snyder told Salon in August, "We are hanging by our teeth to the rule of law. That was my judgment at the beginning of his presidency and it is still my judgment now. The rule of law is what gives us a chance to rebuild the system after this is all done."

Whether or not current politics result in tyranny and oppression is still (at this writing) an open question. The importance of Snyder's book is that it points beyond this moment to the wider trends and challenges of a world which is global (like it or not), connected (like it or not), and interdependent on both our natural climates and accrued, hard-won cultural heritages. A University founded on "a rigorous and interdisciplinary search for truth and wisdom" that "forms the cornerstone of all University life and welcomes people from all faiths and cultures" cannot leave our students unprepared. In order to make history, young Americans will have to know some (p. 126)  Will that be the twenty-first lesson on tyranny from the twenty-first century?

--Gavin Ferriby

Hartley argues that liberal arts educations widen a student's horizon, inquire about human behavior and find opportunities for products and services that will meet human needs. The "softer" subjects helps persons to determine which problem they're trying to solve in the first place.

FuzzyAndTheTechieJacketCoverThe Fuzzy and the Techie: Why the Liberal Arts Will Rule the Digital World, by Scott Hartley.  New York: Houghton Mifflin Harcourt, 2017. ISBN 978-0544-944770 $28.00 List.

Hartley writes that a "false dichotomy" divides computer sciences and the humanities, and extends this argue to STEM curricula as well. For example, Vinod Khosla of Sun Microsystems has claimed that "little of the material taught in liberal arts programs today is relevant to the future." Hartley believes that such a mind-set is wrong, for several reasons. Such a belief encourages students to pursue learning only in vocational terms: preparing for a job. STEM field require intense specialization, but some barrier to coding (for example) are dropping with web services or communities such as GitHub and Stack Overflow. Beyond narrow vocational boundaries, Hartley argues that liberal arts educations widen a student's horizon, inquire about human behavior and find opportunities for products and services that will meet human needs. The "softer" subjects helps persons to determine which problem they're trying to solve in the first place.

That said, the book does not move much further. Hartley never really tries to provide a working definition for true "liberal arts" education except to distinguish it STEM or Computer Science. By using the vocabulary of "fuzzy" and "techie" he encountered at Stanford, he inadvertently extends a mentality that has fostered start-ups notably acknowledged to be unfriendly to women. So far as I could determine, a mere handful of Hartley's sources as noted were published elsewhere than digitally--although the "liberal arts," however defined, have a very long tradition of inquiry and literature that Hartley passes by almost breezily, and is very little in evidence. His book is essentially a series of stories of companies and their founders, many of whom did not earn "techie" degrees.

Mark Zuckerberg's famous motto "move fast and break things" utterly discounted the social and cultural values of what might get broken. Partly in consequence, the previously admired prodigies of Silicon Valley start-ups are facing intense social scrutiny in 2017 in part as a result of their ignorance of human fallibility and conflict.
Hartley is on to a real problem, but he needs to do much more homework to see how firmly rooted the false dichotomy between sciences and humanities is rooted in American (and world-wide) culture. The tendency, for example, to regard undergraduate majors as job preparation rather than as disciplined thinking, focused interest and curiosity is so widespread that even Barack Obama displayed it. ("Folks can make a lot more, potentially, with skilled manufacturing or the trades than they might with an art history degree" --Barack Obama's remark in Wisconsin in 2014; he did retract it later).

Genuine discussion of the values of humanities and STEM degrees can only take place with the disciplined thinking, awareness of traditions, and respect for diversity that are hallmarks of a true liberal arts education.

The recent acquisition of BePress & Digital Commons by Elsevier has occasioned a snowstorm of commentary and opinion.  Some of that has not been helpful, even though well-intended.

The recent acquisition of BePress & Digital Commons by Elsevier has occasioned a flurry snowstorm of commentary and opinion.  Some of that has not been helpful, even though well-intended.  Sacred Heart University Library belongs to a 33-member group call the Affinity Libraries Group.  We are all private, Masters-1 universities (some with several doctoral degrees), relatively mid-size between the Oberlin Group of liberal arts college libraries, and the Association of Research Libraries (ARL).

Much of the following is going to be discussed at a meeting alongside or outside the coming CNI meeting in December in Washington DC –but since CNI is expensive ($8,200/year), SHU is not a member, nor are I suspect other Affinity Libraries.  I am hoping that, using one technology or another, the Affinity Libraries can have a conversation as well. 

Affinity Group has changed over the years; we (or they, meaning our predecessor directors) used to meet often, sometimes in quite successful stand-alone events not connected with another event, for example, ALA Annual.  Others have said to me that in some ways the Affinity Group (as it was then) really came down to “professional and personal friends of Lew Miller” (former director at Butler), and while I’m not sure that’s fair, it is accurate in the sense that personal relationships formed a strong glue for the group. As directors retired or moved on, group adhesiveness accordingly changed. I’m avoiding the word or metaphor “decline” here because sometimes things just change, and Affinity Group has been one of them.  No one has been sitting around in the meantime.

We do share a strong commitment to the annual Affinity Group statistics. Perhaps now a discussion about institutional repositories and Digital Commons in particular could garner some interest with attention directed to issues for libraries of our size.

Some of the hoopla surrounding Elsevier’s acquisition of BePress has simply given occasion to express contributors’ intense dislike of Elsevier and its business model of maximizing profits above all else, certainly a justified objection given the state of all our budgets.

I think the anonymous Library Loon (Gavia Libraria) has pretty well summed up various points (though I don’t agree with every one of her statements), and Matt Ruen’s subsequent comment on August 9 is also helpful.  Paul Royster at University of Nebraska—Lincoln wrote on September 7 on the SPARC list:

The staff at BePress have been uniformly helpful and responsive, and there is no sign of that changing. They are the same people as before. They have never interfered with our content. I do not believe Elsevier paid $150 million in order to destroy BePress. What made it worth that figure was 1. the software, 2. the staff, and 3. the reputation and relationships.BePress became valuable by listening to their customers; Elsevier could learn a lot from them about managing relationships--and I hope they do.  BePress is also in a different division (Research) than the publications units that have treated libraries and authors so high-handedly. The stronger BePress remains, the better will be its position vis-a-vis E-corp going forward. Bashing BePress over its ownership and inciting its customers to jump ship strikes me as not in the best interests of the IRs or the faculty who use them. 

Almost every college library has relationships with Elsevier already; deserting BePress is not a moral victory of right over wrong. The moral issue here is providing wider dissemination and free access for content created by faculty scholars. No one does that better than BePress, and until that changes, I see no cause for panic. Of course there are no guarantees, and it is always wise to have a Plan B and an exit strategy. But cutting off BePress to spite their new ownership does not really help those we are trying to serve.

I share Royster’s primary commitment freely to disseminate content created by faculty scholars. Digital Commons has done that for SHU in spades, and has been a game-changer in this university and library, in my experience. I know that many share such a primary commitment; many also share enduring and well-grounded suspicion of just about anything Elsevier might do.  As a firm, their behavior often has been so downright divisive and sneaky (we can tell our stories…)  When I first read of the sale, my gut response was, “Really? Great, here’s big problem when I don’t really want another.”   Digital Commons is one of the three major applications that power my library: 1) the integrated library services platform; 2) Springshare’s suite of research & reference applications, and 3) BePress.  Exiting BePress would be distracting, distressing, and downright burdensome.  As Royster writes, “there are no guarantees.”  Now we have to have Plan B and an exit strategy, even if we never use it.

What I fear most is Gavia Libraria’s last option (in her blog post): that Elsevier will simply let “BePress languish undeveloped, with an eye to eventually shrugging and pulling the plug on it.”  I have seen similar “application decay” with ebrary, RefWorks, and (actually) SerialsSolutions, several of which have languished (or are languishing) for years before any genuine further development.  I watched their talented creators and originating staff members drift away into other ventures (e.g., ThirdIron).  Were that to happen, it would be bad news for SHU and other Affinity members.  Royster’s statement “they are the same people as before” has not always held true in the past when smaller firms become subject to hiring processes mandated by larger organizations (e.g., SerialsSolutions’ staff members now employed by ProQuest).

On SPARC’s list, there has been great discussion about cooperation & building a truly useful non-profit, open-source application suite for institutional repository, digital publishing, authors’ pages (like SelectedWorks), etc.  Everyone knows that’s a long way off, without any disrespect to Islandora, Janeway, DSpace, or any other application.  DigitalCommons and SelectedWorks is pretty well the state of the art, and its design and consequent workflow decisions have benefited the small staff of the SHU Library enormously (even with the occasional hiccups and anomalies). Digital Commons Network has placed SHU in the same orbit or gateway as far larger and frankly more prestigious colleges and universities, and I could not be happier with that.  I have my own SelectedWorks page and I like it.  I would be sorry to see all this go –unless a truly practical alternative emerges.  Who knows when that will be?

In the meantime, we will be giving attention to Plan B –until now we have not had one or felt we needed one (--probably an unfortunate oversight, but it just did not become a priority).  I really don’t yet know what our Plan B will be.

I sense that if OCLC were to develop a truly useful alternative to Digital Commons (one well beyond DSpace as it presently exists), it might have some traction in the market (despite all of our horror stories about OCLC, granted).  Open Science Framework, Islandora, or others hold promise but really probably cannot yet compete feature-by-feature with Digital Commons (at least, I have not seen anything that really even close).  If you think I’m wrong, please say so! –I will gladly accept your correction.