Skip to content

Change is hard. Adaptation to changing conditions is hard enough without the burden of misplaced, scary, apocalyptic language.

After doubling down, Clay Christensen has tripled down. This is a familiar ploy: if you say something doubtful, then repetition will make it come true. In the words of one critic, Christensen is to business what Malcolm Gladwell is to sociology.

Christensen and Michael B. Horn, the former the apostle of disruptive innovation and the latter his St. Timothy, recently repeated their claim that 50% of colleges would fail in the next decade, or 10-15 years. Their goal line has been reasonably consistent: in 2011 it was "as many as half," (we're 8 years in) and 2013 it was 25 percent in the next 10 to 15 years (6 years in); in 2017 "as many as half" within a decade (2 years in); in 2019. Christensen and Horn write that "some college and university presidents . . . tell us in public and private settings that they think the 50 percent failure prediction is conservative -- that is, the number of failures will be far higher." Names? Places? and does executives saying that make it so (is this presumption of their predictive competence warranted)?

But the reasons for this prediction keep changing, and there's the ploy: save the effect but change the cause. One is reminded of the late Sydney Brenner's Occam's Broom: "sweep under the carpet what you must to leave your hypotheses consistent." The reason for such precipitous closures, foreseen (in 2011) in the period 2021-2026 were disruption due to innovative technologies: the subtitle of The Innovative University is Changing the DNA of Higher Education from the Inside Out. The main idea is that "in the DNA" of American colleges and universities is a desire to be like Harvard: wealthy and comprehensive. By contrast Brigham Young University/Idaho exhibits a completely different, innovative strand of DNA: "in how it serves students by a combination of distance learning, on-site learning, and lower-cost alternatives to residential college" (quoting my review from November 2012).

Implicit in the metaphor of DNA is a certain determinism: you cannot change your own DNA, after all (without extremely powerful, still-developing technologies that occasion many moral questions). Your DNA determines, in this view, what you will be: such "Harvard" DNA will be a fatal flaw in many colleges and universities, according to Christensen and Eyring. Can you really change your DNA from the inside out? It's a clumsy metaphor: no wonder Christensen has abandoned it.

That 2011 book also quite ignored several inconvenient facts about BYU/Idaho. Mormons get a considerable price break there that "Gentiles" do not receive --pointing obviously to a hefty subvention from LDS sources. BYU/Idaho is, after all, a satellite of a powerful, wealthy, comprehensive mother ship in Provo, Utah: the satellite campus can hardly be a synecdoche, assuming that what is true for the part is true for the whole. The economics of BYU/Idaho and the considerable technological subsidy its online instructions receives from the mother system is simply left out. Apparently Occam's Broom works well in Idaho and Utah.

Is Christensen's central claim true, that the DNA of American colleges and universities propels them to desire to become Harvard. Is that really true? What about those institutions that could have become Harvard (or Michigan), and chose a different path? Did a liberal arts college that chose to remain a liberal arts college necessarily thereby fail its DNA? Ask too many questions, and the whole edifice collapses.

Christensen's account and predictions rely on a very superficial knowledge of the history of higher education. That lack of knowledge allows him to claim that nothing has really changed in American higher education in 150 years. How about women's education? (--one of Christensen's real blind spots). How about public community colleges? A comparison: religious groups in the Abrahamic traditions often work by one person with some kind of authority meeting and talking to others who are supposed to be instructed. Isn't the real point what they say? --words with profound differences that mask a reductive similarity of communication. Closer to home, James McCosh (of Princeton) and Clayton Christensen (of Harvard Business School) both stand or stood in front of others and talked: is that really the sum of development between them? Again: Occam's Broom.

In their 2019 opinion, Christenen and Horn change the goal-posts: now the cause of instructional failure will be changing business model driven by implicitly disruptive technologies. Does anyone remember the educational TV boom in the 1960s? That also was a changing business model. Of course business models are changing, and have changed in the past: again, consult a deeper understanding of American higher education. All current debates about business models, missions, curricula, and the needs of students have a very long history, such as Crisis on the Campus by Michael Harrington (ca. 1951), or the famous General Education in a Free Society (1950). Students have never been mere passive recipients (although the contemporary view of them as "consumers" drives them to passivity). From the colonial to the denominational colleges, land-grant colleges, public universities: the business models have evolved. The business model in higher education is changing as we speak.

In other words, Christensen and Horn present the same tired and superficial nostrums of ten years ago. Even the historical examples in his ur-text, The Innovators Dilemma (1997), are questionable. Predicting the apocalypse is an old business.

I have argued elsewhere that the fearsome language occasioned by "disruptive technologies" has origins in the pre-Millennial Restorationist theologies of early 19th century frontier America, especially the Burnt-Over district of upstate New York, and showcased pre-eminently by Joseph Smith, Jr. There's a reason that his Latter-Day Saints are latter. Christensen was formed in a community that stands by Smith's proclamations. I do not pretend that he has smuggled theology into business, but rather that there is an elective affinity between such mainstream LDS thinking and business disruption: Joseph Smith Jr. was supposed to put the rest of Christianity out of business (the "great apostasy" from the 1st century to 1830). How has that worked out for for the Latter Day Saints? (--and nevermind the very current cosmetic name changes).

I continue to wonder whether discussions of disruptive innovation in higher education are in fact a cloak for expediting other changes, less technological but no less disruptive, initiated by senior academic leadership. To be a disciple of "disruptive innovation" means you're a member of that club. Is this called group think? Has it served GE well?

So will somewhere around 50% of American colleges and universities fail in the next decade? This might be the case, but not for Christensen's and Horn's (and Eyring's) reasons. In their recent post in Inside Higher Education, they cite situations in New England. I live in Connecticut: this is daily reality for me. Demographics are shifting: colleges and universities in the northeast quadrant of the continental US are going to have a hard time on that basis alone. Some have already closed, more probably will --but not because of changing business models driven by disruptive technologies. Demographic change exerts a constant pressure not unlike climate change. The real question is, who can adapt, and how well? The population probably will not achieve replacement rate, even in the southwest.

American higher education may be entering a "perfect storm" of demographic change, economic turmoil, and moral and cultural drift or outright corruption (e.g. the recent admissions scandals). All of this is cause for deep concern; none of it depends upon the snake-oil of "innovative disruption" via technologies that power changing business models. For many institutions, strategies of differentiation based upon price point, purpose, and location will matter a great deal. Strategies based on pure number crunching accompanied by credulous faith in technologies will probably not work. Online education is here to stay, but it will not disrupt on-ground education as much as non-technological demographic trends will. This is not the stuff of disruption, but of long-term anticipated and unanticipated consequences of historical change, about which Harvard Business School professors have no more particular expertise than anyone else. When disruptive innovation gets dumbed down, it isn't disruptive anymore, but just change.

Change is hard. Adaptation to changing conditions is hard enough without the burden of misplaced, scary, apocalyptic language.

The printed book, once thought to be a obsolete technology, shows strong signs of re-emergence as a stable cultural experience not apt to be interrupted by digital distractions or the dopamine kicks of addictive social media. 


Printed book imageTypewriters, mechanical watches, vinyl recordings, newspapers, printed books --obsolete technologies, right? Get with the program: countless incumbent industries and professions have been rendered pointless: disrupt or be disrupted --right? This has been the dominant cultural narrative --right?

I first heard about the obsolescence of librarians 35 years ago at the start of my career. Columbia University soon after accepted dominant cultural narratives and closed their graduate library school, college of pharmacy, and departments of geography and linguistics. Pharmaceuticals? Digital and print librarians? Linguistics and languages? Geographic information systems? --all obsolete (Whoops!). Since those who proclaim their demise have usually been selling some replacement, cynicism follows fast. Another prediction of demise, another day.

Entirely outside of libraries, a counter-narrative has grown. David Sax popularized one in Revenge of the Analog: Real Things and Why Real Things Matter (PublicAffairs, 2016): we interface with the world in a tactile, communal world.   At Harvard Business School, Prof. Ryan Raffaeli studies organizational behavior, using field research.  He contributes much more sophisticated thinking about re-emergent technologies. He has found that "incumbent" technologies and industries can make a comeback. This story has important implications for libraries.

Some technologies re-emerge from disruption and destruction, especially those that had a long history. Count out VHS tapes and punch cards: those were transitional. Typewriters have had a long enough history, as do fountain or nib pens (extending the dip, quill-type pens since 1827) .

Printed books, like other technologies, brought whole occupations and kinds of work with them: not just printers, but also binders, sellers, retailers, and of course librarians. As a candidate for "innovative disruption" by digital books, the demise of the printed book, so loudly proclaimed ten years ago, mandated the demise of book stores, libraries, librarians, publishers, editors.  Now anyone can write a book (see Amazon); who needs editors? Who needs libraries or bookstores?

Some disruptions are truly innovative --others just disruptions, and others just hype, but shouting as real (see previous post). The disruption narrative is not sufficiently incorrect (although it can be applied poorly), but the consequence corollary of the incumbent industries' necessary inability to adapt --and certainty of their demise-- is less well-founded.  Raffaelli's research shows that technologies can re-emerge, a cognitive process in two phases: first largely cultural, temporal, and narrative process; second a competitive process in a re-defined market with distinctive values not strictly established by price. His leading example is the Swiss mechanical watch-making industries; his second is the return and rise of independent book sellers in the USA.

Both the watch-makers and the book sellers lost substantial market shares when disruptive, good-enough technologies moved upmarket and claimed their most profitable customers: watchmakers with the rise of cheaper, more accurate quartz watches in the 1970s; book sellers with the rise of major chain bookstores in the 1990s, followed by Amazon. They keenly felt their losses: numerous Swiss firms closed or discontinued manufacturing; from 1995 to 2009 around 1,400 bookstores closed. Enough hung on, however, to rebound: how did they do it?

Raffaelli identifies the terms of competition: old terms such as price, availability, and quality change with the entry of disruptive technologies to market. The survivors have re-defined the competition: how they want to compete, and what value proposition they offer to their customers. He traces a complex process of de-coupling product and organizational identity and renegotiation of foundational concepts and business roles. The process is both bottom-up (from the "factory floor" or fundamental, front-line production or service) and top-down: from industry alliances, design thinking, and organizational management.

In the Swiss mechanical watch industry, he has identified entrepreneurs and guardians. Entrepreneurs are alert to market signals, cultural currents, and emerging narratives that suggest that new communities are forming new values. Guardians by contrast preserve older technologies and enduring values and counterbalance the entrepreneurs; both are necessary for the process of cognitive re-emergence. When the industry drew near to complete collapse, collectors began to purchase mechanical watches at high prices at auctions, signaling that their small community found genuine value expressed momentarily in price. Entrepreneurs realized that the market for mechanical watches had not completely disappeared, but changed: the value lay not in keeping time for a price, but in expressing a cultural signal. Guardians, meanwhile, had preserved enough of the technology that recovery was possible; veteran employees preserved crucial tools and skills that enabled a recovery. Each needed the other; the leadership necessary for re-emergence arose not just from the top level of the organization and industry, but from the commitment and wisdom of key skilled workers. Mechanical watches were then marketed as high-end, luxury items that "said something" about their owners. As new customers entered or moved up-market, they adopted such watches as a sign of cultural status and belonging.

Independent booksellers successfully re-framed their market as primarily community, secondarily as inventory. First the chain stores (Borders, Barnes & Noble) out-competed them on price, then Amazon on price and inventory availability. Independent booksellers have focused instead on 3 Cs: Community and local connections, Curation of inventory that enhanced a personal relationship with customers, and Convening events for those with similar interests: readings, lectures, author signings, and other group events. The booksellers' trade association (American Booksellers Association or ABA) facilitates booksellers' connections with local communities with skills, best practices, effective use of media, and outreach to other local business and organizations (--even libraries, once considered the booksellers' competitors). The re-emergent market was defined both by entrepreneurial booksellers, front-line service guardians, a growing social movement committed to localism, and industry-scale cooperation. Between 2009 and 2017 the ABA reported +35% more independent booksellers: from 1,651 to 2,321 nation-wide. A sign of the integration of booksellers with community spaces: for 2017 sales up 2.6% over 2016.

Like independent bookstores, the "library brand" remains strongly bound to printed books --after all, the name derives from "liber" (Latin), confirmed with "biblos" (Greek). The printed book, once thought to be a obsolete technology, shows strong signs of re-emergence as a stable cultural experience not apt to be interrupted by digital distractions or the dopamine kicks of addictive social media.  This brand identity will persist even though libraries offer many kinds of resources in many formats --including millions of digital books.

What does such technology and market re-emergence have to do with libraries? These cases suggest the emerging re-definition of libraries (as both old and new) is analogous to much of Raffaelli's work, and that the narrative frame of "disruptive innovation in higher education" can be --should be-- challenged by a this more useful counter-narrative, "new and re-emergent technologies in higher education."

While libraries' role as mere "book providers" has been challenged by disruptive technological service entrants such as the Internet, Amazon, and social media, libraries' role as a channel for trusted, stable information is stronger than ever. The Pew Research Center survey data from Fall 2016 found that 53% of Millennials (those 18 to 35 at that time) say they used a library --a generational cohort (not just college students--the study focused on public libraries). This compared with 45% of Gen Xers, 43% of Baby Boomers, and 36% of Silent Generation. In 2016 Pew also reported that libraries help "a lot" in deciding what information they can trust, from 24% in 2015 to 37% in 2016. Women held that opinion more strongly, 41%. Recent anecdotes suggest that such opinions have not changed direction.


Boston-public-library-free-to-allLibraries are regarded as very strong assets to a community: the high values placed on pleasant space, safety, and community events also emerged in the Pew studies. Coupled with bottom-up initiatives from front-line librarians and individual organizations, the American Library Association has devoted substantial attention and resources to initiatives such as the ACRL Framework for Information Literacy in Higher Education, and the Libraries Transform campaign.  Libraries' free-to-all traditions (supported by tuition, tax dollars, and other sources) do not track community impact as easily as do independent bookstore sales figures. Their value proposition for their communities becomes clear in usage figures (at SHU growth in usage has outpaced growth in enrollment) and the faculties' documented turn towards librarians in helping undergraduate students develop research, critical analysis, and information literacy skills.

As a re-emergent technology, printed books sustain a host of skills, occupations, organizations, and cultural signals that do not boil down to a single, simplistic, marketable narrative. Conceived in the late 20th century as "information resources," books gave way to digital representation; conceived as "documented knowledge," the act of reading books in a library context provides a tangible experience of informed learning, cultural absorption, and community participation. Libraries provide many services. Without the "brand" of reading books, and the sustaining services of librarians, the library would turn into derelict, zombie storage spaces. Knowledge is a communal good as well as a private act; it is never simply an individual achievement: free to all. We are all culturally embedded in the minds of our predecessors and communities for weal and woe --and without libraries, bookstores, timekeepers, and printed books, we will not be able to progress from woe to weal.

 

Joseph Smith Jr.'s role as an innovator, the "disruption" of Mormon proclamation, and the America that received or rejected his message all form an enduring, cultural matrix that later facilitated the popularization of Clayton Christensen's theory of “disruptive innovation” as a popular idea.

NB This is a very long post, and might be more readable in this .pdf document.

Abstract: Christensen’s theory of disruptive innovation has been popularly successful but faced increasing scrutiny in past years.  Christensen is a Latter-day Saint, and the career of Joseph Smith, Jr. and his ideas form a powerful backdrop for Christen’s theory.   Smith's role as an innovator, the "disruption" of Mormon proclamation, and the America that received or rejected his message all form an enduring, cultural matrix that later facilitated the popularization of “disruptive innovation” as an idea. Uncritical embrace of the popular theory, especially in higher education, implies acceptance of a cultural assumptions and suggest that the theory is less useful than has been claimed.  This essay is a reflection on history and experience, not a theoretical quid-pro-quo, or take-down.

Something always struck me as oddly familiar and decidedly off-key about Christensen's confident claims that innovation would explosively disrupt American higher education.  A previous boss dismissed my views as the reflex of one in a dying profession (librarianship). 

I never accepted her cynical critique; neither could I put my finger on why I still disagreed strongly.  Then a new insight came to me while teaching a historical survey course on American Religion about the idea's interesting and problematic deep structure.  My insight sprang from some familiarity both with the discourse of innovation in higher education, and 19th-century American religion, two widely known but widely separated fields.  What I have found will, I hope, give pause to thoughtful educational and academic leaders.  Uncritical embrace of "disruptive innovation" might implicate them in religious and cultural commitments that should give them pause, especially if they lead faith-affiliated organizations.

The first backstory:  Christensen first introduced disruptive innovation in 1995 in an article aimed at corporate managers rather than researchers.  In 1997 he expanded his ideas in The Innovators Dilemma which caught on in 2000s with managers, business writers, and consultants in a big way, with the aura supplied by Harvard Business School.  Since 2011 he has predicted that as many as half of American colleges and universities would close or go bankrupt in fifteen years (so in 2021-2026). He is still at it: in April 2017 he maintained his claim when speaking to Salesforce.org's Higher Education Summit.  "I might bet that it takes nine years rather than ten."  Sometimes he's been wrong (about the iPhone, for example), but he does not vacillate.

Disruptive innovation has become big business, popularized not only by Christensen (who has come to regret losing control of the term), but by a host of management consultants, pundits, and experts.  Among librarians, David Lewis applied Christensen's ideas in 2004, expanded into a book in 2016 that has been on the whole well-received by those in my "dying profession."  Predictable pushback came in 2014 from another Harvard professor (of history, Jill Lepore), and a detailed reexamination in the MIT Sloan Management Review, and others.  Christensen in turn defended his ideas and reformulated some of them in 2015, on the 20th anniversary of his initial publication. 

Three years later, if the concept has lost some valence or he's just wrong about higher education, why rehash this now and for the umpteenth time?

That's where the second backstory becomes relevant.  Christensen (of old Latter-day Saint stock) is not just coincidentally Mormon; that identity is central to his person and that background to his work. 

When I teach my historical survey of American Religion, in due course we come to the so-called Second Great Awakening in the first half of the 19th century.  Scholars give special attention to the "Burnt-Over District" of western New York, home of many potent religious and political ideas associated with "Whig Evangelicalism": abolition, temperance, the rights of women, and reforms of public health, education, prisons, orphanages, etc.  The District fostered not only mainstream Christian restorationist and evangelical movements (such as Disciples of Christ, ("Campbellites"), Methodists and Baptists), but also countless Millennialist, New-Thought, and Spiritualist communes, Shakers, Millerites (early Seventh-Day Adventists) -and Joseph Smith Jr.'s Latter Day Saints. 

Smith resists casual dismissal: was he truly a prophet of the living God (the mainstream Mormon view)? -a womanizing fraud (the view of many of his contemporaries, and critics since)? -a self-deluded prophet who eventually bought into own fabrications and could not extricate himself (a sort of early David Koresh)? -or some kind of mystic or pyschic with unusual access to otherworldly regions and the subconcious (a sort of frontier, raw-edged Francis of Assisi)?

Smith promulgated the enormous Book of Mormon (Skousen's critical first edition is 789 pages).  He claimed an angel guided him to find ancient plates in a hill near Palmyra, New York, which he translated from unknown languages with the help of seer stones and a hat, and dictated on-the-fly to his wife Emma Hale Smith and others, all in 65 days.   Even if he made it up, or shared authorship, or plagiarized part, it is an amazing performative achievement.  A densely layered anthology of documents, speakers, and authors, the text can be regarded (if not as Scripture) as an early forerunner of "magical realism." All this from 20-something farmhand with a little primary education.

Smith was truly a "rough stone rolling" whom generations of Mormons have never managed to smooth. "No man knows my history," he is reported to have said; "I cannot tell it: I shall never undertake it. . . . If I had not experienced what I have, I would not have believed it myself."  His innovative edginess strongly contrasts with earnest, family-oriented, upbeat, corporate image of contemporary Mormons.

"Innovative" -there's that word.  In matters of religion, innovation is not always a positive idea.  Smith's most famous innovations were the origins of the Book of Mormon, and the "new and everlasting covenant" (revelation) of both eternal marriage, and the principle of plural marriage.  The last innovation directly challenged 19th century American ideas about the family, and occasioned a furious opposition of a scale rarely seen in American history (leaving aside for the moment the historical plague of racism).  Latter Day Saints were opposed in Ohio, persecuted in Missouri (the Governor ordered extermination); Smith was murdered in 1844 in Illinois by a lynch mob acting out of congeries of fears.

The subsequent succession crisis would have led to fatal splintering were it not for Brigham Young.  A considerable majority of Mormons followed his leadership from Illinois in the Great Trek to Salt Lake City (1846-1848); Young's organization both preserved and transformed the Latter-day Saints; they lost their prophet but gained a hyphen.  The founder’s innovations would have perished without Young's tenacity, sheer longevity (died 1877) and "courageous leadership" or "iron-fisted rule," depending your point of view.

These two long backstories are essential for understanding both the meteoric rise of "disruptive innovation" and its recently waning appeal as an explanatory theory in light of qualms about its accuracy.

Joseph Smith, Jr., can be seen as an exemplary disruptive innovator.

"'Disruption' describes a process whereby a smaller company with fewer resources is able to successfully challenge established incumbent businesses." While incumbents focus upon improving their product or service (especially on behalf of their most profitable customers), they tend to ignore less profitable market sectors, or exceed their needs.  Entrants seek out those ignored market segments and can gain customers by providing more suitable products or services, frequently simpler and at lower cost. Incumbents can fail to respond in kind while entrants "move upmarket," improve their offerings, maintain their advantages and early success, and eventually drive out or acquire incumbents when their formerly most profitable customers shift purchases to the entrants. (Christensen, 2015). 

Since Christensen has complained that terms have become "sloppy" and undermine his theory's usefulness, I have tried to paraphrase his core idea carefully, and to present Mormon history even-handedly.  My central claim is that Smith's role as an innovator, the "disruption" of Mormon proclamation, and the America that received or rejected his message all form an enduring, cultural matrix that later facilitated the popularization of “disruptive innovation” as an idea.  Christensen's religion did not cause him to create his theory, but did contribute a framework that fostered its reception, as well as its swift extension beyond its first cases in manufacturing disc drives, construction implements, and other tangibles.

Christensen's commitment to his Church is beyond question; the intellectual, doctrinal traditions of his faith powerfully molded his character and thinking, by his own admission.  By all accounts he is a charming and sincere man, and would never have smuggled his religion into his professional thinking; he is an honest and forthright broker.  This essay is a reflection on history and experience, not a theoretical quid-pro-quo, or take-down.

Smith's movement "came to market" at a critical time in American religious history.  The Constitutional de-coupling of religious institutions from the state apparatus was one of the primary innovations of the new nation.  The new "open market" meant that incumbent legally established, financially supported Christian churches had to compete for adherents and support.  In Connecticut, for example, the "standing order" of political and religious aristocracy came to an end in 1817.  Such competition challenged the established churches' previous role as assumed arbiters of social legitimacy.  If you have a wide field of religious choices, you could also choose "none of the above." National anxieties about declining status of established forms of Christianity in large part fueled the striking resurgence of Christian groups loosely termed "evangelical." 

A significant portion of those evangelical groups can be described as "restorationist," appealing to a New Testament proclamation of God's restitution of all things.  This was taken to mean a return to the "primitive church, " a re-pristination of Christianity - taking it back to the putative purity of the founders.  This led to a proliferation of religious bodies, almost all of which inherited the binary certitude from earlier centuries that each was "correct" and hence others were necessarily "incorrect." Each group warranted its appeals in the lost purity of the first Christians.  For many groups, the path back to purity had been cleared by curbing the incumbent religious churches by disestablishment, and they hoped that clearance would level their status.

Since the religious marketplace of early 19th-century America had only recently been opened, doctrinal disputes that now seem arcane often paralleled heated social and culture divisions.  Smith's own family mirrored the national situation; his grandfather was a Universalist (that all humans could receive God's corrective grace; they opposed evangelicals); his mother has been identified as Presbyterian (officially, she would have believed that God's eternal decree predestined some to eternal blessedness and foreordained others to eternal damnation; Presbyterians tended to be allied with evangelicals).  Joseph Jr. may have caught a "spark of (evangelical) Methodism" at local rural revival meetings. His maternal grandfather and parents experienced visions and voices; like many farmers they used divining rods to find water and looked for buried treasure, a kind of folk magic.  They believed in prophecy and vision, tended towards skepticism about "organized religion,” and were receptive to new religious ideas.  He is reported to have told his mother, "I can take my Bible, and go into the woods, and learn more in two hours, than you can learn at meeting in two years, if you should go all the time."

In the religious marketplace in western New York, Smith’s family were typical of a market segment often ignored by the more well-established groups who appealed to more properous farmers, townspeople, and entrepreneurs (see Johnson’s A Shopkeepers Millennium).  Smith's family, on the other hand, were downwardly mobile recent arrivals from Vermont without a network of support, a consequence both of poor decisions and environmental strains such as the “Year without a Summer” (1816).  They typify the impoverished, rural working class on the inner edge of the frontier, a down-market segment less promising to more prominent groups, for whom the competitive religious marketplace was particularly nettlesome.

The 14-year-old Joseph was confused by the "cry and tumult" of Presbyterians vs. Baptist vs. Methodist, all using "both reason and sophistry," to "establish their own tenets and disprove all others."  He asked, "What is to be done? Who of all these parties are right; or, are they all wrong together?  If any one of them be right, which is it, and how shall I know?"  In other words, his market segment saw that ecclesiastical competition compromised the integrity of all parties. Reading a biblical text that directed him to ask God, he went into the woods (again!) and reported he experienced a dramatic theophany: the "Personage" answered "that I must join none of them," their "creeds were an abomination" and adherents "were all corrupt."  His take-away: he realized "at a very early period of my life, that I was destined to prove a disturber and annoyer."  Joseph’s subsequent innovations certainly disturbed and annoyed other groups.

Care must be taken, however, in simply equating Joseph’s social location with a commercial market position, because the religious “marketplace” differs in important ways from commerce: product differentiation, lock-in, and brand loyalty.

The religious "product" is not a commodity, but a sense of living affiliation with a group that makes doctrinal, moral, and behavioral claims in such a way that simultaneous affiliation with more than one group is either prohibited or discouraged.  The ultimate outcome, some kind of eternal blessedness, in principle excludes other ultimate outcomes.  Today many children in "mixed" families can feel religious differences strongly (and opt for "none"). For example, an individual cannot be a Catholic in good standing and simultaneously an evangelical Baptist in good standing -their claims and ideas simply conflict too much; if both present in the same family, some accommodation must be reached.  Joseph Smith Jr. found such exclusive "product differentiation" troublesome.

Religious adherents' "market lock in" is high: one might radically change one's affiliation once or twice in a lifetime, but more often is unusual and perhaps suspect, and "conversion" can exact high social costs.  The religious fervor of New York's Burned Over district in Joseph Smith, Jr.'s time left religious organizations in flux, so that conversion costs were often much less than before or after.  All early Latter Day Saints nevertheless had to make a clear decision that departed from their inherited religious affiliations.

A religious group's "brand loyalty" involves a constellation of commitments; socialist Fundamentalists and alt-right Episcopalians are vanishingly rare (for example).  The brand loyalty of early Latter Day Saints evolved from 1830 to 1844, becoming progressively stronger both in positive response to Joseph Smith Jr.'s continuing revelations, and defensive response to violent persecution.  For example, early Saints' constellation of commitments was ambivalent towards slavery; initially as Northerners early adherents opposed it; then revelations and teachings evolved to allow some slave-holders to join in Missouri (a slave state). After Smith’s murder, his son Joseph Smith III and widow Emma Hale Smith repudiated both slavery and plural marriage in the Reorganized Church of Jesus Christ of Latter-day Saints in Missouri, the "minority" successor group.  By contrast, Brigham Young's larger "majority" successor not only retained plural marriage but attempted to legalize slavery in the Utah Territory.  Since Republicans, starting in 1854, sought to abolish "twin relics of barbarism," slavery and polygamy (a jab at Young's group), it is unclear whether that commitment arose from core convictions or defensive resistance.

"Disruptive innovation" in the religious marketplace has to be treated carefully, because of not only the special nature of the religious market place, but also rigorous examination of the idea of "disruptive innovation:" it does not mean just any disruption.

Whatever the sources of Joseph Smith Jr.,’s ideas, he led a movement that "gain[ed] customers (i.e., adherents) by providing more suitable, often simpler products or services, frequently at a lower cost."  (Latter-day Saints have never had professional clergy; their commitment to mutual assistance is exemplary.)  Market incumbents (more organized and better financed competing groups) were slow to respond in kind, and as Smith's group moved "upmarket," it maintained its "advantages and early success" -high rates of "lock-in," group cohesion, and brand loyalty.  Smith's group, however, never quite succeeded in driving the "incumbents" out of the market or even acquiring most of their former customers.  Their sense of urgency lost its edge.

Why are the Latter-day Saints “latter-day”?  This code phrase refers above all to a shared set of cultural and religious assumptions and commitments in early 19th-century America.  "Millennialism" was the belief that the coming of the Kingdom of God (promised in the Bible) was imminent and that America, with its special covenant of religious liberty, would be central to its arrival.  Millennialism came in two distinct forms with opposite answers to the question, "Will Christ return before or after the promised millennium (1000 years) of peace?"  Pre-millennialists emphasized the troubles (tribulations) that would both precede and signal Christ's return to reign for 1000 years before the Last Judgement.  Post-millennialists proclaimed that Christ would return and the Last Judgement occur after the millennium of peace, culminating in his return; their task was to make the world ready for the Kingdom of God.  Both expect the Kingdom very soon: we are all living in the biblical "latter days."

This recondite disagreement has important implications.  Post-millennialists were all about social reforms that would make the United States so like the Kingdom of God that American society would usher in the millennium.  Pre-millennialists emphasized that Christ would only come after dramatically increasing tribulations.  Things getting worse and worse were a sign of his approach –hence they disfavored social reforms as a distraction from the real work of preparation for evil times. (Historical aside: the War of the Secession largely discredited post-millennialism, which morphed into the program of social reforms in the Progressive era. Pre-millennialism evolved into dispensational Christian fundamentalism, combining expectation of tribulation with a believe in the factual, literal innerrancy of the Bible.)

Latter-day Saints' enduring post-millennialism shows, among other ways, in their boundless optimism.  The familiar, earnest young missionaries (think The Book of Mormon, the Broadway show) are a token of the continuing commitment of the Latter-day Saints to usher in the latter days, although they expect them less imminently. Millennialism is common coin no longer.  Despite the popularity of the "Left Behind" series of books and movies, only a small minority of survivalists or "preppers" appeal to Biblical warrants for their expectations of imminent tribulations (disaster).

Detached from Christianity, expectations of imminent disaster and rebirth went rogue in American culture long ago.  The Silicon Valley today, for example, is home to many who expect a "singularity" in which the artificial intelligence outstrips human intelligence and introduces profound changes to civilization as a whole ­–another sort of secular millennium in which technology has replaced a Messiah as the anointed power.  Popular movies and books have made dystopia a cultural cliché.  (What's the disaster this time? Nuclear war, apes, viruses, climate change, or the abrupt disappearance of millions?). How many jokes about "voting in the last American election" (double entendre) play on similar fears?

"Disruptive innovation's" popularity exploded in the 1990s and 2000s exactly because of the numerous hopes and fears raised by the advent of the Internet and its devices and social media.  Josh Linkner warned, "disrupt or be disrupted," (The Road to Reinvention, 2014) and that binary choice spoke in apocalyptic tones to incumbent mass media, libraries, bookstores, journalists, travel agents, financial consultants, university presidents, and anyone else who deals in "information" as a commodity.  Such urgent warnings shout to established corporations, "The end is near: you can't innovate fast enough; you're not even the right people to do it."  Incumbent organizations were counted out simply because of their incumbency: MOOCs would inevitably disrupt brick-and-mortar educational institutions, now denigrated because of their mere physicality. 

The popular version of “disruptive innovation” played dystopian fears of the collapse of the known "incumbent" corporations and rise of an economy of perpetual disruption -Schumpter's capitalist creative destruction now recast as "disruptive innovation" with a brutalist, binary emphasis: disrupt or be disrupted.  The archetype "creative disruptor" is the technological whiz-kid (I nominate the Mark, "Zuck Almighty") whose revelatory "Book of Faces" and continuing revelations of a "new and everlasting platform" will usher in a thousand-year era of effortless, limitless, and unfailingly upbeat social confabulation.  Except when many kinds of terrorists, Russian autocrats, vaccine deniers, and deranged stalkers confabulate as well.

What does this have to do with Clayton Christensen?  Well, both a little and a lot.  He cannot deny his own popularization of his ideas through his books, media appearances, securities fund (the short-lived Disruptive Growth Fund, launched in 2000 at just the wrong time), and army of students, friends, and defenders such as Thomas Thurston in TechCrunch.  He lost control of "disruptive innovation" as a term of art precisely because of its appeal to those who make a living from in-your-face, counterintuitive claims.  Lepore identified circular reasoning in the popular idea of creative disruption ("If an established company doesn't disrupt, it will fail, and if it fails it must be because it didn't disrupt").  This logical circle may or may not characterize highly-disciplined case studies of Christensen's theory, but certainly rings true to the endless popular iterations.

Whether Christensen's theory holds up almost does not matter to "disruptive innovation" as a popular idea.  By analogy, in Smith's America, as Terryl Givens has noted, what mattered about the Book of Mormon was not its teachings or particular message. "It was the mere presence of the Book of Mormon itself as an object that . . . served as concrete evidence that God had opened the heavens again."  In that era all manner of revelations appeared: the Shakers' Holy, Sacred, and Divine Roll and Book, the visionary experiences of Ellen G. White (one of the originators of the Seventh-Day Adventists), and the visions of rival claimants of Smith's prophetic mantel among the Latter Day Saints after his death.  Kathleen Flake has noted, "Henry Ford wanted a car in every home. Joseph Smith was the Henry Ford of revelation. He wanted every home to have one, and the revelation he had in mind was the revelation he'd had, which was seeing God."  The heavens, once opened, proved harder to close.

The popular idea "creative disruption" has attached itself, meme-like, to a lot of second- and third-rate scams.  Business theory has fewer brightly defined disciplinary boundaries than physics. King's and Baatartogtokh's conclusion that the theory has limited predictive power does not render Christensen's ideas useless, but does suggest that "disruptive innovation" will not be the "one theory to rule them all," and with the profits or prophets bind them. 

Joseph Smith Jr. claimed that the encounter he had with the Holy in the woods warned him not to join any of the (Protestant) groups in his vicinity, whose creeds were all "corrupt" and "an abomination."  Christian restorationists called the very early Christian movement in the short times reflected in the New Testament texts "the primitive church," and regarded all subsequent developments as not merely wrong, but apostate: those knew the truth but deliberately denied it.  Joseph Smith, Jr.'s saw his new revelation as a giant delete key on all of Christian history, Orthodox Eastern, Catholic, and Protestant.  All of it had to go.

In a similar manner, popular "disruptive innovation" connotes the passing destruction of all that is wrong with sclerotic corporate capitalism, and the restoration of the pure, "invisible hand" of the marketplace that allegedly guided early capitalists.  This popular view resonates with a great deal of cultural, political libertarianism, that giant corporations and government bureaucracy are apostasy betraying the true faith handed down from the founders (either Jesus or Adam Smith, as you wish).  "Move fast and break things," advised the Zuck; what can be disrupted should be disrupted.  Including, it would now seem, democracy wherever it might be found.

Disciplined use of the theory of "disruptive innovation" in carefully defined circumstances provides explanatory clarity but its predictive power is in fact more of a hope than an established fact, despite the protests of Christensen's defenders.  This means that it is one theory among other theories: Michael Porter's theory of competitive advantages and multifactorial analyses will likely work equally well in other carefully-defined situations.  Similarly, The Church of Jesus Christ of Latter-day Saints has found ways of regarding other religious groups positively (even Evangelicals, often the most hostile), and has moved on from the language of "apostasy."  Originally intending to hit that giant delete key, subsequent Latter-day Saints have found a way to live as active readers of their particular texts in the midst of many other readers of many other texts.  This has relevance on the ground.  Given the official LDS teachings regarding divorce and homosexuality, some LDS families have found informal means to include and tolerate differences within their members, coming to resemble the family life Joseph Smith Jr.' knew as a boy.  (Others have continued to shun their "apostates.")

Unlike Smith, Christensen never intended to promulgate a "unified field theory" of religion or business development. He is not completely responsible for losing control of his theory as a popular idea.  The close of his 20-year, 2015 re-evaluation, "We still have a lot to learn" acknowledges that "disruption theory does not, and never will, explain everything about innovation specifically or business success generally." 

Christensen's modesty still did not inhibit him from doubling down on his claim that half of American colleges and universities would close by 2025.)  Allowing his claim relative rather than revelatory validity dispels the apocalyptic fears of barbarians at the gates.  His primary analogy in The Innovative University (2011) is "changing the DNA of American Higher Education from the Inside Out," (the subtitle).  He claims that all American colleges and universities other than a branch of Brigham Young University in Idaho share the DNA of Harvard: all these institutions want to become, apparently, Research-1 universities if given money and the chance.  What does that really mean, and is that really true?  Such a simple analogy does grave injustice to community colleges (vital economic links for countless communities and immigrants), specialized schools such as Maine Maritime Academy, or even an elite liberal arts college such as DePauw University.  The popular view that higher education has changed and can change little, is flat wrong: ask any productive historian of higher education.  Change and innovation (whether disruptive or other) will not appear equally and everywhere overnight.  The higher education sector is not (thank heavens) the Silicon Valley, or McKinsey & Co.

Yet all is not well: the economic model underpinning American higher education is likely unsustainable in the coming decades for many reasons.  Higher education also forms a huge social and financial investment that unlikely to dissipate.  Distance education, information technology, changing social expectations, shifting demographics will all play a role in whether particular colleges and universities can continue to be viable. Disciplined use of the theory of "disruptive innovation" will likely hold some, but is unlikely to hold all explanatory and predictive keys.  The truth is out there but it will be much more complex.

The striking persistence of the popular "disruptive innovation" in senior management circles (typified by the Salesforce.com higher education event) reveals not only persistent fears and enduring threats, but short attention spans devoted to keeping up with the swift pace of too many events.  I suspect that popular "disruptive innovative" functions in a manner more affiliative than explanatory: "if you don't get it, you're not one of us. -- You think Jill Lepore, or King and Baartatogtokh, might be right, eh?  Let's see how long you last in the C-Suite" (--especially if you can pronounce the latter's name).

"Disruptive innovation" elicits fears useful for those who want to shake up certain professions in health, law, librarianship, and the professoriate, but by now its been over-used.  At librarians' meetings (ALA, ACRL) I have developed the habit of responding to the expression, "disruptive innovation" with the question, "what are you selling?"  Fear sells "solutions;" its potency as a means of marketing continues nearly unrivaled.  No one ever sold an expensive library services platform with the phrase, "this won't solve all your problems."  Since 1985 I have sat through many presentations that predicted the closure of libraries within ten years - Christensen's remark "I might bet that it takes nine years rather than ten" would find a new audience.  We who are about to be disrupted salute you, Prophet.

Nevertheless: printed books, asking questions, research assistance, and personal relationships with library instructors endure.  They were warned, but they persisted.  It is past time to find a more accurate analysis and predictive theory of the future of libraries and higher education.

An academic library enacts a community of practice so that learners move beyond "standard answers" to understand the real questions, sensibilities, and aesthetics of their disciplines, and why they matter.

Clayton Christiansen's impressive work on disruptive innovation (see previous post) arises from his examination of innovative developments in concrete products such as transistors, computer chips, and automobiles.  His analysis has both an intellectual plausibility and an on-ground sense of touch.  

One of the main points (to paraphrase crudely) is that the new innovation frequently is not (in fact) as good as the old, expensive, hard-to-get product, but for the innovation's users it's good enough.  

Example: the incursion of foreign automobiles into the USA market in the 1970s, in particular German and Japanese cars.  They had a reputation as not being as reliable as your grandfather's Oldsmobile or Buick (and maybe they weren't).  I owned a 1967 Volkswagon bug, and it wasn't totally reliable.  I later owned two successive Ford Pintos, the cars that exploded on rear-end impact (faulty gas tank).  They were terrible, and I've never been persuaded to own another Ford.  So while German and Japanese cars were regarded as less reliable (imagine that!), in fact the Big Three automakers were producing glitzy junk. No wonder younger drivers abandoned them in droves.

In that case, not only was the "new" product "good enough," the former product had deteriorated.  Earlier, the first transistor radios were only "good enough" (tinny sound), but they were a huge, portable improvement on the old tubes.  These products are really clear, and consumer-oriented, although Christiansen's analysis also holds ground very well in the case of computer chips, which are secondarily consumer-oriented.

So what do academic libraries produce?  --much less clear than radios and automobiles.

The old language about "the academic library supports the students and faculty" is insufficient. (See Scott Bennett's article.) The support role has been supplemented (if not replaced) by Google and other traffikers in information.  That is the true innovative disruption in the academic library --Google (Amazon, etc.) is not "as good as" but is "good enough," and the exchange is not primarily financial (dollars for support) as much as time and effort.  For many using Google (etc.) is good enough: not so much work, easy to use, and ubiquitous.  Just think about the question, "why is it so much easier to buy a book than to borrow a book?"

If the old support, service-oriented language is insufficient, what's left for academic libraries?

Real (or deep) learning happens in communities.  In a community, they internalize the implicit practices of a discipline that matter most.  That's why they are called disciplines, not just subject matter --learning puts the schaft (schaffen=create) in the Wissenschaft (wissen=to know) in German, the source of the model of the modern Ph.D. research university.

But the research university "DNA" is just what Christiansen claims innovative organizations such as BYU Idaho disruptive.  There are several levels to his claim. Consider that this organization is called BYU (Brigham Young University) Idaho for a reason --it's basically oriented to the "mother ship" BYU in Provo, Utah.  All of BYU receives a huge tuition subsidy for all LDS students who are "temple worthy" (an LDS status indicating good standing: in 2011-2012 $4,560 vs $9,120 for non-LDS).  Who teaches at BYU-Idaho?  It doesn't produce it's own faculty, but depends on other organizations (such as BYU Provo).  While traditional faculty may face disruptive innovation in time, some alternative method of demonstrating certified expertise then will have to be found --or consider that impact on medical or engineering educations.

Whether or not every college has the "DNA" of "Harvard" (roughly equals the Ph.D.-granting Carnegie Class One research university) --deep learning still occurs in communities of practice.  John Seely Brown (.pdf) writes:

Indeed, knowing only the explicit, mouthing the formulas, is exactly what gives an outsider away.  Insiders know more.  By coming to inhabit the relevant community, they get to know not just the "standard" answers, but the real questions, sensibilities, and aesthetics, and why they matter.

Notice Brown's verb inhabit. I'm sure that such a community can be inhabited via distance education modalities, but it takes a lot of work.

Libraries and librarians come to understand how people learn as self-directed, internalizing learners --the library is a learning enterprise without the structure of the direct learning environment (classroom or course management space).  Students are intentional learners, not just users whose use of resources the librarians facilitate.

The disruptive innovation presented by all kinds of information technology, and finally by Google, Amazon, iTunesU, MOOCs, and their kindred --this disruption forces the clarification of what an academic library produces: an environment where students take responsibility for their own learning.  Librarians enact the institutional mission of the university in the context of that environment.  

An academic library enacts a community of practice so that learners move beyond "standard answers" to understand the real questions, sensibilities, and aesthetics of their disciplines, and why they matter.   Libraries are one of the places where disciplinary outsiders can become knowledgable, practicing insiders.  The library enacts the schaft in the Wissenschaft.

 

Disruptive innovation in academic libraries can only be understood in the context of disruptive technology and the complex variables of what academic libraries actually produce.

"Disruptive innovation" has become such a buzz-word that it seems to have spawned a minor industry.  Disruptive innovation is "explained" in a video featuring Clayton Christiansen, the guru and (for practical purposes) the inventor of the phrase.  Let it be noted that Christiansen himself sticks to a careful, specific definition.  (The commentary spawned by this phrase, however, has taken it way beyond Christiansen's probable intentions.)  According to that video:

Disruptive innovation is not a breakthrough innovation that makes good products a lot better. . . . It transforms a product that historically was so expensive and complicated that only a few people with a lot of money and a lot of skill had access to it.  Disruptive innovation makes it so much more affordable and accessible that a much larger population have access to it.

Christiansen goes on in his writings (such as The Innovative University --which I reviewed here) to add that such new, affordable products may be (in fact) inferior to the previous, expensive products --but that doesn't matter: the new products are good enough.  Common examples that Christiansen and others cite are transistors (in particular, transistor radios for consumers), automobiles, and computer chips.

One would foolhardy --and probably wrong-- to doubt the basic wisdom of Christiansen's insights.  But how has this really played out in academic libraries?  I've read a lot about this, but in the end I use what I have read to understand my own experience.

So what is "the product" that academic libraries produce, much less that universities produce?  This is where Christiansen's concepts get stickier.

So far as disruptive technology goes, my entire career in librarianship has enacted the disruptions.

Digression: In 1981 I began work as a cataloguer's assistant in the Historical Studies Library at the Institute for Advanced Study in Princeton --the original think tank for Albert Einstein, Kurt Gödel, John von Neumann, Robert J. Oppenheimer, George A. Kennan, Erwin Panofsky, to name only a few of the true luminaries.  A great deal of everyday life the later 20th and 21st centuries is founded on the work of IAS scholars.  While I was there I encounter --and listened to-- George Kennan, Sir Isaiah Berlin, Stephen Hawking, Irving Lavin, Albert O. Hirschman, and Clifford Geertz.  For a humble cataloguer's assistant, recent M.Div. from Princeton Theological Seminary, it was a very heady experience.  (And this after three years in Princeton, where I remember encountering John Nash, Rudolf Carnap, John Fleming, and Carl Schorske).

Anyway, what about being a cataloguer's assistant?  I actually typed cards for the catalog.  That kind of work disappeared by about 1985 (--but not before I stopped doing it in 1982).  My first professional job as a cataloger 1986-1993 disappeared after I resigned it, as did work as an Cataloging Associate at Princeton University, and Editor of a Union List of Serials for academic libraries in Rhode Island, 1996-1998.  Jobs I have held since --Head of Technical Services in what is now a constitutent library of Columbia University, and Systems and Electronic Resources Librarian at Muhlenberg College, still exist in some fashion, but actual work --what those people do everyday-- has changed dramatically.

In short, I used to do a lot of things that computers do now, mediated and supervised by human beings. Information technology in one sense disrupted that work but not completely --professional-level care for database quality and consistency (=avoiding garbage-in) is what makes the very daily world of the Internet go.  It's built on the work of many, many people long before "computers" stopped being people and started being things.  (That obscure reference explained.)

Work in libraries 1980-present has been a story of disruption, but also continuity.  Grant that the technology is disruptive, what about disruptive innovation?  What does an academic library produce?  That is less clear --and for that reason, some people argue, academic libraries shouldn't be funded.  "Show your productivity in numbers, or be gone," in effect.