Skip to content

The printed book, once thought to be a obsolete technology, shows strong signs of re-emergence as a stable cultural experience not apt to be interrupted by digital distractions or the dopamine kicks of addictive social media. 


Printed book imageTypewriters, mechanical watches, vinyl recordings, newspapers, printed books --obsolete technologies, right? Get with the program: countless incumbent industries and professions have been rendered pointless: disrupt or be disrupted --right? This has been the dominant cultural narrative --right?

I first heard about the obsolescence of librarians 35 years ago at the start of my career. Columbia University soon after accepted dominant cultural narratives and closed their graduate library school, college of pharmacy, and departments of geography and linguistics. Pharmaceuticals? Digital and print librarians? Linguistics and languages? Geographic information systems? --all obsolete (Whoops!). Since those who proclaim their demise have usually been selling some replacement, cynicism follows fast. Another prediction of demise, another day.

Entirely outside of libraries, a counter-narrative has grown. David Sax popularized one in Revenge of the Analog: Real Things and Why Real Things Matter (PublicAffairs, 2016): we interface with the world in a tactile, communal world.   At Harvard Business School, Prof. Ryan Raffaeli studies organizational behavior, using field research.  He contributes much more sophisticated thinking about re-emergent technologies. He has found that "incumbent" technologies and industries can make a comeback. This story has important implications for libraries.

Some technologies re-emerge from disruption and destruction, especially those that had a long history. Count out VHS tapes and punch cards: those were transitional. Typewriters have had a long enough history, as do fountain or nib pens (extending the dip, quill-type pens since 1827) .

Printed books, like other technologies, brought whole occupations and kinds of work with them: not just printers, but also binders, sellers, retailers, and of course librarians. As a candidate for "innovative disruption" by digital books, the demise of the printed book, so loudly proclaimed ten years ago, mandated the demise of book stores, libraries, librarians, publishers, editors.  Now anyone can write a book (see Amazon); who needs editors? Who needs libraries or bookstores?

Some disruptions are truly innovative --others just disruptions, and others just hype, but shouting as real (see previous post). The disruption narrative is not sufficiently incorrect (although it can be applied poorly), but the consequence corollary of the incumbent industries' necessary inability to adapt --and certainty of their demise-- is less well-founded.  Raffaelli's research shows that technologies can re-emerge, a cognitive process in two phases: first largely cultural, temporal, and narrative process; second a competitive process in a re-defined market with distinctive values not strictly established by price. His leading example is the Swiss mechanical watch-making industries; his second is the return and rise of independent book sellers in the USA.

Both the watch-makers and the book sellers lost substantial market shares when disruptive, good-enough technologies moved upmarket and claimed their most profitable customers: watchmakers with the rise of cheaper, more accurate quartz watches in the 1970s; book sellers with the rise of major chain bookstores in the 1990s, followed by Amazon. They keenly felt their losses: numerous Swiss firms closed or discontinued manufacturing; from 1995 to 2009 around 1,400 bookstores closed. Enough hung on, however, to rebound: how did they do it?

Raffaelli identifies the terms of competition: old terms such as price, availability, and quality change with the entry of disruptive technologies to market. The survivors have re-defined the competition: how they want to compete, and what value proposition they offer to their customers. He traces a complex process of de-coupling product and organizational identity and renegotiation of foundational concepts and business roles. The process is both bottom-up (from the "factory floor" or fundamental, front-line production or service) and top-down: from industry alliances, design thinking, and organizational management.

In the Swiss mechanical watch industry, he has identified entrepreneurs and guardians. Entrepreneurs are alert to market signals, cultural currents, and emerging narratives that suggest that new communities are forming new values. Guardians by contrast preserve older technologies and enduring values and counterbalance the entrepreneurs; both are necessary for the process of cognitive re-emergence. When the industry drew near to complete collapse, collectors began to purchase mechanical watches at high prices at auctions, signaling that their small community found genuine value expressed momentarily in price. Entrepreneurs realized that the market for mechanical watches had not completely disappeared, but changed: the value lay not in keeping time for a price, but in expressing a cultural signal. Guardians, meanwhile, had preserved enough of the technology that recovery was possible; veteran employees preserved crucial tools and skills that enabled a recovery. Each needed the other; the leadership necessary for re-emergence arose not just from the top level of the organization and industry, but from the commitment and wisdom of key skilled workers. Mechanical watches were then marketed as high-end, luxury items that "said something" about their owners. As new customers entered or moved up-market, they adopted such watches as a sign of cultural status and belonging.

Independent booksellers successfully re-framed their market as primarily community, secondarily as inventory. First the chain stores (Borders, Barnes & Noble) out-competed them on price, then Amazon on price and inventory availability. Independent booksellers have focused instead on 3 Cs: Community and local connections, Curation of inventory that enhanced a personal relationship with customers, and Convening events for those with similar interests: readings, lectures, author signings, and other group events. The booksellers' trade association (American Booksellers Association or ABA) facilitates booksellers' connections with local communities with skills, best practices, effective use of media, and outreach to other local business and organizations (--even libraries, once considered the booksellers' competitors). The re-emergent market was defined both by entrepreneurial booksellers, front-line service guardians, a growing social movement committed to localism, and industry-scale cooperation. Between 2009 and 2017 the ABA reported +35% more independent booksellers: from 1,651 to 2,321 nation-wide. A sign of the integration of booksellers with community spaces: for 2017 sales up 2.6% over 2016.

Like independent bookstores, the "library brand" remains strongly bound to printed books --after all, the name derives from "liber" (Latin), confirmed with "biblos" (Greek). The printed book, once thought to be a obsolete technology, shows strong signs of re-emergence as a stable cultural experience not apt to be interrupted by digital distractions or the dopamine kicks of addictive social media.  This brand identity will persist even though libraries offer many kinds of resources in many formats --including millions of digital books.

What does such technology and market re-emergence have to do with libraries? These cases suggest the emerging re-definition of libraries (as both old and new) is analogous to much of Raffaelli's work, and that the narrative frame of "disruptive innovation in higher education" can be --should be-- challenged by a this more useful counter-narrative, "new and re-emergent technologies in higher education."

While libraries' role as mere "book providers" has been challenged by disruptive technological service entrants such as the Internet, Amazon, and social media, libraries' role as a channel for trusted, stable information is stronger than ever. The Pew Research Center survey data from Fall 2016 found that 53% of Millennials (those 18 to 35 at that time) say they used a library --a generational cohort (not just college students--the study focused on public libraries). This compared with 45% of Gen Xers, 43% of Baby Boomers, and 36% of Silent Generation. In 2016 Pew also reported that libraries help "a lot" in deciding what information they can trust, from 24% in 2015 to 37% in 2016. Women held that opinion more strongly, 41%. Recent anecdotes suggest that such opinions have not changed direction.


Boston-public-library-free-to-allLibraries are regarded as very strong assets to a community: the high values placed on pleasant space, safety, and community events also emerged in the Pew studies. Coupled with bottom-up initiatives from front-line librarians and individual organizations, the American Library Association has devoted substantial attention and resources to initiatives such as the ACRL Framework for Information Literacy in Higher Education, and the Libraries Transform campaign.  Libraries' free-to-all traditions (supported by tuition, tax dollars, and other sources) do not track community impact as easily as do independent bookstore sales figures. Their value proposition for their communities becomes clear in usage figures (at SHU growth in usage has outpaced growth in enrollment) and the faculties' documented turn towards librarians in helping undergraduate students develop research, critical analysis, and information literacy skills.

As a re-emergent technology, printed books sustain a host of skills, occupations, organizations, and cultural signals that do not boil down to a single, simplistic, marketable narrative. Conceived in the late 20th century as "information resources," books gave way to digital representation; conceived as "documented knowledge," the act of reading books in a library context provides a tangible experience of informed learning, cultural absorption, and community participation. Libraries provide many services. Without the "brand" of reading books, and the sustaining services of librarians, the library would turn into derelict, zombie storage spaces. Knowledge is a communal good as well as a private act; it is never simply an individual achievement: free to all. We are all culturally embedded in the minds of our predecessors and communities for weal and woe --and without libraries, bookstores, timekeepers, and printed books, we will not be able to progress from woe to weal.

 

Burnett's and Evans' chapters, "How not to find a job" and "Designing your dream job" should be required reading for all college seniors --example: dysfunctional belief: I am looking for a job; reframe: I am pursuing a number of offers.

Burnett-Designing Your LifeDesigning your Life: How to Build a Well-lived, Joyous Life, by Bill Burnett and Dave Evans. 

New York: Knopf, 2017. 238 p.  ISBN: 978-1-101-87532-2

I am reviewing a book that the library doesn't own, because you really should buy your own copy of this one. This is a book with directions, exercises, useful "try stuff" pages that invite margin notations, doodles, objections, and questions.

"Designing your life" sounds strange at first. The book is based by two practitioners of "design theory" at Stanford who have taught one of the most popular courses there in the Design program --the home of "human-centered design." Burnett and Evans ultimately ask: what makes you happy? through the medium of design process: try, prototype, critique, try again:

  • Be curious;
  • Try stuff;
  • Reframe problems;
  • Know it's a process;
  • Ask for help

They reframe a lot of dysfunctional beliefs: you won't have one perfect life, rather:" you can have multiple plans and lives with you; we judge our life by the outcome, rather: life is a process, not an outcome. Their chapters, "How not to find a job" and "Designing your dream job" should be required reading for all college seniors --example: dysfunctional belief: I am looking for a job; reframe: I am pursuing a number of offers.

This sounds very like happy-clappy self-help, but its not: they ask some tough questions. They move in a counter-cultural direction: the things that we say that we want (a good grade) are not the things that will really make us happy. It would be really interesting to read this book along with Kahneman's Thinking Fast and Slow as a way to identify our own dysfunctional beliefs and cognitive fallacies.

Be curious. Go out and get a copy (or download a copy). Try stuff. Reframe your dysfunctional beliefs. Learn that happiness is a process, too. There is no right choice --there is only good choosing.

Joseph Smith Jr.'s role as an innovator, the "disruption" of Mormon proclamation, and the America that received or rejected his message all form an enduring, cultural matrix that later facilitated the popularization of Clayton Christensen's theory of “disruptive innovation” as a popular idea.

NB This is a very long post, and might be more readable in this .pdf document.

Abstract: Christensen’s theory of disruptive innovation has been popularly successful but faced increasing scrutiny in past years.  Christensen is a Latter-day Saint, and the career of Joseph Smith, Jr. and his ideas form a powerful backdrop for Christen’s theory.   Smith's role as an innovator, the "disruption" of Mormon proclamation, and the America that received or rejected his message all form an enduring, cultural matrix that later facilitated the popularization of “disruptive innovation” as an idea. Uncritical embrace of the popular theory, especially in higher education, implies acceptance of a cultural assumptions and suggest that the theory is less useful than has been claimed.  This essay is a reflection on history and experience, not a theoretical quid-pro-quo, or take-down.

Something always struck me as oddly familiar and decidedly off-key about Christensen's confident claims that innovation would explosively disrupt American higher education.  A previous boss dismissed my views as the reflex of one in a dying profession (librarianship). 

I never accepted her cynical critique; neither could I put my finger on why I still disagreed strongly.  Then a new insight came to me while teaching a historical survey course on American Religion about the idea's interesting and problematic deep structure.  My insight sprang from some familiarity both with the discourse of innovation in higher education, and 19th-century American religion, two widely known but widely separated fields.  What I have found will, I hope, give pause to thoughtful educational and academic leaders.  Uncritical embrace of "disruptive innovation" might implicate them in religious and cultural commitments that should give them pause, especially if they lead faith-affiliated organizations.

The first backstory:  Christensen first introduced disruptive innovation in 1995 in an article aimed at corporate managers rather than researchers.  In 1997 he expanded his ideas in The Innovators Dilemma which caught on in 2000s with managers, business writers, and consultants in a big way, with the aura supplied by Harvard Business School.  Since 2011 he has predicted that as many as half of American colleges and universities would close or go bankrupt in fifteen years (so in 2021-2026). He is still at it: in April 2017 he maintained his claim when speaking to Salesforce.org's Higher Education Summit.  "I might bet that it takes nine years rather than ten."  Sometimes he's been wrong (about the iPhone, for example), but he does not vacillate.

Disruptive innovation has become big business, popularized not only by Christensen (who has come to regret losing control of the term), but by a host of management consultants, pundits, and experts.  Among librarians, David Lewis applied Christensen's ideas in 2004, expanded into a book in 2016 that has been on the whole well-received by those in my "dying profession."  Predictable pushback came in 2014 from another Harvard professor (of history, Jill Lepore), and a detailed reexamination in the MIT Sloan Management Review, and others.  Christensen in turn defended his ideas and reformulated some of them in 2015, on the 20th anniversary of his initial publication. 

Three years later, if the concept has lost some valence or he's just wrong about higher education, why rehash this now and for the umpteenth time?

That's where the second backstory becomes relevant.  Christensen (of old Latter-day Saint stock) is not just coincidentally Mormon; that identity is central to his person and that background to his work. 

When I teach my historical survey of American Religion, in due course we come to the so-called Second Great Awakening in the first half of the 19th century.  Scholars give special attention to the "Burnt-Over District" of western New York, home of many potent religious and political ideas associated with "Whig Evangelicalism": abolition, temperance, the rights of women, and reforms of public health, education, prisons, orphanages, etc.  The District fostered not only mainstream Christian restorationist and evangelical movements (such as Disciples of Christ, ("Campbellites"), Methodists and Baptists), but also countless Millennialist, New-Thought, and Spiritualist communes, Shakers, Millerites (early Seventh-Day Adventists) -and Joseph Smith Jr.'s Latter Day Saints. 

Smith resists casual dismissal: was he truly a prophet of the living God (the mainstream Mormon view)? -a womanizing fraud (the view of many of his contemporaries, and critics since)? -a self-deluded prophet who eventually bought into own fabrications and could not extricate himself (a sort of early David Koresh)? -or some kind of mystic or pyschic with unusual access to otherworldly regions and the subconcious (a sort of frontier, raw-edged Francis of Assisi)?

Smith promulgated the enormous Book of Mormon (Skousen's critical first edition is 789 pages).  He claimed an angel guided him to find ancient plates in a hill near Palmyra, New York, which he translated from unknown languages with the help of seer stones and a hat, and dictated on-the-fly to his wife Emma Hale Smith and others, all in 65 days.   Even if he made it up, or shared authorship, or plagiarized part, it is an amazing performative achievement.  A densely layered anthology of documents, speakers, and authors, the text can be regarded (if not as Scripture) as an early forerunner of "magical realism." All this from 20-something farmhand with a little primary education.

Smith was truly a "rough stone rolling" whom generations of Mormons have never managed to smooth. "No man knows my history," he is reported to have said; "I cannot tell it: I shall never undertake it. . . . If I had not experienced what I have, I would not have believed it myself."  His innovative edginess strongly contrasts with earnest, family-oriented, upbeat, corporate image of contemporary Mormons.

"Innovative" -there's that word.  In matters of religion, innovation is not always a positive idea.  Smith's most famous innovations were the origins of the Book of Mormon, and the "new and everlasting covenant" (revelation) of both eternal marriage, and the principle of plural marriage.  The last innovation directly challenged 19th century American ideas about the family, and occasioned a furious opposition of a scale rarely seen in American history (leaving aside for the moment the historical plague of racism).  Latter Day Saints were opposed in Ohio, persecuted in Missouri (the Governor ordered extermination); Smith was murdered in 1844 in Illinois by a lynch mob acting out of congeries of fears.

The subsequent succession crisis would have led to fatal splintering were it not for Brigham Young.  A considerable majority of Mormons followed his leadership from Illinois in the Great Trek to Salt Lake City (1846-1848); Young's organization both preserved and transformed the Latter-day Saints; they lost their prophet but gained a hyphen.  The founder’s innovations would have perished without Young's tenacity, sheer longevity (died 1877) and "courageous leadership" or "iron-fisted rule," depending your point of view.

These two long backstories are essential for understanding both the meteoric rise of "disruptive innovation" and its recently waning appeal as an explanatory theory in light of qualms about its accuracy.

Joseph Smith, Jr., can be seen as an exemplary disruptive innovator.

"'Disruption' describes a process whereby a smaller company with fewer resources is able to successfully challenge established incumbent businesses." While incumbents focus upon improving their product or service (especially on behalf of their most profitable customers), they tend to ignore less profitable market sectors, or exceed their needs.  Entrants seek out those ignored market segments and can gain customers by providing more suitable products or services, frequently simpler and at lower cost. Incumbents can fail to respond in kind while entrants "move upmarket," improve their offerings, maintain their advantages and early success, and eventually drive out or acquire incumbents when their formerly most profitable customers shift purchases to the entrants. (Christensen, 2015). 

Since Christensen has complained that terms have become "sloppy" and undermine his theory's usefulness, I have tried to paraphrase his core idea carefully, and to present Mormon history even-handedly.  My central claim is that Smith's role as an innovator, the "disruption" of Mormon proclamation, and the America that received or rejected his message all form an enduring, cultural matrix that later facilitated the popularization of “disruptive innovation” as an idea.  Christensen's religion did not cause him to create his theory, but did contribute a framework that fostered its reception, as well as its swift extension beyond its first cases in manufacturing disc drives, construction implements, and other tangibles.

Christensen's commitment to his Church is beyond question; the intellectual, doctrinal traditions of his faith powerfully molded his character and thinking, by his own admission.  By all accounts he is a charming and sincere man, and would never have smuggled his religion into his professional thinking; he is an honest and forthright broker.  This essay is a reflection on history and experience, not a theoretical quid-pro-quo, or take-down.

Smith's movement "came to market" at a critical time in American religious history.  The Constitutional de-coupling of religious institutions from the state apparatus was one of the primary innovations of the new nation.  The new "open market" meant that incumbent legally established, financially supported Christian churches had to compete for adherents and support.  In Connecticut, for example, the "standing order" of political and religious aristocracy came to an end in 1817.  Such competition challenged the established churches' previous role as assumed arbiters of social legitimacy.  If you have a wide field of religious choices, you could also choose "none of the above." National anxieties about declining status of established forms of Christianity in large part fueled the striking resurgence of Christian groups loosely termed "evangelical." 

A significant portion of those evangelical groups can be described as "restorationist," appealing to a New Testament proclamation of God's restitution of all things.  This was taken to mean a return to the "primitive church, " a re-pristination of Christianity - taking it back to the putative purity of the founders.  This led to a proliferation of religious bodies, almost all of which inherited the binary certitude from earlier centuries that each was "correct" and hence others were necessarily "incorrect." Each group warranted its appeals in the lost purity of the first Christians.  For many groups, the path back to purity had been cleared by curbing the incumbent religious churches by disestablishment, and they hoped that clearance would level their status.

Since the religious marketplace of early 19th-century America had only recently been opened, doctrinal disputes that now seem arcane often paralleled heated social and culture divisions.  Smith's own family mirrored the national situation; his grandfather was a Universalist (that all humans could receive God's corrective grace; they opposed evangelicals); his mother has been identified as Presbyterian (officially, she would have believed that God's eternal decree predestined some to eternal blessedness and foreordained others to eternal damnation; Presbyterians tended to be allied with evangelicals).  Joseph Jr. may have caught a "spark of (evangelical) Methodism" at local rural revival meetings. His maternal grandfather and parents experienced visions and voices; like many farmers they used divining rods to find water and looked for buried treasure, a kind of folk magic.  They believed in prophecy and vision, tended towards skepticism about "organized religion,” and were receptive to new religious ideas.  He is reported to have told his mother, "I can take my Bible, and go into the woods, and learn more in two hours, than you can learn at meeting in two years, if you should go all the time."

In the religious marketplace in western New York, Smith’s family were typical of a market segment often ignored by the more well-established groups who appealed to more properous farmers, townspeople, and entrepreneurs (see Johnson’s A Shopkeepers Millennium).  Smith's family, on the other hand, were downwardly mobile recent arrivals from Vermont without a network of support, a consequence both of poor decisions and environmental strains such as the “Year without a Summer” (1816).  They typify the impoverished, rural working class on the inner edge of the frontier, a down-market segment less promising to more prominent groups, for whom the competitive religious marketplace was particularly nettlesome.

The 14-year-old Joseph was confused by the "cry and tumult" of Presbyterians vs. Baptist vs. Methodist, all using "both reason and sophistry," to "establish their own tenets and disprove all others."  He asked, "What is to be done? Who of all these parties are right; or, are they all wrong together?  If any one of them be right, which is it, and how shall I know?"  In other words, his market segment saw that ecclesiastical competition compromised the integrity of all parties. Reading a biblical text that directed him to ask God, he went into the woods (again!) and reported he experienced a dramatic theophany: the "Personage" answered "that I must join none of them," their "creeds were an abomination" and adherents "were all corrupt."  His take-away: he realized "at a very early period of my life, that I was destined to prove a disturber and annoyer."  Joseph’s subsequent innovations certainly disturbed and annoyed other groups.

Care must be taken, however, in simply equating Joseph’s social location with a commercial market position, because the religious “marketplace” differs in important ways from commerce: product differentiation, lock-in, and brand loyalty.

The religious "product" is not a commodity, but a sense of living affiliation with a group that makes doctrinal, moral, and behavioral claims in such a way that simultaneous affiliation with more than one group is either prohibited or discouraged.  The ultimate outcome, some kind of eternal blessedness, in principle excludes other ultimate outcomes.  Today many children in "mixed" families can feel religious differences strongly (and opt for "none"). For example, an individual cannot be a Catholic in good standing and simultaneously an evangelical Baptist in good standing -their claims and ideas simply conflict too much; if both present in the same family, some accommodation must be reached.  Joseph Smith Jr. found such exclusive "product differentiation" troublesome.

Religious adherents' "market lock in" is high: one might radically change one's affiliation once or twice in a lifetime, but more often is unusual and perhaps suspect, and "conversion" can exact high social costs.  The religious fervor of New York's Burned Over district in Joseph Smith, Jr.'s time left religious organizations in flux, so that conversion costs were often much less than before or after.  All early Latter Day Saints nevertheless had to make a clear decision that departed from their inherited religious affiliations.

A religious group's "brand loyalty" involves a constellation of commitments; socialist Fundamentalists and alt-right Episcopalians are vanishingly rare (for example).  The brand loyalty of early Latter Day Saints evolved from 1830 to 1844, becoming progressively stronger both in positive response to Joseph Smith Jr.'s continuing revelations, and defensive response to violent persecution.  For example, early Saints' constellation of commitments was ambivalent towards slavery; initially as Northerners early adherents opposed it; then revelations and teachings evolved to allow some slave-holders to join in Missouri (a slave state). After Smith’s murder, his son Joseph Smith III and widow Emma Hale Smith repudiated both slavery and plural marriage in the Reorganized Church of Jesus Christ of Latter-day Saints in Missouri, the "minority" successor group.  By contrast, Brigham Young's larger "majority" successor not only retained plural marriage but attempted to legalize slavery in the Utah Territory.  Since Republicans, starting in 1854, sought to abolish "twin relics of barbarism," slavery and polygamy (a jab at Young's group), it is unclear whether that commitment arose from core convictions or defensive resistance.

"Disruptive innovation" in the religious marketplace has to be treated carefully, because of not only the special nature of the religious market place, but also rigorous examination of the idea of "disruptive innovation:" it does not mean just any disruption.

Whatever the sources of Joseph Smith Jr.,’s ideas, he led a movement that "gain[ed] customers (i.e., adherents) by providing more suitable, often simpler products or services, frequently at a lower cost."  (Latter-day Saints have never had professional clergy; their commitment to mutual assistance is exemplary.)  Market incumbents (more organized and better financed competing groups) were slow to respond in kind, and as Smith's group moved "upmarket," it maintained its "advantages and early success" -high rates of "lock-in," group cohesion, and brand loyalty.  Smith's group, however, never quite succeeded in driving the "incumbents" out of the market or even acquiring most of their former customers.  Their sense of urgency lost its edge.

Why are the Latter-day Saints “latter-day”?  This code phrase refers above all to a shared set of cultural and religious assumptions and commitments in early 19th-century America.  "Millennialism" was the belief that the coming of the Kingdom of God (promised in the Bible) was imminent and that America, with its special covenant of religious liberty, would be central to its arrival.  Millennialism came in two distinct forms with opposite answers to the question, "Will Christ return before or after the promised millennium (1000 years) of peace?"  Pre-millennialists emphasized the troubles (tribulations) that would both precede and signal Christ's return to reign for 1000 years before the Last Judgement.  Post-millennialists proclaimed that Christ would return and the Last Judgement occur after the millennium of peace, culminating in his return; their task was to make the world ready for the Kingdom of God.  Both expect the Kingdom very soon: we are all living in the biblical "latter days."

This recondite disagreement has important implications.  Post-millennialists were all about social reforms that would make the United States so like the Kingdom of God that American society would usher in the millennium.  Pre-millennialists emphasized that Christ would only come after dramatically increasing tribulations.  Things getting worse and worse were a sign of his approach –hence they disfavored social reforms as a distraction from the real work of preparation for evil times. (Historical aside: the War of the Secession largely discredited post-millennialism, which morphed into the program of social reforms in the Progressive era. Pre-millennialism evolved into dispensational Christian fundamentalism, combining expectation of tribulation with a believe in the factual, literal innerrancy of the Bible.)

Latter-day Saints' enduring post-millennialism shows, among other ways, in their boundless optimism.  The familiar, earnest young missionaries (think The Book of Mormon, the Broadway show) are a token of the continuing commitment of the Latter-day Saints to usher in the latter days, although they expect them less imminently. Millennialism is common coin no longer.  Despite the popularity of the "Left Behind" series of books and movies, only a small minority of survivalists or "preppers" appeal to Biblical warrants for their expectations of imminent tribulations (disaster).

Detached from Christianity, expectations of imminent disaster and rebirth went rogue in American culture long ago.  The Silicon Valley today, for example, is home to many who expect a "singularity" in which the artificial intelligence outstrips human intelligence and introduces profound changes to civilization as a whole ­–another sort of secular millennium in which technology has replaced a Messiah as the anointed power.  Popular movies and books have made dystopia a cultural cliché.  (What's the disaster this time? Nuclear war, apes, viruses, climate change, or the abrupt disappearance of millions?). How many jokes about "voting in the last American election" (double entendre) play on similar fears?

"Disruptive innovation's" popularity exploded in the 1990s and 2000s exactly because of the numerous hopes and fears raised by the advent of the Internet and its devices and social media.  Josh Linkner warned, "disrupt or be disrupted," (The Road to Reinvention, 2014) and that binary choice spoke in apocalyptic tones to incumbent mass media, libraries, bookstores, journalists, travel agents, financial consultants, university presidents, and anyone else who deals in "information" as a commodity.  Such urgent warnings shout to established corporations, "The end is near: you can't innovate fast enough; you're not even the right people to do it."  Incumbent organizations were counted out simply because of their incumbency: MOOCs would inevitably disrupt brick-and-mortar educational institutions, now denigrated because of their mere physicality. 

The popular version of “disruptive innovation” played dystopian fears of the collapse of the known "incumbent" corporations and rise of an economy of perpetual disruption -Schumpter's capitalist creative destruction now recast as "disruptive innovation" with a brutalist, binary emphasis: disrupt or be disrupted.  The archetype "creative disruptor" is the technological whiz-kid (I nominate the Mark, "Zuck Almighty") whose revelatory "Book of Faces" and continuing revelations of a "new and everlasting platform" will usher in a thousand-year era of effortless, limitless, and unfailingly upbeat social confabulation.  Except when many kinds of terrorists, Russian autocrats, vaccine deniers, and deranged stalkers confabulate as well.

What does this have to do with Clayton Christensen?  Well, both a little and a lot.  He cannot deny his own popularization of his ideas through his books, media appearances, securities fund (the short-lived Disruptive Growth Fund, launched in 2000 at just the wrong time), and army of students, friends, and defenders such as Thomas Thurston in TechCrunch.  He lost control of "disruptive innovation" as a term of art precisely because of its appeal to those who make a living from in-your-face, counterintuitive claims.  Lepore identified circular reasoning in the popular idea of creative disruption ("If an established company doesn't disrupt, it will fail, and if it fails it must be because it didn't disrupt").  This logical circle may or may not characterize highly-disciplined case studies of Christensen's theory, but certainly rings true to the endless popular iterations.

Whether Christensen's theory holds up almost does not matter to "disruptive innovation" as a popular idea.  By analogy, in Smith's America, as Terryl Givens has noted, what mattered about the Book of Mormon was not its teachings or particular message. "It was the mere presence of the Book of Mormon itself as an object that . . . served as concrete evidence that God had opened the heavens again."  In that era all manner of revelations appeared: the Shakers' Holy, Sacred, and Divine Roll and Book, the visionary experiences of Ellen G. White (one of the originators of the Seventh-Day Adventists), and the visions of rival claimants of Smith's prophetic mantel among the Latter Day Saints after his death.  Kathleen Flake has noted, "Henry Ford wanted a car in every home. Joseph Smith was the Henry Ford of revelation. He wanted every home to have one, and the revelation he had in mind was the revelation he'd had, which was seeing God."  The heavens, once opened, proved harder to close.

The popular idea "creative disruption" has attached itself, meme-like, to a lot of second- and third-rate scams.  Business theory has fewer brightly defined disciplinary boundaries than physics. King's and Baatartogtokh's conclusion that the theory has limited predictive power does not render Christensen's ideas useless, but does suggest that "disruptive innovation" will not be the "one theory to rule them all," and with the profits or prophets bind them. 

Joseph Smith Jr. claimed that the encounter he had with the Holy in the woods warned him not to join any of the (Protestant) groups in his vicinity, whose creeds were all "corrupt" and "an abomination."  Christian restorationists called the very early Christian movement in the short times reflected in the New Testament texts "the primitive church," and regarded all subsequent developments as not merely wrong, but apostate: those knew the truth but deliberately denied it.  Joseph Smith, Jr.'s saw his new revelation as a giant delete key on all of Christian history, Orthodox Eastern, Catholic, and Protestant.  All of it had to go.

In a similar manner, popular "disruptive innovation" connotes the passing destruction of all that is wrong with sclerotic corporate capitalism, and the restoration of the pure, "invisible hand" of the marketplace that allegedly guided early capitalists.  This popular view resonates with a great deal of cultural, political libertarianism, that giant corporations and government bureaucracy are apostasy betraying the true faith handed down from the founders (either Jesus or Adam Smith, as you wish).  "Move fast and break things," advised the Zuck; what can be disrupted should be disrupted.  Including, it would now seem, democracy wherever it might be found.

Disciplined use of the theory of "disruptive innovation" in carefully defined circumstances provides explanatory clarity but its predictive power is in fact more of a hope than an established fact, despite the protests of Christensen's defenders.  This means that it is one theory among other theories: Michael Porter's theory of competitive advantages and multifactorial analyses will likely work equally well in other carefully-defined situations.  Similarly, The Church of Jesus Christ of Latter-day Saints has found ways of regarding other religious groups positively (even Evangelicals, often the most hostile), and has moved on from the language of "apostasy."  Originally intending to hit that giant delete key, subsequent Latter-day Saints have found a way to live as active readers of their particular texts in the midst of many other readers of many other texts.  This has relevance on the ground.  Given the official LDS teachings regarding divorce and homosexuality, some LDS families have found informal means to include and tolerate differences within their members, coming to resemble the family life Joseph Smith Jr.' knew as a boy.  (Others have continued to shun their "apostates.")

Unlike Smith, Christensen never intended to promulgate a "unified field theory" of religion or business development. He is not completely responsible for losing control of his theory as a popular idea.  The close of his 20-year, 2015 re-evaluation, "We still have a lot to learn" acknowledges that "disruption theory does not, and never will, explain everything about innovation specifically or business success generally." 

Christensen's modesty still did not inhibit him from doubling down on his claim that half of American colleges and universities would close by 2025.)  Allowing his claim relative rather than revelatory validity dispels the apocalyptic fears of barbarians at the gates.  His primary analogy in The Innovative University (2011) is "changing the DNA of American Higher Education from the Inside Out," (the subtitle).  He claims that all American colleges and universities other than a branch of Brigham Young University in Idaho share the DNA of Harvard: all these institutions want to become, apparently, Research-1 universities if given money and the chance.  What does that really mean, and is that really true?  Such a simple analogy does grave injustice to community colleges (vital economic links for countless communities and immigrants), specialized schools such as Maine Maritime Academy, or even an elite liberal arts college such as DePauw University.  The popular view that higher education has changed and can change little, is flat wrong: ask any productive historian of higher education.  Change and innovation (whether disruptive or other) will not appear equally and everywhere overnight.  The higher education sector is not (thank heavens) the Silicon Valley, or McKinsey & Co.

Yet all is not well: the economic model underpinning American higher education is likely unsustainable in the coming decades for many reasons.  Higher education also forms a huge social and financial investment that unlikely to dissipate.  Distance education, information technology, changing social expectations, shifting demographics will all play a role in whether particular colleges and universities can continue to be viable. Disciplined use of the theory of "disruptive innovation" will likely hold some, but is unlikely to hold all explanatory and predictive keys.  The truth is out there but it will be much more complex.

The striking persistence of the popular "disruptive innovation" in senior management circles (typified by the Salesforce.com higher education event) reveals not only persistent fears and enduring threats, but short attention spans devoted to keeping up with the swift pace of too many events.  I suspect that popular "disruptive innovative" functions in a manner more affiliative than explanatory: "if you don't get it, you're not one of us. -- You think Jill Lepore, or King and Baartatogtokh, might be right, eh?  Let's see how long you last in the C-Suite" (--especially if you can pronounce the latter's name).

"Disruptive innovation" elicits fears useful for those who want to shake up certain professions in health, law, librarianship, and the professoriate, but by now its been over-used.  At librarians' meetings (ALA, ACRL) I have developed the habit of responding to the expression, "disruptive innovation" with the question, "what are you selling?"  Fear sells "solutions;" its potency as a means of marketing continues nearly unrivaled.  No one ever sold an expensive library services platform with the phrase, "this won't solve all your problems."  Since 1985 I have sat through many presentations that predicted the closure of libraries within ten years - Christensen's remark "I might bet that it takes nine years rather than ten" would find a new audience.  We who are about to be disrupted salute you, Prophet.

Nevertheless: printed books, asking questions, research assistance, and personal relationships with library instructors endure.  They were warned, but they persisted.  It is past time to find a more accurate analysis and predictive theory of the future of libraries and higher education.

On Tyranny is not only about an American moment, but about a worldwide one.

image from libapps.s3.amazonaws.comOn Tyranny: Twenty Lessons from the Twentieth Century, by Timothy Snyder.  Duggan Books (Crown), 2017. 126 p. ISBN 978-0804190114. List price $8.99

Yale University professor Timothy Snyder has spent a long time learning the languages, reading the documents, exploring the archives, and listening to witnesses of the totalitarian tyrannies of Europe in the last century --particularly of Nazi Germany and the Stalinist Soviet Union. His scholarship bore particular fruit in books such as Bloodlands: Europe between Hitler and Stalin, and Black Earth: the Holocaust as History and Warning. He came to recognize that certain characteristics in the development of those tyrannies are present in the world today, and in the United States. This book is no partisan screed: Snyder recognizes in the 45th President features he knows from other contexts; those other contexts underscore the drift towards totalitarianism apparent from Russian to Europe to the USA. On Tyranny is not only about an American moment, but about a worldwide one.

This short book consists of a brief introduction, twenty short chapters, and an epilogue. Each chapter directs an action, such as no. 12, "Make eye contact and small talk" followed by a historical example, or expansion of the point. All the actions can be undertaken or performed in daily life; there is no grand theory here.

In place a grand theory, there is a fundamental point: respect and value facts, truth, and accurate usage of our common language. In Moment (magazine), he explained: "Once you say that there isn’t truth and you try to undermine the people whose job it is to tell the truth, such as journalists, you make democracy impossible." He told Bill Maher (at 2:02) than while "post-fact" postmodernism might connote "Berkeley, baguettes, and France and nice things," it more likely means that "every day doesn't matter; details don't matter; facts don't matter; all that matters is the message, the leader, the myth, the totality" --a condition of Europe in the 1920s.  Such disdain for the truth goes hand-in-hand with conspiracy theories that put assign blame to a group associated with undermining the purity of the majority. "Rather than facing up to the fact that life is hard and that globalization presents challenges, you name and blame people and groups who you say are at fault."  Jews, Mexicans, Muslims, Rohingya, Tutsis, Hutus, globalists, evolutionists, or any other "outsider."  The myth: "Make [fill in the blank] great again."

A librarian or research might particularly resonate with Snyder's directions, "Be kind to our language," "Believe in truth," and "Investigate" (lessons 9-11). This is all a way to prepare to "be calm when the unthinkable arrives" (lesson 18) --when a leader exploits a catastrophic event to urge follows to trade freedom for security, and suspends the rule of law. The Chief Executive may or may not be attempt to stage a coup; that American democracy survived the dark moment after the Charlottesville.  Snyder told Salon in August, "We are hanging by our teeth to the rule of law. That was my judgment at the beginning of his presidency and it is still my judgment now. The rule of law is what gives us a chance to rebuild the system after this is all done."

Whether or not current politics result in tyranny and oppression is still (at this writing) an open question. The importance of Snyder's book is that it points beyond this moment to the wider trends and challenges of a world which is global (like it or not), connected (like it or not), and interdependent on both our natural climates and accrued, hard-won cultural heritages. A University founded on "a rigorous and interdisciplinary search for truth and wisdom" that "forms the cornerstone of all University life and welcomes people from all faiths and cultures" cannot leave our students unprepared. In order to make history, young Americans will have to know some (p. 126)  Will that be the twenty-first lesson on tyranny from the twenty-first century?

--Gavin Ferriby

Hartley argues that liberal arts educations widen a student's horizon, inquire about human behavior and find opportunities for products and services that will meet human needs. The "softer" subjects helps persons to determine which problem they're trying to solve in the first place.

FuzzyAndTheTechieJacketCoverThe Fuzzy and the Techie: Why the Liberal Arts Will Rule the Digital World, by Scott Hartley.  New York: Houghton Mifflin Harcourt, 2017. ISBN 978-0544-944770 $28.00 List.

Hartley writes that a "false dichotomy" divides computer sciences and the humanities, and extends this argue to STEM curricula as well. For example, Vinod Khosla of Sun Microsystems has claimed that "little of the material taught in liberal arts programs today is relevant to the future." Hartley believes that such a mind-set is wrong, for several reasons. Such a belief encourages students to pursue learning only in vocational terms: preparing for a job. STEM field require intense specialization, but some barrier to coding (for example) are dropping with web services or communities such as GitHub and Stack Overflow. Beyond narrow vocational boundaries, Hartley argues that liberal arts educations widen a student's horizon, inquire about human behavior and find opportunities for products and services that will meet human needs. The "softer" subjects helps persons to determine which problem they're trying to solve in the first place.

That said, the book does not move much further. Hartley never really tries to provide a working definition for true "liberal arts" education except to distinguish it STEM or Computer Science. By using the vocabulary of "fuzzy" and "techie" he encountered at Stanford, he inadvertently extends a mentality that has fostered start-ups notably acknowledged to be unfriendly to women. So far as I could determine, a mere handful of Hartley's sources as noted were published elsewhere than digitally--although the "liberal arts," however defined, have a very long tradition of inquiry and literature that Hartley passes by almost breezily, and is very little in evidence. His book is essentially a series of stories of companies and their founders, many of whom did not earn "techie" degrees.

Mark Zuckerberg's famous motto "move fast and break things" utterly discounted the social and cultural values of what might get broken. Partly in consequence, the previously admired prodigies of Silicon Valley start-ups are facing intense social scrutiny in 2017 in part as a result of their ignorance of human fallibility and conflict.
Hartley is on to a real problem, but he needs to do much more homework to see how firmly rooted the false dichotomy between sciences and humanities is rooted in American (and world-wide) culture. The tendency, for example, to regard undergraduate majors as job preparation rather than as disciplined thinking, focused interest and curiosity is so widespread that even Barack Obama displayed it. ("Folks can make a lot more, potentially, with skilled manufacturing or the trades than they might with an art history degree" --Barack Obama's remark in Wisconsin in 2014; he did retract it later).

Genuine discussion of the values of humanities and STEM degrees can only take place with the disciplined thinking, awareness of traditions, and respect for diversity that are hallmarks of a true liberal arts education.

The recent acquisition of BePress & Digital Commons by Elsevier has occasioned a snowstorm of commentary and opinion.  Some of that has not been helpful, even though well-intended.

The recent acquisition of BePress & Digital Commons by Elsevier has occasioned a flurry snowstorm of commentary and opinion.  Some of that has not been helpful, even though well-intended.  Sacred Heart University Library belongs to a 33-member group call the Affinity Libraries Group.  We are all private, Masters-1 universities (some with several doctoral degrees), relatively mid-size between the Oberlin Group of liberal arts college libraries, and the Association of Research Libraries (ARL).

Much of the following is going to be discussed at a meeting alongside or outside the coming CNI meeting in December in Washington DC –but since CNI is expensive ($8,200/year), SHU is not a member, nor are I suspect other Affinity Libraries.  I am hoping that, using one technology or another, the Affinity Libraries can have a conversation as well. 

Affinity Group has changed over the years; we (or they, meaning our predecessor directors) used to meet often, sometimes in quite successful stand-alone events not connected with another event, for example, ALA Annual.  Others have said to me that in some ways the Affinity Group (as it was then) really came down to “professional and personal friends of Lew Miller” (former director at Butler), and while I’m not sure that’s fair, it is accurate in the sense that personal relationships formed a strong glue for the group. As directors retired or moved on, group adhesiveness accordingly changed. I’m avoiding the word or metaphor “decline” here because sometimes things just change, and Affinity Group has been one of them.  No one has been sitting around in the meantime.

We do share a strong commitment to the annual Affinity Group statistics. Perhaps now a discussion about institutional repositories and Digital Commons in particular could garner some interest with attention directed to issues for libraries of our size.

Some of the hoopla surrounding Elsevier’s acquisition of BePress has simply given occasion to express contributors’ intense dislike of Elsevier and its business model of maximizing profits above all else, certainly a justified objection given the state of all our budgets.

I think the anonymous Library Loon (Gavia Libraria) has pretty well summed up various points (though I don’t agree with every one of her statements), and Matt Ruen’s subsequent comment on August 9 is also helpful.  Paul Royster at University of Nebraska—Lincoln wrote on September 7 on the SPARC list:

The staff at BePress have been uniformly helpful and responsive, and there is no sign of that changing. They are the same people as before. They have never interfered with our content. I do not believe Elsevier paid $150 million in order to destroy BePress. What made it worth that figure was 1. the software, 2. the staff, and 3. the reputation and relationships.BePress became valuable by listening to their customers; Elsevier could learn a lot from them about managing relationships--and I hope they do.  BePress is also in a different division (Research) than the publications units that have treated libraries and authors so high-handedly. The stronger BePress remains, the better will be its position vis-a-vis E-corp going forward. Bashing BePress over its ownership and inciting its customers to jump ship strikes me as not in the best interests of the IRs or the faculty who use them. 

Almost every college library has relationships with Elsevier already; deserting BePress is not a moral victory of right over wrong. The moral issue here is providing wider dissemination and free access for content created by faculty scholars. No one does that better than BePress, and until that changes, I see no cause for panic. Of course there are no guarantees, and it is always wise to have a Plan B and an exit strategy. But cutting off BePress to spite their new ownership does not really help those we are trying to serve.

I share Royster’s primary commitment freely to disseminate content created by faculty scholars. Digital Commons has done that for SHU in spades, and has been a game-changer in this university and library, in my experience. I know that many share such a primary commitment; many also share enduring and well-grounded suspicion of just about anything Elsevier might do.  As a firm, their behavior often has been so downright divisive and sneaky (we can tell our stories…)  When I first read of the sale, my gut response was, “Really? Great, here’s big problem when I don’t really want another.”   Digital Commons is one of the three major applications that power my library: 1) the integrated library services platform; 2) Springshare’s suite of research & reference applications, and 3) BePress.  Exiting BePress would be distracting, distressing, and downright burdensome.  As Royster writes, “there are no guarantees.”  Now we have to have Plan B and an exit strategy, even if we never use it.

What I fear most is Gavia Libraria’s last option (in her blog post): that Elsevier will simply let “BePress languish undeveloped, with an eye to eventually shrugging and pulling the plug on it.”  I have seen similar “application decay” with ebrary, RefWorks, and (actually) SerialsSolutions, several of which have languished (or are languishing) for years before any genuine further development.  I watched their talented creators and originating staff members drift away into other ventures (e.g., ThirdIron).  Were that to happen, it would be bad news for SHU and other Affinity members.  Royster’s statement “they are the same people as before” has not always held true in the past when smaller firms become subject to hiring processes mandated by larger organizations (e.g., SerialsSolutions’ staff members now employed by ProQuest).

On SPARC’s list, there has been great discussion about cooperation & building a truly useful non-profit, open-source application suite for institutional repository, digital publishing, authors’ pages (like SelectedWorks), etc.  Everyone knows that’s a long way off, without any disrespect to Islandora, Janeway, DSpace, or any other application.  DigitalCommons and SelectedWorks is pretty well the state of the art, and its design and consequent workflow decisions have benefited the small staff of the SHU Library enormously (even with the occasional hiccups and anomalies). Digital Commons Network has placed SHU in the same orbit or gateway as far larger and frankly more prestigious colleges and universities, and I could not be happier with that.  I have my own SelectedWorks page and I like it.  I would be sorry to see all this go –unless a truly practical alternative emerges.  Who knows when that will be?

In the meantime, we will be giving attention to Plan B –until now we have not had one or felt we needed one (--probably an unfortunate oversight, but it just did not become a priority).  I really don’t yet know what our Plan B will be.

I sense that if OCLC were to develop a truly useful alternative to Digital Commons (one well beyond DSpace as it presently exists), it might have some traction in the market (despite all of our horror stories about OCLC, granted).  Open Science Framework, Islandora, or others hold promise but really probably cannot yet compete feature-by-feature with Digital Commons (at least, I have not seen anything that really even close).  If you think I’m wrong, please say so! –I will gladly accept your correction.

if you know Yewno, and if Yewno, exactly what do you know? --that "exactly what" will likely contain machine-generated replications of problematic human biases.

This is the third of "undiscovered summer reading" posts, see also the first and second.

At the recent Association of College and Research Libraries conference Baltimore I came across Yewno, a search-engine-like discovery or exploration layer that I had heard about.  I suspect that Yewno or something like it could be the "next big thing" in library and research services.  I have served as a librarian long enough both to be very interest, and to be wary at the same time --so many promises have been made by the information technology commercial sector and the reality fallen far short --remember the hype about discovery services?

Yewno-logoYewno is a so-called search app; it "resembles as search engines --you use it to search for information, after all--but its structure is network-like rather than list-based, the way Google's is. The idea is to return search results that illustrate relationships between relevant sources" --mapping them out graphically (like a mind map). Those words are quoted from Adrienne LaFrance's Atlantic article on growing understanding of the Antikythera mechanism as an example of computer-assisted associative thinking (see, all these readings really come together).  LaFrance traces the historical connections between "undiscovered public knowledge," Vannevar Bush's Memex (machine) in the epochal As We May Think, and Yewno.  The hope is that through use of an application such as Yewno, associations could be traced between ancient time-keeping, Babylonian and Arabic mathematics, medieval calendars, astronomy, astrological studies, ancient languages, and other realms of knowledge. At any rate, that's the big idea, and it's a good one.

So who is Yewno meant for, a what's it based on?

Lafrance notes that Yewno "was built primarily for academic researchers," but I'm not sure that's true, strictly. When I visited the Yewno booth at ACRL, I thought several things at once: 1) this could be very cool; 2) this could actually be useful; 3) this is going to be expensive (though I have neither requested nor received a quote); and 4) someone will buy them, probably Google or another technology octopus. (Subsequent thought: where's Google's version of this?)  I also thought that intelligence services and corporate intelligence advisory firms would be very, very interested --and indeed they are.  Several weeks later I read Alice Meadows' post, "Do You Know About Yewno?" on the Scholarly Kitchen blog, and her comments put Yewno in clearer context. (Had I access to Yewno, I would have searched, "yewno.")

Yewno is a start-up venture by Ruggero Gramatica (if you're unclear, that's a person), a research strategist with a background in applied mathematics (Ph.D. King's College, London) and M.B.A. (University of Chicago). He is first-named author of "Graph Theory Enables Drug Repurposing," a paper (DOI) on PLOS One that introduces:

a methodology to efficiently exploit natural-language expressed biomedical knowledge for repurposing existing drugs towards diseases for which they were not initially intended. Leveraging on developments in Computational Linguistics and Graph Theory, a methodology is defined to build a graph representation of knowledge, which is automatically analysed to discover hidden relations between any drug and any disease: these relations are specific paths among the biomedical entities of the graph, representing possible Modes of Action for any given pharmacological compound. We propose a measure for the likeliness of these paths based on a stochastic process on the graph.

Yewno does the same thing in other contexts:

an inference and discovery engine that has applications in a variety of fields such as financial, economics, biotech, legal, education and general knowledge search. Yewno offers an analytics capability that delivers better information and faster by ingesting a broad set of public and private data sources and, using its unique framework, finds inferences and connections. Yewno leverages on leading edge computational semantics, graph theoretical models as well as quantitative analytics to hunt for emerging signals across domains of unstructured data sources. (source: Ruggero Gramatica's LinkedIn profile)

This leads to several versions of Yewno: Yewno Discover, Yewno Finance, Yewno Life Sciences, and Yewno Unearth.  Ruth Pickering, the companies co-founder and CEO of Business Development & Strategy Officer, comments, "each vertical uses a specific set of ad-hoc machine learning based algorithms and content. The Yewno Unearth product sits across all verticals and can be applied to any content set in any domain of information."  Don't bother calling the NSA --they already know all about it (and probably use it, as well).

Yewno Unearth is relevant to multiple functions of publishing: portfolio categorization, the ability to spot gaps in content, audience selection, editorial oversight and description, and other purposes for improving a publisher's position, both intellectually and in the information marketplace. So  Yewno Discovery is helpful for academics and researchers, but the whole of Yewno is also designed to relay more information about them to their editors, publishers, funders, and those who will in turn market publications to their libraries.  Elsevier, Ebsco, and ProQuest will undoubtedly appear soon in librarians' offices with Yewno-derived information, and that encounter likely could prove to be truly intimidating.  So Yewno might be a very good thing for a library, but not simply an unalloyed very good thing.

So what is Yewno really based on? The going gets more interesting.

Meadows notes that Yewno's underlying theory emerged from the field of complex systems at the foundational level of econophysics, an inquiry "aimed at describing economic and financial cycles utilized mathematical structures derived from physics." The mathematical framework, involving uncertainty, stochastic (random probability distribution) processes and nonlinear dynamics, came to be applied to biology and drug discovery (hello, Big Pharma). This kind of information processing is described in detail in a review article, Deep Learning in Nature (Vol. 521, 28 May 2015, doi10.1038/nature14539).  Developing machine learning, deep learning "allows computational models that are composed of multiple processing layers to learn representations of data with multiple levels of abstraction."  Such deep learning "discovers intricate structure in are data sets by using the backpropagation algorithm to indicate how a machine should change its internal parameters that are used to compute the representation in each layer from the representation in the previous layer." Such "deep convolutional nets" have brought about significant break-throughs when processing images, video, speech, and "recurrent nets" have brought new learning powers to "sequential data such as text and speech."

The article goes on in great detail, and I do not pretend I understand very much of it.  Its discussion of recurrent neural networks (RNNs), however, is highly pertinent to libraries and discovery.  The backpropagational algorithm is basically a process that adjusts the weights used in machine analysis while that analysis is taking place.  For example, RNNs "have been found to be very good at predicting the next character in the text, or next word in a sequence," and by such backpropagational adjustments, machine language translations have achieved greater levels of accuracy. (But why not complete accuracy? --read on.)  The process "is more compatible with the view that everyday reasoning involves many simultaneous analogies that each contribute plausibility to a conclusion." In their review's conclusion, the authors expect "systems that use RNNs to understand sentences or whole documents will become much better when they learn strategies for selectively attending to one part at a time."

After all this, what do you know? Yewno presents the results of deep learning through recurrent neural networks that identify nonlinear concepts in a text, a kind of "knowledge." Hence Ruth Pickering can plausibly state:

Yewno's mission is "Knowledge Singularity" and by that we mean the day when knowledge, not information, is at everyone's fingertips. In the search and discovery space the problems that people face today are the overwhelming volume of information and the fact that sources are fragmented and dispersed. There' a great T.S. Eliot quote, "Where's the knowledge we lost in information" and that sums up the problem perfectly. (source: Meadows' post)

Ms. Pickering perhaps revealed more than she intended.  Her quotation from T.S. Eliot is found in a much larger and quite different context:

Endless invention, endless experiment,
Brings knowledge of motion, but not of stillness;
Knowledge of speech, but not of silence;
Knowledge of words, and ignorance of the Word.
All our knowledge brings us nearer to our ignorance,
All our ignorance brings us nearer to death,
But nearness to death no nearer to GOD.
Where is the Life we have lost in living?
Where is the wisdom we have lost in knowledge?
Where is the knowledge we have lost in information?
The cycles of Heaven in twenty centuries
Bring us farther from GOD and nearer to the Dust. (Choruses from The Rock)

Eliot's interest is in the Life we have lost in living, and his religious and literary use of the word "knowledge" signals the puzzle at the very base of econophysics, machine learning, deep learning, and backpropagational algorithms.  Deep learning performed by machines mimics what humans do, their forms of life.  Pickering's "Knowledge Singularity" alludes to the semi-theological vision of the Ray Kurzweil's millennialist "Singularity;" a machine intelligence infinitely more powerful than all human intelligence combined.  In other words, where Eliot is ultimately concerned with Wisdom, the Knowledge Singularity is ultimately concerned with Power.  Power in the end means power over other people: otherwise it has no social meaning apart from simply more computing.  Wisdom interrogates power, and questions its ideological supremacy.

For example, three researchers at the Center for Information Technology Policy at Princeton University have shown that "applying machine learning to ordinary human language results in human-like semantic biases." ("Semantics derived automatically from language corpora contain human-like biases," Science 14 April 2017, Vol. 356, issue 6334: 183-186, doi 10.1126/science.aal4230). The results of their replication of a spectrum of know biases (measured by the Implicit Association Test) "indicate that text corpora contain recoverable and accurate imprints of our historic biases, whether morally neutral as towards insects or flowers, problematic as race and gender, for even simply veridical, reflecting the status quo distribution of gender with respect to careers or first names. Their approach holds "promise for identifying and addressing sources of bias in culture, including technology."  The authors laconically conclude, "caution must be used in incorporating modules constructed via unsupervised machine learning into decision-making systems."  Power resides in decisions such decisions about other people, resources, and time.

Arvind Narayanan, who published the paper with Aylin Caliskan and Joanna J. Bryson, noted that "we have a situation where these artificial-intelligence systems may be perpetuating historical patterns of bias that we might find socially unacceptable and which we might be trying to move away from."  Princeton researchers developed an experiment with a program called GloVe that replicated the Implicit Association test in machine-learning representation of co-occurent words and phrases.  Researchers at Stanford turn this loose on roughtly 840 billion words from the Web, and looked for co-occurences and associations of words such as "man, male" or "woman, female" with "programmer engineer scientist, nurse teacher, librarian."   They showed familiar biases in distributions of associations, biases that can "end up having pernicious, sexist effects."

For example, machine-learning programs can translate foreign languages into sentences taht reflect or reinforce gender stereotypes. Turkish uses a gender-neutral, third person pronoun, "o."  Plugged into the online translation service Google Translate, however, the Turkish sentence "o bir doktor" and "o bir hemşire" are translated into English as "he is a doctor" and "she is a nurse."  . . . . "The Biases that we studied in the paper are easy to overlook when designers are creating systems," Narayanan said. (Source: Princeton University, "Biased Bots" by Adam Hadhazy.)

Yewno is exactly such a system insofar as it mimics human forms of life which include, alas, the reinforcement of biases and prejudice.  So in the end, do you know Yewno, and if Yewno, exactly what do you know? --that "exactly what" will likely contain machine-generated replications of problematic human biases.  Machine translations will never offer perfect, complete translations of languages because language is never complete --humans will always use it new ways, with new shades of meaning and connotations of plausibility, because human go on living in their innumerable, linguistic forms of life.  Machines have to map language within language (here I include mathematics as kinds of languages with distinctive games and forms of life).  No "Knowledge Singularity" can occur outside of language, because it will be made of language: but the ideology of "Singularity" can conceal its origins in many forms of life, and thus appear "natural," "inevitable," and "unstoppable." 

The "Knowledge Singularity" will calcify bias and injustice in an everlasting status quo unless humans, no matter how comparatively deficient, resolve that knowledge is not a philosophical problem to be solved (such as in Karl Popper's Worlds 1, 2, and 3), but a puzzle to be wrestled with and contested in many human forms of life and language (Wittgenstein). Only by addressing human forms of life can we ever address the greater silence and the Life that we have lost in living.  What we cannot speak about, we must pass over in silence (Wovon man nicht sprechen kann, darüber muss man schweigen, sentence 7 of the Tractatus) --and that silence, contra both the positivist Vienna Circle and Karl Popper (who was never part of it) is the most important part of human living.  In the Tractatus Wittengenstein dreamt, as it were, a conclusive solution to the puzzle of language --but such a solution can only be found in the silence beyond strict logical (or machine) forms: a silence of the religious quest beyond the ethical dilemma (Kierkegaard).

This journey through my "undiscovered summer reading," from the Antikythera mechanism to the alleged "Knowledge Singularity," has reinforced my daily, functional belief that knowing is truly something that humans do within language and through language, and that the quest which makes human life human is careful attention to the forms of human life, and the way that language, mathematics, and silence are woven into and through those forms. The techno-solutionism inherent in educational technology and library information technology --no matter how sophisticated-- cannot undo the basic puzzle of human life: how do we individually and social find the world? (Find: in the sense of locating, of discovering, and of characterizing.)  Yewno will not lead to a Knowledge Singularity, but to derived bias and reproduced injustice, unless we acknowledge its limitations within language. 

The promise of educational and information technology becomes more powerful when approached with modesty: there are no quick, technological solutions to puzzles of education, of finance, of information discovery, of "undiscovered public knowledge."  What those of us who are existentially involved with the much-maligned, greatly misunderstood, and routinely dismissed "liberal arts" can contribute is exactly what makes those technologies humane: a sense of modesty, proportion, generosity, and silence.  Even to remember those at this present moment is a profoundly counter-cultural act, a resistance of the techno-idology of unconscious bias and entrenched injustice.

In educational technology, we are in the presence of a powerful ideology, and an ideology of the powerful: the neoliberal state and its allies in higher education.

(This is part two of posts of my summer reading thus far: see parts one  and three.

Another article in found in my strange cleaning mania is not so very old: George Veletsianos and Rolin Moe's The Rise of Educational Technology as a Sociocultural and Ideological Phenomenon. Published by (upper-case obligatory) EDUCAUSE, it argues that "the rise of educational technology is part of a larger shift in political thought" that favors (so-called) free-market principles to government oversight, and is also a response to the increasing costs of higher education.  Edtech proponents have (always? often?) "assumed positive impacts, promoting an optimistic rhetoric despite little empirical evidence of results --and ample documentation of failures."  In other words, we are in the presence of a powerful ideology, and an ideology of the powerful: the neoliberal state and its allies in higher education.

The authors frame their argument through assertions:  The edtech phenomenon is a response to the increasing price of higher education: seen as a way of slow, stop, or reverse prices.  The popular press questions the viability of college degrees, higher education, sometimes with familiar "bubble" language borrowed from market analyses.  Second: The edtech phenomenon reflects a shift in political thought from government to free-market oversight of education: reducing governmental involvement and funding along with increasing emphases on market forces "has provided a space and an opportunity for the edtech industry to flourish." Although set vastly to accelerate under Donald Trump and Betsy DeVos, funding reductions and a turn to "private sector" responses have long been in evidence, associated with the "perspective" (the authors eschew "ideology") of neoliberalism: the ideology that the free, market competition invariably results in improved services at lower costs.  Outsourcing numerous campus services supposedly leads to lower costs, but also "will relegate power and control to non-institutional actors" (and that is what neoliberalism is all about).

The authors (thirdly) assert "the edtech phenomenon is symptomatic of a view of education as product to be package, automated, and delivered" --in other words, neoliberal service and production assumptions transferred to education.  This ideology is enabled by a "curious amnesia, forgetfulness, or even willful ignorance" (remember: we are in the presence of an ideology) "of past phases of technology development and implementation in schools."  When I was in elementary schools (late 1950s and 1960s), the phase was filmstrips, movies, and "the new math," and worked hand-in-glove with Robert McNamara's Ford Corporation, and subsequent Department of Defense, to "scale" productivity-oriented education for obedient workers and soldiers (the results of New Math, were in my case disastrous, and I am hardly alone).  The educational objectivism implicit in much of edtech sits simultaneously and oddly with tributes to professed educational constructivism --"learning by doing," which tends then to be reserved for those who can afford it in the neoliberal state.  I have bristled when hearing the cliché that the new pedagogy aims for "the guide on the side, not the sage on the stage" --when my life and outlook have been changed by carefully crafted, deeply engaging lectures (but remember: we are in the presence of an ideology).

Finally, the authors assert "the edtech phenomenon is symptomatic of the technocentric belief that technology is the most efficient solution to the problems of higher education."  There is an ideological solutionism afoot here. Despite a plethora of evidence to the contrary, techno-determinism (technology shapes its emerging society autonomously) and techno-solutionism (technology will solve societal problems) assumes the power of "naturally given," a sure sign of ideology.  Ignorance of its history and impact "is illustrated by public comments arguing that the education system has remained unchanged for hundreds of years" (by edX CEO Anant Agarwal, among others), when the reality is that of academia's constant development and change of course.  Anyone who thinks otherwise should visit a really old institution such as Oxford University: older instances of architecture meant to serve medieval educational practices, retro-fitted to 19th- and early 20th-century uses, and now sometimes awkwardly retro-fitted yet again to the needs of a modern research university.  The rise and swift fall of MOOCs is another illustration of the remarkable ignorance that ideological techno-solutionism mandates in order to appear "smart" (or at least in line with Gartner's hype cycle).

The authors conclude, "unless greater collaborative efforts take place between edtech developers and the greater academic community, as well as more informed deep understandings of how learning and teaching actually occur, any efforts to make edtech education's silver bullet are doomed to fail."  They recommend that edtech developers and implementers commit to support their claims with empirical evidence "resulting from transparent and rigorous evaluation processes" (!--no "proprietary data" here); invite independent expertise; attend to discourse (at conferences and elsewhere) critical of edtech rather than merely promotional, and undertake reflection that is more than personal, situational, or reflective of one particular institutional location.  Edtech as a scholarly field and community of practice could in this was continue efforts to improve teaching and learning that will bear fruit for educators, not just for corporate technology collaborators.

How many points of their article are relevant by extension to library information technology, its implementation, and reflections on its use!  Commendably, ACRL and other professional venues have subjected library technologies to critical review and discourse (although LITA's Top Technology Trends Committee too often reverts to techno-solutionism and boosterism from the same old same old).  Veletsianos' and Moe's points are regarding the neoliberal ideological suppositions of the library information technology market, however, are well-taken --just attend a conference presentation on the exhibition floor from numerous vendors for a full demonstration.  At the recent conference of the Association of College & Research Libraries, the critical language of the Information Literacy was sometimes turned on librarianship and library technology itself ("authority is constructed and contextual"), such as critique of the term "resilient" (.pdf) and the growing usage of the term "wicked challenges" for those times we don't know what we don't know or even know how to ask what that would be.

Nevertheless, it would be equally historically ignorant to deny the considerable contributions made by information technology to contemporary librarianship, even when such contributions should be regarded cautiously.   There are still intereting new technologies which can contribute a great deal even when they are neither disruptive nor revolutionary.  The most interesting (by far) new kind of technology or process I saw at ACRL is Yewno, and I will discuss that in my third blog post.

"Undiscovered public knowledge" seems an oxymoron. If "public" than why "undiscovered" --means the knowledge that once was known by someone, recorded, properly interred in some documentary vault, and left unexamined.

(This is the first of three posts about my semi-serendipitous summer reading; here are links to posts two and three.)

This last week I was seized by a strange mania: clean the office. I have been in my current desk and office since 2011 (when a major renovation disrupted it for some months).  It was time to clean --spurred by notice that boxes of papers would be picked up for the annual certified, assured shredding. I realized I had piles of FERPA-protected paperwork (exams, papers, 1-1 office hours memos, you name it).  Worse: my predecessor had left me large files that I hadn't look at in seven years, and that contained legal papers, employee annual performance reviews, old resumes, consultant reports, accreditation documentation, etc. Time for it all to go!  I collected six large official boxes (each twice the size of a paper ream), but didn't stop there: I also cleaned the desk; cleaned up the desktop; recycle odd electronic items, batteries, and lightbulbs; forwarded a very large number of vendor advertising pens to cache for our library users ("do you have a pen?"). On Thursday I was left with the moment-after: I cleared it all out: now what?

The "what" turned out to be various articles I had collected and printed for later reading, and then never actually read --some more recent, some a little older. (This doesn't count the articles I recycled as no longer relevant or particularly interesting; my office is not a bibliography in itself.) Unintentionally, several of these articles wove together concerns that have been growing in the back of my mind --and have been greatly pushed forward with the events of the past year (Orlando--Bernie Sanders--the CombOver--British M.P. Jo Cox--seem as distant and similar as events of the late Roman republic now, pace Mary Beard.)

"Undiscovered public knowledge" seems an oxymoron (but less one than "Attorney General Jeff Sessions").  If "public" than why "undiscovered"?  It means the knowledge that once was known by someone, recorded, properly interred in some documentary vault, and left unexamined and undiscovered by anyone else.  The expression is used in Adrienne LaFrance's Searching for Lost Knowledge in the Age of Intelligent Machines, published in The Atlantic, December 1, 2016.   Her leading example is the fascinating story of the Antikythera mechanism, some sort of ancient time-piece surfaced from an ancient, submerged wreck off Antikythera (a Greek island between the Peloponnese and Crete, known also as Aigila or Ogylos).  It sat in the crate outside the National Archaeological Museum in Athens for a year, and then was largely forgotten by all but a few dogged researchers, who pressed on for decades with the attempt to figure out exactly what it is.

The Antikythera mechanism has only come to be understood when widely separated knowledge has been combined by luck, persistence, intuition, and conjecture.  How did such an ancient time piece come about, who made it, based upon which thinking, from where?  It could not have been a one-off, but it seems to be a unique lucky find from the ancient world, unless other mechanisms or pieces are located elsewhere in undescribed or poorly described collections.  For example, a 10th-century Arabic manuscript suggests that such a mechanism may have influenced the development of modern clocks, and in turn built upon ancient Babylonian astronomical data.  (For more see Josephine Marchant's Decoding the heavens : a 2,000-year-old computer--and the century-long search to discover its secrets, Cambridge, Mass.: DaCapo Press, 2009: Worldcat ; Sacred Heart University Library). Is there "undiscovered public knowledge" that would include other mechanisms, other clues to its identity, construction, development, and influence?

"Undiscovered public knowledge" is a phrase made modestly famous by Don R. Swanson in an article by the same name in The Library Quarterly, 1986.  This interesting article is a great example of the way that library knowledge and practice tends to become isolated in the library silo, when it might have benefited many others located elsewhere. (It is also a testimony to the significant, short-sighted mistake made by the University of Chicago, Columbia University, and others, in closing their library science programs in the 1980s-1990s just when such knowledge was going public in Yahoo, Google, Amazon, GPS applications and countless other developments.)  Swanson's point is that "independently created fragments are logically related but never retrieved, brought together, and interpreted." The "essential incompleteness" of search (or now: discovery) makes "possible and plausible the existence of undiscovered public knowledge." (to quote the abstract --the article is highly relevant and well developed).  Where Swanson runs into trouble, however, is his use of Karl Popper's distinction between subjective and objective knowledge, the critical approach within science that distinguishes between "World 2" and "World 3."  (Popper's Three Worlds (.pdf), lectures at the University of Michigan in 1978, were a favorite of several of my professors at Columbia University School of Library Service; Swanson's article in turn was published and widely read while I was studying there.)

Popper's critical worlds (1: physical objects and events, including biological; 2: mental objects and events; 3: objective knowledge, a human but not Platonic zone) both enable the deep structures of information science as now practiced by our digital overlords as well and signal their fatal flaw.  They do this (enable the deep structures and algorithms of "discovery") by assuming the link between physical objects and events, mental objects, and objective knowledge symbolically notated (language, mathematics). Simultaneously Popper's linkage also signals their fatal flaw: such language (and mathematics) is or are used part-and-parcel in innumerable forms of human life and their languages "games," where the link between physical objects, mental objects, and so-called objective knowledge is puzzling, in addition to a never-ending source of philosophical delusion.

To sum up:  Google thinks its algorithm is serving up discoveries of objective realities, when it is really extending the form of life called "algorithm" --no "mere" here, but in fact an ideological extension of language that conceals its power relations and manufactures the assumed sense that such discovery is "natural."  It is au contraire a highly developed, very human form of life parallel to, and participating in, innumerable other forms of life, and just as subject to their foibles, delusions, illogic, and mistakes as any other linguistic form of life. There is no "merely" (so-called "nothing-buttery") to Google's ideological extension: it is very powerful and seems, at the moment, to rule the world.  Like every delusion, however, it could fall "suddenly and inexplicably," like an algorithmic Berlin Wall, and "no one could have seen it coming" --because of the magnificent illusion of ideology (as in the Berlin Wall, ideology on both sides, as well, upheld by both the CIA and the KGB).

This is once again to rehearse the crucial difference between Popper's and Wittgenstein's understandings of science and knowledge.  A highly relevant text is the lucid, short Wittgenstein's Poker: The Story of a Ten-Minute Argument Between Two Great Philosophers, (by David Edmonds and John Eidinow, Harper Collins, 2001; Worldcat).  Wittgenstein: if we can understand the way language works from within language (our only vantage point), most philosophical problems will disappear, and we are left with puzzles and mis-understandings that arise when we use improperly the logic of our language.  Popper: Serious philosophical problems exist with real-world consequences, and a focus upon language only "cleans its spectacles" to enable the wearer to see the world more clearly.  (The metaphor is approximately Popper's; this quick summary will undoubtedly displease informed philosophers, and I beg their forgiveness, for the sake of brevity.)

For Wittgenstein, if I may boldly speculate, Google would only render a reflection of ourselves, our puzzles, mis-understandings, and mistakes. Example: search "white girls," then clear the browser of its cookies (this is important), and search "black girls."  Behold the racial bias. The difference in Google's search results points to machine-reproduced racism that would not have surprised Wittgenstein, but seems foreign to the Popper's three worlds.  Google aspires to Popper's claims of objectivity, but behaves very differently --at least, its algorithm does.  No wonder its algorithm has taken on the aura of an ancient deity: it serves weal and woe without concern for the fortunes of dependent mortals. Except . . . it's a human construct.

So, Swanson's article identifies and makes plausible "undiscovered public knowledge" because of the logical and essential incompleteness of discovery (what he called "search"): discovery signals a wide variety of human forms of life, and no algorithm can really anticipate them.  The Antikythera mechanism, far from an odd example, is a pregnant metaphor for the poignant frailties of human knowledge and humans' drive to push past their limits. Like the Archimedes palimpsest, "undiscovered public knowledge" is one of the elements that makes human life human --without which we become, like the Q Continuum in Star Trek: Next Generation, merely idle god-like creatures of whim and no moral gravitas whatsoever.  The frailty of knowledge --the it is made up of innumerable forms of human life, which have to be lived by humans rather than algorithms-- gives the human drive to know its edge, and its tragedy.  A tragic sense of life, however, is antithetical to the tech-solutionist ideology of the algorithm.

(Continued in the second post, Undiscovered Summer Reading)

More than a month ago, Joshua Kim asked eleven questions of his colleagues at the Association of College and Research Libraries Conference 2017 (Baltimore, March 22-25). I don't have answers -- but at an embarrassing long delay, here are my responses to his eleven questions.

More than a month ago, Joshua Kim asked eleven questions of his colleagues at ACRL 2017 (Association of College and Research Libraries, Baltimore, March 22-25).  I simply cannot keep up the pace of writing and work that apparently characterizes Dr. Kim (I ask, as does Barbara Fister: "how does he do it?").  I don't have answers -- but at an embarrassing long delay, here are my responses to his eleven questions:

Question 1: What is keeping academic librarians up at night?

Allergy to book mold?  Seriously: the mis-match between 1) the transformations of libraries and librarians now in process, 2) unrealistic expectations, both within and outside academe, about the transformation of higher education by digital technology, and 3) available financial resources.  There is barely enough money to fund our members' needs day-to-day, much less the future needs of members who will bring very high, very different expectations to their education and research.  Training and re-training librarians, even younger, newer ones, will also be not only a significant expense, but an unavoidable one if they are to remain relevant and aligned with their institutional mission.

Question 2: What will the academic library look like in 2025?

Libraries will still living in a both-and world: some members will continue to want printed books, but will use them differently than the recent past (2015).  Some printed books will explicitly engage digital resources as supplements and complements.  Some members will never want to see or open a printed book.  Some members will be working far more with data sets and non-textual (or ostensibly non-textual) information visualizations.  Libraries will have fewer printed books on site, more workspaces (and more different kinds), less reader or member privacy, and more commercialization by monetized information organizations.  Members will still want a physical library space to be a place to get and stay "on task."  Some members will never interact physically with a library, or personally with librarians, but will use library services every day through the information configurations that librarians will tend and troubleshoot.

Question 3: How is the academic librarian profession changing?

Already there is far more emphasis upon communication, instructional, and design skills than ten years ago (even than five years ago).   Technical, back-office skills are rapidly changing from the provision and editing of information to aligning interactive and interoperable information systems.  Library leadership is especially challenged to visualize what could be, might be, or will be, to our stakeholders, and figure out how to achieve all that, and yet negotiate present-day campus political, financial, and legal arrangements that often reflect patterns and processes that are already obsolete.

Question 4: What is the role of the academic library in leading institutional transformation?

Preface: If universities are like the proverbial elephant as regarded by the visually impaired, libraries are very well-positioned (almost uniquely) to see a great deal of the elephant. The daily life of a library interacts with faculty who are teaching and doing research right now, academic leadership, policy and planning, campus operations of all kinds, public safety and security, university financial offices, alumni/ae relationships, enrollment retention, student recruitment, faculty recruitment, information technology, instructional design, construction and facilities management, and sometimes even the food service.  As a library director, I have almost all of those people on speed-dial.

To answer this question: libraries are almost uniquely well-positioned to act as change agents by partnering with a wide variety of interests to achieve a cumulative social, educational, intellectual impact on campus beyond the abilities or purview of any one campus organization.  This requires vision, street smarts, and an ability to listen.  It requires seeing that library priorities are not always the institution's priorities, but when different they need to come into some kind of symbiosis.

Question 5: How do academic librarians think about learning innovation?

It's a broad term.  Librarians would love to collaborate with instructional designers, faculty members, and others to create pathways for learning that transcend previous classroom, lab, and practice settings.   Organizations, consultants, and academic specialists will all be part of that --but many academic librarians I know are increasingly suspicious of the corporate interests in the phrase "learning innovation."  The innovations to learning in higher education that will be most productive are those that will not be packaged and sold by corporate interests, but will be far more local, ad-hoc, and malleable by teachers and learners themselves.  When I view a video purporting to be about Learning Innovation, and the head of Thomas Friedman talks, then I begin to wonder whose interests are really going to be served.

Question 6:  What is the role of the academic library in leading institutional efforts [that will] drive progress in the iron triangle of costs, access, and quality?

Academic librarians have a lot of experience with the trade-offs of costs, access limitations, and quality (both of information per se, and of presentation and interface).  Part of my daily life is putting budget numbers and academic ways-and-means together.  I believe the academic library's role can be incubator, initiator, and assessor of costs for access and quality of outcomes but that role is not guaranteed.  I believe that academic librarians will also want to challenge the oft-encountered (perhaps dominant) idea that instructional quality is simply a cost that limits institutional net income.  The recent ACE paper Instructional Quality, Student Outcomes, and Institutional Finances (.pdf) points at research that needs to be done, and assumptions that should be interrogated.

Question 7: What does the academic library leadership pipeline look like?

I heard some real concerns at ACRL about how the field will mentor future leaders who will need the financial, political, academic, and social skills necessary to lead a complicated organization on a complicated campus (physical or digital).  The relative slow-down in professional movement, promotion, and retirements in the years after 2009, coupled with either outright downsizing or less (immediately) drastic holds on hiring, have produced a situation where there is not a sufficient number of opportunities for rising leaders to learn their craft.  The profession is greying, and I cannot blame recent college graduates who bypass library and information science programs in favor of fields in which they will be able to pay off their substantial student debts more readily.  Yet we really need those people, and we need creative, competent new professionals of every age who will contribute their perspectives and learn how business actually gets done in many institutions.

Question 8:  How is the academic library addressing challenges around diversity and inclusion?

This was also a major theme of the ACRL conference, and built up to the simply fabulous closing keynote by Carla Hayden, Librarian of Congress.  These challenges play differently in contexts: large academic library systems can pursue strategies and mentorships that may not be practical for much smaller libraries --where challenges and real needs for diversity of perspectives, persons, and inclusions of all kinds of persons are still very much present and felt.  I think this is a real opportunity for ACRL: to lead a multi-sided approach with library and information schools, foundations and grantors, large and small academic libraries, national, state, and regional library associations (in particular with library technology, and leadership & management divisions), and academic administrative organizations (AAC&U, ACAD, and HERC), for a cumulative impact on the profession, the libraries, and the universities.  I think that the professional leadership education offered by the Harvard Graduate School of Education is also a vital and viable venue to put together the efforts of many organizations.

Question 9:  What are the big arguments and debates within the academic library discipline?

I find so many that I will inevitably leave some out of even a very long list.  But here's my short list:

  • Privacy, user security, and trust: maintaining the academic library digital and physical space as a non-commercial zone of exception, as much as possible; and the way that user's searches and downloads can become monetized data points for commercial services that offer a false equivalent to a real library--and whether librarians can really do anything about that, or respond to it usefully;
  • Evolving understandings or interpretations of the Information Literacy Framework: what it brings in, leaves out, interrogates, and strengthens, and the sometimes yawning gap between the aspirations of the Framework and the sometimes frightening realities of many young students' lack of curiosity and joy;
  • The persistent tensions between "resilience" as a good term for a kind of creative flexibility in the face of adversity, and "resilience" as a substitution of a personal response for a solution to a structural problem.  I heard one speaker use the work and then immediately apologize for it after a standing-room-only presentation called Resilience, Grit and Other Lies: Academic Libraries and the Myth of Risiliency (.pdf)

Other attendees are welcome to point out all the good fights I missed.

Question 10:  How is the relationship between academic libraries and centers for teaching and learning (CTLs) evolving?

I'm not sure there is a consensus, since there are so many variables in academic contexts. I know of one case where a beautifully renovated CTL in fact combined about 10 other services formerly located elsewhere in a large university.  That set of new partnerships has required so much team-building and re-negotiation that librarians in the same building have not had as much contact as they previously anticipated (this may be changing recently).  Where the CTL and librarian partners have sufficient contact and are not completely frustrated by funding limitations (or non-existence), I have heard that enormously fruitful partnerships evolving.  Open Educational Resources, Open Textbooks, and so many other hot topics really call for multi-sided collaborations.  My favorite anecdote is of an information technologist with a strong secondary background in instructional design who exclaimed, "Wow, there's a lot of information technology in the library!"  I believe that many libraries, especially on undergraduate-oriented campuses, were attempting to be centers for teaching and learning before the phrase was invented, and in those places where turf is not a source of conflict, creative partnerships are forming.

The recent Ithaka S+R survey of library directors found that "while [many] library directors agreed that librarians at their institutions contribute significantly to student learning in a variety of ways, only about half of the faculty members for the Ithaka S+R Faculty Survey 2015 recognized these contributions." (pages 3-4) I suspect that both Centers for Teaching and Learning and academic libraries face a common challenge to communicate what they can (and do) contribute to faculty who are genuinely skeptical or worried about maintaining their turf.

Question 11:  What questions should I be asking about the changing academic library?

The fact that you are asking any questions is remarkable for many academic librarians, who have so often felt marginalized (for reasons good and bad) by campus technology and technologists in the past couple of decades.  You don't take anything for granted.

I can only really respond by suggesting the question that I'm asking as I lead my organization through a process of listening, thinking and planning together: what is our core mission in plain language?  What is our value proposition for our institution?  How do we show that we are doing that?  How does our mission and value proposition align with our institution's proclaimed commitments and priorities?

In this process I have spoken with many people on my campus, and (following advice from a mentor) I asked each of them simply:  "What is your job?  What difference does your office make here?  What's your biggest challenge?"  Their responses were amazing, and almost all of them pointed in one direction: "how do we communicate to a skeptical world what an amazing difference real learning can make in a student's life?"  To the extent that our library can respond to that question with grace and authenticity, we can also state our value proposition and our mission, and our alignment with our university.

Even with all the challenges, controversies, and constraints, this is the best time ever to be an academic librarian.