Skip to content

Newport Digital Minimalism +-+4460807056_140Digital Minimalism, by Cal Newport. Random House, 2019. 256 pages.  ISBN 9780525542872  (Other sources cited in this article are listed at the end.)

Minimalist –anything spare or stripped to its bare essentials.  Minimalism became a cultural movement, then a social commitment: live with less than 100 things, or Marie Kondo’s things that spark joy. Is digital minimalism a contradiction in terms? Is it relevant to higher education, or is it another so-called luxury like “liberal arts education?”  Read on.

  1. Digital Minimalism

The present is a digitally maximal time –but it’s all very vague.  What’s good is great and best in huge quantities: maximum use of digital media (and social media in particular) to give “people the power to build community and bring the world closer together” (—Facebook’s mission statement). How? Just connect, share, and like, and somehow good things will happen.  “You never know, maybe you’ll find this useful” – one of the weakest sales propositions ever.

Anything or anyone that seems to resist digital maximalism risk the label “Luddite,” a dismissive reference to 19th-century weavers who destroyed machines to save, they thought, their way of life.  Newport is no Luddite: he’s a professor of computer science at Georgetown, author of erudite papers on distributed networks as well as popular works such as So Good They Can’t Ignore You (2012).  In Deep Work: Rules for Focused Success in a Distracted World (2016) he explored how to maintain focus to do optimal, cognitively demanding work in a world of distractions: high-value, high-impact undertakings rather than low-impact tasks.  Digital Minimalism sprang from his readers’ struggles with the role of new technologies in their personal lives.  The insight came to him while on a walk on a deserted beach in the Bahamas (Newport strongly recommends walking as an analog practice)— a lovely location for “deep work!”

Digital minimalism is “a philosophy of technology use in which you focus your online time on a small number of carefully selected and optimized activities that strongly support things you value” –without the infamous Fear of Missing Out (FOMO).  Living this philosophy successfully means engaging long-term in cost/benefit analyses: is the benefit worth the time?  Time is the most truly limited resource. Clutter is costly; optimizing your time is crucial; intentionality is satisfying: consistently following through on your commitments.

Newport unpacks all this lucidly (he is, after all, a computer scientist).  His first chapters lay the foundations: why declutter your digital life? What can you gain? How do you do it and stick to it? His latter chapters focus on practices: how to do a digital de-clutter; how to grow comfortable again with spending time alone; how to reclaim real leisure; how to avoid digital rabbit holes such as clicking “like,” and how to find other digital minimalists: community support. He seeks to answer Andrew Sullivan’s plaintive essay, “I Used to Be a Human Being” (2016): to help upend Sullivan’s lament “by providing a constructive way to engage and leverage the latest innovations to your advantage” –to be able to “say with confidence: ‘Because of technology, I’m a better human being than I ever was before.’”

Wait—isn’t this the point of an education? Newport acknowledges the depths here: Aristotle, Thoreau, Abraham Lincoln; but he avoids getting pulled off-task.  The book is a readable length, but its shadow stretches very far indeed: becoming a better human being stretches far beyond dispelling the enchantments of technology.

Back, for a moment to Sullivan:  his moment of insight came after illness, sleeplessness, the demands of a profitable media business (blog), and dwindling friendships. “Multi-tasking was a mirage. This was a zero-sum game. I either lived as a voice online or I lived human being in the world that humans had lived in since the beginning of time.”  Why zero-sum? He had (has) only so much time to pay attention. The ceaseless wind-tunnel of distraction “denies us the deep satisfaction that comes with accomplishing daily tasks well, a denial perhaps felt most acutely by those for whom such tasks are also a livelihood —and an identity.”

Many university teachers have noticed that students (especially undergraduates) now seem even less prepared to engage in serious thinking, research, writing, and lab work than a decade ago.  Their observations dovetail with major shifts in student mental health observed by counselors in the past few years, validated by Jean Twenge’s research on those born 1995-2012, who grew up with constant access to social media. “Rates of teen depression and suicide have skyrocketed since 2011. . . Much of this deterioration can be traced to their phones . . . . The effect of screen activities is unmistakable: The more time teens spend looking at screens, the more likely they are to report symptoms of depression . . . . This trend has been especially steep among girls.” Twenge’s teenage research subjects in 2015-2016 are (or will) enroll in university classes 2018-2021.

Can this be blithely dismissed: That’s progress, you can’t stop it?  “Progress” hides a more sinister reality: the social media apps these young people use so often have been specifically engineered to encourage maximal use through intermittent positive reinforcement and the drive for social approval.

  • Apple engineers Justin Santamaria and Chris Marcellino developed the iPhone push-notification technology that affects the same neurological pathways as gambling and drug use: “reward-based behavior that activates the brain’s dopamine pathways.”
  • Tristan Harris (“Design Ethicist” at Google) notes that humans crave approval, and companies tweeks their apps to hook their users with the power of unpredictable positive feedback, sprinkling “intermittent variable rewards [likes, tags, tweets, etc.] all over their products because its good for business.” Getting a reward is like winning at a slot machine, and “several billion people have a slot machine in their pocket.”
  • Sean Parker (Facebook founder) remembers, “The thought process that went into building these applications, Facebook being the first of them, ... was all about: How do we consume as much of your time and conscious attention as possible?”

The combination of phones and social media apps is specifically designed to hook users –especially young people—into prolonged use because their business model is to expose them to paid advertising, political, and entertainment content intended to shape their behavior and gather their votes and dollars.  A great many users (especially the young) are compulsively on their phones because they have been hooked –exactly what the phones were designed to do.  Sean Parker fears that social media “literally changes your relationship with society, with each other ... It probably interferes with productivity in weird ways. God only knows what it's doing to our children's brains.”  Sullivan suggests that this enslavement is merely “new antidepressants of a non-pharmaceutical variety.”

These intentions are not a new idea, but in digital engineering now taken to new extremes.  Timothy Wu writes that newspapers were drastically changed by the introduction of advertising in the 19th century: readers became not just subscribers, but also an audience the newspapers delivered to advertisers.  Matthew Crawford notes that the first industrial assembly lines, by demanding concentration on repetitious tasks, so altered the experience of work that Henry Ford’s workers simply walked out in 1913.  When Ford wanted to add 100 workers to the line, he had to hire 963, and was forced to double the daily wage to keep the line staffed.  In broader social terms, Crawford writes elsewhere that advertising through social media apps of the claim a large portion of the “attentional commons” for private purposes in the attention economy, with the resulting surfeit of messages and enervated users.  Just as Ford’s innovations in long term fomented a powerful labor union, could a “user union” come to counterbalance corporate attention engineering?

Resisting these claims to wage labor or attention engineering is not new.  The 19th-century Arts and Crafts movement inspired by John Ruskin and William Morris grew from their revulsion against mechanized production and the Dickensian, oppressive division of sweatshop labor in Victorian England.  Newport advances Thoreau’s famous axiom in Walden, “The cost of a thing is the amount of what I will call life which is required to be exchanged for it, immediately or in the long run.”  Rather than the standard account of cost in money, Thoreau counts the cost in life: attention, connection, his pleasure of living deliberately.  In the first chapter, “Economy,” Thoreau gives a very straightforward, New England accounting of his life on the pond, replete with tables (his kind of spread-sheet) to show his point that frequently more is actually less.  By contrast, do not our Concord-like students, crushed and smothered under their load of distraction and debt, come to lead lives of quiet desperation?

Is there any solution or alternative?

A hint of a solution has been given, ironically, by Facebook itself.  Its own David Ginsberg and Moira Burke ask, “is spending time on social media bad for us?” After reviewing a lot of research, they conclude, “it really comes down to how you use the technology.”  This gives the game away: reflective, intentional use (in Newport’s words) “punctures the myth of Facebook as a foundational technology that everyone should just “use” in some generic sense . . . . [they] are encouraging people to think critically about exactly what they want to get out of this service.”

Newport realizes the potential of Ginsberg’s and Burke’s admission. “This mind-set is potentially disastrous” for Facebook because it could result in far less time spent in it, dramatically decreasing its value for advertisers and investors.  Any explicit comparison of the real costs of time and attention with the real benefits of social media threatens Facebook’s business model.

Reflective, intentional, and “critical use is a critical problem for the attention economy.”  By developing minimal and deliberate use of digital technology, users might “front only the essential facts of life,” to see if they can learn what it has to teach: to choose a focused life.  Have universities, by so catering to students’ and parents’ anxieties, accepted their students’ distraction by social media unreflectively?  The “attentional commons” of higher education has always faced competition, but now faces determined competitors armed with the specific agenda to “consume as much of [their students] time and attention as possible” (Sean Parker).

Universities can reclaim their cultural relevance when they come to understand that the greatest threat to education today is not careerism, financial instability, or political hostility, but distraction.  If higher education will ever offer a coherent alternative to a depressed and frazzled generation, it will have to engage the powerful corporate and cultural forces that want to hold students, faculty, and staff hostage to engineered, hyper-palatable mental pseudo-stimuli.  This engagement will have to be smart, flexible, subtle, and persistent if we are to challenge the fast food of social media with the slow cookery of a strenuous education.

The past few months have shown that Facebook and other social media sites are hardly invincible and certainly not foundational, as they face sharp-eyed scrutiny from public, government, and investors alike.  Now is the time for higher education to step up to the challenge of distraction.

Where is this wisdom to be found, and where is the source of this understanding?

  1. Critical Attention, Disruptive Humility, and Analog Reality

Much fuller responses come from a varied handful of writers.  Gary Rogowski and Michael Crawford return to attention required by the physical world of objects, activities, and pleasures that live on despite the blandishments of the digital.  Rogowski’s title and subtitle reveals his point: Handmade: Creative Focus in an Age of Distraction.  Examining what his life has to teach, the essential facts of his life (like Thoreau), Rogowski finds a correspondence between our hands and our thoughts: “Long ago we learned to think by using our hands, not the other way around.”  The sheer physicality of “analog” asks us “to give good evidence of yourself. Do good work.”

Craftsmanship “must reckon with the infallible judgment of reality, where one’s failures or shortcomings cannot be interpreted away.”  Crawford believes “the mechanical arts have a special significance of our time because they cultivate not creativity, but the less glamorous virtue of attentiveness.  Things need fixing and tending no less than creating.”  Sheer physicality, the recalcitrance of the real, is a way of learning the humility of attention. As Rogowski puts it, “situated among our fellows in norms and practices that shape a life, the environment matters.”

The humility of attention is not new. Alexander Langlands inquired into the origins and true meanings of traditional crafts, he bumped into the embeddedness of craeft, an Old English word connoting an amalgam of knowledge, power, and skill, extended to a sense of wisdom and resourcefulness. “We can’t put our finger on exactly what craeft was.”  As an archaeologist, he constantly studies material culture, the “deep time signatures” of so many traditional crafts.  They remind him, “we are makers, and that we have always lived in a world of making.”  The cognitive contemplation of making exercise the mind in silence and solitude.  Craeft is a form of intelligence, an ingenuity through which we can think, contemplate, and be: powerful, resourceful, and knowledgeable through the medium of making in a world of diminishing resources and increasing environmental instability.  To be craefte is all about a mindful life achieved through beautiful simplicity: the humility of attention, failure, and persistence.

Craeft provides a high-touch physical higher education.  It pays attention to the reality of physical, tangible materials as they both shape and respond to human desires, needs, and intentions.  Langlands’ “deep time signatures” are written with the ink of character on the paper of experience, an in-this-moment contact with forms of past life and future practice that transcend any individual’s encounters.  In the 21st century, is this mere nostalgia, or wishful thinking?

Jaron Lanier, one of the primary inventors of “virtual reality” practices these deep signatures of time through his music and writing.  Understanding the virtual imitation of reality so deeply, Lanier wrestles with actual reality acutely: difficult-to-play ancient instruments such as the begena (ancient harp), suling (flute), or esraj (sitar)—he has collected over a thousand.  He doubles down on being human by re-learning the ancient crafts of playing, the deep signatures of time that learning such ancient skills requires.

Lanier’s craeft of music extends to his craeft of writing, and understanding of books.  His Who Owns The Future (2013) concludes with entwined meditations, “The Fate of Books,” and “What Is To Be Remembered?”  Having watched the disruption of the music business and unintentional (but devastating) impoverishment of musicians, Lanier see a similar pattern dveloping regarding reading and writers.  Six years later, his hopes, fears, and expectations for books are fascinating: some have already come true.  Among them:

  • Readers will be second-class citizens (because ownership of a printed copy has become a contract of access to digital content; the reader is left no capital, nothing to resell –a rejection of a true market economy);
  • Many books will be available only via a particular device, such as a particular company’s tablet (think: Kindle, Nook, or digital rights protection such as Adobe Digital Editions);
  • Readers will spend a lot of time hassling with forgotten passwords, expired credit cards, or being locked into the wrong device . . . for years at a time;
  • People will pay less to read . . . while people will earn still less from writing (repeating a pattern seen in music and other media in which “software swallows everything”);
  • By the time books have mostly gone digital, the owners of the top Internet servers . . . will be more powerful and richer than they were before.

What does Lanier see about a book that is worth saving?  A book is not an artifact, but a synthesis of fully realized individual personhood with human continuity (my italics).  To Lanier, the pattern of devaluation and exploitation of “content” (music, literature) feels off-kilter and short-sighted. The network that fails to recognize and preserve human continuity will serve only its masters, just as digital music has fostered a dehumanization of music.  When the human role is reduced to producing “output” (or “content”) just like a collective or algorithm, and when a winner-takes-all market-place ideology becomes the sole means of valuation, the very qualities that make music or literature, reading or writing, humanely worthwhile are simply sidelined, as irrelevant externalities.  To paraphrase the famous declaration from Vietnam, “we had to burn this art form to price it.”

Lanier proposes this (potentially offensive) thought experiment: “Would you want to send a collectively programmed robot to have sex on your behalf because it was better than you, or would you want to have the sex yourself and get better by doing?” Too often Lanier has heard in response, “I’d prefer to have the best available robot,” silently passing over the nexus of data, servers, technological finance, and market ideology that make that choice seem preferable. By analogy, “If market pricing is the only legitimate test of quality, why are we still bothering with proving theorems?” Why don’t we just let the market determine whether a theorem is true?”  Content-by-algorithm or -collective is slow suicide for creators and researchers and slow homicide for everyone else.  Lanier concludes, “I am arguing that there is more than one way to build an information economy, and we’ve chosen the self-destructive option.”

“Content” at market price seems to be so easily available.  For example, printed books are still common.  Hyper-focused on “content-creation” and marketplace valuation, the “siren servers” of technology obscure how each book bears the marks of the deep signatures of time.  A modern book is a carefully crafted, progressively evolving artifact that only implies the depth of human continuity and experience in its design.  If you doubt this, just look at books printed a century or two ago to see the differences in design and the evolution of writers’ and readers’ expectations.  The book is a synthesis of fully realized personhood with the human continuity signed with time’s deep signatures. A printed book simply transposed into a digital reader as some kind of file bring convenience at the cost of interaction (annotating, bookmarking, easy page-to-page comparison or reference –and the present alternatives to those activities in software are simply dreadful).  A time may come when a digital book represents human continuity –but not yet nor for a long while.  We are creatures of touch and smell as well as sight, hearing, and taste.

I believe Lanier’s insight (that books and other media are syntheses “of fully realized individual personhood with human continuity”) foregrounds the odd and enlightening “revenge of the analog” (David Sax’s book). The return of “disrupted” technologies (prominently but not only vinyl recordings, paper books, film photography, and board games) is possible not only because digital versions work so well, but because they work too well.  Convenience becomes a trap, a mirage of experience rather than genuine experience. Sax’s different narrative shows that technological innovation is not an inevitable and irreversible march, but “a series of trials that helps us understand who we are and how we operate.”  The overwhelming superiority of digital media ironically leads to a shift of valuing: an older technology can sometimes work better, and its inconvenience and inefficiency becomes its renewed strength.  Print publishing, retail sales, on-site workers in a vibrant community (such as Shinola in Detroit), on-ground education, face-to-face interaction in meetups, book clubs, meditation groups, summer camps—all of these ideas and practices have proven surprisingly resilient.  Resilience cannot charm away real difficulties: no analog Dr. Pangloss!  But it does suggest that a popular narrative of irresistible disruption might not be all there is; resilience is an alternative to battle-field metaphors of victory and defeat.

The worship of disruptive technologies (and disruption in general) can go too far. “Disruptive innovation” may not in fact be the rule, but the exception.  Clayton Christiansen’s celebrated theory is not beyond criticism (by his Harvard colleague Jill Lepore) and equally cogent alternatives (Harvard colleague Michael Porter’s theories of competitive strategy and advantage).  Very recently Alfred University president Mark Zupan called Christiansen’s bluff.  Christiansen has doubled down on his 2011 prediction that as many as half of American universities would close or go bankrupt in 10 to 15 years.  Zupan has wagered $1 million, to be given to Christiansen’s non-profit Institute if at least half of all traditional universities fail or merge by 2030; if not, Christiansen would contribute the same sum to Alfred University’s endowment.  Will Christiansen be willing to put his money where his mouth is?  (See also this previous blog entry.)

The humility of paying careful attention to deeply fallible and human face-to-face encounters and processes calls the bluff of technologism –the belief that (often disruptive) technology will cure all woes, a new variety of traditional snake-oil wonders (climate change: “there’s an app for that!”).  Ryan Raffaelli (also from Harvard Business School) has studied how new value can be created for “old innovations” –a subtle and delightful riposte.  Raffaelli has studied, in particular, how the Swiss watch industry saved itself by reinventing its identity: a technology re-emergence.  Swiss watchmakers who survived the competition of cheaper imports saw prices for mechanical watches up for auction increase dramatically.  Suddenly they realized: there could still be value latent in an older technology re-conceived as a social and cultural fashion signal.  Just when the Apple watch might have swept away the Swiss watch industry, instead the watch industry revived as a progression for those who, having worn an Apple watch, now wanted something up-market that feels and signals “genuine.”  As went Swiss watches, so have fountain pens (Goulet Pens), notebooks (Moleskine), and independent bookstores, now numbering more than in 2012.

Here—at last!—is an opening for liberal arts education.

  1. Analog Liberal Arts Education Against the Grain: Digital Minimalism in Action

The term “liberal arts” now seems so old-fashioned or vague that it has been seen to impede further conversation.  I once spoke with the president of a well-regarded liberal arts college who claimed that she no longer tried to use the term, because no one knows what it means.  Her view discounts the real value of the phrase “liberal arts education,” ambiguous and elusive, which has thus resisted facile, ideological characterization. The ambiguity is the impediment that re-conceives value.  The word “liberal” in this case means something far older than the tired liberal-versus-conservative polarization that disfigures (or has destroyed) public discourse: liberal as in liberating, free-making, getting disentangled from the sloppy shortcuts of everyday thinking.  “Liberal arts” connotes a narrative of thoughtful practices out of synch with careerism and digital maximalism.

The narrative of the liberal arts, a braid of habits and practices from (inter alia) classical antiquity, medieval universities, American colleges in the New National period, and the rise of sciences and social sciences in Progressive era, is an amalgam that is unavoidably counter-cultural. Powerful ideologies, whether arising from crown, state, church, market, or faction have always arrayed themselves against it.  Overt ideologies are the obvious culprits, but equally antagonistic have been the subtle habits and dispositions that can cloak the subtle, slippery demands of power, position, and fortune.  The “liberal arts” have been both the servants of power and privilege and a primary source of their interrogation. This both/and contradiction or ambiguity, with a strong whiff of the impractical, means that contemporary marketplace ideology can hardly avoid patronizing liberal arts higher education as outmoded, expensive, boring, elitist, wasteful, and ideologically suspect.  That suspicion shows exactly why liberal arts higher education meets a critical need for pause and reflection in an ideologically riven time.  Being a thoughtful, humane person has never been easy.

What are the liberal arts?  Many answers have been proposed that, following William Cronon’s insight, come down to lists: essential facts, mandatory readings, curricular requirements, etc.  The original artes liberales were a medieval list of seven subjects that set male aristocrats off from everyone else.  Corresponding modern lists of competencies are scarecly inspiration and carry the odor of the curricular committee: competencies in commmunication, major modes of thought (science, mathematics, arts, social sciences, etc.), cultural heritages, etc.  Nevertheless, Cronon cannot resist making another list, ten personal qualities which he associates with people that seem to embody the values of a liberal education (here only several):

  1. They listen and they hear;
  2. They read and they understand;
  3. They can talk with anyone;
  4. They can write clearly, persuasively, and movingly . . .
  1. They practice humility, tolerance, and self-criticism. . .
  1. They follow E.M. Forster’s injunction from Howards End: “Only connect . . .”

(The ones left out here are necessary as well: read Cronon’s short essay)

All of these personal qualities are well associated with the analog, “real” world –contra the critics who see “reality” only as an excuse for performative hard-headedness.  Cronon:

A liberal education is not something any of us achieve; it is not a state.  Rather it is a way of living in the face of our own ignorance, a way of groping toward wisdom in full recognition of our own folly, a way of educating ourselves without any illusion that our educations will ever be complete.

Cronon then emphasizes that each of these qualities is also education for human community:

Each of the qualities I have described is a craft or a skill or a way of being in the world that frees us to act with greater knowledge or power. . . . [W]e need to confront one further paradox about liberal education. In the act of making us free, it also binds us to the communities that gave us our freedom in the first place; it makes us responsible to those communities in ways that limit our freedom. In the end, it turns out that liberty is not about thinking or saying or doing whatever we want. It is about exercising our freedom in such a way as to make a difference in the world and make a difference for more than just ourselves.

Cronon’s “craft, skill, or way of being in the world” returns to craeft: a way of living that bears the deep signatures of time, that nurtures the craeft of attention, humility, and human continuity.  This is the real point of a liberal arts college education –that it cannot be achieved or stated in four years, but is a life-long way of living in the face of our own knowledge and ignorance, wisdom and folly, and inevitably incomplete connection.  Only connect: human freedom of the service of human community, human continuity.

Matthew Crawford proposes a fruitful metaphor for connection: the cultural jig.  The metaphor’s origins lie in carpentry: “a jig is a device or procedure that guides a repeated action by constraining the environment in such a way as to make the action go smoothly, the same each time, without his having to think about it.” For example, a jig can guide a saw, so that a carpenter need not measure many boards individually.  A cultural jig can contribute to personal character “that is built through habit, becoming a reliable pattern of responses to a variety of situations:” such as (historically) thrift, parental authority, personal accountability.  Tradition can be a robust cultural jig, fostering a community of practice in which real independence through interdependence does seem to become possible (though never guaranteed).  The conundrum is that real freedom and self-mastery require some dependence on and mastery of cultural jigs, such as attentiveness.  When a liberal arts education is effective, a student (or graduate) is never adverse to seeking, genuine cultural jigs to guard against folly, foibles, and simple ignorance.

This line of thinking can get very woolly very fast (and maybe already has).  It can have real-world consequences, such as careful thinking about what makes work meaningful.  British business researchers Catherine Bailey and Adrian Madden have done research to discover what does, in fact, make work meaningful, and their conclusions bear striking similarity to the consequences of a liberal arts education.

How and why do people find their work meaningful? When:

  • work matters to more people than just to the workers themselves, or their bosses;
  • when it can mean engagement with mixed, uncomfortable, or even painful thoughts and feelings, not just euphoria or happiness, and a sense of coping with sometimes intractable challenges;
  • when meaningfulness in work is allowed to emerge in an episodic rather than scripted, sustained way, in moments that are not forced or managed, but “contain high levels of emotion and personal relevance, and thus become redolent of the symbolic meaningfulness of work;”
  • when workers have the time and space to become reflective, as meaningfulness can be realized over time rather than spontaneously or momentarily;
  • when the feeling of engagement is exactly a feeling or personal contribution in an organizational environment of tasks and responsibilities, formed in a narrative of engagement and satisfaction.

By contrast work becomes meaningless when connections are broken: people from their values; leaders from subordinates; pointless tasks without connection to any real problem; over-riding people’s better judgment; disconnection from personal relationships; exposure to personal harm (disconnection from safety and well-managed risk).  Organizations can cultivate meaningful work by stressing personal connections, well-delegated responsibilities, real-world problems, real-world beneficiaries, and recognized accomplishments.

Such meaningful work can become the occasion of the personal attributes that William Cronon identified in liberally educated individuals (above).  These can well be extended in context of work to build the practice of forgiveness, as Rogowski illumines it: “Screwing up is a given. Forgiveness is not. Unless you practice it.”  A liberal arts education is an ideal way to learn how to fail, since it is a given that one will fail, but also how to recover and go on.  An education that teaches how to fail, however, is neither what a college admissions officer wants to represent, nor what the anxious parents of a high-school senior want to hear.

Only connect: human, working freedom in the service of a good greater than mere organizational continuity, a craeft that conserves and strengthens human continuity.  The imperative to connect is, in a vital sense, a powerful cultural jig.

But can you make a living doing this?  The predominant narrative in American society now is that a liberal arts education is a frivolous waste or élite privilege that prepares a young person poorly for the “real world” (always taken to mean: the grimly hyper-competitive world that the writer envisions for everyone else).  It a liberal arts education of little value in “today’s world?”

A recent Mellon Foundation study answers: no: a liberal arts education incurs costs and permits benefits on par with pre-professional degrees.  This is a carefully nuanced study that looks carefully at what constitutes a liberal arts education, and that mere attendance at a liberal arts college is not always a good proxy.  It tries to control for factors such as parental income and achievement, college selectivity, typical incomes in various professions, and other factors.  It warns that observed correlations do not prove causations, since in any case no real control group is available to perform a meaningful (or even ethical) experiment.  Nevertheless, “claims that liberal education is of little value because it does not lead to employment is clearly not supported by the existing data.”  At the same time: more work needs to be done.

All this seems a long way from digital minimalism –or is it? I am convinced that one of the best ways to counter simplistic digital maximalism –a vague giving “people the power to build community and bring the world closer together” (Facebook) is exactly the craeft of connection that bears the deep signatures of time in human continuities in a variety of media.

A liberal arts education is analog in a digital world: intentionally inconvenient as a strategy for identifying enduring value.  The power of a “digital cleanse” (Newport) is vastly increased when a liberal arts education can provide genuine alternatives and powerful cultural jigs: engagement with real people, over-arching, important questions, and intractable problems.  Screwing up is a given. Forgiveness is not. Practicing craeft is a jig of attention, humility, forgiveness, failing and moving beyond failure, human continuity in a real community.  Only connect.

Resources linked or mentioned in this blog entry:

Allen, Mike. “Sean Parker unloads on Facebook, ‘God only knows what it’s doing to our children’s brains,’” Axios, November 9, 2017

Bailey, Catherine, and Adrian Madden. “What Makes Work Meaningful—Or Meaningless.” MIT Sloan Management Review, Summer 2016.

Christiansen, Clayton M., Michael E. Raynor, and Rory McDonald.  “What is Disruptive Innovation,” Harvard Business Review, December 2015.

Crawford, Matthew B.  Shop Class as Soulcraft: An Inquiry Into the Value of Work, Penguin Books, 2009,  especially pages 41-42.

Crawford, Matthew B.  The World Beyond Your Head: On Becoming an Individual in an Age of Distraction,  Farrer, Straus and Giroux, 2015, especially pages 8-20.

Cronon, William.  “’Only Connect’ . . . The Goals of a Liberal Education,” The American Scholar 67, Autumn 1998 (full-text available on his website)

Ginsberg, David, and Moira Burke. “Hard Questions: Is Spending Time of Social Media Bad for Us?” Facebook Newsroom, December 15, 2017

Harris, Tristan. “How Technology is Hijacking Your Mind—from a Magician and Google Design Ethicist,” Medium: Thrive Global, May 18, 2016

Hill, Catherine B., and Elizabeth Davidson Pisacreta.  The Economic Benefits and Costs of a Liberal Arts Education.  An essay commission by The Andrew W. Mellon Foundation under the auspices of the Mellon Research Forum on the Value of Liberal Arts Education.  The Foundation, 2019.

Langlands, Alexander.  Craeft: an Inquiry into the Origins and True Meaning of Traditional Crafts. New York: Norton, 2018.

Lanier, Jaron.  Who Owns the Future? New York: Simon and Schuster, 2013.

Lederman, Doug. “Clay Christensen, Doubling Down,: Inside Higher Ed, April 28, 2017.

Lepore, Jill. “The Disruption Machine: What The Gospel of Innovation Gets Wrong,” New Yorker, June 23, 2014.

Lewis, Paul. “’Our Minds Can Be Hijacked’: the Tech Insiders Who Fear a Smartphone Dystopia,” The Guardian October 6, 2017

Porter, Michael. “The Five Competitive Forces That Shape Strategy.” Harvard Business Review, January 2008.

Raffaelli, Ryan and Carmen Nobel.  “How Independent Bookstores of Thrived in Spite of Amazon.com”  Harvard Business School, Working Knowledge, November 20, 2017.

Raffaelli, Ryan.  Technology Reemergence: Creating New Markets for Old Technologies, Swiss Mechanical Watchmaking 1970-2008.  Administrative Science Quarterly, May 2018.

Rogowski, Gary. Handmade: Creative Focus in the Age of Distraction.  Fresno, California: Linden Publishing, 2017.

Sax, David, The Revenge of the Analog: Real Things and Why They Matter.  New York: Public Affairs, 2016.

Sullivan, Andrew.  "I Used to Be a Human Being."  New York, September 2016

Thoreau, Henry David. Walden: A Fully Annotated Edition, edited by Jeffrey S. Cramer, New Haven: Yale University Press, 2004, especially the first chapter, “Economy”

Twenge, Jean M. “Have Smartphones Destroyed a Generation?” The Atlantic, September 2017.

Wu, Tim. The Attention Merchants: The Epic Scramble to Get Inside our Heads, New York Vintage Books, 2017, especially pages 11-17.

Zupan, Mark.  “Betting on (Non-Profit) Higher Education,” Rochester Democrat and Chronicle, February 28, 2019.

if you know Yewno, and if Yewno, exactly what do you know? --that "exactly what" will likely contain machine-generated replications of problematic human biases.

This is the third of "undiscovered summer reading" posts, see also the first and second.

At the recent Association of College and Research Libraries conference Baltimore I came across Yewno, a search-engine-like discovery or exploration layer that I had heard about.  I suspect that Yewno or something like it could be the "next big thing" in library and research services.  I have served as a librarian long enough both to be very interest, and to be wary at the same time --so many promises have been made by the information technology commercial sector and the reality fallen far short --remember the hype about discovery services?

Yewno-logoYewno is a so-called search app; it "resembles as search engines --you use it to search for information, after all--but its structure is network-like rather than list-based, the way Google's is. The idea is to return search results that illustrate relationships between relevant sources" --mapping them out graphically (like a mind map). Those words are quoted from Adrienne LaFrance's Atlantic article on growing understanding of the Antikythera mechanism as an example of computer-assisted associative thinking (see, all these readings really come together).  LaFrance traces the historical connections between "undiscovered public knowledge," Vannevar Bush's Memex (machine) in the epochal As We May Think, and Yewno.  The hope is that through use of an application such as Yewno, associations could be traced between ancient time-keeping, Babylonian and Arabic mathematics, medieval calendars, astronomy, astrological studies, ancient languages, and other realms of knowledge. At any rate, that's the big idea, and it's a good one.

So who is Yewno meant for, a what's it based on?

Lafrance notes that Yewno "was built primarily for academic researchers," but I'm not sure that's true, strictly. When I visited the Yewno booth at ACRL, I thought several things at once: 1) this could be very cool; 2) this could actually be useful; 3) this is going to be expensive (though I have neither requested nor received a quote); and 4) someone will buy them, probably Google or another technology octopus. (Subsequent thought: where's Google's version of this?)  I also thought that intelligence services and corporate intelligence advisory firms would be very, very interested --and indeed they are.  Several weeks later I read Alice Meadows' post, "Do You Know About Yewno?" on the Scholarly Kitchen blog, and her comments put Yewno in clearer context. (Had I access to Yewno, I would have searched, "yewno.")

Yewno is a start-up venture by Ruggero Gramatica (if you're unclear, that's a person), a research strategist with a background in applied mathematics (Ph.D. King's College, London) and M.B.A. (University of Chicago). He is first-named author of "Graph Theory Enables Drug Repurposing," a paper (DOI) on PLOS One that introduces:

a methodology to efficiently exploit natural-language expressed biomedical knowledge for repurposing existing drugs towards diseases for which they were not initially intended. Leveraging on developments in Computational Linguistics and Graph Theory, a methodology is defined to build a graph representation of knowledge, which is automatically analysed to discover hidden relations between any drug and any disease: these relations are specific paths among the biomedical entities of the graph, representing possible Modes of Action for any given pharmacological compound. We propose a measure for the likeliness of these paths based on a stochastic process on the graph.

Yewno does the same thing in other contexts:

an inference and discovery engine that has applications in a variety of fields such as financial, economics, biotech, legal, education and general knowledge search. Yewno offers an analytics capability that delivers better information and faster by ingesting a broad set of public and private data sources and, using its unique framework, finds inferences and connections. Yewno leverages on leading edge computational semantics, graph theoretical models as well as quantitative analytics to hunt for emerging signals across domains of unstructured data sources. (source: Ruggero Gramatica's LinkedIn profile)

This leads to several versions of Yewno: Yewno Discover, Yewno Finance, Yewno Life Sciences, and Yewno Unearth.  Ruth Pickering, the companies co-founder and CEO of Business Development & Strategy Officer, comments, "each vertical uses a specific set of ad-hoc machine learning based algorithms and content. The Yewno Unearth product sits across all verticals and can be applied to any content set in any domain of information."  Don't bother calling the NSA --they already know all about it (and probably use it, as well).

Yewno Unearth is relevant to multiple functions of publishing: portfolio categorization, the ability to spot gaps in content, audience selection, editorial oversight and description, and other purposes for improving a publisher's position, both intellectually and in the information marketplace. So  Yewno Discovery is helpful for academics and researchers, but the whole of Yewno is also designed to relay more information about them to their editors, publishers, funders, and those who will in turn market publications to their libraries.  Elsevier, Ebsco, and ProQuest will undoubtedly appear soon in librarians' offices with Yewno-derived information, and that encounter likely could prove to be truly intimidating.  So Yewno might be a very good thing for a library, but not simply an unalloyed very good thing.

So what is Yewno really based on? The going gets more interesting.

Meadows notes that Yewno's underlying theory emerged from the field of complex systems at the foundational level of econophysics, an inquiry "aimed at describing economic and financial cycles utilized mathematical structures derived from physics." The mathematical framework, involving uncertainty, stochastic (random probability distribution) processes and nonlinear dynamics, came to be applied to biology and drug discovery (hello, Big Pharma). This kind of information processing is described in detail in a review article, Deep Learning in Nature (Vol. 521, 28 May 2015, doi10.1038/nature14539).  Developing machine learning, deep learning "allows computational models that are composed of multiple processing layers to learn representations of data with multiple levels of abstraction."  Such deep learning "discovers intricate structure in are data sets by using the backpropagation algorithm to indicate how a machine should change its internal parameters that are used to compute the representation in each layer from the representation in the previous layer." Such "deep convolutional nets" have brought about significant break-throughs when processing images, video, speech, and "recurrent nets" have brought new learning powers to "sequential data such as text and speech."

The article goes on in great detail, and I do not pretend I understand very much of it.  Its discussion of recurrent neural networks (RNNs), however, is highly pertinent to libraries and discovery.  The backpropagational algorithm is basically a process that adjusts the weights used in machine analysis while that analysis is taking place.  For example, RNNs "have been found to be very good at predicting the next character in the text, or next word in a sequence," and by such backpropagational adjustments, machine language translations have achieved greater levels of accuracy. (But why not complete accuracy? --read on.)  The process "is more compatible with the view that everyday reasoning involves many simultaneous analogies that each contribute plausibility to a conclusion." In their review's conclusion, the authors expect "systems that use RNNs to understand sentences or whole documents will become much better when they learn strategies for selectively attending to one part at a time."

After all this, what do you know? Yewno presents the results of deep learning through recurrent neural networks that identify nonlinear concepts in a text, a kind of "knowledge." Hence Ruth Pickering can plausibly state:

Yewno's mission is "Knowledge Singularity" and by that we mean the day when knowledge, not information, is at everyone's fingertips. In the search and discovery space the problems that people face today are the overwhelming volume of information and the fact that sources are fragmented and dispersed. There' a great T.S. Eliot quote, "Where's the knowledge we lost in information" and that sums up the problem perfectly. (source: Meadows' post)

Ms. Pickering perhaps revealed more than she intended.  Her quotation from T.S. Eliot is found in a much larger and quite different context:

Endless invention, endless experiment,
Brings knowledge of motion, but not of stillness;
Knowledge of speech, but not of silence;
Knowledge of words, and ignorance of the Word.
All our knowledge brings us nearer to our ignorance,
All our ignorance brings us nearer to death,
But nearness to death no nearer to GOD.
Where is the Life we have lost in living?
Where is the wisdom we have lost in knowledge?
Where is the knowledge we have lost in information?
The cycles of Heaven in twenty centuries
Bring us farther from GOD and nearer to the Dust. (Choruses from The Rock)

Eliot's interest is in the Life we have lost in living, and his religious and literary use of the word "knowledge" signals the puzzle at the very base of econophysics, machine learning, deep learning, and backpropagational algorithms.  Deep learning performed by machines mimics what humans do, their forms of life.  Pickering's "Knowledge Singularity" alludes to the semi-theological vision of the Ray Kurzweil's millennialist "Singularity;" a machine intelligence infinitely more powerful than all human intelligence combined.  In other words, where Eliot is ultimately concerned with Wisdom, the Knowledge Singularity is ultimately concerned with Power.  Power in the end means power over other people: otherwise it has no social meaning apart from simply more computing.  Wisdom interrogates power, and questions its ideological supremacy.

For example, three researchers at the Center for Information Technology Policy at Princeton University have shown that "applying machine learning to ordinary human language results in human-like semantic biases." ("Semantics derived automatically from language corpora contain human-like biases," Science 14 April 2017, Vol. 356, issue 6334: 183-186, doi 10.1126/science.aal4230). The results of their replication of a spectrum of know biases (measured by the Implicit Association Test) "indicate that text corpora contain recoverable and accurate imprints of our historic biases, whether morally neutral as towards insects or flowers, problematic as race and gender, for even simply veridical, reflecting the status quo distribution of gender with respect to careers or first names. Their approach holds "promise for identifying and addressing sources of bias in culture, including technology."  The authors laconically conclude, "caution must be used in incorporating modules constructed via unsupervised machine learning into decision-making systems."  Power resides in decisions such decisions about other people, resources, and time.

Arvind Narayanan, who published the paper with Aylin Caliskan and Joanna J. Bryson, noted that "we have a situation where these artificial-intelligence systems may be perpetuating historical patterns of bias that we might find socially unacceptable and which we might be trying to move away from."  Princeton researchers developed an experiment with a program called GloVe that replicated the Implicit Association test in machine-learning representation of co-occurent words and phrases.  Researchers at Stanford turn this loose on roughtly 840 billion words from the Web, and looked for co-occurences and associations of words such as "man, male" or "woman, female" with "programmer engineer scientist, nurse teacher, librarian."   They showed familiar biases in distributions of associations, biases that can "end up having pernicious, sexist effects."

For example, machine-learning programs can translate foreign languages into sentences taht reflect or reinforce gender stereotypes. Turkish uses a gender-neutral, third person pronoun, "o."  Plugged into the online translation service Google Translate, however, the Turkish sentence "o bir doktor" and "o bir hemşire" are translated into English as "he is a doctor" and "she is a nurse."  . . . . "The Biases that we studied in the paper are easy to overlook when designers are creating systems," Narayanan said. (Source: Princeton University, "Biased Bots" by Adam Hadhazy.)

Yewno is exactly such a system insofar as it mimics human forms of life which include, alas, the reinforcement of biases and prejudice.  So in the end, do you know Yewno, and if Yewno, exactly what do you know? --that "exactly what" will likely contain machine-generated replications of problematic human biases.  Machine translations will never offer perfect, complete translations of languages because language is never complete --humans will always use it new ways, with new shades of meaning and connotations of plausibility, because human go on living in their innumerable, linguistic forms of life.  Machines have to map language within language (here I include mathematics as kinds of languages with distinctive games and forms of life).  No "Knowledge Singularity" can occur outside of language, because it will be made of language: but the ideology of "Singularity" can conceal its origins in many forms of life, and thus appear "natural," "inevitable," and "unstoppable." 

The "Knowledge Singularity" will calcify bias and injustice in an everlasting status quo unless humans, no matter how comparatively deficient, resolve that knowledge is not a philosophical problem to be solved (such as in Karl Popper's Worlds 1, 2, and 3), but a puzzle to be wrestled with and contested in many human forms of life and language (Wittgenstein). Only by addressing human forms of life can we ever address the greater silence and the Life that we have lost in living.  What we cannot speak about, we must pass over in silence (Wovon man nicht sprechen kann, darüber muss man schweigen, sentence 7 of the Tractatus) --and that silence, contra both the positivist Vienna Circle and Karl Popper (who was never part of it) is the most important part of human living.  In the Tractatus Wittengenstein dreamt, as it were, a conclusive solution to the puzzle of language --but such a solution can only be found in the silence beyond strict logical (or machine) forms: a silence of the religious quest beyond the ethical dilemma (Kierkegaard).

This journey through my "undiscovered summer reading," from the Antikythera mechanism to the alleged "Knowledge Singularity," has reinforced my daily, functional belief that knowing is truly something that humans do within language and through language, and that the quest which makes human life human is careful attention to the forms of human life, and the way that language, mathematics, and silence are woven into and through those forms. The techno-solutionism inherent in educational technology and library information technology --no matter how sophisticated-- cannot undo the basic puzzle of human life: how do we individually and social find the world? (Find: in the sense of locating, of discovering, and of characterizing.)  Yewno will not lead to a Knowledge Singularity, but to derived bias and reproduced injustice, unless we acknowledge its limitations within language. 

The promise of educational and information technology becomes more powerful when approached with modesty: there are no quick, technological solutions to puzzles of education, of finance, of information discovery, of "undiscovered public knowledge."  What those of us who are existentially involved with the much-maligned, greatly misunderstood, and routinely dismissed "liberal arts" can contribute is exactly what makes those technologies humane: a sense of modesty, proportion, generosity, and silence.  Even to remember those at this present moment is a profoundly counter-cultural act, a resistance of the techno-idology of unconscious bias and entrenched injustice.

Library work is inherently collaborative: even solo librarians aren’t really solo, but depend on the work of librarians elsewhere. The collaboration of learner and teacher can be deep work, even when that teacher is not formally a classroom instructor. Cal Newport's book pertinently describes and advocates for deep work.

I’ve been reading Cal Newport’s new book Deep Work: Rules for Focused Success in a Distracted Work (Grand Central/Hachette, 2016), and it is challenging and invigorating.  As a historian of Christianity, much of what he says resonates strongly with the writings from religious communities of varying types: those Benedictines who work outside the cloister, Dominicans, Jesuits, and Society of St. John the Evangelist, for example.

I realize those are all very different emphases of Christian spirituality.  In common, however, is a desire to find balance between the active and contemplative life —neither simply to leave the “shallow” world absolutely (in contrast with, for example, Carthusians), nor simply to surrender any meaningful deep work and wonder.  Newport writes (briefly) about honing a skill with craft (for example, wheelwrights, blacksmiths, coders, or teachers) and the connection of meaningful, skilled work with the sacred —the world of luminous, shining, wonderful things.  He speaks from an intellectual background formed by Matthew Crawford’s Shop Class As Soulcraft, Winifred Gallagher’s Rapt: Attention and the Focused Life, and Herbert Dreyfuss' and Sean Kelly’s All Things Shining.

I’m not yet finished with Newport’s book, but I’m an engaged reader, and with such books I take my time.  Newport has reminded me vividly of the first professional library job I had (at Drew University, 1986-1992), when computers were coming into academic libraries, but e-mail, the Internet (not yet graphical), and the culture of distracted busyness were in an early stage, compared to the present.  Working with less distraction, I did in fact get more done, and more happily —one reason that I remember that job as perhaps the most satisfactory job I have had as a librarian.

I have heard it said that as academic librarians, “our interruptions are our business,” and that may be true when fielding requests for help from our students and faculty.  But they’re asking for help less than they used to, and the days of the reference question that ends with a verification of fact are long past.  Now questions have much more to do with process: how do I use this database?  How do I cite this in APA? How can I tell if an article is really peer-reviewed? —just to cite facile examples.  Academic librarians must admit, I believe, that the principle interruptions we endure most days do in fact come from each other: the relentless stream of e-mail, and the distractions of social and news media.

In January I heard Jim Honan of Harvard’s Graduate School of Education reflect on a phrase he took from a librarian in New York State, “Our data does not do justice to our story.”  What is our story as a library, what is our value proposition: how does what we do matter, to whom, and how do we do it?  Responsible and apt answers to those questions have to go beyond the shallow work of day-to-day institutional librarianship to the deep work of the field.

Do academic librarians have “deep work” to do, or is it all in the shallows?  Newport defines deep work (page 3):

Deep work: Professional activities performed in a state of distraction-free concentration that push your cognitive capabilities to their limit.  These efforts create new value, improve your skill, and are hard to replicate.

Do academic librarians do any of that?  I must answer yes —but in metaphors or images that differ from the kinds of deep work that Newport seems to presuppose as a computer scientist and mathematician (his work concentrates on distributed algorithms, designed to work through and among interconnected processors).

Librarians fundamentally connect learners (inquirers) to sources of information and knowledge —learners who are taking responsibility for their own learning.  As such a learning-centered library is necessarily a polymorphous, polyglot, multifocal place (physical or digital place, or both at once).  The new value that librarians create (to use Newport’s words) will reside in the minds of those inquirers with whom the librarians interact.

The value proposition of libraries ultimately lies in improving the skill of independent learners to set their own terms and extent for learning, to take responsibility for what they know, and want to know —both know cognitively, and know how to do.

The strategies librarians employ —how are they going to do this— involve both interactions with learners and intellectual resources and tools.  This is the truth behind David Lankes’ contention that “a room full of books is simply a closet, but that an empty room with a librarian in it is a library.”  A library is fundamentally what librarians do more than what they have.  The academic public, of course, usually sees it the other way around.

Academic librarianship suffers in spades from the major distractions that impede deep work(pages 53 et seq.).  Newport’s “metric black hole” afflicts most of the field: not only is it nearly impossible to measure what makes an academic librarian effective or distracted, it is hard to measure the impact of this professional work in the first place.  The ACRL has undertaken significant initiatives to show the value of academic libraries, but none of the strategies or paths so far are completely persuasive.  

This difficulty with metrics leads to following the paths of least resistance: absent the clear and compelling evaluative mechanism of a bottom line (or other metric), librarians (among other workers) tend to choose the behavior that is easiest and easiest to rationalize at the time.  Being a librarian means doing what other librarians do, even if that’s not very deep, and how would you know that, anyway?

Hence, in the absence of clear indicators of what is really means to be valuable and productive, like other workers librarians can make busyness a proxy for productivity: do lots of stuff in a visible manner (hey! look at us over here in the library!).  So it has to be valuable and productive, right?

I haven’t yet finished the book, so I don’t want to give the appearance of reviewing it.  I have a question for Newport, however.  Is his operational concept of deep work in the book in fact overly determined by the kind of deep work he does as a computer scientist?  If he were a linguist, a psychologist, or a  performing artist, would he have written the book differently?

By no means to I wish to trivialize his work (either computer science, or this book).  Newport’s leading example of deep work is Carl Jung and
the tower he built near his rural house in Bollingen Tower a two-story stone house with a private study (not very far from Zurich).  Jung would go there to write undistracted, unlike his busy practice, family, and cafe life in Zurich.  Without question “the Tower” was crucial for Jung’s thinking and writing, producing the remarkable insights and books that not only took on Sigmund Freud, but changed depth psychology and real people’s lives.  His work was “deep” in every sense.

Newport tends by implication to characterize Jung’s work in Zurich, by contrast, as shallow.  Newport sympathetically  and consistently characterizes shallow work as significant and unavoidable —the everyday work of professional duties and communications that require attention, but not deep engagement.  In his several examples of Richard Feynman (physics), Adam Grant (business and work behaviors), or Rick Furrer (blacksmith), Newport associatesdeep work  strongly with isolation and often solitary craft —whether craft of steel, wood, or words (writing), and shallow work with all the other stuff.

Yet much of Jung’s work in Zurich was anything but shallow.  His numerous cases show up all over his writings, and his deep analyst-analysand encounters inform every page of his writings.  His challenge to Freud and Freudians required not only courage and persistence, but skill —a skill that cannot be characterized as “shallow” in any sense.  Newport never characterizes it as shallow explicitly, but the implication remains strong; while he writes explicitly, “don’t work alone,” he encounters conceptual and definitional difficulties when associating deep work with collaboration.  Although Newport describes Jung's pattern as "bimodal," his description cannot help but privilege the deep over the "shallow," even though without Zurich there would have been no Bollingen (and vice versa).  Is it not possible in each place Jung was engaging differing and distinct Gestalten or formulations of deep work?

How does this pertain to librarians? Library work is inherently collaborative: even solo librarians aren’t really solo, but depend on the work of librarians elsewhere.  The collaboration of learner and teacher can be deep work, even when that teacher is not formally a classroom instructor.

Newport’s concept of deep work is not flawed, but it needs to be broadened and adjusted for several lines or other metaphors of work —I’m thinking about librarians and parish clergy, lines of work that I know personally and best (there are many others, of course).  Such adjustments cannot --must not--detract from clarity or pertinence.  Librarians almost certainly do spend too much time on e-mail and connectivity of fairly trivial sorts —for example, the rush in the recent past for librarians to tweet their work even though the very medium of Twitter tends to trivialize it.  It is very easy for librarians to mistake busyness for productivity.

Telling the library’s story, showing its value proposition and strategy can be deep work.  Deep work requires librarians not to confuse busyness with productivity, and not to follow the safe paths of least resistance and sheer habit.  Librarianship is a craft, service both to the living and the dead, collaborating with both learners and resources.  It can be a variety of soulcraft.  (I never forget that I hold a degree from Columbia University’s School of Library Service.)  Clearing the mind for this deep work does in fact afford a glimpse of the sacred trust of learners, traditions, and change.  Newport’s book gives librarians' deep work a robust boost, a clarion recall to mental clarity.  I'm privileged and happy to be able to continue reading it.