Skip to content

A book isn't an artifact, but a synthesis of fully realized individual personhood with human continuity.

My prevous post took a brief look at the religious ideas that permeate not only Jaron Lanier's Who Owns The Future? (whether he explicitly acknowledges those ideas or not). This post considers what he contributes about books, and the future of books.

(Lanier, author of Who Owns The Future? appears on the SHU campus on Wednesday, October 9, 7:00 p.m. Schine Auditorium)

Books have become a cultural flash point that inspire "maniacal scheming" (see pages 352-360) --an unwitting testament to books' enduring iconic, cultural power.  What bothers Lanier is that current development of networks --the Siren Servers that seek total computational awareness and control--might lead to losing "the pattern of what a book is in the stream of human life and thought." (353)  After sketching some possible future scenarios about the fate of books, authors, and readers, Lanier offers a definition (one of the very best I have ever read):

A book isn't an artifact, but a synthesis of fully realized individual personhood with human continuity. The economic model of our networks has to be optimized to preserve that synthesis, or it will not serve [hu]mankind.(358)

Lanier here touches upon the emotional salience and cultural power that books evoke.  The uneasiness Lanier shares with many is not just about texts (tomes, bindings), but about human lives.  "Human life is its own purpose," he continues.  "Thinking about people in terms of the components on a network is--in intellectual and spiritual terms--a slow suicide for the researchers and a slow homicide against everyone else."(360)  The ingestion of millions of e-texts into Artificial Intelligence divorces what humans write about from who they are, and what makes their lives meaningful to them.  "Whether we will destroy culture in order to save/digitize it is still unknown."(353) (Lanier references that metaphor to the Vietnam war.)

What makes a liberal education liberal--freeing--is the strong association (synthesis) of particular texts with particular people, moments, events, movements, points of view.  The real intellectual problem with Wikipedia isn't its alleged accuracy or inaccuracy. Rather, it "proposes that knowledge can be divorced from point of view." Note that Lanier writes knowledge --not data, not information, not the "flashes of thought" that might be "inserted meaningfully into a shared semantic structure" (ibid.)  Knowledge is what humans make for other humans.  Strictly speaking, computers can store, locate, index, and transform data, but can't know in the same sense.

These are my own thoughts, sourced in Lanier's text, which I found to enormously helpful in articulating the fundamentally different business model of a library from a database, even a sort of meta-database (a database of databases --a discovery service, in other words).  What libraries are about is the discovery of knowledge in human communities and continuities, in a symmetrical transaction that celebrates unanswered questions (intellectual risk) and acknowledges the presence of other sources of knowledge --whether living persons, libraries, databases, search engines, or other human syntheses of any and every kind.  

This transaction (process, pedagogy) thrusts libraries into an educational process squarely at odds with Siren Servers that are naracisstic (as though they alone collect data), risk-externalizing (questions and uncertainties never belong to the Server, always to the user), and depend upon extreme information assymetry --users can't know what the Server already knows about them, and how it seeks to modify their behavior.

Understanding the cultural term "book" a "a synthesis of fully realized individual personhood with human continuity" respects authors, readers, and the economic and intellectual chain of power and responsibility that connects them.  This also illuminates why some (many?) people care so passionately about books --what they care about is human continuity, personhood, what makes a human life worth living.  What better question could a liberal arts education pursue?  What could be more "relevant" to the challenges of living in a "flat," networked world?

Jaron Lanier's book Who Owns The Future? is (in the words of one of my beloved college professor's), "quite a read."

Jaron Lanier's book Who Owns The Future? is (in the words of one of my beloved college professor's), "quite a read." (Lanier visits SHU on October 9, 2013.)  It's a wild, occasionally bumpy ride through simultaneous developments of technology, economy, and social thinking occasioned by the massive computing power ("big data") of arrays of servers.  When such an array achieves dominance in such a manner that it can aspire to omniscience, Lanier calls it a "siren server" --a software-mediated social vision that believes it's the only game in town, marked by radical information assymetry and outsources risk as much as possible.

There are so many elements of this book that I will pick out several for consideration, but one at a time. This piece concerns the religious elements of the social vision that advanced software engineers (called by synecdoche "Silicon Valley") seek to monetize and compell social acceptance.

Lanier does take this on in his fifth interlude, "The Wise Old Man in the Clouds," with the double meaning intact. Silicon Valley (or at least, many there) anticipate The Singularity, when software comes to write itself and computers outstrip human interaction, when the memories, emotions, and thoughts of an individual can be uploaded into "the cloud" and when the body died, the person lives on --when illness, death, scarcity (want), and human limitations of every kind are overcome (p. 325-331), when robots can provide satisfying sex.  (Really! --see pages 359-360)

As a technologist Lanier (who is both a technologist and a philosopher) wants to skitter away from religious questions.  Speaking for technologists, "We serve people best when we keep our religous ideas out of our work." (p. 194) --and yet this book is shot through with religious sensibility and ideas, including non-traditional human development ideas famous in the Bay area.  The questions of limits and Ultimate Concern, of human closed-in awareness and the unexpected in-breaking of The Other, keep returning again and again.  No one has yet successfully addressed these questions as regards technologists.  (And by "successfully," I don't mean that I would agree with that writer, but that such a writer both acknowledges these questions and moves discourse forward.)

Lanier's makes frequent reference to visionaries (H.G. Wells, Alan Turning, Ted Nelson), philosophers (Aristotle, Hobbes, Malthus, Marx) and science fiction writers or characters (Philip K. Dick, Dr. Strangelove, Star Trek the TV series, especially the original series).  All of these raised questions broadly classes as theological or religious --although apparently in the Silicon Valley "religious" means such Concern as narrowly defined by California-flavored evangelicalism, western Mormon sensibilities (whether orthodox LDS or not), and the spectre of fundamentalisms of every stripe.  ("Spiritual" is a very different word, suggesting all the happy feelings of Eastern philosophies mixed in with self-affirming slogans.) No wonder Lanier wants to restrict technologists to keep religious ideas out of their work.  

But does "religion" have to be defined that way?  (The fact that violence-prone religious fundamentalists share a small bit of thinking in common with "religion" makes other religious people guilty of crimes against humanity to the same but narrowly limited extent that chemists are guilty because they share a small amount of thinking with DKFarben, makers of the poisons used by the Nazis.)  The Silicon Valley amounts to being the paradigm of, among other things, "spiritual but not religious."  But that's a feint, simply deflecting attention.

On the one hand, "what does it mean to be human" (which Lanier re-phrases as, "whether people are 'special'", p. 196) is not a technological question and can't be answered in those terms and limitations.  On the other hand, those terms and limitations beg that question.  The adjustment of software and information to reality is imperfect --reality consistently outstrips human ability to encode it.  (For all the hype that information lies at the heart of the universe --such as DNA encoding for example-- it takes humans to translate that reality into symbols or code.)  The religious and philosophical questions raised by massive "cloud" computing are inescapable, and only a resolute will to face them for what they are will sing a song over against the Sirens strong enough to modify their behavior.