The Usefulness of Useless Knowledge

The Usefulness of Useless Knowledge, by Abraham Flexner, with a companion essay by Robbert Dijkgraaf. Princeton: University Press, 2017. 93 pages. ISBN 9780691174761. Sacred Heart University Library: Q126.8 .F54, 2017

This little book is a classic –it was an essay in Harpers in 1939, and achieved iconic status during the war because of Flexner’s association with Albert Einstein and the Institute for Advanced Study in Princeton. It is still worth reading, especially now with a companion essay by Robbert Dijkgraaf, Leon Levy professor at the Institute, a mathematical physicist and string theorist.

The entire point of the essay is that useless knowledge is not useful yet –with all things, it is a matter of time. The period 1939-1945 was a spectacular demonstration of this truth: in 1939 an obscure article appeared in Physical Review, “The Mechanism of Nuclear Fission” on the exact day that World War II began in Europe (September 1). The previous year Alan Turing completed his Ph.D. dissertation, Systems of Logic Based on Ordinals at Princeton University, and Institute mathematician Johann von Neumann wanted to hire him as a postdoctoral assistant, but Turing returned to England. By 1945 esoteric nuclear fission had resulted in two atomic bombs, as well as Turing’s Bombe, a mechanical computer to decipher the German Enigma code, made possible by logic based on ordinal numbers. In both cases that useless knowledge did not remain useless very long.

Flexner freely admits the waste and appearance of waste of a great deal of speculation and experiment, because exactly where the another critical insight will arise is never clear and cannot be predicted. “To be sure, we shall waste some precious dollars.” It “looks prodigious. It is not really so. All the waste that could be summed up in developing the science of bacteriology is as nothing compared to the advantages which have accrued from the discoveries of Pasteur” and many others. “Science, like the Mississippi, begins in a tiny rivulet in the distant forest. Gradually other streams swell its volume. And the roaring river that burst the disks is formed from countless sources.”

The critical factor for Flexner (and for us!) is spiritual and intellectual freedom, and a narrow market ideology can threaten this as surely as any other. “An institution which sets free successive generations of human souls is amply justified whether or not this graduate or that makes a so-called useful contribution to human knowledge.” Flexner is deeply aware that market-driven economy can crowd out exactly what nourishes the market. In 1939 Flexner’s reflection has “a peculiar poignancy. In certain age areas –Germany and Italy especially–the effort is now being made to clamp down the freedom of the human spirit.” Mutatis mutandis, now we see this spirit alive in China, Hungary, Russia, and maybe in the utter refusal of science and truth by some in the United States. The real enemy is the person “who tries to mold the human spirit so that it will not dare to spread its wings, as its wings were once spread in Italy and Germany, as well as in Great Britain and the United States.” What comes around goes around, especially fear as practiced by some politicians.

Dijkgraaf writes in the prologue, “Supporting applied and not-yet-applied research is not just smart, but a social imperative.” And yet, “our current research climate” increasingly is governed by imperfect ‘metrics’ and policies.”

Driven by an every-deepening lack of funding, against a background of economic uncertainty, global political turmoil, and ever-shortening time cycles, research criteria are becoming dangerously skewed toward conservative short-term goals that may address more immediate problems but miss out on the huge advances that human imagination can bring in the long term.

Nicholas Carr made the case in 2010 that as the Internet encourages rapid, distracted sampling of small bits of often unconnected information, humans are losing capacity for reflection and concentration (a point also made abundantly by Cal Newport’s 2016 Deep Work). Research skewed by current factors of money, turmoil, and the refusal of truth will miss engagement with deep questions –and remain in the shallows without even the awareness that depths exist.

What does this have to do with a private teaching university? We certainly have no funds for research and little time away from the metered productivity of publication, teaching, and departmental governance. Just those priorities can inadvertently obscure the truth that no one really knows where the next scientific discovery, cultural insight, or social movement will come from. Here is as good as anywhere.

The point of Flexner’s essay still holds: major advances invariably come from the most obscure corners. Who knew that nearly incomprehensible physics papers by a Swiss patent office worker would still be cited and proven correct more than a century later? We sell our students short is we cave into pressure simply to prepare them for a job at the neglect of their minds and their spirits. Do our skewed metrics just get in the way? Will they learn the deep respect for truth, and that truth is possible, that is the basis of any real thinking?

Review and recommendation: Timothy Snyder’s On Tyranny: Twenty Lessons from the Twentieth Century.

image from libapps.s3.amazonaws.comOn Tyranny: Twenty Lessons from the Twentieth Century, by Timothy Snyder.  Duggan Books (Crown), 2017. 126 p. ISBN 978-0804190114. List price $8.99

Yale University professor Timothy Snyder has spent a long time learning the languages, reading the documents, exploring the archives, and listening to witnesses of the totalitarian tyrannies of Europe in the last century –particularly of Nazi Germany and the Stalinist Soviet Union. His scholarship bore particular fruit in books such as Bloodlands: Europe between Hitler and Stalin, and Black Earth: the Holocaust as History and Warning. He came to recognize that certain characteristics in the development of those tyrannies are present in the world today, and in the United States. This book is no partisan screed: Snyder recognizes in the 45th President features he knows from other contexts; those other contexts underscore the drift towards totalitarianism apparent from Russian to Europe to the USA. On Tyranny is not only about an American moment, but about a worldwide one.

This short book consists of a brief introduction, twenty short chapters, and an epilogue. Each chapter directs an action, such as no. 12, "Make eye contact and small talk" followed by a historical example, or expansion of the point. All the actions can be undertaken or performed in daily life; there is no grand theory here.

In place a grand theory, there is a fundamental point: respect and value facts, truth, and accurate usage of our common language. In Moment (magazine), he explained: "Once you say that there isn’t truth and you try to undermine the people whose job it is to tell the truth, such as journalists, you make democracy impossible." He told Bill Maher (at 2:02) than while "post-fact" postmodernism might connote "Berkeley, baguettes, and France and nice things," it more likely means that "every day doesn't matter; details don't matter; facts don't matter; all that matters is the message, the leader, the myth, the totality" –a condition of Europe in the 1920s.  Such disdain for the truth goes hand-in-hand with conspiracy theories that put assign blame to a group associated with undermining the purity of the majority. "Rather than facing up to the fact that life is hard and that globalization presents challenges, you name and blame people and groups who you say are at fault."  Jews, Mexicans, Muslims, Rohingya, Tutsis, Hutus, globalists, evolutionists, or any other "outsider."  The myth: "Make [fill in the blank] great again."

A librarian or research might particularly resonate with Snyder's directions, "Be kind to our language," "Believe in truth," and "Investigate" (lessons 9-11). This is all a way to prepare to "be calm when the unthinkable arrives" (lesson 18) –when a leader exploits a catastrophic event to urge follows to trade freedom for security, and suspends the rule of law. The Chief Executive may or may not be attempt to stage a coup; that American democracy survived the dark moment after the Charlottesville.  Snyder told Salon in August, "We are hanging by our teeth to the rule of law. That was my judgment at the beginning of his presidency and it is still my judgment now. The rule of law is what gives us a chance to rebuild the system after this is all done."

Whether or not current politics result in tyranny and oppression is still (at this writing) an open question. The importance of Snyder's book is that it points beyond this moment to the wider trends and challenges of a world which is global (like it or not), connected (like it or not), and interdependent on both our natural climates and accrued, hard-won cultural heritages. A University founded on "a rigorous and interdisciplinary search for truth and wisdom" that "forms the cornerstone of all University life and welcomes people from all faiths and cultures" cannot leave our students unprepared. In order to make history, young Americans will have to know some (p. 126)  Will that be the twenty-first lesson on tyranny from the twenty-first century?

–Gavin Ferriby

Do You Know Yewno, and If Yewno, Exactly What Do You Know?

This is the third of "undiscovered summer reading" posts, see also the first and second.

At the recent Association of College and Research Libraries conference Baltimore I came across Yewno, a search-engine-like discovery or exploration layer that I had heard about.  I suspect that Yewno or something like it could be the "next big thing" in library and research services.  I have served as a librarian long enough both to be very interest, and to be wary at the same time –so many promises have been made by the information technology commercial sector and the reality fallen far short —remember the hype about discovery services?

Yewno-logoYewno is a so-called search app; it "resembles as search engines –you use it to search for information, after all–but its structure is network-like rather than list-based, the way Google's is. The idea is to return search results that illustrate relationships between relevant sources" –mapping them out graphically (like a mind map). Those words are quoted from Adrienne LaFrance's Atlantic article on growing understanding of the Antikythera mechanism as an example of computer-assisted associative thinking (see, all these readings really come together).  LaFrance traces the historical connections between "undiscovered public knowledge," Vannevar Bush's Memex (machine) in the epochal As We May Think, and Yewno.  The hope is that through use of an application such as Yewno, associations could be traced between ancient time-keeping, Babylonian and Arabic mathematics, medieval calendars, astronomy, astrological studies, ancient languages, and other realms of knowledge. At any rate, that's the big idea, and it's a good one.

So who is Yewno meant for, a what's it based on?

Lafrance notes that Yewno "was built primarily for academic researchers," but I'm not sure that's true, strictly. When I visited the Yewno booth at ACRL, I thought several things at once: 1) this could be very cool; 2) this could actually be useful; 3) this is going to be expensive (though I have neither requested nor received a quote); and 4) someone will buy them, probably Google or another technology octopus. (Subsequent thought: where's Google's version of this?)  I also thought that intelligence services and corporate intelligence advisory firms would be very, very interested –and indeed they are.  Several weeks later I read Alice Meadows' post, "Do You Know About Yewno?" on the Scholarly Kitchen blog, and her comments put Yewno in clearer context. (Had I access to Yewno, I would have searched, "yewno.")

Yewno is a start-up venture by Ruggero Gramatica (if you're unclear, that's a person), a research strategist with a background in applied mathematics (Ph.D. King's College, London) and M.B.A. (University of Chicago). He is first-named author of "Graph Theory Enables Drug Repurposing," a paper (DOI) on PLOS One that introduces:

a methodology to efficiently exploit natural-language expressed biomedical knowledge for repurposing existing drugs towards diseases for which they were not initially intended. Leveraging on developments in Computational Linguistics and Graph Theory, a methodology is defined to build a graph representation of knowledge, which is automatically analysed to discover hidden relations between any drug and any disease: these relations are specific paths among the biomedical entities of the graph, representing possible Modes of Action for any given pharmacological compound. We propose a measure for the likeliness of these paths based on a stochastic process on the graph.

Yewno does the same thing in other contexts:

an inference and discovery engine that has applications in a variety of fields such as financial, economics, biotech, legal, education and general knowledge search. Yewno offers an analytics capability that delivers better information and faster by ingesting a broad set of public and private data sources and, using its unique framework, finds inferences and connections. Yewno leverages on leading edge computational semantics, graph theoretical models as well as quantitative analytics to hunt for emerging signals across domains of unstructured data sources. (source: Ruggero Gramatica's LinkedIn profile)

This leads to several versions of Yewno: Yewno Discover, Yewno Finance, Yewno Life Sciences, and Yewno Unearth.  Ruth Pickering, the companies co-founder and CEO of Business Development & Strategy Officer, comments, "each vertical uses a specific set of ad-hoc machine learning based algorithms and content. The Yewno Unearth product sits across all verticals and can be applied to any content set in any domain of information."  Don't bother calling the NSA –they already know all about it (and probably use it, as well).

Yewno Unearth is relevant to multiple functions of publishing: portfolio categorization, the ability to spot gaps in content, audience selection, editorial oversight and description, and other purposes for improving a publisher's position, both intellectually and in the information marketplace. So  Yewno Discovery is helpful for academics and researchers, but the whole of Yewno is also designed to relay more information about them to their editors, publishers, funders, and those who will in turn market publications to their libraries.  Elsevier, Ebsco, and ProQuest will undoubtedly appear soon in librarians' offices with Yewno-derived information, and that encounter likely could prove to be truly intimidating.  So Yewno might be a very good thing for a library, but not simply an unalloyed very good thing.

So what is Yewno really based on? The going gets more interesting.

Meadows notes that Yewno's underlying theory emerged from the field of complex systems at the foundational level of econophysics, an inquiry "aimed at describing economic and financial cycles utilized mathematical structures derived from physics." The mathematical framework, involving uncertainty, stochastic (random probability distribution) processes and nonlinear dynamics, came to be applied to biology and drug discovery (hello, Big Pharma). This kind of information processing is described in detail in a review article, Deep Learning in Nature (Vol. 521, 28 May 2015, doi10.1038/nature14539).  Developing machine learning, deep learning "allows computational models that are composed of multiple processing layers to learn representations of data with multiple levels of abstraction."  Such deep learning "discovers intricate structure in are data sets by using the backpropagation algorithm to indicate how a machine should change its internal parameters that are used to compute the representation in each layer from the representation in the previous layer." Such "deep convolutional nets" have brought about significant break-throughs when processing images, video, speech, and "recurrent nets" have brought new learning powers to "sequential data such as text and speech."

The article goes on in great detail, and I do not pretend I understand very much of it.  Its discussion of recurrent neural networks (RNNs), however, is highly pertinent to libraries and discovery.  The backpropagational algorithm is basically a process that adjusts the weights used in machine analysis while that analysis is taking place.  For example, RNNs "have been found to be very good at predicting the next character in the text, or next word in a sequence," and by such backpropagational adjustments, machine language translations have achieved greater levels of accuracy. (But why not complete accuracy? –read on.)  The process "is more compatible with the view that everyday reasoning involves many simultaneous analogies that each contribute plausibility to a conclusion." In their review's conclusion, the authors expect "systems that use RNNs to understand sentences or whole documents will become much better when they learn strategies for selectively attending to one part at a time."

After all this, what do you know? Yewno presents the results of deep learning through recurrent neural networks that identify nonlinear concepts in a text, a kind of "knowledge." Hence Ruth Pickering can plausibly state:

Yewno's mission is "Knowledge Singularity" and by that we mean the day when knowledge, not information, is at everyone's fingertips. In the search and discovery space the problems that people face today are the overwhelming volume of information and the fact that sources are fragmented and dispersed. There' a great T.S. Eliot quote, "Where's the knowledge we lost in information" and that sums up the problem perfectly. (source: Meadows' post)

Ms. Pickering perhaps revealed more than she intended.  Her quotation from T.S. Eliot is found in a much larger and quite different context:

Endless invention, endless experiment,
Brings knowledge of motion, but not of stillness;
Knowledge of speech, but not of silence;
Knowledge of words, and ignorance of the Word.
All our knowledge brings us nearer to our ignorance,
All our ignorance brings us nearer to death,
But nearness to death no nearer to GOD.
Where is the Life we have lost in living?
Where is the wisdom we have lost in knowledge?
Where is the knowledge we have lost in information?
The cycles of Heaven in twenty centuries
Bring us farther from GOD and nearer to the Dust. (Choruses from The Rock)

Eliot's interest is in the Life we have lost in living, and his religious and literary use of the word "knowledge" signals the puzzle at the very base of econophysics, machine learning, deep learning, and backpropagational algorithms.  Deep learning performed by machines mimics what humans do, their forms of life.  Pickering's "Knowledge Singularity" alludes to the semi-theological vision of the Ray Kurzweil's millennialist "Singularity;" a machine intelligence infinitely more powerful than all human intelligence combined.  In other words, where Eliot is ultimately concerned with Wisdom, the Knowledge Singularity is ultimately concerned with Power.  Power in the end means power over other people: otherwise it has no social meaning apart from simply more computing.  Wisdom interrogates power, and questions its ideological supremacy.

For example, three researchers at the Center for Information Technology Policy at Princeton University have shown that "applying machine learning to ordinary human language results in human-like semantic biases." ("Semantics derived automatically from language corpora contain human-like biases," Science 14 April 2017, Vol. 356, issue 6334: 183-186, doi 10.1126/science.aal4230). The results of their replication of a spectrum of know biases (measured by the Implicit Association Test) "indicate that text corpora contain recoverable and accurate imprints of our historic biases, whether morally neutral as towards insects or flowers, problematic as race and gender, for even simply veridical, reflecting the status quo distribution of gender with respect to careers or first names. Their approach holds "promise for identifying and addressing sources of bias in culture, including technology."  The authors laconically conclude, "caution must be used in incorporating modules constructed via unsupervised machine learning into decision-making systems."  Power resides in decisions such decisions about other people, resources, and time.

Arvind Narayanan, who published the paper with Aylin Caliskan and Joanna J. Bryson, noted that "we have a situation where these artificial-intelligence systems may be perpetuating historical patterns of bias that we might find socially unacceptable and which we might be trying to move away from."  Princeton researchers developed an experiment with a program called GloVe that replicated the Implicit Association test in machine-learning representation of co-occurent words and phrases.  Researchers at Stanford turn this loose on roughtly 840 billion words from the Web, and looked for co-occurences and associations of words such as "man, male" or "woman, female" with "programmer engineer scientist, nurse teacher, librarian."   They showed familiar biases in distributions of associations, biases that can "end up having pernicious, sexist effects."

For example, machine-learning programs can translate foreign languages into sentences taht reflect or reinforce gender stereotypes. Turkish uses a gender-neutral, third person pronoun, "o."  Plugged into the online translation service Google Translate, however, the Turkish sentence "o bir doktor" and "o bir hemşire" are translated into English as "he is a doctor" and "she is a nurse."  . . . . "The Biases that we studied in the paper are easy to overlook when designers are creating systems," Narayanan said. (Source: Princeton University, "Biased Bots" by Adam Hadhazy.)

Yewno is exactly such a system insofar as it mimics human forms of life which include, alas, the reinforcement of biases and prejudice.  So in the end, do you know Yewno, and if Yewno, exactly what do you know? –that "exactly what" will likely contain machine-generated replications of problematic human biases.  Machine translations will never offer perfect, complete translations of languages because language is never complete –humans will always use it new ways, with new shades of meaning and connotations of plausibility, because human go on living in their innumerable, linguistic forms of life.  Machines have to map language within language (here I include mathematics as kinds of languages with distinctive games and forms of life).  No "Knowledge Singularity" can occur outside of language, because it will be made of language: but the ideology of "Singularity" can conceal its origins in many forms of life, and thus appear "natural," "inevitable," and "unstoppable." 

The "Knowledge Singularity" will calcify bias and injustice in an everlasting status quo unless humans, no matter how comparatively deficient, resolve that knowledge is not a philosophical problem to be solved (such as in Karl Popper's Worlds 1, 2, and 3), but a puzzle to be wrestled with and contested in many human forms of life and language (Wittgenstein). Only by addressing human forms of life can we ever address the greater silence and the Life that we have lost in living.  What we cannot speak about, we must pass over in silence (Wovon man nicht sprechen kann, darüber muss man schweigen, sentence 7 of the Tractatus) –and that silence, contra both the positivist Vienna Circle and Karl Popper (who was never part of it) is the most important part of human living.  In the Tractatus Wittengenstein dreamt, as it were, a conclusive solution to the puzzle of language –but such a solution can only be found in the silence beyond strict logical (or machine) forms: a silence of the religious quest beyond the ethical dilemma (Kierkegaard).

This journey through my "undiscovered summer reading," from the Antikythera mechanism to the alleged "Knowledge Singularity," has reinforced my daily, functional belief that knowing is truly something that humans do within language and through language, and that the quest which makes human life human is careful attention to the forms of human life, and the way that language, mathematics, and silence are woven into and through those forms. The techno-solutionism inherent in educational technology and library information technology –no matter how sophisticated– cannot undo the basic puzzle of human life: how do we individually and social find the world? (Find: in the sense of locating, of discovering, and of characterizing.)  Yewno will not lead to a Knowledge Singularity, but to derived bias and reproduced injustice, unless we acknowledge its limitations within language. 

The promise of educational and information technology becomes more powerful when approached with modesty: there are no quick, technological solutions to puzzles of education, of finance, of information discovery, of "undiscovered public knowledge."  What those of us who are existentially involved with the much-maligned, greatly misunderstood, and routinely dismissed "liberal arts" can contribute is exactly what makes those technologies humane: a sense of modesty, proportion, generosity, and silence.  Even to remember those at this present moment is a profoundly counter-cultural act, a resistance of the techno-idology of unconscious bias and entrenched injustice.

Who Owns The Future Of Books?

My prevous post took a brief look at the religious ideas that permeate not only Jaron Lanier's Who Owns The Future? (whether he explicitly acknowledges those ideas or not). This post considers what he contributes about books, and the future of books.

(Lanier, author of Who Owns The Future? appears on the SHU campus on Wednesday, October 9, 7:00 p.m. Schine Auditorium)

Books have become a cultural flash point that inspire "maniacal scheming" (see pages 352-360) –an unwitting testament to books' enduring iconic, cultural power.  What bothers Lanier is that current development of networks –the Siren Servers that seek total computational awareness and control–might lead to losing "the pattern of what a book is in the stream of human life and thought." (353)  After sketching some possible future scenarios about the fate of books, authors, and readers, Lanier offers a definition (one of the very best I have ever read):

A book isn't an artifact, but a synthesis of fully realized individual personhood with human continuity. The economic model of our networks has to be optimized to preserve that synthesis, or it will not serve [hu]mankind.(358)

Lanier here touches upon the emotional salience and cultural power that books evoke.  The uneasiness Lanier shares with many is not just about texts (tomes, bindings), but about human lives.  "Human life is its own purpose," he continues.  "Thinking about people in terms of the components on a network is–in intellectual and spiritual terms–a slow suicide for the researchers and a slow homicide against everyone else."(360)  The ingestion of millions of e-texts into Artificial Intelligence divorces what humans write about from who they are, and what makes their lives meaningful to them.  "Whether we will destroy culture in order to save/digitize it is still unknown."(353) (Lanier references that metaphor to the Vietnam war.)

What makes a liberal education liberal–freeing–is the strong association (synthesis) of particular texts with particular people, moments, events, movements, points of view.  The real intellectual problem with Wikipedia isn't its alleged accuracy or inaccuracy. Rather, it "proposes that knowledge can be divorced from point of view." Note that Lanier writes knowledge –not data, not information, not the "flashes of thought" that might be "inserted meaningfully into a shared semantic structure" (ibid.)  Knowledge is what humans make for other humans.  Strictly speaking, computers can store, locate, index, and transform data, but can't know in the same sense.

These are my own thoughts, sourced in Lanier's text, which I found to enormously helpful in articulating the fundamentally different business model of a library from a database, even a sort of meta-database (a database of databases –a discovery service, in other words).  What libraries are about is the discovery of knowledge in human communities and continuities, in a symmetrical transaction that celebrates unanswered questions (intellectual risk) and acknowledges the presence of other sources of knowledge –whether living persons, libraries, databases, search engines, or other human syntheses of any and every kind.  

This transaction (process, pedagogy) thrusts libraries into an educational process squarely at odds with Siren Servers that are naracisstic (as though they alone collect data), risk-externalizing (questions and uncertainties never belong to the Server, always to the user), and depend upon extreme information assymetry –users can't know what the Server already knows about them, and how it seeks to modify their behavior.

Understanding the cultural term "book" a "a synthesis of fully realized individual personhood with human continuity" respects authors, readers, and the economic and intellectual chain of power and responsibility that connects them.  This also illuminates why some (many?) people care so passionately about books –what they care about is human continuity, personhood, what makes a human life worth living.  What better question could a liberal arts education pursue?  What could be more "relevant" to the challenges of living in a "flat," networked world?

Is Individual Learning Really Outmoded?

This post refers back to the post (below) of May 14, 2010, and the post of August 25.

In those posts, I mentioned Larry Sanger (co-founder of Wikipedia) and his article Individual Knowledge and the Internet.  Sanger analyzes three common strands of current thought about education and the Internet.  "First is the idea that the instant availability of knowledge online makes memorization of facts unnecessary or less necessary."  The second strand claims that "individual learning is outmoded, and that "social learning" is the cornerstone of "Learning 2.0"  (The third two strand will be examined more fully in a later post.)

Why do I return to an article published a year ago?  I believe that Sanger is on to something: a superficial, misleading articulation among certain educationists that learning has become fundamentally different with the advent of social web tools.  On the contrary, Sanger see such tools as fancy tools, but only as tool towards a very similar end: the content and method of liberal learning which remains to be done, no matter what the technological environment.  I agree with him.  I think that Sanger's argument is worth continuing, if only because, as the bloom seems to be coming off some Web tools, this is a teachable moment to ask, what does it mean to be truly educated?

But back to Sanger's critique of a second strand of thinking about learning adn the Internet: that individual patterns of learning are outmoded, and the new pattern of learning (thoroughly invested in and enabled by Web social tools) is collaborate, social group learning.  Just as some educationists' first claim that the Web has made memorization unnecessary (by in part caricaturing all remembered content as mere rote, unreflective memorization), so this strand caricatures individual learning as –well, individualistic— as lonely, uncreative, and private to the point of solipsism.

Now group learning and social learning using social web interaction –wikis, online conversations, online fora of all sorts, can certainly be valuable.  They can also have problems, and carry costs and benefits which a wise teacher can choose to use as time, attention, and the situation suggest.  This is to say that these tools are exactly that: tools; that other tools (reading a book, an article, summarizing a paper, writing a poem, translating a passage, or other traditional activities) might also be useful, or not, as the situation suggests.

John Seely Brown and Richard P. Adler, however, go much further (in the article cited above).  The go on to claim that "collaborative learning" is "the core model of pedagogy," and that of course digital platforms alone enable this.  Asking what is meant by social learning, they claim:

Perhaps the simplest way to explain this concept is to note that social learning is based on the premise that our understanding of content is socially constructed through conversations about that content and through grounded interactions, especially with others, around problems or actions. The focus is not so much on what we are learning but on how we are learning.

In other words, social learning shifts focus from content to process, which "stands in sharp contrast to the traditional Cartesian view of knowledge and learning."  This view (according to Brown and Adler) views knowledge as a kind of "substance" and that pedagogy –the art of teaching– concerns how to transfer this substance from those who know to those who do not yet know, i.e. from teachers to students.  This "transfer" contrasts with the "constructed" knowledge students arrive at collaboratively.  How "substance" differs from "construction" is left unsaid.  As Sanger points out:

One could just as easily, and with just as much justification, assert that what is constructed in social learning is a "substance" that is socially shared. One can simply say instead that Cartesian learning involves the teacher causing the student to believe something that is true, by communicating the true thought.

In any case, Brown's and Adler's understanding of "Cartesian" (by extension, of Descartes) is laughably superficial.  "Substance" is not a prominent term for Descartes, who though that each person's mind is a substance, not knowledge itself.  Brown and Adler have simply adopted an idea from widely repeated (and vague) academic discourse that knowledge is a social construction (certainly a problematic idea –just ask a physicist).  Knowledge as a kind of "substance" is much more Aristotelian or even Thomist, but those thinkers are too intimidating to serve well as a kind of fashionable foil for social constructivists.  Thankfully Brown and Adler did not drag Kant into this.

The distinction boils down to learning with or without the presence and support of peers.  Certainly some people need peers in order to maximize their abilities to learn; others need solitude.  Isn't this obvious?   The view that social learning is therefore superior is easy to claim, but very difficult to verify in any meaningful manner, because "social learning" simply lacks the definitional heft to test rigorously.  The several tools which Brown and Adler present as examples of social learning are interesting, but cannot bear the entire weight of presenting an alternative to a straw-man "Cartesian." 

Ultimately, you have to do your own reading, no matter how the Decameron or the Divine Comedy come to you (to think of two classic texts with extensive online tools).  You may post your thoughts in essays on a blog or wiki (as I am doing), but the act of writing is still solitary, and needs practice for mastery.  (I certainly don't claim the latter!)  Discussion in any forum, whether face-to-face or online, is a great thing –but I agree with Sanger that a true scholar needs the ability to think independently.  A scholar is not automatically a member of a herd.  You might get a lot of help from peers to learn maths, science, management, economics, or a host of subjects –but if you don't master the material yourself, then you haven't learned it.  If you can't do the problems yourself, you haven't mastered them.  Your peers will not be omnipresent, whether in an examination, or on the job.

I agree with Sanger that those four activities –reading, writing, critical thinking, and calculating– are crucial to liberal education.  A person who can't do them can't really be called educated.  Social learning is an important supplement to, but not a replacement of, individual learning.

Why does this matter to me as a librarian?  I am involved with planning a library renovation –I am making sure that there will be both group and individual spaces for study.  Part of liberal learning includes memorization, reading, writing, independent judgement, calculating –exactly the kind of independently responsible learning so much in demand by knowledge workers today and in the future.  What goes on in a library is individual learning, supplemented by group learning.  Individual knowledge is still necessary in the internet age, and "social learning" without individual knowledge is insufficient to the tasks of reading, writing, critical judgement, and calculating.  At the end of the day, you have to wipe your own nose, say your own prayers, reading your own texts, and work your own problems.

Three Jeremiads: a second look

Darnton-1-122310_jpg_230x1010_q85 Robert Darnton's The Library: Three Jeremiads (New York Review of Books, December 23, 2010) is a wonderfully written, rather gentle set of Jeremiads –for those of us used to reading the real Jeremiah.  He finds research libraries (and by extension, the rest of us) facing three crises, but he ends with hope, not doom.  (In that sense, he more like the original Jeremiah than many would realize.)

Darnton's three jeremiads are, in compact phrases:

  • Hard times are inflicting serious damages on scholarly publishing.  Scholarly publishers can no longer count on selling 800 copies of a monograph, and so many university presses have stopped publishing in some smaller fields (colonial Africa) altogether.  The scholarly monograph is becoming too expensive to sustain, and this back up the entire line from graduate-student research to publish-and-perish for newer faculty.  The pipeline is very seriously clogged.
  • University journals have experienced excessive pricing as control of critical scientific journals have passed to private hands.  The average price of a annual journal subscription in physics is $3,368; the average price in language and literature is $275.  Publishers impose drastic cancellation feeds, written into "bundled" journal subscriptions (sometimes hundreds) over several years.  Publishers seek to keep the terms secret, although a recent case in Washington casts doubt on that ability.  Academics devote time to research, write up the results as articles, referee the articles, serve on editorial boards, and then buy back the product of their labors at ruinous prices.  In order to break the monopologies of price-gouging empires such as Elsevier, scholarship needs open-access journals which are truly self-sustaining.  The Compact for Open-Access Publishing Equity attempts to create such a sustaining coalition of universities.
  • The Google books settlement offers some hope for breathing new life into monographic publishing, according to Darnton.  (I disagree — see below.)  A Digital Public Library of America (DPLA) could succeed should Google fail, but the primary obstacles are not financial but legal.  Those works published between 1923 and 1964 are often in a copyright limbo called "orphaned works," because no one knows who actually holds copyright, if anyone.

Darnton's last Jeremiad offers hope, but is, I find, not a sustaining hope.  Recently I was helping my staff to shift part of our small collection because our shelving is at 100% of capacity and we do still desire to purchase some new monographs in print.  By chance I was shifting our modest collection of books on feminism and its development –but all the essential texts were there, starting with Simone de Beauvoir's The Second Sex (English translation 1953).  All of these titles are in print; the subject remains of great interest to many in the university; all of this material remains in copyright, but much of it is now old enough that the identities of the rights-holders can become difficult to trace.  Given the legal problems, little of this material is likely to be digitized on a large scale any time soon.

There may come a time when the sheer need for digitized texts will overwhelm the vested rights of very numerous rights holders, and society will enforce an equitable arrangement –the Google Books proposal would assign 63% of profits to authors and publishers, to be held in escrow by a trust persuant to a Book Rights Registry.  This proposal cuts the Gordian knot: the Copyright Act granted a long-term license which the government in turn never attempted to track, insofar as enforcement was to be carried out by a (presumably aggrieved) rights-holder.  This promises, however, endless litigation, and by the time that is ended, interest in almost all texts from the 1923-1964 period (or even later) will have faded further.

The sustainability problems for scholarly writing and publishing are very real, and remain.  For a smaller, teaching-oriented University, the reality that these problems are first dealt with by the Class-1 research universities is little comfort: we all live with the results of the mess society and technology has made of rights, copyrights, and the ubiquitous threat of litigation.  Predatory journal pricing structures remain, and it is little comfort for a teaching university that the prices are so far out of the realm of the possible that only a few mourn the impossibility of major scientific journal subscriptions.  The only way forward, as I see it, is to offer support to organizations such as the Public Library of Science, SPARC (The Scholarly Publishing & Academic Resources Coalition), and the evolving identities and offerings of JSTOR and ITHAKA.  But this is not an answer.  It merely joins Darnton's appeal to change the system.