Back on January 5 (doesn't that seem like another age of Middle Earth by now?), Cal Newport wrote about what he termed "evaluation entanglement:
- Evaluation entanglement. Keeping your productivity commitments all tangled in your head can cause problems when a strategy fails. Without more structure to the productivity portion of you life, it’s too easy for your brain to associate that single failure with a failure of your commitments as a whole, generating a systemic reduction in motivation.
Newport was writing in the context of New Year's "productivity tweaks," or what were once called resolutions. They usually go by the boards in a few days or weeks.
So far as I can tell, Newport borrowed "evaluation entanglement" from the physics of entangled states (Newport is a scientist) and description/evaluation entanglement in the work of Hilary Putnam (Newport is also a philosopher) --and both of these clusters disciplines are pertinent and informative for Newport's primary intellectual interest in computer science, distributed algorithms that help agents work together.
Putnam's work on fact/value entanglement liberated "thick ethical concepts" that lie under sentences such as Nero is cruel from the straightjacket of notions of fact and value judgement, so that Nero is cruel can express a value judgement and a descriptive judgement at the same time. The entanglement of facts and values is a characteristics of those many statements that" do not acquire value from the outside, from the subject's perspective, for example, but facts that, under certain conditions, have a recognizable and objective value." (Martinez Vidal, Cancela Silva, and Rivas Monroy, Following Putnam's Trail, ISBN 9789042023970, 2008, page 291)
Newport's point is simpler, but lies in the shadow of Putnam's entanglement of facts and values: a person can associate her or his single failure in one element or commitment (to productivity, in this context) with the failure of his or her commitments as a whole --and that this pervasive sense of value can generate "a systemic reduction in motivation." In other words: I want to be productive in manners or projects A, B, C, and D, "and if I fail at C, I fail at the rest, and my life rots." The fact of failure with C generates an evaluative entanglement that describes my whole life.
A life so described needs to be described in much thicker terms, however. Such failure is very rarely simple, straightforward failure.
This is where the work of Robert Kagan and Lisa Laskow Lahey at the Harvard Graduate School of Education is very helpful. They have focused on immunity to change: both individual's immunity, and organizational immunity. In their book Immunity to Change, and their large online class (I hesitate to call it MOOC) Including Ourselves in the Change Equation, they explore and describe the significant difference between undertaking technical means to solve adaptive challenges --when change is not simply a matter of altering well-known behaviors and thought, but involves adapting thinking and finding new mental and emotional complexities at work.
Based upon adaptive theories of mind and organizational theories of change, Kagan and Lahey take their students on a journey of thinking new thoughts, or telling their stories in a new way --literally, changing the narrative in such as a manner that both visible commitments to change, and corresponding hidden, competing commitments that block change, can reveal a person's (or organization's) big assumptions about the world. By holding up those big assumptions to the light of understanding and reflection, persons can question effectively and adaptively whether such assumptions in fact are valid.
I took this course last Fall, and found it to be a very rich experience. I won't reveal what my own visible commitments, hidden competing commitments, and big assumptions were --only to say that I was working on a life-long issue that affects every relationship and commitment in my life. My goal for change and understanding was something that definitely passed the "spouse test" --"oh yeah, that's you one hundred percent."
Kagan's and Lahey's metaphor one foot on the gas, the other foot on the brake pretty well summed up what I had been finding in attempting changes in my life and character. That metaphor invokes neatly an "evaluation entanglement" -- both a descriptive judgement and value judgement in the same phrase. The "thick ethical concept" is a philosophical way of telling a story --telling a narrative of your own life (or your organization's life) that frames the descriptions and the values in a certain way of thinking. Adapting such thinking to new complexities, and changing the story by expanding and deepening it, is the core structure that liberates a person, and changes and organization, from taking one example of failure as "failure of your commitments as a whole." Such increasing mental complexity and adaptive thinking is critical to avoid "generating a systemic reduction in motivation." There's nothing that defeats a person or an organization quite like the experience of seeking change but blocking change at the same time, one foot on the gas and the other on the brake. What is produced is a great deal of heat, significant atmospheric pollution, very little traction, and no progress.
Navigating the shoals of evaluative entanglement requires complex thinking and a certain level of lived experience. There is no app for this. But there is a course, and I recommend Including Ourselves in the Change Equation whole-heartedly to anyone who really wants to change.