Skip to content

Book Review-How We Learn: The Surprising Truth About When, Where, and Why It Happens

Our ability to learn ranks right up there with our ability to coordinate our activities as the chief weapon that we’ve used to become the dominant species on the planet. As anthropologists John Tooby and Irven DeVore have commented, we carve out the “cognitive niche”. Despite our cognitive capacity being so essential to our survival that it literally drives us to be born before we’re fully prepared to take on the world, relatively little is understood in science about how we learn — and substantially less of what we have learned has become common knowledge. How We Learn: The Surprising Truth About When, Where, and Why It Happens is designed to change the public’s awareness of what little we do know about learning. (If you want more on our ability to coordinate and its importance, you can see The Righteous Mind or Mindreading.)

Literacy

While learning is essential to our current world, much of what we think about as learning is new from an evolutionary sense. Even reading, writing, and arithmetic are relatively new creations. Consider that before the invention of the printing press, literacy meant the ability to write your own name — and there weren’t that many people that were literate. Today, we view literacy differently: it’s the ability to read and write in our native language.

We expect — rightly or wrongly — that our children should be able to have basic fluency in their native language by the time they’re ten. We expect even more from them as they progress through schools. Where calculus was the domain of specialized mathematics only a few decades ago, it’s an assumption for most professions today. You’re expected to understand the basics of a branch of mathematics that was until recently a speciality — and much, much more.

Put Out the Fire

Evolution

In evolutionary terms, the human being we know is a newcomer. Written history extends back a few thousand years, and fossil evidence goes back ten thousand years or so. That’s a blink in evolutionary time. We evolved from hunter-gatherers to the masters of agriculture, and with that we developed a caloric surplus, which allowed us to start to pursue more abstract thoughts than worrying about the next meal and avoiding becoming one.

During this rapid conversion from a nomadic existence following berries and buffalo to one with deep roots in agriculture and our subsequent adaption into a sedentary and highly intellectual experience, we’ve moved faster than our genes can keep up. We’ve moved into a world where our shared knowledge is so much more than we as humans have ever encountered.

Some have estimated that we experience in a single year more information and data than our grandparents experienced in their lifetimes. That’s an increase in information of 50-100 to 1 in just the last 100 years, and it’s getting faster.

Information Management

When I speak to audiences about information management, I share how the advances in our ability to share knowledge is growing at a breathtaking pace. Until Gutenberg’s printing press in 1450, if you wanted something copied, you gave it to a scribe or a monk. Gutenberg made it possible to take important texts and make copies efficiently, thereby reducing the barriers to having books. In 1870, we got typewriters. This allowed us to standardize the appearance of text and to provide a standard structure. In 1959, Xerox created the xerographic process for photocopying content. Suddenly, the bar for replication was dramatically lowered. In the 1970s, computers made the processing and replication of data easier. In the 1980s, computers became personal, and suddenly everyone was able to store and share their information. In the 1990s, computers were networked, so sharing between people became automatic. By the 2000s, we shared images as well. In the 2010s, we started delivering video.

The upshot of this is that it took us thousands of years to get to writing and then a few thousand to get to the ability to replicate content. Now we’re looking at innovations in our ability to share information about every decade. How can you possibly keep up with all the knowledge being created? The answer is that you can’t — however, to even keep up with a portion of what we need to know, we need to be efficient and effective with our learning. Unfortunately, our learning innovations haven’t kept up.

Brain Science

There are two distinct branches of science that study how the brain works. One branch is psychology, which is largely concerned with the proper functioning of our minds as it relates to the behavioral outcomes. The other branch is neurology, which is focused on understanding how our brains perform the wonders that they do.

I’ve shared in my reviews of The Cult of Personality Testing, Warning: Psychiatry Can Be Hazardous to Your Health, The Heart and Soul of Change : Delivering What Works in Therapy, and other books how little we actually know about psychology. In truth, the correlations of outcomes for the counseled and uncounseled states are horrifyingly similar. There’s great arguments in this field about what is and isn’t effective. Psychoactive drugs are prolifically prescribed, and yet seem to have very little effect.

On the neurology front, we’ve got some knowledge about the regions that are active for various thinking and behaviors, but there’s more that we don’t know than we do know. We’re looking into an opaque gray matter hoping to tease out how the magic works — and we’ve been largely unsuccessful. (See Incognito, The Rise of Superman, The End of Memory and Emotional Intelligence for some on neurology.)

Along the way, we’ve found some answers from brave and insightful (and sometimes lucky) scientists who stared at a result and scratched their heads until they could come up with plausible hypothesis about what is going on inside our heads. These answers have not been adopted by those who lead the charge for better education for everyone – they’re marching the same old beat to the same old drum. (See Helping Children Succeed and Schools without Failure for alternative views.)

Myths and Legends

Old myths about how we learn that were garnered from limited experience and observation sometimes run directly counter to the research generated by prestigious universities. Good science is saying some of the things that we’re doing aren’t the right things. We’re not optimizing the learning experience. What we thought we knew about how to teach and learn is being turned on its head — and some is being validated as fundamentally correct.

Some of the myths like having to “keep your nose to the grindstone” are being dispelled by compelling evidence that taking a break can increase retention and free up the cognitive resources necessary to generate the innovations to drive the next generation of business leaders forward.

Forgetting is Your Friend

The nemesis of learning has been the forgetting curve. Ebbinghaus precisely documented the decay of memory using nonsense — in an exacting way. The forgetting curve has long been the enemy of professional trainers and teachers. It’s seen as failure of learning. However, it might be the result of an active process where our brain is trying to cope with the onslaught of information that it wasn’t ever designed to handle. It could be that our mental systems that were designed to consolidate memories trimmed them from our consciousness, so we could focus on things that are more urgent and more relevant.

Losing memories – forgetting – is a painful experience for all of us. It’s frustrating to forget a name or a word when we feel like we need it most. However, this process isn’t one measure but is instead two. Moments after we “need” the information, we may find that we suddenly rediscover what we lost in an annoying but normal aspect of how our memories work.

Memory as Two Separate Measures

One way to consider memory is that it can be measured by two separate attributes. The first measure is the measure of storage. Did we encode the memory and keep it in our brains? Even if we did manage to keep it in our brains, that says relatively little about our ability to recall the information at will. There are things that I know, that when prompted I can recall but for which there are few paths in my brain to be able to recall.

This model for memory is the brainchild of Robert Bjork of UCLA and his wife, Elizabeth Ligon Bjork. Their hypothesis is that we evolved with systems that allow us to forget as a natural part of the process. If we had too much in the front of our mind — that is, with high retrieval — we’d never be able to get anything done. The thoughts would constantly be competing with one another. The retrieval paths for some of our memories are trimmed so that they can only be recalled with very specific stimulus.

Desirable Difficulty

Some research points to a desirable difficulty in learning that causes the brain to more intensely link a memory, and this seems to happen with things that were learned once, then “forgotten” or dramatically unlinked for retrieval and relinked. They are so hard to find that our brain seems to not want to make the same mistake of unlinking again. As a result, ideas that are difficult to learn — or relearn — are given special priority for relinking.

In a strange way, forgetting isn’t the enemy of learning; it may be the tool that our brains use to ensure that we’re able to retrieve the right memories at the right times, even if it doesn’t always guess correctly.

Memory is Context Dependent

Have you ever heard that if you study drunk, you should take the test drunk? As crazy as this sounds, it may be correct. Studies with marijuana proved that when someone studied something while under the influence, their performance was better on a test if they were also under the influence. It seems like, somehow, the state of the person got encoded along with the information and the retrieval was linked to that state.

It’s a well-established fact that behavior is a function of both the person and the environment (see Leading Successful Change for more on Lewin’s function). It’s further a researched fact that people’s opinions are related to where they’re asked questions. If you put them in an environment that feels like home, they’ll give more accurate answers about their home life than if they’re placed in an office or at a college. (See Loneliness for more.) It seems that the web of neural connections is shaped by where we are.

The Importance of Sleep

Historically, sleep was viewed as wasted time. However, from an evolutionary standpoint, we find that most animals sleep at times and lengths that serve them. Koalas survive on a very low-calorie diet of eucalyptus leaves and sleep 20 hours a day. The brown bat similarly sleeps all day but during dusk and dawn, when their adaptation of echolocation is most powerful at allowing them to feed on mosquitos, and they are least likely to be struck down by predatory birds. So, too, there must be an evolutionary reason for our sleep cycle. Some of the evidence seems to be repair of our bodies; but more interestingly, it’s essential for the development of long-term memories and learning.

There has been a great deal of research on sleep now, but it wasn’t always that way. In December 1951, Armond, the son of a young graduate student, Eugene Aserinsky, was hooked up to the predecessor of the EEG, and REM sleep was observed for the first time. Aserinsky thought it was a fluke, but test after test confirmed high levels of brain activity during specific periods of sleep – and more than was expected all the time.

Since then, research has progressed. We now know there are various stages of sleep, and these different phases of sleep seem to be performing different kinds of maintenance. Stage 2 is all about motor memory, stages 3 and 4 are for building retention, and REM helps us build pattern recognition. (If you want more on the research into sleep, see The Rise of Superman.)

Trying It Out – Testing as Studying

One of the challenging things about assessing the efficacy of training (see Efficiency in Learning) is that each assessment changes the learning. Assessing retention after a day increases the probability that someone will remember more when tested two weeks later. The finding is relatively easy to explain. They see a greater relevance in the information, because they’ve been tested on it. (See The Adult Learner for more on the importance of relevance.) What’s harder to explain is how, after two weeks, the average performance will climb when compared to the test just one day later. Even without additional studying, performing an assessment will cause the student to retain more than they remember at the first assessment.

There’s not clear consensus on exactly how or why this happens – but it does happen. We don’t know whether the assessment creates desirable difficulty in the learning process, it increases awareness and therefore elevates memories of related topics that can be used to navigate back to the original idea, or whether sleep continues to reintegrate old memories. Whatever the cause, we learn, in part, based on the way that we’re tested. The more that we’re tested on simple recall, the more that we’ll remember things that require simple recall. The more we provide complexity in our testing, the more likely we are to encourage complex storage of facts.

The real test is the test of life. What will you retain from How We Learn – and why?

4 Comments

  1. […] makes an appearance in learning in works like The Adult Learner, Efficiency in Learning, and How We Learn. Credibility and our ability to appear credible to our audience shows up in marketing books. (See […]

  2. […] need for connection. It’s how we compete with ants for the most biomass on the planet. As How We Learn comments, we have the cognitive niche. However, most of our cognition is designed to manage […]


Add a Comment

Your email address will not be published. Required fields are marked *

This site uses Akismet to reduce spam. Learn how your comment data is processed.

Share this: