Why Biology Holds the Key to Improving Learning
Our brains need time to develop and process information.
I'm Scott Carlson, a senior writer at The Chronicle covering higher ed and where it's going. Last week, I attended the Competency-Based Education Network's annual meeting, where many of the conference sessions (including two that I led) and casual conversations focused on measuring the skills of college students and of people in the work force. And much of the time, I was thinking about some of the issues that Marsha Huber raises in her guest essay below.
The attendees of last week's conference were surely grappling with how to define skills and assess them accurately -- and how those definitions and assessments can change according to context and be enhanced by real-world experiences, like apprenticeships. But there was a nagging tension in some of the conversations: On one hand, competency-based education and skills assessment can play a valuable role in recognizing and surfacing the expertise and abilities that people have built over time, which may not have been recognized by the formal school system. In that way, such assessments could accelerate someone's path to a degree or certificate.
But, on the other hand, can skills- and competency-based assessments measure growth that will happen long after a student has concluded their course? To what extent is a college education about seeding ideas and perspectives, and then allowing students to ruminate on them for weeks, months, or years, transforming that person along the way? This process seems particularly true for the habits of mind often classified as "durable skills," like critical or creative thinking.
In any case, Huber makes a salient point here: Learning science deserves a more prominent role in how we structure college courses and classrooms -- and perhaps how we think about what college is supposed to do for a student.
Rethinking Learning: Why Biology, Not Technology, Holds the Key
by Marsha Huber
Why are we still debating the skills college students need upon graduation, as if it is a new conversation? For nearly four decades of my teaching career, I have heard the same refrain: Students need critical-thinking skills. But despite all this discussion, why do we continue to miss the mark? It's because the answer lies in something fundamental: We don't understand the biology of the brain.
If we genuinely want to improve education, we must first understand how the brain matures and processes information. When I was a visiting scholar at the Harvard Graduate School of Education, I learned a few fundamental truths about the brain from Kurt W. Fischer, the late founder of the mind, brain, and education program. He taught us three main principles: 1. Forgetting is part of learning, 2. learning is not linear, and 3. the brain learns in context.
This understanding, however, did not sit well with a colleague on the finance faculty at my previous university. She once accused me of not teaching anything to the accounting students in my principles class, because, she said, "they don't remember anything when they get to my class."
I explained to her that students learn in context, and that their brains cannot immediately remember what they've learned when another professor is standing at the front of the room because the context has changed. "It takes time for the brain to figure out that the content it's learning is not tied to the context," I said.
"I don't believe you!" she said as she turned quickly and rushed down the hallway.
This exchange highlights one of the key reasons we struggle to move the needle on critical thinking: We do not acknowledge how long it takes students' brains to mature.