Metacognition training boosts gen chem exam scores

Metacognition training boosts gen chem exam scores
Credit: University of Utah

It's a lesson in scholastic humility: You waltz into an exam, confident that you've got a good enough grip on the class material to swing an 80 percent or so, maybe a 90 if some of the questions go your way.

Then you get your results: 60 percent. Your grade and your stomach both sink. What went wrong?

Students, and people in general, can tend to overestimate their own abilities. But University of Utah research shows that students who overcome this tendency score better on final exams. The boost is strongest for students in the lower 25 percent of the class. By thinking about their thinking, a practice called metacognition, these students raised their final exam scores by 10 percent on average - a full letter grade.

The study, published today in the Journal of Chemical Education, is authored by University of Utah doctoral Brock Casselman and professor Charles Atwood.

"The goal was to create a system that would help the student to better understand their ability," says Casselman, "so that by the time they get to the test, they will be ready."

Errors in estimation

General chemistry at the University of Utah is a rigorous course. In 2010 only two-thirds of the students who took the course passed it - and of those who didn't, only a quarter ever retook and passed the class.

"We're trying to stop that," Atwood says. "We always want our students to do better, particularly on more difficult, higher-level cognitive tasks, and we want them to be successful and competitive with any other school in the country."

Part of the problem may lie in how students view their own abilities. When asked to predict their scores on a midterm pretest near the beginning of the school year, students of all performance levels overestimated their scores by an average of 11 percent over the whole class. The students in the lower 25 percent of class scores, also called the "bottom quartile," overestimated by around 22 percent.

This phenomenon isn't unknown - in 1999 psychologists David Dunning and Justin Kruger published a paper stating that people who perform poorly at a task tend to overestimate their performance ability, while those who excel at the task may slightly underestimate their competence. This beginning-of-year survey showed that general chemistry students are not exempt.

"They convince themselves that they know what they're doing when in fact they really don't," Atwood says.

The antidote to such a tendency is engagement in metacognition, or thinking about and recognizing one's own strengths and limitations. Atwood says that scientists employ metacognition skills to evaluate the course of their research.

Metacognition training boosts gen chem exam scores
A sample screenshot of homework feedback to help students assess their strengths and weaknesses. Credit: Brock Casselman
"Once they have got some chunk figured out and realize 'I don't understand this as well as I thought I did,' they will adjust their learning pattern," he says. After reviewing previous research on metacognition in education, Atwood and Casselman set out to design a system to help chemistry students accurately estimate their performance and make adjustments as necessary.

Accurate estimation

In collaboration with Madra Learning, an online homework and learning assessment platform, Casselman and Atwood put together practice materials that would present a realistic test, and asked students to predict their scores on the practice test before taking it. They also implemented a feedback system that would identify the topics the students were struggling with so they could make a personal study plan.

After a few years of tweaking the feedback system, they added the element of weekly quizzes into the experimental metacognition training to provide students more frequent feedback. By the first midterm exam of the 2016 class, Casselman and Atwood could see that the experimental course section's scores were significantly higher than a control section's that did not receive metacognition training. "I was ecstatic!" Casselman says.

By the final exam, students' predictions of their scores were about right on, or a little underpredicted. Overall, the researchers report, students who learned metacognition skills scored around 4 percent higher on the final exam than their peers in the control section. But the strongest improvement was in the bottom quartile of students, who scored a full 10 percent better, on average, than the bottom quartile of the control section.

"This will take D and F students and turn them into C students," Atwood says. "We also see it taking higher-end C students and making them into B students. Higher-end B students become A students."

Atwood adds that the students took a nationally standardized test as their final exam. That means that the researchers can compare the U students' performance to other students nationwide. The bottom quartile of students at the U who received metacognition training scored in the 54th percentile. "So, our bottom students are now performing better than the national average," Atwood says.

"They're not going to be overpredicting their ability," Casselman says. "They're going to go in knowing exactly how well they're going to do and they will have prepared in the areas they knew they were weakest."

A cumulative effect

This study covered students in the first semester of general chemistry. Casselman has now expanded the study into the second semester, meaning some students have had no semesters of metacognition training, some have had one and some have had two. Preliminary analysis suggests that the training may have a cumulative effect across semesters.

"The students who are successful will ask themselves—what is this question asking me to do?" Atwood says. "How does that relate to what we're doing in class? Why are they giving me this question? If there's an equation, why does this equation work? That's the metacognitive part. If they will kick that in, they will see their grades go straight through the roof."

Both Atwood and Casselman say this principle is not limited to chemistry and could be applied throughout campus. It's a principle universally applicable to learning, and has been hinted at for centuries, including in a Confucian proverb:

"Real knowledge is to know the extent of one's ignorance."


Explore further

State school students twice as likely as private school peers to finish top in medical school

More information: Brock L. Casselman et al, Improving General Chemistry Course Performance through Online Homework-Based Metacognitive Training, Journal of Chemical Education (2017). DOI: 10.1021/acs.jchemed.7b00298
Journal information: Journal of Chemical Education

Provided by University of Utah
Citation: Metacognition training boosts gen chem exam scores (2017, October 20) retrieved 17 June 2019 from https://phys.org/news/2017-10-metacognition-boosts-gen-chem-exam.html
This document is subject to copyright. Apart from any fair dealing for the purpose of private study or research, no part may be reproduced without the written permission. The content is provided for information purposes only.
304 shares

Feedback to editors

User comments

Oct 20, 2017
I am sure that most people who have gone to college is aware that its worth a letter grade improvement at least just learning exactly how to take a timed test.

Oct 20, 2017
That's not what is in question here. This is about misplaced confidence, not realising you're not as good as you thought until a wheel comes off.

Oct 21, 2017
That's not what is in question here. This is about misplaced confidence, not realising you're not as good as you thought until a wheel comes off.


That's exactly the point about taking timed tests: you have to be able to identify which questions you cannot answer and concentrate on the ones that you can in the time allotted.

Depending on the examiner, the test may also award you penalty points for wrong answers, but not for answers you didn't give, so the strategy is to skim through the list of questions and pick what you think you're able to answer. Choose all the easy ones, and then if you have any time left, try to tackle the more difficult ones.

If you just start going through the list in order, thinking you'll be able to answer all of them in time, you're either a fool or a savant. Hence, the improvement they see is not because the students got better at the subject and improved their focus, but because they got better at taking the test.

Oct 21, 2017
Eikka, I think I understand your contention about the strategy students should use to get through timed tests.

That brings up two questions to my mind. First, can we rely on good test scores as a reliable indication that the student is actually learning what they need?

In other words, a high-scoring engineer? How do we trust the safety margins on that bridge they built? How do we trust that ace-student brain surgeon is competent?

The second question is, what method would you suggest for monitoring whether or not the students are actually learning anything?

Oct 22, 2017
In other words, a high-scoring engineer? How do we trust the safety margins on that bridge they built? How do we trust that ace-student brain surgeon is competent?


By the fact that engineers just coming out of school aren't even allowed to draw plans for a bridge without someone higher up checking their calculations and approving the designs. In many places you don't even get to call yourself an engineer without extensive experience in the field.

can we rely on good test scores as a reliable indication that the student is actually learning what they need?


Simple answer to a simple question: no. There's always two ways to pass standard tests: understanding and rote memory. Almost every student employs both where the other fails.

Oct 23, 2017
Thanks E, for your reasonable explanations.

Great, now I'm going to have to worry how much the elder experts have learned during their careers.

Oct 23, 2017
It's a lesson in scholastic humility: You waltz into an exam, confident that you've got a good enough grip on the class material to swing an 80 percent or so, maybe a 90 if some of the questions go your way.

Then you get your results: 60 percent. Your grade and your stomach both sink. What went wrong?

Is this a serious question?
Listen, if you waltz into an exam without enough preparation that gives you a serious shot at 100% then that's what went wrong. You didn't study enough. Period.

Oct 24, 2017
Listen, if you waltz into an exam without enough preparation that gives you a serious shot at 100% then that's what went wrong. You didn't study enough. Period.


1) Prioritization. You can't do 100% on every subject, so you have to pick your battles.

2) The tests are designed so that few if any can get a 100%. If the test score gets saturated and too many people get straight A-s, the difficulty is increased for the next class.

So, going back to point 1, trying to get 100% scores in your tests results in diminishing returns for your efforts and you end up spending way too much energy and time studying for one subject, burn yourself out, and then fail the rest. Again, if you try to score 100% on all your tests, you're either a fool or a savant, and chances are you're not a savant.


Please sign in to add a comment. Registration is free, and takes less than a minute. Read more