Skip navigation links

Oct. 1, 2013

$5.7 million NSF grants seek better ways to assess STEM learning

Michigan State University has received two National Science Foundation grants totaling $5.7 million, funds that will utilize computer software to analyze student writing in science and engineering classes.

The goal is to help retain more students who are enrolled in the so-called STEM disciplines – science, technology, engineering and mathematics.

The two grants include $5 million for five years to develop a website where student exam answers can be analyzed; and an additional three-year, $718,000 grant to assist instructors in the use of the software.

In typical, large-enrollment STEM courses, multiple-choice exams are given because they are easily scored by computers. But a lot of information is hidden from view with multiple-choice exams.

The grants will allow a team of MSU researchers to develop computerized tools that will analyze students’ written responses to homework, quiz and test questions to predict how they would be assessed.

“Students answering questions in their own words is the most meaningful way for instructors to identify learning obstacles,” said Mark Urban-Lurain, principal investigator for the $5 million grant and associate director of MSU’s Center for Engineering Education Research. “The realities of typical large-enrollment undergraduate classes, however, restrict the options that faculty members have for evaluating students' writing.”

“When students express what they know in their own words, it is deeper and a more rich view of what students know and have learned,” said John Merrill, principal investigator of the $718,000 grant and director of MSU’s Biological Sciences Program. “It’s a more interactive form of learning and teaching.”

The computer software is based on programs from the business world that analyze surveys. It selects words and phrases written by the students that provide insight into how they understand the course material. It can analyze several sentences.

A major part of the funding is targeted for completion and public rollout of a fully automated website where instructors around the world can have their students’ open-response answers analyzed automatically.

Merrill said improvements to the current constructed response assessments would reveal more about student misconceptions, allowing faculty members to make small corrections throughout the semester.

“The software can analyze text and determine correctness,” he said. “But we have to train the computer to do that.”

Other collaborators on the project include faculty members at the University of Colorado at Boulder, the State University of New York at Stony Brook, the University of Maine at Orono, the University of Georgia, the University of South Florida and Western Michigan University.

 

By: Tom Oswald