Home » Newsletter » Archive » Pandemic Sends Students Struggling With Math Online for Help From UK Research Team

Pandemic Sends Students Struggling With Math Online for Help From UK Research Team

Bottge Profile Photo
Dr. Brian Bottge

Fayette County Public Schools students began earning credit at home through the state’s non-traditional instruction (NTI) program this week, in an effort to help curb the spread of the coronavirus. Researchers at the University of Kentucky College of Education are using video-conferencing technology to connect with students from a Lexington middle school as they earn credit at home.

When schools began announcing closures due to COVID-19, Dr. Brian Bottge and his colleagues were mid-way through a study to help students who struggle to learn mathematics. To finish, they would need to stay connected with their partnering teacher and students.

Project manager Linda Gassaway, along with doctoral student Megan Jones, worked with the middle school teacher (the teacher is not identified to protect student privacy) to transform the project into something that could be delivered virtually.

Bottge’s research focuses on adapting math lessons to reach learners with disabilities. In his current study, supported by an Institute of Educational Services supplemental grant, Bottge, Gassaway, and Jones are taking a closer look at how tests can be designed to provide a better representation of what students with disabilities have learned.

“Students with disabilities often do not perform well on traditional tests, but that does not mean they don’t know how to do the mathematics behind the test questions. It is deep inside them and we need to find better ways to get it out of them. If we can uncover what they have learned and have them display it for us, it can lead to more success for these students,” said Bottge, who is the William T. Bryan Endowed Chair in Special Educationat the UK College of Education.

Project manager Linda Gassaway, far left, supervises an Enhanced Anchored Instruction lesson involving hovercrafts (file photo).

The need for the new assessment surfaced during two randomized studies supported by a four-year Institute of Educational Sciences grant. Based on more than 100 classroom observations, test results, and interviews, the findings suggested that traditional written tests fall short of capturing the problem-solving abilities of students with math disabilities. Under Bottge’s leadership, the research team is working to identify newer forms of assessment that more adequately display students’ problem-solving skills.

Bottge’s initial research focused on designing effective instructional materials. That work resulted in creating and analyzing lessons following a model known as Enhanced Anchored Instruction. The method is based on Anchored Instruction, a learning approach developed at Vanderbilt University. The project-based lessons are designed specifically for adolescents with learning disabilities, but have been effective with students at all achievement levels.

The students are first tasked with helping students in a video solve a multi-component problem such as figuring out where on a ramp to release a model car to navigate tricks at the bottom of the ramp. In the hands-on unit, students test their own cars on a large ramp like the ones shown in the video. The purpose of both units is to help students develop their informal understanding of pre-algebraic concepts such as linear function, line of best fit, variables, rate of change (slope), reliability, and measurement error. The students must then transfer what they learned during the project-based activity to answer questions on a written test.

photo of teachers working with students
Doctoral student Megan Jones (standing at left) works with students (file photo).

“It’s probably a shock to students to sit down and do a written test, since it is so different than how they learned the material,” Bottge said. “We see the students picking up the math concepts while working on their projects. On written tests, they perform better than if they had not done the project. However, I don’t think the written test captured how well they understood the problems.”

All along, the research team has been trying to figure out what kind of errors students are making when they arrive at an incorrect answer. They often have to judge if the student pulled the answer out of the sky, or if there was an attempt at reasoning behind the answer. However, they felt their error analysis could only get them so far. Now, they are focusing on finding a method for better evaluating students’ learning.

“A lot of students have difficulty solving problems, especially students with disabilities, for a couple reasons,” Bottge said. “One has to do with the disability itself, because a lot of students with disabilities have learning disabilities. But the other issue is how we have taught these students in the past. We have taught them with difficult word problems and they don’t understand because a lot of them have difficulty reading as well.”

The researchers note students often attend to the wrong stimulation on the test paper. While the word problem may be asking them to calculate the miles per hour of a car traveling a certain distance, students may become distracted by the illustrations paired with the question. Rather than a calculation, they may note the color of the car and the number of clouds and tress in the drawing because they struggle with filtering out extraneous information.

students building ramp to launch cars
Students build car ramps to perform hands-on portion of Enhanced Anchored Instruction (file photo).

To take a closer look at where students’ test errors are happening Gassaway, the project manager, has been meeting one-on-one with students in the days following their written test. She engages them in a discussion about what the mathematics problems on the test are trying to solve.

“We sit across from each student with an audio recorder and explain why we are doing this,” Gassaway said. “We ask if they have ever experienced a time when they felt they did well on the lesson, but then the test came and they didn’t do well. That seems to resonate with them.”

Students are given a new copy of the test and are asked to read the questions out loud. Gassaway looks for places where the students’ understanding of the questions could have impacted their answer. They may read certain words incorrectly or not pause when encountering a comma or period, stumbling through the question. She makes notes and asks them to look at questions again.

“The intent is for them to start figuring out the problems on their own,” Gassaway said. “I ask a few questions to help pull out the info they need. I meet them where they are stuck, teach them a bit, and see how they respond, walking through the whole problem. Getting inside kids’ brains is enlightening. I have been teaching since 1986 and have seen lots of stuff, but I am still having ‘aha’ moments.”

Now that the project has moved online, Gassaway will talk with students using the Zoom videoconferencing platform.

Following the individual session with Gassaway, students take another written test on their own. If they make mistakes on the second test, the hope is that they are more educated mistakes, rather than a guess. The audio recordings of the sessions with Gassaway are later analyzed by Jones, the doctoral student working on the project.

The researchers admit it would be impossible for teachers to test every child in the U.S. with the same level of one-on-one focus. However, they know if they can get more information about what students are thinking when they answer questions incorrectly, they can start to see patterns that will be useful for influencing how lessons are taught and measured to better meet students’ needs. Ultimately, they will be designing a hybrid form of assessment that combines an oral questioning component with a technology-based platform.

“We have really great teachers. We know the students are understanding a lot of the things they are teaching. Written tests do not draw out of students what they know, which is unfair to the students and unfair to the teachers.  We make a lot of judgements based on test scores, and if they are not measuring what they should be measuring, then we are making the wrong assumptions about some of these kids,” Bottge said.

To learn more about research studies and degree programs in the UK College of Education Department of Early Childhood, Special Education, and Counselor Education, visit https://education.uky.edu/edsrc/.