Several of the course modules offered by the University of Liverpool’s Department of Mathematical Sciences have more than 500 students. Such large classes made it difficult for lecturers to provide timely and consistent feedback. Additionally, high marking loads reduced the time available for instructional strategies that would develop students’ higher-order thinking skills.
Dr. Joel Haddley, a lecturer in the Department of Mathematical Sciences and leader of the Mathematics Centre for Enhancement in Education (MathsCEE), saw automated, continuous online assessment as a solution. During the 2017-18 academic year, he rolled out the Möbius digital platform for three math courses. Two were his, and the other was a first-year linear algebra course.
Since the pedagogical implementation of the platform differed across the courses, Dr. Haddley was able to demonstrate Möbius’ flexibility.
For example, the other course lecturer wanted to improve students’ mathematical writing. That lecturer included scaffolding for writing in the form of gap-filling assessment questions. After gaining that practice, virtually all the students in that course attempted to use proper mathematical writing in their final exam: a big improvement on previous years.
Dr. Haddley used the platform to flip one course module. He made the change gradually, first using instructional videos as supplementary materials. “I was still lecturing a bit in class because I didn’t want to change too much too quickly,” he said.
The following year, Dr. Haddley fully flipped the course. Students prepared for weekly active learning sessions by viewing his videos and completing only diagnostic activities. Once a student had completed each week’s material, the associated summative assessments automatically unlocked. “The students are really engaged,” Dr. Haddley reported. “I’m getting more than 90% every week doing the preparation.”
One of Möbius’ features that Dr. Haddley finds invaluable is the capacity for generating random questions. Instead of randomly selecting questions from an existing bank like a virtual learning environment (VLE) tool, Möbius uses mathematically-generated random numbers. “So, whereas a question bank might have 10 questions, our software could have infinitely many,” Dr. Haddley explained. The random variable feeds into the solution, so students receive fully worked solutions and feedback.
The random questions have changed in the way students help each other with formative assessments. When they all received the same question, they asked each other for the answers. Now, they ask for the methods for arriving at an answer.
In some implementations, students are given multiple chances to take the online summative assessments. Their correct answers can be “locked in” so that they don’t have to answer those questions again, adding a formative element which motivates students to try again. Students also receive automatic feedback, which enables them to diagnose where they need to improve. Finally, students don’t have to be on campus to take the online assessments; that’s now an “anytime, anywhere” part of their learning.
The cost and effort of deploying the digital platform paid off very quickly for the department. “We actually recouped the initial investment in the first year,” stated Dr. Haddley. Based on the number of assignments marked, and the time that staff and teaching assistants would have devoted to grading, he estimates the cost savings to be more than £50,000 a year. “And we’re investing that time back into the student experience in more valuable ways,” he stated.
In 2018, the university awarded Dr. Haddley a Learning and Teaching Fellowship in recognition for his innovative work with Möbius. The platform is now in 20 modules, providing fully automated assessments for over 40,000 individual assignments a year.
Word about Möbius has spread beyond the mathematical sciences department. A geography course began using the platform in the 2019-20 academic year to give students instant answer-specific feedback on multiple-choice assessments, using Möbius’ ‘How did I do?’ feature.
For Dr. Haddley, one of the biggest rewards has been the capacity to focus on higher-order learning outcomes. “Because we’ve got such high student numbers, our focus had been on the low-hanging fruit,” he said. “Automating that frees up our time, and now there’s interesting work being done on how to really measure deep understanding and how to aim for higher-order learning outcomes with this tool.”