How students' performance improved on Tassomai

At Tassomai, we are always analysing data to measure where the software is most effectively supporting learning. One of our internal metrics analyses how students’ understanding of difficult questions changes throughout their time on Tassomai; this is the statistic which we have published this year in our infographics.

Which questions did we look at?

In our analysis, we looked at how students’ accuracy on certain questions changed when they finished using Tassomai compared to when they started on the course. The questions we chose to analyse were what we call “stretch questions” - these are questions with a high difficulty, that are frequently answered incorrectly.

stretch .png

When students start on Tassomai, they only see a few stretch questions - it is only after the initial calibration phase, when they have completed 10-15% of the course, that they begin to see stretch questions more regularly. This is because we only want students to unlock these questions when they have demonstrated a decent level of basal-knowledge in each topic.

As they progress through the course, the amount of stretch questions - and the difficulty of these questions - gradually increases. This is based on a principle which is seen in game design: flow-state building, “Goldilocks-zone” differentiation. Students are given a level of challenge which is just right for them, and this is calculated and reassessed in every topic for each user. Therefore when a student is rapidly improving in a topic, we will increase the question difficulty more quickly, whereas the question difficulty is capped in topics where they are struggling. This ensures that students are always challenged, but never disheartened, keeping them engaged - while simultaneously scaffolding their learning, ensuring they always see relevant content.

This therefore suggested to us that, since difficulty increased as students progressed through Tassomai, students would show a similar level of performance in stretch questions at the beginning and end of the course.

Which students did we look at?

We filtered out students who had not passed the initial calibration period - students who had completed less than 10% of the course. This was because those students would have insufficient data to compare between when they started and when they finished using Tassomai, and would therefore have insufficient engagement to be part of a meaningful analysis.

After removing these students, we looked at all the answers given to stretch questions by the remaining students across all 500 UK schools that used Tassomai in the 18/19 academic year. We then took the first time they answered these questions, and compared them to their final answers, to see if there was a net change in accuracy. We looked at this on a per-student basis, and then averaged that for each school.

What were the results?

Inevitably, there is some noise when looking at this per-student - not every student improved compared to when they first answered stretch questions - but this is to be expected. Some students would have been trying harder when they started the course, or would have been focusing on their worst topics in the run-up to the exam.

However, when we looked at the trends in the data they told a compelling story: every school with a statistically significant number of students showed an improvement in performance.


The average increase in performance across all schools was 7.8%

As is the case with all averages, there were some schools that far exceeded that - with around twenty schools’ students showing an increase in performance of over 15%! - while others improved by less than the average; even in their cases, there was still an improvement significantly above what we had predicted.

What this tells us is that, by engaging in this type of regular, personalised practice with feedback - and thanks to the amazing efforts of teachers to embed Tassomai as homework and part of their school’s culture - the results have paid off for students as they prove their ability to tackle harder material with equal or increasing success.

There is more to this story of course; Tassomai’s data has helped to inform targeted revision sessions and interventions, parents and students have been pushing hard for great results, and, of course, teachers have done incredible work in the classroom. We are under no illusions who the heroes of the story are, but we’re very proud to have helped facilitate that hard work up and down the country, and are excited to see the results next year as we roll out our program to new year groups for more subjects in September.

If you’re interested in learning more about our data and research, please get in touch with us!

generic artboard.png