Jump to ...
Cancel

Third Space learning

Designing a report to show impact and improve retention

 

IN A NUTSHELL

Third Space Learning (TSL) needed to improve customer retention by proving to schools that their online tutoring was having a positive impact on pupils. I was tasked with developing a report that showed this progress.

Through customer research, multiple design iterations and user testing, I developed a report that met the needs of multiple stakeholders and applied information design techniques to quickly turn information into actionable insights. These were instrumental in helping with the retention conversations.

 

THE FULL STORY

THE BRIEF

Third Space Learning (TSL), an online tutoring startup, were unable to demonstrate the impact their service was having on pupils. As a result, schools found it difficult to justify the costs and were not retaining. I was tasked with creating a report that demonstrated impact and helped the customer team with retention conversations.

“I can’t go to the head asking for 2k a term with just
‘I think this works’ ”

Customer feedback summarising the problem

 

 

RESEARCH

SURVEY

Demonstrating impact was challenging because we had no formal tests to give a before and after comparison. Without this data, I decided to create a customer survey to help gain insights into the different stakeholders reporting needs and current behaviour.

The following themes emerged from analysing the survey results:

Needs
  1. See impact through pre and post test comparison
  2. Evidencing progress over time to Ofsted 
  3. Highlighting where pupils struggled to help class teachers be more effective in the classroom
  4. SLT (senior leadership team) need to see progress, and insights to inform decisions on which pupils to select.
Existing behaviours
  1. Reports commonly used in termly pupil review meetings 
  2. Current session reports commonly discussed with pupils
  3. Discussed TSL with parents but didn’t have something suitable to share with them

 

DESIGN

SKETCHES

Based on the insights from the survey and a content audit of potential data sources, I sketched a framework for the report and explored different ways of visually encoding the data to make it quick to understand at a glance. This involved stripping back anything that wasn't communicating data to a minimum so it's not distracting users as well as using the right visual encoding to match the data type.

Sketch of the overview page
A class level overview comparing pupil progress would meet the needs of the school's SLT who need to justify spend on the programme to governors and Ofsted. This could also help class teacher make decisions on who should potentially be replaced.
Wireframe of the new subnavigation
An individual view focused on the detail for each pupil, including where they struggled and what they achieved in the context of the curriculum. This would be suitable for teachers, pupils and parents.
Sketches exploring different ways of representing progress
I explored different ways of representing pupil's progress. I developed the sketch on the left working with educational specialists on the team. This communicates the shape of the pupil's progress. Pupils learnt a lot of new content (adjusted to take into account the age appropriate of the lesson content) in weeks where there is a steep gradient. The alternative which summarises how much new content has been learnt, how much has been reinforced, and how much was not taught/ needs more work is simpler to understand but doesn't tell the story of what's happening over time.
Wireframe of the new subnavigation
I explored different ways of communicating the age appropriateness of the lessons. Dedicating so many pixels for this info seemed inefficient so I moved to approach where the colour of the lesson reference number icon represents age appropriateness and used the bar charts to represent what happened in the lesson.
Sketches exploring different ways of representing progress
Explored different ways of tying pupil's engagement with how much content they got through. Also thought it would be useful to tie pupil's rating of how difficult they found the session with how useful they found it to help the teacher easily spot if a pupil is not finding lesson useful because it's too difficult or easy.
Wireframe of the new subnavigation
Explored different way of presenting info about the lessons covered including a chronological order vs. lessons covered grouped by the categories and topics they belonged to. I anticipated the latter would be more useful as it allows them to easily assess how they're doing in topic area.

DIGITAL DESIGN

I opened up Sketch to mockup the designs of the overview page and individual report page based on the paper sketches.

Mockup of the report overview page

Mockup of the overview report page ready for user testing.

 

Mockup of the individual report
Mockup of the individual report page ready for user testing.

 

USER TESTING

INITIAL ROUND

I took the design mockups and conducted user testing with the primary users of the report. The objective of this research was twofold:

1. Understand whether we are presenting the right content to meet the needs of our users and check that this is at the right level of granularity to provide actionable insights

2. Check the effectiveness of the information design–is it easy to interpret and pull out the actionable insights?

 

Key learnings
XXXXXXXX
Teachers felt that we had come up with a good way for them to understand pupils' progress (in the absense of a pre and post test) though the text explanation needed refinement to help comprehension.
XXXXXXXX
A summary is only useful if it's providing specific actionable insights. Also, highlighting gaps at lesson level wasn't granular enough. They need to know the specifics on where the pupil was stuck.
XXXXXXXX
Didn't interpret the orange and green as representing age appropriateness. Another key insight I discovered was that these reports needed to work in greyscale as schools so schools can print them cheaply in black and white to share with stakeholders and to put in pupil's physical progress folder.
In weekly reports, we reported whether a pupil was 'secure' in an area without stating whether they already knew it however we did capture this data for internal purposes. I hypothesised that making this distinction would act as a warning flag to those who intended to use us to plug learning gaps rather than reinforce what's covered in class, and give them an opportunity to change their approach to lesson selection. In the user testing, I tested out the terminology 'Reinforced existing knowlege' to describe this case and it was received positively. This gave the company the reassurance that revealing these insights would not result in angry teachers calling us asking why we were teaching things the pupil already knew. I also learnt that teachers had a different understanding when a pupil had been marked  'secure' in an area of Maths so this needed to be addressed.

MULTIPLE ITERATIONS

Through multiple design and user testing iterations, I was able to address the issues raised first focussing on displaying the right content and secondly ensuring it was easy to comprehend.

XXXXXX

 

1-2 Experimented with colours, brightness and saturation to create a set that had a clear contrast so easily distinguishable in colour as well as when printed out in black and white.
3. Thickened the band representing the difficulty as when in black and white the previous version was too subtle.
4. To address the feedback about giving more specific feedback on gaps, I added this 'Needs work' feature which pulls in the first step within a lesson that the pupil struggled with. I felt displaying all the items in a lesson that needed more work would be overwhelming and unecessary as these are accessible in individual session reports. The dot draws attention to the item.
5. To highlight the successes, I also decided to pull in data on the furthest point the pupil got to in the lesson.
6. Previously, users were confused when the lesson reference number was within a coloured circle, with the colour representing the age appropriateness of the lesson. I separated the two items visually so it's clearer that colour (or value of the grey when viewed in greyscale) represents age appropriateness.

 

FINAL REPORT

As well as refining the report, I added functionality to allow users to complete reporting related tasks as quickly as possible. This included filtering and ordering by Pupil Premium (a government grant given to schools to spend on improving outcomes for disadvantaged pupils) which allows SLT to quickly show Ofsted the progress of Pupil Premium children. I also designed download PDF functionality to allow teachers to download multiple individual reports for multiple children in one go.

 

Overview report
Homepage of the redesigned Truphone site on a Mac and smartphone

 

Individual report
Homepage of the redesigned Truphone site on a Mac and smartphone

POST-LIVE FEEDBACK

To get feedback on how the reports were being received, I armed the customer team with some questions to ask when speaking to schools. This helped validate the need for some of the functionality that we weren't able to build in time as well as showed that it was meeting the needs of our users.

“Adam was really impressed. Liked being able to see the overview and thought the individual reports were really useful for being able to identify gaps and personalise learning in future sessions.”

Customer team feeding back comments from school

 

CONCLUSION

By understanding the needs of users and the data we had available, I designed a report that gave actionable insights to our schools to help them get the best out of the programme & allowed them to better understand the impact it was having. These reports helped the customer team when trying to retain existing schools and have become a key part of our pitch to new schools.