← Back to Blog

Designing a Dashboard Teachers Actually Use During Lessons

teacher looking at a colourful student progress dashboard on a laptop in a schoo

Before we wrote the first line of dashboard code, we spent three weeks interviewing primary school teachers. Twenty-three interviews. We asked one question to start each session: "What would you want to know about your students during a maths lesson, if you could know anything?"

The answers were not what we expected. We had assumed teachers would want deep analytics — learning trajectories, percentile comparisons, achievement gaps by demographic group. They wanted something much simpler and much more immediate: which students are stuck right now.

The Core Insight: Dashboards Are Not Reports

The distinction that emerged from the interviews was sharp. A progress report is something a teacher reads after school, on a laptop, with time to think. A dashboard is something a teacher glances at while simultaneously managing 27 students, answering a question from a TA, and watching the clock to make sure the class doesn't run over into literacy time.

These are completely different use cases. A progress report can have eight columns, nested filters, and drill-down charts. A classroom dashboard that requires more than 3 seconds to read is useless — because a teacher who has to stop and parse data during a lesson won't use it. They'll handle the class on instinct instead and use the dashboard for nothing more than printing end-of-term reports.

This was the first major design pivot. We had planned a rich analytics view as the primary interface. We changed course to lead with a live session view — a single screen showing each student's current state in a format a teacher can scan in under 5 seconds.

What "Current State" Actually Means

We ran a second round of interviews to understand what teachers needed to know at a glance. The answers clustered into four things:

Who's actively working — Has a student stopped responding? Are they distracted, confused, or is their Chromebook frozen? Teachers need to spot idle students without walking the room.

Who's struggling — Not who got the last question wrong (that's too granular), but who's been wrong consistently in the past 5 minutes. A student who missed one question might have made a typo. A student who's missed four in a row probably needs intervention.

Who's finished and waiting — Students who complete a session segment quickly and sit idle are a classroom management problem. The teacher needs to know to assign extension work before the student becomes a distraction.

Whole-class pace — Is the class broadly where expected, or have half the students not reached the midpoint yet? This tells a teacher whether the lesson is on track and whether the session length was correctly estimated.

Everything else — detailed error analysis, historical trends, percentile standings — is important, but it belongs in a separate analytics view accessed after the lesson. Not during it.

The Colour System and Why We Changed It Twice

Our initial live session view used a standard green/amber/red traffic light system: green for making good progress, amber for slower than expected, red for appearing to struggle or idle.

The first problem: red/green colour blindness affects roughly 8% of males, which means in a class of 15 male students, at least one teacher using the dashboard is likely to have difficulty distinguishing the states. We switched to a shape-coded system (circle, triangle, square) alongside colour — but then teachers said the shapes were hard to parse quickly.

The second problem, which emerged from classroom observation rather than interviews: "red" carries a stigma in education. One teacher told us directly that she found herself reluctant to display the dashboard on the class projector (some teachers do this) because she didn't want students to see who was "in the red." The dashboard was becoming a social risk for students, not just a tool for teachers.

We moved to a neutral-language, icon-based system. Students are shown as: actively progressing (moving arrow), needs attention (pause symbol), completed and waiting (check mark), or session not started (clock). Warm orange for the pause symbol rather than red. This eliminated the stigma concern and passed informal accessibility checks with colour-blind colleagues.

Reducing Clicks to Zero During a Lesson

The original dashboard required teachers to click on a student icon to see what they were struggling with. In user testing with classroom teachers, this interaction never happened during a live lesson. Teachers would notice an amber student, note it mentally, and investigate during the break. The click-to-expand design assumed a level of interaction that classroom conditions don't support.

We replaced the click with a hover tooltip: hover over a student icon for 400ms and a small overlay shows their current question, their last three answers (correct/incorrect only, no details), and a one-line flag if there's a persistent error pattern ("Consistently missing 7× facts"). No click required. No navigation away from the class view.

The 400ms delay was deliberate: it prevents the overlay from appearing on accidental cursor passes, which is a real problem when a teacher is scrolling quickly. We tested 200ms, 400ms, and 600ms delays. At 200ms the overlay felt jittery and appeared unintentionally. At 600ms teachers found it too slow. 400ms was the consensus sweet spot.

Post-Lesson Analytics: What Teachers Actually Do With Data

Beyond the live view, we needed to understand how teachers use data outside lessons. The interviews produced a consistent finding: teachers check student data most often during planning periods (between lessons), during pupil progress meetings, and when preparing parent consultation notes. The median time spent on the analytics view per teacher per week was 11 minutes, based on session logging in the pilot.

This means the analytics view needs to load fast and surface actionable information quickly — not comprehensive data that requires extensive exploration. Our design principle for the analytics view: every screen should answer a specific question that a teacher actually asks, not just present data that might be interesting.

The questions teachers ask: Which students have made the least progress this week? Which topics does my class as a whole perform worst on? Who is ready to move on to the next unit? Which students might need a parent conversation about their maths progress?

We built the analytics view around those four questions specifically. Each question has a dedicated view that answers it directly, rather than presenting a data table and letting the teacher derive the answer themselves. This is a different design philosophy from most analytics dashboards — "answer the question" rather than "show the data" — and it consistently gets positive feedback from teachers who've used data-heavy platforms before.

What We Deliberately Left Out

Teacher interview data is as useful for deciding what not to build as for deciding what to build. Several features that we considered implementing were removed based on explicit teacher feedback.

Comparison to other classes or schools: Every teacher we interviewed said they did not want to see how their class compared to others. The concern was twofold — it creates competitive pressure based on factors outside their control (class composition, prior attainment levels), and it can be misleading without context. We don't show benchmark comparisons unless a teacher specifically asks for them.

Predictive "at risk" flags: Our data science team proposed an algorithm that would flag students as "at risk of falling below national expectations" based on current trajectory. Teachers rejected the idea in testing. The phrase "at risk" was felt to be stigmatising for students, and several teachers said that a prediction made by software would feel presumptuous without the teacher's own contextual knowledge. We don't surface predictions — we surface current performance data and let teachers draw their own conclusions.

Parent-facing views: Some ed-tech platforms allow parents to see real-time student progress. We don't offer this. The decision was based on teacher feedback — primary teachers wanted to mediate and contextualise data before it reached parents, not have parents receiving raw performance data without any framing. This is a deliberate product choice, not an omission we're planning to fix.

The Ongoing Design Process

Dashboard design is not a one-time problem. We run quarterly teacher feedback sessions with a rotating group of 8–12 pilot school teachers. These sessions have produced meaningful changes in every cycle — from the colour system changes described above to a new bulk-export feature for pupil progress meeting prep that wasn't in our original roadmap at all.

The single most consistent finding across every session: teachers want less, not more. Every feature request is a request to simplify, clarify, or remove something that creates friction. The hardest discipline in product design for education is restraint — the willingness to not build something you could build, because it would make the product harder for the people who actually use it during a lesson.