
The 2014 National Curriculum Statutory Programme of Study for mathematics covers Years 1 through 6. The document includes around 190 specific objectives across six year groups, organised into topics including number and place value, addition and subtraction, multiplication and division, fractions, measurement, geometry, and statistics. Most of those objectives appear at multiple year groups, building in complexity across the key stages.
Our current lesson library has 412 items. Each item needs to be correctly mapped to one or more curriculum objectives, tagged with the correct year group and topic, and reviewed whenever the curriculum guidance is updated or our understanding of the objective is refined. Doing this well is less glamorous than building adaptive algorithms or designing teacher dashboards — but it's the foundation on which everything else rests.
Why Curriculum Alignment Actually Matters (and Why It's Harder Than It Looks)
Schools adopt Everybody Counts partly on the basis that our content is curriculum-aligned. When a maths lead tells her head that the platform maps to the National Curriculum, she's making a claim that needs to be true. If a teacher assigns a "Year 4 multiplication" session and the questions include content that isn't taught until Year 5 — or that duplicates Year 3 content the students have already mastered — the teacher's confidence in the platform erodes quickly.
The curriculum alignment problem has two parts: initial tagging (making sure every item in the library is correctly attributed on first entry) and ongoing maintenance (ensuring tags remain accurate as our understanding evolves and as items are revised). Both require a process, not just good intentions.
The initial tagging is harder than it appears because National Curriculum objectives are written at a level of generality that doesn't always map cleanly to specific question types. Take this Year 4 objective: "recall multiplication and division facts for multiplication tables up to 12 × 12." This is a single objective, but it covers 144 distinct multiplication facts and their corresponding division facts. A question about 7 × 8 is trivially attributable to this objective. A question about missing factors (__ × 6 = 42) is also attributable to it — but it's testing a different cognitive operation that some teachers treat as Year 5 content and others as Year 4.
Curriculum documents don't resolve this level of ambiguity. Experienced curriculum teachers do — which is why our curriculum review process involves primary teachers, not just our internal team.
The Tagging Taxonomy We Use
We use a four-level tagging taxonomy for every library item:
Year group: The year group for which this content is primarily intended. Some items are tagged as appropriate for two year groups (e.g., Year 3/4 bridging content). Items tagged for multiple year groups appear in both year groups' default sequences.
National Curriculum strand: The broad topic area — number, measurement, geometry, statistics. For the current library, 83% of items fall within the number strand (consistent with the NC's strong emphasis on number across KS1 and KS2).
National Curriculum objective: The specific objective from the statutory guidance. This is a direct reference to the document section, not a paraphrase. Every library item links to the exact text of the objective it addresses.
Cognitive level: Our own internal categorisation of what the item asks the student to do — recall (retrieve a known fact), apply (use a procedure on a new case), or reason (identify a pattern or justify a claim). This isn't part of the NC but it's important for our adaptive algorithm's decision-making: a student who recalls 7 × 8 correctly doesn't necessarily need another recall item; they may be ready for an apply-level task.
The Initial Tagging Process
New items enter the library through a three-stage process. First, they're drafted by our curriculum lead (currently a half-time role held by a former KS2 teacher with seven years of primary experience). Second, they're reviewed for curriculum accuracy by one of three external curriculum reviewers — primary teachers or former teachers who do quarterly reviews under a paid advisory arrangement. Third, they're tagged in our content management system using the taxonomy above, with the tagging done by the drafter but verified by the reviewer.
The reviewer's job is not just to confirm that an item is factually correct (that's necessary but insufficient). The reviewer assesses: Is the item's difficulty level consistent with its year group tag? Is the language used appropriate for that year group? Does the item test what the curriculum objective actually requires, or does it test a proxy that's easier to assess but not what the objective is about?
That last question is important and often produces debate. "Recall multiplication facts up to 12 × 12" could be tested with a straightforward "what is 8 × 9?" question. But the NC's intention, according to the non-statutory guidance that accompanies the statutory requirements, is that students can apply these facts flexibly — not just retrieve them in isolation. An item that presents the fact in a different form (e.g., "a rectangle has an area of 72 cm². If one side is 8 cm, what is the other side?") tests closer to the intended objective, but is more complex and therefore harder to target appropriately with the adaptive algorithm. These tensions are real, and we resolve them through discussion rather than by formula.
Maintenance: Keeping Tags Accurate Over Time
Tags become stale for several reasons. Our understanding of how a particular item performs (based on IRT calibration data) sometimes reveals that it's much harder or easier than its year group tag suggests — which prompts a review of whether the tag is correct or whether the item needs revision. Teacher feedback occasionally identifies items that are technically curriculum-aligned but address a sub-objective that teachers typically cover in a different year group than our tag indicates (reflecting local variation in scheme-of-work sequencing). And occasionally we update items to improve clarity, which may affect their alignment.
We run a quarterly maintenance review covering items flagged by three triggers: IRT calibration results showing a difficulty-to-year-group mismatch, teacher feedback logged against a specific item, and a systematic calendar review (every item in the library is reviewed at least once per academic year regardless of other triggers).
The quarterly review is done by the curriculum lead and at least one external reviewer. Items that need revision go back to the drafting stage; items with correct content but potentially miscalibrated tags are re-reviewed and either retagged or retained with a note in the item record explaining the decision.
When the Curriculum Changes
The National Curriculum has been under review since 2023, with the DfE's Expert Panel on Curriculum and Assessment publishing its final report in 2024. Depending on the outcome of that review, there may be meaningful changes to the primary maths curriculum requirements, which would require a full review of our item tags against the new document.
We've built our tagging system with this possibility in mind. Every item's NC objective reference is stored as a structured data field pointing to the current version of the document. When a new curriculum is published, we can generate a report showing every item tagged to objectives that have changed — which gives us a systematic starting point for the required review, rather than having to audit all 400+ items manually from scratch.
We plan to review this again when the new National Curriculum is confirmed and implement any required updates before the academic year in which the new curriculum comes into effect. Schools that rely on our curriculum alignment claims deserve to know this is actively maintained — not set up once and left.
The Teacher's Perspective on This Work
We've had teachers tell us they appreciate that we can point them to the specific curriculum objective a session covers. This comes up most often when a teacher is explaining to a headteacher or a parent why a particular topic is being practised — having a specific reference to statutory guidance is more useful than "the platform says it's Year 4 content."
We've also had teachers find errors in our tagging — items they felt were incorrectly attributed, either too hard or too easy for their year group. These reports go straight into the maintenance queue. A teacher noticing a tagging issue and reporting it is exactly how the process should work: our external reviewers are good but they can't match the ground-level knowledge of a teacher who sees 28 students work through a specific question set in a live lesson.
The curriculum alignment work is the invisible infrastructure of the platform. It doesn't appear in demos. It doesn't generate press coverage. It's the reason a teacher can trust that when she assigns a "Year 4 multiplication" session, she's assigning content that her Year 4 students are actually supposed to be working on — which is the minimum standard the product needs to meet to be genuinely useful rather than just technically functional.