Cling to the main vine, not the
loose one.
Kei hopu tōu ringa ki
te aka tāepa, engari kia mau ki te aka matua
Thoughts on Teaching and
Learning of Mathematics
Measuring and Monitoring #13
At the start of each chapter [or page] is a couple of key
outcomes I see as useful and achievable.
- to increase your understanding of getting good quality data
to improve learning
- to learn new efficient ways of looking at data
Introduction.
Data, data, data. My experience of 40 years in education suggests
that mathematics teachers, especially, collect test data every few
weeks, examination data once or twice a year, e-AssTTLe data, PAT
data, spot test data and then pretty much do nothing with it. The
mark book, paper or electronic, is full of attendance data and
marks out of 10, 20, 100 and 37 or some other strange total. A pig
does not get heavier by weighing it more often! (Thanks Dave).
There is a lot of data not collected as well. Student voice data,
journal data from students and the from the teacher's journal,
observation data from another teacher, informal data from early
lesson starters and late lesson "try this" or "next time!" data.
Less is better when it comes to measuring and monitoring a class
full of emotions, attitudes, randomness, choices and events
completly out of the teacher's control and usually knowledge.
A 2022 note. The new changes in NCEA and Numeracy expected in
2023/4+ bring attention to this page. It will be essential for all
Y7 to 11 teachers, Math HODs to get an infallible measure of where
students are in relation to the NZC. The new online Numeracy
assessment will credit students with a pass if they are at or
above NZC Level 4.5, so quite multiplicative. I have been harping
on about this essential measure for 20years! Read this page
carefully. Overtime!
Hence Lesson #1
If you do not intend to use the data do not go to all that
trouble and waste student's learning time doing a test.
What data do you need?
In my project schools I suggest we collect 4 to 6 items of data
each year per student. It is nice to have a math measure from the
previous year, one from each term, and an outside check; total 6.
From these I can create cohort, class and individual profiles and
report accordingly. Learning is hap-hazard and random and any
measure has an MOE (Margin of Error) that confuses any sense
making meaning from one test. Deep learning takes time. All the
little Unit Tests that teachers persist in giving students, the
daily tests and the weekly tests are all useless to an HOD who is
trying to see the big picture. The variation is wicked in these
tests and the reliability pretty close to zero. I only want the
over time data using a consistent measure.
Hence Lesson #2
Quality overtime data, one per term, reduces the variation
and increases the reliability of sense making.
How is quality data measured?
Quality data is about keeping variation to a minimum and is a hard
learned lesson. I suggest using one test or style of test that has
multiple NZC Level assessing ability. NZCER has PAT, the Ministry
has e-AssTTLe, teachers can use the PACT Tool and the Learning
Progression Framework, past tests if you are good at writing such
things, and there is always an OTJ or Overall Teacher Judgement
where the teacher compares several items of student work with NZC
expectations.
I use what I call the LOMAS test. It
was developed around 2007 by Dr Peter Hughes and Dr Grigor Lomas.
All tested and researched. The questions are carefully constructed
and are concerned with the thinking strategies used by a student
with the knowledge he or she has in the Number Strand of the NZC
Levels 1 to 5. There are four parallel tests and I might have a go
at writing another one in the near future. See the link above for
more detail and the actual test components. The test is efficient
taking only 20minutes or so to complete by a student and less than
that to mark the complete class and enter it into a set up
spreadsheet. It is possible to have a complete update in one day
for a cohort and many hundreds of students. The LOMAS test does
not have a reading issue nor a writing issue. No calculators are
allowed and the student can get their previous term answer sheet
back to check or improve answers. I have used the same test with
same group of students on a period by period, day by day, week by
week repetition and always found the same measure results.
Learning can change over a term so increase is expected.
The data I assist HODs and teachers to collect is
- a measure for the previous year
- a LOMAS test result for each of 4 terms
- an e-AssTTLe result.
How is quality data analysed?
Here is an example of a data record for a Year 9 student.
Y8
LT1 LT2 LT3
LT4 eA4
Surname First Name
3
3 4
4
4 4
LT1 = Lomas Test Term 1. eA4
is the e-AssTTLe test in Term 4. "e" means online.
This student made a strong improvement from NZC L3 to NZC L4.
Typical improvement is half a curriculum level in one year but
often I am now seeing a 100% acceleration or 1 NZC level and often
better.
Here is an example of the gain a class of 26 students made last
year. The year 8 data has been removed. The graph shows a gain of
1.38 NZC levels using the Lomas measure. The end of Year e-AssTTLe
test was not quite as kind as the LT4 test which I think is
probably due to the different style of test and the reading
(literacy) required to interpret the questions.
Class data for 26 mixed ability students,
showing steady gain over 1 year.
Here is another view showing the spread of the students.
Same data as above showing the Blue Term 1
data to the Green Term 4 data.
A measure that I use to check everyone is one track, learning is
happening and we are all focused on the same future is the %Mult
or percentage of the class at or above NZC L4. Term 4 is a shorter
period of time in NZ schools so the graph would be expected to
slope less. The e-AssTTLe check measure pretty much agreed with
the final result and that confirms and is reassuring. Note by Term
4 the L2 tally had decreased almost to zero. Typically seen is
decreasing L2, a fairly stable L3 as students migrate in and out,
an increasing L4 and L5. If this pattern is evidenced then I am
confident targeted learning to the needs of the different groups
is being enabled, students are engaged, the teacher is being
effective and the class is doing what should be happening.
Same class showing the strong trend of become
Multiplicative.
The cohort of students individually is also shown. Student
#9 had a strong growth year moving form Level 1 through Level 2 to
Level 3. Likewise student # 22 gained similarly moving form Level
3 to Level 5. I call this graph a "Skyscraper Graph". This graph
shows how messy learning can be. Student #18 for example peaked at
L5 and dropped back to L4 in the EA measure. Student voice after
this test shows that literacy was indeed an issue for this student
who said that "Some of the questions were a bit hard to understand
and confused me."
This shows a class of 26 students and each
has a story of the year of learning. Every student gained
in this class.
The cohort view is also available.
Here we see the cohort of 130 students
gained over 1 NZC level, or were accelerated by 100%.
Better was the multiplicative level moved form near 20%
to nearly 80% making most of these students ready for
proportional thinking development. The cohort spread
shows the decreeasing NZC Level 2 and a decreasing NZC
L3 pattern, the holding pattern is more about NZC L4 as
students move in and out as the year progresses, and
increasing NZC L5 and NZC L6 student numbers.
This cohort performed very well. The class almost mirrored this
performance. Every class accelerated learning and improved the
%Mult measure.
Total test time = 4x20 minutes and a 1hr eAssTTLe experience. The
rich data that resulted took 5 hrs to compile into a class by
class breakdown and a cohort summary with comments. The report is
presented to the teachers who are asked to comment and explain any
flat topped "Skyscrapers", missing test data, insufficient mean
gain, poor %Mult gains and so on. It is easy to run but very hard
to hide from this quality data and effective analysis. Overloaded
Markbooks can not produce this sort of clear sense making so that
more time is available for planning and targetting.
Sensible and Informed Target Setting
All to often I see in BOT plans targets plucked out of thin air
for new and existing cohorts. With data as above ensible targets
can now be set for the following year for the cohort and each
class. I would expect the mean for the cohort to approach NZC L5
and the %Mult to improve further to nearly 90%+. Experience shows
that a Year 10 student at NZC L5 will gain a good selection of
Merit and Excellent credits, enjoy mathematics and open doors
across pathways. This cohort will have many choices.
Hence Lesson #3
Less is better. Analyzing and reporting quality data means
it will be used. Maximize time spent planning and targeting time
and energy.
I once had a computer business and a question I would ask a keen
buyer of a new computer was "How is this computer going to save
you time?" If a new computer did not save time, or make more money
then buying a new computer was going to be a waste of money. I
realised I was in the business of helping people save time and
make more money from the beginning. It was not just about selling
a new computer. The measuring and monitoring above is another
example of better use of time. I read an article once where a
business was struggling selling electric drills. Once the focus
had been moved to "selling holes" the company starting performing.
So what is the business of a school and every teacher?
BUT IT IS MEASURING ONLY NUMBER!
In the background on this page is a cartoon that presents two
perspectives.
Background cartoon on this page.
There are two perspectives on teaching and learning mathematics as
well. One is that we are teaching MATHEMATICS (and statistics) and
the other is that we are teaching THINKING. As the man with the
wooden leg said, "It's all a matter of a pinion!".
Mathematics teachers use the context of MATHEMATICS to teach the
deeper learning and longer lasting learning called THINKING in all
its different guises. I explain this all in earlier chapters but
briefly as we move from NZC L1 to NZC L5 our thinking moves from
not connected or organised, to simply connected, well connected,
organised, logical, creative, critical and reflective. Maths has
been taught in schools for so long teachers no longer recognise
this stepping organised procedure and what is really important.
They just teach maths, hopefully form a text or source that is
orgainsed, and co-incidentally also teach thinking. By knowing you
are actually teaching thinking we are now able to focus and put
our energy where it will be most productive.
Hence Lesson #4
Know what your business is about!
But you are only measuring NUMBER!
If you look thoughtfully at the LOMAS test, at mathematical
concepts introduced at each NZC Level, the types of language and
questions asked, the projects and tasks expected in texts I quite
firmly assert that you are actually looking at an increasing
complex structure of thinking. Number allows us to look at that
thinking and to measure that thinking. Number is quite intuitive
to most people and even animals have rudimentary counting ability.
The complexity with which someone uses NUMBER reflects the
complexity of their thinking. What we are actually doing when we
use the LOMAS test (or most other tests NZC referenced) is to
measure THINKING.
The cohort graphs above could well be re-labelled as
Mean NZC = Increasing complexity of thinking, %MULT = % of those
students who can think of two things at once, Spread graph showing
how diverse the thinking actually is.
Lastly
For twenty years I was in a classroom teaching. I wondered from
time to time what I was actually teaching. I became an advisor of
mathematics and have been one for twenty years now. That gave me
time to rub shoulders with Math ED and talk with many people about
the Teaching and learning of Maths. In the last 5 years I have
been more about "Making Dreams Come True" when schools have asked
me for help. It has been a busy time and as you can see form the
data and analysis above some astonishing results can be realised.
There are other projects in NZ that claim acceleration rates of
100% or so for groups of needy kids. My work has created 100%+
acceleration for whole cohorts including those needy kids. Knowing
that we are in the business fo teaching thinking (read the NZC
blurb about mathematics - strategic, logical, creative and
critical thinking takes center stage!) means a purposeful team
with teh same shared goal.
Somewhere in the last 40 years of being a teacher I learned that I
was actually teaching thinking. Somewhere in the last 10 years I
became aware of this and have shared it with the schools I have
worked in to produce positive outcomes. Much of this chapter is in
other places in this book.
Last Lastly
Data is just data. It is a blurry image of what is actually
happening. Students learn while they are doing a test.
Triangulating data with OTJ, student voice, what you see and hear,
all contibutes to a making sense of what you are doing.
I am not a fan of 1st in Maths Awards. I am a fan of most improved
and high achievers. See Teacher Task below.

Some questions...
• Do you teach Thinking or
Mathematics?
• Is it better to ask "How did you get that
answer?" or say "Well done, that answer is correct!"
• Is it more challenging to say "The answer is
24, what was the question?" or "What is 2x12?"
• Do you have a test that you know inside and
out that provides NZC Levels of where your students are now
placed.
Teacher TASK
• Find the LOMAS resources on this site, pretty easy, and trial
Test B or Test C with a group of Year 9 students. While they are
sitting the test write your considered estimate of the NZC Level
for each student.
• Design a spreadsheet that will produce the graphs displayed
above. You could always email and ask me to send a version or
check yours.
• Which students in the class of 26 above would you give awards to
for High Achievers and Most Improved?
• Which students in the Cohort would you target for High
Achievement?