top of page

Ep 20. Reading and math: parallels and pitfalls with Matthew Burns

This transcript was created with speech-to-text software.  It was reviewed before posting, but may contain errors. Credit to Jazmin Boisclair.

You can listen to the episode here: Chalk & Talk Podcast.


Ep 20. Reading and math: parallels and pitfalls with Matthew Burns

[00:00:00] Anna Stokke: Welcome to Chalk and Talk, a podcast about education and math. I'm Anna Stokke, a math professor, and your host.


You are listening to episode 20 of Chalk and Talk. Like many of you, I listened to Emily Hanford's podcast, “Sold a Story.” If you haven't listened to it, it's about reading instruction and how an idea about how to teach reading became popular despite conflicting research evidence in cognitive science. I am not an expert in reading instruction, but as I listened, I couldn't help but notice how many elements of the story seem similar to things I've observed in math education.


A noticeable shift is happening in reading, with educators and policymakers now recognizing the importance of phonics and other principles from the science of reading. But math has not received the same level of attention as reading, and math is important too. For this episode, I brought in an expert in reading to tell us about the history and to discuss whether we can draw parallels with math.


That's the focus of today's episode. My guest is Dr. Matthew Burns. He is a renowned researcher in the science of reading world. His work was featured on “Sold a Story.” He has also done research on math interventions, making him the perfect person to explore connections between math and reading instruction and to offer suggestions about how we might turn things around for math.


I really hope you enjoy this episode. Now without further ado, let's get started.


I am delighted to be joined by Dr. Matthew Burns today, and he is joining me from Florida. He has a Ph.D. in Educational Leadership. He is the Fein Professor of Special Education at the University of Florida and Assistant Director of the University of Florida Literacy Institute. He has published over 200 articles, books, and book reviews.


He is one of the leading researchers on the use of assessment data to determine individual or small-group interventions. He received the 2020 Senior Scientist Award from Division 16, that's School Psychology, of the American Psychological Association. He's well known in the science of reading world, but he has also published articles on effective math instruction, including articles about automaticity with math facts.


His work in reading instruction was featured on Emily Hanford's podcast, “Sold a Story.” Welcome Matt. Welcome to my podcast.


[00:02:53] Matthew Burns: Thank you. Thanks for having me. I'm excited to be here.


[00:02:55] Anna Stokke: I'm a mathematician. I know a lot about math and teaching math, but I'm definitely not an expert in reading instruction. Now, I listened to “Sold a Story,” and a lot of what I heard in that podcast reminded me of things that I see going on with math instruction. And I'm hoping we can explore what happened with reading, and maybe we can talk about whether there are similarities with math.


Now there's a lot to say about the history of reading instruction, but I'm hoping you can hit some of the main points for us. My understanding is that phonics, or sounding out words, was the prevailing method for reading instruction for many years. And that some of the modern programs that de-emphasize phonics originated in the seventies.


And those ideas came from a New Zealand researcher named Marie Clay. Can you tell us a bit about that? What was the idea that Marie Clay came up with for teaching kids to read and what happened after that?


[00:03:57] Matthew Burns: Yeah, Marie Clay, she was a fascinating person. She was the first, I only learned this recently, the first person not from North America to be elected president of the International Reading Association. She's, you know, a really well-known, really well-respected researcher. Now, she found an approach that, I don't know if she called it that, but based on this MSV, meaning, structure and visual.


And really what that means is looking at the semantics, the syntax, and the graphophonic approaches to understanding what you're reading. And the idea is that good readers, according to Marie Clay, good readers use all of the information available to them. So they use what the word means. They use other things in the sentence, like grammar, et cetera, to figure out what the word means.


And she used that information to develop what she called “reading recovery,” which I think was brought to this country, and I know it was brought in the early 80s, I think it was around 84, was brought to this country and became really popular. So her first work was on reading recovery, but it was just for kids.


Who she thought were struggling readers and others after her came in and translated it into just, you know, general reading instruction,

[00:05:08] Anna Stokke: So she came up with an idea for teaching kids to read, which was based on how really skilled readers read. 


[00:05:17] Matthew Burns: Yes. It was an introduction approach based on what she thought good readers did.


[00:05:21] Anna Stokke: What are the names of the most commonly used programs that are based on Marie Clay's approach?


[00:05:28] Matthew Burns: Well, I mentioned reading recovery That's sort of the most famous in the first and then Fountas and Pinnnell, two researchers, two women who are leading, you know, thought leaders in reading. They took her approach and applied it generally. And they called it “guided reading.” And they said, you can take this three cueing system and apply it into just teaching kids how to read in general, not just struggling readers.


Guided reading became, the application of reading recovery to just teaching kids how to read. And then, Lucy Calkins learned it from them and developed reading and writing workshops, which became “units of study.” So the three most common ones are going to be, reading recovery, guided reading and units of study.


Those are the three names you're going to hear most often using this three cueing system approach.


[00:06:17] Anna Stokke: What are the three cues that they use in the three cueing system approach?


[00:06:22] Matthew Burns: Yeah, it's MSV: meaning, structure and visual or more accurately semantic, syntax and graphophonic. So it's, looking at the whole sentence and seeing. What that word means, and try and use that to figure out what the word is based on the context of the sentence, based on the grammar of the sentence, and based on what the word actually looks like.


[00:06:40] Anna Stokke: And so with these programs, you're not actually supposed to sound out words. Is that correct?


[00:06:46] Matthew Burns: Well, they would say, I think they would say, you do sound out a word, but only if you don't know the word, and only if the rest of the context doesn't make sense. Then you actually look at what the word looks like and try and figure out what it says that way. I don't know as to be true for sure, but my interpretation of it is that would be what you would do if the other things don't work.


And they would argue good readers don't sound out words. they just know what the word is by looking at it.


[00:07:10] Anna Stokke: And so did this now become the prevailing approach used across North America?


[00:07:17] Matthew Burns: It's sort of amazing it did. Reading recovery, for whatever reason, became really popular in this country in the, you know, late 80s. And it became political. By political I mean it was adopted through federal grants, et cetera. There was support to adopt this approach.


So it became really, really popular. And now today, the other approach, which I didn't mention, is called “leveled literacy intervention.” The leveled literacy intervention, I don't really mention it because that's part of the Fountas and Pinnell guided reading system. It's the intervention. So guided reading according to Fountas and Pinnell is just for teaching it's how to read.


And then the kids who don't do well, you amplify it, amplify the three cues and that's called leveled literacy intervention. So that's another approach that's commonly used. That right now, The Fountas and Pinnell system and leveled literacy, according to surveys by Ed Week and other outlets, it's the most commonly used one in the United States.


 I don't know about all of North America, but in the United States, it's the most commonly used one. And I think it was something like, it was over 40% of the respondents said that that's the approach that they use.


[00:08:20] Anna Stokke: Now, what was wrong with Marie Clay's idea?


[00:08:24] Matthew Burns: Well, here's the problem. So, we learned shortly after, around the same time frame, around the 80s, early 80s, Keith Rayner, who is a psychologist, does eye tracking. He's a famous eye-tracking researcher. And he was the first one to look and see and say, you know, based on where the eye moves as good readers read, they don't use the context. 


They look at the word and figure it out. And they look at the word, they look at the word, they fixate on the word. And so they're sounding it out. It's the struggling readers who look at the context and try and figure out what it is. So the conclusion from his research was what we're doing is teaching kids who aren't good readers to do what kids who aren't good readers do, which makes it, of course worse. 


And he was the first one to really sort of point that out. And that got people thinking, “Maybe this isn’t such a good idea.” And then the research started. And of course, we saw, you know, by late eighties, early nineties. There were several studies that, showed this approach wasn't really effective.


And by that point, it had become a new term called “balanced literacy.” And balanced literacy, it became first whole-language, whole-language was quickly sort of based on the research of Keith Rayner and others in the 80s and 90s, became a term not used. And so balanced literacy became the new branding.


But basically, we're still talking about a basic MSV approach to teaching reading. The late 80s, early 90s, we know that, no, that's really what, bad readers do, not good readers. And the research had started to show it wasn't very effective.


[00:09:57] Anna Stokke: Okay, so research has been around since the 80s or 90s, showing that this actually isn't an effective approach to use to help struggling readers in particular. So basically, they had taken an idea that actually struggling readers used, which probably didn't work that well, because they were struggling readers, and then they doubled down on it.


Does that sound right?


[00:10:23] Matthew Burns: That's my take on it, yes. And, and then of course, you throw in the leveled of literacy intervention, which is basically now we take the struggling readers and do that not only as a way to teach them just generally, but also as the intervention approach, and it just didn't see positive effects.


[00:10:38] Anna Stokke: And so the research showed that these types of methods likely weren't that effective. Did the research also show that phonics likely is more effective?


[00:10:48] Matthew Burns: Yeah, so one of the first things to really look at that was the National Reading Panel in 2000. And the National Reading Panel was commissioned by the federal government to look at the research and see what really works. And their conclusion was that systematic phonics instruction, and there are different ways to do systematic phonics instruction, but, phonics instruction was more effective.


We saw a nice effect, solid, at least moderate effect for phonics, and a negligible effect for “whole-language” was what they called it at the time. And it's been replicated. John Hattie today, you know, he famously 0.06, whereas phonics is around 0.60 effect size. In the 2000s, we had nice syntheses of research to show explicit phonics instruction was more effective.


[00:11:37] Anna Stokke: So backing up a bit. If phonics actually is the most effective way we have to teach reading, why did Marie Clay and other researchers feel a change was needed back in the seventies?


[00:11:50] Matthew Burns: You know, I don't know, I'm not a historian, I wish an educational historian would study that. Here's my hypothesis, and I freely admit it's a hypothesis. So phonics instruction, for whatever reason, and I don't know why, was sort of linked to a behavioural psychological approach and that's still true today, behavioural psychologists still push for phonics as just a basic antecedent behaviour consequence approach to teaching reading.


And there was a push in the 70s pushing back on behavioural psychology, quite significantly. And so I, I think that just got caught up in that, like this sort of anti-behavioural psychological approach or thinking, which, of course, behavioural psychology is still around today, and shown to be quite effective.


But there was this big push in the seventies against it. So I think it might've got caught up in that, number one. Number two in the seventies, well, actually remember Marie Clay, her research was done, I think her dissertation was in 1966. Don't quote me on that, but certainly in the sixties. but at the time we hadn't done a lot of research on kids who weren't learning how to read well.


So we really hadn't started, so she was one of the first to really kind of look to see what do we do for kids who aren't learning how to read very well. And so I think because we didn't have a lot out there on, kids who weren't doing well, we just needed to start somewhere and so she, she was part of that group.


So I don't have a good, a good answer to that. I think it's more of a societal issue than almost anything. But I do think it's, it was caught up in a couple of different movements. And it was incorrectly, not rejected, but people look for alternatives.


[00:13:30] Anna Stokke: Back to the research. So, in "Sold a Story," you talked about some work that you did to test whether Fountas and Pinnell’s program worked. Can you tell us about that and the results that you got?


[00:13:43] Matthew Burns: Yeah, that was really fun. So I do research in something called instructional level and the idea of an instructional level has been around since the forties and no one really ever studies it all that much. So I do research on that. Anyway, so whenever something comes out that claims to measure an instructional level, I usually take an interest and look into it. 


Well, the Fountas and Pinnell benchmark assessment system, when it came out, it supposedly measures on a structural level, so I got it and looked at it and on one of the pages early in the manual, like page three or four, it said that the Fountas and Pinnell benchmark assessment system can be used to screen the reading skills of kids. 


Oh, okay. Nowhere else in that entire book did it provide any evidence to support that claim. So I said, let's test it. So we took about 900 kids in second and third grade and gave them three measures. We gave them the Fountas and Pinnell benchmark assessment system. And then we gave them the measures of academic progress, NWEA, Measures of Academic Progress (MAP). 


It's a fine screening screener, it's used quite commonly. It's got good psychometrics. So we gave that as our criterion, and we gave the Fountas and Pinnell, we also gave all the kids Aimsweb to see if that worked better or reading fluencies from the Aimsweb assessment system.


What we found was Aimsweb worked fine, but more importantly for this conversation, the decision-making accuracy, of Fountas and Pinnell benchmark assessment system was 54%, which means you can buy the test, for thousands of dollars, train all your teachers, take 20 or 30 minutes per kid, or just flip a quarter.


And you'll get it right just as often. And the worst part is that it identified good readers two-thirds of the time. It identified struggling readers accurately 31 percent of the time, which is alarming because that means we're missing potential struggling readers. Kids who need help, we're missing them two-thirds of the time, and that is alarming.


That was one study we did. We did a series of three. We found decision-making accuracy at 54%. We also found that adding Fountas and Pinnell data into other decision-making metrics and data didn't improve accuracy. In fact, you're better off just to use Aimsweb and MAP, for example, then you are F and P.


And our third study, we found that if we gave kids books that were at their supposed level, Fountas now uses a lettering system. So if the kid's an M and they got a book that's an M, we found that good readers read it. No problem. In fact, the good readers, it might underestimate their skills, but we don't know for sure.


However, the struggling readers, two-thirds of them couldn't read the book that was at their level. So they're a G and it's a G and they couldn't read the book. To me, that's, that was the most alarming the result we saw was that the struggling readers the kids who need this the most, couldn't read the book.


[00:16:32] Anna Stokke: And school districts were spending you know millions on this program, right? And you could just flip a coin. And it also wasn't identifying struggling readers. It essentially wasn't helping kids at all.


[00:16:46] Matthew Burns: No. 


[00:16:47] Anna Stokke: When did you publish that research?


[00:16:50] Matthew Burns: Around 2015.


[00:16:51] Anna Stokke: And so what happened after that? 


[00:16:53] Matthew Burns: Not much until Emily Hanford's Sold a Story. We published those in good journals, Reading and Writing Quarterly, Journal of School Psychology, and Psychology in the Schools, I think. I remember, you know, I'm a classroom teacher, and I've probably never heard of those three journals, and I went to the International Literacy Association conference, which used to be the International Reading Association.


They renamed it. I don't know the year they did that, but they renamed it. And I went to the conference, I've been there many times, but I went in one time, and they have the convention hall, which is the area where all the vendors come and give away free things, and we all love those. It's fun.


Anyway, the hall where all the vendors are, and they're selling their stuff, and they give away free things, it's really fun to walk through and get all these free things. So I walked in. And on the other side of this big hall, this, this, you know, big ballroom-type hall, is this giant sign that says “Fountas and Pinnell Benchmark Assessment System.” It's this huge sign. And you go over there because you can't help but go over there because of this huge inviting sign. And there's these teachers there telling you how wonderful it is.


And they have these, handouts of all these narratives of these teachers saying this is the best thing ever. I'm a good teacher now because of this. I think it's naive, if not unfair, to expect teachers to compete the data from a journal with that. No teacher out there is probably going to go through all of that, and see these wonderful stories from all their colleagues, and then go read a journal article and say, “Oh, but this journal article says it's something different.”


So really, honestly, not much happened until Emily Hanford, gave it a little more breath and then people started paying attention to it. But teachers were continuing to use it and schools were continuing to buy it.


[00:18:34] Anna Stokke: Teachers aside though, we've got Lucy Calkins and Fountas and Pinnell and they were selling these programs and there was research back in the 80s that actually showed the approach didn't work. You published papers that showed that it wasn't that effective, so maybe the teachers didn't know about it, but surely those other individuals must have known, and isn't there some sort of ethical requirement in education to maybe stop what you're doing, stop what you're selling, because it doesn't work?


[00:19:06] Matthew Burns: Yes. Here's another point I would take though, which is, the people you just listed, if you go to Google Scholar, which is something I use all the time, I recommend, practitioners use it. Go to Google Scholar and you can search researchers and it'll give you a research profile.


It'll tell you how many articles they've published, etc. It's great. If you go and search those names and others, they've never published a study or very few, if any. And so if you are someone who's creating something to sell, you should test it first. I had an idea.


Oh gosh, I don't know what year that was. Maybe, maybe around 2004. I had this idea about targeting reading interventions. I thought “Well, that's interesting.” So I did a meta-analysis. I did an observation, much like Marie Clay. I observed something happening with these kids. I did a metaanalysis first and then wanted to see, is that even worth looking into?


Yeah, we saw large effects. Okay, that's worth looking into. I developed a framework. I think I published three or four studies on it. I always do highly controlled and then can teachers just use it. And then once then I started writing about it and saying yeah, this works, use this, to me, that's an ethical obligation.

And I don't know why that didn't happen in any of these circumstances. Why is it, now Marie Clay would argue she did do a dissertation. She did publish on it. So I can accept that. I think there are problems with it, but I can accept that. But some of the others I don't see a single study that they conducted, or anyone else as far as that goes, to support what they did and what they wrote about.


And to me, that is alarming.

[00:20:36] Anna Stokke: It's extremely alarming, and children are not guinea pigs. I do think we need to hold the education system to a higher standard. And another thing, it's my understanding that Lucy Calkins, for instance, was quite charismatic. Teachers wrote songs about her, and I think sometimes it becomes more about the individual and the product they're selling.


And we'll come back to that because I think it's an important thing to talk about. But for now, I want to shift to math instruction because that's where I want to get to so that we can talk about parallels. Do you see similarities between what happened with reading instruction and what's going on with math instruction? 


[00:21:23] Matthew Burns: I do. It's quite analogous, although it's not a complete one-to-one. It's almost like there's shift in thinking around what constitutes evidence and what constitutes research-based practices. And I think that's cutting across all aspects of teaching and learning,


So we're almost seeing. a distrust of research among practitioners, and so if I go to someone and say, “Well, you know, I got this study, and others have done this too, that shows the best predictor of how well a child can complete math word problems is how quickly they can do the multiplication facts.”


Seems like people are more likely to try and find fault in that conclusion than they are to change their thinking. So I do see some consistent things where I see things happening in math and I can't help but wonder, “Why do you do that when the research clearly says that's not the best way?”


[00:22:15] Anna Stokke: Don't you think though that this mistrust could come from things like what happened with reading when people claimed they had research supporting the programs that were being sold to schools and then we find out later that in fact the programs weren't actually evidence-based. Maybe that sort of thing causes people to mistrust research.


[00:22:40] Matthew Burns: Well, maybe that's fair, although I would argue that the basis of what we're talking about for math was if not the same time period It might even predate it. But yeah, no, I agree. I think you're right and I think we get confirmation bias and we get these all these cognitive dissonance where if I'm a teacher and I'm working really hard and doing what I can to help kids.


And I think I'm doing what's the best interest of the kid and someone comes along and tells me it may not be, that's when you're going to see this cognitive dissonance, you know, “Well, you've got to be wrong. I'm doing what's best for my kids.” And if my argument is, well, it's, but research says this, that's why they're, in my opinion, more likely to say, “Yeah, the research is wrong.”


[00:23:17] Anna Stokke: So in general, do you think there's actually just a science of learning? So we hear about the science of reading, and now there's this new group called the Science of Math, but is it all sort of the same principles? Is there a science of learning that essentially tells us about the best ways to teach novice learners? I mean, if the best way to teach, say, a novice learner to read is through direct instruction, explicit instruction, doesn't that also hold when you're teaching a novice learner mathematics?


[00:23:52] Matthew Burns: Yeah, it does. There are some basic principles. You mentioned direct instruction. There's a wonderful, a couple of them, but a meta-analysis looking at 50 years of research that sees nice effects, you know, least moderate effects for reading, writing, math, you know, it just, explicitly teaching kids how to do things works.


And we know that there's several underlying theories, one of which is, the learning hierarchy, which says that when kids are first learning something, they need modelling, explicit instruction, and immediate corrective feedback. And then once they can do it with sufficient accuracy, it's time to get them to do it more quickly.


And so that's lots and lots and lots and lots of practice. And then once you can do it with sufficient speed and accuracy, that's when they can generalize it. And so when teachers say, “Does speed really matter? Does it matter if the kids have their math facts automatized?” Well, it does because they, if it's not automatized, you can't generalize it, right?


So we know some basic principles, you know, going from modelling to practice to generalization. That applies no matter what you're learning.


[00:24:51] Anna Stokke: . Kids definitely have to know their math facts. So we'll come back to that. and when I was listening to Emily Hanford's podcast, something that really struck me is there always seems to be this romantic idea behind some of this stuff.  If we immerse kids in books and we make reading fun and we start with meaning and understanding, the reading will just come.


And I hear this sort of thing in math. all the time. So it's sort of like a top-down approach to teaching instead of a bottom-up approach. So in other words, the idea is to start children with problem-solving. They'll get excited about the problem and the foundational skills will just come after that. But as a matter of fact, You can't solve a problem, right, if you don't have the skills to solve the problem if you don't have the foundational skills to do it or the techniques to do it. Does that resonate with you in terms of what happened with reading?


[00:25:45] Matthew Burns: Absolutely. It makes intuitive sense. It sounds fun. Teaching and learning should not be boring. Boy, I couldn't agree with that more. That's yeah, a hundred percent. The idea sounds great, right? You give the kids something exciting, get them excited about it, they'll be motivated, they'll dive in, they'll figure it out.


It just doesn't work. I one time had a colleague who had a stats question. And she asked me a stats question, and I said, “Yeah, I can help you with that.” I said, she was using a stats program, if your listeners aren't familiar, called SPSS, which is a commonly used stats program. I said, “Hey, you know what, let me see your manual, and I'll help you.”


I'll show you where it is. So she handed me a copy of her manual, I opened it up, and it was written in Chinese. And she was kind of embarrassed, like, “Oh, right, right, it's in Chinese.” And I looked at it and thought, you know, I have an interesting problem right now. I want to help this person. I'm sincerely motivated.


And the question she's asking me is interesting to me. I can't do it. I could spend all day and I'm never going to figure it out. You could give me a hundred bucks as a reward to figure it out. I'm not going to be able to, because I can't read this. And that's how I think a kid who can't read.


I mean, all you do is frustrate them and tell them, “Yeah, you know, this is great.” Here's this exciting thing and they get excited about it and they can't do it. All they're going to do is experience frustration. In fact, some research I've done has shown that many kids with behaviour problems, anyone would agree with this, most kids with behaviour difficulties in schools have reading problems as well.


So this idea that, yeah, this productive struggle, I mean, productive struggle in reading to me is unfortunate because if the kid can't read it, it's not productive struggle. It's frustration. What you're seeing is frustration.


[00:27:23] Anna Stokke: Well, yeah. And it's the same thing with math. That's something that's commonly promoted, productive struggle. It's good to struggle. And I mean, to some extent, yeah, of course, there are times when you struggle with math, and you've got to persevere, right? But to start someone off with this complex problem, and they're just supposed to struggle and it's all supposed to work out for them, what happens is the kids that are already doing well, maybe they're getting outside tutoring or maybe they just excel in math. They go further, but the kids who don't have those foundational skills, they really get left behind.


[00:27:59] Matthew Burns: Yeah, a productive struggle and an inquiry-based instruction and problem-based learning and all that, that's great for the proficient readers and kids are good at math. I really have no problem with those. But for all the other kids, it's probably not going to be helpful.


[00:28:13] Anna Stokke: In math, we often hear phrases like “drill and kill.” The idea is really to disparage practice. And then sometimes teachers maybe feel uncomfortable giving their students a lot of practice or practicing times tables, getting proficient at times tables, things like that.


But actually, practice is extremely important in math. Did that sort of thing happen in reading? Is there an equivalent drill and kill phrase that you'd hear in reading?


[00:28:40] Matthew Burns: Yes, 100%. So drill and kill, the reading version of it is basically any phonics-based approach to learning because you have to practice what the letters say, etc. And it's the same thinking that we don't want just kids be practicing. When you look at, again, I'll cite John Hattie. His work, the two single things that teachers can do that led to the best effect were feedback and practice.


Those are so important and I would argue that one of the single best predictors of how well kids do in math, at least, you know, more advanced math ideas like solving word problems is the fluency of which they complete the facts and the amount of practice and repetition they had. How well someone learns something and then therefore applies it is directly related to how often they've practiced it.


[00:29:24] Anna Stokke: And I think for people that have learned a lot of math, normally they know that. I think sometimes people forget. I think sometimes there's the curse of knowledge that sometimes you forget the amount of practice you had to do to get to the point where you're at. But I think it's really damaging when people disparage practice. I wish people would quit doing it, and I don't know why they do it. 


[00:29:46] Matthew Burns: My stepson, he was now in third grade. We got a letter home from his teacher at the beginning of the year saying, “Hey, this year, we're really going to work on multiplication facts. So please practice the facts with your kids at home.” I was like, yeah, “Oh, finally. I love this teacher.”


I hadn’t even met her yet, and I loved her. And we do practice the math facts at home. A lot of the people who are, now as an adult, who are really good at math, or the teachers, etc., they may have learned that with very little repetition needed, right? They may have picked it up pretty quickly.


And so in their own thinking, you say they may have forgotten how much practice they did. They may not have had the same level of practice that some kids need, and they didn't need that amount. But boy, kids who are struggling in math and reading, they need lots of practice.


[00:30:26] Anna Stokke: Thank you for pointing that out. I've noticed that too, that just in general, some, it takes some people a little longer to learn things you know, like you say, there are people that maybe were able to memorize times tables really quickly, but then it takes other kids longer. Can any child memorize times tables?


[00:30:44] Matthew Burns: Oh, I think so. Yes. Let's talk about learning times tables and learning the multiplication facts. It's extremely important for kids to know those to the point of automaticity because if you don't, you can't generalize it. So you can't tell me Johnny went to the store three times and bought three apples each time, whatever unless you can tell me three times three is nine without thinking about it.


And in reading, you know, the same thing you have to you have to get what is called orthographic mapping, right? It gets the point of automaticity. Just look at it and know what it is. So, yes, I think any kid can learn that and there are fun ways to do it. And we start with those fun ways.


When the teachers out there, think about fun ways to practice math multiplication facts and addition facts, et cetera. As long as two things are there, as long as there's generation and repetition. So repetition, they need practice. Generation means that they have to generate the answer themselves.


The multiplication table, you know, where you look up, you know, three times three and look it up and the answer is nine, that doesn't help build retention or automaticity at all because there's no generation. They have to do that from memory. 


So I was working with a teacher one time and asked her to do this and talk about practicing facts and she developed a system, you know, every morning her kids would practice facts and she would say, okay, she'd call one kid at a time and say “three times three” and the kid didn't know the answer. They'd look it up on the multiplication sheet and say nine and she would find the next kid.


They do it for like a minute of the day. I was so upset because she was doing this and it was great, but it wasn't what we need to do. So there's no repetition there because each factor was just done once in each kid, there was only like, you know, 10 kids that participated at a time, one at a time. And if they didn't know the answer, they could look it up.


And therefore, there was no generalization. in order to remember it, you have to look at three times three several times to tell me the answer. Every kid has to do it, not just calling out one. And, so if you call on a kid, for example, and they don't know the answer and they look it up, that's okay.


But then you've got to come back to them a minute later and ask them again so they can tell you from memory. They have to tell you from memory. So as long as those two things are there. I've taught kids with math learning disabilities. Intellectual disabilities. We teach them their math their multiplication facts, especially in a matter of weeks.


[00:32:54] Anna Stokke: I have actually heard some people say it doesn't matter if kids learn their times tables, it's really not necessary, which you're telling me it definitely is. I also agree that it's definitely necessary. 


I am wondering about another thing and it's interesting because I got this message from a teacher on the weekend about manipulatives. And the teacher said that they're encouraged to use a lot of manipulatives and their feeling from what they've noticed in working with the students is actually, this actually doesn't help a lot of the time when you're trying to get kids to a point where say they're automatic with math facts that it actually seems to hinder getting to that point. 


What are your thoughts on that? Do manipulatives help or do they hinder automaticity?


[00:33:39] Matthew Burns: Had you not said automaticity, I would have had a totally different answer. So I do think they help. Let me explain why. But I do think they can hinder automaticity. Teaching with math manipulatives is how you teach the concept of what you're teaching. And that's really important.


Kids have to know that. So that's really important. However, what I found from my experience and from a couple studies is that most kids who struggle in math understand the underlying concept. In fact, I did research on teaching conceptual understanding and I really struggled to find kids who didn't understand the underlying concept.


And so I worked with a teacher and, if a kid wasn't getting it, she would get out the manipulatives and reteach it. Now, if the kids don't understand the underlying concept, that's not a bad thing to do, but if they do, then that's not going to help. And so some of the studies I've done have shown that when kids understand the underlying concept, you really got to just practice the algorithm, practice the fact and the algorithms. That's what those kids need. So for those kids, manipulatives will hinder automaticity.


But if the kids don't understand their underlying concept, then math manipulatives can, reteaching with those can really help. My point, and it's more of an observation than a study I've ever done, is I just think, don't think there are a lot of kids for whom that's the case.


I think once you teach it, and they get it, reteaching it with manipulatives usually isn't the answer.


[00:35:00] Anna Stokke: And so something like, say, you're trying to teach a kid that two times three is six, and you're looking at the underlying concept, you might pull out some pennies or something like that, and you've got, three groups of two, and, and you show that that's a six. Like they can't just memorize two times three is six, they need something to fall back on if they're not sure how to get the answer right.


But you wanna get to a point where it's automatic.


[00:35:22] Matthew Burns: So I had this teacher who was working with elementary school, again, kids with intellectual disabilities. So kids with IQs between like 55 and 65. I asked if I could teach the kids the multiplication facts, because I do research in something called incremental rehearsal, which is an intensive intervention.


It's an intensive way to practice things like letters, sounds, and math facts. So I said, I want to try it with these kids, and I did, and we taught them the math facts, and you know, a couple of weeks. They're single-digit multiplication facts in just a couple of weeks. And so I was showing the teacher the data, and I said, isn't this great?


Look, the kids are learning it. And she said, “Oh, that's great, but that's not what they need.” “What do you mean?” “They need to understand it better conceptually.” I said, “Well, okay, why do you say that?” She goes, “Here, I'll show you.” So she brought a little boy over and said, you know, again, I'll keep using three times three.


And the kid wrote down three times three and started working on it, and he put up his fingers and counted, you know, I'm seeing it more audibly than he did so you can hear it. He counted, he went, “1, 2, 3, 4, 5, 6, 7, 8, 9.” So he had just done three sets of three, he wrote down the answer as nine, and the teacher said, see, he needs to understand it better conceptually.


Wait a minute, isn't telling me three times three is three sets of nine and counting it that, isn't that showing he understands it conceptually? Again, for that kid it wouldn't have, it wouldn't have helped. We have to teach him, teach him to automaticity, his multiplication facts, which we did.


[00:36:43] Anna Stokke: And that makes sense. And another thing I've noticed is there's just a lot of misunderstandings out there about conceptual understanding. it's not a well-defined term.


[00:36:53] Matthew Burns: Yeah. So I found, and this is again, just based on my experience, most teachers, when they say conceptual understanding are actually referring to generalization. And they, they're saying I want the kid to be able to do like a word problem. I know there are other ways to, generalize things than word problems, but that's just a pretty easy, concrete example.


And what, when they say conceptual understanding, what they're meaning is actually application, generalization. In fact, we did a study of this many years ago, where we found that most of the time in intervention research, people were, you're calling it conceptual understanding, but were testing application. 


Yes, absolutely. We absolutely need the kids to be able to apply it, but that's different than conceptual understanding. And I think teachers are teaching conceptual understanding, actually teaching underlying ideas, but wanting to see application, and the bridge between conceptual understanding and application is automaticity.


If the kids can't apply it, and we are interested in that, but we think it's called conceptual understanding, so I re-teach the concept, that's not going to help most of the kids.


[00:37:58] Anna Stokke: That's very interesting. So is there an equivalent of conceptual understanding and reading?


[00:38:04] Matthew Burns: Yeah, it's, it's reading comprehension. again, the ultimate goal, in fact, reading, it's, to me, it's synonymous to say reading comprehension and reading. Like reading is reading comprehension and vice versa. You know, we want kids to be able to comprehend, but we, I did two studies, only two, but others have done similar research found similar things, which is if a kid in elementary school is not reading at least about 50 or 60 words a minute, they're not going to comprehend what they read. 


Now it's correlational, not causational. I certainly, I saw a kid read 11 words a minute, who did really well in comprehension, and I saw kids read 130 words a minute and not, you know, not do well. But for most kids, if the kids weren't fluent in reading the words, they didn't comprehend what they were reading.


So there is absolutely an analogy to reading.


[00:38:49] Anna Stokke: So let's go back to research. So we'll hear the phrase “research shows” a lot in education, but sometimes when you look into it, you'll find that the research actually doesn't show what's being claimed. And a good example I want to give in math is the claim that timed tests cause math anxiety. That's a claim I've heard a lot.


But when you look into the research that's supposed to support this, it's really questionable. In fact, I don't think there actually is evidence that timed tests cause math anxiety. But this isn't an isolated case. So for example, I had Brian Conrad on the show, and we talked about the California Math Framework, which he read, and he would see a statement that seemed questionable, and so he'd look up the citation, and the paper referenced would often have nothing to do with what the statement said.


And then there are other examples where, if you look at the papers, The study maybe lacked a control group or it wasn't a randomized controlled study, that sort of thing. So the scientific method is this sometimes lacking in education research?


[00:40:03] Matthew Burns: Yeah. It is. So for example, an article cited quite a bit about math anxiety, saying that research finds timed tests cause math anxiety, and I saw that being cited, Jo Boaler, in a journal called Teaching Children Mathematics. So I went and found it and strangely, the citation that that is commonly used is in a column called “In My Opinion.”


And I'm not making that up. There's actually literally, it's that the little headline with there is called in my opinion. It's not a study. It's her opinion and she cites, she cites something she has done that showed that it causes math anxiety. So I found the original article, and it was not a study either.


It was an opinion piece. So she's citing one opinion piece as study and another opinion piece. And we see that quite a bit in math but also in reading. in fact, I think we see that quite a bit in reading. I'll give you one example. This is an older one now.


This is why I feel safe to use it. I don't like calling out my colleagues like the way I'm about to, but I'm going to. There's an article published in the Early Childhood Education Journal, which is a good journal, by a woman who I don't know, Anita IaQuinta, I think that's how you say her name.


And it's about how effective guided reading is, and the title of it is Guided Reading: A Research-Based Response to Challenges of Early Reading Instruction. A Research-Based Response, Guided Reading. Okay, great. So I looked at the citations, there are 22 references, 18 of them were books or position papers, one qualitative dissertation, three what I would call empirical studies, one was Juul 1998, which just shows the longitudinal effects of being a good reader basically, I'm oversimplifying, the National Reading Panel, and Torgeson '98.


None of those three studied, well, the National Reading Panel reported on guided reading and they reported a negligible effect. So none of those support guided reading as an effective approach. There's literally not a single citation in that paper that supports it as a research-based practice. Yet the title of it is a research-based response I think we see that all the time and it's really unfortunate because most practitioners aren't trained to look for that, right?


They're not trained to do the analysis. I just did they're going to see the title and think, “Oh, it's a research-based response,” and that's going to be their opinion from then on.


One time I was working with a school, and I was talking about the importance of math facts and learning math facts, and as I'm standing in front of all these teachers, I'm a paid consultant, the assistant superintendent hands me a piece of paper and says, “Well, what do you think about this research that says learning math facts is harmful for children?”


I said, “Well, you're paying me to read this and respond. I will.” And so I did, and it was a blog. It wasn't a study. It wasn't research. It was a blog. It was an opinion piece. And, and I consumed it, I read it, and I said, this is a blog, and I showed her that none of the citations were actual studies.


And I pulled up my computer and went to Google Scholar and showed her five, six, seven studies agree with what I found. So her, her challenge to my claim was research, but it was just a blog that someone had written. That's one of the downfalls of the science of reading movement, is what people accept as science has grown, and we see a lot of blogs, a lot of websites, and things that aren't research being accepted as evidence.


[00:43:23] Anna Stokke: It seems like we need a better way to monitor or vet programs. Do you think that there should be some system in place to vet instructional programs? 


[00:43:33] Matthew Burns: It's almost like we need an FDA, you know, the Food and Drug Administration for educational practices, right? It's almost like we need something out there that says, “This is a good approach.”


Now the What Works Clearinghouse, the Department of Ed here in the States, that was supposed to do that. And, and they do it, but, there's a couple of problems with it. Number one, th What Works Clearinghouse usually finds what doesn't work because most of much of what we do in educational practice doesn't have a good research base.


We really dive into it. We see that that's the case. So it's the what doesn't work clearinghouse. And secondly, they rate the quality of the research, not the size of the effect. So I'll give you an example like leveled literacy intervention. I mentioned that earlier. Leveled literacy intervention, the aspect of, the intervention aspect of the Fountas and Pinnell guided reading system.


There's three studies I know of and the What Works Clearinghouse reviewed. And they're good studies. Random assignments, well done studies. And so they're rated really well by the What Works Clearinghouse. But what they don't look at is the size of the effect. All three of these studies found very small effects.


So, to me, that says, there's a good research author that says, good research says this doesn't work. But that's not how it's presented. It's presented as, there's good research. And so we see that and think, “Oh, there's good research.” But it needs to take you to the next step and say, good research shows it doesn't work.


So we need something like that. We need a system for educational practices to be, you know, vetted or examined and to look to see, “Yes, we think this is an approved practice.” I mean, just like they do in medicine, you know, every practice you see in medicine has to go through some level of screening and vetting and to say, yes, this seems to be something we could do or no, it's not.


[00:45:13] Anna Stokke: Another thing I've always found sort of interesting is, so if you were to decide that you wanted to do research on a particular class or several classes, let's say, you'd have to go through some sort of ethics approval, correct? And you'd have to get signatures from the parents and the teachers and everybody involved.


So that's if you were going to do actual research, but they roll new programs into schools all the time that you could flip what you're doing on its head. I mean, they could be doing direct instruction one year and then the next year, they're going to do inquiry-based.


And one of the school divisions in my province, I've seen them switch math instruction programs, I would say at least five times in the past ten years. There is no parent, no parental consent needed. They just go in and switch the program. I find that to be at odds with if you're going to do research, you have to get consent, but you can just roll in new programs without consent.

[00:46:11] Matthew Burns: Yeah, and here's make matters worse. So I was when my children were younger and in elementary school, the district that they attended was considering switching their math curriculum. So they had a parent night to get parent input. So I think that's how they think they get around it, to get parent input on these three new curricula they were examining. 


They were examining, I think, five, but there were three that they were kind of leaning towards. Well, when I saw the list, I immediately went to Google Scholar and saw that there was a federally funded study that examined, compared these three exact curricula and found that one was clearly more effective than the other two.


And so I, I went to the parent night and I brought a copy of that with me and I went to the director of research and evaluation who was there and said, well, I assume you, you talked about this study and he hadn't seen it. Nobody had seen it. So not only are you, you know, experimenting and trying things out and I'm not opposed to experimenting and trying things out.


Absolutely. But if there's a, but there should be at least be a research base for it and parents should be part, be part of the process. And so if you're going to try to adopt the new approach. It's one thing to try something that's not supported by research. It's another thing to have there be a research evidence out there that says this is better than the other and not know about it.


To me, that's really downright unethical. I'll be a little forgiving because research is my job and I understand that. But all I did was Google Scholar. They have access to Google Scholar. They could have looked at it in about 30 seconds just like I did.


[00:47:39] Anna Stokke: The culture needs to change really. We need to be holding the education system to higher standards for sure.


[00:47:44] Matthew Burns: What would be one thing I could change in education? My answer is always this, value for data and research. If teachers were good at consuming data and administrators were actually concerned about what research says is an effective practice, 95 percent of the things we discuss as problematic in education would be fixed.


[00:48:01] Anna Stokke: About some of the education thought leaders, sometimes you could even call them education celebrities, I think. And we see this in both reading and math. Do you think maybe they get caught up in the positive attention that they're receiving and that this may be why they don't step back on the claims they've made, even when it comes to light that there are problems with some of the ideas they promote?


[00:48:30] Matthew Burns: I guess I won't comment on what their motivation is. I'll tell you my experience. So when I go to conferences like the Science of Reading Higher Ed Summit that just occurred, you know, I go to these types of conferences and people come up to me and they say thank you for your work and they'll take selfies with me and things.


I actually was a subject once of a scavenger hunt at a conference. Like they, one of the items on the scavenger hunt was a selfie with me. And you know, that's really flattering and seductive and you leave thinking, ah, you know, this is, this is really great. And I think because of that, people telling you that what your stuff is so great and people tell you what you say is true, that you sort of start to, to believe it.


And you get to the point where you can say, “Well, I think it's true. Therefore it must be.” So I think us researchers need to be more self-critical and self-reflective. We need to recognize is “What I'm thinking based on evidence, or is it something I think is true because I think it's true?” So I really challenge other researchers to engage that level of self-reflection because you can get caught up in it really easily.


[00:49:33] Anna Stokke: So what lessons can we learn from reading about evidence-based instructions and what can we take away to improve math instruction?


[00:49:42] Matthew Burns: I think recognizing what the research says. The science of math movement has not taken off. That makes me sad. There's a small group of us who are pushing it, but, there's no Emily Hanford for math, right? There's no one who's, maybe this will, this podcast will be that, that'd be fantastic. 


But, you know, there's no real parent push, popular movement push for science of math. And so one lesson would be to really get the parents involved and let them understand the situation, let them understand there are research-based alternatives. I think that's parents are powerful, powerful allies.


The science of reading movement has, in my 25 years, whatever, doing this, I've seen more change in reading instruction in the past five years than I have the previous 20 before it. I think that's because parents became really interested and really involved in pushing for change. We need that to happen for math.


We need that to happen for math. If it does, we'll see change. And also, we have to recognize, earlier in the math movement, the value of actual research-based practices and what that means.


[00:50:46] Anna Stokke: The shift you're talking about with reading instruction so some states are insisting that schools must use evidence-based reading instruction, right? And also Columbia University recently dissolved the Teachers College Reading and Writing Project, which was Lucy Calkins's organization.


So these are both good things. 


[00:51:08] Matthew Burns: Yes. 

[00:51:08] Anna Stokke: I will make a point, though, here in Canada, I'm not so sure that things are moving as quickly as they are in the U. S. Like in my province, I think that Some of the Fountas and Pinnell stuff is still being used in schools. So what was the major turnaround for reading? Was it Emily Hanford's podcast or was it parents or a combination?


[00:51:30] Matthew Burns: It was a combination. It was, you know, there was a group in the States called Decoding Dyslexia, and they are a group of parents who became interested in reading and saying, “Look, my kids aren't learning how to read, and we need to know what's going on.” Now, part of their solution was to identify children with dyslexia.


I'm not sure that's, the solution. But part of the solution was, no, we need better research-based reading practice. And that way predates anything, you know, Emily did, et cetera. In fact, I was working with Decoding Dyslexia in Minnesota. I mean back in like, you know, 2004, 2005, certainly by 2006, 2007.


So there, that grassroots movement happened way before. And then, I would argue the science of reading movement came out of that. It's an offshoot of that work, that grassroots movement that started in the mid-2000s. Emily's great podcast was an extension, just a continuation of that. And that took this grassroots movement and made it much more widely known.


That's really, really what, what the effect was of that. But no, I think it was a grassroots movement that's been going on for about 20 years.


[00:52:36] Anna Stokke: Parents actually can have a huge impact. One thing, though, I will say, it's very clear if your kid can't read or if they're struggling with reading. So math, I feel that we've got a more difficult situation because it can be difficult for parents to see that their kids are struggling with math or they might sometimes think that, “Oh, that's normal,” you know, there's sort of this idea in math that some people just, you know, they're not that good at math and I wasn't good at math. And so likely my kid isn't good at math. And so I think that might contribute to the issue with math. But I would also say that, it actually, it makes it worse, right?


Because it's more likely that kids are just never going to get caught up in math because they don't have people advocating for them.


[00:53:30] Matthew Burns: Yeah, in math, we have this idea where it's okay to not be good at math. It's almost like, you know, in elementary school to be cool you can't be good at math, you can't like math. You know, I might be exaggerating to make a point, but it's at least acceptable to say, “I don't like math, I'm not good at math.”


If you were to say, “I don't know how to read, I'm not good at reading,” I mean, that would be, first of all, much more embarrassing probably, and second of all, I mean, red flags would go off, alarms would go off, people would come swooping in to do something, but you could absolutely say, “yeah, I don't like math, I'm not very good at math.”


Some of us are good at math, and I think that's really a problem, so your observation, I think, is well taken, that part of the reason we saw the grassroots movement is, it is absolutely unacceptable to be struggling in reading, but for some reason we see it to be acceptable to say, “Yeah, I'm just not good at math.”


[00:54:19] Anna Stokke: It's unfortunate because, you know, we have lots of careers in technology and AI and data science and you really need a strong math background, you need to be able to get to the point where you can take calculus. And so it's really something that we, do have to turn around.


So what can people do? Do you have any ideas? Is there any way to speed up this process? For people listening, what can we do?


[00:54:46] Matthew Burns: Well, I wish I had a good answer to that, but I probably don't. There's a couple of things I would suggest. Of course, as I mentioned earlier, be active consumers, both of educational practice and of what your kids, teachers are doing, etc. I'll recommend a couple of resources.


I've mentioned Google Scholar a couple of times. That's a good one. If you Google IES, the letters IES practice guide, the Department of Education in the States has the What Works Clearinghouse Institute of Education Science, which is the branch of the department that deals with research, and they get together researchers around particular topics, and they write these wonderful practice guides, and they're written for practitioners, you know, and it'll tell you, there's one on how to help struggling readers.


There's one on how to teach fractions. There's about 40 of them. And they're really well done. They're all free PDFs. So you can simply download it and it'll tell you, based on the research, we think these four, five, six things are really important. So I encourage listeners to look at those. They're written for non-technical consumers.


They're really, really well done. I highly encourage it. So there's, there are good resources. And also if you're interested, John Hattie's book I've mentioned is. is quite good as well. So be a, be an active consumer, be a skeptical consumer. Really be willing to, to take a look at what is going on and don't be afraid to question it.


[00:56:04] Anna Stokke: Is there anything else that we didn't cover today that you want to add?


[00:56:08] Matthew Burns: I do wish we would emphasize science more and by science, I don't mean traditional science. I mean the scientific method. I do wish scientific method would be taught more in elementary school and in high school. I think the scientific method is, critically important to understanding human behavior and to knowing what works and what doesn't work.


And sometimes, education and research, we've gotten away from the scientific method. I;d like to see us get back to it.


[00:56:34] Anna Stokke: That is a good idea, and that's good advice. So, I want to thank you so much for coming on today to talk to me about reading. I've learned a lot, and I think we can see a lot of the connections with math and hopefully, there's a way forward and hopefully, we can turn things around for math, just like people turn things around for reading.


So, thank you so much for coming on today. It's been a pleasure to talk to you.


[00:56:58] Matthew Burns: Thank you. This was fun. I enjoyed it. Thank you so much. 


[00:57:01] Anna Stokke: More in just a moment. As we discussed in the episode, it's extremely important that kids develop automaticity with math facts. In the show notes, I've linked to a resource page where you'll find some resources for helping kids memorize times tables. I hope you enjoyed today's episode. I've got another great episode coming on December 15th. If you enjoy this podcast, please consider showing your support by leaving a five-star review on Spotify or Apple Podcasts.


Chalk and Talk is produced by me, Anna Stokke, transcript and resource page by Jazmin Boisclair. Subscribe on your favourite podcast app to get new episodes delivered as they become available. You can follow me on X for notifications or check out my website,, for more information. This podcast received funding through a University of Winnipeg Knowledge Mobilization and Community Impact grant funded through the Anthony Swaity Knowledge Impact Fund.

bottom of page