top of page

Ep 70. What to do when “Research Shows” shuts you down: A guide for parents and teachers

This transcript was created with speech-to-text software.  It was reviewed before posting but may contain errors. Credit to Canadian Podcasting Productions.


In this episode, Anna Stokke explores what to do when a math program or education initiative doesn’t seem to work, but you’re told "research shows" that it does.

Drawing on her personal experience as both a parent and educator navigating “research shows” claims, Anna explains how parents, teachers, and advocates can ask for evidence, evaluate what counts as credible research, and respond when weak claims are presented with confidence.  This episode is based on a presentation she gave at researchED Toronto in 2025.

Anna unpacks common tactics used to shut these conversations down, including shifting the burden of proof, overwhelming people with endless references, credential deflection, and denying that poor practice exists at all.  She also offers practical advice for parents and educators on how to counter these tactics and spot and stop the spread of bad ideas in education.

This is an essential conversation for anyone trying to push for better practice in schools and navigate resistance along the way.

This episode is available in video at www.youtube.com/@chalktalk-stokke

A Substack version of this episode, written by Anna Stokke, as a guest writer for the Center for Educational Progress is available at https://www.educationprogress.org/p/what-to-do-when-research-shows-shuts

TIMESTAMPS

[00:00:22] Introduction [00:01:35] What to do when something doesn’t feel right [00:02:48] Why Anna got involved in math education advocacy [00:06:38] Understanding the phrase “Research shows” in education [00:07:01] The Wildfire Effect: How bad ideas spread [00:09:25] How to ask for evidence [00:09:57] Burden of proof fallacy [00:11:04] Firehose Effect: Overwhelming you with articles [00:12:16] Overcoming the Firehose Effect [00:13:01] Credential deflection [00:14:59] Gaslighting: When you’re told the problem doesn’t exist [00:16:11] Evaluating the evidence [00:17:50] Fuzzy terms: Critical thinking, conceptual understanding, number sense, curiosity, differentiation [00:19:26] Become informed [00:18:36] Resources that can help [00:20:16] Final thoughts [00:00:22] Anna Stokke: Welcome to Chalk and Talk, a podcast about education and math. I'm Anna Stokke, a math professor and your host. Hi, and welcome back to Chalk and Talk.

 

Today I'm doing something a little different, no guest, just me. This episode is based on a talk I gave at ResearchEd Toronto in June 2025, called Research Shows How Bad Ideas Spread in Education. I've also written a version of this episode that's been published on the Center for Educational Progress Substack, which I'll link to in the show notes.

 

Please check it out and share the episode or the Substack article with colleagues or parents who could benefit. I'll also be doing a follow-up episode with some people from the Center for Educational Progress where we'll talk about some examples, things like the failed detracting experiment in San Francisco, updates on the New York math briefs, and some flawed education research that's impacted policy. But today's episode is really meant to be practical.

 

Think of this as kind of a prequel to my episode with Ben Solomon on red flags in education research. That was episode 23. In that episode, we talked about how to spot problems in education research articles.

 

[00:01:35] This one is about what to do when something doesn't feel right, when you're seeing math programs or approaches being introduced in schools that don't seem to work, and you're being told they're research-based. How do you actually ask for evidence? What counts as evidence and what doesn't? And what do you do when people try to shut the conversation down? Because once you start asking for evidence, you will probably find that the pushback often follows very predictable patterns. And if you know what those patterns are, you're in a much stronger position.

 

So that's what we're going to talk about today. Note that this episode is available in both audio and video. The video will come with some helpful on-screen text.

Now, on with the show.

 

[00:02:25] This episode is based on a talk I gave at ResearchEd Toronto in June 2025. And it's a talk I've wanted to turn into audio because the questions it addresses come up constantly, from parents, from teachers, from people who are trying to push for better practice in schools but keep hitting walls.

 

I'll start at the beginning. People often ask me how I got involved in math education and why I'm so persistent about calling out poor practice. And the answer is, it's personal.

 

My husband and I are both mathematicians. We sent our daughter to school expecting she'd be taught math. After all, that's what schools do.

 

They teach kids to read and they teach them to do math. But by grade three, we started to notice something was off. Most days, her class either worked on a problem of the day, which was often something students didn't have the skills to solve, or they were taught confusing, overcomplicated methods for doing basic arithmetic—methods that seemed to make it almost impossible to learn how to do basic math operations.

 

[00:03:40] Then we were invited to a parent math information night at the school. The flyer asked, what should math look and feel like? How do we help children see that math is a subject worth thinking, not just remembering, is the main event? Who could be against thinking? Certainly not me. But if I'd known then what I know now, I'd have recognized that framing immediately as code for no remembering at all.

 

But my husband and I were hopeful, and we went to that information night. However, that's where things really took a turn. We were told that the math curriculum discouraged standard algorithms, the traditional vertical methods for arithmetic, in favor of invented strategies and less efficient procedures.

 

[00:04:31] We were told this promotes understanding. As mathematicians, we were skeptical. We looked around the room.

 

Parents seemed satisfied because they trusted the school to do what's best for children. Now at that parent night, they gave us a research paper that was supposed to prove to us that standard algorithms are harmful. But the paper reported on a small case study with children with learning difficulties.

 

There was no control group and no statistical analysis, and the author drew conclusions that didn't follow from the evidence. So that paper didn't support what we'd been told at all. And that was my first real encounter with education research, and I was not impressed.

 

[00:05:22] I wondered how a flawed education paper was allowed to shape the way children were being taught math. And why was no one asking questions? Our daughter's school wasn't an outlier. The same patterns were playing out in classrooms across the country.

 

And that moment set me on a path that I'm still on today to push for better standards in math education. Over time, I've learned to read between the lines, to ask pointed questions, to look closely at what's being presented as evidence, and to never take education claims at face value. I'm going to share what I've learned, and I hope it helps other parents and teachers.

 

[00:06:08] So, the first thing to understand is that the phrase research shows doesn't carry the same weight in education as it does in other fields like medicine or science. In education, it might refer to a blog piece, an opinion piece dressed up as evidence, or a small, low-quality study. Even a published journal article in education needs to be scrutinized because a surprising amount of education research is quite low-quality.

 

But the phrase research shows is powerful and persuasive. It makes opinions sound like fact and lends authority to claims that haven't been tested properly. And once a claim gets repeated enough, it starts to feel like established truth.

 

[00:07:01] I call this the wildfire effect. Here's how the wildfire effect works. Number one, a flawed study or opinion piece gets cited by an influential educator.

 

Number two, it gets repeated at education conferences, in professional development sessions, and on social media. Number three, it starts appearing in district documents, in books, and other education papers. Number four, it then gets cited as well-established research.

 

Number five, it becomes justification for education policy. And at no point in that process is the evidence seriously examined. A good example is the claim that timed tests cause math anxiety.

 

[00:07:55] This is not supported by high-quality research. But it traces back largely to an opinion piece by a prominent math educator, and it has been repeated so many times that many educators believe it is settled science. Meanwhile, the Institute of Education Sciences, one of the most rigorous sources of research guidance, actually lists timed activities as a research-informed strategy to support students who are struggling with math.

 

So, we have an opinion piece generating widespread belief and rigorous evidence pointing in the opposite direction, all because of the wildfire effect. But what's at stake? When weak or non-existent evidence drives education decisions, students don't get effective instruction. Struggling students fall further behind.

 

Teachers are misled, sometimes for years. Resources get wasted on programs that do not work. And the high-quality research that does exist gets drowned out.

 

[00:09:06] Remember, a PhD, a position of influence, or a published book is not proof of accuracy. Anyone can write a book, and there are plenty of folks with PhDs who spread pseudoscience. Evidence is what actually matters.

 

And more teachers and parents need to be equipped to ask for it and to recognize when they're being shut down. So, let's talk about how to do that.

 

[00:09 37] So step one is to actually ask for evidence. But when you start asking for evidence, you'll quickly discover that there are some very predictable tactics used to avoid providing it. I want to name four of them so you can recognize them when they happen to you.

 

[00:09:57] So, tactic one is the burden of proof fallacy.

 

This is when someone makes a radical claim but refuses to back it up and instead shifts the burden of proof onto you. So let me give you a real example. Suppose someone says, research shows standard algorithms are harmful.

 

You ask, can you give me the evidence for that? And they respond, you need to show that they're not harmful or something to that effect. That's the burden of proof fallacy. Now as Carl Sagan said, extraordinary claims require extraordinary evidence.

 

[00:10:37] The burden of proof belongs to the person making the claim, especially when that claim challenges established practice or if the claim is really radical. Saying standard algorithms are harmful is a radical claim that goes against long established practice. The onus is on the person making that claim to support it.

 

Don't let the burden of proof be shifted onto you. Now tactic two is something I call the firehose effect. If the evidence for a claim is shaky, one way to avoid scrutiny is to overwhelm you.

 

I've experienced this repeatedly. I ask for evidence and I'm told, read this book and it has 400 references. Or I receive a list of 30 articles.

 

 

[00:12:28] The reasoning is simple. It's impossible to check the validity of hundreds of references. The best example I know of where someone actually overcame the firehose effect is Brian Conrad, a Stanford math professor.

 

When the California math framework was released as a draft, a thousand page document, he went through every claim and every citation. He found repeated misrepresentations of sources, non-peer-reviewed articles, sweeping generalizations. He documented everything in a public critique.

 

Now most of us can't do what Brian Conrad did, but there are things you can do. So first, be specific. Ask for two or three high-quality studies, not books.

 

[00:12:24] Second, if you do receive a large reference list and you have a group of people that can help you, divide up the work. My colleagues and I recently did exactly that when an education professor claimed that requiring teachers to take additional math courses made them worse math teachers. A radical claim.

 

We asked for evidence, and she sent 22 articles. We divided them up, read every one, and wrote a report. Not one of the 22 articles supported her claim, and several contradicted it.

 

[00:13:01] Tactic three is credential deflection. This happens when instead of legitimately engaging with you, someone questions your right to be part of the conversation. I'm a mathematician, I've been teaching math for over 20 years, and this still happens to me.

 

Here's an example. A contract instructor in a faculty of education once wrote this about me when I published an opinion piece about improving math education in my province. I'm going to read it to you directly.

 

Let me stress that her perspective as a mathematician is far different than that of a math educator. Many of the statements she makes are a reflection of her lack of knowledge regarding effective practice. So, he's saying that I'm not the right kind of person to be making recommendations on math education.

 

This is an ad hominem attack, criticizing the person instead of engaging with the argument. And there's an implicit claim buried in it that only people trained in education are qualified to evaluate education research, as though there's something special about education research that the rest of us can't understand. That is preposterous.

 

[00:14:24] So first of all, a mathematician is often better placed to identify weak methodology in education papers, missing control groups, flawed statistical analyses, conclusions that don't follow from the data. And honestly, critical thinking skills are often all that's needed to assess the validity of many education papers. If you're being dismissed because of your credentials, it usually means the person doesn't want to engage with you or is unable to provide the evidence you're asking for.

 

[00:14:59] So if someone attacks your credentials, redirect them to the only question that matters, please provide evidence for your claim. Tactic four is gaslighting. So gaslighting is when someone tells you that the problem doesn't exist, that you're imagining it, that poor practice simply isn't happening in schools.

 

[00:15:24] I've been told more than once that inquiry-based learning is rare, that most classrooms are dominated by direct instruction, that discovery learning is a straw man, and that I'm just making things up. A simple response to this? Look at the professional development. What is being offered to teachers? How often does it focus on explicit instruction? What is being offered to teachers? How often does it focus on explicit instruction, retrieval practice, building fluency with basic facts, or direct teaching of critical math skills? Compare that to how often professional development focuses on building thinking classrooms, growth mindset, or inquiry-based approaches.

 

[00:16:11] Professional development, school newsletters, and school improvement plans reflect what school systems believe teachers should be doing. If you're a parent, your child's schoolwork is also evidence of what's going on in math class in your child's school. Now step two, evaluate the evidence.

 

If you do manage to get actual research articles in response to your questions, which in my experience is less common than you'd think, you will need to assess them. Let me quickly cover what doesn't count as evidence and what red flags to look for. So here are some things that don't count as evidence.

 

Opinion pieces are not evidence. Newspaper and magazine articles are not evidence. Articles that haven't been peer-reviewed are not evidence.

 

And position statements from organizations, including from influential bodies like the National Council of Teachers of Mathematics, that's the NCTM, are not evidence, even if they cite studies. You need to look at the studies themselves. There are five red flags that Ben Solomon provided that we discuss in detail in episode 23.

 

I'll put a link in the show notes but let me highlight the most common one I see in math education research, and that is vague, unmeasurable outcomes. [00:17:51] Math education is full of appealing but fuzzy terms. Critical thinking, conceptual understanding, number sense, curiosity, differentiation.

 

These terms are used constantly. The problem is they often lack clear definitions and are very difficult to measure. So, if you read that a program promotes critical thinking, that's a red flag because there really is no standard definition for what that means, making it impossible to measure.

[00:18:27] Another thing to watch out for is when programs get labeled as research-based, but the underlying studies didn't actually measure whether students learned math. Building thinking classrooms is a great example. It's widely described as evidence-based.

 

But the study most often cited for it measured student engagement, not whether students actually learned math. And engagement is not learning. Students can be highly engaged and learn very little.

 

 

[00:19:01] And I'll link to an episode where Zach Groeschel and I discussed the lack of evidence for building thinking classrooms. If a program is claiming it's evidence-based, the evidence needs to show that students learned more math. That's a minimum bar. Check out the show notes for more red flags.


[00:19:23] Step 3. Become informed. The best defence against bad ideas in education is to build your own knowledge of what the evidence actually says.

 

And there are excellent resources for this. I'd point you to four in particular. The Institute of Education Sciences Practice Guides are freely available. They're online, rigorous, and practical. The National Mathematics Advisory Panel Final Report. The National Center on Intensive Intervention.

 

And the Education Endowment Foundation, which is a UK-based organization that does great synthesis work on what improves student outcomes. I'll put links to all of those in the show notes. And of course, keep listening to Chalk & Talk.

 

[00:20:16] We bring on researchers and practitioners from around the world who are working at the intersection of evidence and classroom practice. That's exactly the kind of knowledge that will help you push back, ask better questions, and recognize when you're being misled or shut down. Now I want to close with something that drives all of this for me.

 

Our daughters have mathematicians as parents. When the schools were failing them, we had the knowledge and the confidence to fill the gaps. Most children don't have that advantage.

 

[00:20:53] Most parents trust that when the school says research shows, there's actually research behind it. If we want better outcomes for children, we have to stop accepting research shows at face value. We have to start demanding evidence, evaluating it carefully, and refusing to be shut down when we ask questions.

 

[00:21:18] It's not easy. People will question your credentials. They'll overwhelm you with citations that you can't check.

 

They'll tell you you're wrong about what's happening in classrooms. Stand firm. Redirect every deflection back to the argument.

 

Our children deserve nothing less than instruction that's actually grounded in high-quality evidence. Thanks for listening. I'll link everything we discussed in the show notes, the substack version of the article, and resources I discussed.

 

[00:21:50] If this episode was useful, please do me a favor and share it with someone who could benefit. Thank you. Thank you so much for listening.

 

[00:21:59] If you enjoy this podcast, please consider showing your support by leaving a five-star rating on Spotify or Apple Podcasts. Don't forget to subscribe on your favourite podcast app or on YouTube so you never miss an episode. You can stay connected with me on Instagram, Facebook, TikTok, X, Blue Sky, or LinkedIn.

 

All links are in the show notes, and check out my website, annastokke.com, for more information. This podcast is funded by a grant from La Trobe University and from the Trottier Family Foundation through a grant to the University of Winnipeg to fund the Chalk & Talk podcast.

Anna Stokke

Department of Mathematics & Statistics

The University of Winnipeg

515 Portage Avenue, Winnipeg, Manitoba

Canada R3B 2E9

204-786-9059

  • YouTube
  • Twitter
bottom of page