Joseph South & Jessica Garner

Joseph South & Jessica Garner

February 23, 2026

Pedagogy First, Tech Second: What Schools Get Wrong About AI (and How to Fix It)

Two big truths can coexist: AI is powerful, and schools can absolutely make a mess of it.

In this episode of AmpED to 11, Brett Roer and Rebecca Bultsma sit down with Joseph South (Chief Innovation Officer) and Jessica Garner (Managing Director of Innovative Learning) at ISTE + ASCD to talk about what it looks like to do AI properly in K-12 without turning it into a culture war.

They get into:

  • Why most “tech problems” in schools are actually pedagogy problems
  • The ISTE + ASCD merger and why the pedagogy conversation and the tech conversation must become the same conversation
  • What teachers are already doing with AI that’s genuinely useful (and actually saves time)
  • The real reasons some educators resist AI (and why those concerns are not naive)
    How to vet AI tools beyond shiny marketing, including transparency, privacy, and red-teaming
  • Why engaging parents and families early matters, and how to do it without panic

This is the grounded, no-drama version of AI in education: clear, human, and practical.

Listen now and join the conversation.

Jessica Garner: [00:00:00] They’re really thinking about how do I protect my students? How do I make good financial decisions for our schools and our districts? How do I make sure that these tools are really connected to, to, to the pedagogical practices that we want to promote?

Joseph South: You know, how are we engaging parents in these conversations?

Joseph South: How are we engaging families in these conversations? How do we bring the voices to the table? That are being left out, that feel left out, and that, ~um,~ if we don’t bring into the conversation, we’re gonna find ourselves with a lot of friction moving forward.

Rebecca Bultsma: Using AI to get you started as might actually be sabotaging us.

Rebecca Bultsma: Instead, we need to be thinking through the complexities and the routing and the benchmarks and the streets, and creating that strategic plan in our mind, and then using AI to fill in the side streets.

Brett Roer: Welcome everyone to the AmpED to 11 podcast. I am joined as always by my incredibly brilliant co-host, Rebecca Bultsma. Rebecca, how are [00:01:00] you doing today?

Rebecca Bultsma: I’m good, but I’ve spent the week, uh, just completely immersed in Claude code. It’s running my whole life in. Great ways, but I feel like I’ve been cla pilled they’re calling it where it’s like all I do, I’m coming up with new and creative ways to use it and talking about it a lot.

Rebecca Bultsma: So I just kind of feel like I’m down this rabbit hole in Wonderland. So it’s good when I have things like podcasts on my calendar to pull me out of the, uh, the Claude Universe. But other than that, doing great.

Brett Roer: Alright, well we’re glad you’re here. I also started using that and I told someone today that my brain hurts in a good way.

Brett Roer: After spending a lot of time yesterday trying new tools and trying new things with ai. So think we’re in the same space. But luckily, we are joined by two incredible leaders at the intersection of education and innovation. We are joined today by Joseph South, the Chief Innovation Officer at ISTE ASCD, as well as Jessica Garner, the Managing Director of Innovative Learning at ISTE ASCD

Brett Roer: [00:02:00] Thank you both for joining us today. How are you doing?

Joseph South: I’m doing great. I wish I was emerged in Claude code because I’ve been emerged in Snow Creek, which is, um, the, the icy snow that’s fallen on the DC area. Um, it’s a crazy town out here.

Jessica Garner: Yes. Across Virginia. Um, this is day seven of no school for my child, and tomorrow will be day eight of no school.

Rebecca Bultsma: Oh my goodness. Wow. Wow. Not to pull out my Trump card, but you know, you’re talking to a Canadian who lives in the Rocky Mountains, right? So I, I feel like, uh, I outwin all of you, but I’m sorry you’re having to deal with that because I’m sure you don’t live where I live by choice. So you have my fullest sympathy for your horrible winters.

Rebecca Bultsma: Know that I have every ounce of empathy for you because I have experienced exactly what you’re going through and right now too. It’s quite the winter.

Brett Roer: Seven days. That is impressive. I think Rebecca, as always, that we know, like Canada’s ready for this. Many parts of the United States have not experienced something of this [00:03:00] nature.

Brett Roer: So you have our, you have our empathy, Jessica, um, and I hope we all get to experience some warmness coming up in the future, but we are so excited to have you both here today. Something we’d love to do for our audience is really let you tell your own story, right? So you’re both leaders at ISTE ASCD, you’re really changing how people are navigating education, um, in the age of ai.

Brett Roer: So if you could both take a moment and just share your journey, your why. How did you really arrive at this moment in the work you’re leading at ISTE ASCD. Jessica, why don’t you kick us off?

Jessica Garner: Okay. Um, so I’ve been with ISTE ASCD almost a year and a half at this point, and I came from 27 years in public education.

Jessica Garner: I worked for 13 years as a teacher, um, and then was the North Carolina Teacher of the year in 2009 10, which kind of shifted my whole trajectory. Um, I worked for a couple years at the State Department and then led, led some different district initiatives for about 11 years in two different districts.

Jessica Garner: And so I, I, you [00:04:00] know, I think I come to this work with less of a technology background and more of a curriculum and pedagogy lens really with the heart of a teacher looking at all of this. Um, and so I think for me the why is really about. Uh, thinking about the learning experience for students.

Jessica Garner: Everything that I did as, as an educator for 27 years working in schools and districts and at the state level, was about ensuring a great education for our students. And so I think that that’s my why for this as well. I’m trying to create a world that I want my kids to live in, that I want, hopefully someday grandkids to live in.

Jessica Garner: And so I think that, that we are at a really interesting point in the really not just education, but in the world right now. And so I’m just trying to, to help change a little bit of that if I can.

Brett Roer: Amazing. Joseph?

Joseph South: Yeah. So part of my why is working with people like Jessica, um, that was a fantastic answer and it is really great in this seat that you work with people who are so [00:05:00] committed, um, to helping educators and students thrive.

Joseph South: So I, you know, my background, I started out in instructional design. Um, really excited about using technology. To design new learning experiences that weren’t possible without technology. And I stayed excited about that. But I also, along the way found that, um, a lot of what I was designing wasn’t being used.

Joseph South: And part of that was because I was designing it in a way that required technology that was beyond what schools had available at the time. And part of it is because even though I was doing my research and basing it on, uh, you know, research-based pedagogy, those pedagogies weren’t always accepted in schools.

Joseph South: And so I really had to do a reset, um, and ask myself if I wanna have impact, what do I need to be doing differently? Um, and so ultimately that led me to the US Department of Education, where I was the director of the Office of Educational Technology. Um, and then from [00:06:00] there on to IDO and a design firm, um, for a year, and then to ISTE ASCD, which I’ve been here for almost 10 years.

Joseph South: And I think what changed for me during that time is I realized that, one, we gotta meet educators in schools where they are. Um, we’ve got to, you know, work with what we have in front of us. And I love the people who want to go in and blow everything up and make everything new. And I, we need people who are doing that, but most students are going to experience public school in a very similar way to the way it’s been for a long time.

Joseph South: And so I really gain energy by working in those settings and helping them get where they need to go.

Rebecca Bultsma: So for our listeners who might not be familiar with your organization, how would you explain, uh, ISTE, how would you explain this merge with your updated name? Joseph, tell us a little bit about, uh, what people need to know about the organization and the changes.

Joseph South: [00:07:00] Yeah, so I am super excited about the emerging that we’ve done, um, in part because there’s been a technology conversation happening over here and a pedagogy conversation happening over here, and they need to be the one same conversation. And so what we find is that when schools bring the chief academic officer in the same room with the chief technology officer and they bring in the leaders and they say, what are we trying to accomplish from a teaching and learning point of view?

Joseph South: Okay, great. Now how can the technology support that? That’s when we have a really interesting conversation and. Most of the time when technology is not effective, it’s a pedagogical failure. And not either because we didn’t consider the pedagogy first, or because we just didn’t consider pedagogy at all, right?

Joseph South: We just completely left it outta the conversation. And so the merging was to bring [00:08:00] those two conversations together. Um, and I think that some of the ways that that impacts, um, the ecosystem is we now have emerged conference. So our annual conference is the last week of June. It’s gonna be in Orlando this year.

Joseph South: And you can find just as many sessions about research-based pedagogy as you can about, um, the powerful use of technology. Um, we have publications that span that whole continuum. We have professional learning, um, online courses. We also, Jessica leads all of our, um, face-to-face work with school districts.

Joseph South: And then we also have certifications, um, where you can really go deep and learn. And so I think the collective conversations has been really powerful.

Rebecca Bultsma: I love that, especially as a researcher. I sometimes struggle at ed tech conferences because it’s a lot of hype about tech technology. And then you read things like by Larry [00:09:00] Cuban, you know, like oversold and underused all of these things.

Rebecca Bultsma: And there’s, there is at a lot of those conferences a disconnect. And so I think that is great. Jessica, my question for you is, um, what opportunities do you see emerging from this partnership that might not have existed before?

Jessica Garner: Hmm. No, that’s a really good question. Um, so. You know, I, I actually just spent the entire day today with some educators in Northern Virginia, and they’re thinking about how they innovate in their schools and what this looks like.

Jessica Garner: And so more than ever the conversation, I think between what happens with technology and what happens with pedagogy, they’re, they’re intersecting right now. Um, so I’ll just, you know, a couple of examples I heard today, teachers and leaders talking about, um, using, using AI to analyze. What are the needs of the students in front of me, right?

Jessica Garner: So taking out names, but thinking about how do I put in IEP goals? How do I put in, you know, 5 0 4 accommodations, my students who are multilingual learners, like their goals, and then just other kids that I have in my classroom. [00:10:00] Like what are, what does everybody’s needs? Take off all the names, feed that in and say, okay, what do I have going on in my classroom?

Jessica Garner: Where are there similarities? Where are there differences? Help me meet the needs of these students. And so it’s coming at it from a lens of what are the activities that I need to do with my students to help them meet their goals, to help them reach the standards, but it’s how does the technology support that?

Jessica Garner: And how does the technology enable me to actually have this analysis in a way that would’ve taken me hours to do as a teacher before? So that’s where I see some of this power kind of coming together. And I think. We’re also seeing at ASCD, some really interesting overlaps. We, we have, um, a set of authors that have written a book called ai, um, AI enhanced Literacy.

Jessica Garner: Right? So looking at reading and writing in the age of ai. And so it’s this idea of we’re not just gonna talk about the technology for technology’s sake, and we’re not gonna talk about reading and writing and pretend that the technology doesn’t exist. It’s how do we bring these two things together and how do we help [00:11:00] educators understand that, you know, there are technologies that are in the world that our kids are gonna be working with.

Jessica Garner: Industry has already moved on, so how do we bring that together in, in the K 12 classroom? I think it’s pretty exciting.

Brett Roer: That’s incredible. And so on that spirit, you know, you’re really focusing on what you talked about in Northern Virginia, but if you and Joseph could expand on, I’ve been honored. I get to see a lot of the work you’re doing in person with leaders across the country and educators.

Brett Roer: When you have those educators in person and you’re convening these, uh, organizations and educators around the country, what is right now, like getting you most excited, and I’ve seen you all in person, what is filling your cup when you see the kind of movement they’re making? And then what are times where you’re still like, it’s really deflating or it drains your battery?

Brett Roer: What are things that you see in both of those phases? Um, Joseph, what have you been noticing out there?

Joseph South: I, I mean, I’ve been really impressed with the degree that educators are willing to experiment and try things that are new to them they’ve never done before. You know, [00:12:00] educators are busy. They, they got enough going on, they don’t need new things in their lives.

Joseph South: Um, yet they understand that this technology is so powerful that they, they’ve got to get their arms around it. And so when I see them digging in, um, it really heartens me. I also love to see a. Places where students as AI creators is already coming to the forefront. So rather than just thinking about AI as like a, a tutor or something like that, they think of AI as a tool that a student can use to do something amazing.

Joseph South: So for example, we run an an innovator challenge, and it’s for students and we run it every year and they submit an app that they’ve created with AI that addresses one of the un sustainable development goals. And one of the examples of that is there was a team in Mexico that created a, a glove that digitizes hand movements and [00:13:00] then translates it into a SL, uh, I guess, uh, Spanish ASL, which can then using, by using AI, translates it into any language.

Joseph South: Um, I mean, it’s amazing. We could not do that in the past. I just about knocked over my water. I’m so excited. Um, and um, that’s the kind of thing where I start to really see the potential, the draining side of things comes when we get into all or nothing debates around this technology. Either we think everything should be AI and it should be everywhere, and it should take over everything, or we think we should just ban it.

Joseph South: Um, I really think that’s a waste of time. Um, none of us want to use technology all of the time for everything, and all of us would be lost without technology some of the time. Um, and so I wish we could get past some of those, uh, extreme, extreme positions.

Brett Roer: Absolutely. Jessica.

Jessica Garner: I get really excited when I see educators willing to try [00:14:00] things and when you hear them say things like, after 24 years of teaching.

Jessica Garner: I feel energized. That is the thing that really, really fills my bucket, right? When I hear people saying, I feel like I’m getting to the place in teaching that I always wanted to be, and I’m 24 years in. So that, that’s super exciting and I hear stories like that all the time. It that, you know, I, across the country, across school districts, when, when teachers are starting to use these technologies and think differently about their instruction and, and using some of the efficiency of AI for in transformative ways, that’s what really, really gets me excited.

Jessica Garner: I think for me, what drains my battery, um, is, is how complicated we’ve made this in many instances, right? So instead of thinking about this as another tool that can support instructional practice that we always know, it has become this giant thing that, and a little bit to [00:15:00] Joseph’s point, right? Like there’s this debate about whether it should be all in or not, and it, and it’s.

Jessica Garner: I think that part to me is the discouraging piece because I see that if, if one person can pick it up, use it for one small thing and then change one little piece, it, it starts, you know, it lights that fire. But if we make it this giant debate and this it, oh, we have to have all these things in place, no, we have to have safe, responsible, ethical, use in place.

Jessica Garner: But it doesn’t take, you know, for lack of a better term, an act of Congress to make that happen. There are small things that can happen, um, that make a really big difference, that get people excited about it.

Rebecca Bultsma: I have a follow up question to that because I think in all of our practice in different ways, we’re encountering people who are kind of all in and people who are hesitant.

Rebecca Bultsma: And I’m curious what you’re seeing when you run into educators who are. Resisting ai, is it a res resource thing? Is it a time thing? Is it a fundamental belief thing? Is it an [00:16:00] overwhelm? I’m curious what you’re seeing and um, what your read is on that.

Jessica Garner: I’ll, I’ll jump in and just start and then Joseph, you can feel free to share some thoughts.

Jessica Garner: So there, I think there’s a, I think it depends on the person. So some people are very philosophically opposed to, to AI because they’re very worried about cognitive offloading, which is a real thing. If you don’t design your activities in the right way, you absolutely can, can use AI as a tool to just do your work for you.

Jessica Garner: Right? I think that is a thing, but it’s, it’s not that different from. What we might have said Google could do, you know, five, five years ago. But, so that’s, that’s one piece of it. A lot of people are very, very concerned about the environmental impact of AI and are just kind of opposed to it because they, they can’t get past that piece of it.

Jessica Garner: Right. Um, some people haven’t had training and so they’re like, I’m just not comfortable with it. I haven’t had time to play with it myself. I don’t know how this could be used as an educational tool. I don’t understand it. And so I’m just not ready to go there yet. Um, and then some people are just more hesitant [00:17:00] thinking about where it could go.

Jessica Garner: Some of the science fiction pieces of it, they’re like, I don’t want to depend on it at all, even if I do understand it and do see the value in it, because I’m scared of what it might be down the road. So those are just some of my, I don’t know, just my observations.

Joseph South: Yeah. And, and those are, you know, those are legitimate.

Joseph South: Right. And, and I, I think we just have to be careful that we don’t, um, act as if. The objections are naive. Um, there’s a lot of learning science that says that learning happens best when two humans are working together with each other, um, to support our learning process. And if AI is interrupting that, then yeah, that’s concerning, right?

Joseph South: And people should be concerned. And so I think, um, while there’s a range of concerns, um, I feel a lot of, uh, empathy for those who are concerned that we are, [00:18:00] um, watering down the human aspects of teaching and learning.

Rebecca Bultsma: Just to pull on that thread about cognitive offloading, uh, I learned something this week that actually totally changed the way that I talk about using AI for educators that I’ll share here and get your take on some of the cognitive offloading things.

Rebecca Bultsma: Uh, but forever, you know, a bunch of us who do AI literacy and talk to people, tell people to use AI to get you started, right? And then, then you can kind of take your human whatever and work on it. And kind of this AI sandwich idea that I used to teach. And I read a study this week about London cab drivers.

Rebecca Bultsma: It’s been around since the early two thousands about how London cab drivers don’t die of Alzheimer’s because they have to write this exam that takes four years to study foreign memorize every route in London, and it makes their hippocampus bigger. But what they recently found out, like just last year, is the reason that works for them is because of how they think through their roots.

Rebecca Bultsma: The human part of thinking through what are the main junctions, what are my main pivot [00:19:00] points and my strategy to get from here to there. And then they fill in the side streets as they go. And those actual filling, thinking through the junctions, um, and coming up with a plan yourself and the easy stuff comes later, actually changed the way that I think about cognitive offloading and realizing that using AI to get you started is, might actually be sabotaging us.

Rebecca Bultsma: Instead, we need to be thinking through the complexities and the routing and the benchmarks and the streets, and creating that strategic plan in our mind, and then using AI to fill in the side streets. Um, so it just goes to show exactly what you were saying, Joseph, about the learning science that we’re learning all the time and how ai, uh, how AI is supporting.

Rebecca Bultsma: But I’m curious on your thoughts about that, Joseph. It looks like you have some,

Joseph South: I mean. First of all, that is a fascinating study, and I, and I really wanna read it, um, because I knew about the cab drivers, but I not heard this update. Um, so I, so that, I think that is a potential implication. Right? Another, another way that I think about framing this [00:20:00] is, you know, a, when we think about AI and how we’re going to use it, um, I think ultimately we’re gonna become AI conductors, right?

Joseph South: So right now we’re sort of like, we learn, you know, the analogy is you learn to play an instrument, right? You’re learning to do this thing or that thing or that thing. And in school it’s like divided into like math or science or reading or whatever. Um, but, and then when we use AI tools today, we tend to go to one tool and ask one question and get one answer and then do something with it, right?

Joseph South: Which I think is kind of the, uh, workflows that maybe you’re referring to, Rebecca. Ultimately, I think what we’re going to find is that. We have AI agents, and each one of them is very powerful and can do very specialized things. And it’ll be our job to figure out how to apply this problem across the agents.

Joseph South: And in my mind, that’s a very similar planning process to the cab drivers, right? So in the end, we might [00:21:00] still bring in AI fairly early in the process, but we’ll do it once we have a mental map of how we want to bring it into our problem space.

Jessica Garner: I also think it depends on the task that we’re doing with ai, right?

Jessica Garner: So this is something that, um, I’ve been thinking about a lot in, and I, I think that as I’ve heard trainings and been part of trainings and have delivered trainings, I think there’s a real risk in, in, in giving like, this is, this is how you should do it. Because I think that it depends on the task, it depends on how we wanna interact with this, this technology.

Jessica Garner: And there’s so many different uses of it. That I think that there are absolutely times where you have to have your mental map in place. I also think that there are times that if you’re, you know, if you’re completely stuck, getting AI to help you create that mental map and working together with it could, could also be beneficial.

Jessica Garner: Um, I just think it ha it, it really depends on your level of expertise that you’re approaching a problem with. It depends on your level of interest. It depends on [00:22:00] your, you know, how, how invested you are. If it’s not a task that you, that really matters to you, then I think there either you’re gonna use AI in ways that’s really different from a task that you feel very passionately about.

Jessica Garner: And so a lot of times when we are doing training with educators. We’re not even talking to them about ai, we’re talking to them about, we have, we have these eight transformational learning principles that we reference often as an organization. And you know, one of those is to create authentic experiences for students.

Jessica Garner: If, if we wanna shift and think about AI as a tool to really help our kids and help them build academic integrity, they have to care about their assignments, they have to care about the learning that they’re doing. So we have to make it authentic. We have to give them agency with it. And so I really think that that’s kind of where, you know, our use of this technology is gonna, that’s gonna be on us for how we design these learning experiences for students.

Brett Roer: Yeah, absolutely. You know, something that we got to experience, uh, in the [00:23:00] fall in Charlotte, which was the first time I had ever done it in many of the educators, you know, in collaboration with ISTE ASCD and Generation ai, which I, in a moment, would love for you to share this incredible work you’re leading and the impact you’re trying to have.

Brett Roer: We did for the first time ask people like these education leaders specifically to like think through a problem of practice, then try to use just an AI tool alone to try to think through it deeper and ask questions to it. Then just speak to a human, like another member of the cohort. Then like utilize AI again and kind of show them like there’s different portions and then ask everyone about that experience.

Brett Roer: Like what was the benefits of talking to an AI tool specifically, and then what was the benefits of talking to um, one individual and then a team and then sharing out. And I think what was really rich was similar to what y’all said is like the times when they’re talking to an AI tool can get you maybe more technical or if you’re using it correctly, it can have you like ask different perspectives about what you’ve already input and know.

Brett Roer: But when they talk to other humans who had been in similar shared [00:24:00] experiences, they ask really nuanced questions that are like, you had to have kind of like had that thing either go wrong or go right. And it really made an imprint on you emotionally to be like, what about that? That was interesting because both of those are really useful and they mentioned like two different ways, but by utilizing it together it really lets you move further along in the process.

Brett Roer: So just totally rationalizing and agreeing with like there’s gotta be the right rationale and use case for when you’re using it and when not to use it. And then why. And I think we’re getting there, but I’d love Jessica and then obviously Joseph as well. Could you share a little bit about what generation AI is, the work ISTE ASCD is, and some of maybe your incredible partners who are, uh, working alongside you?

Jessica Garner: Yeah, sure. So Generation AI is an initiative that we launched, um, gosh, it’s been about a year and a half ago, which is crazy. And it is, um, work that, that we have multiple funders. google.org is a funder. General Motors is funding us and we’ve partnered with six amazing organizations. We have partnered with, um, the Center for Black Educator [00:25:00] Development.

Jessica Garner: Computer Science Teachers Association and digitize Latinos for Education, the NEA and Play Lab. And with those coalition partners, we are really trying to reach over 200,000 educators to really help them think about teaching and learning in an age of AI and help them be prepared to lead in this work.

Jessica Garner: So we are doing all different kinds of activities. Uh, we’ve divided the work, which I, I really love this. We’ve divided the work into deep learning experiences. So Brett, which you kind of mentioned, our convening in Charlotte was kicking off a year long experience with educators, uh, a community of practice, and we go deep with them.

Jessica Garner: Um, so we have deep learning experiences. We have courses where folks spend several months with us, with facilitators really digging into this in a, in a deep way. We have some that are foundational that maybe are, are not quite so deep, but they’re, we’re still getting a foundation with them. So it could be, um, some, some trainings, some courses, different things that we have to offer.

Jessica Garner: And then [00:26:00] we have what we call sparking engagement. So we know that, you know, this is gonna be the largest group of folks that we reach because it, it’s hard to spend enough time with that many people to get a, a giant reach. So we’re sparking engagement by sharing videos of the work that we’re doing.

Jessica Garner: We’ve interviewed a lot of educators. We are creating videos that kind of summarize the activities we’re doing just to get people’s thoughts out there. We’ve had probably, we probably had 300,000 views on our, our videos to date. And then we’re also working on updating a framework that we have that helps educators evaluate ed tech tools.

Jessica Garner: And it was created before AI was, was really part of a lot of ed tech tools. And so we’re updating that framework to put some indicators in there. When you have a tool that now all of a sudden has this AI feature pop up in it, how do you make good decisions about EdTech tools? And so we’ve started to publish, um, some insights that we’ve gotten from educators because we started that work by actually going and asking the field, what do you think should be [00:27:00] included in this work?

Jessica Garner: What are we missing? What needs to be in there? And so we’ve had, we, we’ve had several hundred thousand people look at those insights. People are very interested. This idea of how do we choose the right tools? And I think that because AI is, is embedded in so many tools, and schools already have a lot of these tools, and now AI is showing up, they’re really thinking about how do I protect my students?

Jessica Garner: How do I make good financial decisions for our schools and our districts? How do I make sure that these tools are really connected to, to, to the pedagogical practices that we wanna promote? Um, so that’s some of the sparking engagement work that we’re doing, but it’s really exciting. Um, you know, PlayLab has been doing courses and providing folks with tools.

Jessica Garner: The NEA has created some free online courses that, that folks can take part in. We’ve done webinar series with them, um, center for Black Educator Development, Latinos for Ed, and, uh, computer Science Teachers Association have all done these awesome communities of practice with their members and, you know, [00:28:00] the folks that are, are in their, their organization.

Jessica Garner: So it’s, it’s been a really great collaborative effort that we have, have gotten know of here.

Rebecca Bultsma: I wanna pick up a little bit on something you were talking about with the, uh, you know, choosing good ed tech tools because like the average district has over 2000 ed tech tools right now, and, uh, this is something that as an AI ethicist especially like specializing in education.

Rebecca Bultsma: This is something I think about a lot because my work is in the UK and the ed tech industry works a lot different here. Um, but there was a report that came out last year that, it’s actually one of my favorite, it’s a forecasting report commissioned by the government of Canada about all of the forecasting and foresight connecting to ai.

Rebecca Bultsma: And one of the big risks that they flag is that data collected about children will follow them for the rest of their lives and impact the rest of their lives. And that certainly is one of the biggest risks of a lot of these ed tech tools, right? As student data, how it’s being protected, how it’s being, um, used and sold and traded.

Rebecca Bultsma: You [00:29:00] know, like what’s happening in the background, how many hands it changes. Besides that though, uh, tell us a little bit more about what you’re telling people about vetting the tools that they use and how they use them, because that is a question that I get a lot, certainly from education leaders who have to sign off on these things and basically assume all the risk through the existing legislation.

Rebecca Bultsma: Right. But what, what advice are you giving educators and education leaders, both of you, Joseph, any insights there?

Joseph South: Yeah, I mean, so, so as you point out, this is a really tough area and it was, it was hard enough to vet those, uh, 2000, uh, tools. And I’m not sure if there were 2000 tools into district that those were being vetted anyway, but people who are serious about vetting tools have to look under the hood, right?

Joseph South: And it’s hard to look under the hood of an AI tool, um, because a lot of, I mean, the bottom line is we don’t know why Generat AI tools work. [00:30:00] We are. As a, as a species, still trying to understand them. Um, and that’s a pretty heavy thing to like consider. And once again, this validates concerns that people have about generative ai.

Joseph South: Right. And I just, I just wanna be clear, um, I, I hear that that said, um, there’s a lot that we can do to try and get a read on some of this. So some things that we can do, number one, we can ask for and demand transparency from the provider. Um, what engineer are you using? How is it trained? What’s the source data?

Joseph South: What filters are you using to mitigate bias? Um, and then when you get to the privacy question, is this, is, is the things that are being inputted by the users being used to train your AI engine or not, where do you store that data? When, when, for how long do you store [00:31:00] it? When do you destroy it? Um, et cetera, right?

Joseph South: And, and many of those tools, many of those questions we’d ask of non-AI tools, but they’re even more important in an AI setting. Um, and then I think once you sort of get past the, the, the basic, how is this thing made and are you mean privacy? There’s of course security questions, but those are pretty analogous to other EdTech tools of where you’re storing it and who has access and, and that sort of thing.

Joseph South: But then I think you have to ask questions about what pedagogy is represented by this tool. Did anybody think through the educational theories and philosophies that went into it? Um, and you know, which ones were, was it trained on? And this is again, not, it’s not usually a question you ask about an EdTech tool, but it’s something you need to ask about these tools.

Joseph South: Um, and then I think finally I would say. [00:32:00] You really need to do a little bit of, you know, red teaming. Um, and so this is of course a term from, uh, the security field where you actually try and break it. Right. And I think if I were bringing an AI tool into my school, I would certainly have some adults, um, spend some time trying to send it off the rails just to make sure that all the, um, all the bumpers and the, and the safeguards that have been in actually work.

Joseph South: Mm-hmm.

Jessica Garner: Yeah, we’re, we’re, you know, some of the questions that we’re asking folks to consider. So, you know, we have not yet put our, our updated framework out there for how to evaluate that. That’s coming in 2026. Stay tuned. Um, but we’ve, we’ve asked some questions, right? So we’ve gotten some of these insights and so, you know, we’re looking at what is the impact on teaching?

Jessica Garner: What is the impact on learning? What is the impact on the ed tech design with these AI tools? And [00:33:00] then what is your implementation strategy for them? So those are kind of the four big areas that we’re thinking about with this work. And so, you know, one of the examples that I think, you know, when Joseph talks about some of the pedagogy pieces, we’re asking questions like, how does this product allow for greater personalization of the learning experience at an individual student level, right?

Jessica Garner: And like, how might it lift learning for all students? So those are some of the things that we’re asking people don’t just think about. How, you know, how much it costs, how long does it take to learn how to use, you know, some of these pieces, but think about what actually is this tool going to do for teaching and for learning and, and use, use that as some of your decision making points.

Rebecca Bultsma: I’m looking forward to reading that when it comes out. Jessica, keep me posted on that. So I’m, I’m always kind of the negative Nancy on this, so I’ll, I’ll take like a plot twist here and just ask you what you’re seeing right now that’s genuinely exciting you about AI and the possibilities. Is there a new tool?

Rebecca Bultsma: Is there a new system? What are you seeing that’s genuinely [00:34:00] exciting you out there right now?

Jessica Garner: Thinking specifically about a tool, I mean, and, and it’s funny because this is probably the, the place where we spend the least amount of our time when we’re working with educators. We, we purposefully don’t talk a lot about tools.

Jessica Garner: We, we know that tools are part of this whole thing and we know that you have to use them, but really this is about. A fundamental shift in the way that we think about teaching and learning. And so I do wanna preface it with that, but I, I’ve been excited lately just about all of the different potential that Notebook LM has.

Jessica Garner: Um, just because it seems like such a versatile tool. It’s, you know, it’s very easy to use and, and it keeps changing. Every time I open it, I’m like, oh, there’s a new feature. Oh, there’s a new feature. So it just, you know, I, I’ve personally been excited about Notebook LM,

Joseph South: so I’m gonna try not to mention any specific tools, because we try and sit, sometimes we talk about ISTE ASCD is the Switzerland of the, um, ed tech space.

Joseph South: But there, [00:35:00] so one thing that I, that I have heard that I think is really fascinating is there’s a, there’s a tool out there that was designed to capture students. You know, sort of handwriting when they were doing math problems, there’s more than one tool that does this. And then, you know, analyze that and, and on the, and that is already cool because, you know, in the past there really wasn’t a way for a tool to sort of capture that authentic, um, work that a student’s doing.

Joseph South: But the thing that really captured my attention is that same tool allows them to do a think aloud protocol and talk through what they did. And the teachers were telling me that the ability to, you know, when the tools shows that a student’s made an error, they can jump to the place where the student’s talking about it has been incredibly enlightening for the teacher.

Joseph South: And I just think what a great application of [00:36:00] AI to connect. An educator to the out loud thinking of a student on a particular problem, and it’s not the sort of thing that you think about first when you think about an AI tool, but it’s exactly the kind of thing that AI enables that just wouldn’t be logistically feasible otherwise.

Rebecca Bultsma: It is so interesting that you bring that up because I, I don’t know how closely you follow the news. This is what I do full-time. This is all I hear about and read about and think about. But Johnny Ives has teamed up with OpenAI to come up with this mystery device that they are OpenAI and him or building.

Rebecca Bultsma: He was obviously like behind the iPhone and it’s been this mystery, but there’s been some leaks that there, it’s actually a pen that will sit on your desk, capture the things you’re doing, encourage people to write and capture what they’re writing and work exactly like you’ve described. Just the, the word nerd in me laughs at the O pen, right?

Rebecca Bultsma: Like O pen ai. Uh, but you know, I can see that really being great in [00:37:00] the. In the way that you’ve described, and I’ll bet that that’s exactly what it is to kind of force us to go back. We know the connection. Well, I do anyway, just, I, I don’t learn properly unless I’m taking notes as I read, unless I’m thinking with a pen in my hand.

Rebecca Bultsma: And the way you’ve described sounds really great. We’ll have to see. I, I’ll certainly try it out. I try everything out, but, uh, yeah, it’ll be fascinating.

Joseph South: Well, I hope that you’ve parked open ai.com somewhere, Rebecca, so you can sell that back to them for millions of dollars.

Rebecca Bultsma: Brilliant, brilliant.

Brett Roer: Oh boy. Well, this is an excellent segue because we can see the puns are flowing and, uh, sounds like you have, you have the same wit as Rebecca does.

Brett Roer: So we are now gonna do a fan favorite part of the AmpED to 11 podcast, where Joseph and Jessica, you are now the co-host of the M two 11 podcast. You get to ask any question, any burning question that you have for Rebecca and I about anything in the space that you think our listeners would wanna learn about.[00:38:00]

Brett Roer: The floor is yours.

Joseph South: I’ve been really excited about this part of the podcast because, let, let’s face it, both of you are AI experts and you’re just, you know, in the middle of it. Um, so the thing that I think about a lot is, you know, how are we engaging parents in these conversations? How are we engaging families in these conversations?

Joseph South: How do we bring the voices to the table that are being left out, that feel left out, and that, um, if we don’t bring into the conversation, we’re gonna find ourselves with a lot of friction moving forward. Um, so what is the right way to do that?

Rebecca Bultsma: I have, I have thoughts on this. My background before this is actually, um, public engagement and, uh, working in school comms.

Rebecca Bultsma: And Brett and I actually taught a session in the UK [00:39:00] about this. And when we helped a school district in California, uh, write a policy, this is the strategy that I employed, I believe in a framework called, um, oh wow. It just, like, it just left my mind. But it’s appreciative inquiry. Yeah. Appreciative inquiry framework, which basically involves the whole community, student, community, uh, educators like the everybody involved and you work through it from a positive lens.

Rebecca Bultsma: What do you love about our school or in our district? Um, if everything goes right, what does that look like? Um, what do we absolutely wanna keep the same? Uh, what, you know, all these positive questions, there’s all these categories and then you build a shared vision for the future together. And, uh, you work.

Rebecca Bultsma: Towards it, kind of from that place of aporia, you know, where we don’t know, but we’re gonna figure it out together. But looking to the future with positivity and bringing absolutely every single voice to the conversation and having those regular benchmarks and transparency and communication around that.[00:40:00]

Rebecca Bultsma: I think sometimes we get so used to thinking, oh, we have to know everything and act like we know everything. But going into it as a learner, um, is so important and it’s uncomfortable that. Messy in between space that all of our, all of our favorite theorists talk about. But that even in transformational learning, Jessica, you know, that disorienting dilemma everybody’s experiencing at the same time right now that we have to work through.

Rebecca Bultsma: But bringing everybody along for the ride is so important. And then you get less. And this is what I tell districts too, just from even a public relations standpoint and a relationship building standpoint, you get less pushback if you’re transparent that you don’t know that you’re, um, we all agreed that this is what we want and this is what we love, and so this is what we’re doing because we wanna get here.

Rebecca Bultsma: But charting that course together I think is critical. And asking questions and listening the whole way. It’s easier said than done, but I think having those conversations with students, I think we talk, and parents, [00:41:00] honestly, we talk so much about the educators, we talk a lot about the students, but we’re dealing with a generation of parents often.

Rebecca Bultsma: Like our age, right? Like we didn’t grow up with a cell phone or social media or like, I got my first email address in college. You know, like a lot of people who aren’t digital natives who when their kid comes home and says, um, I failed this test ’cause they think I used ai, they don’t know what that means.

Rebecca Bultsma: And so they don’t know to be defensive or not defensive or go after the school, like until everybody’s on the same page. And we build that baseline level of understanding and literacy and the shared vision and shared expectations. It just makes it all so much easier. Easier said than done. But I do really like appreciative inquiry as a starting point and as like a framework to work within.

Joseph South: I love it. Thank you. There’s a lot of wisdom there. Thank you.

Brett Roer: Yeah, that’s a tough one. I was always to follow. Uh, next question. I would prefer to go first ’cause she’s, she’s always intimidating. But um, so some of the, so similarities, right? So [00:42:00] previously as a principle, not knowing the name of this, uh, framework that Rebecca introduced me to.

Brett Roer: I often would start with that. Um, I would free ai, I would literally sit down my staff and I would let open door policy. Anyone can come into my office during like certain times of the year and I would do five good minutes and I would just take freehand notes. You could tell me anything. I’m not gonna talk anything that would help move the school forward in support of the community, our mission or values.

Brett Roer: And any stakeholder could do that. Now, a lot of the work that we do, um, we’re doing this in a number of states with districts when we build community AI playbooks while there’s, you know, frameworks such as isti standards that we obviously use, really, it goes into questions like from the rhythm project and we ask a specific question.

Brett Roer: So again, like, would you be okay? Um, if teachers used an AI chat bot to prepare for a difficult conversation with a parent? Now we know there’s no right or wrong answer to that per se, right? You don’t need to use an AI chat bot to do so, but what you hear afterwards will surface whether they think that [00:43:00] strengthens your, uh, a road human connection.

Brett Roer: It will surface people’s values. What we keep on sharing with, uh, parents, especially when I leave parent workshops, I always start with, gimme one word about how you feel about AI in education right now. And typically it’s like apprehensive, scared, uncertain, um, curious, but scared. And by the end of those we usually get to things like, you know, I’m empowered, I’m fascinated, I’m ready.

Brett Roer: I want to try. And what you do in that hour really comes down to just listening and usually through pre-work, asking people what is the most challenging part about being a parent to your child today in the age of ai? So if they bring up homework, I teach them how you can take a picture of your child’s homework that day.

Brett Roer: And I always use math because my son is a math wizard and he just doesn’t think anyone in our family understands math because we don’t use the same words as his teacher. And we didn’t grow up with common core math. So I take a picture of it and I just prompt, uh, any AI tool and say. Imagine if you were a parent, you know [00:44:00] the answer to this, but you don’t know how this child learns it in his school.

Brett Roer: I just take a picture of the front of the book. I say, I assume that my child’s teacher has the like teacher’s edition of this book. What word should I use to guide them? Imagine you’re the best teacher in the world. Don’t give ’em the answer. But that helps me. ’cause sometimes I look at it and I’m like, I don’t know what the actual, like what the, what do they want as an exemplar?

Brett Roer: So it helps you get the exemplar and then figure out how to use his words, but never give them the answer. A lot of of that’s usually what comes up on calls. When I ask people, what do you want to do in real time with ai? It’s that. So I go, great. Someone would take a picture of their kids’ homework that they had an argument with their kid about.

Brett Roer: Let’s do it live. The other thing I do a lot is teach people present parenting. And that’s simply, and you know, Jessica’s seen me do this at many things. Take out your iPhone or any device, just hit voice memo. It creates a transcript, turn it over and beforehand either your college advisor or I take on the role of a college advisor and have people role play, what are the questions that you should be asking your child about college?

Brett Roer: Or career or their future. And then [00:45:00] just don’t have any screens out. Don’t Google something in the moment. Just have a free flowing conversation. Take that, throw it into whatever AI tool you want. Don’t say each other’s name so there’s no PII. And just see the map it creates, and then what questions spark from them.

Brett Roer: And I do that in real time. So basically based on the demographic or the age of people, I try to ask what they want to, what their frustration is, show them a very quick low tech solution that actually brings humanity back together. And then the last thing I try to show people is, how do you overcome biases in ai?

Brett Roer: And the thing I have found that works the best is have someone describe their child in the future, living their dream, or describe them now in the way that you most love them. And by getting people to realize what kind of ways you have to prompt an AI tool to overcome biases, just keep asking them. And if you’re not sure, I literally show in real time.

Brett Roer: Maybe I don’t know what questions. So you just ask it. I wanna create a culturally relevant picture of my daughter. What questions should I ask myself? And then we just do it live. And I just take that transcript, I put it in. Then they refine it and we show people like, this is [00:46:00] how you can use AI in a number of different ways.

Brett Roer: That’s really about you and the child, but it’s starting with what they want to accomplish. So those are low lift ways to get people excited about, okay, if I can do this, then what else can I do? So that’s how I’ve been trying to engage people and bring the humanity back.

Joseph South: So there’s a really cool thing here, I think across both of your answers, um, but I’m just like, pondering is, you are both saying we need a whole lot more input from people into the conversation, into the tools.

Joseph South: I think a lot of us, when we see AI tool, we’re like, oh my goodness. It’s like the oracle. It’s supposed to know everything. I just need to ask it a question and, and you are giving the opposite advice. You’re saying bring your ideas, bring your hesitations, bring your problems, bring your perspectives. Bring that first.

Joseph South: Then engage the tool. I just think that’s such a smart reversal. [00:47:00]

Brett Roer: Wow. Thank you for encapsulating that. That was actually what we’ve been trying to say, but you said it much more concisely and better, so thank you. Good hosting. Wow, Jessica. Good luck.

Jessica Garner: So I guess my question for both of you is, where do you see the biggest gap right now between like, the vision for AI that you guys are kind of putting out there, um, and classroom reality.

Jessica Garner: So where, where do you see that, that gap?

Brett Roer: So first I’m gonna start with like what I’m hearing and, and seeing, but mostly, you know, we talk about what drains are battery. I think it’s like the uphill battle. So for example, this is a good example of how we use the community AI playbooks. I truly do not want a child on a screen unless it’s like there’s a clear rationale that this is a better use of their time for whatever the outcome is, than not being on a screen.

Brett Roer: Being said at a certain age level, I do think you need to start scaffolding in AI literacy skills, which can start prescreen, like I just mentioned. You can have conversations. So a good example is I just presented to about 20 Bronx principles today, [00:48:00] and I was showing them some of the work I do. So like one of them is, uh, aligned to writing standards, showing students in their future.

Brett Roer: So all I did was with an a, a safely approved tool, have students put an image of themselves in, then write about their future goals and career that aligns to New York State writing standards in fifth grade, and the tests they have to take shortly. And then that picture, we then extrapolate that into the future and we have the child interacting with their adult version of themselves.

Brett Roer: So for example, if they’re painters, they’re painting together or they’re, um, brain surgeons and they’re like in a hospital together helping patients, and it’s like mind blowing and they love it. And then they’re like, well, how could I do it here? And so what I’m noticing is. Everyone is, has students writing fifth grade writing assignments.

Brett Roer: But like by adding this little nuance that takes virtually no time and is just one additional portion changes the whole dynamic of it. So I really like to start [00:49:00] with like, what do you want kids to learn? And even if you don’t change that, how do you now think from an AI mindset? So similarly in AP courses, we did this at this district in California last year, and I still use it really anywhere I coach about pedagogy is like, okay, fine.

Brett Roer: You wanna teach the exact same AP lit essay you taught last year? Great. Take that AP lit, uh, writing sample that you were gonna do tomorrow in class. Tell the students the day before. I want you to use any AI tool you want. I want you to jailbreak cheat on this exam green light tonight. Then take whatever’s the most important aspects you learned on how to ace this test and put it onto, you know, a Google Doc, one page tomorrow yellow light.

Brett Roer: You can take all that synthesized information and study it for 10 minutes. Now that’s not different than when I was in high school with an index card. Same thing your teacher said, here’s the test tomorrow. Open book at night, one index card to study before you go in, and then everything’s off, right?

Brett Roer: Green light, yellow light, red light. But with ai, after now red light, you take the 20 minute [00:50:00] writing sample, and now you have time left in class. You pass it up. Now you say with consent. Hey guys, let’s have a, let’s have a Socratic seminar, or let’s have a writing. Uh, let’s have a open discussion. How did y’all cheat last night?

Brett Roer: Oh, Rebecca, you know, made a mind map in notebook. You turned it into, um, a podcast. You made a video. I made, um, you know, a, a charts out of it, and I just learned everyone’s personalized learning style. How does AI help their brain work? If you record that and you put it into like a notebook, lm, or any tool, you now know how each of your peer classes learns and then you can differentiate the assignments.

Brett Roer: Nothing different, the same standards, but by like scaffolding it and then listening. The sky’s the limit. And there’s tools out there that do that. So that’s what I’m hoping to get people to is like, you don’t wanna change. I understand why, and you’re comfortable and you think the success is a skill that’s important, and maybe it is, and maybe it’s what the state requires.

Brett Roer: So let’s build around that mindset and let’s get you to learn more about your kids or to [00:51:00] like make them feel more excited and like they’re engaged. As you said before, Jessica, that’s how I’ve been trying to get them to think about how do you move forward purposefully and then not feel like kids are not able to do the same skills pre ai, and there’s validity to that.

Brett Roer: So that’s what I’m trying to do is meet people where they are and build the fun stuff around it, and then hopefully inspire and unlock some thinking there.

Rebecca Bultsma: I think, uh, the biggest gap that I’m seeing is like a, I think Jessica already brought it up. It’s almost that binary mindset that people have that AI is a cheating tool or you have to either be for or against ai.

Rebecca Bultsma: And I, I, um. My son came home from college this week and he was working on an assignment, and uh, I said, well, let’s use Notebook LM to, or like a tool that I had built that I use for my own, to test my own understanding, to have a conversation with you to help you identify like how you feel about this content or your unique [00:52:00] perspective and angles.

Rebecca Bultsma: Like it will just engage you in a Socratic line of dialogue. And he said, no, I’m not allowed. And I’m like, what do you mean you’re not allowed In his actual course syllabus, it says, AI tools are prohibited for generating ideas or summarizing or to help you in any way with any understanding of texts. And I, I struggle with that because I think that there’s this, um.

Rebecca Bultsma: Obviously there’s just this huge gap between, uh, of understanding that’s essentially what it is. And I’m sure, uh, his professor believes that AI tools are cheating or perhaps like maybe is fundamentally against putting texts into an AI model for a variety of reasons, right? Like that I completely understand.

Rebecca Bultsma: But also the why was never established as to why that was the reason, which is a different thing. Um, but I think conflating, uh, the idea of AI to write your assignment for you and AI as a thought partner to engage you in deeper learning or deeper engagement with primary texts, [00:53:00] um, I think is the point. I think that’s the best possible way to use it.

Rebecca Bultsma: That’s how I use it. I use it to challenge my thinking and my ideas. The taxi driver paper I read earlier, I give that, I set up a PhD level thought assistant and I have it challenged me and I have it, uh, ask me questions to think about things differently and challenge my worldviews. And I find that very, very useful understanding that college kids and high school kids are programmed to find the path of least resistance, right?

Rebecca Bultsma: So I think, um, that’s kind of the biggest gap that I’m seeing is. Just that binary framing. Either they’re cheating or they’re useful and there’s nothing in between. But I recognize that that comes from a fundamental kind of lack of widespread understanding, or people who don’t even know where to start with them gauging with the tools themselves.

Rebecca Bultsma: So I understand why it’s there, but that is kind of the gap that I see.

Jessica Garner: So last week actually, we have an author, um, Tony Frontier. He’s, he has written several books and, and you know, ISTE ASCD’s publisher for them. And so he, he wrote a book called AI with [00:54:00] Intention. And, um, one of the things, we did a webinar with him last week, and one of the things that he actually talked about that I think is super relevant to, to what you just said is he said, you know, one of the things that we need to do if we’re trying to create a culture of academic integrity, which is one of the, the things that he talked a lot about, and I totally respect that.

Jessica Garner: He says, we need to help people understand what types of help they’re allowed to get in general. So like are you allowed to talk to an adult and get feedback on an essay? Are you allowed to talk to a peer and get feedback on an essay when you’re at home? Are you allowed to Google something and get feedback?

Jessica Garner: Right. And so an AI tool is just one more way to get feedback, but if you haven’t outlined upfront the types of feedback that are actually allowed in the convers, like just in general, because I think there’s a difference between exactly what you said, using an AI tool to just do the work for you versus choosing how you get feedback and from whom or what.

Joseph South: Yeah, and I really love you. You may have heard about this and [00:55:00] maybe even talked about on the podcast in the past, but I think Long Beach, the school district in Long Beach, California, they have a little app that is on their, um, internal version of, of Chrome or whatever, and it helps. Sorry, I think it’s Laguna Beach.

Joseph South: Not Long Beach, though. I’m sure Long Beach has awesome stuff too, um, at Laguna Beach. And so they, um, what they do is, is it helps the teacher outline their expectations for where the student is or isn’t allowed to use ai. And then when the student submits the paper, it helps the student document how they did and didn’t use ai.

Joseph South: I just think that’s so smart, right? You’re just creating clarity and transparency on both sides and under those circumstances, right? We can get somewhere, right? And, and maybe over time the teacher will feel like, no, I wanna change my expectations. And that’s fine. And they can, and they just need to update, um, [00:56:00] this little app.

Joseph South: Um, I think we get, I think we get, um, we’re afraid, right? And we feel like we need to protect. And when you are in a protective mindset, you push away. Risk, which pushes away innovation. And so if we can find mechanisms that help people take baby steps into the riskier world, you know, we can actually overcome some of this.

Jessica Garner: Well, and one, one quick thing about that, Joseph, one of the things that I love about that particular app that you’re talking about is in order for them to create that app, they had to have really tough conversations about the ways in which they were going to talk about AI usage in the app. Right? So it’s a transparency tool, but they, the students actually, and the teachers pick which ways they’re using AI and in relation to specific instructional practices, they had to have conversations about instructional practice and what they were gonna agree on and what they were gonna disa, you know, say, yes, it’s [00:57:00] okay in these situations.

Jessica Garner: And if it’s not okay, we’re not gonna include it in the little list that’s on the, on the app that we include. So I, I love that. Yes, it’s a tool for transparency with a whole bunch of thought that went into the backend of creating that before they deployed it,

Joseph South: which is probably the most important part of the whole thing, right?

Jessica Garner: Yep, exactly.

Rebecca Bultsma: And I think if we take it one step back, even like even more first principles, we have to look at how important it is that the teachers actually have somewhat of an understanding of these tools and how they work. Because you can’t dismiss something like Notebook LM or using AI in this way to be useful if you have never actually seen how useful it can be or experienced the limitations.

Rebecca Bultsma: And so that probably just goes all the way back to US institutions prioritizing pd, but also us taking sort of individual responsibility for our own, um, exploration as well.

Brett Roer: Yeah. One thing that I noticed was when I’ve been pushing districts to think about this, like when we [00:58:00] create, uh, tools that allow you to like generate lessons that are aligned to all of your instructional framework and policies, what’s missing is what you just mentioned.

Brett Roer: Thoughtful conversation around when, how, and why you should use AI ethically in the classroom. And so two ways I’ve been pushing people to dialogue on this is one, we’re gonna do this on Thursday at a school here in New York State, is interview students in middle school and high school and ask them to take a ride with you on the AI roller coaster.

Brett Roer: And just ask one kid, you know, period one, what does your teacher say about using ai? Are you allowed to use ai? If so, how? Why? You know, just one, like what is your teacher saying? Or they say we’re cheating, or whatever they’re saying. And then just ask them to go through their day and do that with like three or four kids.

Brett Roer: And don’t ask ’em who the teachers are. But that’s one where you’ll be like, wow, these kids are truly on a roller coaster. And whereas everything else in schools is like, we have all the rules and policies in place. And I always say it like everyone knows what the cell phone policy is in all these new states, and yet this doesn’t exist.

Brett Roer: And it’s just a roller coaster ride. And the second one is when I work with instructional coaches is [00:59:00] just ask your teachers the same way you encourage them or require them to have an objective or a learning standard or whatever else is part of it. Ask them to just put, what’s the AI policy? What’s their ai?

Brett Roer: What is their acceptable AI use for each assignment? And if you make that standardized, doesn’t matter if it’s even, um, different by class to class until you come up with shared language. Just the fact they have to think about it. And then you can at least push their thinking and coaching sessions on like, well I see you banded outright.

Brett Roer: What if we taught them how to do this? And they’re like, oh yeah, that doesn’t change what I wanted them to learn. And it might make it more exciting. And that’s how you can gradually get them there. Even if they start at a full no. Or maybe they say full. Yes. And it’s like, Hey, that’s not good either.

Brett Roer: That’s these kids can just do that assignment. Then maybe we need to add some nuance here about like use it here but not here. So those are two ways I’ve been really pushing people to think about how difficult it’s for students with no guidance in policy. Start wherever you can, but even the assignments themselves could help.

Rebecca Bultsma: I really like how my institution, the University of Edinburgh does it because on every assignment you have to clearly [01:00:00] articulate exactly how you used AI and why. And I think that idea, just as a starting place, to think through how you’re using it intentionally and explain to somebody else why you used it there and in what ways just immediately makes us more cognizant and accountable to our use of ai.

Rebecca Bultsma: So good place to start.

Brett Roer: You two are off the hot seat. You did a fantastic job. Might be the best two cohost yet that we’ve had. So well done. And now we really like to turn it over to you for our final question. We’ve talked a lot about, you know, talking good behind people’s back, so this is a great opportunity for you to do the following.

Brett Roer: And one that’s great when we have two guests because like one of you can go and then if someone remembers something else, they can go after that person. So there’s no time limit on this, but this is a great opportunity for our audience. You work with so many innovative districts, leaders, educators, shout them out.

Brett Roer: And talk good about what work they’re doing. Again, institutions, districts, education leaders, whomever. [01:01:00] And the part that we always try to get to is like, if we’re building this dream, looking to the future of AI and education and innovation, like who needs to come along for the ride with us? So, open question about who do people need to know about that are doing great work and why out there in the space?

Brett Roer: And take as long as you like.

Joseph South: Oh my goodness. So there are so many people doing good work. I’m, this is one of the things where I’m like, I’m gonna leave somebody out

Brett Roer: and that’s okay. That’s okay. I promise you listeners, that’s, that’s okay. And if you’re not in it, you know, you get it. This is, this is on the spot.

Joseph South: It’s so, so first of all, um, people often ask me, you know, who am I reading about ai? And so I get a lot of my information about AI from LinkedIn. That’s, that’s, there’s some really great content on LinkedIn. So if you don’t think of LinkedIn as a content. Place it’s really become one. So Ethan Mollick, who is um, a professor, um, at the Wharton School, is one of the [01:02:00] most intriguing voices on AI in education.

Joseph South: Really thoughtful, also funny, which doesn’t hurt when you, when like, Rebecca, you’ve been reading about AI all day, every day. A little bit of humor can, can help.

Rebecca Bultsma: I knew you were gonna say Mollick when you said orchestra the conductor thing, because that’s a mooch thing. I loved it.

Joseph South: Yeah. Um, and then, um, also John Bailey, um, he’s a really interesting person.

Joseph South: He’s, uh, been a policy person, um, on the sort of Republican side of the house in the past, um, and really thinks about AI both from, uh, a learning point of view and an economic point of view. Um, I think he brings a really interesting, um, perspective as well. I also, um, one of the things, another thing, another, uh, source, I read the whiteboard notes.

Joseph South: So Whiteboard puts out a newsletter and they do a pretty good job of pulling up some of the more interesting, um, ai, uh, articles and things that are, that are [01:03:00] going on. Um, and there’s another person that I’m gonna think about why I pass this over to Jessica.

Jessica Garner: Okay. So I’ve already mentioned some of the people that I think are just doing amazing work in this area, and those are our coalition partners that I mentioned earlier today.

Jessica Garner: I mean, they are, they are thinking about this. And one of the things that I love, you know, Indigitize, specifically in digitized Center for Black Educator Educator Development and Latinas for Education are really engaging. The, their communities. And so they’re thinking about this in ways that really impact their communities.

Jessica Garner: Um, I had the opportunity to, to co-present with in digitize and some of their fellows from their community to practice. And just the thoughtfulness that they’re, that, that the indigenous community is approaching AI with, it’s just, it’s inspiring to me. And so I just appreciate folks kind of leaning into their own culture and, you know, the, and the impact that their culture has on how they think about this.

Jessica Garner: Um, and, you know, creating [01:04:00] space for, for people to say, I don’t really want my data to be owned by these big companies and so I’m not gonna engage in this right now until we have our own data centers that are on our own lands. You know, this is some of what I’m hearing from, from indigenous folks and it’s fascinating to me.

Jessica Garner: And so I just appreciate the, the care that they come at these conversations with and the openness to listen. So, so, you know, that’s, that’s a group, um, our coalition partners that I, I wanna shout out, you know, I also wanna shout out the folks at the state level that I think are doing this really well. So, Matt Winters in Utah is doing some awesome things.

Jessica Garner: Vera Cubero, North Carolina is doing some amazing things. Um, AJ Cote in Massachusetts is doing awesome things. Um, and so we’ve had the pleasure, you know, of working with some of these folks at the state level and, and you know, just seeing, again, the level of intentionality of how they’re engaging stakeholders.

Jessica Garner: And when you work at the state level, you know, I did for three years, it is a [01:05:00] challenging place to be because you, you know, unfortunately you’re at the place where. You sit between policy and education. And so the, the politics of educ, of, of the, the state get in the way sometimes of education. And so I just have so much respect for our state leaders who are struggling through this work, but thinking, um, thoughtfully and really, I mean, being leaders in how they equip the educators in their state to, to work with and think about ai.

Jessica Garner: There’s also this guy, um, I think his name is Dr. Joseph South, and he has a really cool, um, newsletter that he does on LinkedIn. Is it called Spark? Spark Learning? Spark Learning Spark. I knew I was gonna get it wrong, but, um, also really cool ways just to think differently about learning. And so I appreciate the work that he’s doing in this field too.

Joseph South: Um, you, you referred to Michelle Culver earlier. [01:06:00] Um, I think the work that she’s doing around how AI impacts, uh. Human relationships and social relationships is really important. Another person that I read and I realize and I’m, I’m doing a theme here, these are sort of like more outta the box people.

Joseph South: ’cause I think we all know the usual suspects in the field already. Um, but Allison Dulin, Salisbury, um, which may not have come up in your feed, um, she, for a while she was doing the one AI article you should read this week. And I found like she was choosing really great stuff to read. And again, I just found her on LinkedIn.

Joseph South: That’s Alison Dulin Salsbury. Um, and if you read the one thing per week, um, you’ll get a really great, um, overview as well.

Brett Roer: Well, that’s, that’s, those are some really good finds and thank you for bringing up people that we’ve all heard of. And then those hidden gems, that’s really the goal of that. So you both aced it.

Brett Roer: Well done. No one come for them in the comments. Jessica, you did a great [01:07:00] job of shouting out, uh, you know, a great place they can learn about Joseph’s thoughts on LinkedIn and his newsletter. Could you both just kindly end by just sharing with our audience once more ways they can either engage with you all in person online, how to get involved and find all the great resources, um, that ISTE ASCD offers?

Jessica Garner: Yeah, I’ll start with on generation ai.org is a great place to go to learn about all of our grant funded activities that we do. Um, and then, you know, we, we have some really great membership resources as well. Um, so iscd.org/ai um, is a great place to go and find some other things. Joseph, is there anything else where, any other places you’d send people?

Joseph South: I mean, those are, those are great starting points. Um, we have, you know, we published a number of books. From different perspectives, from the perspective of, uh, ed tech coaches, from their perspective of instructional leaders, you know, from all different points of view, um, that I think can be really, really [01:08:00] useful.

Joseph South: Um, and then, you know, at our conference we, um, I think last year I think we had over 200 sessions on AI and teaching and learning. Um, and I mean, it’s just, and by the way, those are 99% of those are from other educators talking about their actual experience. Um, I think that that’s a concentration of, of, of knowledge and experience that just doesn’t exist elsewhere.

Joseph South: Um, so there’s lots of ways to find your way into our community and we hope that you’ll just pick one and, uh, and see where it leads you.

Brett Roer: Well again, Joseph Jessica, thank you so much for the amazing work you lead at ISTE ASCD, and thank you so much for spending time today with us on the AmpED to 11 podcast.

Brett Roer: Thank you all so much everyone. Thank you for listening. Have a wonderful day.