Thomas Hummel: [00:00:00] Software, meeting the hardware of robotics and these AI models and our whole world can flip upside down and our education can flip upside down. And what we need as a society could be flipped upside down here too.
Thomas Thompson: Content moderation, testing, evaluation, verification. If I was to make big bets on what’s important going forward, it’s definitely evaluation and verification.
Thomas Thompson: Teachers need to be able to see under the hood to ensure that the content they’re creating is fit for purpose for the classroom. And understand where that might not be the case.
Thomas Hummel: Just because we can doesn’t mean we should. And all these tools that in all the AI use that’s being done in schools is done now without data and there’s been zero data to support anything.
Brett Roer: Welcome everyone to the AmpED to 11 podcast. We are honored. It is a double trouble today with Thomas Thompson iv, the CEO of Eduaid, as well as Thomas Hummel, the chief product officer and current middle school science teacher. It is an honor and pleasure to have you [00:01:00] both here joining us today. How are you both doing today?
Thomas Thompson: Doing well. Thank you for the invite. It’s great to be here.
Thomas Hummel: Thanks, Brett. Doing awesome.
Brett Roer: My partner to the North Rebecca Bultsma. How are you doing today, Rebecca?
Rebecca Bultsma: I’m good. I just uh, wanna hear all about Thomas’s new baby actually, please. Yes.
Brett Roer: Yeah. Start us off with joy. Tell us everything.
Thomas Hummel: Yeah, so technically I’m on the longest break I’ve ever had from teaching right now on paternity leave.
Thomas Hummel: So my wife and I just had a baby at the beginning of October, so October 7th. And so I’ve been off now for about a month just being a dad and a little boy, healthy and happy and just getting absolutely no sleep. He’s actually nocturnal, so my days are all flipped around and it’s just a total disaster in that front, but a beautiful disaster nonetheless, you know.
Thomas Hummel: Thank you for asking. I mean, I appreciate that.
Brett Roer: Parenthood a beautiful disaster
Rebecca Bultsma: Sums it up.
Thomas Hummel: Yeah, like I don’t have my hair done. I couldn’t, didn’t have time to do my hair or anything. So I got the skull cap on [00:02:00] here. Probably haven’t shaved in a few days and, but you know, just running edge Wade and trying to manage my own life in certain ways.
Brett Roer: Well first really then thank you and our listeners. Um, just to give some context here and then we’re gonna turn over to our amazing guests. We didn’t know that, uh, Thomas Hummel was on paternity leave, so all the more exciting. But I did have the chance to meet, uh, Thomas a few months back and their story of him and Thomas Thompson IV was just so incredible that I said we have to have them as guests on AmpED to 11 ’cause they’re currently building AI solutions and iterating and refining them as current practitioners in the field.
Brett Roer: And you really can’t ask for a much more authentic experience than that in the age of ai. So, without further ado, I’m gonna open up to either Thomas, you all take turns, the audience. Maybe you wanna introduce your name every time, but could you maybe tell us like. You’re talking across the hall, you’re in real classrooms, and all of a sudden Eduaid AI is formed.
Brett Roer: Can you [00:03:00] kind of just share for everyone your journey, your why, and how that conversation grow into this amazing organization you co-founded and built together?
Thomas Thompson: Yeah. So Thomas Hummel is the science teacher of, of the both of us. So, you know, he’s plugged into a lot of online circles and and whatnot. And, um, it was around the time that GPT was about to be relaunch, launched publicly.
Thomas Thompson: Um, he starts playing around with the tool. He says, uh, you know, I asked it to create a lesson plan. Its pretty good. Could be better, but this is pretty good. It’s saved me some time. It was really fast, like this could be an interesting thing. So then we said, okay, it’s not the ideal user interface for a teacher.
Thomas Thompson: It’s not the ideal way for a teacher to interact with a, with an AI tool. So we said perhaps we could do something. That speaks directly to the teacher that couches the AI within the classroom that meets the teacher where they’re at, where their skill level is at, and um, you know, kind of speaks their language.
Thomas Thompson: And from there, uh, started building Eduaid us two. And, uh, our other [00:04:00] co-founder Tyler, he’s the engineer of the group.
Brett Roer: Oftentimes we don’t have the luxury of having two guests. But would Thomas Hummel, is there anything else you’d want to add? Uh, or did he get that story pretty well?
Thomas Hummel: Yeah, I mean, it really just started off from the premise that we were working in these Title one schools on the eastern shore of Maryland and like.
Thomas Hummel: The job of teaching. Like when you’re young, you have like this ambition about teaching and how this is gonna be the greatest job ever and I’m gonna change the world. But then you start to face the actual system of teaching and while you are making a big difference in the lives of students, there is like the system that comes into play that is just insurmountable.
Thomas Hummel: And so we were seeing teacher turnover all the time. We were seeing teachers get hired without teacher certificates that we had to work with. And it was really hard to kind of pick up that slack as the professional teacher in the building and still be positive and still have [00:05:00] energy and still have a life outside of the classroom.
Thomas Hummel: And you know, we set out to sort of solve these bigger issues in education, you know, bigger than just our classroom whenever we tried to develop this product. And so, like Thomas said, we launched it with just us three. There was like nothing else in the space. And what we found was that what we did really resonated with teachers and it just caught on like wildfire.
Thomas Hummel: And our growth charts like really went like just off of the charts. And it was just affirmation that what we were doing needed to be refined and needed all of the attention that we possibly could do because the power of ai, you know, the right way in education, we see as really powerful. But we also see like, there’s also a negative side that can come about from that too.
Thomas Hummel: And so, you know, we’re just trying to do the best that we can by the teachers that we are trying to help on the mission that we set out to do.
Rebecca Bultsma: [00:06:00] So for people who are unfamiliar with Eduaid, tell us what you do. What is, what does your company do? What do you build? How do you support teachers?
Thomas Thompson: In short, it’s a workspace for teachers to do.
Thomas Thompson: All the work of instructional planning from the creation of materials, assets, resources that you’d use in the classroom to kind of ideating, thinking about what you might do in the classroom, exploring different methods and techniques of instruction. Just to give you a concrete example, um, you know, retrieval practice is a very effective tool in the teacher’s toolbox.
Thomas Thompson: However, um, you know, generating and coming up with a number of different retrieval questions. It can be time consuming, especially if you’re trying to create some variety for different student needs and all of these types of things. Generative AI is terrific at creating just an arbitrary number of retrieval practice questions.
Thomas Thompson: If you can, you know, get the prompting right for the retrieval piece. So a teacher could say, create a short answer worksheet with minimally differentiated question types, [00:07:00] all in that, um, retrieval realm. Or you could say, create a worked example or an elaborated analogy, or a direct instruction script, or a jigsaw activity, or a review game or graphic organizers.
Thomas Thompson: And I could just keep going because there’s about 125 different educational assets and resources in edge aid that the teacher could create. Loads it into a text editor and you have full control over it. You can, you know, add different types of questions. Then you could differentiate the materials. We have a bunch of one click tools to do differentiation.
Thomas Thompson: So you could think maybe uploading a material you already have created and then maybe being able to translate that chunk, the text in a smaller sections, insert check for understanding questions. Right? So it’s really a, a set of limitless possibilities for a teacher to play in this kind of sandbox that we’ve provided them.
Thomas Thompson: We don’t really tell them what to do or how to do it, we just provide a bunch of pieces. The teachers can make what they want, take what they need and dis carve the rest if it’s not something that they could use in the classroom or need that day. [00:08:00]
Rebecca Bultsma: Now, part of what I do and what I talk about in my regular life has to do with, you know, responsible AI and AI ethics.
Rebecca Bultsma: How have you made sure that the tools that you’re building are ethical, are safe, are, are responsible to be in schools and for teachers to use?
Thomas Thompson: Rigorous testing and research. So to that end, we’ve partnered with the Chan Zuckerberg Initiative, CZI, and we’re beta testing their evaluator tools and in co-designing those with that team, they’re a terrific team over there.
Thomas Thompson: And, um, this gives us access to say if you create something that you state is for a second grader, is this actually at a second grade reading level in terms of sentence structure, vocabulary choice? Is it appropriate for the grade? So am I going beyond, say, content that would be appropriate for that age level?
Thomas Thompson: Um, if I want to scaffold up to that, it’ll provide options for scaffolding. So we’re actually gonna make these, some of these evaluator tools publicly available on our side so teachers can get transparency into the outputs themselves. But we do a lot of [00:09:00] testing internally for that. Then of course there is the endpoints for content moderation.
Thomas Thompson: That way we’re not say bringing in student person identifiable information to the system or providing responses for, um, unethical queries around, um. We moderate on nine different topic areas and hard pressed to call them all off of top of mind, but things around, you know, uh, bias, uh, violence, uh, sexual topics, things like this.
Thomas Thompson: Um, we moderate for all of those as well. So content moderation, testing, evaluation, verification. If I was to make big bets on what’s important going forward, it’s definitely evaluation and verification. Teachers need to be able to see under the hood to ensure that the content they’re creating is fit for purpose for the classroom, and understand where that might not be the case.
Thomas Thompson: So again, new update for Eduaid that’s coming out very soon you’ll have those evaluation tools publicly available, but we use them internally right now.
Thomas Hummel: Yeah, I would, Rebecca, I would say like zooming [00:10:00] out too, like from a company standpoint, what we do to like ensure, you know, ethical practices and. Like, ensure that AI is safe is we’ve actually just kind of made, tried to make those decisions along the way as we were building this.
Thomas Hummel: Like we could have done a student facing side of this. We chose not to because of those concerns. We could have, you know, take in a different route. But what we do is we just keep getting feedback from teachers and whether or not that they, the professional educators think that it is appropriate or not, is like one of the best ways for us to ensure what we’re doing is that line.
Thomas Hummel: It’s like nobody can really vet this material better than a professional teacher. And we’ve kind of tried to stay within that framework of what we were building too. Like, we don’t, we don’t take any student data. We don’t take, we don’t allow students to use this thing. And so. Despite that probably hurting our growth numbers [00:11:00] or our metrics or time on our platform or any kind of metrics that any other business has to return to somebody, we don’t have to do that.
Thomas Hummel: And so that is how we are trying to do what’s best by our user base.
Rebecca Bultsma: Yeah, no, I love that. That’s part of the things I ask about and that I look for in particular. So I’m, I’m a fan so far
Brett Roer: for our listeners out there, but also bigger picture, so Right, we’re talking about Eduaid, but also just in general, when you partner with either individual teachers or schools and or districts, first we’d love to hear how do you partner with these different uses users, and right now with AI evolving so quickly.
Brett Roer: What are the most important guardrails that you think like schools need to have in place for tools to be successful and ultimately to like, you know, either make education, uh, educators more empowered students, more engaged, increase student outcomes, et cetera. What are some of the like core tenants that you wanna make sure [00:12:00] you see in policies moving forward in guardrails?
Thomas Thompson: Mm-hmm. That’s a great question. So I would say it’s probably some of the most important pillars are one that you have an AI policy, like is the first hurdle. That sounds obvious, but there are quite a few schools that have been slow to adopt some form of AI policy doesn’t have
Thomas Hummel: an AI policy policy.
Thomas Thompson: Yeah.
Thomas Thompson: Okay. So there’s one example right there. Um. Hey, you need to adopt some policy. Uh, there’s great resources out there. Um, Lance Eaton, uh, he has a terrific substack where he collates, I think it’s like 125 different, um, AI syllabus policies all in one database so teachers can see a sampling of different types.
Thomas Thompson: So definitely do the research and explore because what I call out as maybe the most important may not be so important to other schools, other districts, but there are some baseline things we should all get, right. One is what is happening to the teacher and students’ data that is being entered into the systems and, um, being returned to them, right?
Thomas Thompson: That’s, that’s the first one. What’s the data [00:13:00] privacy agreement between you and the company? Because a company like Eduaid or, um, magic School, or diffit, or Brisk or School ai, or any number of these AI tools for teachers don’t have an internal AI model that the company itself owns. They’re making API calls to foundation models.
Thomas Thompson: How are those relationships governed? Uh, what are the data protection, um, practices and, and agreements between these, these companies and these, uh, AI providers and vendors? So there’s a, there’s a vast interplay where data is floating from many different organizations between other organizations. So how are those relationships governed?
Thomas Thompson: Which relationships are there, which subprocess are there? You need insight and transparency into how the company is using your data for. Some AI response. You may not even know which AI model is being used. So that’s another question you need to be, um, explicit about which models are being used for, for which purposes, and, and, um, whether or not there are content moderation, endpoints and guardrails in place.
Thomas Thompson: That’s probably the first pillar. And then the other is [00:14:00] acceptable use policies for teachers and for students. Um, you won’t know if you’re breaking the rules unless they are very clear to you. So when is it okay for a student to use an AI model? Is it ever okay for a student to use an AI model? For what purposes, et cetera?
Thomas Thompson: And same for an educator. How can they use this in their work? Which use cases are, okay? Here’s an example. Is it okay for a teacher to use an AI to generate practice problems for review, for upcoming assessment? Most people would probably say, yeah, as long as they’re vetting the questions, ensuring that they’re factually accurate.
Thomas Thompson: Is it okay for a teacher to use an AI to respond to an email from a parent? Now we might have some divided opinions. If the parent is talking to the teacher, perhaps they wanna response directly from that teacher, not from an AI model. Are you sharing the parent’s name and email with the AI when you’re copying over the message?
Thomas Thompson: You probably shouldn’t do that. Um, you definitely shouldn’t do that. I should reframe. Um, okay. Third use case. Should an AI be used to grade your student’s assignment and, [00:15:00] uh, then be kind of the grade that will reflect maybe promotion through the class or to another grade? It could be grades hold a lot of power over student promotion in school.
Thomas Thompson: So final use case should an AI be used to write an IEP plan where you have a significant amount of personally identifiable information, medical records, et cetera. So there’s numerous types of use cases in a teacher’s job and in different types of, um. Teaching position. So being a special educator versus the general educator in a classroom teaching high school versus teaching elementary school.
Thomas Thompson: Like there’s so many different populations of teachers that might inter interact with generative ais in different ways. So you need a clear, acceptable use policy that addresses these types of use cases for the educator and again, for the students. So data privacy and acceptable use are probably the two first pillars.
Thomas Thompson: And I’ve droned on long enough.
Rebecca Bultsma: You’re saying all the things that I like to hear personally. So what is the interplay then between your, your technology and what you do [00:16:00] and let’s say a school’s AI policy, where do those intersect?
Thomas Thompson: Now that might very well intersect one in the procurement process itself.
Thomas Thompson: When they’re looking at AI tools and investigating, this is our AI policy. Does your tool fall within these parameters? We’ve had a few schools that have reached out with inquiries about a, a school level partnership, and they would lay out clearly like, we’re looking for a tool that does this, this, and this, and here’s our acceptable use policy.
Thomas Thompson: Does your tool like adhere to these following things? And of course you need to read it. And if you say you do, you damn well better actually do it. And then, um, the second half is sometimes it’s reflected in their DPA, um, their data privacy agreement. Um, a lot of schools, however, are using DPAs that were clearly made before, uh, gendered AI technologies were released.
Thomas Thompson: So they don’t really have carve outs about specific areas that, here’s an example. So a gen, uh, DPA policy around teacher identifiable information, right? So is then. [00:17:00] Say the creation of a multiple choice test that will be used in the classroom is, does this have any kind of that information, is it addressed here?
Thomas Thompson: How does that fall under your, um, policy? Because it technically is classwork, but it’s not, uh, say student performed classwork, it doesn’t have their name on it or something like this. So there’s a lot of areas where some more, um, clarity on how this DPA then will interact with an AI provider. It’s not so clear.
Thomas Thompson: In some cases, my example wasn’t great, but you get the, you get the idea.
Brett Roer: We interviewed a number of students recently and it’s like they’re the amount of anxiety they’re facing and you’re seeing this firsthand where they’re like, so do I use AI here? Like it is helping me learn, I’m not cheating, but am I cheating?
Brett Roer: And like all of those questions that are going on, so like, one, how do you guide students, so guidance. And then two, how do you protect them from the things they shouldn’t be engaging with on AI for a number of reasons. So literally that, what’s the ideal guidance and [00:18:00] guardrails? That you’re seeing in the classroom, and then obviously as a, a founder of an education organization.
Thomas Hummel: Yeah, so, so the, my favorite thing that I’ve seen is this continuum out of North Carolina that each kind of individual assignment will have a score that a teacher should look over, just like they would a grading skip, like how they would grade certain aspects of it. But you could start to assign different assignments, the amount of AI that is acceptable.
Thomas Hummel: So like sort of an acceptable use policy on certain things. As far as, I think that that’s sort of where we should be going is like, if. There’s sort of the world where you have to bridge the gap between the old school of education and the new school. And I think just being transparent in each individual assignment and taking it assignment by assignment is probably the best way to go about it.
Thomas Hummel: And the most fair way to go about it for students. Like if I tell a student, you know, this is gonna be graded, I tell them that before I give them the assignment. That’s just fair. Right. And so I think the same thing should probably be true [00:19:00] with AI use. Like, oh, okay, well you can use a certain amount of this, or No, you can’t use any of it on this assignment.
Thomas Hummel: And I think that that’s the only you, you should have a clear guidance in that regard. For me personally as a middle school teacher, I don’t even really, like, my students are too young to be interacting with the foundational models under my supervision in my classroom. So I don’t really have like op, I don’t really see opportunity for me to be doing that with.
Thomas Hummel: 12 year olds and 13 year olds. Like, I don’t wanna play Loko Parentes and say, Hey, get on chat GPT, here’s an account. Like you can use a certain amount of ai. Like, I just don’t, I, that’s not how I’m operating it in my own personal classroom. So I hope that answered your question.
Rebecca Bultsma: It’s a great answer. I spent some time this morning reading, uh, a new report that Google put out about AI and learning and education in the last few days.
Rebecca Bultsma: And I just wanna zoom out for a bit and just [00:20:00] chat a little bit about what your predictions are, how you see ai, you know, the big loose term, like in general, changing the way kids learn, teachers teach. What do you think the future looks like in five years, 10 years, 20 years in the classroom?
Thomas Thompson: I’m usually quite hesitant to make any predictions because they have a habit of making me look foolish as time goes on.
Thomas Thompson: But, um. The nice thing about this particular question is that while the technologies will evolve very quickly, the way the the human brain operates has taken many, many years of evolution over time to, to get to where we’re at. So I think the most helpful place to start is stripping away all of the technology and asking a basic question, you know, how do humans actually learn?
Thomas Thompson: If you start there, there are some practices, techniques, methods that teachers can employ to create conditions in which learning is more likely to take place. There’s never like a surefire thing, education’s a game of [00:21:00] probabilities. You’re trying to create the most optimal conditions in which learning may most likely take place for particular students.
Thomas Thompson: So this rise around the interplay between retrieval, spacing, and interleaving of practice. Um, really what it comes down to is deliberate practice over time with different problem types, interleaved throughout. That’s ultimately what I think the killer application for AI and education would be and how to optimally.
Thomas Thompson: Go about this, this dance between the three. Uh, what types of retrieval Questions to ask. When to ask them over time. Like, um, many teachers might do reviews for a few weeks or cram a bunch of review right before a test, which is not really an effective practice if you are consistently referring to prior knowledge and making the prior knowledge that’s requisite for the tub be learned material very clear, and you’re relating between the two and you’re drawing analogies to experience all of these things.
Thomas Thompson: Generative AI might have some rule in creating those analogies, those examples, the non-examples, the retrieval questions, the spacing calendar, when to [00:22:00] interleave materials. But it really just goes back to those were effective practices long before there were was generative ai. So can it in some way amplify a teacher’s ability to implement those practices?
Thomas Thompson: Now, the unique problem that it brings about, that we’re going to have to solve for is cognitive offloading in novice learners. That’s, that’s the biggest problem. Um. Cognitive offloading not only for novice learners in terms of the students, but novice teachers as well, who are maybe offloading educational practice to the AI and not firm, uh, forming like a firm competency with how teaching actually works and when and why they’re using certain methods in the classroom.
Thomas Thompson: Same for students, right? If you’re AI tutor just gives you the answer after two, um, idk, then it’s probably not a great AI tutor. Um, again, it’s forming shallow fluency, hollow understanding. Students aren’t really mastering the content in any meaningful way. Forming memory traces any of this stuff that we need that actually contributes to learning.
Thomas Thompson: So strip away the technology. How does learning work? Once you answer that [00:23:00] question, look at how the technology is then interacting with those processes. If it’s amplifying processes that are most likely to create the conditions for solid durable learning, then. All power to it. And if it’s undermining those conditions, then we should be concerned and figure out a way to mitigate that.
Thomas Thompson: So cognitive offloading and effective learning sciences based practice, some balance of those two is going to be the dance we need to do over the next couple of years. Relationships clearly relate to engagement and motivation with the student, and the teacher has to do this. And engagement and motivation are like prerequisites.
Thomas Thompson: If a student is totally disengaged because there’s no buy-in because they have a poor relationship with the teacher, which can actively undermine learning, that’s, um, a negative externality and and negative direction. What role does AI play in those relationships? I’m, my bet would be that AI can free up the teacher’s time from just all of that paperwork and the, the piece of planning, all of that quality instruction such that they might be able to better focus on the needs of the students directly in front of them.
Thomas Thompson: Instead of saying, [00:24:00] you know, let’s say a student is falling behind, starting to get disheartened. Maybe there’s behavioral. Outbursts because of this, right? The student feels overwhelmed. They enter class, they feel frustrated. Maybe this goes on for a while. The teacher didn’t catch it. The student associates frustration with this class and they form a negative opinion of the teacher and, and all of these things start to occur.
Thomas Thompson: Building out materials that would reach that student where they’re at is a attacks on the teacher’s time without a generative AI tool. With generative ai, they might be able to say, okay, here’s the objective. Here is the material. The student struggled at this point. I need to create problems and scaffolding to help the student reach this.
Thomas Thompson: The teacher comes to class prepared with those materials can then address that student where they’re at, work through those frustrations. Because again, the teacher doesn’t come in having went, oh my gosh, I was here until 9:00 PM last night creating all of these materials. I didn’t get home until late.
Thomas Thompson: This triggered an argument with my, uh, significant other or something like this. And the teacher comes in with personal problems themselves and they’re not their best self. So some combination of those. Um, [00:25:00] we recently did a, uh, pilot program with Teach for America and that was actually one of the pieces of feedback that we got from some of those, uh, educators, which is I was able to enter the classroom feeling more energized because I felt more prepared because I had, you know, multiple pieces of content differentiated for different student populations and needs.
Thomas Thompson: And I was able to go in and, you know, have something for that student where they’re at, and then be able to address the concerns and misconceptions that they were building. And I think ultimately that would. Do a long way towards that relationship piece you were talking about.
Thomas Hummel: Yeah. I think it’s funny how like when you ask anybody about their learning experience or as a student, right?
Thomas Hummel: Relationship with like a teacher or a mentor or a leader, like that’s like the number one thing. It’s not some dumb test that you took. It’s not even like the content that you learned. It’s like, I was inspired by this person and they’ve helped me fall in love with that. But that never gets taken into account whenever you have decision makers making [00:26:00] decisions in school.
Thomas Hummel: So it’s like the number one thing we value as an individual, but it’s like not really taken into account all the time whenever you’re making decisions for schools, like Right. And it’s, it’s kind of interesting. And so you asked the question about where do we see AI or where do I see AI going? Like, and when I zoom out, it’s like.
Thomas Hummel: There is a world like Thomas said, where AI allows teachers to have all this time and have better relationships with their students, and that is some of the feedback that we got. But there is also a world where like if these AI tutoring companies. Start to really have great data. I think it’s a hard decision to make as a parent to send your kid to a traditional school if money isn’t a factor.
Thomas Hummel: And I think that there’s a world where AI could be the driving force of what changes, what school could look like. I just think that our public schools are really fragile. [00:27:00] I think that our workforce of teachers are really undervalued and underappreciated more than ever before, and they’re uninspired because of that.
Thomas Hummel: And I don’t think that, you know, we don’t have a lot of good people advocating for teachers and for the system that we have currently. So, I don’t know. Like I, it’s hard to say, like, I listened to this podcast the other day with Brad Gerner talking to Sam Altman, and he had said that anybody that predicts what’s gonna happen three years out is foolish.
Thomas Hummel: And that’s like really hard to, that’s really hard to hear for me, like as an educators like. Three years. Like we could see the combination of software meeting, the hardware of robotics and these AI models and our whole world can flip upside down and our education can flip upside down. And what we need as a society could be flipped upside down here too.
Thomas Hummel: So it’s just, it’s crazy. It’s like we all value our relationships and we love our teachers and students and all [00:28:00] that stuff, but when the dec, when decisions get made, nobody, that’s never really brought into consideration.
Rebecca Bultsma: No, I totally like what you bring up. And I think everyone’s worried and talking about like it’s gonna change education.
Rebecca Bultsma: And then when I think about the parts of education that are broken, some of the things that you mentioned, I wonder how we can salvage the good parts, but leverage the power of ai. But not in a way that’s damaging so that kids are forming unhealthy relationships with machines versus teachers. And you keep teachers engaged, like you mentioned, like we’re seeing I’m from Canada, but we see the same thing.
Rebecca Bultsma: Yeah.
Thomas Hummel: Yeah. And that, that’s sort of the thesis behind our product and sort of back to what Brett, Brett was getting at with guardrails is like as soon as you start to believe that teachers are no longer the best guardrail for students and no longer the best expert, things are going to change very, very quickly.
Thomas Hummel: Like you take that power away from the professional educator to make decisions for teach for students, [00:29:00] right? You put that in the hands of ai or you put that in the hands of other people or other, like CEOs of different school systems, whatever those look like. I think you could see our education system change very quickly.
Brett Roer: Yeah. So I’m kind of gonna try to, to write this and kind of take what I’ve heard from each of you and ask, uh, I want Thomas Hummel to answer this one as the current practitioner. So. Thomas Thompson did a great job of sharing, you know, what, what is learning, right? Like his own theoretical beliefs and grounded and expertise.
Brett Roer: And Rebecca’s really pushing on the ethics and the interplay between AI tools versus relationships. So Thomas Hummel, what you’re talking about, about the lack of right now, the low morale due to lack of, uh, support teachers are feeling and under appreciation when you are talking. Forget about Eduaid for a second.
Brett Roer: The power of ai. What’s the best, like you’re in the teacher’s lounge and you have a teacher, right? They don’t have policy and you know, they have the [00:30:00] passion. They might just not feel supported right there if you were to like, you know, saddle up next to ’em and be like, Hey, let me show you something really cool.
Brett Roer: Doesn’t that I’m not talking about, right? What is like the thing right now that you really see people light up around? Then simultaneously, like what is the thing, not necessarily putting AI in the hands of a student, but what’s the thing that translates when a teacher’s using AI effectively that brings that joy of learning back to the student?
Thomas Hummel: Ultimately a way that, you know, I would frame this, and Thomas, and I’ve had a lot of discussions about it, but the thing that’s most powerful about AI right now in education to me as a teacher in the teacher’s lounge or however you wanna frame it, is that it lowers the cost of experimenting the PR in the practice.
Thomas Hummel: And while it takes a lot of effort to redo your lessons from scratch, it doesn’t take a lot of effort to use AI to tweak them and make them better in certain [00:31:00] areas that you, that I have already used, like that’s what I use the AI for the most is that this didn’t work so great last year. I’m gonna put a game in here in this aspect, or some questions in here, in this aspect or, or a small reading to add context in this aspect.
Thomas Hummel: It’s not even there to replace the content or the curriculum, but it’s there to enhance it. And when you’re a passionate educator and you wanna make your lessons better, like I know some teachers do, not all teachers, but if you truly do, AI’s the best companion for that because it’s unlimited brainstorming and it’s unlimited ideas, it’s unlimited context that you want to add.
Thomas Hummel: And so it, it lowers the cost of experimenting. And in a practice of education, experimenting costs you a ton of time and a ton of effort and a lot of teachers do the same lesson over and over again, even if it’s not the best because it just takes so much time to really do hammer out a high quality lesson.[00:32:00]
Brett Roer: We might d dive deeper into that ’cause I’d love to just kind of hear the actual moves you’re making to like. Empower people specifically around that, and that might, so we might, we might put a pin in that and come back to it in a little bit. But we are gonna flip the script now. So, Thomas Thompson iv Thomas Hummel.
Brett Roer: You are now the co-host of the AmpED to 11 podcast. Uh, Rebecca and I are on the hot seat. Please, each of you feel free to come up with one question that you’d like to ask us, uh, about what’s happening in the world of AI in education.
Thomas Hummel: Let me ask you this, you’re all over the world, okay? Or you’re at least all over the country giving all these speeches and you’re talking to educators all over the place.
Thomas Hummel: Do you feel like we are heading in the right direction with our implementation of the tools of AI and the use cases that you’re seeing in. Schools or have we just bought into the latest hype cycle and in turn made wrong decisions. Like that’s how I [00:33:00] feel is like just because we can doesn’t mean we should.
Thomas Hummel: And all these tools that, and all the AI use that’s being done in schools is done now without data and there’s been zero data to support anything. And so whether or not we have made the right decision, decisions have been made. Like do you feel like people are regretting that this, those initial decisions or do you think that we’re going to try to like storm through with that?
Brett Roer: Rebecca, if it’s possible, could you, do you wanna go first? Only because you are actually doing this on an international level and this is like your area of study and also you’re just really interesting and I’d like to hear what you say first.
Rebecca Bultsma: It’s an interesting question, I think, I think people move forward the best they can with the information they have at the time.
Rebecca Bultsma: I think obviously the, maybe the jurisdictions who just started with an outright ban probably like realized that this wouldn’t be a fad, this would be [00:34:00] something we needed to do. I don’t think anyone has a choice but to plow forward. I think kind of you alluded to it, the biggest risk that I don’t think we’re talking about enough is this intersection of private business and EdTech tools of companies who are making millions of dollars off of underfunded education.
Rebecca Bultsma: Plowing into schools, pulling those dollars out of education and then making even more money when they harvest and sell the data from students and teachers. And that’s something that really bothers me, and I think that’s something that we need to take a breath and look at and get regulated as soon as possible.
Rebecca Bultsma: Because I like what you’re doing because you are working directly with teachers. You’re on the ground, you’re giving back to the community that you serve and you’re a part of it. I don’t love the idea of that, how private industry has the ability now to fully impact and change our education system and take dollars away from it without necessarily giving anything back in return.
Rebecca Bultsma: [00:35:00] So I would say that probably the biggest gap, uh, is that for me, I think. Not enough people know about it or talk about it or realizing what’s happening on the backend. Uh, with all of these tools that magically came out of nowhere and aren’t necessarily grounded in best practice or ethical practice or even legal practice or the fact we don’t even have laws that exist that protect systems and people from things like this happening.
Rebecca Bultsma: I think that’s a major gap. So I think that combined with the urgency that, especially the school administrators I talk to, feel about that they’re behind or, you know, most of them care so much about preparing students for the future. They’re all making decisions of the best they can from a place of caring, um, based on limited information and based on whatever the tools told them.
Thomas Hummel: It’s unfortunate that like, I totally agree that it’s from a place of caring, but it’s from a place of desperation. And schools are not right. People are not making [00:36:00] decisions from a place of thriving. They’re making a decision from surviving. And it’s like, I hope that this, I’m gonna throw this against the wall.
Thomas Hummel: I hope this sticks and I hope I can keep my job and I’m gonna do anything to do that. And it’s like we see all kinds of turnover, right? This amount of superintendent turnover is at an all time high. Like you could go down the list of all kinds of people that are making decisions that when this AI fad came out, they really just left their morality or their ethics, or their, or their, I mean, I guess, I don’t wanna say it like that, but there it is.
Thomas Hummel: Just like the rush to do has been very scary for me to see.
Rebecca Bultsma: Yeah, making those decisions from a place of fear. Instead of having the time to actually measure and understand. And a lot of the people who are making those decisions, the superintendents, like they’re not trained procurement officers. They don’t know the questions to ask.
Rebecca Bultsma: They don’t understand this technology themselves. And so I think that’s the biggest gap [00:37:00] and risks that I’m seeing. It’s just the sheer speed and destination.
Thomas Hummel: It’s like they don’t eat. It’s like what we are seeing is that people aren’t even seeking information though. They allow the information to come to them, and if you do that, it’s, you are never gonna get a true story of the space and, and what’s available and what’s out there.
Thomas Hummel: It’s like, right, there’s all these meetings that superintendents go to that take $30,000 to go there and to sit down and talk to them. But it’s almost like when we have this field of ed tech like that, it’s like there’s such a skewed perspective from the way I see it. It’s like the, the conversation’s not honest.
Rebecca Bultsma: Yeah. Well, it’s undue influence, right? From for-profit companies in a not-for-profit space, a social space. Exactly.
Thomas Thompson: The, the intermixing of pro for-profit and non-profit is a dangerous precedent to set. And something that, uh, we’re gonna see how that plays out. I mean, OpenAI is, um, started out as like a capped for-profit with a non-profit board and [00:38:00] now is for-profit and has a CEO is not consistently candid.
Thomas Thompson: And, you know, what, what kind of president does that set then as well? Um, a little shaky. Yeah. The, um, ideal is educational technology developed within. The confines of the school itself. Right. Um, and among the teachers and among the, the staff and the students, there’s a school district in Maine, they’re doing something really interesting.
Thomas Thompson: You know, they bought the racks themselves. They’re hosting their own local AI model. It’s a collaborative project between the students, the staff, and the community to build it. It’s something that they will own instead of paying out a subscription to a rent seeking digital landlord. Um, and, you know, they’ll own the data.
Thomas Thompson: They’ll own the hardware, they’ll own the software. It’s all in-house. I mean, that’s, uh, that’s the ideal version. Now, does every school have the bandwidth time and, uh, staff and budget to do all of this? Uh, probably not. I think as Hummel was getting to, um, working from a place of desperation mixed with the [00:39:00] rent seeking incentives of like, you’re missing out on this tool.
Thomas Thompson: You need to come on board and, and sign a licensing agreement with us. And, you know, all of these impulses, you’ve the institutional forces, just the, a lot of convergent factors that. Can definitely lead to negative outcomes for US public education, which has been, um, burned by educational technology before over promised and underdelivered.
Thomas Thompson: I mean, how many times MOOC’s Personalized learning. I mean, hearing that AI is going to be the thing that delivers personalized learning when we’ve been promised personalized learning since 1992 is pretty frustrating. Yeah. Just to read that term
Rebecca Bultsma: actually bothers me because Yeah, exactly.
Thomas Thompson: It’s still, yeah, it
Rebecca Bultsma: still like implies it’s terrible term.
Rebecca Bultsma: There’s one linear way to get from here to there, and it’ll just meet you wherever you are on that one path. Right. Like, and that there’s only one way to get from here to there, it’s just you might be behind or ahead, but you have to be on that one path
Thomas Thompson: or collecting massive amounts of learning analytics for adaptive instructional platforms, which just are [00:40:00] reactive.
Thomas Thompson: It’s like from some arbitrary starting point that they picked. Yeah, I’m, I’m not a, it’s, it’s. I work in educational technology now and, but I, I generally dislike most educational technology. Um, and I don’t mean that as just like a, a selfish kind of business thing. Like, oh, we’re the ones, no edge weight has clear room for improvement.
Thomas Thompson: Um, but the, uh, just the general field being so far removed from the classroom and from how learning and teaching actually works, um, is clearly disheartening.
Rebecca Bultsma: But it’s good to hear that you guys are talking about doing it differently and doing it better. Right? Like that to me is, you know, gives me hope because I think this is the way it should be and everything should be right.
Rebecca Bultsma: People with pure motivations.
Thomas Hummel: Unfortunately though, Rebecca, it’s like. Our mission that we set out to do, these are the things that are getting in our way rather than actual product or act like we’re actually like helping the teacher. Like, it’s like [00:41:00] there’s a whole system of LMSs and pay to play and, and these conferences, I mean, it is really an ecosystem not really designed to have the best rise to the top all the time.
Thomas Hummel: And it’s, uh, it’s been very frustrating.
Brett Roer: Whew. Okay. That’s a tough one. Uh, you, y’all get such great opinions here. I’m gonna be contrarian not to be contrarian like for sake of contrarianism, but like, I’m gonna, I’m gonna point out some of the things that I’ve noticed in the five years since I’ve been a school leader.
Brett Roer: So one, like I started education in 2005, now it’s 2025. I venture to say probably no other time in American public education has 20 years really looked different and stayed the same. But like how much has authentically changed in what you’re doing day to day as a teacher, student administrator, parent and I go to all these things you’re talking about and I get time to talk to people like you all.
Brett Roer: And like right after this I get to like [00:42:00] reflect on what I’m listening to. So I know the place of privilege. I’m saying all this from because I’m not making a lesson plan tonight. I’m like absorbing this. And I often think when I’m at conferences or events, like, how are other people taking this information and actually applying it tomorrow to like the things they’re responsible for, whether it’s in the classroom or uh, leading.
Brett Roer: I don’t have to do that. So that’s like the first thing it, I think that is a luxury that, um, allows me to have a certain perspective. I also have empathy knowing, like I, I don’t think I would’ve gone to those conferences. I would’ve stayed in my bubble. So first thing is like, you do need those intersections because I have seen people leave, let’s assume they’re leaving with a good tool, but like I’ve gone to really rural places where people have said, we wouldn’t even have known about this if, if we hadn’t brought an AI summit to like rural Indiana or Pennsylvania or Ohio.
Brett Roer: So there’s a, there’s a, a strong validity for why we do that. I mean, I met Rebecca with her keynoting at an AI summit [00:43:00] where people’s minds were blown because she was showing them like real use cases for personal and professional use. And if that was in different hands, that could have had a really negative or different outcome.
Brett Roer: And luckily there’s people like Rebecca and people like the work you’re leading. So that is important, like that access to innovation. But what you all have said, like the part that I hope we start to get right is one, let’s put the focus on again, like giving everyone concrete, practical. Implementation.
Brett Roer: Like if you’re, listen to the teachers who are telling you why students are struggling, listen to the students, their families, the leaders, and like to me that’s an exceptional use of AI is listening and making an action plan. Once you collect evidence and wisdom, the part I still struggle with and I don’t have an answer to because there’s not enough research on it, is like when I talk to students and they tell me the ways they’re using it, I find it to be brilliant.
Brett Roer: They are saying it’s helping them, right? They’re [00:44:00] not saying they’re using it to cheat, they’re saying it’s helping them learn the way their brains work best, and there’s no way a teacher could differentiate the instruction to that level. There’s just not enough time. AI tools are helping with that, and I hope it continues to go that way from a pedagogical like, this is good instructional practice based on what we know.
Brett Roer: This is a better way to continue to differentiate based on what we know. That I think is evident, like you can do that if applied. Well, the first thing is how are you training them? Is the tool, the tool’s, the vehicle. So what are you training them on? Like how are you making sure it aligns to good pedagogy?
Brett Roer: How are you getting that metacognition? To me, that’s the biggest thing. Like I found when I was a teacher, the hardest part to teach teachers when I was an instructional leader was like, students don’t need to hear you talk about facts anymore at any point really since 2005. They need you to explain how you’re thinking.
Brett Roer: If you’re the quote unquote subject expert and you’re leading students, and that was always the hardest shift for someone to be like, if you’re gonna talk, it has to be you explaining how or [00:45:00] why not a fact and not for your sake of you speaking. I feel like AI is similar in that regard. If we can get people who understand how to use I AI and leverage it to personalize and differentiate based on something that’s valid, then that’s the skill we need to be teaching people.
Brett Roer: And again, I don’t know when the right time to put it in kids’ hands are when it’s not. That’s probably step two. But if teachers understood. There’s a rationale for when to use it and how to use it and why to use it. I think that would help a lot with them starting to grapple with, okay then when, if I’m learning this way now maybe it’s applicable for students in a certain scenario use case, but those are all much bigger questions and it’s happening in very small pockets.
Brett Roer: So when I do see it, I’m so excited. But to answer, I think the final point is, I don’t know how to scale that yet, but that’s the power of yet
Thomas Thompson: I would say. I agree with almost everything you said, but it’s, I’ll be contrarian once more back. Since it’s a podcast, it’s a place to be contrarian,
Brett Roer: double [00:46:00] contrarian.
Thomas Thompson: My, uh, my one concern with, out, out outta everything you said, which again, broad agreement is that in educational settings, learning experiences that minimize effort and increase the appearance of fluency and engagement and enthusiasm in the students leads those very students to inflate their own estimates of their learning.
Thomas Thompson: And, um, there’s a couple of papers on this. Uh, carpenter 2020 on student misjudgment of their own learning is a great place to start. I, I worry that AI will create the appearance of fluency and engagement and enthusiasm, and they’re not putting in that much effort and they feel like they’re getting it.
Thomas Thompson: They’re misjudging what they’re actually learning, and they’re not forming those durable memory traces and, and broad conceptual structures. The one piece of advice I’d give to any student is the most important memory is still the one in your head. It’s the one that relates experiences and connects those experiences to other ones and is very much more relational than what you get.
Thomas Thompson: Just looking something up or, or asking chat GPT for it. And I do worry that [00:47:00] any use of generative AI at the student level will in some way undermine that process of learning could be totally wrong. Uh, but I think a lot of the research so far points to, that’s a possibility that we should probably test more thoroughly as we, uh, as we use ai.
Thomas Thompson: Hopefully we do it before we would put it in front of students, but in the nature of reality is such that we’ll probably have to do it while we’re putting in AI in front of students. Not we edu educate, we don’t do that. But, uh, in general as society, I mean,
Rebecca Bultsma: I’m just wondering if. Maybe skills of the future involve, like quickly synthesizing and adapting and the illusion of learning in some contexts.
Rebecca Bultsma: Maybe that is the skill, right? Like maybe, uh, how kids internalize and learn information in some contexts needs to, or will need to be different in the future. And it will just need to be how quickly can you skim and adapt and synthesize what you’ve learned in this job to make it good enough? At what point, maybe there’ll be [00:48:00] certain things that will really, really, really matter.
Rebecca Bultsma: That matter now. There may also be skills that we are dead set on teaching now, and we feel strongly that kids need to get these skills and AI can never help them with them. But it feels like the calculator conversation however many years ago. You know? So that’s just kind of another way to look at it too, right?
Rebecca Bultsma: I think that required skillset for successful humans in the future may look different than we think it does now, but there’s no way to prepare for that. It’s just part of an interesting conversation, which is why we invited you here.
Brett Roer: And as a guest of the AmpED to 11 podcast right now, in this hypothetical scenario, the Contrarianism is gonna continue.
Brett Roer: So I definitely agree with, like, I’m a big believer that, um, frictionless learning isn’t learning. So one is, you’re right, like if you’re gonna teach students to go home and use ai, like the things I’m hearing them tell me, you’re right. I don’t know if they’re actually learning their grades would validate that.
Brett Roer: Meaning if they were asked to be assessed on a certain standard and skill, and they’re able to demonstrate that, [00:49:00] and they’re saying this, this works how my brain works and allows me to take information that I know I’m gonna be graded or assessed on and lets me, you know, manipulate it in a way that helps me learn it.
Brett Roer: So like, whether I’m a visual, they like visualizations or podcasts or relating it to gossip Girl, whatever it is, like hearing students do that is very similar to how I learn. So I like, I do internalize that as like, yes, like there were so many things I realize now I wish I had extra scaffolds and supports as a learner that I didn’t have access to.
Brett Roer: So when kids take agency like that, that to me is a good use of ai and they’re still being accounted. Like, did you learn this thing that the teacher said you’re being assessed on? But um, to your point and what Rebecca just said, I think some of the skills that we never had to grapple with is like discernment, like.
Brett Roer: When to use that tool, when to use your power, your brain versus an AI tool. When to use it in combination, when to use it and add a teammate like that. Knowing when is gonna be such a huge [00:50:00] skill moving forward. And if you don’t have some of those skills that you’re referring to, yeah, you’re definitely at a disadvantage if you don’t know how to access that and apply it.
Brett Roer: And that is a fear I have of like, everyone’s gonna become overreliant on certain things. So definitely agree with you on that. That’s only one question. How long is this interview?
Rebecca Bultsma: I do wanna point out too that all the, not once in this conversation have we talked about this big gap with parents, right?
Rebecca Bultsma: Like Brett was meeting with students this week and the kids were telling him about how their parents kind of use AI like Google and they don’t have that discernment or know when to use it and how to use it. And so another big gap of support for students at home, you know, that, um, you know, is kind of being missed in the conversation.
Rebecca Bultsma: That is, uh, an interesting opportunity to, to kind of bring them along for the ride so that they can model. Good learning and good behavior, we all can, right? For students. That’s the goal.
Thomas Thompson: That’s actually the perfect segue, uh, to my question for, for both of you, which is that most [00:51:00] commentary treats AI and education as this single kind of monolithic thing.
Thomas Thompson: Um, but I’d like for us to spell out kind of a, a taxonomy. It won’t be complete, but we can start. What are the maybe three most distinct use cases that deserve, you know, maybe entirely different strategies, right? Of thinking or entirely different policies, right? A teacher use case might be different than students working with an AI tutor, which might be different than a, um, AI that creates a learning analytics report that sends back to parents or something like this just to create some crazy idea.
Thomas Thompson: But what, um, what do you think, what, what are the use cases? What’s the areas that maybe get underrated, overrated? Where does this discussion not go or should it go? It’s a pretty open question.
Rebecca Bultsma: I have the interesting opportunity to sit as like both a teacher and a learner because I am doing like master’s degree work, um, in artificial intelligence ethics specifically.
Rebecca Bultsma: So I get to use it and experiment with it as a learning tool, and I find it highly valuable, uh, just as a learner and with [00:52:00] my kids in college, just to be able to, at the risk of using the word we just agreed we don’t like, but personalize, um, how I consume information. Right. So I was on, uh, the Google Beta site today playing with some of their new AI tools that they’re experimenting with, uh, the lab.
Rebecca Bultsma: And one of them is kind of like notebook lm, but it takes a research paper or a document and it will almost turn it into like kind of like a gossip session between friends. It’s more of like a conversation you’re listening in on than a podcast formal. And I’ve done that with my kids before. But interesting for connecting random ideas.
Rebecca Bultsma: But ’cause I also did it with a teacher once too, who was teaching, um, you know, the Book of Exodus in a Catholic school, but her teenagers weren’t into it, so she gave it an outer banks movie trailer and created something that was highly engaging. So I think that’s probably one of the most interesting ways is to make learning interesting and unique and, um, put it in a context that’s actually engaging for an individual.
Rebecca Bultsma: Same with like Google [00:53:00] Opal, which you can build these personalized apps instantly for every student if you want. I think those kind of very specific use cases that speak to the individual, I think those are the best possible things. If students could learn to leverage those tools to put this accounting information in basketball analogies, like I think that is super underutilized for.
Rebecca Bultsma: Teachers and leaders in school. I think one of the best use cases I’ve found is like pressure testing. Like, here’s my lesson. I want you, the AI tool I’m working with to, uh, tell me 10 ways that a 10th grade student would think this is super boring, or tell me 10 ways that you would cheat on this assignment using ai.
Rebecca Bultsma: I think those kind of gap and pressure test opportunities are massive. I think like use it for that. Use it to hyper personalize what you’re doing and to find the gaps in what you might be missing. Those are the use cases that I default to over and over and over.
Brett Roer: I’ve always the, again, the hardest [00:54:00] part to get for me as an educator when I had co-teachers or when I was coaching teachers or leading teachers, was like, why?
Brett Roer: Like, why is this what you’re doing? And there’s no, I don’t always, I’m not saying that as like judgment. It’s actually like, can you express to me like why is this the best thing? And sometimes it’s really hard to get them to that point, but. One thing I hope to see now that there’s AI is kind of what I heard Thomas say.
Brett Roer: I’m a huge proponent of individually by each assignment, like, right, what’s the skill you actually want them to use? AI doesn’t exist, so you want them to learn how to do what? Okay, great. And let’s all assume we’re like, that’s a good standard. Like assuming all the standards are what we actually want kids to be learning and we can maintain a rigor.
Brett Roer: Now the next question is, since we live in 2025, is should a student use AI for this assignment? And in what specific way? And then how do you explain to a student, kind of going back to what we said before about not losing the intrinsic ways your brain works is [00:55:00] like, great, you need to still meet this benchmark and this bar and this level of rigor, so here’s how AI could help you reach that level of rigor.
Brett Roer: To Rebecca’s point, like right, like have it work the way your brain works, right? At the end of the day, you still have to produce an outcome that assumedly, if it’s good, look, it’s valuable. We want kids to be able to reach certain benchmarks and goals. But step one is, when is ai, uh, you know, a hindrance?
Brett Roer: And when is it a helper? And actually have that conversation at the teacher level because teachers are great at giving, many teachers are great at giving explicit directions and instructions, verbal, nonverbal, on each sheet of paper. And yet, like, this feels like a bridge too far for many. But it’s only because we actually haven’t stopped and just been like, just unpack that.
Brett Roer: Like, what is it? What, what are you afraid of? Great. Let’s name the fear. I’m afraid kids are going home. Write the essay. Okay, now what? Like what parts of the essay would actually be helpful? Do you want, like, when I had students who weren’t always at grade or reading level in Title [00:56:00] one schools, they could articulate history to me in such an interesting way.
Brett Roer: The gap was the writing. Okay. So like how are we gonna bridge that gap? The thoughts are there, what’s the skill? And I’m not pretending I’m a writing expert. AI could help me help that kid figure out how to bridge that gap because it could help me analyze what I have in front of me in a way I couldn’t have done before.
Brett Roer: I wouldn’t have the skill or the time. So like that’s a good use case to me, and I hope that would be a good use of pedagogical, like PD is, what’s the skill, what’s the way we can make AI a helper in this scenario when applicable? And like the end goal is you want them to be better at reaching these standards that, again, hopefully are the right ones.
Brett Roer: And if they are, then let’s use it to help us. So that’s my favorite use case right now that we’ll figure out students, families, and the rest of it.
Thomas Hummel: What’s is that, what’s the biggest push though, in education right now with ai?
Rebecca Bultsma: That is the question.
Thomas Hummel: Yeah. What, like, what is, what are we seeing
Rebecca Bultsma: as the push?
Thomas Hummel: [00:57:00] Yeah. What do you see as, what’s the ra, what do you think that’s getting pushed into schools? Yeah. Yeah. I, I know what it is, but
Rebecca Bultsma: I, I just think the vague message is. Well, it depends on the school, but like you have to be using AI but with like no real direction or benchmark or instructions along with it.
Rebecca Bultsma: Um, or, you know, you should be using it to plan your lessons. Like it’s very, very loose. I have seen very few schools who are like, here is specific requirements, which is probably better if you don’t have specific requirements. But the big push is to kind of use it for lesson planning differentiation, um, maybe to organize yourself personally, but it’s very hard to do those things while staying within some of the guardrails that exist too.
Rebecca Bultsma: So I would say that there’s not a lot of specific push or instruction to go along with it or guidance to go along with it. What, what were you gonna say? What are you seeing? No, I was gonna say
Thomas Hummel: like, what do, what do you, what do you think the goal, what is the goal of these AI companies? Like what are they, what [00:58:00] is, what’s, what’s the push?
Thomas Hummel: Like what are they, what do they have? Well, we know
Rebecca Bultsma: what the goal for them is, is to be in all the schools and getting all the data and having the subscription model that then you’re tied to, you know, almost forever, right? And then, yeah.
Thomas Hummel: But yeah, for Stu, for students,
Rebecca Bultsma: for students, the, I think the goal is to, well, I don’t know, you tell me.
Rebecca Bultsma: What are you seeing?
Thomas Hummel: Yeah, no, that, well, that’s what, that’s where the money, I mean, per student models is the district paying model. And that’s what companies are after is time on chatbots, time spent on chat bot, student time spent on chatbots. Like that’s the metric that is being pushed.
Rebecca Bultsma: I have some great research for you and some great people to follow just because I do work in the uk, right?
Rebecca Bultsma: Like it, that’s, it’s very, very different over there than it is in the United States. But I don’t think it has to be like that. Like honestly, talking to you guys has given me so much more hope. Um, just that there are people out there doing good things for the right reasons in school and giving these supports to teachers.
Rebecca Bultsma: And I’m really looking forward to amplifying your company and your message and what you are doing, because I [00:59:00] think that’s what we should be doing.
Brett Roer: Yeah, I’m grateful for that. Uh, like how fortuitous was that? We reconnected. You know, you’ve reached out to me on LinkedIn and we’ve like circled and I’m glad we had a chance to talk ’cause your story really did inspire me.
Brett Roer: And I can’t imagine being a classroom teacher at any point in my life. You have a newborn, you’re still teaching, uh, you know, when you return from maternity and you’re also trying to shape the face and, uh, the, the right reasons of ai, how to bring ai, uh, to educators. So I wanna commend both of you, Thomas Thompson IV and Thomas Hummel, and the work you’re leading at Eduaid.
Brett Roer: The last question we are gonna go out on is this. We understand the amazing work you’re leading. We appreciate you coming on. The last question we talk about is if we’re gonna move education, the direction that you hope it goes, you need to also have other champions, right? We call it the Oceans 11 question.
Brett Roer: Who’s gonna help you pull off the ultimate mission of, uh, you know, enhancing and improving, uh, education? Free flowing. Just who are some folks out there that you [01:00:00] wanna make sure others know about? You’ve named some great people, but who are some last folks that you might wanna make sure get their flowers?
Brett Roer: Uh, you know, as we close our show,
Thomas Hummel: I’d like to shout out my guy, Jacob Kantor. Love that guy. Yeah, I know you know him. He’s just been super kind to us. We have no like business relationship with him, but he’s just connected us with all kinds of people. Love that. We love our friends at CZI. They have been amazing with the Learning Commons.
Thomas Hummel: Everybody needs to go check that out. That’s an incredible resource that we are gonna bring to people as well. As far as other people, selfishly AI tool that I use in my own classroom, CuriPod, if you haven’t heard of them, they’re amazing. Love that product. I think what they do, getting students feedback and still be engaging and like leaving the teacher in power is like the way that things should be going.
Thomas Hummel: So. Who else can I shout out? I don’t know. That’s it. We’re bootstrapped, so we don’t really have any real friends, we just have teachers. We just have a whole bunch of teachers that like our products. Shout them out, you know, then, and I’ll wanna shout you [01:01:00] guys out. Thank you for having us on. So appreciate that.
Thomas Thompson: Talk to, uh, definitely talk to talk to educators, that that’s the biggest takeaway. But you should also, if you’re a teacher and you’re listening to this Daniel Willingham and the book is, why Don’t Students Like School? It’s a terrific read. He is a terrific thinker. Read his articles, read his research.
Thomas Thompson: He’s a cognitive scientist who, um, really has just turned his lens to school. Um, terrific book. Terrific person still alive, so I’m sticking it to mostly Alive people. Um, next Paul Kirschner and Carl Hendrick. Carl Hendrick has a substack called, um, the Learning Dispatch. I wish I was getting kickbacks on this recommendation, but I’m not, but you should all go read it, right.
Thomas Thompson: Um, it’s terrific. Um, the book is called How Teaching Happens. The sequel is called How Learning Happens, both written by the Same Authors. Uh, Carl Hendrick is terrific again. Um. A couple other Substack. Dylan Kane, he’s a math teacher in Colorado. He is a substack called 5 12 [01:02:00] 13, I believe. Um, you should go read that.
Thomas Thompson: It’s again, terrific in the classroom, on the ground. And now for the, um, dead people, uh, Daniel Dennet, neurosci, uh, cognitive scientist, philosopher. Um, very useful. John Dewey. Go back. Reconnect with, uh, the history that we all have sprung from, um, Mortimer j Adler. Alright, the Padilla proposal. It’s a terrific idea for the future of schools back in the eighties Great Books program.
Thomas Thompson: The idea that there are great ideas, that there’s a great conversation, that there’s great books we should read is really important, especially with ai. And then, um, that’s it. I’ll, I’ll leave it there. Yeah, that’s probably a good list. Mortimer j Adler, John Dewey, Daniel Dennet, Paul Kirschner, Carl Hendrick, and Daniel Willingham.
Thomas Hummel: This is why I brought him on board. There’s no si there’s no ed nerd like Thomas Thompson.
Rebecca Bultsma: Oh, I love it. I love it. Anything, the goal is to raise kids who are curious and thinking outside the box and trying to give back. I think everybody [01:03:00] should read all those books if it gives us even half a shot of raising good people, uh, who end up doing the great work that you two are doing.
Rebecca Bultsma: So thank you so much for taking the time to join us today.
Thomas Thompson: Of course. Thank you so much for the invite. Thank you for the great questions.
Brett Roer: Thomas. Thomas, keep up doing the amazing work you’re doing. Keep raising the next generation of curious, uh, curious learners. Keep educating them and keep providing educators with safe, effective tools in their classrooms.
Brett Roer: And on behalf of Rebecca and I, I just wanna thank our audience for taking time to listen to the AmpED to 11 podcast. You all have a wonderful day and we’ll see you on our next episode.