Julia Fallon: [00:00:00] You know, I think policy actually, it, it’s how values get operationalized at the end of the day. That’s really where I see it. And it’s one thing to say that we care about ethics. It’s another thing to actually design systems that uphold, uphold those things when things get complicated.
Rebecca Bultsma: Some people feel very, very strongly that it’s important that kids learn traditional ways, traditional methods, and not use AI.
Rebecca Bultsma: And so that ends up being really hard. Um, and it’s hard to find spots that everybody agrees on.
Julia Fallon: The model isn’t working, and this is AI is just showing it even more so. And do we need to have more project-based learning types of environments for kids? Do you know where they can explore things too?
Julia Fallon: Maybe, you know, then they can learn where their, their passions are and they can go off.
Brett Roer: We are live on the AmpED to 11 podcast. Thank you to our millions of listeners for joining us today. Rebecca, how is everything going in your homestead of Canada?
Rebecca Bultsma: Uh, you know what? Just living the [00:01:00] winter Wonderland dream up here, and just reflecting on all the stories Brett was telling us right before we went live.
Rebecca Bultsma: So we might have to start another podcast, uh, for the real stuff. But, uh, the, uh, AmpED to 11 after dark podcast potentially. We’ll, uh, see how that goes with all the good stories. But other than that, I’m, I’m doing great. How about you?
Brett Roer: I’m excellent. Today is Versha Munshi-South, the Chief of Staff of Amplify Elevate’s birthday.
Brett Roer: We were just in our favorite school district in Westchester. We just interviewed students and while I truly am so excited to introduce our guests in a moment, there’s nothing better than being in person with students. It was the best thing ever until this podcast right now. So, without further ado, we need to introduce this rockstar, I guess you could say.
Brett Roer: She’s also a DJ star. She is Julia Fallon, who goes by many names and has many titles. So, Julia Fallon, I wanna first introduce you as. I’m gonna read your LinkedIn profile, Julia Fallon, she her, and we’re a first [00:02:00] degree technology and Learning Alchemist, a Friday night DJ and the executive director of the State Educational Technology Directors Association.
Brett Roer: Julia, thank you for joining us on the AmpED to 11 podcast. How are you?
Julia Fallon: I’m good, thank you. I’m really glad that I’m here.
Brett Roer: We’re as well. And did you ever think that would be the introduction to a professional setting of a podcast appearance?
Julia Fallon: No, but I love it. I love that you’re using my entire title across, across the way.
Brett Roer: Yes. So first, for those that don’t know, we are gonna go deep cuts in your journey. We’re gonna talk about, you know, getting Ukrainian cured pork sent you halfway around the country. That’s all to come. But could you first start by just sharing with people. What is the actual incredible work you’re leading in the world of education?
Brett Roer: You have such a really important seat at the table. Could you share with everyone what it is you do and how you serve education today?
Julia Fallon: Sure. Like a brief kind of overview is that I have spent, I think, the majority of my career sort of working at the intersection of education, technology, [00:03:00] and policy. And often as I like to say in the unglamorous middle, where good ideas either fall, you know, fall quiet, or they, they fail quietly, or they are actually, you know, or they become real, which is great.
Julia Fallon: But I have been in the EdTech space, both at the higher ed side and K 12 space now for. 30 years so people can kind of figure out how old I am from that whole thing. But I spent 17 and a half years working at the Office of Superintendent of Public Instruction here in Washington State, which is where I’m based in various ed tech roles from career and tech ed to the pure ed tech, more of the boxes and wire side, you know, tech plans, things like that, to working in the federal program, title two Part A, which is around supports for teachers and administrators, and really helping our folks understand what tech integration looks like and what good professional learning looks like in that space.
Julia Fallon: About five years ago, I joined the staff of SETDA, where I was a member for 17 and a half years prior to that. Um, we are a small and mighty professional association of state level ed tech [00:04:00] leaders from across the country. We also have affiliates that work in the state kind of space. As well. Those are the estate or regionally focused.
Julia Fallon: And we also have corporate members that share our vision and mission. And I always say we’re small and mighty because compared to our counterparts, like a Cosine or an ISTE, we do not have the numbers. We don’t have a, a conference that I have thousands of folks at. We are very small. Um, if you think about folks that work in departments of education, there’s only a handful.
Julia Fallon: You know, there’s some states that actually have dedicated even more staff, but typically it’s been a handful. And to find people that are doing the same type of work that you’re doing in a state bureaucracy or a state government has been a really great experience just being able to go, Hey, who knows about this?
Julia Fallon: Because that’s not maybe my wheelhouse. And, and be able to do all that kind of stuff. So really the work that we do at SETDA has really been helping state leaders move from that reaction to intention. And you know, we’re gonna, I know we’re probably gonna dip into the AI conversation here as well, but it’s really about how do we.
Julia Fallon: [00:05:00] Help each other so that we can make better informed decisions and do it in a way that cares for the folks that we serve at the end of the day. So I’ll leave it at that.
Brett Roer: That is impressive. I’m gonna, Rebecca is gonna be chomping at the bit because she’s gonna have so many, much more incredible questions, really get in the weeds about the work you’re leading.
Brett Roer: Here’s something I’d love to ask you. Sometimes we ask people, you know, if you had a magic wand or if you had the right levers to make change at scale, what would you do? And you’re one of the few people we’re talking to who like actually do have some of those levers. What is the coolest part about your job?
Brett Roer: Like, what’s something you really get to do or you can’t believe you have a seat at the table or the ability to influence, you know, change in, in the United States in Education.
Julia Fallon: Enhancing or amplifying like human capacity. I, again, I feel like my entire career has been about building community or caring and nurturing for community. And I think it, it’s one of those, and I think someone asked me the other day, like one of my like, what are [00:06:00] your top four books? And like, the wisdom of crowds comes to mind always about, you know, individual brilliance is one thing, but when you can put a bunch of people together, you can actually go farther than that one person in essence.
Julia Fallon: And I think that’s what SETDA really exemplifies over the t. It’s we’re, we turned 25 this year, but the idea is really about how do we continue to connect people in ways that we can then move needles, right? And, and everything else. We have tech technology that lets us do it. I mean. You know, the technologies today, like we’re on a podcast.
Julia Fallon: Remember, imagine doing this like 20 years ago it would’ve been like we had to go to a studio, we had to do some other stuff. So the air, the barriers to entry are much lower. But I think we’re getting better at connecting virtually and making it really engaging and everything else. And I think again, it helps just support communities and it’s affiliation I think is one of those types of skills that the internet brought to everybody.
Julia Fallon: Like how do you affiliate and how do you mobilize and advocate and, you know, collaborate and communicate with one another. So I think the [00:07:00] magic wand is, I wish more people had more positive experiences about that, including even online learning experiences. You know, it was really sort of disheartening to see what happened during the pandemic in some ways, because I know the power of technology and the, the types of experiences that students could have, you know, participated in, and not everybody got that experience.
Rebecca Bultsma: I’m really glad that you brought that up because you were talking about, um, before we jumped on here, one of your kids who’s getting ready to go to college, maybe in Canada, which fingers crossed, but that kind of leads me to think a little bit. We have kids the same age who kind of were in school during the pandemic.
Rebecca Bultsma: What, what did we miss there that we can get right here, do you think? Like what are the opportunities? What did we miss that we have a second pants hat and what do we risk if we don’t get it right this time?
Julia Fallon: Well, what’s interesting is, I think we have, someone asked this the other day, also asked me this question.
Julia Fallon: Like, we’ve had the internet in classrooms for 30 years, right? I think, I wanna say 1996 is really [00:08:00] when the United States, the internet was brought to everybody’s space, both commercially and schools started getting connected, right? I think under the Clinton administration, you know, like let’s get all schools connected and E-Rate started to come together as a program to help schools.
Julia Fallon: We’re seeing sort of. Ripples from that, right? But at the time, I think some people were like, well, why do we need it? What do we need to do with it? Some teachers like, well, I don’t need to know about it in teacher prep programs and we’re not gonna, you know, address it. And I think what started to happen is when the pandemic hit, we made the joke.
Julia Fallon: I made the joke when I was a state leader, like, we should get, I told you so shirts we already knew. Well, we could say, you know, check the box that, you know, 96% or 98% of schools were connected and kids had devices. We weren’t all the way there. Right? We still had that little bit to go. But what happens is now everybody’s home during the pandemic, right?
Julia Fallon: And that access was not necessarily 98% or whatever. We found communities that weren’t connected at all, et cetera, [00:09:00] et cetera. And then what we tend to do in education is we always tend to go to like the lowest common denominator. Like in my home district, and I’m not gonna name them, you know, they had probably ready to go for 80% of the school population, but they waited until they had a hundred percent.
Julia Fallon: Meanwhile, for three weeks, everybody’s kind of putzing around. Versus trying to figure out how do I serve what I can serve right now? And then get those folks that aren’t connected or need devices up and running. So professional learning, you know, not teachers not having the skills for what it means to like teach online and virtually parents not understanding what’s also happening.
Julia Fallon: You know, that sort of thing. Kids maybe not having the skills, trying to figure out developmentally, like kindergarteners and Zoom is a lot different than having high school students on Zoom at the time. And I think what’s happening now with ai, and I’m hoping that we’ve learned the lessons of the last five years, is to be really intentional about how we implement the technology.
Julia Fallon: I’m not saying we have to slow down and wait until everybody’s on board, but I think we need to be a little bit more intentional about it’s gonna transform the [00:10:00] classroom, you know, more just like the internet did. I think we had technologies in there that helped, but it’s really gonna, I think, change how we do things right.
Julia Fallon: It’s, it’s accelerating. And I had posted on LinkedIn recently, I think it’s actually. It’s glaringly now where we have gaps in the system, not just technology. Technology seems to be getting the scapegoat here, but it’s really showing the gaps in the systems of this model that we’ve been kind of holding on nostalgically to as well.
Julia Fallon: Um, one of the things I said during the pandemic was nobody missed the bell schedule, right? Or chem lab. They didn’t miss that part. They missed ritual and they missed community, right? They were talking about when are we gonna have a football team? When are we gonna have the prom? When can the kids get together because they miss each other?
Julia Fallon: It was about that human connection. And I wish that we could also look at technology as like, how do we, how do we help that human connection happen? Um, how do we have more of those con conversations? You know what I mean? How do you free a teacher up with like, let’s say, AI so they can spend more time with their students, right?
Julia Fallon: [00:11:00] And build that connection because that’s where the learning takes place. It’s not, you know, technology dumping something into my brain, so. I think it’s just forcing us to ask harder questions. Right? And especially, I’m hoping it’s around governance ethics and what public education is really for. You know, what is it actually for in having that conversation?
Julia Fallon: I believe it’s a civic thing. I think as a country, I think that helps us. But I mean, other people have other ideas about it being a workforce, you know, development kind of thing, which it does too. But I think for me, civics comes first. You know, being able to participate in my community. And I think schools should reflect the values of their community.
Julia Fallon: So I think sometimes when we think that things can come up front high, I feel like, well, where’s student voice? Where’s parent voice? Where’s the community voice? And if they’ve all decided they wanna go in a certain direction, we should honor that in some way. As long as they’ve kind of gone through a process that reflects all of those things.
Rebecca Bultsma: Let’s talk a little bit more about that policy part, because that’s where it’s, everyone’s struggling right now. Like I’m a huge policy nerd. I help with policies, but [00:12:00] you’re right. Uh, the local ones are different from the state, ones are different from, uh, what the federal government envisions. And how do you see that all playing out?
Rebecca Bultsma: Like what do you think the biggest disconnects are between like, let’s say local state and like national, um, policy ambitions around AI and like the implementation realities and how do we ever start reconciling that? You probably see a lot of this at every level, and I know that a wicked problem, but I think it starts by talking about it a little bit and I’m curious what you think.
Julia Fallon: Well, there’s always the, the, the, the stuff that has to happen between policy and implementation, right? Like I, I do believe, and I know it’s hard to say that in probably 25, 20, 26, is I do believe that. There’s good policies out there, right? The intent is good. When you look at legislative intent, I don’t think anybody is trying to be a jerk here, right?
Julia Fallon: Even no child left behind, which I know a lot of people have, you know, feelings about. Ultimately the intent [00:13:00] was to make sure that no kid was left behind. How it got implemented though, and how it got measured. Completely different experience, right? So from a policy lens, I think for me is like who’s carrying the risk?
Julia Fallon: Who sets the rules, who lives with the consequences? We have a Congress that isn’t necessarily reflective of all the people in this country, you know, in the United States in particular. Uh, how do we make sure that voice gets in there? And they’re also not very young, I’ll put it that way. So they have ideas about how things were and how things should be.
Julia Fallon: And I think we have a generation of kids. I think about my own kids and the ones that are coming behind that they are looking at the world a little bit differently. And I think that they’re actually more community focused. Previous generations. We tend to be more individualistically focused here in the United States, I think they’re more community focused because they realize they’re gonna have to work together in order to be successful in life and that sort of thing.
Julia Fallon: But policies, like, you know, what gets approved, but also what [00:14:00] quietly sneaks in. You know what I mean? Like there’s some things about like, Hey, this is the law. But then sometimes people have workarounds because it’s not gonna work maybe for the, their local community or their regional community. We have always questions about who, what, who’s accountable when something breaks.
Julia Fallon: So we can write the best privacy laws out there in the world, but what happens when it doesn’t it when it fails, right? How do we, how do we hold companies accountable? You know what I mean? We, we, we’re not even get into cybersecurity, I hope, because that’s another bucket of things, but like, who’s accountable at the end of the day when all that data is out there?
Julia Fallon: Right? You had said something on another podcast about asking about who does it harm and who does it. You know what I mean? Who’s left behind? I, I read some language about that. And does the guidance arrive before or after harm? Right. Did, is it done in a proactive way or is it done in a reactive way? And then if our, and, and in the terms of K12 education policy are educators supported, right, or they left on their own to figure it all out.
Julia Fallon: Like I think these are kind of questions I think about, you know, without policy, sort [00:15:00] of innovation relies on sort of individual heroics, like somebody having an idea and getting it out there and the thing. But I think with policy you can have more sustainable and more equitable implementations. And when I say equitable implementation, it doesn’t mean the same prescription for every single community, but they get the resources they need in order to implement with some fidelity.
Julia Fallon: You know, I think policy actually, it, it’s how values get operationalized at the end of the day. That’s really where I see it. And it’s one thing to say that we care about ethics. It’s another thing to actually design systems that uphold, uphold those things When things get complicated.
Brett Roer: I would love, again, we.
Brett Roer: We’re so fortunate that we have someone like you who gets to see this at a national scale. You’re supporting, like you’re really at that intersection where like you’re living with government policy, the direct impact you’re seeing at the top. And you know, most of the work I’m leading and most of the people I’m in community with, we’re seeing the end result, the end user result in impact.
Brett Roer: So I heard two things today that I would love to get your opinion on Julie, and obviously Rebecca as well. So one of them was, I was [00:16:00] moderating, uh, a listening session today, and it was this question around like, you know, it’s a community and then I’m the only external member of the community. Like, well, you’re the res, you know, you’re helping us as an AI expert.
Brett Roer: And obviously I said, that’s not the case. But they were like, well, where should we be seeing outside? Like, what else is happening outside that we should know about? And I said, because we’d already been talking. I said to Julia’s point, like you’ve already brought up how community is the thing that fills your cup.
Brett Roer: Like everything you’ve mentioned brings humanity back. It’s not ai, you know, but then I said like, but when we’ve been discussing how you feel about AI and these things. We can always adjust a framework or attach a framework to your community values, as you said before, but like the answers are really right here in this room.
Brett Roer: Like you all can have targeted conversations that would allow the wisdom in this room to shape what your AI guidance and policy is. So if you or somebody in that room, Julia, where would you first, what would be kind of things you’d want that community to do if they have already made the question and decision to bring people together?
Brett Roer: What should they be talking about in those rooms? And then [00:17:00] two, if they wanted to find external frameworks or organizations that would be really useful to take those values and align them to best practices, who should they be turning to?
Julia Fallon: Oh, that’s a good, those are good questions. I think. I think coming up with some maybe shared principles, you know what I mean?
Julia Fallon: That you can actually implement, right? Like it’s one thing to be really like 30,000 foot, but at the end of the day, I think communities are trying to figure out, in particular teachers, right? Because at the end of the day, that’s, I mean, they’re the ones that are impacted the most by what’s happening, but they need, they don’t really need mastery of everything.
Julia Fallon: I think there’s this idea like, oh, the teacher needs to know everything there is, and dah, dah, dah, dah. But I think it’s more like they need judgment, so they need help trying to figure out like, when do I use it? When do I not use it? You know, how do I talk about these choices with, with students? One of the things that we called out in the 2024 national and tech plan was, you know, universal design for learning, right?
Julia Fallon: You should be able to design stuff for all students, and if you use it, it doesn’t just benefit those students with disabilities, right? Or it needs accessibility stuff. It actually [00:18:00] benefits all students, so. That can help you determine, again, that judgment, right? When do you use technology? When do you not use technology?
Julia Fallon: There might be some students you don’t use technology for because it’s not probably appropriate for them to be able to get to a learning goal and things like that. But also, while you’re doing that, you can have those conversations with students because students need a safe place to learn how to use technology in a way that lets them be responsible, citizens, responsible when they become employees, responsible employers, when they’re starting businesses and doing all of that kind of stuff.
Julia Fallon: And it’s, again, it goes back to what shared principles are you all talking about? Do you want all your students to be, you know, like post-secondary success, whatever that looks like, college. You know, certificate apprenticeship programs, like what does that look like? Do they have a set set of standards that they all are gonna know?
Julia Fallon: And there’s some good stuff that’s already out there, right? Like, you know, the ISTI standards are out there for students and educators. AI literacy can, can be coupled with that type of conversation as well. Uh, you know, we’re still gonna [00:19:00] talk about cyber bullying and how you treat people kindly. We’re still gonna talk about stranger danger.
Julia Fallon: All of those things that you would talk to kids about, even if technology wasn’t there. It’s still a conversation. But those are principles that I think you need to land on. Some shared principles, and I think where districts and states can. Align is, what do those principles also look like, right? I don’t think we need one national rule book again, you know what I mean?
Julia Fallon: But I think we should have some, some shared principles about what we see. And then I feel like accountability should be building capacity, not fear mongering, right? So when, when stuff comes, compliance kind of driven, innovation completely shuts down and people aren’t creative because they, they’re, they’re fearful, they’re gonna get in trouble.
Julia Fallon: So what does it look like if accountability built capacity? Do you create spaces, you know, as a community for people to try things? You know what I mean? And, and not be fear-based and, and everything else. So I think that’s where I would start them. I’m trying to think of the people that you would talk to.
Julia Fallon: That’s a harder question for me, Brett, because I’m [00:20:00] still trying, I mean, especially in the AI space, I’m trying to figure out who I can trust, you know what I mean? Who is authentic? Who’s not coming at this, where they’re just trying to make some cash and get out, you know, like you’re trying to assess all of that kind of stuff.
Julia Fallon: But just like any other emerging technology. Our, our field is really small. I mean, I don’t think people realize, especially if you’re not an Ed Techer that’s listening to this, our field is actually quite small. And I think reputation and community and connection actually can tell a lot about who you can trust in the space as well when it comes to all of this stuff.
Julia Fallon: So to lean on that and see who they’re connected to, and I’m very particular about my LinkedIn, like I just don’t accept things from everybody because I don’t want people to necessarily leverage my reputation, right? If I don’t know them, if I, if, if it’s somebody that I share 78 connections with, I’m more apt to say, yeah, let’s do this.
Julia Fallon: Because obviously there’s an alignment there. But random people that send me stuff, I’m like, if I don’t know you or if I haven’t met you at a conference and I haven’t had a conversation with you, I’m a little bit more skeptical about connecting. So that’s my own sort [00:21:00] of philosophy about LinkedIn. I know other people have other ways to use LinkedIn, but that’s my, that’s my way that I use it.
Brett Roer: Yeah. And I wanna ask, I wanna get Rebecca’s insight on that question as well. So Rebecca, I’ll let you, let’s wheels churn there. But Julia, I important. We collectively, all three of us. And then, you know, this kind of like you said, there’s kind of like a circle that people, that you, we get to actually interact with these people as humans, ascertain their values and see their true motivations.
Brett Roer: And that’s one of the things I love about the fact that we get to bring on guests like you onto this podcast. Like I might be directly impacted as a teacher who’s, my district has adopted X standards or X, you know, alignment to AI policy or guidance. And I’ll never know the difference. So like that’s why I’m so grateful that people like you come on and you share some of those best practices because you wouldn’t be amplifying those and lifting those up unless you truly believed in them.
Brett Roer: So I just wanna say thank you. Now it’s your turn. Rebecca, what would you say to that? Like what are, what would be your feedback and thoughts around the questions that surface on my end?
Rebecca Bultsma: I think it’s hard, there’s so many [00:22:00] different levels to come at it, obviously like. There are a lot of issues that I’m running into with my kids, even just recently at university, where every teacher has a different ethical philosophy on ai, which is like as an AI ethics researcher, like how you even define what is ethical and what is ethics, especially connected to AI, is so hard because we live in such a fragmented society where we can’t even agree on whether.
Rebecca Bultsma: Tipping is ethical. You know, like all of these really, really kind of simple things that something like this, um, you know, we may feel like it’s ethical to prepare kids for a future with ai, whereas some people feel very, very strongly that it’s important that kids learn traditional ways, traditional methods, and not use ai.
Rebecca Bultsma: And so that ends up being really hard, um, and it’s hard to find spots that everybody agrees on. So I agree with the idea of those shared principles. Brett knows this, but I advocate for a, an approach called appreciative inquiry where you talk to students and parents and all the staff and you agree on these [00:23:00] shared principles and what’s going well and the direction you wanna go together and build around that.
Rebecca Bultsma: But it’s so much easier said, uh, than done. And that’s why it’s hard to do it in bigger communities and the difference between K12 and post-secondary and then kids enter the workforce where they. Do or don’t have the skills that they need. So it’s, it’s messy. There’s not a lot of, uh, clean answers, but the best thing you can do is listen and talk to people and find out what works best.
Rebecca Bultsma: But I think we have to remember that kids are getting a lot of disconnected guidance right now, like even from class to class. Like, absolutely not in this class. Okay, yes, you can use it in this way, in this class. And it’s just a lot for them to manage and, and they’re getting a lot of mixed information about competing priorities.
Rebecca Bultsma: And the generation that our kids are, Julia, I think are getting, um, AI was really demonized for them early on. So there’s actually a very specific generation of about six years here that it was told to them from day one that it was cheating and it was bad and it was wrong. And they really have deep-seated like feelings [00:24:00] about that, that were kind of given to them very, very young.
Rebecca Bultsma: And so it’s just interesting how influential teachers and leaders and so we just, we have to get this right, whatever right. Looks like, um, it looks different for everybody in their mind. For me, it’s having kids prepared to exist. Get jobs in the real world, which I think involves learning and understanding ai, which starts with teachers learning and understanding AI and the people making decisions about it.
Rebecca Bultsma: But again, chicken, egg, hard, complicated. What’s the kind of the biggest disconnect that you’re seeing out there? Like are you seeing this kind of polarization when it comes to ai?
Julia Fallon: I’m, and I was, I was just gonna, you just said something that is interesting because I think sometimes I don’t know if ethical AI is about getting it right, necessarily.
Julia Fallon: I, because it’s just, I don’t know if we’re ever gonna get it, right. Like, you know what I mean? Like I think, but,
Rebecca Bultsma: and what is right.
Julia Fallon: Yeah. And I wonder sometimes if it’s more about building systems that can notice harm and respond before it scales, right? Like that’s a different way to look at it, right?
Rebecca Bultsma: That’s kind of where your accountability thing comes in though, right? Yes. [00:25:00] That who’s responsible for,
Julia Fallon: well, if everybody’s responsible, then no one’s accountable, right? That’s
Rebecca Bultsma: exactly
Julia Fallon: kind of thing. So.
Rebecca Bultsma: Do we have individual responsibility to use it in ways that don’t cause harm? Is it the school’s job to teach us?
Rebecca Bultsma: Is it the ed tech company’s job to build that in? Is it the foundational tech company’s jobs to make sure that their systems don’t cause harm? Does the government, are they responsible? It’s almost like the gun debate, but like on a different scale of who is accountable, what is accountable, and what is the right way to do this?
Rebecca Bultsma: And it’s just like you said, right now we’re living in a world where there is no real accountability for any of the harm that AI is causing short and long term. And that’s part of the problem. There’s not a lot of, um, accountability.
Julia Fallon: I do wanna, I do wanna do a shout out though because I do think that educators who slow down and ask better questions are really underrated.
Julia Fallon: Does it make sense? I’m not saying that they’re blockers or the laggards as we call them in that, you know, adoption cycle. But the ones that are going, I need to ask them better questions. And I think we also fail to really listen to students. I [00:26:00] think they are the clearest voices right now asking for transparency and some agency, which we don’t always give them.
Julia Fallon: And, you know, building that trust. And sometimes I wish that we would listen to those folks that are like closest to implementation because I think they see things a lot earlier than the rest of us. And I feel like we need to listen to them a little bit more. Um, ’cause there’s always folks that are gonna go first, right?
Julia Fallon: They are like our scouts and then say, okay, so what did you learn? You know, what, what should we work out? You know, can we get around this pothole here? Or is this not the way to go? And I had the fortunate luck of sitting next to somebody on a, you know, cross country flight who actually, now mind you, I was also trying to look up his credentials while I was talking to him to make sure he was like, legit, like, you know, oh, you run AI for the Department of Defense?
Julia Fallon: Oh, wait a second. You know, like, I wanna make sure this guy is really legit. And it, it, he truly was. And we happened to not have anybody in that middle seat. And I was like, this is a, this is like just as chat, GPT is hitting. Right. August of 23 is what I keep thinking about in my head in terms of dates.
Julia Fallon: And I said, what [00:27:00] You’re running it for the Department of Defense. What would you say to k12? We were trying, we’re gonna be grappling with this. And he’s, you, he’s like, you have to build those use cases. You’re never gonna be able to cover all the scenarios, but also be clear about where you don’t use it and then hold people accountable.
Julia Fallon: Like if you used it in a space, it’s not, it’s not good. It’s not good. Right. Be very clear about where not to use it as much as you are clear about where to use it. So I, I keep thinking about him and that conversation that I had, you know, across the country from Seattle to DC about, oh, where do we not do, I don’t think we have a lot of conversation about where not to use it.
Julia Fallon: You know, everybody’s like, let’s just use it everywhere. I’m like, well then maybe not, you know, like, maybe we need to slow down here a little bit and uh, that sort of thing. But what’s happening right now in the United States too is all this rhetoric about taking tech outta the classroom. Right. So it went from cell phone bands to device bands.
Julia Fallon: So now we don’t want tech at all and there’s just this air of privilege in some ways that drives me a little crazy too. Like, well, that’s nice for you ’cause your kid’s still gonna probably get some sort of tech [00:28:00] experience and then be successful post-secondary. But I, I made the case in a, in a conversation, we were talking about cybersecurity.
Julia Fallon: I’m like, let’s say we, let’s say the scenario happens and we get rid of tech in the classroom, just we’re gonna do that. It happens. And that’s the worst case scenario. You still need it for the front office. You know, we have lunch room, we have bus schedules, we have security systems, we have assessment data.
Julia Fallon: Are we all gonna start doing pencil and paper assessments again, because that’s where we’re gonna move to if you’re not gonna have any tech in the classroom, because if you give a third grader a test, do they fail it because, you know, they didn’t, didn’t do well in the assessment because they don’t know how to use a computer or they don’t know the stuff.
Julia Fallon: Like you won’t be able to ferret that out if they don’t have those experiences. So I, I think people are kind of. Extreme because they’re just overwhelmed and there’s a lot of fear out there right now and everything else. But how do we get them to think about, okay, there’s instructional tech, there’s, I, I don’t wanna get to a situation either where some kids that need tech in order to be successful, right?
Julia Fallon: Because they have a learning disability and everything else where they’re wearing a scarlet letter because they’re walking around the laptop because no one [00:29:00] else has a laptop. Like right now kids are learning is they, can they all look the, you know, they all look like a regular student and they don’t get called out for their learning disability or something else that’s happening.
Julia Fallon: So I’m, I’m trying to figure out how we get back to that moderate, you know, what works kind of space versus these extremes like no tech, all the tech and everything else. So that’s been an interesting, interesting conversation that’s been going on. And I think it’s actually, frankly distracting us from the real issue, right?
Julia Fallon: About agency. Are we gonna give kits agency and choice with their stuff? And the system that we have now is not working.
Rebecca Bultsma: Isn’t it interesting just that we’re seeing at every level of society that kind of polarization to all or nothing, or one side or the other? I think that’s so interesting because you know, all of this and all of ethics, everything lives in the gray, in the nuance, and I think even the idea of all labeling, all ai.
Rebecca Bultsma: No AI in the classroom. Well, are you kidding me? Like what about like, you mentioned like versions of it that are [00:30:00] specifically designed to support learning. What you mean is no AI to do kids’ homework, right? Like if that’s what you mean, say that but, and maybe like no AI for these specific things, but we’re just trying to make everything all or nothing and put things into broad categories when it’s just a million little categories and we have to actually think through nuance.
Rebecca Bultsma: Not only in this but in every aspect of everything we do. And that’s part of the problems I think we’re running into largely labeling people as good or bad or this as good or bad or technology as good or bad and trying to go all or nothing. I think, uh, that’s not the world we live in, as you mentioned, this generation sees that, knows that, but the world is being run by people, our generation and a lot older than us, who really see the world in that binaries and it suits their needs and purposes often to keep things that way.
Rebecca Bultsma: And uh, it’s not helping very many people.
Brett Roer: Yeah. Speaking of gray areas. This was another question that the superintendent, whose school I visited today, he said this and I would love to know Julia, like, [00:31:00] and obviously Rebecca as well. So we talked about today some of the students we interviewed were very anti ai, but like from a very ethical, moral standpoint of like, if this is what we’re supposed to use it for, great.
Brett Roer: Like, if we’re not supposed to use it here, you know, like she had a very strong opinion that like she morally would never do that and she knows many others do. And it, you know, it goes on, it goes, um, unpunished. So she’s like, and I said, you know, that’s a, that’s, she’s navigating that every day. Like obviously everyone has that moral imperative.
Brett Roer: You, you’re allowed to cheat or not cheat, but like, she’s so ethical, but she’s seeing other people in with no effort because the teacher doesn’t notice it and it won’t change their assignments. She’s indirectly being punished by that and her values have to be, you know, pushed every day. But then we came up with this idea, which is around the table.
Brett Roer: We used either a teacher use of ai. Or a student use of AI and was like, well what’s the gray area? Like we all know literally going to chat g bt typing this out, getting the answers, turning that into your teacher. We know that’s wrong. So we’re like, we have that extreme, we all, that’s easy. [00:32:00] And we have the other one where it’s like just shutting it down.
Brett Roer: These students talked about physics. It’s like having a second teacher and they’ve learned so much by being able to rely on a AI generated physics support. So we said, great. What if we just went around and everyone said like, well what’s the gray area like, where do you just personally feel like, I don’t know when I’m supposed to, or not supposed to in the absence of guidance.
Brett Roer: I said, we all just did that. Could that kind of help us as a community kind of shape like where we actually are and then like adopt policy around that?
Julia Fallon: I don’t, I don’t know because I think about, let’s say that, I mean in the world, right? Like let’s say, you know, like we’re, we’re adults, right? We’re in there in the spaces.
Julia Fallon: My, my, no one’s telling me that I can’t use it. You know what I mean? No one’s saying that I can’t use it for something. And I think it’s a big question. No one’s really talking about this is assessment. Because I’d rather the teacher go to the kid, whatever you punch in there and you get the report, we’re gonna know about, you know how sales divide, right?
Julia Fallon: That is, that is not disputed in essence. Right? That’s a thing that we all [00:33:00] agree upon in science, but I wanna know what kind of queries did the kid ask? Like, you know, where did they go? And can I see how their learning is uncovered? The learning process versus just it’s, how do I know that’s the right answer?
Julia Fallon: How did they confirm that that was like a reputable source where I could really rely on that? I guess, I guess it goes back to, you know, when we can Google the 50 state capitals, do we need to teach state capitals anymore? I mean, so for me, that gray area isn’t, you know, is the teacher rewarding the, the, the output at the end of the day, or are they rewarding the process that the student took to get there?
Julia Fallon: And I keep saying if a student can choreograph a dance, that shows me how DNA. Binds is that any different than a kid that, you know, puts together a PowerPoint? They’re still showing me that they understand how DNA binds and that’s the knowledge I want the kid to walk away with regardless. Right? So I, I think it, it’s a big conversation about assessment and that is a rabbit hole that I don’t think are worm, you know, a [00:34:00] can of worms that I don’t think people wanna necessarily get into.
Julia Fallon: And that looks different than one school that I went to. It looks different than our legislators went to school. And that’s a different, and, and it also cuts off this instant access for grades, right? That parents want, right? Like they wanna know that the kid took the test and I can check the grade book at the end of the day and they got an A that might get broken down a little bit ’cause teachers are gonna need more time to actually assess whether or not a student understands or understands how to look for the information.
Julia Fallon: Those are the skills that you’re gonna need in your work life later on. You know what I mean? Unless you’re gonna be a biologist or you know, a stem cell scientist. You’re gonna need to get in there a little bit more, right. About how cells divide and everything else. But I think for us is where the model isn’t working, and this is AI is just, is just showing it even more so.
Julia Fallon: And do we need to have more project-based learning types of environments for kids? Do you know where they can explore things too? Maybe, you know, then they can learn where their, their passions are and they can go off, do the [00:35:00] thing. You know, a kid that’s gonna inherit their family farm is gonna use technology.
Julia Fallon: You know, anybody that’s been in a combine these days, it’s, it’s, they’re doing tiktoks. Well, the, the, the equipment’s running the thing. I mean, I see that all the time, right? But they also understand what that information is getting them so they can make adjustments on the fly and everything else. So that gray area is, I mean, how do we do assessments?
Julia Fallon: How do we help parents understand what’s happening? I think when they’re not invited to the conversation, they become even more fearful because it doesn’t look like what they know. But if they could say, Hey, no, my kid is gonna learn this, I think they would be all in. So I don’t know if that answers your question, Brett.
Julia Fallon: I feel like. People. The, the, the cheating thing is the cheating thing. Kids have always
Brett Roer: gray area questions, gray area answers. Rebecca, your thoughts?
Rebecca Bultsma: I was just thinking about how it’s just, this, again, all comes back to the policy thing. ’cause it’s not even necessarily even about what the parents want. The parents can demand that, but the legislators need a way to categorize.
Rebecca Bultsma: Kids, categorize schools, allocate funding, and they need something that they can grade [00:36:00] And, uh, how kids are doing on a pass or a fail. They need numbers, they need metrics. And so that’s part of a broader system-wide, um, rethink of education that’s gonna need to happen. Because you’re right, the future of work is going to require a very specific set of, we used to call soft skills, which are, I prefer to call durable skills now.
Rebecca Bultsma: Collaboration, communication, you know, cooperation. All of those things that are harder to put into metric categories that then help us make binary decisions about kids and their futures and SATs and all of these things. And it’s, it’s just not gonna be as neat and tidy into big buckets like we’ve always had.
Rebecca Bultsma: Like we. You know, want at certain levels, and I don’t know how we, how we even start to come at something like that. I do think AI can help with that somehow, but the goal is to have it help support how we think and that process, rather than just give us outputs faster. Right? Having AI do stuff for you is not the solution.
Rebecca Bultsma: Uh, we all know that, [00:37:00] but it’s, it’s all very, very nuanced and complicated. So as policy, as ethics, as all of it always is, so where do we, where do we start is the question, I guess, Julia, like what’s the, what’s the work you are doing? What does a typical day look like for you and what are you seeing?
Julia Fallon: What we’re trying to do is making sure that state leaders have, so again, I think the conversation’s going to like research and evidence where we can make really good decisions based on what we know at the time, right?
Julia Fallon: We’re not gonna always have everything ready to go, but can we, can we help? State leaders, figure out where they can go get information, right? Research, talk to districts, whatever, to make a policy decision or help inform policy as it gets written, because eventually it gets implemented and everything else.
Julia Fallon: But we do a annual survey every year of our state leaders. We’ve been doing it for the last four years. We’re working on our fifth year one around trends. You know, like what’s what? You know, what’s hot and everything else. That helps with legislature when they’re going to legislatures and say, Hey, listen, AI is now [00:38:00] topping.
Julia Fallon: Cybersecurity is still up there. Professional learning is still an unmet need. How do we invest in professional learning? Because regardless of all this other stuff, you have to help teachers. You have to give them the space. And again, in the 2024 national ed tech plan, we talk about the design divide, right?
Julia Fallon: Like we have access almost worked out. We still have to work on home access, and that’s one thing. But teachers need time and space, right? To be able to figure out how to use technology effectively in their instructional practices. And this is an equity issue. And I know that is a dirty word sometimes here in the United States at the moment, but it’s really about just like even equity be equitable access in a school.
Julia Fallon: And I often use my kid as an example. You know, she has to take biology, right? At some point in her high school career. And her last name starts with F and her best friend’s, last name starts with M. And they will be in a section of biology 1 0 1 and they will have two different types of experiences. Now, I’m not saying that they have to look exactly the same, but one teacher might be tech adverse and one teacher might be like more tech forward.
Julia Fallon: [00:39:00] And it’s an equity in a building in a same section of a thing. And as a parent, I wanna make sure that my kid’s prepared, right? So I would, hopefully, I would want my kid to be in the class that the teacher’s using technology to help inform or letting them use technology to demonstrate their knowledge or, you know, look up stuff or whatever they need to do.
Julia Fallon: So. That’s the design divide, right? And then we have the use divide, which is, I want to make sure that my kid is using it in an active way, not in a passive way. Right? They’re not just doing test prep or, you know, remedial stuff. She’s actually using it, like you said, to create, to collaborate, to do research.
Julia Fallon: Like they’re using it in a, in an active way to be able to demonstrate knowledge and, and understanding. So I think for me is, yeah, it’s just trying to figure out how, what kind of system did I create in my space? But as, as a state leader, we wanna make sure that we are getting dollars for professional learning, whether it’s from the feds or from from state budgets or, you know, cybersecurity.
Julia Fallon: You need to have actually [00:40:00] secure and safe access, right? You can’t just have it open to everything. How do we make sure that we have infrastructure as a sector? We have been woefully under resourced as a a sector, right? In terms of infrastructure. Both in the physical infrastructure, but also like human capacity.
Julia Fallon: So how do we, if you want students to come out with skills, then you better, you better be investing right in the ED system to do all of that kind of stuff. But our leaders are really coming together right now. A, you know, many states have put out AI guidance now they’re like, okay, now I need to implement.
Julia Fallon: Right? I need to, now that districts have the guidance and they have sort of a container to work with what’s next? So what, what’s the state role? We often ask that question all the time. What’s the state’s role? Because the state can’t do it all. Right? And again, we want it to reflect a local community. So how do you help a small rural district in Washington state be able to do this when they only have, you know, 600 kids total?
Julia Fallon: And that’s even big for some of the schools that we have, but they only have 600 kids. So how is the community, how do we support them? Maybe they don’t [00:41:00] have a it, you know, instructional tech person. Can they use a regional one? Can, can the state provide some sort of support? How do we do that? So, because the larger districts obviously have more resources when it comes to that, but smaller districts don’t.
Julia Fallon: And, and you know, we, I, I don’t remember what the stat is in terms of like these 20 million students are in rural and smaller communities, right? And that’s a whole untapped, and I know the ed tech sector and develop, we haven’t talked about developers. They overlook that because it’s not a volume buy, right?
Julia Fallon: But you could actually go in and do a lot of fidelity on your product and impact and, and contribute to research to whether or not it works in a smaller school district with, you know, a hundred kids. Does your reading curriculum work and you know, or your platform and everything else? So our, our group is really our, our, our folks are, we, we try to use the state trends report and also just things that are coming top of mind.
Julia Fallon: We have a very active slack space, so thinking about how we use technology to connect everybody to one another. We’re at conferences as well and we bring our community together. But trying to have our ears to the ground, like what’s coming [00:42:00] down and how can we support research and evidence is one of those things.
Julia Fallon: AI is one of those things. Cybersecurity. And then we have some state level programs like E-Rate and title two that we have, you know, our ends in that. We help states be able to come together to ask those types of questions so they can support their districts a lot better at the end of the day.
Brett Roer: Julia, you said so many things I wanna go deeper on, but I also wanna make sure we give you some equity of voice at the AmpED to 11 podcast.
Brett Roer: You know, we always allow our guests to, you know, reverse it. It’s your turn to ask my Rebecca and me any questions you might have. You get one shot at this. You know how to master a stage and rock a crowd. So let’s go. What do you got?
Julia Fallon: I think my question is how do we keep sort of AI leadership from becoming another compliance exercise instead of a capacity building one, right?
Julia Fallon: Versus a, I’m gonna check this off ’cause it’s the shiny thing, and if I can show that I have AI and smart boards, and I am a modern school, [00:43:00] I wanna, I wanna move beyond that, right? I wanna move beyond, like, you’re actually creating spaces to, to build that capacity, right? You’re building the capacity of your teachers, you’re building the capacity of your school leaders and everything else to be thinking about how to use technology in a way that I don’t wanna walk in and see a smart board being the same way he uses a blackboard in essence, right?
Julia Fallon: Like, how do we get beyond that? But ai, how do we make sure it’s not another compliance exercise?
Brett Roer: Rebecca,
Brett Roer: we
Brett Roer: checked the box. You the first or or second you choose.
Rebecca Bultsma: I usually go first, Brett, so I’ll let you go
Brett Roer: first on this front. I always say it’s really hard to follow and now it’s gonna be even harder to lead off.
Brett Roer: Okay, so I think you just said it, Julia. We were actually, I was just talking with someone and they were like, for their parents for example, they’re like, well, we have some parents who, you know, think one thing, we have others who don’t want it at all. And I said, why don’t we create a space that’s called I hate, but I’m willing to try and like just show them why it’s relevant to how to be either a better parent or just what’s the thing in your personal life as a busy adult.
Brett Roer: And [00:44:00] so one thing that’s a little different about compliance with AI leadership, and this is if we get it right, obviously what I’m saying is it’s so impactful to how a leader, if they already have the right mindset for leading, how much it could compliment and empower them in it. And it’s easy for me to say that I’m sitting here, but when you teach someone that and then you explain that is a replicable process and mindset and the technology exists.
Brett Roer: That you have to get them to that moment where immediately, it’s not like you just taught me one magic trick. You taught me something that I can replicate and do in different problem sets that I have to do. So regardless, that’s with anyone with ai, whether it’s an AI leader or not. And then the last thing is, everyone should take Rebecca’s amazing course called like gen ai.
Brett Roer: Everything the K 12 Edu Care needs to know because everyone that takes it says, wow, I understand blank, and now I feel empowered to do X so I can serve y. So like, those are the two things you need to do, and that’s it. And [00:45:00] then we solved everything. Rebecca, good luck following that one.
Rebecca Bultsma: Oh, I think like you’ve gotten to the core of it, I think you need to figure out, um, you need to identify people’s pain points and speak to exactly what their actual needs are instead of just trying to ram it down their throats.
Rebecca Bultsma: Right. Like it, in my experience with the thousands and thousands of people I’ve, I’ve talked to about this, it’s just a matter of figuring out one little way. That it can be useful even in one little aspect of your life, even if it’s your personal life. And that is almost like a, a little magical gateway into being a little bit more open-minded about it.
Rebecca Bultsma: And it’s like that with everyone, right? We look at differences in society and you find one thing in common or something, and it, it opens the door. And so I think, but also I think we have to like acknowledge people’s hesitations. They’re usually grounded in very, um, real and valid, uh, reasons for hesitation and reasons for maybe not wanting to dive in headfirst, as you mentioned, [00:46:00] the best people we see doing this are the ones that are taken a minute and, um, really evaluating all the options.
Rebecca Bultsma: And so I think, uh, acknowledging those things and really, um, not buying into the hype and trying to push the hype at people going in very, very measured with, yes, here’s some opportunities, here are some risks, and what. What are a couple of the biggest pain points we hear about consistently and how can we solve just one little thing.
Rebecca Bultsma: I’d say the other thing, uh, is as you mentioned, just bringing students into the conversation. Like really just, uh, some of the best guests we’ve had and conferences we’ve been to have these, like this one kind of teacher or EA who’s just like passionate about ai, who starts a little club with the students and then the students end up telling the board about it or teaching the parents about it and, and we’re teaching the superintendents about it in a way that makes it engaging and fun and approachable and helps that light bulb moment happen.
Rebecca Bultsma: So I think we just have to stop just like [00:47:00] everything else, thinking there’s one way to do it, making it a compliance thing and that it’s great full stop period, end of story. And just going into it all a little bit more open-minded and maybe in non-traditional ways. ’cause as you mentioned, like things rolling out technology and rolling out policies is just always been done in a very specific compliance kind of checkbox way.
Rebecca Bultsma: And I just don’t think that’s gonna work here.
Julia Fallon: I have another question. I have another question though I wanna ask because it’s related. What is responsible UN adoption look like when the tool doesn’t align to our values? Because I think that we’ve, in education, we tend to carry things for a long time.
Julia Fallon: When we could have put, probably put them down and let’s say a community, and maybe it’s students, because I’m hearing a lot from students, like they don’t wanna use it. So maybe the community decides they don’t wanna use it. So what does it look like for adoption when it doesn’t align with our values?
Rebecca Bultsma: Well, that’s a great question because. You know, Larry Cuban, I don’t know if you’ve ever read any like of Larry Cuban stuff, but he talks about how this, uh, oversold and [00:48:00] underused this technology has been shoved at people. And the, it was all supposed to revolutionize it. Like the whiteboard was supposed to change education, the smart board forever, right?
Rebecca Bultsma: Like, uh, one to one laptops revolutionize education forever and people just aren’t seeing that. But it’s not like we’re getting rid of computers. Maybe some people are, but I think that needs to be a conversation too. And it probably starts with being very intentional about the tools we even bring in, in the first place.
Rebecca Bultsma: Like there’s over 2000 an average of 2000 ed tech schools in EV tools in every school district, which is. Crazy and all the risk is assumed by the school district, which is also crazy. And the amount of money that’s being made off of education and taken out of education is crazy in my mind. So I just think we need to be very, very intentional about what we’re adopting, how we’re rolling it out.
Rebecca Bultsma: We’re not just grabbing things because this is the trend, or we are feeling behind, or it promises through hype to solve all of our problems. ’cause historically we have not seen that work. And so I think we need to be mindful of that. And I [00:49:00] think that’s why you have to go in with very clear benchmarks and expectations that you are grading regularly if you’re getting the outputs that you want.
Rebecca Bultsma: And if it’s not working, don’t make it 3000. Like start cutting your roster. You know, just start actually just sticking with what works because the money that’s being hemorrhaged out of education right now into private industry makes me ill. And so I just think we need to be very, very intentional about what we’re adopting and why.
Julia Fallon: I do, I do see that sentiment out there and everything else along with Cosin and ISTE and Digital Promise and cas at Innovate uu. About two years ago, we started a conversation about how could we help, because I know as I use this a bird walk, I, I know as my own, when I walk the floor, I have a certain set of questions I always ask, you know what I mean about privacy and about accessibility?
Julia Fallon: And I’m like, how do I help the average person that’s walking a conference floor ask those questions and get beyond the marketing hype, right? Or beyond the sa, you know, the superintendent got invited to a steak dinner and now all of a sudden we have a $2 million deal. And I’m not saying that you [00:50:00] can’t still have those sales practices as part of your thing, but I know that I would say to the ed tech developer community, like, really listen, help figure out what, like figure out what my pain points are and I’ve been on record already.
Julia Fallon: Like, we don’t need another LMS, please don’t build another LMS because right now no one has the wherewithal to actually migrate to another LMS, but try to figure out what the pain points are for the one that somebody might be using, and then solve for that. Right? Or what kind of things are you trying to solve for and come in from that.
Julia Fallon: Things that I, I share the same sort of thing. We’ve wasted a lot of money and it’s hard to fare it out exactly where technology has impact. I, it just, it’s part of the infrastructure. I don’t say, well, how does the bell schedule impact the kids’, you know, learning and everything else. That’s what drives me crazy, you know what I mean?
Julia Fallon: At the end of the day. So I share that sentiment as well and everything else, but I do know that we put out those quality indicators. We’re hoping that people are starting to get them, and then hopefully products can start to validate against some of those indicators to show that they actually meet the standards that we put out there.
Rebecca Bultsma: It’s just [00:51:00] a lot for school leaders to worry about, right? They really vet these tools and like, like you said, in small districts, they’re not trained to be procurement officers, but they have to take on all this perpa risk and all of this liability risk as school leaders who are not trained to do this and then expected to do these metric follow-ups to even know, uh, whether they’re working or not, um, and pay out millions of dollars potentially to these companies and keep track of all 2000 of those subscriptions.
Rebecca Bultsma: Like, that’s not working. They need support and they need help with that too, right? Like it’s a lot to ask of people.
Brett Roer: The adoption, the methodology I would take on for an adoption if we wanted to think as AI is a way to build community and empower, you know, empower people, right? When used correctly. So like when I have supported EdTech companies as a consultant and outcomes weren’t matching intentions, the first thing I would do is I would, you know, pre AI would type and take low inference notes and reach out to the end users, the teachers, and or students.
Brett Roer: Now with ai, you know you have your metrics, you know how much someone’s using a [00:52:00] product and you know what the outcomes it was supposed to solve for should have been with whoever the end user was. Let’s say students. So if it’s meeting expectations, great If it helped move the students where you needed to, but if it’s not what I would just interview those students and say like, it looks like you’re not activating this enough.
Brett Roer: Why? Especially if it’s an anomaly with that product and a student’s performance versus another one. Or especially about like usage. ’cause then there might be a specific reason why students aren’t engaging with that tool and you have that answer. And then knowing this as a, you know, parent in a school board, people do not like waste.
Brett Roer: And they would love to hear, hey, this product didn’t deliver on metrics. And our kids are explaining why. Well now it’s an easy answer And like that’s what would get people up in arms. Our own kids see this doesn’t work. That’s a very common sense way. And you can get that and you can get buy-in from your community.
Brett Roer: ’cause then they’ll see you’re not that steak dinner. Sure. Maybe it got you to try something and. You’re not gonna just continue to do it because there’s a steak dinner on the table, which is, you know, the more ethical thing. Um, [00:53:00] so something like that could really be a, a clean, easy way to do it. And I do wanna shout out the guidelines that you provided.
Brett Roer: I remember last year at ISTE, it was the first time I’d ever seen like a very clear vendor expo hall checklist of questions you should ask based on your role. And I was so impressed by that. So thank you for bringing that to this kind of scale because yeah, that’s, that’s not what educators go to school for, right?
Brett Roer: They have, they believe in stuff that people say will help kids. So thank you for building that.
Julia Fallon: You’re welcome. Procurement. Not very sexy, but it gets the job done. So we’ve been also working on, like, we put out a guide in relation to those quality indicators around what kind of questions can you be asking around those indicators as part of the procurement process.
Julia Fallon: Because we know procurement officers also are not, you know, well versed in the tech space either, but we wanna be sure that they come in the RFP or the, the proposal so that whoever is evaluating it then does have answers in one place.
Rebecca Bultsma: And think there’s a lot of things that historically, um. Haven’t been [00:54:00] as much of an issue in procurement like for old tech things, but the amount of data that a bus has been collected, bought, and sold like by the ed tech and shared by the ed tech industry of student data, when student can’t consent to that, um, that doesn’t exist in a lot of the procurement practices as a consideration.
Rebecca Bultsma: And so I just think there needs, again, like for context, I do my research in the uk right? Where there’s, you know, the GDPR and all of these data privacy protections. And so this is something I think about a lot. But, uh, yeah, I do think there’s, there’s a lot to think about and it’s a lot on school leaders and there needs to be more.
Rebecca Bultsma: Like you’re providing resources to help support them. So thank you for that.
Julia Fallon: But, but also giving school leaders like the voice too, to say I’m all about data minimization. Like I, I, and I always use my kids’ example when she was in, you know, th third grade where field trip forms come home like every five minutes ’cause they’re taking ’em out, right?
Julia Fallon: And I’m like, why am I constantly giving you all my health card information? Emergency con, like that probably hasn’t changed very much since the start of school. At least have me verify it. You just need to know if she can go to the zoo and I can check something. I don’t need to have [00:55:00] it in on a piece of paper that God knows where that piece of paper ended up after I filled it out with all of this like sensitive PII on it, you know?
Rebecca Bultsma: Yeah. Well, what I’m, what I’m thinking about is these ed tech tools that collect all the student data, uh, collect all of the data of the student learning process, all of this information about kids, that some of them are shared with like 2000 partners. And then you can’t just get rid of that tool unless you know upfront that you get that data back or what happens to that data or how are we protecting student data like on a broad scale.
Rebecca Bultsma: Because without data, there’s no ai and it’s like the most valuable currency that exists right now. And we’re paying ed tech companies in money and in data, and then they’re making money on the backend of it, a lot of them too. And so it’s just a, a bigger consideration that didn’t used to be a consideration as part of the procurement process, but now that, that is the number one currency right now, you know, for any tech company in ed tech is how do we get data?
Rebecca Bultsma: How do we get more data? Um, it’s just another thing, you [00:56:00] know, broader ethical issues to be thinking of. Boring for everybody but me, I’m sure
Brett Roer: not at all. That’s why people tune in.
Rebecca Bultsma: For the data, for the data talk
Brett Roer: for the rich discourse. I mean, this is what it’s for you, you, you aren’t afraid to dive into the tough challenges ahead.
Brett Roer: We’re almost at time, but our last and final question, Julia, this is really important. We’re gonna end on a very high note, okay? And afterwards, I’m gonna unveil my new closing line for the podcast. And Julia, it was not inspired by you, but you’ll appreciate more than anyone. So Julia, our last question is, right, we’ve talked about the challenges.
Brett Roer: Let’s leave on some bright spots, right? You truly get to work across the nation with some of the top leaders. Take a moment and just, who do you wanna shout out? Give flowers to? These are the people that are helping to build the next generation to get an education right? Give some flowers to people that just other folks need to know about, should reach out to, should research, should study, should listen to their podcast.
Brett Roer: Who do people need to be learning from today to move the work forward that you wanna [00:57:00] shout out right now?
Julia Fallon: I will always shout out my state leaders collectively as a whole. So if you don’t know who your state ed tech director is, contact myself or look at our website or our, you know, you can do a little research yourself on Google, but you probably should know who those folks are.
Julia Fallon: They are often doing the unsung work of trying to move the needle forward and there’s good stuff happening in all states. So I will say that, you know what I mean? I know people, it’s, it’s politically charged right now, but there’s good work happening in all states and we wanna make sure that everybody gets their flowers for that type of unsung.
Julia Fallon: It’s government work. It’s not very sexy. It’s, you know, not very recognized as, as well. But I think of folks like Bre Urness-Straight straight out of the state of Washington here. You know, leading both the development of AI governance, but also really trying to understand how to create an ecosystem from a broadband perspective and access to everything, right?
Julia Fallon: Just trying to move the needle here in Washington state. I think about folks like the folks outta Utah, both Rick Gaisford, [00:58:00] who I joke around as my ed tech dad. There’s a whole story behind that. But he has really been one of those longtime state leaders that has really laid the foundation for what Utah is able to do.
Julia Fallon: And I know that Utah right now is having its own political moments, um, with technology. But they hired Matt Winters, who’s an AI specialist that sits at the state level. And you think about Utah, and I often say that Utah is leading the country and ed tech strategy and people are very surprised by that.
Julia Fallon: But they have continued those investments for lots and lots of years around how to make sure that, you know, communities are connected, that educators are supported, that students are gonna get the skills and they’re gonna get the experiences using technology as a whole. I think about folks like Dorann Avey, who’s probably wearing, you know, multiple hats in Nebraska.
Julia Fallon: Also our current seat, a board chair where they have really focused on professional learning and how they can provide a state role to get educators what they need. And, and they have this huge canvas instance where it’s, it’s top down, [00:59:00] bottom up and they have some instructional designers that help, you know, vet for quality.
Julia Fallon: So it’s high quality stuff, but this is a way, it’s an innovative way. Very not sexy though, but it definitely helps, uh, build the capacity of all of those teachers that are out there. I think about like Brad ha out at Indiana who’s leading as the ed tech director. He’s very well versed in the cybersecurity space.
Julia Fallon: You know, can, can speak about that both on the national and the state level and how states can actually help provide cover for an entire state, you know, smaller districts all the way there and things like that. Um, I think about some other folks that are not necessarily in my immediate circle or my membership community, but I think about like a Rob Dixon out of Wichita.
Julia Fallon: Public schools who blows me away with all the AI stuff that he is doing, where I’m just like, Rob, I really just need a couple minutes of your time to like, just to get some tutorial kind of stuff about how he’s using it not only for productivity, right? Like I think a lot of us are using it for productivity, but really thinking about how to use it from a systems capacity and making sure [01:00:00] that schools in his district are gonna be, you know, well equipped to be able to use it across the board and from a system standpoint, right?
Julia Fallon: And, and that sort of thing. So I love talking to him. I love talking to like Mike Conner who’s left obviously a district, but now has started his own company. Just, just my own handful of folks that I like to just. Not test ideas with, but to be able to be like, Hey, I’m cranky about this. And they challenge you, either your thinking or they give you something new to think about, but you, it’s in, it’s in a very collaborative and kind of safe space to have those conversations as well and everything else.
Julia Fallon: So yeah, those are some of the folks that I would like call out. Jacob Kantor’s been a great connector in getting Ed Tech, the ed tech community to really understand the ed tech developer community to really understand where states are coming from and try not to keep building stuff that we don’t need.
Julia Fallon: Like really listen to what we do need and then build for that so that we can spend less money and things like that. But yeah, those are some of the folks that I am keeping eyes [01:01:00] on in the space and everything else.
Brett Roer: Bouquets. Bouquets. And I think you have found a kindred spirit in Rebecca when AI and society really probably more.
Brett Roer: She needs someone to vent to you. Two would be kindred spirits I think in like moving forward and then be like, okay, and now what do we do? So hopefully this is another member you reach out to. That was incredible and a whole lot of flowers. All right, so Julia, this is my new sign off. So we were just at a school and they are famous.
Brett Roer: The superintendent brought crumb cake to one of their first superintendent’s days when they were a new superintendent and it was a hit. So I went back and it was Versha’s birthday. And this crumb cake truly was like one of those things that you’re like, how does this exist on this planet? So I tried to find it so I could bring it to the school because I know the superintendent likes to end.
Brett Roer: We could celebrate versus birthday didn’t work out because they were like, we don’t sell pound cake. The place that they gave me crumb cake and as we left I told vha, you know, I [01:02:00] almost bought a jumbo cannoli, but it would’ve been kind of crazy ’cause there was like eight people and she said, I don’t know if this aged birthday is really like deserving of a giant cannoli.
Brett Roer: And I said. Every day could be jumbo, cannoli worthy if you try. So that’s my message for all the AmpED to 11 listeners out there. When the world gets you down and you’re like, I don’t deserve a jumbo cannoli in life. No. Make every day a jumbo cannoli day, and you’re gonna have a pretty interesting life.
Julia Fallon: I’d say I, I wholeheartedly agree with that.
Julia Fallon: Sign off. I, I even just think a regular cannoli is there’s reason for a regular cannoli every single day. Brett and I share our love of cannolis, so this is where that’s coming from. But jumbo for sure, but a regular cannoli. I could eat one every day.
Rebecca Bultsma: Just outta curiosity. How big is a jumbo cannoli? Like, don’t the
Brett Roer: I will send pictures.
Rebecca Bultsma: Right.
Brett Roer: You know, I’ve put on LinkedIn, but it usually is about, it’s a really good question. I dunno.
Julia Fallon: I feel like it’s two football-.
Brett Roer: No, I’d say it’s arm’s. Arm’s length. Yeah. It it like 30 something in there. It’s [01:03:00] awesome. Yeah, like it’s awesome. And I’ll send you pictures. I’ve done it now like four times in 2025.
Brett Roer: I hadn’t. There’s a whole backstory, Rebecca. We’ll film a whole thing about that and Julia, you can be on the pod. It’s called Holy cannoli. Anyway, that was a long wind way saying you deserve a jumbo cannoli every day. You don’t have to have one every day, but make your life that interesting. Signing off in the AmpED to 11 podcast.
Brett Roer: Take care.