Improving Teaching Outcomes & Classroom Mental Health Using AI

Description: This week we’ll be diving into the role of AI in education with Craig Jones, CEO of Formative. Formative is a web-app for classrooms that gives teachers the virtual tools necessary to engage, instruct, and assess students to optimize teaching outcomes. In this episode, Deep and Craig will discuss how COVID-19 helped facilitate the adoption of automation in the education space, student's mental health regarding digital learning, and how platforms like Formative may help augment some of COVID-19's impact on the education world.

Craig begins by discussing about how he got passionate about education, what motivated him to found Formative, and how Formative improves on traditional learning through asynchronous instruction, teacher-student interaction, and longitudinal analysis of student growth.

Learn more about Craig here: https://www.linkedin.com/in/craigcarterjones/
and Formative here: https://www.formative.com/

Listen on your preferred platform: https://podcast.xyonix.com/1785161/11436844-improving-teaching-outcomes-and-classroom-mental-health-using-ai-with-craig-jones-of-formative

[Automated Transcript]

Deep: Hi there. I'm Deep Dhillon. Welcome to your AI injection, the podcast where we discuss state of the art techniques and artificial intelligence with a focus on how these capabilities are used to transform organizations, making them more efficient, impactful, and successful.

Welcome back to your AI injection. This week we'll be diving into the role of AI and education with Craig Jones, CEO of Formative. Formative is a web app for classrooms that gives teachers the virtual tools necessary to engage, instruct, and assess students to optimize teaching outcomes. I thought you could get us started by telling us a little bit about your background in education and you know, what inspired you to start formative.

Craig: Thanks first, uh, for having me. Really excited to, to get to talk today. So my name's Craig. Um, I'm the co-founder and CEO of Formative and, uh, right outta college. I joined a program called Teach for America. Which placed me as a middle school teacher in South Los Angeles with pretty minimal training, you know, summer's worth of like how to become a teacher and a lot of things about the institutional problems that I, I'd be facing.

But, but really nothing could prepare me for, for what I saw when I was in, um, my placement school in Los Angeles. While there, I taught for four years. And, uh, really loved it. Loved every minute of it, but it was very challenging to put in context. One year I, only 14% of my students were at grade level in mathematics, and the majority were probably far below grade level.

And that was the best year students that I ever had . Um, Wow. And so it was definitely an eye opening experience. Those students were incredible and I, I built some things in my classroom that helped me make sure that they had really incredible growth. 86% of those same students ended up scoring at or above grade level in science.

So 14% in math were at or above grade level, 86% in science. And I think that was one of the most, uh, eye opening things for me was just how much the system was failing. These kids were otherwise really great kids, and that propelled me to wanna do more in education. Um, after my fourth year of teaching, I, I finished with the last three years having 99th percentile academic growth.

As measured by the district on how the students were projected to score versus how they scored each year. And was that with the 

Deep: same kids or each year you were able to work with or was that just your methodology 

Craig: improved? Uh, each year you get new kids. Uh, my first year wasn't super good, um, but still was above average.

But year two, three and four were, were special. And I probably would've kept teaching to be honest, but I was. Pink slipped ironically, which is district's way of saying you've been let go just cuz of budget cuts. And, um, I was the youngest teacher and that was the whole equation. It was just seniority based.

And so I ended up being very fortunate that I had some connections at the district and I, I became a fellow for the superintendent in Los Angeles Unified. And also ended up because of that opportunity getting to help launch some products. My first ever real private sector job, which was at Bekin, and they just brought me in, kind of be the voice of the teacher.

It was only showing up every Friday, but given a lot of feedback on each of their products and software and hardware that they were building in the education space. And that was just so amazing. They, they had such a wonderful setup that they're so affiliated with Apple that everything they'd do has incredible design.

Just know how to do things really professionally. And doing that for a year really gave me the confidence that I could start my own company and was fortunate enough during that time to be going to ucla, getting an mba, while also, uh, being fortunate enough to, to be roommates with an incredible co-founder who at the time I didn't know he'd be my co-founder, but my partner in crime, Kevin McFarland, and we just talked every single day.

About what we were thinking and seeing in education. And he ended up joining the board of a charter school. And, uh, and, and that's really how formative came to be, was these nightly conversations about the future of education and my experience and his experience growing up as a son of a coal mining family and how education kind of shaped his, his whole trajectory.

Yeah. And that, that's kind of how I got passion about education. Yeah, 

Deep: I mean that's, uh, that's a, that's a great founding story. So, Maybe it would help if you walk us through your product experience, maybe from a student's vantage, and then, um, From a teacher's vantage, and then we can start to talk a little bit about, you know, how AI might help things.

But I think if we first understand like, what does a student see maybe without your product and then with it, you know, pre and post, that might help us get anchored in the conversation. 

Craig: Absolutely. When you're teaching. You know, back in the old days, you would either print out assignments to your students, have the students complete the assignments, then they'd pass them into you maybe in the middle of class or at the end of the class, and you try to see, in that assignment, if you can identify some misconception that the student has, then maybe you give feedback to the student, write some comments on the piece of paper, hand it back to him the next day.

And then that was kind of the, the cycle. Obviously 

Deep: what, And just, just for, um, our listeners benefit, what, what grade levels are we talking about here? I, 

Craig: I think that's universal education. That's kindergarten through beyond, you know, secondary school. That's, that's usually the loop of education is you, you as a teacher, sign things to students.

You teach, you lecture and then you give feedback. They do you correct? 

Deep: Yeah. Assignments or you give corrections on exams or 

Craig: quizzes. Exactly. I think that's the status quo. That's been going on for a while, and then in my classroom when I was teaching some new technology was starting to come out. Um, one thing was these systems called clickers.

They've probably been out for a while, but I had some pretty fancy ones that students could actually, you know, type on so you could, you know, get them to input more than just like an B, c, D type response. And, and that, that started to change the game. To a certain extent, as you're now collecting live data, you can kind of auto grade things really immediately.

You can present visuals to this, to the class in real time, color, coding each student if they're getting things right or wrong. Also, the advent of laptops in the classroom and students bring in their own device. You start to use things like, you know, Google forms or different surveying tools. There was a moment there, uh, where the old school assignment starting to, you could, you could picture that dying down and feedback becoming more real time.

But it didn't take off as fast as I think people thought it would . Um, and that's largely just. Internet wasn't super ubiquitous in schools for, for quite some time, and devices were very expensive. And then all of a sudden, you know, Chromebooks started to become really cheap in schools. Cheaper than textbooks.

Yeah. I think 

Deep: the price got down below a hundred bucks at some point, maybe eight, nine 

Craig: years ago, something like that. And, and then laws started to be rewritten where you could actually use devices and internet started to get pretty good, to be honest. And then, Things slowly changed. I, I will say there was a, a solid three, four year period where devices did exist in classrooms.

The internet was good enough, but teachers were still going to the copy machine, printing out assignments and doing kind of the old school model, probably largely due to c. Right At that same time, everything kind of flipped upside down. Different platforms made it really easy to give assignments digitally.

Students kind of started to become digital natives and expected. Yeah. So is 

Deep: that the kind of crux of the experience that the teachers logged in and the students are logged in and the teacher can formulate problem or, and students can work on it and get interim work that they're like as they're working through stuff, the teacher can sort of see it and maybe see aggregations.

Like things that are common across the students in the classroom? Is that the context for the 

Craig: product? That's where we started. I mean, most specifically, when my co-founder and I were observing classrooms, we would see that they were still printing sheets of paper. So we, very first thing we did was the ability to have teachers upload documents.

And then you could add input fields onto the document, somewhat like a DocuSign, uh, experience. So you could just digitize that work that the teacher was otherwise gonna go to the copy machine. That really triggered a lot of people to go, Wait a second, , Okay, I can use the devices that were in the back of my classroom now.

I don't have to re start from scratch. So that was the first thing we did was, you know, helping people digitize these assignments. And the real innovation we would have there is when they would send a link out to their students and the students would fill out those documents the teacher would see in real time.

Each student response broken down by question number. 

Deep: Yeah. And so one of the things that we read about and hear about is, are all these studies that say just the mere presence of a smartphone on a desk, um, let alone an open laptop, you know, takes away our ability to focus and direct our attention to where it needs to be directed.

Did you see that in the classroom? And what did you do to mitigate some of those effects that are pretty widely published now? 

Craig: Yeah, that's a good point. I think there's certain grade levels. It's still a little questionable whether or not you need the devices as, as frequently, but, but definitely for the middle and high school students and even upper elementary, I think they're, they're quite capable of balancing it.

There was a couple years there where I felt like if you were to tell a student to shut down the device, they would look at you secretly wanting to open it up. But I think kids have become pretty mature over the years. They're so used to having devices at home that the teacher says, Turn it off. They just turn it off.

It's not like a luxury item anymore. The opposite usually is still true in most classrooms. I don't think until total virtual learning happened that students were being given an extreme amount of device time in schools. To be honest, the first class in schools that we started to sell were the opposite.

You would walk into schools and show them our product. And they would demo it out with their, their students. And I remember walking around the room and making sure things were working correctly and a teacher would come up to me cuz they're looking at the live sheet of results, like the live view that they, they're, they can project all the students to show how all the students are doing.

And this one teacher looked at me and said, Your sight's broken. And I, I remember just feeling kind of really embarrassed for a moment and then saying, Okay, well, what's broken? Let me go check it out. She's like, These two students see them. They're like, my best two students and there's no data for them.

And I, I remember feeling just. Like, Okay, well let's go to those students and find out what's going on. And you walk up to the students and they look at you like they just saw a ghost. Because it's the first time that the teacher had ever really checked on those two students. They probably came in with very high test scores.

You know, really top tier students got their work done, but they weren't used to being accountable. In class, and so when the assignment was sent via formative and they weren't working at the start of the class when everybody else was, the students were shocked when they realized that the teacher could tell on our site that they hadn't even started yet.

The teacher didn't realize that at the moment they thought the site was just broken, but because everything is real time on our website, It was a wake up call for the students, so I don't think that those students were screwing around because they use computers every day and, and they're just used to it.

I don't think the computers were the problem in the education. I think that the lack of transparency in terms of learning and the fact that when you're teaching, you're usually only able to focus on a few students at a time, means that a lot of students slip through the cracks. Usually the middle performing students, the top performing students don't get as much attention as maybe the students You.

Need the most attention. It's kinda like that 80 20 rule, spend 80% of your time on the 20% of students that maybe need it the most. 

Deep: So is that kind of the, the main thrust though is to optimize that real time in class assessment of a set of students and, uh, provide that real time feedback that maybe historically.

Spent with the teacher wandering around and looking over shoulders and now is able to kind of do that a little more efficiently. Is that, is that the crux of the, of the improvement that you guys are going for or are you touching on other aspects of the, of the student teacher interaction, Like when they, you go home and, you know, in the homework arena or like other 

Craig: areas as well?

Yeah, we, we do two things. We do the in class or asynchronous kind of instruction where you give an assignment to students. The students respond on their. You see live student work, they get live feedback. So that's kind of bucket. One of what we focus on is really just making that anytime you want to give your students an assignment, a quiz, homework, little check for understanding that that process is just a little bit more interactive and the teacher ends up with more data.

And then the second thing that we do is more of a longitudinal analysis on student growth over time. So with each assignment that you give to your student, We start to notice trends on their learning. You aggregate each of those daily assignments and then you can start to predict how that student might perform by the end of the year.

It's kind of our, our real innovation and I'd say is that second part. But in order to get that daily classroom learning data, you, we had to build a platform that teachers used. And then the newer element for us is actually helping school districts. Now that their teachers kind of really like this core platform to also have some of their more traditional assessments in the same platform to kind of calibrate whether or not the daily learning is leading to end of year outcomes.

Deep: So I imagine, um, so, so kind of taking the conversation into the AI arena, so on the predictive side for a given student being able to predict, um, you know, their longitudinal progression, Imagine that's helpful, particularly if you can anchor it in some kind of, uh, understanding of why they're likely to maybe underperform or suboptimally perform to their personal optimum.

Yeah. Is, is that one place that you're looking at AI usage? Like what are the areas where you're seeing the inefficiencies, where gathered data can really benefit from some machine learning? 

Craig: Well, I think what you just described is the holy grail to collect data every single day and use that data to really give feedback to all the key stakeholders in a child, whether it be the student themselves, parents, educators, school leaders, even potentially content publishers or other players in the space to, you know, what's working, what's not working, and then be able to showcase or recommend things that lead to accelerated learning.

I don't think we're at this stage yet where that's how we're using AI specifically. But that is the holy grail I think you need. There's a lot that goes into that and we hope to be the player that comes across, comes out and top of that in the long run. But, but we're still at the data aggregation stage.

Um, we're trying to collect billions of student data points per year on, on learning that only some of that is probably going to be labeled in a way that will be useful. I think that's the next frontier. I think the first place that we were looking at using AI was more on helping teachers give faster feedback to students, um, more the instructional.

Or the real time more are the things they do every day as opposed to forecasting, providing long-term recommendations on soon learning. 

Deep: Need help with computer vision, natural language processing, automated content creation, conversational understanding, time series forecasting, customer behavior analytics. Reach out to us at xyonix.com. That's X-Y-O-N-I-X dot com. Maybe we can help.

Yeah, let's dig in there. I mean, that, that sounds good. I mean, so thinking off the top of my head, you've got a teacher, you've got 20, 30, however many students in the room, it's still a challenge to get them to efficiently give feedback to, you know, all 30 students. I mean, maybe the mistakes. They make cluster and there are certain common mistakes and you can sort of jump in with an example to the class.

So that would be kind of maybe a scenario that where you could sort of learn from the students all at once and intercede in the moment. I mean, are there, is that a legit scenario? Are there others that kind of come to mind? 

Craig: I, yeah, I think that's probably the most straightforward use case that we have for ai.

I mean, we, we recently participated in a competition where we presented a proposal. That would've looked at students solving mathematical, drawing problems. Mm-hmm. . So we were just gonna do something very simple like a solve for X type mathematical problem. And, and the students basically have a canvas, like a whiteboard, where they can sketch out how they would solve for x.

There's really never been to date, any way to take student drawings or student work. Actually, not just final answer, but the actual scratch that got them to the final answer and give them live feedback. While they're working on that, we've calculated that that's the majority of what students. Do in math anyway.

Yet all that input is usually thrown away. And the only thing that most traditional models look at is if they got the right answer at the end. We think there's a lot of opportunity on that front and there's a lot of really smart people who have done pretty amazing things paving the way for some great, great innovation to, to take place there.

Um, and we think we're in a pretty good spot that we could be, that the best platform to, to advance that. Cuz already we have millions of student drawing responses. The largest group of our users are mathematic students. We just have a lot of this already happening in our platform. We think it'd just be great that on something like a drawing where you never can give feedback or the teacher has to look at every single response to be able to coach the student up, I think it's perfect for, For a learning model.

Yeah. I mean, that's. 

Deep: It's almost like been a problem for a while. I mean, you know, you have a multi-step math problem. Just to kind of dig in a little bit, a lot of the feedback you get is just right or wrong. Like, you know, the final step is right or wrong, and there's sort of a need for granular feedback from step to step.

I mean, I remember, I'm just thinking back to like, you know, elementary school or middle school. I remember getting dinged constantly for just skipping all the steps and just putting, you know, or skipping a good chunk of steps or not following whatever methodology somebody was teaching you how to, I dunno, do long division or some algebraic.

Mm-hmm. sequence of steps. I mean, I remember being really frustrated with the need to show work at a level. And in a way that was sort of prescribed when there were alternates, you know, where you can just sort of jump ahead or whatever. And I, I remember being really frustrated with that as a kid. Like, I don't know what you want, but here's, here's how I get to an answer.

It feels almost like in order to assess students, we force them into a methodology for solving basic math. And maybe not even so basic math, but we, you know, we force them into a methodology just so we can get a glimpse into their mind. And I'm wondering if how you think about 

Craig: that. Well, I mean, first, I, I, Your example is a really interesting one.

I think both of us were probably pretty good at math growing up, and hopefully you were just being slowed down by the problems that you were being given and maybe a sophisticated system would already be advancing you to questions where you're not just, you know, able to say the answer, like your mind, you to actually have to go through some steps and then it might be able to help figure out that, that learning gap.

But definitely that's not the case for most people in, in the us. I mean, to put a bluntly, scores in the United States for mathematics are absolutely abysmal, and so any system. That can help lead to better outcomes is a worthwhile endeavor there. There's a lot of research that shows that the two most effective ways to improve learning are giving feedback and tutoring.

Tutoring, yeah. Is very difficult to scale. So that's where kind of an auto coach could come in play. Giving feedback you would think would be easy, but. If you can't give feedback on live on student drawings or mathematical work, which is again, where a huge chunk of their, their responses lie, then basically if you can't give feedback and you can't tutor, you're not doing the two most things that research proves are effective.

Yeah, so I, I think that in general there are just such abysmal results right now in the United States and honestly around the, the world. Any system that could provide better feedback and better coaching or tutoring. Would probably lead to monumental outcomes. 

Deep: Digging in a little bit on the tutoring front, because I think a lot of folks just jump, you know, right.

To machine learning or automated solutions. But you know, if, just kind of going back to your scenario, you know, teachers, you know, looking at a real time view of students, students have. Our inputting and getting feedback. It seems like there's an opportunity there to augment the teach the teacher with, you know, maybe a population of tutors, for example, uh, or that aren't in the classroom.

It also seems like you might augment the teacher with. A prior known lessons and aggregations from this particular exercise that's being done. So in other words, on the one hand, if you can get efficiencies from humans that aren't sitting there necessarily, and maybe asynchronously, they're providing some assessment.

And if you can maybe combine that with the fact that most, you know, sets of 30 students learning a particular lesson in a particular demographic are gonna generally make the same kind of mistakes and get stuck at the same kinds of places that you can sort of bootstrap that. And it does feel like you could offer a lot of functionality to sort of augment that.

The teacher said they're not literally staring at the output of every student and having to think it all through from scratch, but they're maybe being flagged in near real 

Craig: time. Yeah, that's, I, I hope that's what our platform already does really well, and that's why our platform is really popular with teachers.

We have hundreds of thousands of teachers who. Use formative for that very purpose. But I, I do think we're at this stage where they get, you know, you've got 30 students plus in your classroom, you kind of have a flag happening that students are, these students are the ones you need to focus on. But even then you're, you're already overwhelmed in terms of, okay, I know I have a lot of students that need my help.

I could use some help giving them extra coaching, uh, or feedback or more timely interventions. To your point about tutoring, there's some really great companies on the tutoring front. Uhhuh, , just the expensive. Hasn't proven to be super scalable, but Khan Academy spun off there at SoCon in another company called schoolhouse world that's doing some really exciting stuff in the tutoring front.

Free tutoring basically for anyone. And honestly, their their biggest innovation is probably being able to vet tutors in a scalable manner because you're gonna provide tutors, real human beings for anyone who wants to come in anytime. There's a lot of liability in just making sure that. Presenting people that are appropriate, also capable of being a tutor on the subject to students.

And they've done some world class stuff on that front, but in general, I think these are top tier places of focus and you don't necessarily need to use AI yet for everything here, but we're in a pretty sweet spot now with ai and I do think very little of that has trickled into the classroom and it's, it's time that it does.

Deep: Yeah, it seems like, um, If you think about the types of things that you can sort of in an automated fashion help correct. And maybe classify and, and, and give feedback on. There's math, there's also, and things that maybe in addition, take a lot of time of teachers. So there's like long form content, you know, like reading essays for example, or, you know, even like short paragraph style answers.

Those are things that are time consuming for a teacher to go through that process of reading. There are also examples where, you know, we tend to excel on the natural language processing front to be able to, you know, automatically. Maybe not think at such a high level with respect to, you know, what exactly like, you know, arguments that are maybe not well formed, but there's a lot of, you know, correlated variables, um, that have predicted value.

You know, for example, you know, if you've got kind of consistent grammatical mistakes, consistent punctuation mistakes, you know, and, and combine that with maybe significant spelling mistakes, maybe you still formed. Very strong essay, but most likely you didn't . And so there's a risk of these algorithms, you know, grading accurately, let's say, but not really providing much value.

Um, other than saying, other than honing in and saying you spelled things incorrectly, your grammars in all of those are important things to get. But that's not necessarily the same thing as somebody like really having an in depth read of an essay and having substantive things to say as opposed to just detail oriented things or.

Not to diminish the need for proper grammar or spelling and, you know, punctuation, but, you know, those are things that as computer ss, we handle because we can . 

Craig: I, I find in the, the long form writing, there's kind of the two different things that are currently happening. There's your Grammarly type, you know, give me the red squiggle.

Or the green squiggle. Mm-hmm. as I'm typing something real time feedback really designed for the student. And then you've got your more on the a c T and I need to grade a couple million or writing passages, and so I need a model that can gimme a 99% accuracy of scoring the next writing passage to this prompt.

Those are probably the two main things that I see there. Maybe a little bit can go into, but there really isn't much in in the of coral. Effect in the math world that I've seen yet, math is still just, did you get it right or wrong? There isn't a red squiggly system when a student's solving a math problem.

That's, that's the part that we were trying to innovate on. Yeah. Which is a challenging thing, but definitely can be done on like a solve for X problem. Something where you kind of know where the student's supposed to be getting to in terms of, I don't know what the mathematical equivalent of writing essays is, but proofs pro

Deep: I don't, but that's, that doesn't usually happen in the younger years, 

Craig: so Yeah, you're not super. Super knowledgeable on that front, but, but the neat part with, uh, with math is all those little inputs really do tell a pretty clear story and also in education. Yeah, that's a good point. Yeah. Most of the items are tagged to learning standards, so there've been some very, very big.

Winners in terms of platforms that if a student uses it enough will probably lead to really good outcomes in learning. But then the teacher is so essential in US K12 education especially that you don't really, basically there's companies like Rocket Ship Learning, I don't know if you've heard of them, but they basically, their model is to put a kid in front of a testing booth, you know, put blinders on both sides of their, like almost cubicle, where they could only focus on the computer.

Give them a headphones set and have 'em stare at the computer and work all day, and you'd get good math outcomes at the end of the year. But man, you were basically turning a kid into a. There's a strange balance in education where you have to really involve the teacher, not just for the simple psychological value of a teacher, but also for the motivation for the student, for the long term Love of learning.

Um, well, I mean I think we 

Deep: saw a lot of those side effects during covid. I mean, like, you know, we took, well, however, many million students are probably a billion students, but we looked globally and we stuck 'em home and we stuck 'em in front of a glass box and we. Be motivated, uh, get your work done.

Meanwhile, we have just a massive number going into depression, you know, developing anxiety, all kinds of problems, exacerbated by the fact that they're just not in the social setting. They need to be in. And, uh, and they're just alone. And we've all seen the, you know, the meme videos of the depressed kid, like just sitting around in front of their computer all day.

I'm wondering if, you know, when you, when you think about the role of the teacher, like what lessons did we get from the Covid era here in Seattle, for example, for, we don't know the exact reasons, but for whatever reason, the school district banned the use of. Video cameras, Uh, I think it was an equity issue or something, and I don't know if that was districtwide or, you know, just, you know, my, my, uh, kids in, in their, in their school.

But I, it struck me as a spectacular, stupid thing to do. I mean, because all of a sudden you take a kid that normally would've woken up, taken a shower, gotten ready, uh, just cuz they know someone's gonna look at them and now they're just like laying in bed till three in the afternoon. Fairly moving. It just seems like a complete colossal failure.

Like I, I don't really see anything positive that really came from that experiment other than it just highlighted how horrible the, at least the public school system was at adapting. But I don't know what, Did you see something different and did you see areas where, you know, that remote, we learned a lot from Covid, or, All I saw was a lot more prescriptions for Prozac.

Craig: I will say that our teachers seem to be fairly happy. I mean, our numbers. Insanely strong during Covid, and all we were getting was feedback from people basically saying, I would've quit if I didn't have our product and not really doing this to endorse our product in particular, but we built something that was really good at getting student response.

If you think of the worst teacher you've ever had, it's probably one who just lectured at you the whole day, and Zoom is very conducive to talking and kind of turning your microphone on mute and. Being there and maybe having homework. It's not a very interactive, engaging model. What we built is pretty much the opposite.

It's all about getting student feedback. The teacher can either pre present something where they're moving the student through some kind of activity, but at each step there's questions and places for feedback, or the teacher can get things to students that are Synchron. Where the students work through at their own pace, but the teacher's constantly given feedback because they're noticing the students moving in the wrong direction.

They're getting alerts on our site that something's not right. So I think that we, we personally did not have that same issue of Covid, but, but nationally, the data's all there, that there was tremendous learning loss. And I think the worst part of this is that it usually most affects areas that already have a lot of issues to begin.

I, I think we're gonna be seeing decades. Oh, we're gonna, Yeah. It's like the python 

Deep: swallowing, the, the deer or whatever, you know, this, the data impacts of covid, not just in education, but for healthcare, everything. We're gonna be looking at this for 30 years, 40 years probably, you know, until that whole, until that generation ages 

Craig: out.

The biggest, the only part I'll add to that is that really weighed on teachers and then to your statement about like pro. There were a lot of just national issues with depression and oh, they're still there. 

Deep: I mean, we have a, just a massive dearth of 

Craig: therapists that's leading to now a lot of educators leaving the industry.

I, I talked to someone from the University of Florida. They had a single person enrolled in secondary education program that they were looking at, so, Oh my gosh. You're talking about, you know, talk. Wait, why I 

Deep: unpack that for me a little bit. Like, why are they leaving? Because, um, the students are so depressed or something like, I, I missed the 

Craig: link there.

I mean, it wasn't just students that are depressed, I think. Oh, the teachers are depressed. Yeah. Okay. Students are depressed. Teachers are depressed. You throw in a whole bunch of political weight, , it's amazing how much crazy stuff is going on. Florida has all kinds of whackadoodle politics going into there.

Think about all the different board meetings that, I mean, just on every single imaginable topic, it has not been very fun to be a teacher and then a lot of teachers. Really do like to see their students have academic outcomes. Yeah, 

Deep: of course. I mean, I imagine that's the most exciting 

Craig: part. Like it can be, but, but if you look at the last couple years, it's been about going a step below that, not in terms of like the Maslow's hierarchy needs.

You're, you're really focused on survival. You're focused on making sure psychological needs or met of your students. And a lot of teachers, I mean, they, they do an incredible job. They're amazing. But I'm sure. They didn't sign up to also be therapists, psychologists, and therapists and yeah. Deal with the drama.

So it's not a very glamorous role to be in. And then it's, they're kind of like police officers in a 

Deep: way. Like teachers are sort of like the front lines of interaction with the bulk of our youth population. I mean, you know, a lot of these kids probably, you know, don't get a lot of other interaction, 

Craig: but, but if you look at it, there are so many statistics that show that we're losing a lot more teachers than.

We're putting into the system and I don't see that slowing down. Oh, so you feel 

Deep: like that's one of the forces driving some of the automation and the sort of facilitating some of the scaling properties in your business? We have to do something to get the outcomes that we need with less resources in essence.

Craig: I wouldn't have made that connection yet. I'm sure that's inevitable in some case, but we just want to keep teachers from feeling like they need to leave the profession. If you just make them have a happier day, support them, then we can potentially slow the bleeding down and keep more teachers at the school.

I don't, I don't wanna automate away a teacher ever. I wanna make sure that the teacher gets the necessary tools so that they. You know, at home grading for four hours. 

Deep: Oh, I see where you're going. So some of the undesirable functions of their job, at least maybe what you're hearing is, is the stuff they do when they leave the office, they gotta go grade and you're helping pick up some of that slack for them, basically.

Craig: Yeah. I think we just want to be the, We think teachers are super human. They really are. I, I'm just blown away by how many amazing teachers. Over the years, but ideally, you can take away the least enjoyable parts. You can automate all the parts of their job that they shouldn't have to waste time on so that they get to spend as much time as possible just being that super teacher, that students, you know, remember 30 years later is the best teacher of their life, but if, if they're spending a lot of time on other things, they're, yeah, like, like 

Deep: non non student interaction things basically.

I'm gonna kinda like switch gears slightly. What are the criticisms of the modern education system? Is that It's not modern at all. It's a model that came up about in like, you know, the 18 hundreds and a time when we didn't have the ability to, you know, to optimize learning for individuals as opposed to.

Trying to move a chunk of students, like 30, let's say, or 50 or whatever, through a system where, where, you know, there's levels set and you're trying to move all 30, you know, through a set of, of levels at, at once. And so one of the critiques that you, you, you read a lot in. I think one of the visions that maybe Salcon puts together, which, you know, I, you know, if you've got a student that's, that's really advanced, then they shouldn't be stuck sitting there learning things that are very straightforward for them.

And if you have a student who's on the other end of the spectrum, Who's really struggling with basic concepts, you shouldn't keep shoving them through so that they just continue to be bewildered cuz they haven't gotten to the thing they are. Meanwhile, you have a teacher who's having to teach to, you know this, everything in between these two extremes and maybe they clustered to some extent.

Like you got a bunch at the blow in and a bunch and a few at the high end and like, you know, a bunch in the middle. But the general argument is, If we fast forward like, let's say 10 or 20 years, and we want this like optimal view of learning, students should be able to move at their own pace in this worldview.

They should be able to not be alone when they do it. They should have coaching and guidance that feels very meaningful to them, where they're motivated, but ultimately we're, we're not trying to like see how do they perform relative to the, you know, to some standard body population, but we're trying to see how well we can get them to perform relative to their potential.

Do you agree with that person of the space? And like, if not, why not? And if so, why? And, and what does that world look like? You know, if we fast forward 

Craig: and Yeah. It's such a great question and deep question. I mean, to be honest, I, I can foresee a future where you literally can just. Download stuff to your brain, and all of a sudden you walk out and you know Spanish.

Like, I, I do think that that could be a reality in our, in our world, without much technology will advance. But, but that is not when I, when I think like long term vision maybe that that is a world we live in. But realistically in the, in the next, you know, 10, 15, 20 years you're looking at. A lot more systemic problems than I think where your, your question immediately, uh, was posed, first of all, like Salcon.

I think he's an incredible visionary in our space and met people where they needed to be met. Like, think about all the people in the world who skipped the textbook era immediately had their first, you know, cell phone or a. We're able to find basic high quality algebra instruction and they maybe didn't, couldn't even afford a teacher, couldn't afford to go to school or live too far away from a school or from a country that didn't even, or just have a bad teacher.

Deep: I mean, like with my kids, this is like a recurring theme is like they'll have a bad teacher who can't explain something and they spend their energy trying to figure out the appropriate lesson in, in Khan academies that yes, uh, that will match it. Like that seems like a really basic tool that should be there.

You know, where you can just take whatever your teacher's trying to teach and map it to Khan Academies cuz you know, Sal Khan's, the guy's impressive, like the, but his ability, his ability to communicate like very simple and very complicated topics. With an incredible amount of breath is, is really mind boggling and like suffice it to say whoever your kid's sitting in front of, he's better than them at explaining that , that whatever that topic is, I almost just feel like they need to be routed to him in an appropriate context.

The teacher they're sitting in front of should be the one who's like guiding their education, establishing like helping them with motivation. Like why do they even care? 

Craig: So when I was teaching, I told you how bad the math student scores were, which made me feel like even though my students were doing great in.

Not really very proud cuz the students are going into high school at an extraordinarily low math in the score, which is probably more important in terms of their ability to get into college or whatever next, you know, direction they're trying to go. So I tried to augment some of their math instruction cuz it wasn't happening at the school.

And so I would subscribe Khan Academy videos even though I wasn't the math teacher and. Give them homework for math cuz it felt like it was progressing. And Khan Academy for the longest time didn't even have tools for teachers. It wasn't really their philosophy, it was really about the learner. But I think they realized that, yeah, you'll get the students who wanna learn, but you gotta facilitate things happening through the current system if you want to get widespread adoption.

So then they, they built a lot of tools for teachers and for schools to. More easily assign, you know, the videos or the lessons to students and let the students work through and, and let the teacher actually get that data. That was a big innovation for them. I, I have to imagine that the vast majority of their usage actually happens through a teacher then pushing the materials to the student, because there just aren't students that go home every day saying, You know what?

I'm really bad at this specific trigonometry. Solve some, some problem. So I'm gonna go watch YouTube videos to figure this out and take practice problems. That's just not how most people work. So by using the teachers, the, the layer to reach the students, I think that was very successful for them. But then I, I talked to him, I actually, a couple months ago, and he, he said that their usage went up by two and a half X during covid.

Okay, but then decreased this last year to lower than their 2017 number. That was very eyeing. That's fascinating. Why would that be? I wonder. I know you would think everybody has kind of discovered it and they'd stick around. His response I thought, was that covid fatigue in terms of like zoom fatigue, like you've been on devices too much, let's disconnect for a year.

But he actually made a statement that he thought it was more. Teachers just weren't teaching as much that there were so many things going on in the school that the actual amount. Teaching that, you know, an Algebra one teacher wasn't finishing Algebra one anymore. And, and there's a certain element of truth to that in terms of there wasn't very much accountability for the last couple years.

Oh yeah. I mean, we 

Deep: took all of our, like our, you know, my daughter, you know, we just augmented her with, I think we went from like one hour of tutoring up to like five or six because there was like no instruction happening in the, during the, you know, covid time. And it was like, It just wasn't working at all.

So we, we had to like, augment it completely with tutoring. Virtually nothing was being covered by teachers filter that to the lens of a 16 year old, right? Like that's what I'm hearing to 16 year old, which could be, could 

Craig: not be true. I, I know that there really went off the rails. We've witnessed so many amazing teachers doing things way above and beyond and having growth even despite Covid.

But I do think that if you average out everybody, there probably was a lot less learning. I think what's wild though is that this last school year was mostly an in person school year, but it also had a lot of challenges. 

Deep: Well, you got masking, you got covid shutting down and like teachers getting sick and going out, students getting sick.

A lot of anxiety. Just being in that setting, you know where you can't. I 

Craig: think anxiety's the biggest one because. For the most part, it was kind of a back to reality moment. There was a lot of optimism about that, but then the year itself was still very challenging. It wasn't like you just can undo what had just happened.

You're coming back to students now that have a lot more, Well, plus 

Deep: you're behind. You know, a lot of the students are just 

Craig: behind. But we are seeing outcomes seem to be better this year from what I've heard from most of the districts that we survey and talk to. Yeah, yeah. There are definitely a lot of challenges.

I just think you just have to keep meeting people where they are and keep moving them forward. That's what we try to do. We just observe teachers every single day. Watch what they're doing and try to make their life five to 10% better, you know, every month. How do 

Deep: you measure whether the teacher's lives are getting five to 10% better every month?

Do you, do you, do you 

Craig: take data and, and, and I mean, a lot of it's based on usage data and making sure that they continue to refer us to more colleagues. And we do things like net promoter score and customer satisfaction. And so we do a lot of things that are more. High level. We've also, we do an efficacy study every year to see if using our site leads to better outcomes for the students.

There's different ways to, to make sure that you're, you're getting better as a company. A lot of that, though, is just building what the teachers really need, Being real with where they are, not trying to build for the future. There was another company called Newton, Built these incredible computer adaptive testing models where basically students again, could sit in front of computer, answer five questions, and then the assessment would adapt based on their learning.

Give them harder. Question is an easier question. Mm-hmm. , and then this company's building something really amazing, but not meeting the market where it is and almost kind of trying to cut the teacher out of the equation. But along the way, while we're meeting teachers where they're, Our goal is to continue to collect learning data.

To continue to be able to provide information to the, the key stakeholders with the learning data, we really have to just do two things. Be able to measure where the student is each day, and then be able to show some kind of acceleration of learning, you know, forecasts where they would've been without us, or just based on prior test scores of praise and prayer outcomes.

And then show where they actually are trending based on using our platform more and more. And so if you can accelerate learning that, that would be the real way to prove that you're, you know, leading to five to 10% better outcomes every single. 

Deep: Have data? Have a hypothesis on some high value insights that, if extracted automatically, could transform your business? Not sure how to proceed? Balance your ideas off one of our data scientists with a free consult. Reach out at xyonix.com. You'll talk to an expert, not a salesperson.

Um, so what are, you know, like what are some of the specific features of formative that you know, that have improved teaching outcomes and how are they made possible or potentially improved with, you know, 

Craig: AI. Anytime we've got so many variables in equation, I don't know if we'd say a specific feature is what led to more.

Yeah. Normally what we do is we just look at some pre and post assessment and then see if using formative more led to better outcomes. Using certain aspects of formative, more led to better outcomes. Right now we're about to do an efficacy study on the types of feedback that teachers give to students.

Just students do a pre-assessment, maybe on certain number of questions on a standard. Then give feedback in various different ways. Do you give audio feedback, give written feedback, give feedback within five minutes of the student work and feedback within one hour of the student work? You know, different ways.

Measuring that feedback as the kind of catalyst to then see if students who were given feedback faster ended up having 10% better outcomes on the post assessment. We're trying to figure out stuff like that, but it's tricky. I mean, every one of these efficacy studies in education is inevitably just really, really difficult.

Each setting that you place the students in, you have to expect you'd have different outcomes. It feels almost 

Deep: like you gotta do double blind studies like they, like we do 

Craig: in he. And then the downside there is your product. I mean, we release updates every single day of the product, which is 

Deep: good, you know, because then you know, you got, you got a nice rate of 

Craig: evolution, but, but then that efficacy study is instantly in in the next day.

So I think it's difficult to do efficacy studies, but you gotta keep trying and you gotta keep getting smarter about that. So far, we've had good outcomes there, which has been really nice to see statistically significant correlations between our product usage and student results. But realistically, I mean, there's a lot going on in this.

We're kind of a Swiss Army knife for teachers. They can use us for the really informal, very fun live checks for understanding, almost use our site in like a more interactive, engaging presentation or almost like a game. And then you can also use it for the more rigorous common assessments. Things that are standards aligned, the things we're, you know, large bodies of the student population and school district are taking the same thing.

So you can actually look at that data. But the fact that all that's happening our own platform is really what we think is the most exciting. If you collect all this data and then run it through some pretty sophisticated modeling, you might notice things that have never been discovered before, that actually, you know, doing X is more important than you thought it was.

We hope to be able to find some cool innovations or cool discoveries and education as we collect billions upon billions of student data points and use that in very. Mature and we take it very seriously that we, we get all these student responses and we're good stewards of that data, but we, we do think that's gonna really unlock some, some key innovations over the future.

Like, that's my favorite part with AI is, you know, I was pretty inspired by AI solving heads up poker, something that I never thought could be solved, uhhuh. And just watching how it actually plays the hand versus what you thought was the best way to play. And I, I hope some more things will be discovered in educat.

Deep: Thanks so much for, for coming on the show, Greg. It's been a, it's been a great conversation. I, I'm just gonna ask one final question, like, let's fast forward 10 years out. Paint for us a vision of the best possible, you know, AI in the educational arena for, from a maybe a student's vantage and, and tell us like, what, what does it look like and mm-hmm.

like what are they learning about? How is it, and how does it maybe differ from our current sort of standards only 

Craig: based approach? I would hope that your child, children, when they come home will know exactly every day which concepts, which learning standards. They 100% know that they know with 80% confidence, with 65% confidence, and see that every day like, Okay, I definitely know this.

I do not know this or I knew this, but it's degrading the confidence that we have that, you know, this concept. And so they can kind of see the entire breadth of what they're supposed to. And a good visualization on, you know, where there's a potential misconception. Currently, the best that we have for that is at the end of every summer, you as a parent probably get a report that was last year's test scores, and you probably don't even get a broken down by each of the learning standards, but you might get the big concepts and, and that's your, your barometer on student understanding.

Yeah. You need to be getting that in real time every. And then as a parent, you should probably be able to get some access to that pretty frequently, hopefully. Ultimately in a similar timeline. But obviously we don't want parents to be helicopter parenting on every detail of this, but I think each of the stakeholders should have some visibility into this learning data that allows them to be more targeted.

So you should be able to come home and say, I need a private tutor on this specific thing, cuz. Is the thing, we have the least confidence that my student knows. And then if they're doing really well, they should be moving way beyond, you know, whatever is currently being taught in that grade and subject like if, if they've proven competence on all of that, they should be moving forward.

So I think that that's definitely in the foreseeable future, I think AI can help with that. But one of the biggest challenges right now, A lot of different data sets are floating out there. Teachers may be using formative to do some of their assignments with their students. They may be, you know, using 15 other platforms in their classroom.

We need to start to aggregate all that data together. Really start to, I mean, even first, know what you're learning every day as a concept. Like what, what were you supposed to learn today? What standards were you supposed to be taught? Communicate that as a, as a baseline, and then start to build up the case of what percent mastery do you have for each of these different standards.

I think that that. What we hope to do. We think that with billions upon billions of student data, and as it starts to become more and more tagged to these standards, you'll have so much knowledge on each student's learning every day that you won't even need to take the test at the end of the year. It would be purely redundant in a waste of everyone's time.

Oh, 

Deep: you'll be able to forecast the uh, 

Craig: scores quite accurately. Yeah, and then maybe for a few years you'll be using the test just to calibrate whether or not the forecasts were true, but we think 10 years from now, the test should be just purely redundant and hopefully sophisticated. 

Deep: That's all for this episode. I'm Deep Dhillon, your host, saying Check back soon for your next AI injection. 

Deep: In the meantime, if you need help injecting AI into your business, reach out. At xyonix.com. That's X-Y-O-N-I-X.com. Whether it's text, audio, video, or other business data, we help all kinds of organizations like yours automatically find and operationalize transformative insights.