What will Happen to Educators With More Classroom AI Technology? Better Question is Probably - What Will Happen to Students?
More often than not, when we think of the looming age of automation at work, we imagine the faces of tired truck drivers and frustrated low skilled retail workers. An image less easily rendered by our minds is the face of highly trained educators currently teaching children all across the United States. As artificial intelligence (AI) technology makes its way into the classroom, teachers and futurists alike will be forced to ask a crucial question: Can advanced AI begin to address the ontic educational disparities that predates today’s use of said educational technology?
Recently, the New York Times (NYT) ran an article focused on the expansion of Summit Learning,a primarily automated teaching tool developed in Redwood, California which is meant to put students “in charge” of their own education. The article intrigued me to visit Summit’s website. Once there, one quickly sees the essence of the goal behind the technology. The up front marketing blurb states that “Summit Learning uses personalized teaching and learning to empower students to harness their inner drive for success.” I don’t know about you, but that sounded much like a statement we could have pulled off a corporate, online training website.
The NYT article on the Summit process highlighted some of the issues that have come into focus with the expanded use of this technology approach to classroom activities. Supporters of Summit Learning see it as a righteous and future-minded step that will optimize education and better engage students. However, detractors, among which many parents, students, and educators list themselves, fear it’s a dangerous path towards automating away a profession that up until now has been quintessentially human.
Though the merits of self-directed learning have been discussed for years, within the 380 schools where the Summit Learning program has been implemented, feelings regarding the program’s effectiveness have been somewhat mixed. The program touts an ability to better prepare 72,000 students for the rigors of post secondary education by having them set and follow through on short- and long-term goals with less oversight from the educator. In other words, students are expected to work with more independence. Concerns lobbied against the program’s impersonal nature only seem to highlight the fact that Summit Learning tools can’t really address any of the underlying, and ultimately limiting, disparities that the student users may face. So while classroom independence using Summit might not be a bad intention, it is difficult to address the students’ independence without attention to the personal issues that may actually restrict individual goal-setting and goal-reaching mechanisms. Having tools that make teachers even more potentially disengaged from the students, doesn’t inspire confidence that we are moving in the right direction.
The first indication to Summit’s limited ability to address ontic disparity within education is that teachers complain that the tool has made them more bystanders than educators. As one educator at a Summit school told the NYT, teachers are only encouraged to have about 10 minutes of face to face time with each student every day. Some students told the publication that they miss interacting with their instructors. One doesn’t need to do much research to determine that the quality of interaction between teachers and students is a vital component in the latter’s social and academic development both early and later in life.
So you might be thinking that if the teacher isn’t doing much face-to-face, then maybe he or she is doing more group interactions. Not so. “In one school, we did not observe a single Summit math teacher engage in whole-class or small-group math instruction,” wrote researchers from Johns Hopkins Institute for Education Policy. “Instead, teachers either completed work at their desks, and/or answered questions when students raised their hand. Finally, the lack of teacher surveillance of student progress in some Summit classrooms meant that students worked very slowly through the material.” Someone should have predicted that K-12 students, given the chance to act totally independently with technology in the classroom, might be tempted to move at their own pace.
In Kansas, the home state of the Summit schools featured in the NYT article, 50 percent of the kids who are using Summit Learning at school are on free or reduced lunch. This suggests that the students on whom the Summit tool is being guinea-pigged, are those students already potentially at risk. Couple that with the fact that Kansas ranks 29th nationwide in terms of educational funding and that the state’s teachers are generally underpaid. Experimenting with Summit Learning on unsupported teachers and poor students could seem a lot like fashioning a tourniquet out of fishing wire and wondering why the bleeding won’t stop.
Granted, Summit Learning is still in it’s relative infancy and it’ll be hard to know how things will shake out until they actually do. But, as technology stands right now there is still a laundry list of vital teaching tasks that even sophisticated AI can’t complete. Educating children full time may be one of them. Turning such a significant amount of pedagogical control over to a system that has yet to demonstrate a high level of uncontested effectiveness may not be a good idea. That’s to say that many of the statistics that indicate any success coming with Summit Learning can also be countered by data highlighting exactly how the issues are far larger than a single learning platform.
For example, the Johns Hopkins researchers were actually examining the effectiveness of the Summit Learning program on students in some Providence, Rhode Island public schools. Not unlike Kansas students, public school students in Providence are struggling. Only 10 percent of them score at a proficient level in math and only 14 percent score proficient in language arts. Despite Summit’s claim of success, the overall educational landscape in Providence still sees some high schools with graduation rates below 70 percent. That said, how much confidence is one supposed to have in Summit’s higher graduation rates when the Johns Hopkins study noted that many students in Summit classrooms were stalling on one screen while others were doing “well below grade-level work,” with only a small percentage of them doing work “at, or close to, grade level”?
The NYT article also observes that, “Summit [technology] demands an extraordinary amount of personal information about each student.” However, even with that data, can the computer can't teach a hungry child better than a human being can? That’s because the technology can’t account for the child’s existence beyond the keyboard clicks. For example, when the Summit AI technology is grading quizzes it won’t say ‘Hey, you’ve answered B four times in a row and your handwriting is totally illegible, maybe we should talk about if you have eaten today.’ Even if it did, the technology doesn’t address the fact that eventually the kid will perhaps leave school just to go home to a house where there is no technology or a full supper? It also doesn’t address the needs of underpaid teachers to handle classroom challenges they know exist. The technology also doesn’t address the fact that the Kansas public school system (and yes Summit schools are ‘public’) is “more segregated today than it was in the 1990s”? The technology is not addressing any of the human challenges that could stymie classroom success.
At this point, one can reasonably question whether it’s right to critique the program based on a handful of goals it wasn’t designed to meet. But these questions should be considered in light of the fact that Summit Learning has received more than $70 million in grant money.
Understandably, not every grant is meant to fund everything, but in a case like this, one should still mount an almost ethical inquiry into the overall nature of funding. That is to ask, should a private company receive millions in funding to push an experimental program that can’t fix extremely tangible problems that could be solved with, you guessed it, more funding? One could almost be forgiven for thinking that this is an example of the grant gods blessing a curious new tech initiative with millions rather than investing more directly in the families that Summit Learning is supposed to help.
Moreover, Summit Learning has been cartoonishly dishonest about the number of schools that disenroll from it’s programming. The learning platform claimed that only 10 percent of its schools across 38 states quit using it each year when in actuality that figure was much closer to 20 percent. Entire school districts in Connecticut, Pennsylvania, and Indiana have even dropped the program based in concerns that it’s essentially a technology based social experiment on school aged children.
Has anyone behind Summit Learning really asked if it’s the disparities that seem almost inherent to public schooling in 2019 that’s holding students back?
Currently, the long term results of Summit Learning are unknown. But it still leaves one to question whether it can really help kids be more prepared for college or the workforce? Isn’t that the goal of school? If Summit Learning isn’t even addressing the issues that may contribute to millions of kids being unprepared at graduation for decades, then what can it really get done right now?
Those behind Summit Learning note that one of the main goals is to let “teachers to do what they do best — mentor students.” How does the recommended 10 minutes of face to face time, which the NYT article confirms that some of the students never get, qualify as good, or even adequate, mentorship? Nothing on Summit Learning’s website has addressed those particular and tangible concerns or shortcomings. That makes the whole effort to supplement facetime with human educators in favor of AI look like another way to sell Chrome books and score big grants.
Automating any profession or process is always an ordeal that will invariably come with potentially ethical hiccups. That’s because AI requires testing. But, do we as a society have the luxury to test on kids who already start at a disadvantage? Should the vulnerable be forced to have the quality of their education tested in this way?
To quote Jurassic Park, perhaps Summit Learning was “so preoccupied with whether or not they could that they didn't stop to think if they should.”