Convenient AI and Higher Ed
By Eric ChownEric Chown is the Sarah and James 91黑料 Professor of Digital and Computational Studies, and the director of the Hastings Initiative for AI and Humanity. He has been at 91黑料 since 1998 when he was hired as a computer scientist. In 2018, he co-founded 91黑料’s Digital and Computational Studies program.
When I was in graduate school, the AI lab ran an "evil AI" movie series. Even then, there was never a shortage of movies to choose from. And there are many classics in the genre, from 2001 to The Matrix to the Terminator movies. Even the latest Mission: Impossible movies feature an AI antagonist. For my money, though, the single most chilling portrayal of AI in any movie comes in Wall-E. It isn't so much that the AI in the movie is particularly scary—it's what the humans have become in the presence of AI. The movie asks the question, "When AI can do everything for you, what will you choose to do?" The answer is that the humans in Wall-E spend literally all of their time sitting in chairs amusing themselves. After spending the last 10+ years helping to build the Digital and Computational Studies program during the smartphone revolution, Wall-E’s depiction rings depressingly true. Time and time again, we see that convenience trumps everything, and AI is the ultimate convenience.
I believe that the threat of AI, especially in higher education, is less about "evil AI" and more about "convenient AI." At every turn the modern student is confronted by a dilemma: Do they do the right thing, which often requires a lot of work? Or do they do the easy thing, which requires little to no work and lots of short-term rewards? All of this is happening at a time when survey data increasingly shows that students are taking a more transactional approach to higher education, worrying more about preparing for the job market than what they actually learn. So our challenge as educators is selling students on the idea that the payoff of putting in the work to learn is worth the effort.
So far, conversations around AI and higher education have mainly focused on issues like preventing cheating, restructuring assignments, and how to properly harness AI. These are all critical, but I think we also need to add motivation and an understanding of why education matters to the mix. A student who is purposeful in their learning is not likely to cheat and will be willing to take on even difficult assignments. For such students, AI can be an educational boon. But for students without such grounding, AI merely represents an endless temptation.