When ChapGPT was announced, many in the education community suggested that it would end education as we know it. Students would be able to cheat almost effortlessly. Over time, educators have begun to see the limitations of ChatGPT. We now see that ChatGPT lacks the capacity for critical evaluation, higher-order reasoning (or any reasoning for that matter) or empathic engagement.
Becoming aware of these limitations had brought some calm to the crisis. It has given educators some confidence that they will be able to find human or technological ways to detect cheating and similar concerns. In questioning their initial panic, educators now have the room to ask a different question, namely, “Might ChatGPT actually be good for eduction?”
Some think so. In the MIT Technology Review, Heaven (April 3, 2023), writes: “Advanced chatbots could be used as powerful classroom aids that make lessons more interactive, teach students media literacy, generate personalized lesson plans, save teachers time on admin, and more.” A variety of educators have advance similar arguments: ChatGPT is here to stay – get used to it. From this point of view, the question is not, “How can we fight ChatGPT?”, but instead, “How can we best use it”?
The Wrong Question
I suggest, however, that “How can we use [insert new technology here] in education?” – is the wrong question. In fact, even asking this question is one of the problems of education. It is the exact opposite of what we should be asking.
We do not say, “Here is a hammer. How can I best use it?” Instead, we say, “I am building a house. What tools and materials do I need to build it?” And of course, the answer to this question relies upon what type of house one wants to build and what we take to be a good house.
Asking “How can we best use ChatGPT” is like being given a screwdriver and then looking for a screw to turn. When we think this way, we find ourselves looking for screws when we should be thinking about the types of edifices we want to create.
That is, we should be asking, “What type of person and society do we wish to create? What do we want students to know and do?” Only having addressed these questions do we ask, what tools and technologies can we call upon to advance our goals?
Some Truly Frightening Examples of What Happens When We Ask the Wrong Question
I want to show how the arguments made in Heaven’s article leads to what I regard as some frightening consequences. These are not consequences of ChatGPT – but instead consequences of believing that education can be enhanced by finding uses for ChatGPT. Here is a near exhaustive list of the types of positive uses that Heaven and his interlocutors see for ChatGPT.
Note that each of the benefits suggested for ChatGPT are not answers to the question, “how can we better educate students?” Instead, they are responses to problems created by the introduction of ChatGPT and similar technologies themselves. Each proposed use is the reactive rather than proactive.
Heaven notes that ChatGPT is forcing educators to rethink already existing problems in education. He writes:
Take cheating…[I]f ChatGPT makes it easy to cheat on an assignment, teachers should throw out the assignment rather than ban the chatbot (emphasis added)
This is an astonishing statement. Consider, for example, the essay. Should I not assign an essay because it is easy to cheat on it? Why given an essay? We don’t give essays simply to assess knowledge on a topic. We assign essays because the act of researching and writing essays forces students to articulate their thinking, to reflect on the structure of language, to understand how to use language to create and express thought, and more. If I “throw out” the essay, I eliminate the very activities in which the students must engage in order to learn how to think. What assignment, using the chatbot, will fulfill these goals?
Heaven notes that:
using ChatGPT to generate a first draft helped some students stop worrying about the blank page and instead focus on the critical phase of the assignment This suggests that the first draft is not a critical phase of the assignment.
This is not so. The first draft is a essential part of the assignment. It is the time where active efforts to identify relevant points and organize them into a set of coherent thoughts take place. If the chatbot writes the first draft, the student is robbed of the opportunity to engage in these essential components of learning to think..
Still further, Heaven states:
Information that was once dispensed in the classroom is now everywhere: first online, then in chatbots. What educators must now do is show students not only how to find it, but what information to trust and what not to, and how to tell the difference. ‘Teachers are no longer gatekeepers of information, but facilitators’.
All new knowledge results from the refinement of existing knowledge. This is a basic developmental principle. To learn something new, must assimilate novel experiences to existing knowledge, and thereupon modify existing knowledge accordingly. Education is not a mere matter of helping students “find information”. To reduce the teacher to a “facilitator” is to ignore the central role that the teacher plays in identifying the types of knowledge that students need in order to build new knowledge.
It is tempting to think that chatbots will be effective tools to help students study.
In March, Quizlet updated its app with a feature called Q-Chat, built using ChatGPT, that tailors material to each user’s needs. The app adjusts the difficulty of the questions according to how well students know the material they’re studying and how they prefer to learn. “Q-Chat provides our students with an experience similar to a one-on-one tutor,” says Quizlet’s CEO, Lex Bayer.
This might sound promising. Flashcards are very popular with students (and even teachers). However, flashcards are not the best way to foster learning. They are tools to promote memorization. Indeed, as of this writing, Quizlet’s website features a video called Memorize Anything with Quizlet Flashcards. It reads, “Digital flashcards make studying easier so you can memorize anything.”
While memorization is indeed important as the first step for some learning purposes, learning requires the fostering understanding. Memorizing is not understanding. To understand, it is necessary to connect ideas together into a larger whole. If students are able to pass assessments using mere memorization, there is something wrong with the assessment. And any tool that treats memorization as a primary vehicle of learning does not serve students well.
But there is more. Heaven continues:
In fact, some educators think future textbooks could be bundled with chatbots trained on their contents. Students would have a conversation with the bot about the book’s contents as well as (or instead of) reading it.
Reading is an active process of breaking down, understanding meanings, and actively creating a structured understanding of a set of arguments, principles or themes. In college, most problems in understanding are reading problems – the results of a failure to read deeply or to know how to read deeply. One cannot replace the skills and knowledge acquired by reading by having a conversation with a chatbot. Students need to learn to read.
And if that were not enough, consider this:
And, of course, some students will still use ChatGPT to cheat. In fact, it makes it easier than ever. With a deadline looming, who wouldn’t be tempted to get that assignment written at the push of a button? “It equalizes cheating for everyone,” says Crompton [a professor interviewed for the article]. “You don’t have to pay. You don’t have to hack into a school computer.”
And so, if students will inevitably cheat, it’s best to ensure that all students have equal access to cheating.
Should We Stop Worrying and Just Learn to Love the Bomb?
If you can’t beat ‘em, join ‘em. Well, that certainly would be the path of least resistance.
It is true: technology drives cultural change. It is neither possible nor desirable to eliminate technology. Yes, AI is here to stay. To be sure: we must teach our students to use it appropriately and for good.
To do this, our goals must come first. ChatGPT – like any technology – is a tool. A tool is never simply good or bad. It is only as good as the purpose for which it is used. We cannot let the means (our technologies) drive our goals. Currently, we are letting the technological tail wag the educational dog. In education and in life, the juggernaut of technology seems to be outstripping our capacity to regulate it and use it for good.
Technology will always be a central part of the solution to any cultural problem. The question is, can we develop the wisdom to use technology for good? Or will it use us?
If you like what we are doing, please support us in any way that you can.