ChatGPT is Here to Stay — How Will AI Impact International Education? A Conversation with Mary Curnock Cook and Nick Hillman

Blog Banner

Every day, technology occupies a slightly bigger role in our lives than it did the day before. In 2023 we’re seeing that take shape swiftly with artificial intelligence (AI) in several fields. One area that will be greatly impacted by this development is education. Generative AI tools, such as ChatGPT, have the potential to transform the way we learn, teach, and access information by providing personalized and interactive learning experiences.

At ApplyBoard, we’re working with AI tools to provide our school partners, recruitment partners, and students with an experience that plugs them into a seamless, information-rich interaction that’s simple to understand. From AI chatbot-enabled engines that are driven by natural language processors, to our enhanced search engine that uses machine learning techniques to point users toward relevant information, AI is driving some of our most forward-thinking product offerings.

We sat down with Nick Hillman and Mary Curnock Cook of ApplyBoard’s UK Advisory Board to explore the impact of ChatGPT and other generative AI tools on education. How will these tools impact student-teacher relationships, assessments, and the ethical considerations that guide each of them? Join us as we delve into the exciting possibilities and challenges of using AI in education.

Callout of Nick Hillman's biography.
Callout of Mary Curnock Cook's biography.
AI has played a much bigger role in all of our lives lately, but that really jumped up a level when ChatGPT came on the scene this past November. When you first heard about this tool, what was your gut reaction on how it could impact education?

Mary Curnock Cook: Well I’m an optimist, so I was excited by it but also fearful that the education sector would find it very difficult to react quickly to such a powerful new technology. Some people have said it’s similar to the reaction when calculators, Google, and spellcheck came in, but I think ChatGPT is fundamentally different because it mirrors human interaction with information.

I was recently in the room for a higher education seminar where I asked the moderator to use ChatGPT to generate multiple choice questions for a subject we were discussing. Of course, it did so in a matter of seconds, spitting out very plausible questions you could test students with.

It then started to become clear to me how powerful of a tool this could be for educators. If you can use generative AI to help you write topic guides, assessment items, and quizzes, you’re freeing up time to engage personalized aspects of learning that students want, need, and possibly don’t get.

Nick Hillman: Even though the material in ChatGPT isn’t always accurate, it consistently sounds human. It’s not what we’re used to hearing or reading when computers talk to us. So for me, that brought about three emotions. One was fear, of how this tool could be used for plagiarism and harming learning outcomes. When I was a teacher, it was easy to tell when students were using Encarta to get the answers for assignments. It’s not so easy to detect the use of these AI tools, even with software.

There’s also this excitement and novelty factor that comes with new technology. Many people in my life, from my son to my sister, started experimenting with ChatGPT in different ways to improve productivity.

Then you think of the massive advantages that this technology could facilitate. Research applications, process efficiencies, rediscovering information the world has forgotten about, and changing the way we learn.

There’s a huge difference between getting ChatGPT to write an essay and handing it in as yours and using ChatGPT as part of the learning process. But just like reading a textbook or using a calculator, this tool could allow students to come to their own conclusions in a more efficient way.


Leveraging ChatGPT is going to be a skill. It’s about how to use it properly to squeeze the most benefits and mitigate the risks. What’s the best way for students and teachers to develop that skill?

Mary Curnock Cook: Whether it’s in the faculties of education or online, I don’t know where it starts, but the people who know what they’re doing need to start using it. There needs to be a great sharing of potential use cases so people know what they need to practice and get better at. It’s just a matter of time before informal groups begin to form where people are sharing information and knowledge on all of this.

Nick Hillman: The easy answer is we don’t know yet. But as long as we think of this tool as being part of the learning process and not a shortcut to the outcome, I think that’s largely a positive. I suspect students and teachers are going to find their way on this together.

This is the worst version of ChatGPT that’s ever going to exist. There are so many developments that are going to extend the benefits of AI tools beyond words—image and data entry to name a couple of major areas.

But this is a shiny new tool and people are excited about using it. In order to keep up with the interest on the student side, educators are going to need to commit to understanding it. If students and educators have a common understanding of ChatGPT benefits and risks, they’re going to be able to integrate it into the learning process.


In what ways can AI tools like ChatGPT improve accessibility and inclusivity for students from diverse backgrounds?

Mary Curnock Cook: This is one of the most important potential outcomes of AI generative tools for me. ChatGPT and other tools like it have the power to democratize access to education. If the next generation of ChatGPT can answer students’ questions more effectively than human tutors can, then there’s a huge opportunity to bring down the cost of education.

This could be a game changer for the quality of the online education experience. Beyond the highly personalized content for students that AI enables, its ability to personalize the feedback and support we can provide students with is incredibly exciting.

Plus, this tool is going to grow exponentially in its features and use cases. What it’s going to open up for the world of online learning is making education not only more accessible, but also a deeper, more enriching experience that students will want to stick with.

Nick Hillman: I think it’s important that international students develop a strong understanding of the English language. If students write an inferior-quality essay and feed it through ChatGPT, they’re not necessarily learning the language better.

Different education and cultural norms mean different students have varying definitions of what is or isn’t inappropriate. So it’s going to be crucial to define what is or isn’t acceptable in terms of using these AI tools in the classroom and on assignments. But if used in the right way, AI absolutely has the power to make education less intimidating and more accessible for students around the world.


Rather than trying to fight AI tools, it seems that we will have to be proactive about managing the role they play in the lives of educators and students. Do you think schools could benefit from working with students on the best methods to use this software in a way that is helpful for students but not detrimental to the educators instructing them?

Mary Curnock Cook: Of course, there are some die-hards who say ban it and go back to pen and paper, but the majority of higher education professionals I’ve spoken to are thinking, “How do we take this immense technological progress and turn it to the advantage of educators, students, institutions, and learning?”

This is just a fantastic turning point, and a lot of people in higher education are trying to figure out how we can not just live with this, but use it to everyone’s advantage. Largely because our graduating students are going to have this be a part of their everyday life in the same way pen and paper were decades ago.

Nick Hillman: You can’t uninvent something. Once nuclear energy was invented, we couldn’t reverse that discovery, but we could think about how to spread that technology for productive uses and not destruction.

ChatGPT tools now exist and I think we have to mine into the benefits of this software and focus on the positives. One of the biggest areas I think could be enhanced is critical thinking. One of my colleagues asked ChatGPT to write a story about Christmas, and the story said children got loads of presents on Christmas Eve and Christmas morning. It combined the traditions of multiple cultures to create a false reality.

That’s a simple example, but it points out the need for students to always possess the ability to sniff out misinformation. Conducting exercises where students attempt to fool ChatGPT and analyze the areas where it needs human assistance—that’s a way to give control back to the student.


How can educators ensure that AI tools are being used in a way that complements and enhances their teaching, rather than replacing it?

Mary Curnock Cook: The workload management applications around administration, teaching, learning, and assessment are incredible. A huge amount of personal tutoring could be done through AI that would allow student-teacher interactions to be spent on the most essential items. Summarizing class sessions and providing comprehensive notes are welcomed services that AI could relieve instructors of.

The real worry comes from the potential for students to cheat. It’s about getting students to use AI tools in an ethical way, not only within the world of higher education, and because it’s going to become a part of their working lives after they graduate. They’re going to be using these tools for all sorts of ways of working and interactions with their colleagues and customers.

So those ethics around how to use AI in a positive, proactive, and helpful way are incredibly important. My worry is that universities will spend far too long talking about this and developing a framework for how to navigate it over the next several years. By that time, ChatGPT will be on its 40th iteration and that framework is no longer going to be relevant.

Nick Hilman: Education is a human-to-human endeavour. ChatGPT is just another tool alongside Google, your library, and academic journals. It’s a tool to help you come to the best possible answer for any existing question. And hopefully, because it exists, you might arrive at that answer in a quicker and more detailed way than you could have before.

Ultimately this is a tool for human beings. We’re not giving over the world to ChatGPT. We have the ability to write the narrative for this story, so long as we’re proactive about it.


How do you think the use of AI tools in education will impact student-teacher relationships?

Mary Curnock Cook: If generative AI means that the relationship between students and educators is at a different level, in terms of not having to do housekeeping-type interactions, that’s probably a good thing. If students and teachers are using their energy and focus to address the items that truly require human interaction, that’s a good thing.

We’re progressing towards a model where students and teachers are going to be able to use their time more effectively. They’re not going to have to put in the foundations every single time before getting to those higher-level conversations.

If we see this evolution as a boost to the target level of cognitive agility that students and educators engage in, that’s probably the most positive way of looking at things.

Nick Hillman: I think we have to be very careful here, because it’s much harder when using ChatGPT than other tools to prove if someone has plagiarized something. It’s producing unique copy and answers for every question it gets asked. If you’re an instructor and you’ve got a mediocre student who starts handing in brilliant work, there’s a suspicion that there’s foul play.

And then there’s the opposite. A student who’s been doing well who suddenly starts coasting because ChatGPT is doing all the work for them. So there’s a question of how this bears out in examinations and practical applications of curriculum. I do worry slightly about heightened tension between learners and instructors.

That’s why I think it’s important to have these conversations out in the open. Students are naturally going to warm up to new technology at a faster rate than a 50-something-year-old professor might. We have to have collaborative, mature conversations with students about the use of these tools.

I think that’s an easier conversation to have with university-level students, but it becomes harder to do in a room full of 14-year-olds. There’s going to be an inevitable gap in understanding between students and teachers early on. There has got to be a partnership between academics and their students where they’re learning from each other around the use of this technology.

The academic will still know more about the content of the course, but they might not know more about the medium in which their students are learning it. They are on a journey to understanding this technology together.


When it comes to new technology, there’s always going to be an imbalance of expertise between students and teachers. How can institutions manage that gap and prevent it from getting too wide?

Mary Curnock Cook: There’s a load of tech coming down the pipeline very fast, and my feeling is that higher education is not particularly well placed to pivot to what’s likely to be required—a very different approach to teaching, learning and assessment.

If you leave it to individual educators to teach themselves how to use these tools, uptake will be very patchy. This is a topic for leadership teams to get on top of. It’s urgent for our students’ sake, but also for institutions. We need to develop frameworks for how universities are going to deal with this. What it means for redeveloping assessments, who’s going to do that, who needs to be trained, and who needs support. We need to consider all these things at the university staff level, and then obviously how this is going to play out for students as well.

Nick Hillman: ChatGPT is clearly on a whole different level than any other sort of technological advancement we’ve seen over the last several years in education. But it’s not so much the complexity of the technology that matters, it’s the intuitiveness of it.

One of the biggest benefits of this tool is that you don’t really need an instruction manual for how to use it. We learn how to master these tools by doing. Even though the tech is more sophisticated people can still become familiar with it quickly. It’s just that instructors sometimes have less of an appetite for change.

Ultimately, we don’t give as much thought to changing the way we teach in post-secondary education. This sector’s resistance to change is going to be tested with the introduction of ChatGPT and other AI tools. They’re going to be forced to adapt and reconsider what the best way to do things is.


How might AI tools change the role of teachers in international education? Will they become more facilitators of learning, or will they still have a prominent role in direct instruction?

Mary Curnock Cook: We’ve been on that journey towards facilitating learning for a while now. Part of the promise of higher education is this community of academics and students who create new knowledge together.

I’ve spoken to some industry professionals who are genuinely relieved that the essay as an assessment tool has been blown out of the water by AI. It’s very encouraging that the higher education sector is full of innovative thinkers and open minds. It’s easy to characterize professors as old and stuck in their ways, but there are a lot of people with an appetite for innovation.

Nick Hillman: We saw this in part with MOOCs. Everyone was excited about them, we saw massive uptake in the early days followed by a massive drop-off. Nobody stuck with it. Because they didn’t have anyone checking in with them on deadlines and scheduled quizzes or tests. There was no sense of a team environment.

Ultimately, I think humans want validation from other humans. AI has the potential to improve the efficiency of grading and assessing work, but a human is always going to value the opinion of another human over a computer. The real-life experience and critical thinking skills that instructors bring to the table are always going to matter.

The human element of education is really important to motivate students to do the work in the first place and for the evaluation at the end. AI technology comes in the middle of that, helping students write better essays, digest more information, or get better at critical thinking. It’s not the end in itself. Technology can be the end result in some respects, but I don’t think it’s the end result in learning because learning is a human process.


In what ways might AI-powered tools change the way we approach assessments and evaluations in education?

Mary Curnock Cook: Often, we think about assessment as posing a question and asking for an answer in a certain format. Essay, short answer, multiple choice, etc. What’s going on now is that people are thinking about much more authentic ways to assess knowledge levels.

We’re likely going to see more instances of verbal assessment—requiring students to articulate their arguments in the moment. This is a format that requires you to have absorbed and analyzed a fair amount of knowledge in order to excel. There will be a lot more problem-solving and scenario-based learning as well.

Formats that mimic the sorts of issues you might face in real life. When you’re in the room with someone who’s really bright, you know immediately. It’s communicated through the speed and articulation of their comments. But moving forward, tools like ChatGPT are going to be able to move so fast that we will likely be able to use them immediately. But the quality of the information you put in will dictate the quality of the information you get out, so maybe that will be the skill that matters most.

Nick Hillman: The possibilities for how this tool could impact assessments are so wide-ranging. You could administer an exercise where students have to do something on ChatGPT and then self-evaluate it, so the end result is not the thing that ChatGPT has produced, it’s the student’s assessment of the quality of it.

Some say tech will reduce human involvement. More in-person or oral evaluations could be a route certain institutions look at as well. That’s an expensive thing to scale, but for smaller class sizes and certain graduate programs, it might ensure students are truly drawing on their own knowledge.

The examination format will likely end up depending on the course. If ChatGPT allows a marketing student to produce a more effective campaign, that’s likely a good thing. But for students writing an essay on literature, instructors will want them to construct their own arguments. Sometimes it’s the instruction you put into your tools that matters the most. The goal isn’t always to be original, it’s to be effective. And if ChatGPT is helping students become more effective in what they do, it’s a win-win.
 

Subscribe to ApplyInsights

Sign up for the latest insights on international education.



About the ApplyInsights Team

Led by ApplyBoard Co-Founder and CEO Meti Basiri, the ApplyInsights Team analyzes the latest government, third-party, and ApplyBoard internal data, to provide a complete picture of trends in the international education industry. They also work with industry experts and ApplyBoard team members to gather local insights across key source and destination countries, where ApplyBoard has helped more than 600,000 students around the world.

 

APPLYINSIGHTS DATA BLOG

The most important stories in international education, backed by data