Custom CSS

An academic's perception on the use of AI within academia and education

Photo of the author who wrote this blog post
Oliver Back
8 min read

ChatGPT has sparked the debate over the place AI has in education and research. Many people question whether AI will change the direction educators take their classes in, and how they conduct in-class and at home assignments. There is  growing sentiment online that this change is already happening.

In a recent interview, Anthony Z, a university lecturer, spoke about his perception of AI in higher education and academia.

In the year since OpenAI launched ChatGPT, have you been aware of more students talking about, and/or using this technology, or other AI-driven tools such as writing or paraphrasing tools?

Absolutely, yes. I think ChatGPT was officially launched at the end of 2022, right? So immediately afterwards I realised students were talking about it. Students are often early adopters of new technology, so it is unsurprising that as soon as ChatGPT became publicly available, students would experiment with it in their education. But how much is too much? One of my PhD students from China tells me he's using AI for proofreading, and I've seen positive sides to that.

Professors don’t want to correct grammar or spelling mistakes. Using an AI tool to bring writing quality up to the point where students can focus on the content is a much better use of their time. In this way, writing assistants can add tremendous value to any student’s workflow.

Overall, what is your attitude to AI research tools helping students with the following: Literature searches; reading and assimilating literature; drafting essays, dissertations, or other coursework?

My experience with ChatGPT is that it’s not very good with summarising literature. It kind of paraphrases the literature and for me that wasn’t a good experience, but I'm really positive and open about students using this technology as long as they tell me in advance they've used AI tools to help them.

In your understanding of generative AI, do you think that they are up to the standard where you could be fooled?

Do I think the current ones would be able to generate a credible essay? No, there is a lot of repetition; lots of the same information written in different ways. However, I’ve recently been involved in writing a kind of application form in Chinese and I've used ChatGPT to help me with that. It’s good at generating content that you can add your own input and context to.

What if your students used ChatGPT, or another paraphrasing tool to give them the structure or  outline of a piece of written work or even to reword something they'd written. What’s your opinion on the use of AI there?

I would totally support that, especially for second language users, because the advantage with some open AI tools is that they can help you to generate grammatically correct sentences that sound more like native speaker would write. For example, I know there are students who use AI tools to check the emails they send to us beforehand to make them closer to their native language. So these tools can be really helpful, especially with paraphrasing. We get a lot of questions from international students about how to paraphrase well. and we also run academic English sessions to help them do just that.

So it seems like using AI as a way of enhancing or improving what you’ve already written is valid?

Yes, it's something that can provide additional help, but you shouldn't purely rely on it to generate written work from scratch. Obviously when you do a literature review, you need to apply critical thinking and develop your own arguments. That can only come from being human. At this stage, I'd say I'm not sure whether this technology is powerful enough to replicate that at the moment.

Academics are more inclined to see AI as one tool in a wider toolkit, rather than the entire solution. Humans still need to be involved in the process. It seems likely that the next big skillset to be taught will be writing and refining AI prompts, and learning how to enhance AI generated writing, rather than starting from scratch.

In the next 5 years do you think it’s inevitable that tools such as ChatGPT will be used by students of all levels and how do you feel about the impact this may have on both teaching and learning? If you think there are both positive and negative ramifications for yourself and for students, could you talk about these?

I was in a talk about OpenAI in academia and the speaker said, imagine  20 years ago or 30 years ago when the internet was starting to become popular and  search engines were emerging. There were a lot of people who were against using them, but you can't prevent these developments  from becoming popular. Generative AI is  the new trend. I would say in the next five years it's inevitable that this technology will become embedded and people will have to adjust their ideas and opinions. That's the reform new technology brings and at some point you just need to go with the flow. I say there's no way of preventing that from rising up. ChatGPT is  gathering opinions and ideas from all over the world, and continually  learning which is a positive thing and I definitely think that it will be more and more popular among students of all levels. With teaching and learning, I wouldn't expect it to  fully replace educators and academics in universities. I don't see that happening any time soon, to be honest.

The introduction of AI induced fears amongst many that robots would start replacing workers, and in some ways it has. But academics like Anthony posit that teaching will never fully be replaced by AI due to the requirement of human-to-human contact within an educational environment – Something that AI is likely to never replicate.

Do you think that using AI could help you to teach higher level and more complex topics more effectively?

As lecturers, we also need to be adopting more of this kind of technology  to help us to prepare sessions, teaching materials etc. Whilst students can use it to help them with a literature search for example. The main thing is that generative AI moves forward in a positive way and becomes more accurate, because in my limited experience, the accuracy wasn’t very good. And, you know, it's not just about literature searches and paraphrasing to be honest. When you do statistical analysis, for example, using R code and asking ChatGPT to help you interpret the results is really useful, and it's very accurate compared to doing a literature search with ChatGPT.

In the future, it may be the norm for AI to help shape and structure the teaching process. At the same time, allowing students to use AI tools could be the equivalent of providing them with ‘cheat sheets’ used in past years. Students may find that alongside their required reading list, their library has a ‘top tools for students’ section available and included within their tuition package.

Do you currently or do you expect to adopt AI-powered tools as part of your own work as an educator in the next few months?

Officially using it in the classroom with students? I don't think so. I wouldn't open ChatGPT in front of the whole class and type in something to show them. In the next few months? That's a bit too quick for me, but I certainly would use it myself for my own learning.

Would you expect teaching to adjust to the use of AI in the same way that teaching mathematics adjusted to the invention of the calculator?

Certainly the invention of the calculator has made doing mathematical calculations much easier and quicker. Using AI should make things much easier and quicker, although there’s potential for it to make research and learning more complicated, because it has the potential to provide different insights and perspectives that you've never thought of. It would certainly make things more colourful, provide views from different perspectives. It's like a search engine, right? It can communicate with you and then you can top up with the additional information, in terms of the answer.

Do you think that AI tools help to increase accessibility to higher education for disabled students, and for international students studying in their second language?

Definitely. I saw this even with a small improvement to the LMS, Blackboard. We use this platform a lot for teaching and there’s now Blackboard Ally, which is a small function that helps you to make content more inclusive. The content you upload to Blackboard can be transferred into different versions so it can read it out for you, which is particularly useful for disabled students who have dyslexia, for example. At the beginning people didn’t really see how it was going to be useful, but now when we upload material to blackboard, it can tell us how inclusive that material is and what changes need to be made to make it more accessible. We would check that it has a kind of a percentage to tell you whether that's inclusive enough. Usually we would expect to score higher than 80%. Having this accessibility indicator is really useful for instructors because it helps us to adapt material to be as inclusive as possible. It can be especially useful when teaching non-native English speakers

Tools designed to increase inclusivity of complex academic teaching materials are already becoming the norm, so it’s likely that AI will slot right in with the academic tool kit, becoming a part of a lecturer’s daily life.

AI platforms such as ChatGPT and Scholarcy are being adopted by more and more students and lecturers as AI tools become the standard solution to turn to when approaching complex academic text.

Tags