ChatGPT in Academia: How Scholars Integrate Artificial Intelligence into Their Daily Work

A recent survey conducted by De Gruyter’s Insights team has revealed that academics hold both concerns and curiosity regarding artificial intelligence tools such as ChatGPT. What’s more, they are keen to see more transparency and education about the ethical use of these disruptive technologies.

Whether it’s scare stories about our jobs being replaced by algorithms or worries about deep fakes, the past year has seen countless media headlines about the potential impact that Artificial Intelligence (AI), and particularly ChatGPT, could have on our lives, professions and wider society.

Standing for Chat Generative Pre-trained Transformer, ChatGPT is a large language model-based chatbot developed by OpenAI. Launched on November 30, 2022 and now on its fourth iteration, ChatGPT is both revered and feared for its ability to mimic human intelligence and for its general knowledge and problem-solving abilities.

In academia, ChatGPT has gained notoriety as a tool students may (and perhaps do) use to write their papers. Concerns about training data, copyright and plagiarism exist too. However, while researchers agree that ChatGPT and similar AI tools can pose many threats and challenges – there are major advantages too.

We Want to Know

Between June and July 2023, the De Gruyter Insights team surveyed 748 humanities, social science and STEM academics across 82 countries to get behind the headlines. We wanted to understand whether AI was routinely being used in their scholarly work – and if so how.

Check out the full research report Cautious but Curious: AI Adoption Trends Among Scholars here.

The research investigated what role (if any) ChatGPT was playing in a scholar’s work-flow and what attitudes they had about this emerging but potentially highly disruptive technology.

As a publisher, we wanted to know more about what scholars consider the main benefits and drawbacks of AI to be in order to help us react better to emerging technologies and provide our own authors with appropriate support.

Not Now – But Not ‘Never’

The research suggests that while academics have adopted some AI tools such as DeepL and Grammarly into their scholarly work, they are a long way off incorporating ChatGPT as a standard part of their research processes. Findings suggest that academics consider ChatGPT to be a technology still in its infancy.

“While current usage and adoption may be low, researchers have plans to take advantage of the tool at some point soon.”

Just 10 per cent of respondents said they use ChatGPT weekly with just four per cent using it daily. Indeed, many (39 per cent) of scholars had never used ChatGPT at all.

However, while current usage and adoption may be low, researchers have plans to take advantage of the tool at some point soon. In fact, of those surveyed who are aware of ChatGPT, the majority (62 per cent) support the use of the technology in the future with many considering it something that is ‘here to stay’. Also, the research found a high level of awareness of the technology with 71 per cent saying they are either ‘quite familiar’ or ‘very familiar’ with the tool. Overall, STEM scholars are more familiar with AI workplace tools than HSS academics.

The Alignment Problem

The research reveals that currently, ‘trust’ is the top factor limiting the wider adoption of ChatGPT by academic researchers for research purposes within their institutions.

Of those who have used the technology, a high proportion (78 per cent) have experienced it bringing back partial or unreliable results. This means that researchers must spend time double checking results. Many feel that ChatGPT’s output is not always aligned with their goals.

“Many researchers are finding their feet with ChatGPT by first using it in non-work settings.”

However, while scholars might not yet want to use ChatGPT as a formal element of their research processes, they are more willing to use it at home and in non-research environments. In fact, nearly a quarter (24 per cent) of respondents who have used ChatGPT have only ever used it outside work.

This suggests that many researchers are finding their feet with ChatGPT by first using it in non-work settings. Whether this will lead to the adoption of the technology more widely for research purposes could be the subject of further research.

Ethical Concerns

While those who use ChatGPT have some practical concerns about the reliability of the tool, other researchers have ethical worries which could be limiting use. Plagiarism and academic misconduct were also key concerns voiced by the researchers surveyed as was the dissemination of ‘fake news’.

“Academics also raised concerns that ChatGPT might exacerbate existing inequalities …”

Academics also raised concerns that ChatGPT might exacerbate existing inequalities for researchers operating in underfunded disciplines and developing nations. Many believe that if ChatGPT is to only be available as a paid-for tool many researchers might not be able to afford the costs.

However, respondents were split on this issue. Many other academics suggested that ChatGPT could be vital in ‘levelling the playing field’ for non-native English speakers as it could make translation faster and easier.

Either way, the fact that researchers consider that ChatGPT could have such an impact on research practices globally is a mark of how influential they believe the technology could become. This is perhaps why the respondents were so keen to understand more about how the technology works and what data it has been trained on.

Standing Out from the Crowd

The research found that academics are most often using AI tools and ChatGPT to correct, translate or simplify text as well as to find meanings and definitions.

What appears to set ChatGPT aside from the crowd in terms of its ability to help with research-related tasks, is its capacity to synthesize large amounts of text and/or data and produce summaries of published papers.

Scholars see this functionality as being unmatched – at least, so far. In many cases, researchers are excited about ChatGPT’s potential to save research time and to assist with the speedy compilation of literature reviews – often a laborious process.

Watching and Waiting

It’s normal that any new and emerging technology takes time to bed in and gain the trust and confidence of its users. This research suggests the same is true of how research communities are approaching their adoption of AI tools and specifically ChatGPT.

“Perhaps scholars are watching and waiting to see what the next iteration of ChatGPT can do.”

The research indicates that currently, there are only a few early adopters of the technology. Most academics are cautious with understandable worries and fears. But many are intrigued by the potential of the tool. Perhaps scholars are watching and waiting to see what the next iteration of ChatGPT can do.

How De Gruyter is responding

De Gruyter takes seriously the impact – both positive and negative – AI could have on scholarly research and publishing. This survey marks a first step for us to learn more about the needs of our authors regarding the use of AI tools.

Guided by the recommendations of the Committee on Publication Ethics (COPE), we have added a note on the use of AI tools such as ChatGPT to our author guidelines. This states that while AI tools can be used to help write scholarly work, authors must also outline exactly which AI tools have been used and how they have been used. We would not accept a publication solely authored by AI and are currently working on further guidelines as new technologies are developed.

In addition, we will always respond to requests for more specific guidelines and guidance on the uses, opportunities, limitations and risks of AI tools for academic work. To this end, we are working closely with academic and scientific communities. We believe that together, we can work to make the most of the opportunities of artificial intelligence for scientific progress while minimising the risks.

[Title image by Pixdeluxe/E+/Getty Images]

Ramlah Abbas

Ramlah works as Manager Insights & Analysis at De Gruyter.

Alexandra Hinz

Alexandra works as Digital Communications Editor at De Gruyter. If you would like to pitch her a blog idea, get in touch with her via email!

Chris Smith

Chris is a writer, researcher and communications consultant working with De Gruyter and other scholarly publishers. He’s also co-founder of Prolifiko which runs coaching programs for academic writers.

Pin It on Pinterest