New research from Plain English Foundation reveals that while AI uptake is gathering pace, training is not keeping up
Plain English Foundation, Australia’s leading provider of plain English training and editing, has found that while generative AI can fast-track writing, the output needs close oversight and writers need more training to get the best outcomes.
The Foundation surveyed 289 Australian professionals from government, private sector and not-for-profit organisations.
The aim was to better understand how people are using generative AI at work and the challenges it brings. The survey identified the most popular AI tools, what they’re being using for, and how they’re changing ways of working.
AI at work is taking hold
Generative AI is becoming ubiquitous, with almost 90% of respondents using it at least once a month. One in three use it every day. And it seems that Australian workplaces are embracing the shift.
Most respondents reported that their organisation already has a gen AI policy (57.1%), while a further 16.3% had a policy in development.
The most popular gen AI tool was Copilot (67.8%) followed by ChatGPT (35.6%). Users were less likely to report using Gemini, Grammarly and Claude.
Around 1 in 13 people said their workplace had its own in-house AI tool, and we expect this to increase as more organisations invest in bespoke AI solutions.
The three most common writing tasks where people are leaning on gen AI are: refining their written expression (63.3%), summarising long texts (58.1%) and brainstorming (47.1%). A third of respondents reported using AI to do research and to draft emails and other documents.
AI outputs are falling short
Respondents reported a range of problems with AI-generated output, with cliched language (71.3%) and hallucinations (58.8%) topping the list of common issues. Respondents also cited inappropriate tone, overly long prose and summaries that miss key information.
Dr Elizabeth Beach, Editor at Plain English Foundation said these results highlight the ongoing need for human oversight when using AI-generated content.
“As AI becomes embedded in everyday writing tasks, human-driven plain language principles are more important than ever,”
“Right now, we can’t rely on gen AI to produce high-quality output that’s error-free and sounds human, no matter how good the prompting. To do that, we need humans.” she said
Workers are sceptical about AI
The survey asked how confident respondents felt about AI accuracy and reliability. The results ranged from very low (0% confidence) to very high (92%), with an average score of 45%.
One participant commented that “AI is still far too inaccurate and [unfortunately] real-world situations are being used as a testing ground.”
Looking to the future, the survey asked how optimistic people felt about the potential impact of AI in the workplace. This time, the results were skewed towards a slightly more positive outlook, with the overall average sitting at 53.3%.
Another respondent expressed mixed thoughts, commenting, “I am both concerned and excited. I am excited because I believe it is the future and has so much potential.”
For me it is like have an always-positive assistant with incredible knowledge… But I am also concerned … for how it might undermine the learning stages of upcoming professionals,”
“How do we pass on the cognitive skills and mindset necessary for our work when we might just outsource the work to AI? And how do we ensure we keep the nuance and humanity in the work?”
“And if AI generates writing based on what is already available to it, how do we manage the bias when it is then using other AI-generated work as a basis?”
More training is needed
Despite the widespread uptake of gen AI, only 21.1% of respondents had completed training on how to use it. Two in five wanted to do formal training, but about one-third said they were happy to experiment themselves.
As a skills-focused organisation, it’s no surprise that Plain English Foundation supports the call for more training! But that training needs to be relevant. It needs to focus on the fundamentals of clear communication and help people understand what good looks like.
And perhaps most importantly, writing training now needs to include critical and analytical thinking to ensure that humans stay in control of the writing process and take responsibility for what they produce.
“Once people know the fundamentals of clear communication, it’s much easier to know how to work with AI to improve the quality of the output,” said Yusuf Pingar, GM at Plain English Foundation,”
“Clients tell us they’re struggling to get quality outputs from AI, so we show them how to improve their prompts, how to critically assess their AI outputs and how to polish the expression to ensure a top-quality final document,” said Pingar
“AI tools have sped up the writing process, and the pressure to write even faster is building every day. But that brings the risk of errors, poor quality writing and damaged reputations.
“Our philosophy is to slow down, take the time to get your team trained in clear communication, and start getting better outputs from AI. That’s when you’ll really start to see the productivity gains kick in,” he said.
