Artificial Intelligence Machines Can Now Write Medical Articles

Automation exists in many aspects of our lives, even in some of the articles you read. With a bit of input from humans, certain apps are designed to use artificial intelligence (AI) to write articles and other content on their own. We call these “AI writing assistants” or “AI writers”. An AI writing assistant uses a unique language to create written content that sounds human. The content can be a blog post, a press release, an article in a newspaper, or even a speech. The options are limitless.

Automation of all types, including AI writing assistants, comes with pros and cons. From the author’s perspective, the primary benefit of using AI writing assistants is that they can write content faster than a human can and thus produce large volumes of content in a short amount of time. An AI writing assistant can also give a creative nudge to a writer experiencing writer’s block or generate new ideas that a human had not previously considered.

The main disadvantages of using AI writing assistants are that the text generated often does not flow well from the beginning to the end and might mislead the reader with inaccurate or outdated information.

AI assistants can benefit authors, but they are not a replacement for human writers. They are exactly as their name suggests – assistant writers. As such, AI writing assistants can generate content at scale, and they don’t need to be trained on a specific topic or niche before they start working on it. During the writing process, all the AI writer needs is some guidance from the human copywriter, who will work with them closely to ensure that their generated content is factual, timely, complete, and relevant for their audience. Once the text is complete, the human author must review, correct, cross-reference, update and edit the text to minimize the risk of spreading misinformation.

For those of us in the medical field, inaccurate or outdated information can have dangerous results. Patients and healthcare professionals make health and medication decisions based on information they read if they trust the source. The problem is that articles written by an AI writing assistant are not labeled as such, and they can sound very human-like. Therefore, the reader must determine whether a human or an AI writing assistant wrote the content. This is often harder than it sounds.

To test people’s ability to recognize healthcare-related articles written by either a human or by an AI writing assistant, an international survey asked 164 people to read ten different texts written by either a human or an AI writing assistant. The participants included the general public, healthcare professionals, and medical writers. All were blinded regarding the source of the texts and they had to guess if these were written by a human or by an AI writer.

How easy was it for them to correctly identify whether a human or an AI writer wrote the texts they read? Not easy at all. Of the ten texts in the survey, more than 50% of the general population could correctly identify the source six out of ten times. The results were similar for healthcare professionals and medical writers.

Statistical analysis showed no difference in the ability of the general public, healthcare professionals, or medical writers to identify healthcare-related texts that humans wrote. However, when it came to healthcare-related texts written by AI writers, medical writers were significantly better at correctly identifying them as AI-written than the general public and healthcare professionals.

See for yourself. Two of the following texts were written by an AI writing assistant, whereas a human wrote the other two. Can you identify which is which? You can find the answers at the bottom of the page.

  1. Gastroesophageal reflux disease (GERD) occurs when stomach acid frequently flows back into the tube connecting your mouth and stomach (esophagus). This backwash (acid reflux) can irritate the lining of your esophagus.
  2. GERD can be caused by many things, including lifestyle factors, food choices, and medications. It can also be worsened by pregnancy or obesity.
  3. At the simplest level, obesity is caused by consuming more calories than you burn. Obesity, however, is a complex condition caused by more than simply eating too much and moving too little.
  4. Obesity is a growing problem in the United States. It is estimated that almost 40% of adults are obese, which can lead to health problems such as cardiovascular disease, diabetes, cancer and even death.

What does this mean for patients who read medical articles? Expect to have an increased amount of information become available to you. This is good if the articles were then vetted and corrected as required by a trained professional. It also means that it is possible that you have access to more healthcare-related articles that were written by an AI writer, which could contain errors or outdated information, and you would not know about it. This issue gets even worse when such misinformation gets shared on social media.

There are no straightforward methods to identify whether written information came from a human author with expertise or an AI writing assistant. As a reader, you can look for articles from recognized and trusted organizations such as hospitals, medical non-profit organizations, or governmental websites. Certain health information websites publish the name of the professional who reviewed the article. That assures you that a human expert has vetted the article to ensure accuracy and timeliness. Moreover, AI medical writers will sometimes include statistical figures, but at the moment, they are not able to identify the source of that figure. So, if you see a statistical figure and no reference, it is possible that an AI writer may have written it. However, some authors don’t follow strict writing principles, or they may have accidentally omitted the reference, so not seeing a reference is not a guarantee that an AI writer wrote it, but you should question the fact.

AI writing assistants are changing the publishing world by generating content for articles, blog posts, and other types of digital media. If not appropriately managed by a human expert, the AI writer can generate misinformation. You should beware of identifying trusted information sources and looking for articles that name the expert who reviewed the article.

Results of the AI vs Human Quiz

Human

Artificial Intelligence

Human

Artificial Intelligence


Natalie Bourré
Health Communications Consultant
Natalie Bourré is a Healthcare Communications Consultant and founder and owner of Marketing 4 Health Inc. She is studying the impact of automation in healthcare communications as part of her Doctorate of Business Administration degree.
First published in the Inside Tract® newsletter issue 222 – 2022
Source: Bourré, Natalie. LIGS University, Doctorate of Business Administration. February 2022.
Editor’s Note: We do not use AI assistants in preparing content for the Inside Tract® newsletters.
Photo: © ismagilov | Bigstockphoto.com