The Pros and Cons of Using AI Generative Tools in Your Content Strategy

25 April, 2023

For better or worse, AI generative tools are here. Since the debut of ChatGPT-3, Open AI has released ChatGPT-4 – an even more advanced system. Other generative AI tools (including Google’s Bard) are also available. Earlier this year, we looked at why using AI for content may be riskier than you realize. As AI continues to develop at breakneck speed, it’s time for a look at the pros and cons of using AI generative tools in your content strategy.

The Appeal of Generative AI

Content creation is a time-consuming process. Generative AI tools speed it up considerably.

Whereas other AI tools can fine-tune content – for example, by spell-checking an article or tweaking the lighting in a photograph – generative AI tools create new content from scratch. All you have to do is enter a short prompt telling the program what you want and the AI will spit out your content in mere seconds.

For businesses that need a steady stream of content and have a tight budget, this sounds pretty miraculous.

The Dangers of Generative AI

According to Reuters, Elon Musk and various AI experts have called for a pause on generative AI development, citing potential risks to society.

AI advancements have been moving incredibly fast. Society hasn’t had time to consider all the possible drawbacks nor work out the various legal issues associated with generative AI – and there are a lot of possible issues:

  • You can’t copyright AI-generated content. The S. Copyright Office has declared that content created using AI generation can’t receive copyright protection. The decision involves a comic book that had its copyright protection for AI-generated art revoked. This has big implications for all AI-generated content.
  • AI programs are facing accusations of copyright violations. Generative AI programs seem to create content out of nothing. In actual fact, they receive training based on existing, human-created content. Many creators have objected to AI developers using their content for this purpose without their permission and without any compensation. According to Insider, Stability AI (the company that makes the generative art tool Stable Diffusion) is facing two lawsuits that allege the company infringed on people’s copyright by scraping art to train its algorithms. These lawsuits could have major implications for AI art and other AI content.
  • Your data may not be safe. Wired says “ChatGPT has a big privacy problem.” Italy has already banned the program over data privacy concerns. Companies should also be careful about the data they provide to generate content. CNN says some companies (including JPMorgan Chase) have clamped down on employee use of the tool due to compliance concerns.
  • AI content is leading to defamation lawsuits. Generative AI tools don’t always tell the truth. They often “hallucinate” – a term used to describe when AI tools make up information, often with a convincing level of detail. If you ask ChatGPT for sources to make sure you’re receiving correct information, there’s a good chance the AI will simply fabricate references. According to Duke University, the resulting citations will even seem legitimate. As a result, companies that depend on AI generative tools may publish misinformation. Even worse, they may end up committing libel. There have already been lawsuits – Reuters says an Australian mayor is preparing to file a defamation lawsuit over ChatGPT content that falsely accused him of being involved in a bribery scandal.

How Can Brands Leverage AI Tools?

AI generative tools have amazing potential, but there are many issues to work out before companies can embrace these tools without reservations.

Companies can leverage tools like ChatGPT and Bard in several ways:

  • Article Development – This may be the riskiest application because your company likely won’t own the content you create. Even worse, the content may be filled with misinformation. If your article leads readers astray or provides incorrect information about other people or companies, you could even face legal liability. Google says “Bard may give inaccurate or inappropriate information” and warns users not to “rely on Bard’s responses as medical, legal, financial or other professional advice.”
  • Landing Page and Email Copy – You can use tools like ChatGPT to create website and email copy. Once again, a potential drawback is your company may not own the content. You also need to be careful about the data you’re providing to produce the content.
  • Editing – Writers have been using computer programs to help with editing for decades. You might want to give a tool like ChatGPT an article you’ve written and ask it to edit and improve the article. However, you need to be careful about any changes the AI makes – since some of the information may not be factually correct, you shouldn’t accept everything without question. In addition, if the AI tool does significant rewriting, you may not have copyright for the content.

Proceed with Caution

AI generative tools are exciting, but a cautious approach is necessary. Before companies embrace these tools, they need to consider the possible implications. The legality around AI generated content is still being decided – it’s possible that AI usage could create a liability nightmare for your business in the future.

Do you need help with content creation? You can trust the human writers at Inbound Insurance Marketing to create the articles, social media posts, marketing emails, white papers, web copy, and other pieces of content your company needs. If you’re unsure about what you need, use the Content Roadmap.