Risks and Guidance: Using GenAI in Marketing Materials

“When people and businesses use a generative AI machine, there is a set of terms and conditions they are inherently opting in to—terms and conditions they typically haven’t read.”- Thomas Huthwaite
As generative artificial intelligence (AI) tools further advance, they become increasingly attractive to businesses looking to cut costs when developing marketing materials. By inputting one or several prompts, whole suites of content can be developed at low cost (often for free), and very quickly.
However, there are a number of considerations that businesses must take when using AI in content creation, including risks to input materials, and the risks inherited by using output materials.
Businesses must be wary of the terms they agree to when using AI, including the kind of data that is being used to train them.
Knowing the terms—ownership and risk
When using generative AI in the process of content creation, it is crucial to understand exactly who owns the relevant materials (including the inputs and the generated outputs)—and who carries the risk, including what happens to the inputs, and the risks involved in using the outputs.
For example, will the input materials be used to only train the AI model, or might they also be used in future generated outputs? If an infringement claim is made against a piece of AI generated material, who is taken to have infringed—could it be your business that carries the liability?
When using a third-party AI tool, by submitting a prompt, the user will be taken to have agreed to the terms and conditions of this platform. These terms and conditions often outline the details of ownership and risk. Understanding these terms is crucial—and a step that is often neglected, particularly by businesses without clear AI policies.
“When people and businesses use a generative AI machine, there is a set of terms and conditions they are inherently opting in to—terms and conditions they typically haven’t read. Businesses should ensure they are engaging with these terms should they wish to use AI in marketing materials to best understand what they are signing up for, where their input materials are going, and the care that should be taken with the generated outputs,” says AJ Park Litigation Practice Group Leader, Thomas Huthwaite.
Before the days of easy access AI tools, businesses relied on in-house resources or creative agencies to create their materials. When employing creative agencies, the lines of ownership and risk are typically clearly laid out and negotiated. However, there are now instances in which a creative agency might also be employing an AI tool, complicating the terms further.
“The owner of the generated output and the risk that has been inherited may be inconsistent. Before AI, creative contracts would generally set out that the commissioning business owned the generated output, and their creative agency would provide certain reassurances, indemnities and liabilities relating to the outputs that they prepared. However, AI complicates this with additional layers of terms, which are often far less favourable to the commissioning business,” says Huthwaite.
“If your marketing agency is using AI, the contractual situation becomes even more complex. There may be an additional set of terms that you aren’t even aware of, and that contradict the terms you thought you agreed to. It’s important to understand who is actually assuming the risk,” adds Blake Carey, Senior Associate at AJ Park.
It’s also important to recognise the level of dependency on AI when developing materials. There are varying levels of risk associated with a fully AI generated image vs a tweak or thought starter. The former is more likely to open a business to liability which may even include a copyright infringement claim.
“When considering use of an AI tool, a business should ask itself in the beginning how much they are planning to rely on AI. Basing your entire brand or marketing efforts on AI outputs is different from using AI to spruce up your work, spell check, or get the ball rolling on creative ideas,” says Huthwaite.
“There are a number of questions to consider when determining copyright in an AI generated output. Who authored it, and who owns it? The AI machine, the AI company, the person who inputted the prompt, or someone else entirely? Can the work then be assigned? Who can be sued over the creation, if it is indeed an infringing work? Has the output copied a substantial part of an existing work, such as a training input? This is an area destined for litigation, which can quickly become expensive,” Carey warns.
Understanding of terms is crucial when using an AI tool, and establishing transparency around who owns what, where data is going, and how it is being used is as vital as it would be with a human contractor.
“It’s difficult for businesses to pin down risk. If you think of AI tools as effectively a contractor commissioned to produce work at the click of a button, are you happy engaging that contractor without the typical transparency, understanding of process, reassurances, indemnities and liabilities that come with employing a human contractor? Are you happy sharing your business-related prompts and other data or confidential information, not knowing whether those inputs will be treated confidentially, how they will be stored, or whether they might be used to train the AI tool or in the generation of a future output?” asks Huthwaite.
AI as a contractor, not a magic wand
A useful step for implementing cautious practice around engaging with generative AI models to create business content is by treating AI as a contractor, rather than a quick, easy, and readily accessible output machine.
“Businesses should treat AI tools as they would any other contractor. You wouldn’t just approach someone and say ‘here’s a bunch of our data and information, create something for us’, without expecting a paper trail of what they have done to your materials, what they’re going to do with it in the future, how the output will be created, what the reference sources are, and so on. This process is often lost with the use of AI—you get this magical machine to create things for you seemingly out of thin air, with no paper trail or documented process,” says Huthwaite.
“You don’t want your data sitting with a third party without appropriate access or use controls. You want data integrity for the health of your business. To manage and understand the risks, you need the paper trail.” Huthwaite adds.
As with any contractor, it is important to be mindful of the information being provided to the generative AI machine, and particularly its level of confidentiality.
“Businesses aren’t always aware of what exactly is being input into the AI tool producing the materials they have asked for. There are potential issues with the control of information being fed in, and the subsequent output. You would never hand over control of business materials to someone who you hadn’t reached a very clear understanding with about what they’re going to do for you—so why would you do so with a machine? Treating this tool as a contractor, rather than something quick, cheap and easy that you are entitled to use, is important,” Huthwaite advises.
Brand reputation at risk
In addition, businesses should consider the social risk involved with implementing AI generated imagery as part of their marketing strategy.
“There is certainly a social stigma when it comes to generative AI materials in marketing. On social media platforms, materials identified by people as being AI generated, particularly those that are poorly produced, are labelled as “AI slop”,” says Carey.
“The key point, though, is the materials need to be recognised as AI generated first. If a business produces a poster featuring a model with six fingers, then it’s likely that customers will think you are sloppy. There’s a risk of brand harm, and even media coverage if the AI’s mistake is significant,” says Carey.
A recent example of this, which has generated controversy, is Will Smith’s alleged use of AI to create fake crowds at his concerts, for use in promotional videos. The public’s ability to recognise AI generated imagery is inherent to this—and those skilled at using generative AI as a tool may be able to avoid it.
“On the other hand, AI generated content can be well produced, and it really comes down to recognising that AI is a tool. A skilled user of generative AI can create incredible work,” Huthwaite adds.
AI tools are cheap and easy alternatives for businesses when producing creative materials, but they must be recognised as tools—dependent on the skill of the user, and with their own sets of terms and conditions, data storage practices, processes, use limitations, risk, and liability.
Thomas Huthwaite and Blake Carey are qualified lawyers and experts in IP litigation based in Wellington, New Zealand. For further information, visit the AJ Park website.