Introduction
Artificial intelligence has become the marketer’s best friend , and, in some cases, their biggest legal risk.
In this Marketing Meetup session, Ryan Lisk, founder of Hybrid Legal, joined Joe Glover to unpack the complex legal side of AI. From intellectual property ownership to client contracts and data protection, Ryan explained how creative teams can protect themselves without stifling innovation.
Below is a summary of his main lessons and real-life examples, written with the help of AI (so please excuse any tiny errors).
Table of Contents
- Introduction
- The Angry Photographer and the Nano Banana Incident
- Be Honest About AI in Contracts
- The Perils of AI-Drafted Contracts
- Confidentiality and the Untrained Team
- Scraping, Data and the Law
- How to Protect Yourself as a Creative or Agency
- AI Tools, Liability and the Fine Print
- Ryan’s 3-Step Checklist for Marketers
The Angry Photographer and the Nano Banana Incident
Ryan began with a story that could keep any agency owner awake at night.
An agency commissioned a freelance photographer to shoot images for a supplement brand. Everything went smoothly until the agency used an AI tool , nicknamed “Nano Banana” , to edit the images into fun social media backgrounds. Months later, a legal letter landed.
The photographer’s lawyers claimed the agency had used and modified the images outside the original contract. The agency thought it owned the photos because it had paid for them. But since the photographer was a freelancer, not an employee, the intellectual property (IP) remained with them unless explicitly assigned in writing.
To make matters worse, the agency’s client contract promised that all creative assets would be owned by the client once the invoice was paid , effectively promising IP the agency didn’t own. And because the edited images had been uploaded to an AI tool, parts of that work could now appear in other users’ outputs, compounding the issue.
Ryan’s advice was clear:
- Always have written contracts with freelancers that explicitly assign IP rights, not just license them.
- Be transparent with clients about whether AI tools were used in creative production.
- If a client demands ownership of the IP, explain that AI-generated work cannot currently offer a guaranteed copyright transfer.
“Right now, no AI system can give you a copper-bottom guarantee that what it produces is completely unique,” Ryan said. “If your client wants full ownership, you need to create the work the old-school way.”
Be Honest About AI in Contracts
Ryan explained that the safest approach is to include clear AI disclosure clauses in contracts.
If an agency uses AI tools such as ChatGPT, Canva or Runway to edit, write or generate assets, the contract should reflect that. Clients should understand that these outputs may not be entirely original, and therefore may carry some IP risk.
He encouraged marketers to use the word “assign” (which transfers ownership) rather than “license” (which only grants permission). It’s a small but vital distinction that can prevent future disputes.
The Perils of AI-Drafted Contracts
Ryan then moved to his second story: a company that asked ChatGPT to write a referral agreement.
The AI produced a polished-looking contract that seemed perfect at first glance. But when Ryan reviewed it, he discovered major gaps. The document didn’t define what counted as a “referred opportunity,” failed to clarify payment triggers, and guaranteed commission even when leads didn’t convert.
“The problem wasn’t the language,” Ryan said. “It was the lack of context. AI didn’t ask the right follow-up questions.”
His advice:
- If you use AI to draft contracts, get them reviewed by a qualified lawyer before using them.
- Remember that AI is a pattern generator, not a legal professional.
- Use AI to help draft ideas or structure, but never to finalise binding agreements.
“It’s like me trying to create a full marketing campaign in AI,” he said. “I can do it, but I shouldn’t.”
Confidentiality and the Untrained Team
The third story was the most serious.
A healthcare agency uploaded a client’s confidential white paper into ChatGPT to create a campaign. Later, parts of that same white paper appeared in other ChatGPT outputs online. The client’s legal team launched an investigation for a breach of confidence, which quickly escalated into an insurance issue.
The cause was simple: employees were using personal AI accounts without company oversight. No policy existed to define which tools were approved or how to handle sensitive data.
Ryan’s lessons were sharp and practical:
- Run an AI amnesty: ask your team which tools they use and for what.
- Risk-assess every tool. Read the end-user licence agreements carefully to see whether they train on your data.
- Draft a clear AI usage policy for both internal and client-facing work.
- Never upload confidential or personal data into public AI tools.
He warned particularly against connecting AI tools to email. “That’s a huge risk,” he said. “Your inbox is full of personal and confidential data. If you plug AI into email, it could end up training the model.”
Scraping, Data and the Law
During the Q&A, one attendee asked whether using AI to collect public data was legal. Ryan’s answer: “It depends.”
If the data includes personal information, it falls under GDPR and the Data Protection Act 2018. Businesses must be able to show where the data came from and whether consent was given.
Even with company data, scraping can be risky. Many websites now include clauses in their terms of use prohibiting AI-based data scraping without permission. He cited The Times and Tripadvisor as examples that have already introduced such restrictions.
“Just because data is public doesn’t mean it’s fair game,” he said. “If a website says no scraping, that’s the law.”
How to Protect Yourself as a Creative or Agency
When asked how photographers, designers and marketers can protect their own work from unauthorised AI editing, Ryan suggested introducing clear licensing tiers.
For example:
- A basic licence that allows limited use.
- A premium licence that transfers ownership rights.
- A special licence that allows AI editing or manipulation at a higher cost.
“The contract is your safety net,” he said. “It defines what can and can’t happen to your work , and at what price.”
AI Tools, Liability and the Fine Print
Finally, the group discussed whether tools like Adobe Firefly are as safe as they claim.
Ryan praised Adobe for building a more ethical model trained on licensed stock images but warned that marketers should still read the fine print. “Even the most responsible AI companies have disclaimers,” he said. “They’ll tell you it’s safe, then add a clause saying they’re not liable if it isn’t.”
His golden rule: pass risk down the chain. If you use AI to create client work, make sure your contracts clearly state that any residual risk is accepted by the client, not the agency.
Ryan’s 3-Step Checklist for Marketers
Ryan closed with a three-step process to stay compliant and confident:
- Run an AI amnesty. Find out what your team is using and risk-assess every tool.
- Lock down your contracts. Protect yourself with clauses covering AI use, IP ownership, and third-party tools.
- Stay transparent. Train your teams, educate your clients and never assume the tech is risk-free.
His final words were optimistic: “AI will be great for marketing. Just go into it with your eyes open.”