A Conversation on AI with faith & VCCP’s Morten Legarth

Get ready for an insightful conversation with Morten Legarth, Creative Director at faith and VCCP. With a background in Computer Science and Business Administration, he is passionate about leveraging technology to push the boundaries of creativity. From training diffusion models to pioneering AI-generated debates, Legarth is at the forefront of exploring AI’s impact on the advertising industry.

In this interview with IAA Benelux, Legarth shares his views on creativity, AI, and how faith and VCCP are navigating this new frontier.


IAA Benelux: Has the recent evolution of AI changed anything within your organization, and if so what?

Legarth: A great deal, yes. Firstly it has meant the launch of faith, our AI creative agency back in 2023. We work with every part of VCCP as a catalyst of using AI creatively throughout the group and for our clients.

Most of the agency has embraced AI independently as well, which we love. Most creatives for example are using AI tools daily, for anything from brief research and pre-visualisation to full on storyboards for their ideas.

And we are rolling out our own internal tools as well to help creatives get more creative control over things like image generations.

Overall it is quite incredible how quickly some of these tools have become an invaluable part of our daily work, and how useful creating our own tools have been.

IAA Benelux: What do you see as the risks of AI for your organization and for the industry as a whole and what is your role / your organization’s role in mitigating those risks?

Legarth: I think it is very easy to be too risk averse. AI is a huge opportunity for our industry and I think, ironically, the main risk for many is to be too afraid to jump in. Of course we need to be responsible, but the reality is that AI is evolving very quickly and the only way to learn about the limitations and boundaries of the technology is to lean in.

We’ve released our first real AI products with O2 recently, but there are other products that have been scrapped or temporarily shelved because the models currently out aren’t good enough yet or they may have issues with things like biases or hallucinations. Some projects fail, and that’s totally fine. We only really learn to understand what the risks are by trying different things and building models and tools.

Our approach at faith is based on an experimentation mindset; this allows us to effectively test and learn and then act as a guide for the rest of VCCP and our clients. We work very closely with our clients on Generative AI projects and let them know where the potential risks are up front and how we plan to mitigate them. That way we’re all on the same page if we have to make changes to a project for example.

As long as you’re serious about using AI responsibly, don’t think of it only as a shortcut and not afraid of having to cancel a project because the tech just isn’t ready yet; the best thing you can do is to get involved and experiment. If we creatives don’t get involved, these models and tools will just be developed without our input and then we won’t be able to influence their development.

IAA Benelux: How do you ensure responsibility over AI-systems used within your organization?

Legarth: We founded faith with the belief that AI used responsibly will be an accelerator of human creativity and with a commitment to our four core principles. Be transparent when AI is being used. Be authentic: fact-check generated content. Be compliant and break no laws and be ethical – only use AI for good. They guide everything we do. VCCP’s legal team is also heavily involved in faith and we have written a full Gen-AI policy that we adhere to across the business. It’s really important to us.

Beyond the business we are working with industry bodies such as the advertising association to help inform regulation in the industry based on our learning.

IAA Benelux: What do you see as the opportunities for the use of AI?

Legarth: A lot of the time the conversation falls on efficiency, and cost cutting, way too quickly. I think that’s way too limiting and, as a creative, that isn’t very attractive to me.

Creatively I think there is a huge opportunity in personalised bespoke content and non-linear brand storytelling. Traditionally, creating highly curated content is cost prohibitive but AI can help remove those limitations. This means we can create more engaging experiences for our audiences that still stem from the same brand world.

The Bubl Generator we created for O2 is a simple example of that. It allows us and our client to put beloved O2 brand character, Bubl, in countless scenarios that just wasn’t possible even a few months ago. This helps us create more saliency for Bubl and be more playful with how they are used in display and social media.

I think AI will help create much more immersive brand experiences because every part of the experience doesn’t have to be prescripted and made by human hands. We will still be setting up the brand world and the overall creative direction for these experiences, but some of the details will change spending on the experience of the user.

IAA Benelux: Do you see the industry using AI for the collective good, and if so how?

Legarth: There is a lot of scrutiny on AI at the moment and rightfully so, and I think most people in the industry are looking at campaigns and ideas that use AI for good. We’ve seen a few cases like Cadbury’s Shah Rukh Khan-My-Ad that used AI for social good and I think we will continue to see more of it. We have a few projects on the way in this space and it is very fulfilling to show how AI can actually be used for good, especially considering how there is a lot of nervousness around the topic.

IAA Benelux: How are the regulations around AI, such as the EU AI Act impacting your business in the EU and abroad? Do you see these regulations as opportunities or threats?

Legarth: I think the regulations are very reasonable and necessary; they mainly focus on banning malicious and risky use of AI which I think we can all agree on. Neither we nor any of our clients would want to engage in that sort of behaviour anyway, so it’s not really impacting us that much. The most relevant regulation to the work that we do is about transparency, and disclosing when things are AI generated. And we are fine with that, I mean it’s already part of our founding principles.

But it will be interesting to see how this evolves as I am expecting at least partial use of Gen AI to become the norm in the industry very quickly. Just last year Adobe introduced generative fill and it has been part of the official release of Photoshop since September. It allows you to inpaint or outpaint using AI so technically that requires disclosure, that way AI in some form will be involved in the creation of most assets so we might have to modify this requirement.