Global Marketing
Management Consultants
Global Marketing
Management Consultants
mobile-logo
Global Marketing
Management Consultants
Top

Managing Marketing: How To Assign Value To Marketing With AI Models

Henry_INNIS

Managing Marketing is a weekly podcast hosted by TrinityP3. Each one is a conversation with a marketing thought-leader, professional, practitioner or expert on the issues and topics of interest to marketers and business leaders everywhere. In this special series, TrinityP3’s Anton Buchner, discusses the rise of Artificial Intelligence and the impact it is having on marketing.

Henry Innis is the Chief Strategy Officer and Founder of Mutiny Group, a team of data scientists, engineers and strategists that help put the rigour and measurability back into marketing. He talks about how cloud computing and advances in deep  learning models that sit within a neural network now help marketers to look forward and predict results, rather than viewing data as a retrospective exercise.

You can listen to the podcast here:

Follow Managing Marketing on Soundcloud or iTunes

Transcription:

Anton:

Welcome to Managing Marketing, a weekly podcast where we sit down and talk with marketing thought leaders and experts on the issues and topics of interest to marketers and business leaders everywhere.

I’m Anton Buchner with a special conversation on the rise of artificial intelligence and the impact it’s having on business and marketing. To discuss this I’m sitting down today with Henry Innes. Henry is the chief strategy officer and founder of Mutiny. Welcome, Henry.

Henry:

Thanks, Anton.

Anton:

Now before we jump in I know your background a little bit. We met I think first when you were at Edge.

Henry:

It was probably my first advertising job.

Anton:

In the world of content marketing—a big buzz and trend. You’ve been an angel investor advisor. You’ve been through a couple of different agencies, VML, YNR. I think you were on the STW High Performers Programme—hotshot—years ago.

Henry:

You’re making me feel my age now.

Anton: But most importantly I notice you’re the director of the Dangar Island Bowling Club.

Henry:

Ex-director, I think.

Anton:

Firstly, where is Dangar Island? And how does one at your age (you don’t look completely grey yet) get into bowling?

Henry:

When you’re on somewhere like Dangar Island there is not a lot to do so you kind of get caught. It’s a funny place where if you don’t play bowls you don’t play the local sport.

Anton:

That’s the social integration aspect?

Henry:

That’s the social integration aspect of Dangar Island, which is a little island, a slice of paradise on the Hawkesbury River about an hour north of Sydney. It’s absolutely fantastic. I try to go back there as much as work permits but it’s very much my home.

Anton:

That’s given you the skill set for marketing—aiming at a ball, bowling down and trying to knock others out of the way?

Henry:

I think it’s aiming the ball that’s going on a curve away from everything you want it to—that is probably a good analogy for marketing.

Anton:

Let’s follow that trajectory—sorry for the pun. I’d love to know your view on AI. We’re interviewing different business leaders and we’d love to know what is your view on artificial intelligence in the business or marketing space?

Henry:

I think it’s worth noting that a lot of the techniques we’re talking about today have been around for 30 or 40 years. A lot of this stuff isn’t brand new. It’s the type of techniques, mathematical and statistical, that we have used for decision analytics for years and years.

The difference over the past three years has been 2 things. One, it’s been about the rise of the cloud, which has changed that from being a very time-consuming laborious process to one where we can speed up the processing time of these models very quickly.

The 2nd thing has been some advances on another frontier, which is because we’re able to speed up that process and time those combinations and start to combine lots of different models very quickly, we’re able to enable things like deep learning where the technology is selecting the model and statistical boundaries that are being used within the model.

Whereas prior, we kind of had to retrofit and guess which technique was going to give us the best answer and try to set or test the statistical boundaries manually.

Anton:

It was a slower test and learn process wasn’t it?

Henry:

Correct. What AI is linked to more than the techniques; it’s the application of the cloud which has really sped up the speed to insight so dramatically. Suddenly, we have access to a lot more insights and a lot more fluidity on that.

A great example of that is the world of market mixed modelling, something we do extensively. The world of market mixed modelling used to have to focus on one or two things. It was based around a multivariate regression model where you had to set your own parameters and things like that, and how long an ad would last and all those kinds of things.

That process would take, as you would know, a year, maybe 18 months to get that view. They were very long projects. And even then to take that view and make that view look forward was very difficult. It was a retrospective benchmarking exercise not a predictive exercise.

So, what we’re able to do now because of the cloud, as an example (and I don’t want to harp on about us too much) we can take all that client data in the cloud and build that into a deep learning model that sits in a neural network. And we’re able to process all of that client’s data. The model will set the statistical boundaries for us and then we can use that to forward look for clients.

As new data comes in the model updates and the client can then look at hypothetically mapping out the different points. So if I invest $500,000 what will the result be? If I reduce my budget by 100,000 in television what will the result be then and what will the impact on sales and brand metrics and other target variables be?

You could have done that kind of modelling 30 years ago but it would have been very, very hard and time consuming to do and you probably would have needed 400 FTE headcounts.

Anton:

And that really started either in the research department or came out of the data/CRM type department where they said, let’s go away and do some market mixed modelling amongst a bunch of other techniques’, which is fascinating because I think at the bottom of all of that ends up being data.

So, it’s only as good as the data coming in to the model as to what you’ll get out of the model.

Henry:

100%–that’s one of the things that’s always under looked in data and whether or not businesses are AI-ready. Is your data verified? Is it trusted? Is it in the correct time series? Is it structured in a way that’s readable and understandable? Is it labelled correctly?

Is there consistency so that if you have one vendor executing something for you versus another, are they bringing that into the same consistent data set so that you at least have a clear view of your business?

Anton:

Which often you’re not seeing are you? You’re seeing completely different schemas or structures.

Henry:

That’s right. A great example of that is the schema a media agency will run versus a creative agency. Their schemas and their ways of looking at the world are probably completely divergent.

Anton:

So how are you solving that problem or seeing solution to that problem through AI?

Henry:

That’s not an AI problem; that’s a human problem. AI is only as good as what you put in. If you’re a business looking to get into the AI space, you first of all have you go through the process; is my data ready to have that applied to it?

And the way that you solve that is by taking an enterprise view on data and what you do and don’t want to solve for, what businesses you do and don’t want to look at, and what are the attributes you are looking at, and the factors and the variables?

You have to clean all of that up before you can even start to think about AI, which is, funnily enough why I think the best businesses that do this really, really well, have a very clear view on their enterprise level data and then they’re applying AI.

The businesses that get it wrong go ‘I want AI’ before they have an enterprise view on data.

Anton:

It has traditionally been very hard—the legacy data systems, data bases, the lack of a centralised data mart for most silo-based companies—it has been very difficult to pull that into a single customer view.

Henry:

And again, I come back to the cloud here. The cloud is making great strides in that space. I look at the rise of AWS, Google Cloud (GCP as I call it), Microsoft Dujour amongst others. They have an incredible capacity to unify data and be fairly agnostic in the data link that is coming in, which is quite unique.

If you go back 3 years you didn’t have that. And that made it very hard to break down silos and things like that. Again, you’re talking about building blocks of AI. Firstly, it’s having an enterprise view on data. Secondly, it’s having a very strong cloud platform where you can bring those enterprise views on data together so they can be manipulated and you can put these models on the top of them.

You might notice I’m talking about AI as models. I’m not talking about AI in this generalist sense I think the industry likes to.

Anton:

I just want to cover that off. I’m glad you’re starting on data because we hear a lot of stories about starting with the tech and leading with technology discussions, which, as you’re saying, is really completely the wrong way around. You can have great technology but it’s only as good as the data coming in and if the cloud is allowing a better interpretation of all the myriad inputs then you can make sense of them.

Henry:

Exactly. And you have to be able to make sense of something to be able to apply something to it. The analogy I give is if your data strategy is not right it’s a bit like asking someone who speaks English to then go and translate Chinese or Latin with no prior knowledge. It’s almost impossible for any machine or human for that matter to make sense of it.

If your humans can’t make sense of your enterprise data how do you expect a model to make sense of it?

Anton:

So, there’s learning no1. Still invest in the data strategy. We often ask clients do they have a data strategy; it can be tumbleweed or quiet and others have it. So, it’s critical to identify at stick level or complex level what data you want to utilise and how you want to utilise to achieve an objective.

Henry:

And the schema in which it is kept. The thing I find that is most underestimated in getting that data strategy schema right is what time series are you collecting in?

Anton:

What do you mean by time series?

Henry:

The frequency in which data is collected. The analogy I give here (and I don’t want to pick on media too much) but if you’re only getting your media data back once a quarter, it’s very, very difficult to then go and construct complex models to see cause and effect. We always say to clients, be very conscious of that time series.

Make sure you’re getting things weekly if not daily. The reason why is obviously that variance is absolutely critical to understanding and modelling the different impacts of things. If you don’t have a lot of variance in data it’s very hard to understand cause and effect.

Anton:

That goes to the age-old problem of are you looking backwards—we were talking about that earlier—are you looking at data as looking backwards because data is just a point where something has happened, an interaction that pops up as a piece of data that can tell you something but it’s looking backwards.

It’s saying someone has done something, clicked, commented, bought or whatever that shows up in the data base.

Henry:

Correct.

Anton:

The great challenge therein is could I predict to say, based on all this interaction and data coming daily or hourly or near real time, what may happen as the next logical step.

Henry:

That’s right and that’s where these models become incredibly powerful if you get them right. The one thing we know is that when a model has a low error rate—typically we’ll train a model on a subset of the data and then see if it completes with a low error rate.

And if it does that you know you’ve got a statistical likelihood of getting things right in the future. Models can then anticipate what will happen. That allows you to start to test hypotheses, which gives you licence to take more calculated risks.

Another thing I would say about the marketing landscape in general is science, and data science in particular, has always been about testing a hypothesis and it seems to me that a lot of the marketing industry uses data reductively rather than to test hypotheses; well if we did this what would the consequences be?

It comes back to that other age-old question of are you asking the right questions of your data. And I think most people look at the data and go ‘what can we kill rather than what can we build?’ Other industries like shipping, logistics, even finance to a degree (hedge funds are fantastic at this). The hedge fund industry is built off ‘if we make this calculated bet what’s our risk profile?’

I think marketers have to look at hedge funds as the model for how to use data in a way that complements humans—let’s just take calculated risks and actually accelerate creativity. It doesn’t inhibit it. There is this great myth that data and creativity are enemies.

Actually, data can be the perfect argument—I know I’m meant to be talking about AI but data and AI are very linked. Data and the processing of it, which is AI, allows us to go to people like the CFOs and say, ‘if we make this bet this is your risk profile’. And having that more aggressive number-driven conversation which allows the industry to be bolder and braver is something I get really excited about.

Anton:

Yeah, I can hear it which is great; I’m sure our listeners can too. It touches on defining marketing as well before we get too deep into AI. A lot of people touch on the Comms side of it, which is really just the outputs, media.

Henry:

I’m so glad you’re talking about this by the way.

Anton:

But market sizing, potential or new market opportunity as well.

Henry:

And also things like pricing. Mark Saraph, a person who was very, very generous to me early on in my career and he always used to talk to us about the 4 Ps in marketing; product, price, place, and promotion. A lot of where we focus our efforts is on promotion.

But understanding promotion in the context of place, price and product is also quite critical because if we don’t understand whether it was the discount of the creative that drove the behaviour change it’s going to be very hard for businesses to make decisions about where they can make better decisions around marketing.

Anton:

And we’ve seen quite a bit in pricing AI tools or models and solutions being used for pricing—price testing maybe.

Henry:

Price optimisation, yes. But the impact on price of say creative performance and things like that—linking those two things together is really, really important because if you go to market with a huge amount of discounts and also a brilliant emotional storytelling ad that’s hard-hitting that you put $100 million of TV behind—was it the $5 off that gave you the result or was it the beautiful ad?

It is very hard, unless you’re combining those data sets, forming a comprehensive view to model that and to model that effectively. What tends to happen is that everyone is discounting down here and everyone is saying spend more money on the TV and ad production over here and no one is going ‘how do these things work together?’

Anton:

Is that attribution modelling at the end of the day?

Henry:

I hate the word attribution.

Anton:

I’m glad you do too.

Henry:

Attribution is (I’m trying to be polite to your listeners) fudge mathematics. The one thing we always talk about at NeatNews is propensity, not attribution. It’s about modelling the propensity of something or the probability that it will impact. That’s why you use things like Bayesian statistics in your model whereas attribution is all about touch.

It’s all about what someone touched and assigning credits based on that. To my mind, when we’re looking at how to attribute or link the probability that something impacted, we like to talk about it in the statistical sense, which is propensity or impact in the model rather than what someone touched. SEM and retargeting are classic examples of that.

Anton:

It’s the way out between do I spend on a big brand ad or do I some sort of promotional offer down the end of the funnel—which one converted? And we all know you need to get some awareness to start with before I know about a product or service or a cause.

But it has been the great marketer’s challenge to say not quite sure how much to spend there then of course the digital world where it’s made it so easy to see that last touch and action and people can go ‘I can prove my search works’.

Henry:

I think Google and Facebook have been marking their own homework. There’s a reason they like attribution and that they haven’t pushed the statistical modelling side of things. Google could have come into this mixed market modelling space and blown everyone away. The reason they haven’t is because 97% of their business and revenue comes from search and it’s convenient for them to have an attribution model.

They could blow our business out of the water in 10 minutes if they wanted to. They never will because it will compromise what they’re trying to do.

Anton:

Let’s not get too Bayesian statistical and tell us a little about what you’re doing in the space.

Henry:

Mutiny—we call ourselves a predictive growth agency. We’re kind of two businesses mixed together which is the best way to explain us. We have the product side, which we’re very focused on where we build products to process data effectively and then we have the services side where we effectively help people to get the best structure to leverage products and AI and build that capability for themselves.

I’m very much obsessed about capability in this space for clients rather than doing campaigns, projects and things like that because I think embedding this capability for clients is going to mean two things long term.

What we’re seeing with clients is if you get the AI prediction stuff right they might be able to reduce FTE headcount on the boring stuff. Data analysts who are just sitting there compiling spreadsheets and crunching. We can start to shift that into far more thinking and creativity and push the dial in that kind of direction.

We do a lot of that kind of work with clients; getting the structure right, we help them with market sizing, we help them understand where opportunities are, all leveraging data.

Anton:

Capability is a really interesting area. We could probably do a whole podcast around it but I’d love to know a little bit about your capability and how you got your knowledge in the space. Because it’s early days. Then let’s talk about the marketer’s side where there is a lack of capability and skills.

Henry:

I’ve always been interested in this space. To go back to a very old problem that I first solved with AI is World of Warcraft, when World of Warcraft first came out.

Anton:

You’re a WOW Guru are you?

Henry:

I wouldn’t say a WOW Guru but I got so bored with levelling up the character because it was one of the most boring and frustrating things to do so I got into learning Python script to understand how to automate the character to follow the best path to level up and so ended up levelling up all of these characters and ended up getting banned from World of Warcraft. We got away with it for about 5 weeks before they realised what we were up to.

Anton:

Any kids under the age of 18 listening to this, please do not do this. That’s not our target audience.

Henry:

That was my first interest in the space. Obviously, I’ve come through the agency background. What I really loved was understanding how you could leverage data to make really smart decisions for outcomes for the clients.

As a general rule I saw the opportunity to use data to build smarter and more predictive programmes for clients to be more precise about their investment and how they grow. I guess, from a personal angle, I’ve always seen agencies as an angle to grow my own ambitions in this space so perhaps I was never perfectly aligned to the core business of the agencies I’ve come through.

But I think that’s been a blessing and a curse when I’ve worked with them. I’ve always brought a very new perspective on things but at, the same time I’m never going to be the creative director making the big TV ad nor have I wanted to be.

Anton:

And are you looking at working closely with the tech vendors, off the shelf, or are you looking at building your own tools and solutions?

Henry:

We build a lot of our own models but they plug in natively to the AWSs, the Azures, GCPs and that kind of thing so that’s never an issue for us. We work very closely with cloud vendors. We know and understand how to work with those cloud vendors very, very well.

I think when it comes more specifically to the Sales Forces and Adobes of the world we do do a bit of work on their stacks. We don’t work with them as partners because we don’t see ourselves as needing to operate in that space; we see them as holding and hosting data for us.

Fundamentally, those tech, martech vendors have effectively standardised a lot of the tables, measurements, and structures that data is held in that martech space.

Anton:

We’re also seeing this challenge or opportunity of the tech companies leading the way in the conversation and selling in a great solution which maybe to your point about capability for marketers, they’re not quite there yet. So, we have this over-hype or over-sale of what the opportunity is.

Henry:

We have this great saying at Mutiny, one of our little trademarks, ‘we don’t like martech; we like smarttech’ which is when the people match the technology processes and they’re getting the best out of each other and complementing each other.

And going back to that issue of capability, that’s having the right resources and people who understand not just how to implement technology but how to strategically align it to the business and to the business needs. If I look at most of the technology landscape I see there’s a lot of implementation happening. People are doing that really well.

But understanding how to align that data stack and what that technology’s capability is with regards to the data and activity, learnings and insights and aligning that to a business needs. So, how can we test our brand ad before it goes out to make sure it’s hitting the mark? How can we make sure something is communicating effectively? How can we make sure that we’re capitalising on our audiences in an effective way? I don’t see a lot of people articulating those strategic pieces of technology outside of the existing feature sets. And that’s a real opportunity for marketers.

Anton:

I agree. Maybe share some of the conversations you’re having with the marketers you interact with who have huge varying degrees of capability in this space.

Henry:

It ranges. We’ve done things like enterprise data strategies. We’ve done a number of those. Aligning data across the businesses whether it be supply chain operation all the way through to marketing, HR people, compliance, we’ve worked with a lot of marketers to get that right.

We’re working at the moment to use 1st party data for one very large marketer; using 1st party data to activate their promotional engines and things like that in a very interesting and unique way. Putting that very clear model to learn when certain internal factors are impacting, when certain things are happening in the market it’s getting the right promotional offer into market. That’s had huge impacts very early on and has been very effective.

The other good thing about that model is it’s reducing headcount significantly around tactical campaigns, which means they have to invest less in that and focus more on the big great creative things. We’ve done a lot of market sizing. One of the big areas is how do we better understand the data and the profiles of people that are in market, the yield, price points, profitability and the EBIT contribution that might exist and then using systems to prioritise, build up and cluster different profile groups.

Anton:

So, that would typically sit in a business. Typically a marketer or business would use an agency because of the capability gap—let’s use an external supplier to plug that gap. What you’re talking about sounds to me like a lot of business challenges and opportunities now being solved by agencies (external partners). Is the conversation coming out of marketing or CFO?

Henry:

We typically have that conversation with the C-Suite. I wouldn’t say that Mutiny is a particularly marketing-led business. We’re a data-led business. We focus on understanding data in relation to business problems and growth not data as it understands to advertising and marketing comms down here.

There are a lot of businesses who do plug these capability gaps. These capability gaps typically have not historically been filled by agency groups; they’ve been filled by Mckinsey, Bain & BCG. But the challenge with those businesses is that the talent profile that they hire is typically MBAs—really smart people.

One of my favourite quotes is ‘software is eating the world’ and the models that those MBAs used to construct and spend a lot of time analysing and building can now be built faster, better, and cheaper by software engineers. That’s the approach we take. We’re competing in their capability gap but rather than competing with MBAs, we compete with software engineers.

Anton:

You’re getting a much more real market life application as opposed to theoretical.

Henry:

Correct and one of the things we like to do, which leads to the technology part of the business, and something I think the market needs to shift towards, we look to invent models and shift models to an always on functionality not a once-off functionality.

So, you’ve got an engine there constantly producing decision analytics and that’s a really critical part of trying to get this mix right. If you build a model once say in 2019 and it’s still working 5 years later you can amortise that cost over those 5 years for the CFO or CMO. And it’s still delivering value to you 5 years on.

That’s an incredible benefit of taking an AI-led approach—is that you can start to build a lot of the decision engine in a way that it becomes a capital asset rather than an operating asset.

Anton:

Yeah, and then seen as real business value not just a shiny, new flashy toy of marketing.

Henry:

Exactly and that’s where we try to shift the conversation to—don’t focus on implementing this new technology thing that’s going to be flashy and things like that. Focus on building a model that’s robust enough to last for 10 years on a data set that’s going to be able to predict and anticipate decisions for you and help you run your business at greater speed.

One of the other key things I love to talk about is the axis of what a good AI enabled organisation looks like and I put it all down to decision-making. And it’s two axis like that; you have a speed of decision and you have an accuracy of decision. And if you’re up here typically the machines are making lots and lots of the decisions for you. And the critical decisions are being made really, really quickly by humans.

If you think about what does AI deliver to the wider organisation in terms of value, it is speed to decision and accuracy of decision and getting those things better and better. When you go out into the market that then gives you a decisive competitive edge because you can move faster than your competition.

Anton:

Fast and accurate—that’s been the ultimate challenge hasn’t it? There’s been a huge promise of AI crunching massive amounts of data, whether it’s machine learning or deep learning to bring out better patterns. And those patterns need to tell you something so there’s got to be an insight.

Applying those insights to some sort of strategy that can then be tested in real time and quickly, whether its comms or some other marketing or business opportunity it’s almost nirvana. That’s where you’re trying to aim to: speed and accuracy. I’d love to know some of the stumbling blocks. What’s failing? What have you learnt along the way?

Henry:

The number one thing is the underlying data. That is the number one underlying issue we run into every time. Is the underlying data correct? Is it accessible? Is it collected in the right time series? Rarely is the model the problem—I’d say only 10% of the time.

Anton:

So how do you overcome with business leaders, C-Suite, that conversation around, even if it’s in the cloud, the data structure is wrong or we’re only going to utilise it to 80% or whatever it might be—how are you overcoming those conversations?

Henry:

We say, ‘you have to take an enterprise beyond data’. You have to restructure your data. Data transformation isn’t necessarily like tech transformation in the cloud. It can be done as long as you have the data in there, there are ways to restructure it relatively quickly.

We try not to make it a 50-person conversation. The other way to do it is to chunk it down by department. Once it’s in the cloud, getting alignment by department, getting that data right and understanding how it all fits together is an easier way to do it, and then stitching it together through the tech.

And letting the tech come over through the silos is sometimes a simpler way to do it than trying to get everyone to agree what the master data table will look like.

Anton:

Fascinating. An old acronym—SHISHO—shit in, shit out. It hasn’t changed since the 90s.

Henry:

The only thing that’s changed is pace. That’s where we try to get to market and where the opportunity lies. If you can speed up the pace and accuracy of your marketing decisions, that’s going to help a lot of people.

It’s also worth noting that sometimes data tells you what to keep consistent. There is a concept called creative decay. Creative decay gets worse and worse the more you change out big brand creative messages so your half-life of creative gets worse and worse.

You can model that in the data and sometimes data can be used to test a different hypothesis. So, rather than us having to constantly optimise and change things, which seems to be what the AI cowboys are always saying, sometimes it will tell you that you don’t actually have to change things. And that’s an important part of it too; it’s about testing hypotheses.

Anton:

So, if you’re always on you’re almost in real time with these powerful machines now, there is a great challenge between the long-termism versus short-termism, the pressure market and business leaders have just to get sales. I wonder whether AI is helping solve this or actually making it worse that we are focusing far too much on the immediacy.

Henry:

It’s an interesting one. I think that’s driven by the digital economy and how that’s structured more than anything. It’s driven by what I call the click economy. We’re all looking at CTRs and last click attribution and all this other stuff that simply doesn’t matter that much.

The classic example is the one Brent Smart spoke about, which is he switched off a bunch of his retargeting; nothing happened. That doesn’t surprise me in the slightest. I suspect if the right data model had been applied that would have come out very quickly and clearly that that wasn’t actually making a huge impact.

Certainly, it would have been identified once he’d switched it off, the model would have picked up that it made no impact and would have started to report a negative ROI and SCM. I think there is an important component of having the right model to understand the things you’re talking about.

A lot of the target variables (the things you’re measuring yourself against when you’re talking statistics). A lot of those kinds of elements have been very focused on a click or a last click attribution so that’s where the market naturally focuses. And then you put AI on top of that, it accelerates that because you’re making those decisions faster based on that flawed data.

So, it’s a faster but more inaccurate decision. It speeds up the inaccuracy and that’s why it’s so important to get your target variables right. One of the things we solve for when doing that is we look at things like the key brand metrics across a 3-year period and build that into a model.

You can actually then start to say to a model ‘I want to get an optimum amount of sales whilst maintaining my brand metrics above this. What’s my best mix?’

Anton:

It’s being very clear about not only what are your objectives but what are your success measures.

Henry:

What are your success measures? How are they tracked in the data? How they’re tracked in a regular time series and leveraging that to start to accelerate my understanding of ‘I need to keep these elements consistent and strong in order to succeed and I need to change this stuff down here quite frequently so it might be promotion, conversion, discounting or price.’

It’s understanding how those elements work together, which is going to give us the defence and also the confidence in the industry that we don’t need to change everything all the time. Actually, sometimes the data is going to tell us to keep something for 5 years and we’ve seen that with a client very recently.

Anton:

Yeah, confidence is a great word. There seems to be a lot of distrust in the AI space because we don’t know how the algorithms work.

Henry:

One of the great things AI can start to do is when a CEO with sales flagging, the first thing they do is look towards marketing and they go ‘we need to have a new brand campaign’. If marketers had an AI model that was sitting there with the right data they could then easily go, ‘if we change our creative like this it’s going to have this impact’.

Anton:

You can pull the levers and play with different scenarios.

Henry:

Exactly. And then the CEO will go ‘OK, I don’t want to do that so what do I do. Well, if you give me another 100,000 in this particular channel and $20,000 to fix the performance of this channel’s offer then we’re going to be able to lift sales by this much.’

So, it starts to change the conversation. With a lot of these tools we can start to equip CMOS with the tools to confidently have discussions with CFOs and CEOs.

Anton:

It’s powerful. It gets the CMO back at the table with the C-Suite.

Henry:

100%. We believe the opportunity in data and AI around marketing is to get that seat at the table back. And to use that to accurately predict for class growth in a way that shows the CFO and CEO how best to invest in growth.

Anton:

Fantastic. Yazz and the Plastic Population sang a song called ‘The only way is up’. I get that sense with you. Henry, thank you for coming in. We’re out of time. I just have one final question. When you’re playing bowls on Dangar Island against a robot, who would win?

Henry:

The robot.

Ideal for marketers, advertisers, media and commercial communications professionals, Managing Marketing is a podcast hosted by Darren Woolley and special guests. Find all the episodes here

Want more articles like this? Subscribe to our newsletter:

Fill out my online form.

Anton is one of Australian's leading customer engagement consultants. With an eye for discovering greater marketing value and a love for listening to what customers are really saying about a brand. Anton has helped take global and local businesses including Microsoft, Nestlé, P&G, Gloria Jean's, Foxtel and American Express amongst others to the next level. Check out Anton's full bio here

We're Listening

Have something to say about this article?
Share it with us on Twitter, Facebook or LinkedIn

Tweet
Share
Share
Buffer
Pin