Managing Marketing: Navigating The Complexity Of The Marketing Technology Swamp

anton-kate-adrian (1)

Three guests join to help navigate the marketing technological swamp, from TrinityP3, Dr Kate Gunby, Senior Consultant, Anton Buchner, Business Director and Adrian Treahy, Senior Consultant.

There is no denying that technology continues to disrupt business, particularly marketing, media, and advertising.

The explosion of digital advertising platforms, the rise of e-commerce, programmatic advertising, the associated big data, and now artificial intelligence are all providing marketers with challenges and opportunities in an increasingly complex world.

We discuss how marketers and their agencies can navigate this complexity and what emerging practices we see to address the increasing demands and expectations being placed on marketers.

You can listen to the podcast here:

Follow Managing Marketing on SoundcloudPodbean, TuneInStitcher, Spotify, Apple Podcast and Amazon Podcasts.

Seventy two percent of people are fooled by statistics, but sixty eight percent of statistics are made up on the spot.

Transcription:

Darren:

Hi, I am Darren Woolley, founder and CEO of TrinityP3 Marketing Management Consultancy, and welcome to Managing Marketing, a weekly podcast where we discuss the issues and opportunities facing marketing, media, and advertising with industry thought leaders and practitioners.

If you enjoy the Managing Marketing Podcast, please like, review, or share this episode to spread the words and wisdom of our guests each week.

Now, there’s no denying that technology continues to significantly disrupt business, and particularly marketing, media, and advertising. The explosion of digital advertising platforms, the rise of e-commerce, programmatic advertising, the associated big data, and now, artificial intelligence are all providing challenges and opportunities to marketers in an increasingly complex world.

How can marketers and their agencies navigate this complexity, and what are the emerging practices we see to address the increasing demands and expectations being placed on marketers?

To help us navigate this marketing technology swamp, please welcome to the Managing Marketing Podcast from TrinityP3, Kate Gunby, Senior Consultant. Hi, Kate.

Kate:

Hey.

Darren:

Thanks for joining us. Anton Buchner, Business Director. Welcome, Anton.

Anton:

Hi, Darren. Thanks for having me back.

Darren:

And last but never least, Adrian Treahy, Senior Consultant. Welcome, Adrian.

Adrian:

Hi, Darren. Hi, team. Good to be back. Good to have a good conversation about this topic today. Looking forward to it.

Darren:

Well, it’s interesting, isn’t it? Because I think it’s human nature that when you’re confronted by something that’s complex, the first thing we want to do is either run away or try and reduce it down to its most simple patch.

If someone can give me an A or B option, and I have to choose, that’s much easier than actually dealing with complexity. Has anyone seen examples of that recently? Of falling for the A or B simple choice? Yeah, Anton?

Anton:

I think it’s a great question, Darren, and I love you started with the swamp as a word there, I noticed. I think it’s never been a bigger swamp. I think for all of us, this shift is pretty seismic.

I can only think probably the digitization that happened back in the early nineties where we went from paper to digital was a seismic shift and everyone got excited and there was a truckload of technology that came out of that. But that was quite one dimensional.

I think now, we’re at a totally seismic shift in the industry, so the complexity of just an A and B might be a result of it’s just too bloody complex. So, I’m going to fall for this solution or that solution just to do something. But what we’re also seeing is paralysis. I think people are starting to go, I just don’t know what to do these days because it’s become too complex.

Darren:

It is a very good sales technique though, isn’t it? And Adrian, you’d see that in technology. If I can take a complex issue that my customer’s facing and reduce it to a simple solution, the silver bullet of technology makes it an easier sale, doesn’t it?

Adrian:

It certainly does, and I think probably the biggest trap is organizations feeling like they can just buy something in a nice shiny packet in a box, and put it in and it will solve all the answers. And particularly, with AI now, it’ll do everything, they don’t even need to think particularly. They just need to tell it what to do and it’ll deliver it.

I think it’s interesting, particularly coming from the Government sector and some of the bigger areas, people just underestimating the cost and complexity of legacy. It’s the legacy stuff that’s really killing organizations and causing the issues, because even if you go out and buy the great shiny thing, somehow, you have to integrate it in with all your sources of data, which are all siloed and that continues.

The smaller organizations are the lucky ones because they can manage that complexity much more easily. The bigger the organization, the more brands that are trying to manage, the more they’ve done through M&A processes, the more complex the whole environment becomes. And I think it’s just getting worse, not better.

Darren:

Yeah, Kate, we’ve seen the rise of marketing science and a lot of that’s been driven by this huge amount of data that Adrian was referring to. But it’s not as simple as one plus one equals two, is it?

Kate:

It never is. And I think if you think about the fragmentation we’ve gotten, all the choices we have, the ways you can measure that has exploded even more. So, now, you’re trying to make decisions from that data and even understand if that data makes sense in the first place, is a task that is just phenomenally large compared to where it used to sit.

Darren:

Yeah. Anton, because you made a reference before about digitization. I’m old enough to remember when the promise of the paperless office was the big thing that digitization was going to deliver. And I sit here surrounded by bits of papers, so there’s one promise that was never delivered.

What’s the other promise that you think is not being delivered?

Anton:

Well, I think the great promise was that we’d have all this data, what Kate touched on. That data would be powerful, so the more information you have, the more powerful you are. What we’ve seen is the more data you have, often the more difficult it is to find a really clear insight or really interesting piece of information that’s relevant to act on.

I love telling this story that if you look at all the data in the world, it’s a big number (whether we know this number or whether it’s exact or not, I’m not sure) – but it’s 150 zettabytes of data. That means absolutely nothing to me. And then apparently in a couple of years’ time, that’s going to double, so 300 zettabytes.

So, I put into ChatGPT: how many Olympic swimming pools would that fill? And it came out with 12 million Olympic-sized swimming pools in terms of volume. So, that’s the amount of data we are capturing in this footprint globally, 12 million Olympic-sized swimming pools, soon to be 24 million swimming pools.

So, we’ve got a truckload of data, but there’s a bit of a misnomer of what’s most relevant to your organization and how can you use it, how can marketers use it better.

Darren:

And we had that period of time where big data, all of the big consulting firms and tech companies were talking about this promise of big data. But you can capture data, but how do you know which is the data that’s worthwhile, how do you filter through that? Kate?

Kate:

I think it’s a really interesting point. And when I sort of first started capturing data from a media agency perspective, I was told by some of the people I was working with, that’s the quickest path to failure. If you just spend all your time trying to capture data and get everything perfect, you are onto a thankless task that you will never succeed with.

Darren:

But we have worked with clients, that one of the problems is they’re blaming the technology (and Adrian, you probably have a view on this) – they’re blaming the tech that they’ve bought. But in actual fact, it’s the old technology saying garbage in, garbage out, because often, it’s actually the data that’s being poured in there.

I remember an insurance company, they said they were getting their data cleansed and under address line one, which had been entered in one of the retail outlets, and it said, “Woman in red dress by door” as the first line of the address panel. So, how do you get clean data? How do you get data that you can actually use?

Adrian:

Well, I think that’s the problem, is that people are at some level, picking up on Kate’s point, it’s looking for perfection. And I think one of the other areas is that people are looking at technology as the thing that’s going to solve this, and they’re not putting a human lens across things.

And it’s really important to have that interaction, that human experience side of things, overlaying with the hypothesis that you might be making, and then looking for evidence that will back that up. You don’t actually need to have perfect data to make that work, but you need different ways of working.

So, it’s the cultural shift and the ways of working shifts that organizations need to get their head around, stop thinking that you can just go and buy a piece of tech that’s going to actually achieve this.

So, it’s like I’m going to go and buy the fastest car in the world and therefore, I’ll win Le Mans. Well, it’s just not going to happen. The machine’s there as a prerequisite, but it’s all of the other pieces of the puzzle that need to actually go in to make that a success.

And that’s the hardest thing, I think for many organizations, is to really look at how do they build multidisciplinary teams, how do they bring all of the things together to actually have multi matrix business models, get away from hierarchical structure and just saying, “It’s someone else’s role to give me data,” and when the data’s no good, “Well, it’s their problem that it didn’t work.”

It’s too simplistic. Back to your point, Darren, they’re looking for simplistic answers to complex things, and it’s just not the right way to achieve it, I think.

Anton:

Yeah, I think you’re right, Adrian. I think it’s looking in the wrong window. We often talk about that.

So, the three of us worked on a project recently with one client where they were trying to prove commercial value. So, we helped them look in that lens and said, if you look at all the activity, the marketing activity you’re doing, which areas can you track through to commercial impact?

And that was a great first start to start to go, well, all this social content in walled gardens is going to one direction. And if it’s driven through to e-commerce, it’s one tiny percent of the commercial value. But you have no idea how much content’s being created, what the impact of that content is, the teams you’ve allocated both internally and insourcing and in housing, versus the agency resources they’re using.

So, it begged the question, can we use commercial value as a shining measurement direction? And that allowed them to prioritize some of the activity they were doing and deprioritize and stop doing some of the activity they’re doing.

And to Adrian’s point, there was another project we’re working on, Adrian and I, where data quality was the issue. They started with saying, “We need a new CRM system, we need a new digital experience platform. We’re thinking of Adobe.” And we started with a tech discussion, and soon it came back to the quality of your data is actually quite poor flowing through to different systems.

So, for them it was a question of how do you prioritize the key elements of data? And there were probably only five or six key elements they had to actually prioritize to actually do marketing. But back down to your point, I think people are getting swamped with the massive data, the silver bullet, and that’s very outward looking versus the customer.

Adrian, as you touched on this, it’s flipped the whole discussion into the type of customers you want to win and the type of customers you want to keep, and how could we communicate better with them.

Darren:

Adrian?

Adrian:

Yeah, I think it’s also the size of the problem back to the how many 24 million swimming pool swamp. I think that’s a nice analogy between swamps and swimming pools. It’s such a big thing. I think part of it is to reframe it so that people can understand how they can build evidence at a small case, proof of concepts.

Build up the proof of concept, show that it works, show how you can take data that you have, cleanse that data as best as possible, stop looking for perfection, it’s never going to happen, you’re always going to have errors and issues.

But it’s about how do you actually build up that case studies that can then be used to evangelize through an organization, and build up the business cases for the tech and the ways of working and all the things that have to happen without having to invest huge amounts of money and effort and energy to make it happen.

So, I think it’s really about how you actually … and I think that some of the things that we were doing, Anton, with those clients, was really about finding small nuggets and then building a business case around it where they could actually start to feel that there’s some change.

Nobody likes feeling that they’re up against this monumental task and nothing ever changes no matter what they do. So, I think you have to reframe that at some level. And then think of it in bite-sized pieces.

Darren:

Yeah. Kate, one of the issues is that we do get data from lots of different sources, like all of the platforms offer data to the clients, to advertisers and their agencies. Clients are capturing data through their own websites and other places. But how do you actually get all of that to line up? Before you can analyze data, you need to sort of find some sort of way of aligning it, don’t you?

Kate:

I think you do, and it’s again, another great point. And there’s lots of, I guess, discussions on the dashboard era that we’ve been through, the people have spent, “Oh, here’s another incorrect dashboard.” And you can go down the mine, and they … because different technology will register in different ways and it won’t line up.

And I think you have to sort of prioritize and understand what you believe in and what you use to make what decisions. And it’s data, so we want it to be right. And there’s a natural instinct to feel, “Ah, that’s correct. I can trust that now.” But we have to sort of start feeling a bit comfortable with the fact that as long as I’m making a better decision and I’m heading in the right direction, it’s useful data even if it’s not perfect.

Back to Adrian’s point, if you start trying to align all of those things up again, you can solve no problem. And you can spend a lot of time in that phase. I think it’s one of the biggest problems I’ve seen over the years is you can install a new system, spend millions of dollars six months in the implementation stage, another six months kind of onboarding everyone – two years in, you’ve still not made any decisions with it.

And so, in terms of the business case and going back and say, “Look what my shiny new tool did,” it’s not got you anywhere. I mean, often a CMO at that point said, “Look what I installed” and run off to their next job and we start the process again. So, it does take discipline and rigor, which is something not everyone’s got the patience for.

Darren:

Yeah, but one of the problems seems to be that these solutions, these technological solutions for marketing come on the basis of “implement this.” So, the tool, I think it was Adrian, you mentioned the test and learn, do a small-scale test, and then scale it up.

But most of these tech platforms are about plug it in, pour everything in, and suddenly, you’ll start getting results. There’s not a lot of opportunities there in that approach of applying technology to actually do a small test. I can’t see it. Yeah, Adrian.

Adrian:

You’re spot on there Darren, and again, it’s when you’re implementing a platform – so I’m thinking of a client that worked on recently that they were smart enough to know, look, we want this new AI analytics platform that’s going to do all this wonderful stuff for us, but in the meantime, it was how do they start small?

It was actually making sure the vendor had the resources on the ground, the teams on the ground to help them bring in the people that could actually coach the teams and develop small business cases that were actually successful. Rather than just going, it’s a sales rep who’s selling you something and dumping it at the door, which is then your problem.

So, again, I think part of it is making sure that when you’re selecting platforms and pieces of tech, that you have a realistic view of how you can actually build the paces slowly, how you can actually get the knowledge to use it. Because we’ve all seen platforms that come in.

In fact, Anton and I were looking at a client just this week going, well, yeah, you’ve got a problem, you bring a platform, and just having a platform, does that mean that people have the capability to use it? They understand what it can do, they understand how they can integrate it into their ways of working, into their data and their tech stack.

So, there’s so many pieces to that puzzle that they need to consider in an effective way. But the one thing that’s really important is making sure that when they’re talking to vendors, they make sure that vendor is there holding their hand and accountable for results, not just delivering a box.

Darren:

I was just thinking while you were saying that, they perhaps should embrace the printer business model, make the technology cheap and then make the support cost, i.e. the toner horrendously expensive.

Here’s the technology that’s cheap, but you’ll need all this training to actually work out how to use it properly, that you pay for, because that’s actually so important, and actually, implementing it. Anton?

Anton:

I think that that’s probably the key point, isn’t it? That the work we do is shifting the conversation from that tech and saying, “What is your strategy?” And then in that discussion alone, we are unearthing what do you actually mean by AI? Or what do you mean by customization? What do you mean by personalization?

So, I think one of the challenges in the industry that we’re seeing is that all these buzzwords are thrown out. And everyone has a slightly different view of what it actually means, which is fine, but for businesses to align, it’s getting that definition on the table, and saying, “Okay, for our lead generation. We generate leads off our webpage or off a mobile app or off something else, and this data goes into a structured system and that system talks to some sort of engine that can automate a message out, whether it’s by mobile, by email or something else, or on the fly in that platform you’re in.”

So, just that discussion starts to get to right, we understand what you’re talking about, but we’re aligning it back to your strategy. So, again, aligning to a strategy, it sounds really simple, sounds 101. That’s probably one of the biggest areas and challenges that all three of us are seeing that what is the actual strategy? Because tech’s only the enabler.

I think everyone says that and we read that in most articles. AI is just another enabler. It’s steroids, it’s awesome. And look, there are a couple of companies doing it really well. I will throw out because my daughter is 12 and into beauty.

MECCA, just as an example, their pos system, the data they’re capturing and how they’re utilizing it for marketing is next level. And they’re creating experiences a bit like what Apple did with Genius Bar, which is sort of 10, 15-years-old.

But bringing girls and women in for beauty sessions and then keeping that data and information on them in a database. And then targeting and drawing you back in for VIP events and exclusive events, and totally relevant, tailored, customized personal information. That’s a great example of companies doing it really well. And your question to Kate is they’ve seamlessly integrated the systems in the back end.

Darren:

Yeah, Kate?

Kate:

I was just going to say I’ve also noticed recently people doing it very badly.

I’m not big on subscription services, but I have recently subscribed to Binge and my cheap rate has come to an end, and I keep getting personal offers just for you, which they want me to sign up to KO to watch sport.

Well, anyone who knows me would know I have never watched sport on their platform. It’s been more The Great British Bake Off truth be known, but they think it’s a personalized offer, because why wouldn’t I want that.

Darren:

Yeah. Solidarity sister. I absolutely agree, that’s why I dropped Binge. But Amazon in the early days was considered a great example of capturing data, and then offering you things.

But I found this huge flaw in it is that suddenly, I’ll go online and buy a gift for one of the boys’ female friends at school, and then suddenly, they have me down in a group that I have sons and daughters and keep offering me lots and lots of more female gendered toys that I wanted in a one-off.

There is an element of all this that technology still doesn’t seem to have embraced, which is we are human beings, and we are complex, and we are unpredictable in many ways. And we do things that don’t neatly fit into algorithms. Kate?

Kate:

I find it really interesting from the data science perspective and looking at those recommendation algorithms. We see a lot of people who know the programming’s so complex, it’s all about optimizing the algorithm rather than thinking about the true human decision behind it.

And just in that sense, the segmentation or even I’ve bought a new Camelbak water bag, I don’t want another, so saying I might like this and kind of recommending me more and more – it’s not like a book, and you like this author. And I find that a lot with Amazon, it’s all written on these standard recommendation algorithms, and you don’t get that difference in there.

And I think that’s going to be really interesting to see where AI can help us, because it’s got huge potential to do that, and to improve those, and to learn that. But if you don’t put that in place in terms of trying to find it, you’re never going to come across that unless you still understand that human behavior and categorize things. And we’re just not seeing that level at the moment.

Darren:

Yeah, Adrian.

Adrian:

It reminds me of the early work we were doing trying to build experimental design models into marketing. And the first person who did all the study around this McFadden in the fifties who won the Nobel Prize for economics for looking at choice modeling, the one thing is it’s a combination of utility and randomness.

And it’s the randomness that’s really difficult to deal with, not the utility. We can deal with the utility, you’re going to go out and buy a pair of gumboots because it’s raining and you’re going to spend all day researching gumboots and you come back with a pair of Jimmy Choos. Where did that come from?

Darren:

I think Jimmy Choos, it looked very good on you, Adrian. I don’t see why that’s a problem.

Look, one of the problems it seems to me in hovering around this conversation is the unrealistic expectations we seem to have around technology and particular data, and even of data science.

I mean, I’ll admit I’ve got a science degree in what we’d call a pure science. And we keep looking down at the social sciences because it’s virtually impossible to set up a true experiment with a proper control group.

It’s always going to be a bit fuzzy, they’d say, it’s a fuzzy science, but there is an expectation. And Kate, you inferred it before (I think you explicitly said it, not even inferred it) that we do seem to get carried away by data as somehow being always right when it’s often in interpretation rather than an empirical proof.

And one of my favorite sayings is 72% of people are fooled by statistics, but 68% of statistics are made up on the spot. Is one of those sort of proofs because there’s going to be people listening to this going, “Is that right? 72%?” Because for some reason we’re wired that way as human beings that somehow data or something empirical has more credibility than just an idea or a thought.

Kate:

Absolutely. I think that’s always been the case. I used to back in the day sort of, if people didn’t like the presentation, say, “Okay, I’m going to put it in an Excel spreadsheet and are you going to believe me then?”

Darren:

Yeah. But that expectation is actually creating problems because we’re … so, well, you’ve just spent X amount of dollars on some technology, why can’t you tell me exactly what our growth projections are going to be? Or what do we need to do?

And we’re seeing especially the MMMs and the predictive modeling is moving into a world of informing decisions, but is there an expectation that it’s foolproof? And is that a trap for us?

Kate:

I think it’s a huge trap and it is something that really worries me in that space. We’ve seen sort of time and time again that it doesn’t meet the expectations, but then people lose all faith in approaches, and they look for the next silver bullet.

And I think it comes back to what you were saying earlier, people are looking for simplicity and silver bullets rather than putting in the hard yards and the discipline and kind of going through this test and learn, interpret what works. And we don’t have to be perfect, we just have to do more of the good stuff and less of the bad stuff.

Darren:

Yeah. No, I think a good example of that is if your current decision-making process is based on a toss of a coin, you’ve got a 50/50 chance of getting it right. So, anything that can move you into 51% or better is an improvement.

And I think that’s the attitude that we need to bring to this rather than this idea that it’s somehow empirical truth. If it was a religion, it’d be the AI is like a god proclaiming how the world really is. Adrian, what’s your view?

Adrian:

I think it’s that point of also looking for proof points, looking for something to say, “That’s the world, that’s how we know things behave.” And therefore, thinking you can run that forever. But you know when you actually running experimental design, is it changes in real time. Like it isn’t constant, you can’t do an AB test.

And so, we did a test last month and we found out that all people like red ones, so therefore, we’re going to market like this and sell all the red one, because that was a month ago. And that changes every day with every segment, time of day, weekend, who you’re talking to.

So, and that’s where the complexity becomes overwhelming, but there are simple ways of actually starting to try and build confidence. And it’s really about building confidence back to your point of the 51% and  stop looking for perfection, and actually, look for an incremental gain, because it’s about continual optimization. It’s about always improving the baseline in what you know.

And the data’s very good at that, even if it’s only partial. It only has to look at what’s happening in the moment, how people are behaving and interacting. If we can optimize how they’re interacting in that process, we are moving them down the decision path. We’re moving them down the purchase path rather than trying to actually build a model, which is perfect and all the data at the back end.

Because, as we said before, it’s very hard for organizations to even approach any level of perfection. I think perfection’s the wrong word, even accuracy at any level. And I think they need to some level start to think about how do you actually market within the parameters of what you can control.

What you can control is what you can work with and build a model that works with that. And there are tools and ways that they can actually achieve that in a very incremental optimization sort of framework.

Darren:

Yeah. My father, who was a tech person used to say, “An expert’s someone that gets it right 51% of the time” which is why I use that. Kate?

Kate:

A hundred percent on board with what Adrian’s saying. I think the reality when it comes to actually implementing it for marketers is getting that balance right between, I guess what we’ve started to call brand and performance. But we can do sort of experiments on small tactical elements, which I think people are becoming more and more comfortable with. But the risk is this over optimization of a smaller pool of people.

And what we’re not so good at is this bigger picture: how do I grow, how do I build that pool? What are those longer-term impacts? Because that’s not a real time optimization. That’s where you really need to put in kind of the big picture which is where MMM comes in.

But we saw last year, and we’re seeing lots of stuff coming out of the UK and the IPA and work at the moment in terms of consistency, keeping that same brand message, which people get bored of and they want to keep changing. But if we look at something-

Darren:

Sorry, I just want to correct you. When we say “people,” we mean marketers. The actual general public. By the time the marketers completely bored with their own brand, the general public are going, “Oh, what was that?”

Anton:

But Kate, that’s a really good point. I think it’s also this balance between marketers need proof, whether it’s to the CFO or to the board or within the C-suite. They’ve got to be giving some proof that their investment is actually working.

And I think that’s the area we’re saying that they’ve got too many numbers, so many reports, too many different areas tugging in different directions that actually it’s unclear. So, it’s unclear what they’re saying within those big meetings. And part of the work we’re doing with some of the clients is narrow down to what proof you can give.

So, echoing what both Adrian and Katie just said, the data is inconsistent, the data quality is not great, but the trend might be this. So, if the trend is that, then we can roll that through this particular state or on this local marketing initiative or this particular brand initiative.

But I think that’s part of the trap in all of this discussion that the shiny toy, the AI and what’s now agentic AI, the tech building its own tech, is just going to explode more and more data reports. It’s the ones that you look at and go, “Okay, this is what’s really going to help me make a business decision.”

And Adrian, spot on. If you’re not AB testing or not testing something, then I think the risk in all of this is you go, there was no incremental. The only reason you are spending money on AI or on tech or on something else should be from our perspective, it’s going to drive some incremental result, otherwise, keep doing it the way you’re doing.

So, if you can’t prove that incremental that I tested in a local area or in a state or with a cohort or with whatever that I drove some sort of commercial incremental return, then you’re really not doing yourself justice as a marketer with a CFO and the C-suite to prove you’re doing a good job.

Darren:

And Anton, it’s a good point because one of the expectations for marketers is to be able to explain to senior management exactly what the investment is and what the lessons or … lessons can be as valuable as actual results, because it moves you along the way to those results, Kate.

Anton:

Kate and I, engagement reports – I mean, how many times have we seen engagement reports based on pixels? And you go, that’s a fantastic thing. Engagement’s up 16.73% and you go, “What on earth does that mean? On 50 likes?”

Kate:

It means you’ve targeted people who already liked you and we’re going to buy you anyway, so you sold-

Anton:

Correct. Not incremental.

Darren:

Kate, your view on that.

Kate:

I mean, incrementality is obviously key, but I think as we’ve gone down talking about that also forget sometimes is how much we sell overall and how much profit we make is still what the CEO and CFO are actually looking at.

So, if there’s a disconnect in you saying we’ve got so much more incrementality from our advertising spend, and they’re saying, “But our investment’s the same and our sales have gone down,” I don’t believe you, that’s still a problem.

So, I think our tests and all of the measurements we can do are fantastic, but ultimately, one of the numbers that we do actually trust in all of this craziness is how much I’ve sold as a business and how much my profit is. So, we still have to take it back to that overall, “Did I do better this year?”

Anton:

Which is the pressure on the marketer. So, putting the lens back on the marketer and what challenges are they really facing to prove marketing’s working and where to focus investment.

Darren:

It’s also a challenge for the technology and particularly, in an AI world, because there is so much the marketer can directly impact. There are so many things that can be done, but often, it’s competitive sets that are actually having the biggest impact. And then being able to take all of that on board.

If you think managing your brand and your customer base engagement is difficult enough or complex enough, when you start to have to take in a whole macro and micro economic and competitive set, then we’re talking about hugely complex, some would argue in the current environment, chaotic sets of data that we’re trying to make sense of. So, I think there’s definitely an expectation that needs to be managed.

I think we’ve done a really thorough job of looking at what’s the complexity and what are the sort of expectations that come out of that. Let’s switch to focusing on what are some of the sort of emerging practices, and I don’t want to call them best practice because best practice infers that that’s the practice that you have to do forever. But what are the emerging practices around the various areas that we’ve been talking about?

Anton:

I might jump in quickly. I always love Scott Brinker, the Chief MarTech. He always talks about with his team, there’s 14,000 tools now available to marketers, marketing technology tools. And that’s exploding with AI.

So, they’re losing control of being able to track that, we’re now getting into the almost directionally a million, in a few years, a million different types of technologies because people are building their own and computers are building their own, it’s becoming this explosive landscape.

But one of the ones I just wanted to quickly touch on, which I’ve seen, and I think it’s a really interesting summary of almost what we’ve talked about here to date is a company called Hive Science, out of America, and they’re taking a psychology angle. So, using AI and machine learning. But as we’ve touched on, it’s not about the tech, it’s not about the data, it’s about the psychology of decision-making.

And it’s something we’ve all tried to prove, or all seen marketers try to prove over the decades. Why did someone act on our advertising? And as we all know, and Darren you just touched on, it’s not one homogenous group. And again, Adrian, as you said, it’s not one homogenous group, therefore, we just get one version out there. I think we all understand that.

But now with the power of AI, they can look at the different psychology triggers of a person within a particular category, within a particular product.

So, what that means, Kate would think one way about buying a car versus Adrian versus you, Darren, versus me. Might all look exactly the same demographically, psychographically, attitudinally could be all the same, but the actual trigger, almost unconscious based trigger that made Kate act versus Adrian versus Darren versus myself is completely different.

And with that, they’re AB testing and saying to marketers, AB test the different hierarchy of headline features or benefits or content and the imagery that gets served up. And I like this one because it’s a highly practical case for marketers to say, “Yeah, we can scale out different versions of creative, we can change the headlines around, we can change the support points to really resonate with someone.”

And I was sitting next to someone having lunch and they were about to buy a car, they said, “My husband wants to buy it because of the engine and the towing capacity, et cetera.” And she said, “You probably can’t see, but I’m five foot one, I just need to have a step and a good handle to hold onto to pull myself up into the truck.”

So, she would act very differently to her husband in buying exactly the same car. So, I really like this new level of psychology based, which we’ve all talked about over the decades, but now could become a reality in decision making for marketers.

Darren:

Look, and you’re right, focusing on the psychological drivers of choice. But I think to Adrian’s earlier point about there’s still an amount of randomness in there that we have to accept is built into any sort of human system. We are inclined to act what appears to be irrationally. But behavioral economics says that if you study it enough, it starts to become a predictable irrationality.

Anton:

I love that. But even put the irrational statement up, that’s one of the points they’ve talked about. They’ve done it in the auto industry. Put an irrational statement up and someone will go, “Ooh, I like that idea.”

Darren:

Kate, what are some of the practices that you’re seeing emerging that are really helping marketers navigate this swamp?

Kate:

I think one of the things that I’m seeing that’s working really well is marketing teams, getting marketing analytics specialists with experience into the team to help them on this journey. And I think it is complex, and I think market is sort of … research teams in-house started to decline as we started kind of being able to automate and do our own surveys or they became cheaper.

But what we’re seeing now is an increase in people getting experienced marketing analytics, people on board to help with some of this complexity, design the experiments, and help explain when you get those situations, when not all data tells the same story.

And I think you need that at the moment to help build those programs to really make sure you’re on the right track and you’re not going down a rabbit hole of over optimization and you’re trying to solve some of the big questions, but you are constantly learning. And I think that’s the kind of key thing to succeeding at the moment.

Darren:

Oh, I think that’s really interesting because it’s quite a few years ago, a marketing team invited me to an offsite that they were doing, and they had a consultant come along who does that personality profiling into quadrants.

And everyone had pre-done the work, and then they were given an instruction, and they laid out this huge map with red, yellow, green and blue with the idea green and blue was very analytical and logical process, and red and yellow were more people and relationships.

The whole marketing department was right down one end at the red and yellow, the two procurement people were green, blue. And I was sort of wandering around the middle going, I’m not quite sure from the instructions where I’m meant to stand.

But I use it from the point of view of traditionally marketing was not the profession or it’s not a profession, the occupation that people went into with very much a stem focus. They went into it with more humanities.

So, it’s interesting how we are now like many areas of business trying to bring together analytics and sort of rationality with more relationship, humanities, and gut instinct. And often it’s two totally different languages, it’s certainly different views of the world. It’ll be interesting to see how it comes together.

Kate:

And I think that’s been sort of one of my things, is I fit in the red and blue, so I’m a kind of opposing ends. So, that’s sort of where I’ve gone with my career to I like the creativity and the numbers, but not so many.

Darren:

Yes, I can relate to that as well. Adrian?

Adrian:

That just makes me think of Steve Jobs famously saying that he believed the future of humanity and economic growth is at the nexus between the sciences and humanity, and how those things come together. And I think it was spot on. I think from a best practice, I want to focus a little bit on.

Darren:

No, no emerging practice.

Because I don’t want to focus on best practice because that invariably is what’s everyone else doing.

Adrian:

Yeah. Good point. But from a tech perspective, I think some of the exciting stuff is organizations starting to move away from the silver bullet one platform that does everything and having tech stacks that (to use a jargon term) are a little bit headless. And there’s technologies like something as simple as Zapier, for example.

A technology that allows simple integration between things which can be done almost in human readable form. So, getting out of deep coding, deep projects where it takes six months to scope out the requirements and then build out millions, getting to a much more agile process where you can almost say for this one-off campaign, we need this integration. Yeah, we can do that integration just for a six-week period.

It’s only to facilitate this thing because the technology is so easy to implement and doesn’t actually need huge tech teams to implement it. So, moving away from those siloed pieces of technology that had interoperability issues into a more open-source data lake view of the world will empower to solving of some of these challenges that Kate was referring to.

Anton:

Yeah, I think that’s really exciting, Adrian. And I think that’s something we’re seeing. That’s a big shift from build legacy, big systems and have a stack, the marketing stack that’s stacked neatly on top of each other, to this idea, which is a very different way of thinking of maybe I’m just going to build bespoke little customization apps. That’s where that word, customization, personalization, contextual.

So, for the Olympics, I can build something specifically to cohorts that are coming to Brisbane, for example.

Adrian:

So, sorry, I was just on that note, Anton, I was like Zapier even work with a spreadsheet. So, you can have a spreadsheet that you’re working with, which is being populated from the CRM system with a data field because that’s an important data field you want. And that can be implemented in a few hours.

Anton:

I had a conversation with a client yesterday, they said, “What’s your favorite piece of technology?” I said, “Well, Excel’s amazing.”

Kate:

I was just about to say, does that mean we’re going back to Excel?

Anton:

But it’s interesting, isn’t it? I think-

Darren:

I think he meant Google Sheets, didn’t you?

Anton:

I didn’t want to drop the G name, sorry. But that’s a different way of thinking, isn’t it? It’s now saying be much more adaptable and you ready for the market. And I think the next generation coming through, if you talk about back to consumer, the generations have shifted.

Our generation was very time structured as everyone knows, the next generation coming through is very fluid, very mobile, very in the now. So, I think that’s where tech will start to build and start to build out. And you’re right, it’s not a million-dollar project, it’s for $10,000, I can build something that does work very well locally, captures the right data, contextually helps them have a better experience.

And then I go as the recipient and buy something or engage in something or do something. That’s the big shift.

Darren:

Now, I would normally have done this upfront, to really give the listener an opportunity to get to know the three of you better. But what would be the type of project that you would be doing with a marketing team that you would find really exciting and really want to get your teeth into as a way of demonstrating the types of things that really get you excited about this area. Let’s start with you, Anton.

Anton:

Well, that’s a big question. I’m not sure whether there’s ever one project. I think that’s the point we talk about, every project is different.

Darren:

The type of project.

Anton:

But what we’re seeing the most and what really turns me on, is you see a challenge for a marketer. They’re trying to say, “We’re trying to prove our brand investment is working.” Or they say, “Well, we want to improve the capture of data of new customers.” So, it’s an acquisition focus. Or, “We’ve got a customer base loyalty program, we want to improve the repurchase rate or module incremental return.”

Whatever that challenge is, for me, it tends to come back to the same area where you say, “Okay, you’re very clear on your objective. But then the strategy of how to get there may not be clearly articulated.” So, helping them clearly articulate that and align it across the different divisions is a great start.

And then you can start from there and go, okay, which customers are we talking about? Or which audiences are we talking about? Which gets us for me into things like the media, the channel selection.

Darren:

So, customer-centric.

Anton:

Customer, yeah. But where I’m landing on here is that leads you to customer experience. I think that’s the bit for me that’s missed in so many discussions about the brand or the commercial return. And we go, “We’ve got a million people in your database. The value they generate is $500 million, make up the numbers. The incremental opportunity is X. So, why don’t you invest money there, the data’s all there, why are you buying third party lists? How is market mix modeling helping you decide on that channel choice, because you have a channel.”

So, I guess weighing up where they’re currently at. I’m a big believer as well, just to throw in that this is all new territory. No one has the silver bullet, but I think some pragmatic logic and helping people get the objectives, the strategy, then look at data and channels.

And then most importantly (and Kate will probably touch on this), look at the measurement, because back to your point, it’s an art and science. Don’t get hung up on the science, that will help you tell a bit of the story, but there’s going to be some art there to say, “Let’s test AB tests.” I think it’s a full circle of opportunity and you dial up a few areas depending on the client.

Darren:

Thanks, Anton. Kate?

Kate:

Yeah, Anton’s quite right. I’m going to go straight into the measurement, but my sort of the area I love working on is the performance measurement programs. So, rather than a single kind of tool that we have, it’s all about bringing together the macro, the micro, what we’ve done from our experiments.

I’m still a big believer in brand tracking and looking at that from a customer perspective, and our market mix modeling, but how we use them together to actually make decisions and make what we’re doing better.

And I think what I sort of see, again, a lot of people will get one tool and think, “Okay, I’m going to use market mix modeling for all my budget allocation now.” And then they see it doesn’t line up with other results, people lose faith, and it’s, “Oh, what do I do?”

Or they get given a budget and then someone says, “But that isn’t enough to kind of cover the 52 week a year search budget we need” because they’ve done it from a bottom-up perspective. And it’s, you end up in arguments and it doesn’t look like either of the recommendations when it’s actually implemented and on paper.

So, I’d love just working through the different sources, how we use that to actually make decisions and what’s our program and process around that.

Darren:

Fantastic. Thanks Kate. And Adrian, knowing your extensive history building technology solutions for particular needs, what’s your focus?

Adrian:

I think I won’t use the word focus, I think it’s what’s exciting when you go in and you get enthusiastic about is slightly different twist on both Kate and Anton’s view, but really in the same theme. And it’s helping build clarity on how you can deliver results that are meaningful.

So, to me, a lot of the time people know what they would like. So, Anton, they know the objective, the strategy, but sometimes they’re just overwhelmed by the internal problems with the architecture, the technology, the choice of technology. Things are too expensive, how do you implement it?

And the thing that I’m really passionate about is the choice modeling stuff, the stuff that Anton was talking about, that’s stuff we’ve been working on. There’s been tech around and various companies, some of them Australian, like the metrics started in Australia, ended up getting bought by Accenture, but that thought process started in Australia around how do you take choice modeling and experimental design and get marketing tools and methodologies that enable us to start to understand that what is effectively human choice.

How do you sort of make sense of that utility and that randomness in a way that we can actually start to predict and get measurable outcomes and incrementally, improve the baseline. And Kate, to your point before, ultimately, that should be really about a hard objective. So, if the objective is to stop credit card rollover, then how do you actually build an experiment that shows you are addressing credit card roll defaults or you are addressing the …

Another great one is offers. We have done work in the past where you’re going, a bank might be offering a hundred dollars value to retain a customer, and go, “Well how do you know that’s not $80 or $56 or $47? At what point do you get diminishing returns on that?

And we’ve done experiments with banks where we’ve found that effectively, that the incremental value stopped in the low twenties, but they were giving above $50 to every customer that they were trying to retain. And you go, that’s a lot of money over a year if you can save that money with the same net result.

So, the net result here was nearly purely just going, “What’s your marketing offer? How do you optimize that offer so you’re not giving away money that makes no incremental difference to the result of that marketing activity.”

So, those types of challenges I find really exciting because it’s a combination of the teams and how they work, how they think about their marketing, and also how they utilize the tech, and is the tech there to do it.

And the tech today it’s so much more capable of … and AI is a great tool for helping in this process. Because things where we used to have to run to get statistical value, you’d have to run so many experiments.

Now that you can use a large language model to start to understand that and actually go, well, we can take that data, feed it into an AI without having to have 200,000 customers engaged in that we can get statistical certainty really, really quickly, and that’s super exciting.

Darren:

Thanks, Adrian. Well, look, time’s got away from us. I want to thank you Adrian, Treahy, Dr. Kate Gunby (sorry I left your doctorate off), and Anton Buchner. Thanks for coming and sharing with us today and it’s been a great conversation.

Anton:

Thanks, Darren.

Adrian:

Thanks, Darren.

Kate:

Thanks, Darren.

Darren:

Question before we go: if we picked one brand in the world, who’s doing it best?