Managing Marketing: Realising The Full Value Of Customer Experience With AI (Artificial Intelligence)

Brad_Bennett

Brad Bennett is the Executive Strategy Director at MercerBell – Australia’s leading Customer Experience Agency™. Hailing from the USA, Brad talks about his tech background and how it is helping him shape customer focussed marketing solutions for the new wild west arena of Artificial Intelligence. 

You can listen to the podcast here:

Follow Managing Marketing on SoundcloudTuneInStitcher, Spotify and Apple Podcast.

Transcription:

Anton:

Welcome to Managing Marketing, a weekly podcast where we sit down and talk with marketing thought leaders and experts on the issues and topics of interest to marketers and business leaders everywhere.

I’m Anton Buchner with a special conversation on the rise of artificial intelligence and the impact it’s having on marketing. To discuss this I’m sitting down today with Brad Bennett. Brad is the executive strategy and analytics director at Mercer Bell in Sydney. Welcome, Brad.

Brad:

Thank you.

Anton:

Good to see you. I’m excited to hear your point of view today. For those who don’t know you might want to tell us about Mercer Bell, their rise and who they are and a little bit of background about you.

But I’m in interested in the customer experience angle and what’s evolving in artificial intelligence.

Brad:

Sure, so, background quickly. Mercer Bell is a customer experience agency. Technically, we were the first in this market as far as being a trademark CX agency. What does that mean nowadays? Nowadays it’s a really big complicated broad church of things we do for our clients, including working with aspects of AI.

That is everything from deploying it for our clients on an ongoing basis, helping clients message features of artificial intelligence to their clients, and then actually building bespoke things, particularly in the machine learning space for our clients on an ongoing basis. Lots of interesting stuff in that space nowadays.

Anton:

I think stuff is the operative word; it’s evolving. I’d love to know a bit about your background. My research, you’ve come through as a producer, a level of technical knowledge, which is great.

Give us a bit of your background. What’s your passion? What got you through to this CX level?

Brad:

My personal background—the common thread throughout all the things I’ve done over the last 17 or so years in New York and Sydney—the common thread’s been technology. If we wind back far enough I was a developer working at tech consulting agencies, then went into an interesting space of boutique crisis communications—very strange space to be in (kind of interesting) which led me to messaging influenced into tech products I was making for clients.

And then eventually I made the leap into marketing about 10 years ago and now I’m in Sydney doing more of a strategy role—having a data-fix. Very much an infused background in that tech and development space but now looking at how on the strategic side we connect our clients’ products and services with their customers.

Anton:

I think that’s great because when you talk to lots of different people, we have people who lack strong technical knowledge who are coming through marketing, maybe through communications or maybe a traditional brand perspective. Others are coming through a pure data window, more a CRM or customer engagement type background.

But as the world has become so digital; we’re just naturally walking out with our Smartphone and maybe a key (if we still have keys these days). There is so much technology infused in marketing.

Brad:

There is and frankly, that’s the confusing part. There have been endless things written about how the scope of the CMO has increased to now be so tech-heavy. Most of our clients, the CMO has more tech budget than the CIO. It’s a hugely interesting challenge.

And when we start talking about the space of AI—that type of thing exposes one of the biggest issues we see in AI is that most people don’t even know what it is. It’s very buzzy, it’s cool, you can read articles about, watch a couple of YouTube videos, but actually having anything more than a surface level of understanding is one of the biggest challenges marketers face.

Because when you’re trying to figure out how it’s possible to deploy this type of AI tool or machine learning or deep learning (whatever flavour you’re going for)—the challenge is it exposes a lack of knowledge as soon as you really get into the details of what happens.

And that lack of knowledge is what we see as the starting point when we realised that people were deploying things incorrectly or using things in the wrong way or just not realising the full value.

I don’t claim to be an expert, but as a dabbler, I have built my own AI just in very simple processes.

Anton:

What have you built?

Brad:

Just basic stuff like character recognition and basic stuff you can do around neural networks, image recognition and that kind of stuff so I can get my head around it and be able to talk with a bit more specificity on how it may be used and what the opportunities and challenges are from the marketing perspective.

Anton:

Marketers are great for a buzz word or a trend. As we know, artificial intelligence has been around for decades, since the start of the computing era, right back to World War 2 and cracking the Enigma Code, it was based on some intelligent engineering.

And through the decades we’ve seen robotics obviously advance. But a couple of years ago we saw marketing embrace the buzz word, whether that’s through chat bots in customer service. What do you see as artificial intelligence? Do you have a term or a way of looking at it (through a CX, customer experience lens)? Tell us about your view.

Brad:

The first thing to do is just Google ML vs AI (machine learning versus artificial intelligence) and just go to the image tab. There are a variety of standard academic models that will explain what the difference between these things are, namely deep learning, machine learning and AI because those are subsets of each other to varying degrees of complexity.

To your next question around what the opportunity for CX is. When I talk about AI, most business applications are still machine learning. So, as a subset of AI I might use those terms interchangeably. Generally, I mean machine learning.

From a CX perspective, we’re now seeing it help in two major areas. One is classic stuff; insight driven. So, making sense of large, complex sets of data, whether it was structured, or unstructured, or whatever else, but looking for those patterns we might not be able to see with the human eye.

For example, we recently helped a client, over a number of years, with transactional data across a variety of products and services they offer to their constituents, in this case their members (membership for a non-profit organisation). Looking for patterns across a variety of variables as to the types of segments that we can find in that.

They had done their persona work, built a website and the kind of stuff you normally do. We found it didn’t line up with the actual behaviours we could see in the data. Those behaviours were spread across 19 different variables and were a bit hard to see as a human so you can deploy machine learning to pull that type of insight out.

Anton:

So, deeper profiling, getting a richer pattern or profile based on what sounds like a lot of structured data as well but some level of unstructured data or was it mainly structured?

Brad:

In this particular example, it was a series of structured data sets. Thankfully, we were able to get transactional data over the years across a variety of systems from learning modules to conference data etc. But that’s one of the major things that for that insight stuff, machine learning is good for.

And the other one is what you see in a lot of the products and the Martech people are trying to build, deploy and find value, which is still using a lot of machine learning techniques or AI but to automate and scale a lot of the low level tasks.

I call them low level (there are examples of people going above and beyond) but the proof points in a lot of business cases are still send time optimisation, AB testing at scale between messaging and your creative optimisation.

Some things do scale up to a bit more complexity but there’s still a lot of value to be grabbed at that level—just that classic thing of you have a range of messages or ads and getting the right one in front of the right person at the right time–it could still be a wildly complex thing and so a lot of machine learning can help with that.

Anton:

So more of the mundane, production-based part of someone’s job or marketing work they’re doing.

Brad:

Yeah, it’s a part that’s so hard just to scale the profiling. Similar outcomes can be delivered through an AdobeStack, Google AdSense, Martech, AdTech whatever you want, but using a couple of data points (more than a human could look at) to know the difference between Anton and Brad and the things that might move us and produce a commercial outcome.

You could have a million marketers sitting in a room looking at all those data points and figuring it out or you can automate that to a degree using models. We’re not even in the predictive space yet; at that point we’re just churning through it to get to that optimisation.

Anton:

More an analytics task?

Brad:

Yeah.

Anton:

What about in the predictive space? We’ve had predictive modelling as well around for decades. But are you seeing anything around the machine learning aspect of machines being able to predict a pattern in behaviour that maybe a human eye can’t simply because the data sets, as you said, are too big, whether that’s structured or unstructured data?

Are you seeing anything around that where machines are adding greater value to marketers to unearth better insights or patterns?

Brad:

Yeah. For us, as an agency, we see predicting customer behaviour as the edge, the area that will matter over the next 5 plus years. That is a wildly complicated thing. When you move from measuring on to analysing and to predicting, it’s crazy complicated, especially when you look at time scale.

To a degree, you might be able to predict what somebody might click on. You might even be able to predict what somebody might pull off a shelf in a store at that moment. But to move into a longer time frame can make things really challenging.

We have some clients who are certainly using advanced predictive modelling for that next best product or content. That can be quite complex but those are models that you can build and there is a lot of academic rigour behind how to get that going and that’s cool.

One of our clients in the travel space is on the journey of building out very complex bits of AI to do next best product and next best action across a whole variety of suites and offerings. You can imagine it gets quite complex in the travel space. We think that’s going to be the battleground over the next couple of years.

As off the shelf tools normalise current state or make easy the current state, which again is optimisation, measurement, putting stuff in front of the person, right context, right time—that kind of stuff will become easier to use and more and more normal. The step beyond that is going ‘what’s next?’

Anton:

I’m always fascinated by anyone talking about predicting human behaviour because we are irrational beasts. Dan Ariely said, ‘we are predictably irrational’. In that sense, maybe he is saying we can be predictive and whilst we are irrational we can still predict the irrationalities—is that what you’re seeing?

Brad:

Absolutely. If you look at the academic research behind this there is a system 1, system 2 mind. Most people make 95% of their daily decisions in the system 1 irrational, emotional, biased mind. That is the majority of our interactions with the world and with advertising and corporate communications.

So, having AI help understand that we think it needs to be prepared with a level of behavioural science so that you can start to understand those irrational patterns.

A general understanding of AI is one major issue. The 2nd major issue we see is around control.

We see examples of clients within certain scopes deploying AI and actually getting better at measuring those irrational things that customers do. Again, it doesn’t have to be terribly complex. It can be as simple as understanding what types of emotional triggers people use in an unconscious way to interact with ads and make purchases—that’s cool.

But handing governance structures and control over to those things—I think the biggest aspect of handing over control to a machine, this AI black box is less about control than about people really not intuitively understanding how irrational people actually are.

When these patterns are discovered by technology and AI they just look weird because if you actually think about them they don’t make any sense but they are how it is.

Anton:

Which I think gets us to the level of trust. Would you trust a machine to spit out some sort of finding or insight when you don’t know what the working was? With a human you can talk about it whether it was a creative or technologist. You can ask them, what was your working? How did you get to that? What was the leap you made?

So you can understand where a human got to but one of the barriers with machines; is that they’re spitting out answers and you don’t necessarily know how that machine learnt, what it did in the black box to get to that answer, which is a little scary because then we lose trust in the machine.

Brad:

Yes, it’s a strange point when you look at AI development and deployment frameworks. Auditability is one of the key issues because the more complex you get you eventually do have to have an auditable process particularly in certain industries. Depending on the message you’re delivering that’s a legal requirement.

But beyond that it’s a good practice just because somebody’s going to ask eventually and it’s nice to have the receipt at the end of the decision so you can actually prove why you did something.

Anton:

To bring it back to the work you’re doing. What are you talking about with clients? What are you seeing from the agency/ client relationship point of view around AI? Are you seeing hurdles, opportunities? What are some of the conversations?

Brad:

We see (this lack of understanding I mentioned) a variety of expectations and approaches.

On one side of the fence, marketers are really comfortable with AI. Just like humans, in the right context, are super comfortable with AI. General humans, we absolutely expect Netflix to know what’s next. We expect Google, Amazon, and all these companies to literally know, and that’s really complex stuff.

And then on a marketing side we expect a similar AI to be deployed by Google to optimise our digital advertising and Facebook. So, once we’ve become used to it and can see how that works. And to a degree good marketers are structuring their testing and created optimised marketing.

So, we see that, but then on the other side of the fence, there is a bit of a reticence around building your own stuff for value and there is a lack of understanding around how you actually get that done.

We’ve had instances where we’ve created machine learning models and realised that while it will actually get the greatest outcome for our clients, it doesn’t necessarily even have to become part of the sales process for the services we’re offering, because it might actually confuse things and muddy the waters too much so we would rather focus on the business outcomes.

And if they ever ask, we can talk about some of the tools behind the curtain. We’re also seeing, outside of the Googles and Facebooks, which are more supremely productised, people are deploying AI for the wrong things. We see a lot in the MarTech space.

They may have been sold something by a MarTech vendor but their ability to use that and actually apply an out of the box AI solution to their business needs; there is a big gap between what they were sold and what the reality is.

And that becomes a really interesting conversation to have because 2 years ago somebody made a stand for a big MarTech stack and now cool it’s on a roadmap and we’re going to get to the advanced part and turn on the AI and it just doesn’t magically deliver value.

And there’s actually a lot of complexity, custom configuration, off-platform work and whatever you’ve got to do—data massage until you get something structured the AI can learn from. There’s a whole level of complexity there.

People are getting weird, after years and years of dealing with all these other tech issues you’ve got super complicated things popping on top now. We’re helping clients work through that to a degree.

Anton:

So to that point, is there a capability gap, whether it be an agency or with marketers that’s missing?

Brad:

I think it’s a capability gap in the market. Yes. It’s a huge capability gap and we’re all going for the same boat. Because this is also a strange space that requires lots of different types of skills—you can have of course the coders, developers and that overlaps to a degree with the data folks and then all the other types of techy type folks such as architects.

So, it’s a very strange but specialised area and clients are trying to keep that in-house because the smart ones realise the value they can get out of that and they don’t want to outsource that value.

Agencies are trying to get that because if they get those smart folks they can sell them endlessly to clients. And frankly, we just don’t see enough of these people in the Australian market. For example, our data team, the ones who look internally after some AI products whether they be deployed for clients or building our own machine learning models—the entire team was born overseas somewhere.

I’m from New York and my entire analytics and strategy team was born overseas and that’s only through the choice of finding the right people

Anton:

Yeah, as we see the tech giants come on, the Googles, Facebooks, Adobes, Salesforce etc—as they take the lead and have taken the lead through this digital era, they’ve probably skewed the conversation for business probably too heavily towards tech.

As you say it’s the tech stuff that’s now put in but then what do you do with it? And especially if you’re talking customer experience—it’s only as good as the customer’s experience. And for a marketer or agency trying to map that experience and capture data points to try and tell the whole story to then predict the next behaviour.

Now, if that doesn’t all join up, which we all know is one of the biggest problems, that systems don’t talk to each other, data doesn’t flow, data becomes dirty, we can’t tell the whole story, the whole thing starts to break down.

Brad:

Exactly. Often you see at big conferences about this kind of stuff that there is still so much focus on implementation of technology. You look at the outputs and years’ of work going into these things and sometimes you’ll get an email or a banner and it’s so lame.

There’s a huge opportunity to move past this stuff and not just try to automate a single channel, a couple of minor touch points. You can put your human hat on and actually deliver something that matters.

In the short-term there is probably still some value to be extracted by using machine learning and AI to coordinate some more mundane channels and interactions with the customers. We’ve got to keep that at the forefront. At least that’s got to be our goal.

We’ve got to be able to sell something in an emotional sense and just create those brilliant customer experiences that really get people to like your company.

Anton:

Yeah, deeper love and deeper engagement and deeper advocacy. What about measurement? What are you seeing around either a return on artificial intelligence (another buzzword into the lexicon, ROIAI)? Is anyone talking about a return on artificial intelligence or machine learning or tech investment? Or is it still grounded in customer metrics?

Brad:

Yeah, I think we see both sides. We see a lot of business cases come out and we help clients write those and develop them internally for what is expected. And you do have to be rooted in something particularly because there is still a lot of investment behind this.

Outside of the Googles and Facebooks that are actually quite cheap and normalised and every cafe across the street has access to some pretty powerful tools. If you’re building your customer stats whether it be through a martech stack or purpose-built propensity model on your own platform there is a lot of investment there because the tech is new, it’s specialised. It hasn’t come down in cost yet. Some along the way will still talk about the return.

The trick is there are not a lot of standard things that will infer value, set metrics that are probabilistic or guaranteed; it becomes quite challenging. And we have seen a handful of clients who have stalled on certain aspects of AI projects just give the uncertainty of the return or the value of the initial investment required.

Anton:

Or was value ever defined up front?

Brad:

Sometimes you can build a propensity model—great really useful as far as segmentation, but that might be part of a complex customer journey result and while that might inform insights or automate some portion of the customer journey, it’s a classic attribution issue where how do you take that insight and attribute a sale to it and a financial outcome and that’s where it becomes really difficult if you’re doing anything beyond bottom funnel type work.

When you’re not doing that and moving further away from the purchase, that’s actually where these things can really sing and put out a lot of value.

Anton:

It’s the same challenge all digital has had, whether it’s digital media, social media, it just became shiny so didn’t necessarily need a business case and we’re seeing the same in artificial intelligence. As you say it’s a very broad church of ideas.

But things like chat bots for customer service. Is a chat bot a good idea or a bad idea? You could argue it’s a good idea but then the metrics blow that because if you’re trying to reduce call time, is it trying to improve caller experience, is it actually increasing call time to make it a better experience?

Whatever the metric is, the challenge in all that is deciding what you’re trying to achieve? What’s the objective?

Brad:

Exactly. It’s got to be rooted in all that stuff. It’s funny, one chat bot I worked on a while ago—it was very interesting, for its industry it was relatively new, so the actual metrics attached to it were awareness and engagement level stuff. Which I thought was really fascinating; it was actually a reason to go to market and talk to people in a B2B.

In the background there was stuff around reducing call volume and having 24 hour service but the real play was the fact that it just supercharged the whole communications plan, which was an interesting way of looking at it.

Anton:

Well, it’s a part of peoples’ brand.

Brad:

Exactly. They were a tech leader in a relatively traditional B2B space so it gave them a shiny badge which was interesting.

Anton:

We’ve seen plenty of disasters in the chat bot space as well; getting the text wrong and the very early days in trying to interpret human text.

Brad:

Yeah, there’s all those horror stories. Chat bots and semantics, verbal and text recognition—I always love watching those Gartner Hype Cycles—I was chatting with Nick Mercer our Chairman last night, I get excited when something goes down into the trough of disillusionment because that’s when there is the maximum value left to extract.

A lot of the stuff, if it does play out and get to the plateau of enlightenment –now you’re in a position if you start doing it at that point, you’ve got value. You’re not over-paying, over-promising at the top of the curve. And certainly chat bots and some aspects of the AI world are probably sitting in the trough.

Anton:

Let’s keep them in the trough for the moment. What are some of the failings or hurdles you’ve come across?

Brad:

One of the classic things we see right now—some of our sales decks—you were promised and sold a couple of years ago—this value was something that might have AI value built into it and you’re only realising this value out of it. So how do you close that and build that value up? That’s one of the big failings now.

It’s also the lack of understanding—it’s a fundamental failing. It’s complex. It actually inhibits business conversations around it. You don’t have to have a deep intimate knowledge of exactly how neural networks work to understand business implications but because there is such a lack of understanding you can hardly even discuss the business implications.

It’s hard to understand what’s even realistic in a general scope of things such as how can we provide value, what kind of value, general use cases for extracting value from these types of technologies so it’s a huge thing—one of the big pitfalls.

Anton:

Let’s not stay in the trough too long. I’m sure our listeners want to hear about some successes as well. Can you pinpoint one example where you’ve seen a great success or at least a move in the right direction?

Brad:

As I said, my travel client, who is on the journey of some amazing home-grown Australian AI that is being designed to coordinate a number of customer-facing activities across the organisation. It is deployed under certain use cases now and is seeing some business value which is great.

They’re on a multi-year journey of rolling that across a whole variety of things—that’s interesting. To be clear; we’re talking months and months and dozens and dozens of people—it wasn’t just a 5-person team who did this over a quarter. We’re talking multi-year processes, 80 plus people, but their business model and plan is millions of dollars that they’re going to be extracting—that’s very cool to have such a great Australian home-grown version.

There’ll be more in the press about who that might actually be. And there are more controlled versions—that classic stuff of using ML models for advanced segmentation that informs corporate strategy when we go to market and that can drive incremental revenue. That stuff also works so you don’t have to be complex for cases.

Anton:

And who’s driving it on the marketer’s side? Are you seeing it rooted in the marketing department or is it coming out of the IT department? Where are you seeing the discussion internally?

Brad:

We still see it originating in marketing because it still takes a bit of a visionary to follow these things properly. Somebody to understand its potential, get their head wrapped around this because the process of deploying this within an organisation often involves educating a lot of people, dealing with issues of control and governance—it’s really complex.

You have to have somebody with a bit of a desire to make a name for themselves and push that change through if you’re doing it at any level of scale. Those types of people tend to be found more often in marketing than IT.

Anton:

Yeah, I think that plays to culture as well—what’s the business’ appetite for change firstly, and then what degree of radical change in the tech space do they want to jump on?

Brad:

Two of our clients, while the vision for an AI, machine learning future originated in marketing, teams were extracted out of marketing and dropped into IT to lead the process because you still need all the tech skills, so it became a cross-function team.

Anton:

It’s an agile working team. And we’re seeing that time and time again where people are hived off out of marketing, IT, data and analytics, research, business finance as a team in a hub.

Brad:

Is that what you’re seeing in the broader section of clients?

Anton:

Yeah, especially, as you said, it’s a muddy, murky area. It’s not well defined and even if you look at what is intelligence in the human space; it’s still not clearly defined apart from the very specific IQ type tests, what is intelligence, creativity, innovation? They’re quite poorly defined when you actually scratch the surface.

So, it’s no wonder when you get to artificial intelligence, we’ve suddenly leapt into the machine can do this—what is this? What is creativity? How does the brain work? How do neurons connect and how do we have lateral thinking? How do we have patterns found in unstructured data sets that we can’t normally see?

Once you start getting to that level it is just so murky. So, many clients are hiving it off into teams—you guys all work together, focus on it. But quite often we’re seeing it as an endpoint discussion rather than a clear objective; it’s a solution to something.

Let’s build a chat bot, let’s do better predictive modelling or let’s do X, Y, Z or whatever with not a lot of thought around what’s going to be achieved for the business. You can see the obvious impact but let’s focus on what the impact for the business will be.

That just means it’s early days. This whole series is about uncovering what different clients, agencies, tech vendors are doing and it’s been fascinating because I don’t think anyone’s cracked it nor should they. It’s early days in testing, in seeing what can be achieved, which is obviously, for marketers, exciting.

Brad:

Yeah, it’s one of those Wild West type spaces at the moment.

Anton:

Well good luck. Thank you. Great to have a little chat. Thanks for joining Managing Marketing today. I’ll just leave you with one quick question. When the robot takes over your job what will you do?

Ideal for marketers, advertisers, media and commercial communications professionals, Managing Marketing is a podcast hosted by Darren Woolley and special guests. Find all the episodes here