Managing Marketing: Ten Years of Google SEO Updates

Mike Morgan is the Founder and Director of High Profile Enterprises, a company that has been improving the business results of businesses like TrinityP3 for more than a decade. Mike shares many of the Google search changes that have occurred and their impact on SEO during the past decade. We discuss the latest changes that are occurring now, driven by Google’s desire to drive improved user experience and the impact this will have on organisations small and large.

You can listen to the podcast here:

Follow Managing Marketing on SoundcloudPodbean, Google Podcasts, TuneInStitcher, Spotify, Apple Podcast and Amazon Podcasts.



Welcome to Managing Marketing, a weekly podcast, where we discuss the issues and opportunities facing marketing, media, and advertising with industry thought leaders and practitioners.

Today, I am sitting down with Mike Morgan, Founder and Director of High Profile Enterprises, a company that improves business results for New Zealand and Australian businesses using SEO, content marketing and social media strategy.

And that is certainly what they’ve been doing for us at TrinityP3 for the past decade. So, it’s with great pleasure, I welcome you, Mike.


Thanks Darren, good to be here.


Look, it’s been a decade since we first had that conversation around what TrinityP3 could be doing with High Profile Enterprises and how High Profile Enterprises could help TrinityP3. It’s gone quickly, hasn’t it?


Yeah, it sure has, 10 years. I think it was January, wasn’t it? So, a lot has happened in 10 years.


Well, a lot has happened and particularly, in the space of SEO. I mean, we’re at the moment going through another of Google’s many updates, search updates, aren’t we?


That’s right. And this is something that they’ve been striving for, for a long time now. They have always wanted to present the best possible results for their customers and their customers are people using organic search and also, keeping in mind that if they have accurate results, it’s easier for them to sell advertising, which is really making them money.

So, yeah, it’s been 10 years of some turmoil, I guess, because every time they make a major change, there’s always the unintended collateral damage that happens. It’s such a complex space and the algorithms that they use are incredibly complex, so there’s almost always someone who’s hit by one of these major updates.

And as you said, the big one that’s rolling out at the moment is an update called Page Experience. And this is just another step toward what they’ve been pushing for since 2015, and that’s for all devices to offer a good user experience to people using search. And a lot of this is around speed and how pages load for the user.

The Panda Update


It’s been interesting, hasn’t it? Because I remember it was around the time that we started working together, that they had the first of what I’d call the animal updates. You know, they had Panda in 2011, which was all about eliminating the black hat SEO, wasn’t it?


Yeah. It was identifying poor content and penalizing sites that were indulging in mass duplication. And back in those days, there was a whole industry based around hacking search results. And these people found they could develop software that automatically created thousands of pages with slight variations and the content was all machine-generated.

And that would still be effective enough that it would bring tons of people to their site and allow them to sell whatever they wanted because they were able to dominate search results.

So, yeah, Panda was a big one because it was one of the first of their major updates that were seen to be punitive and to target people who were involved in malicious behaviour. And it was quite welcome, to be honest.


Well, that was the year when it was just about identifying search keywords and filling pages up with those keywords.


Yeah, very much. All of those black hat techniques, which give terrible experience to people actually visiting these sites, were gradually targeted over a period of several years before Google moved away from the punitive type of update to one that was more encouraging people to get up with the play and give customers a good experience.


Well, I remember at the time Mike, because you looked at our content — I think I’d been writing a blog irregularly. And I say irregularly since 2006. But you were surprised because I hadn’t actually used any of those SEO black hat techniques, but then again, I hadn’t used any of the white hat techniques and had quite low traffic and authority, didn’t I?


Yes. I think everyone at that time was finding their way. There were a lot of people that realised that this was a really good way to communicate with people. But there are certain broader aspects that had to be addressed. And the content has to be created for a demand that’s already out there.

As we found, there’s no point in having these small blog updates that didn’t really serve a purpose. Social media is fine for that, but if you’re actually creating content it needed to be a little bit more in-depth, a little bit more targeted with what was going on.

And so, as the search engines were changing the way that they ranked content, this is about the same time that we started working on ensuring that the content met all of those standards.

The Penguin Update


And then the next animal off the list after Panda was Penguin. 2012, I think it was, when they targeted you know, spammy links or backlinks.


Yeah. And that was a major. That caused huge consternation because a lot of people had been using a strategy that all backlinks were good. It didn’t matter where they came from. And so, there were all sorts of mechanisms to generate thousands of backlinks. And the majority of these were not seen to be of any reasonable quality coming from sites that have been set up purely for links.

And some of the big sites that were hit by the release of Penguin, were the big article marketing directories, and these had been abused for several years with people who were publishing literally thousands of articles to generate backlinks.

Now, these have pretty much faded into obscurity because of the Penguin update. But there were also other ways. You could get links from all sorts of dubious places. And the downside of the whole Penguin thing was once people realised what it was all about and they had to play by the rules, it also became a way to actually attack your competitors.

Because sites were being penalised for poor quality backlinks, what people started to do was to use negative SEO which was generating thousands of low-quality backlinks from really dubious sites and aiming them at competitor sites. So, it got quite rough there for a while.

And so, in the end, to try to counter this, Google created a disavow tool where you went through a fairly complicated process of identifying all of the links that you didn’t actually build yourself and you wanted to get rid of, and you created these files and it allowed Google to review your site, and hopefully, to put you back on to your previous standing.

The problem with this was that it took several months for it to actually happen. So, for a lot of people, they lost a lot of business while this was all going on.


Having said that, I mean, it hasn’t really eliminated the backlinks, because I still get flooded with emails as I’m sure you are, from people saying to me, “Oh, could you just add this link into your article, which you wrote in 2012 with a do-follow, and I’ll put one in back to you.” There’s still this trading in backlinks going on, isn’t there?


Yeah. And links still are really important. And people will look at the authority of a particular domain and they will try and find some way to actually get a little bit of that authority whether that’s through a backlink from an older post, which seems to be a fairly common strategy. Or to send notes and say, “You’ve got a link broken on this post. It’s going nowhere. I’ve got a piece of content that would suit this perfectly.”

So, there’s a whole range of ways that people are still doing this … it’s an almost grey area. They’re using these strategies to get links. And in most cases, they don’t actually look to see what the site is talking about, so it’s not relevant.

And likewise, with guest post emails that we get all the time as well. All of these people saying we’ll pay for this post if you have a do-follow link. And of course, that’s totally against guidelines. There’s nowhere we’ll do that. But it’s right out there and there are a lot of people who are breaking the rules and generating links from wherever they can.

And most of them are from reasonable sites. They are staying out of the crosshairs, yeah, but they’re just using spammy tactics.

The Hummingbird Update


And look at the same year, they also bought up another update called Page Layout, which was about advertising, but it didn’t fit the animal theme.

So, we’ll move on to the following year, 2013, I think it was Hummingbird, wasn’t it? The Hummingbird was where they started moving away from point counting for the words, the keywords that were found, but started looking at the context. So, starting to almost use an AI in a way that would look at the context that those words were delivered in.


Yes. And this is again, the algorithms getting smarter and smarter. Up until this point, it was fairly clunky and people were using all sorts of techniques to get the exact right number of keywords onto a page. And that really created a whole lot of content that was hard to read because it was more focused on the SEO than it was on the actual end-user.

And so, things like Hummingbird were really welcomed because it gave us a focus on context as you said. This meant that again, we’ve got a machine analysing words but it just made it a lot smarter.


Well, I guess it was a smarter way of picking up where people were just seeding in keywords to get search results on queries rather than actually providing them in a context of overall information, I guess.

A good example from our own experience would be the number of times you see either names or things included in websites, just purely to get those search terms. I can’t believe the number of times TrinityP3 appears in very dubious websites when you bother to go down the long list of search results.


Yeah. And that’s been going on for years as well. That’s why it’s confusing, this whole thing is about keywords and context and tags. You only have to visit any one of a large number of SEO consultancies or agencies. And you can almost guarantee that you can go to one of the SEO pages and they all have exact match phrases linked within their content.

I saw one the other day from a fairly reputable company and there on the SEO services page, was the really uncomfortable, clunky use of the phrase “SEO company in Auckland”. And it didn’t fit the context. It didn’t read naturally in the sentence. And then that was linked to another page on the site to refer to that anchor text keyword.

And I look at that and think it is appalling. You’re not even thinking about the people that are reading about what you’re selling. You’re trying to manipulate search engines, but that’s at the cost of your visitors, your users and again, that shows the confusion.

The Pigeon Update


Yeah. And look, then the next year, 2014, it was interesting; Pigeon, when Google went back to their animals starting with P, after Hummingbird seemed to break the pattern. But it was quite well-named because it was all about location, wasn’t it? It was an update that allowed you to search based on your geography and use in your search queries things like “near me” to find the best results geographically.


Yes, and this was really useful for local businesses and it complemented what they’d been doing with Google My Business, and with the maps results that were already appearing in search results. Again, it made it a little bit smarter. With any of these, there are always issues that come up after the fact that they’ve had to correct. And this was no different.

But I think with local businesses, if you’re in a specific location and you’re doing a search for a particular type of food, takeaway food, for instance, it makes sense that you’re going to be delivered by someone who’s relatively close to you. And then there is a range of other factors around how well/what the reviews are like for this business, and what the authority is and popularity.

And all of these things tie into, again, getting the best possible result in the local area. So, I think it was an important update and it affected the small businesses in particular.


Do you think it has a potentially negative impact on companies like TrinityP3 where we operate globally and yet, don’t necessarily have offices in every single market?


That’s always a challenge. And I mean, as far as those maps results, they do have some prominence on pages, and particularly with mobile, they take up quite a bit of real estate at the top of the page. And that’s the problem, virtual offices in multiple locations are just not going to cut it.

You really do need to … if you want to be able to rank for those local results, you really do have to have a presence in that market. And the presence in that market is usually more than the office — all of those different relationships, and some of those are through the backlinks and different site versions or even different domains for different markets.

If you have a dot com site that is global and you offer services like let’s say, you’re a software service provider — you offer services that they can be accessed from anywhere, then you’re not going to be seen as a credible result for local business. And that kind of makes sense as well because people can use those sorts of services from anywhere.


Yeah, I guess I just have to rely on people putting in a “global marketing management consultants near me” to actually pop up on the top page.


Yeah. Well, I mean, if you’ve got a really strong presence, you’ll still be in the mix. And as far as the sort of searches that generate map results, they are very location-oriented. And people who type in a phrase like a particular product and say something like in Auckland, if someone’s typing heat pumps in Auckland or heat pump installers, that’s what they want.

So, as you’re based in Auckland, you’ve got a pretty good chance if you’ve gotten the right SEO strategy to show up in those map results. Outside of that, if you’re looking at global marketing management consultants, that’s probably not something that you’re going to need a map result for.

The Mobilegeddon Update


Well, then 2015 was (and I love this one) the Mobilegeddon. So, this is Google flexing their arms about mobile-first. But they introduced an update that focused on how well a particular page on a website was displayed rather than the whole site. I mean, they had an evaluation of mobile-friendliness for the whole website, but particular pages were starting to be either unweighted or down-weighted based on their mobile friendliness.


Yes. A lot of this information was shared through Google Search Console and with the individual pages, a lot of people would go to the mobile-friendly test and put their home page into it and feel quite happy that they got a green result and everything was good.

But if they actually had gone through and tested each individual URL on their site, then, in most cases, they would have found that they would have been struggling with other types of layouts on their site not meeting the standard, clickable elements being too close together, the site not fitting the screen dimensions correctly, a whole range of things that were coming from some pretty average responsive design that went on with themes and so on as well.

And so, what we had here was a report within Search Console that said how many pages didn’t meet the standard. And it was a part of the sort of encouragement or coercion to bring sites up to standard. We still come across sites far too often where you have to scroll and zoom in, in order to read something.

And of course, people just don’t have the patience for that anymore. And it’s only if it’s absolutely essential reading that you’d actually bother to do all that sort of zooming and scrolling that you need to do with those poor sites.

So, this is again about the experience, that user experience and making sure that these things all worked well for the person who’s using Google search to find the websites. So, that’s pushing the web toward making it a better place.


Well, it’s interesting, isn’t it? Because this is when mobile responsive sites became the word of the day. Everyone was talking about making their site responsive to the user.

I know from our own experience, what we’ve seen is the growth of users accessing the site from mobile. The desktop is still popular, but the tablet, the iPad and the like, has actually dropped significantly, hasn’t it?


Yeah, that is interesting. And most people have got at least one of those devices in their homes and I’m sure they all still get used as well. But compared to mobile and desktop volume, it’s just nowhere near. And the difference from niche to niche is really interesting as well.

There are some niches that are a little bit more industrial, where the desktop is still way up there, around 80% of visitors. And there are others that are more in the consumer market where mobile is up at around 70% and particularly, where you’re generating a lot of traffic from Facebook or from other social media platforms,

Facebook is very mobile-dominated these days. And that tends to be reflected in analytics. If you’ve got a big Facebook presence for a particular brand, then you’ll find that your mobile visitors to a website will be a lot stronger than desktop.

The RankBrain Update


Also in 2015, there was a RankBrain update, which was again, an AI refinement on making sure the content was useful and that it was actually cohesive rather than just a collection of keywords. Wasn’t it?


Yeah. And this is really the big AI change and the last major one. It was eliminating some of the poor practice from the old days — keyword stuffing or any of those context things were pretty much shown to be a thing in the past which is a great thing.

So, yeah, RankBrain really ramped up AI and again, the intelligence of the algorithms.

The Fred Update


Then it was a couple of years and it’s never been confirmed officially by Google, but there was the Fred update in 2017. Do you remember that one?


I do. It was almost like a joke that came out because Google hadn’t named many of the previous updates which were named within the SEO communities. They’d given nicknames to these things. And so, from memory, anyway, I think it’s Gary Illyes from Google, he jokes that he was going to call it Fred. And I think it just kind of stuck.


But he had a great title; Chief of Sunshine and Happiness. What a title to have, Gary.


Yeah, I mean, Fred again, was more refining. It wasn’t directly targeted at one specific malicious element or any optimisation, specific optimisation technique. It was pushing things all in the right direction, but it did shake up search results. And again, there was the usual collateral damage and people who’ve gained from it as well.

But all of these updates after the punitive ones are all about gradually refining, and making the algorithms more intelligent to understand what people are really looking for. And by putting their searches in context and by putting logical progressions and what people are looking for, the search engine algorithm can predict as much as possible how they can help the person without them having to work so hard themselves.

And all of these different things like “People also ask” or some of those suggested searches in the dropdown. Some of those are hilarious by the way, but others can be really helpful. And from the SEO perspective, they are really useful tools because we can see what Google sees as being relevant behaviour and that helps us to find out if we’ve got any gaps in what we’re doing.


Yeah. And a part of that was also the refinement of natural language processing. So, starting to look at words in the format that they use rather than in isolation, it was refined even further during that, wasn’t it?


Yes. And that becomes a bit tricky when you’ve got several different versions of English even before you start looking at other languages because everyone has their own idioms and usage of languages that are highly localised.

You just have to look at the difference between New Zealand and Australian English — there’s a lot that we have in common, but there are certain things that will come up in certain usages that are pretty confusing to search engines.

And we certainly find that with dealing with other languages like Te Reo Māori. In New Zealand, the search results in that language are relatively substandard at the moment.


Yeah. It is one of the criticisms about the internet that so much of the content is in English, in particular American English. I think there was a figure that about 70% of all the indexed content is in American English.


Yeah, it’s interesting, isn’t it? Obviously, these platforms, these massive tech companies, the majority of them have come out of places like Silicon Valley. And so, it follows on from that. And I guess the interesting thing to watch in the next few years, is we’ve seen a lot of work coming out of other areas.

And some of the most innovative platforms are coming from Eastern European countries, from countries like the Netherlands. And from places that you don’t think of as being big tech hotspots.

So, that’s going to be an interesting transition. And as that happens more and more, I think the approach to other languages and other versions of English will move forward as well. I think it’s lagged, but simply because of the power of Silicon Valley.

The Core Updates


Okay. So, then we get to the Corona virus pandemic and Google gives us two updates last year, both Google Core in May and December. What was the main focus of those?


It’s multi-faceted, and they haven’t given a lot of detail about what these Core updates have all been about. They’re a little bit reactive to changes in behaviour, but also it’s another step along their path toward constantly improving what they’re offering.

And to some degree, they’ve played around with the monetisation of what they’re doing which is key to them being as successful as they are. And so, you do see these features moving around in the Google ads area where different use of shopping and layouts and numbers of ads at the top and the bottoms of the page; this is a constant process of change.

There were some particular niches such as tourism, where there was almost nothing organic on the first page of results. And they felt I think that they could actually comfortably do that because of the type of people who were searching for, say a hotel room in another country.

But of course, with everyone not moving around, it’s shifted things a little bit. So, there’s a lot of reaction going on. There’s a lot of engineers who are working the whole time to try and solve problems. And when you have something this big, it covers the entire globe and has massive amounts of searches every day or every hour, there’s a lot of data they have to go through to find where they’re going with this.

And these unnamed Core updates and the lack of information around them are prone to speculation about what the purpose of some of these updates is. And in the end, we find that there are multiple areas that have been changed or moved. Whereas Google used to have almost a daily update of what they were changing in search results, they don’t do that anymore because people became fixated on it.

The thing that you need to continually do is keep on watching those organic results and if something is moving in a slightly negative direction, then work out a way to identify what caused that and make changes to reverse that trend.

And what I’m finding more and more, particularly in the last couple of years, is that the technical performance of websites has become critical. And it’s got to the stage where smaller businesses really aren’t able to keep up with the technical demands simply because to have that sort of development expertise can be really expensive, and it’s not on the radar for them.

And also, with things like security, if people aren’t investing and making sure that plug-ins are updated and themes are updated and versions of whatever platform they’re using are updated, the amount of hacking out there at the moment is extreme. And there are a lot of sites that are getting compromised.

And because of that, their businesses are being compromised because Google won’t have a site with malware or any sort of malicious files on them in their search results.

The Core Web Vitals Update


And this has got us to this latest update, which is the Core Web Vitals around improving the user experience by increasing the responsiveness of the content on websites, isn’t it?


Yes, and there are a few things they look at pretty closely with this user experience update. When they first started giving these tests, PageSpeed Insights and then Lighthouse and other tools, people looked at it and saw, “I can get up to 50-ish, I’m probably doing quite well for mobile speed.” And it was quite difficult to get up to that level.

And it was only really when we were in the lead up to the Page Experience update that we started to see what the real intention was. And the real intention was to get every site in the nineties for mobile and desktop, which is the green zone which was quite daunting. Very few people in the world actually had the technical expertise to be able to do that with sites.

And we also had the six tests for Core Web Vitals, and those tests are really important things as well, such as the Cumulative Layout Shift, which is the really frustrating thing when you go to a site, you’re just about to click on something and just as you do, it shifts, it moves, and you end up clicking on the wrong thing. And it happens all the time.

So, sites will be punished for not fixing that issue, whether it’s intentional or not. In some cases, it is intentional because they want you to click on ads and they make money. So, then the other tests are things like how many seconds it takes or milliseconds before you can actually interact with a page.

So, when you first arrive at a site and you want to click on something, so you can have a look at it, it’s measuring the time before that page is able to be interacted with. Other things like the largest piece of content on the page, how long it takes that to load, and then the first loading of the page.

And so, all these things have brought a lot more focus on hosting and on servers particularly with server response time and how quickly things load. And what we’re finding generally, is that people are having to have some pretty interesting discussions with the hosting companies about what capacity is and what the resource is for their sites.

Because if you want a really fast site, you’re going to have to have really fast servers and lots and lots of capacity. And so, for a lot of businesses, it’s costing them more money for that hosting environment, but at least, they’re able to get those results for Core Web Vital tests.


Mike, I think it’s sort of whether you see your website as actually an essential sort of customer interface or whether it’s just … I think a lot of companies still think of their website as a brochure on the internet. But if you think of it as an essential part of your business, you’re willing to invest in that.

I know all the work that we’ve been doing with you and your team around increasing the performance of the site has paid off for me, because I notice now my own experience of going to our site, how quick and responsive and easy it is to use, compared to other sites where you get frustrated and abandon the site because it takes so long to load.


Yeah, I think that’s going to be an interesting transition. Google has said with this update that there’s going to be a gradual rollout because what they don’t want is to have a whole lot of essential sites, media sites, and government, universities — all of those sort of sites that it’s critical they show up well on search results. They don’t want them demoted to page two or page three.

And so, they are testing and learning the whole way through this rollout. It started in June and they’re making changes the whole time. And by the end of August, it will be the main way that the algorithms work. And so, even at that point, there’s some debate about how much change that will mean and where the data is coming from about those changes.

And so, there’s a bit of wait and see, but from all of the communications from Google so far, it’s not going to be really dramatic, don’t worry too much, but sites that are really advanced and they have got good user experience are going to perform better. And sites that don’t will perform worse, and it can’t get much greater than that.

The E-A-T Principle


I think from day one, that one of the things that you kept talking about was that the core focus is not about simply improving the number of search results. That’s a measure of actually getting the core right, which is … and I know Google focus on this. It’s the EAT principles; the expertise, authority, and trust, isn’t it? If you can develop a website that demonstrates and delivers expertise, does it with authority and builds trust, then you’ve got a good thing going.


Yes. And all of the things we’ve been talking about, are a part of the EAT thing. So, backlinks; who’s linking to your site? If you’ve got a whole bunch of major sites around the world, and they’re not ones that you can just submit content to.

So, people are talking about you and if you have those in the government sites or whatever they happen to be; university sites are referencing your website, then that’s a huge amount of reputation for your presence. The expertise as well, who are the people involved with the company, what’s their history? And are they producing content? What’s the quality of the content? What’s the quality of links to their content?

So, it’s this massive ecosystem that takes into account who’s communicating with your site, who the people are within your organisation, and what you’re actually doing. Anything that can breach that trust line, and that’s why security is so important, that can take a little while to recover from.

So, if you do end up with malicious files, you’ve got to deal with it immediately, and then make sure you go back and ask for a review and get back out there again because it can take months to recover from something like that. Particularly, if it’s a malicious one.


Yeah. Mike, I’ve just noticed the time, it’s been a terrific conversation going through the 10 years that we’ve been working with you guys, and you and the team have done an amazing job over that time. But unfortunately, I’m going to have to wrap it up.


Well, thank you, Darren. It’s been a pleasure working with TrinityP3. And we’ve seen so much change and we’ve learned a lot and it doesn’t slow down. In fact, it seems to speed up now with the new requirements.

The TrinityP3 team are amazing to work with because you guys are at the top of your game, and the information that you share with the world is remarkable, and the sheer quantity, as well as the quality of information, is … really unmatched in your own niche, but even through marketing and a more general sense as well. You guys are doing an amazing job.


Well, thank you very much. Look, I’m very aware that we’ve only spoken about SEO. I’d love to have you back, Mike. And we can talk about the other parts, which is content marketing and inbound marketing as well, and the role of social media. So, perhaps that’s for another day, but thanks for your time.

I’ve just got a question before you go; just from your perspective, what if anything, could TrinityP3 be doing better?

Ideal for marketers, advertisers, media, and commercial communications professionals, Managing Marketing is a podcast hosted by Darren Woolley and special guests. Find all the episodes here