Keyword Data Just Got Murkier (And What To Do About It)

The march toward keyword less-relevance (I can’t really say irrelevance, so let’s go with this instead) took a big step this week with Google changing the way it reports estimated keyword traffic in its somewhat maligned Keyword Planner Tool. The SEM Post did a write up and so did Search Engine Watch, so if you want more details about exactly what the changes are than what I’m talking about here, check them out.

What, Exactly, Is Happening?

In a nutshell, Google is aligning their Keyword Planner data with their post cannot-opt-out-of-close-variants world and they are showing data for a specific term that actually combines the data from that specific term with other close variants. Historically, if you entered in variations on a keyword, AdWords’ tool would show you data specific to that particular term and that term only. You could actually see differences between the singular and plural of a term, for instance. No longer is this the case.

Does This Even Matter?

In my opinion, this is a major bummer. Ask a PPC pro about close variant whacked out matches and you will probably get quite an earful. I don’t even think that is the worst part though. The worst part of this is being able to get a sense of volume (relative though it was) of slight variations on a term was really valuable. Especially because, at least in my experience, clients are not always correct in their assumptions about which variation of a term is searched on more frequently. Now, we won’t be able to know this in a larger sense either since all the related variants are lumped together.

On the one hand, you could argue that since Google is freely matching these things to each other anyway, being able to see discrete data on the terms really doesn’t matter. And that is kind of true. But, no longer being able to see this kind of macro data I think makes strategy creation more challenging. Not impossible, of course, but harder. Why? Well, I think that most PPC pros would agree that user intent is so key to creating effective strategy. What determines intent? In my mind, it is the search query more than anything.

But if the query for plumber and plumbing, for example, are going to get matched to the same content now, what does it matter which one you’d target? I guess it matters less now, but when I’m working on strategies for clients, part of what I want to develop for them is a deeper understanding of what their potential customers are specifically looking for. How I have historically done this is to look at their analytics (before the “not provided” era), the data in their AdWords and/or BingAds accounts, the data in Google Search Console (formerly Webmaster Tools), manual searching AND the Google keyword tool (in whatever incarnation it was at that time).

Unfortunately, this is part of a continuing trend of the past few years to provide less and less access to search term data for web site owners and/or paid search advertisers. The was a bit of a hullabaloo last week when it seemed for a few days that Google was not even going to let anyone have access to the Keyword Planner Tool if they did not have an active AdWords account. After the story was floating around Twitter, and people were kind of freaking out a bit about it, Google did announce that the lack of access was a bug, not a policy change (story via Search Engine Land is here).

SEOs are in a worse boat on this whole thing than PPCs. Although, I suspect that the close variant mashup experience is headed in their direction too… I am hoping that as RankBrain gets more entrenched in this whole process, Google’s general ability to better parse queries and match them to intent will only improve. Time will tell on that one.

How Can You Work Around This?

Step one, keep calm.

Keep Calm And Get Creative

 

Mine Data You Have

It is a shame to lose access to more detailed macro data for search terms. Having a larger perspective on user behaviors is really helpful, because data from any site or account is always skewed in some way and does not necessarily represent a snapshot of the landscape at large. It is very important to remember this as you work around this. It is critically important to make sure you are not conflating actual user behavior on a specific site with potential behavior for a set of keywords. Doing so can set you, and your clients, up for a world of hurt because of the myopic nature of this data.

There is still data available that will give you more specifics than the close variant frankendata. But you’re going to have to piece it together and do some extrapolating. I’d start working on new disclaimer language to include with your keyword targeting analysis and strategy work.

Data Source – AdWords or BingAds: If you’re running AdWords or BingAds, dig into your search query reports to find trends. You should be doing this anyway, of course, but this is now one of the only places where you can see the specific variant that produced a click.

Disclaimer: These terms show a limited view of the total activity available for a term. It only shows terms that resulted in a click on one of your ads. And, it does not even show you all of the terms that generated clicks on your ads, at that. So, while helpful to see what is CURRENTLY driving traffic to your site, it is not showing you terms that are not, but could be driving traffic to your site.

Data Source – Google Search Console (Formerly Webmaster Tools): If you’re not running Google Search Console for your sites, start right now. If you are, then you can also find data there for terms that both caused your site to appear in search results and be clicked by searchers.

Disclaimer: Again, this offers another piece of the total pie. It is also limited to terms that are CURRENTLY triggering your site to appear in search results.

Use Other Sources To Find Trends In Search Terms Or Behavior

Third Party Tools

Now, I’m sure many of you will be thinking “Shouldn’t I just use a third party tool where I can still see this data, like SEMRush or SpyFu or the like?”. This is a perfectly fine option, although I’ve always held that the third party tools should only be used to rank terms relative to each other and not in any kind of absolute sense. The same could be said for the previously available Google data as well. Time will tell if their data morphs into what Google’s has become. Moz recently revamped their keyword tool, Keyword Explorer, and it is worth checking out too.

Manual Labor

Without a change in the way all of this is working, using tools that are based on search behavior is probably going to be a better bet to define or refine strategy when it comes to search terms to target. Whether you use something like Ubersuggest, Ask The Public or just start entering queries into search engines and note what pops up in the auto-suggestions. This kind of manual work will still show you if term A variation is searched more frequently than term B variation. You can create your own spreadsheet or matrix for the data.

What To Do With The Data Once You’ve Compiled It

I like to look at the two halves separately – data for terms that are driving traffic to the site and terms that are not driving traffic to the site, but could be. Looking at the full landscape makes for better strategy. For the terms that are driving traffic currently, hopefully you also have data on how those terms are performing beyond the click – are they converting? Armed with this information you can then look to the list of potential terms and try to find ones that share some characteristic with a currently successful term. I’d call that the low-hanging fruit. Ideas for new campaigns and ad groups are also generated by looking at the potential term lists.

Then you have to decide how methodical you want to be about it all. Since Google is clearly trying to push us all toward using the more general terms and letting them decide how many things to match to them, you could go that route and do frequent SQR reviews and negative list building. If you want to be more precise as to which terms trigger which ads, that takes more work, particularly in the sculpting of negative lists by ad group. If you want to be sure, for instance, that only the singular version of a term triggers, you will need to add the plural and other variants you can (a) think of and (b) are found in your query reports to ad group negative lists. This can get exhausting and become time consuming, so carefully evaluating the benefit from this practice, and not going beyond the point of diminishing returns is important.

To some extent, creating sound strategy is a little harder with the search terms getting muddier. Like everything else that has changed in the platforms though, we will all learn to work within whatever constraints are thrown at us!

What about you – do you have a favorite keyword research tool or method? Does this move unnerve you at all? As always, share your thoughts in the comments or hit me up on Twitter (@NeptuneMoon).

Trackbacks

  1. […] Keyword Data Just Got Murkier (And What To Do About It): Keyword data is getting a little muddier, but find how out to keep everything straight in this post. […]

Speak Your Mind

*