The Google Shuffle and Keyword Vanity
by Geoff Hineman
It has been common knowledge for more than a decade that Google makes hundreds of changes to its algorithm every year. Some are large-scale changes that come with much fanfare, but most are very small day-to-day changes that are more about testing than imposing significant influence on the search engine results pages (SERPs). Google maintains that these changes are to present users with a better search experience. It's easy to understand that motivation. If users have a better search experience, users keep using Google... and Google keeps selling more ads. Indeed, the foundation of Google's revenue stream is based on its free product offering: organic search results. When algorithm changes happen, the reaction is often that a group of blackhat SEOs try to reverse engineer the change and cram web sites full of exactly the things the change wants, regardless of the user experience. In the past, this has included:
- Doorway pages
- Interlinked microsite networks
- Amassing thousands of spammy links
- Creating hundreds of pages of garbage content
- Pointing spammy links at competitors' sites
Oddly enough, the one strategy that most of these practitioners avoid is actually creating substantive, meaningful content. Go figure. So, when the results start getting gummed up with poor sites, a new change is in order. Then, like clockwork, huge sites, such as Amazon, Walmart, Wikipedia, eBay, etc., return to the top of the SERPs. What is a site in the middle supposed to do? You know, that site that is not a huge commercial enterprise, but is also not chasing and exploiting algorithm changes. What about those sites who briefly rise from number six to number one on a keyword before one of those other sites resurfaces and squashes that optimism. Well, Google counts on that. Continued instability in the SERPs is Google's best friend. For organizations that count on being number one in the SERPs for a specific keyword—often to the detriment to the rest of the SEO program—AdWords is waiting in the wings. If Google could adjust those results in a way that would bring even more ad revenue, beyond the dictum of simply providing a better search experience, where is the incentive not to do just that?
Several years ago, Google announced the Hummingbird change to its algorithm. Since then, they have also rolled out RankBrain. Both of these changes focus around moving away from individual keywords, but instead trying to discern meaning from keyword searches. For instance, if someone is searching for "baseball caps," Google is learning that this is synonymous with "baseball hats" in most cases. As such, they would be treated similarly, if not the same. This is ideal, unless you really want to be number one for "baseball caps." Too often, marketing managers can have an antiquated view of how keywords work in today's Google. The fixation on being number one for certain terms is understandable, even if it is a bit too much of tunnel vision. As such, SEO Managers need to deal with these expectations, but only after a steady (and continued) effort has been made to achieve the goal of prolonged exposure at number one for said keyword. These options for managing expectation, then, become:
- suggesting a PPC campaign that will ensure constant exposure for that term, if budget allows, or
- focusing on a breadth and depth of keywords
The first option could be appealing for organizations that have a very narrow focus or for those that can still maintain significant profitability from running a PPC campaign on a central term. In those situations, a PPC campaign is not only option, it should be a REQUIREMENT. The second option really comes down to educating organizations, managing expectations, and formulating a more comprehensive strategy. Surely, most organizations focusing on one particular keyword can still make money from other terms. Using the aforementioned example, a site carrying baseball caps is also likely carrying: jerseys, pennants, wrist bands, mugs, equipment, etc. Expand your program to include these items. Chances are, these items will be in a much less competitive space and the same effort exerted in these areas will lead to more traffic and revenue. Expanding (and monitoring) the keyword list to include synonyms for the "main term" will give a broader picture of how the "product behind the term" is performing across the board. These solutions might seem quaint. They might seem simple. They are, however, how keywords need to be viewed in the current Google landscape. That is by Google's design. For those who steadfastly hold on to their keyword vanity, Google loves them because PPC is the only guaranteed solution... if budget allows. For everybody else, the solution is a much wider look at the business. More keywords. Going further down the longtail. Turning over every stone for opportunity.
The bottom line is this: an entire business can't be boiled down to a single keyword term. Some managers believe it can. Google realizes that, too, and they count on it to make money in AdWords. Most organizations, however, are not as one dimensional. So, if you want to save money on PPC spend, glean what you can from current AdWords data, if available, and do your keyword research. Learn your industry from a wider perspective, then act accordingly. Remember, only one site can be number one for a given term at any time. Google shuffles those for a reason. Focus on a wider range of terms. Create the best content you can for your products and services (no shortcuts) and future proof yourself against future algorithm changes. At Lett Direct, we have found that this method always leads to gains with algorithm changes and rarely, if ever, leads to position drops.