RSS

Category Archives: keywords

The Seven Habits of Effective SEM: Part II – Keywords

When I first wrote about the seven habits of effective SEM, my primary motivation was to point out that keyword selection was not the end all, be all of SEM. I have seen too many people waste too much of their day trying to come up with that killer ‘long tail’ keyword, instead of spending their time more wisely on other equally important aspects of SEM. Indeed, I sometimes wonder whether keyword selection still deserves to be a top seven SEM technique, since the search engines continue to ramp up their broad matching technology (see, for example, Google’s recent “advanced broad match” announcement and Yahoo’s new terms and conditions which allow them to optimize your accounts for you), making it more and more difficult to find keywords where your competitors are not.

In the end, I concluded that keywords are indeed still important, just not as important as they once were. So here are my best practices for keywords:

1. Create Basic Keywords. Synonyms, action prefixes and suffixes, runons and misspellings, plurals (Google only). Before you start investing in expensive keyword research tools (some of which can cost up to $30,000 a year!), I recommend that you simply brainstorm a basic set of keywords. There are five types of keyword sets that every campaign should have. These are:

a. Root terms. This is the most basic keyword that relates to your campaign. If you are buying keywords for a mortgage lead campaign, this would include words like “mortgage”, “mortgage rates” and “mortgage quotes.”

b. Synonyms. Alternative words that basically mean the same thing as your root terms. Again, thinking about mortgages, this might include “home loans”, “refinancing”, and “home equity.”

c. Action Prefixes and Suffixes. These are words that you can append to the front or back of a root term or suffix that user might type in to further qualify their query. There are two types of prefixes/suffixes: general and category-specific. A general prefix would be something like “buy”, “find”, or “best.” A category-specific prefix might include a geographic region, a qualifying statement “bad credit”, or a commercial name like “Wells Fargo.” Note that the most generic prefixes and suffixes (like “the”) have now been almost entirely broad-matched out of existence, so if you see a prefix or suffix getting no traffic, this may be the reason (and you should probably delete that keyword to clean up your account).

d. Run-ons and Misspellings. Like generic prefixes and suffixes, the utility of run-ons and misspellings is much less than it once was. Still, you can sometimes get a few cheap clicks by creating words like “mortgagerates” and “refiancing.” You should put these in their own ad groups, especially if you are using dynamic keyword insertion (DKI) in your ad text. I recommend that you don’t get too carried away with run-ons and misspellings – you should limit this practice to the highest volume keywords in your account.

e. Plurals. There can be significantly different user behavior on a singular versus plural keyword (see further discussion below). As such, you need to make sure that all of your top keywords include both iterations. Note that this is not necessary for Yahoo, as Yahoo does not differentiate between singular and plural.

If you create five root terms, five synonyms, 10 prefixes and suffixes, and use plurals, this will result in a list of 200 keywords. Add in another 20 misspellings and you are up to 220 keywords. Add in all 50 states, specific cities, and combining prefixes and suffixes on the same keyword, and you can see how these five simple rules can quickly build a keyword set for you without ever touching a fancy keyword tool!

2. Don’t Overdo It. While it may be true that the keyword “Pacifica California Subprime Refinancing Interest Rates Mortgage Companies” will not be specifically purchased by many of your competitors, it is no longer true that you alone will show up on this keyword should you be the only one to buy it. As I have noted numerous times in the past, the search engine “broad matching” algorithms have gotten increasingly better at aggregating tail keywords into the same auctions with head terms.

In the past, there were two advantages to tail terms – first, that you could show up by yourself on that keyword (no longer the case with broad matching), and second, that you could improve your click through rate (CTR) for that specific query and pay less for high position. The second may still be true to a limited degree, but you aren’t going to be able to pay $.10 on a six token (word) keyword phrase and outperform a big competitor paying $5.00 on a head keyword.

Moreover, Google has explicitly stated that keywords beyond five tokens will be automatically considered “low quality” by their Quality Score algorithm. The rationale behind this (which I don’t necessarily buy, by the way) was recently summarized as follows:

very long phrases and very low volume keywords well down the long tail are not necessarily an advantage to a marketer, as they don’t reflect how “real users” normally search. The sweet spot of the long tail is 2-to-4-word phrase. 5-8 word phrases, not so much. Among other things, Google will have such limited data on these, they have no choice but to assign slightly worse quality scores to them.

The other hidden danger of millions of obscure keywords is the risk of either slow bleeds or sudden keyword explosions. A slow bleed occurs when you have 50 or 100 keywords costing you $2 or $3 a month. These keywords fly below the radar but gradually can cost you thousands of dollars a year. Unless you have a bid management system that has the ability to cluster similarly-situated keywords, you are unlikely to discover these bleeders.

A sudden explosion occurs when one of your random long-tail keywords is suddenly matched on a major search, or a news event causes that keyword to get a spike in traffic. As an example, a few years ago I bought the word “Pope mortgage”, which happens to be the name of a city with the word mortgage appended to the end. When Pope John Paul II died, this keyword received a huge rush of unprofitable clicks in a short time.

All this being said, there is still value to the long tail. For most non-retailers (i.e., companies that don’t have thousands of products for sale), a good rule of thumb is to have somewhere between 500 and 5000 keywords in your account. If, however, you find yourself patting yourself on the back for having developed three million keywords, you are living in the past and need to start living in 2008!

3. Keep Them Targeted. Although Google allows 2000 keywords in an ad group, this does not mean you should strive to pack as many keywords into as few ad groups as possible. Indeed, in most instances, you will be better served by having few keywords in many ad groups. There are two primary reasons for narrowly targeted ad groups: CTR and Quality Score. Your CTR will increase if your keywords are closely related to your ad text. Segmenting similar keywords into well-defined ad groups enables you to create very relevant ad text.

Your Quality Score will also benefit from well-defined ad groups. Google rewards advertisers who send a targeted keyword to a targeted ad text to a targeted landing page. When you combine a better Quality Score and higher CTR, you have solved two of the three factors that impact your position on Google (the other being max CPC). This can enable you to pay a lot less than your competitors for the same keywords.

Just to be clear, you could take this targeting approach to the extreme by literally having a one-to-one relationship between a keyword and an ad group. If you have the ability to automatically create relevant ad text and automatically make bid adjustments, this might makes sense. If you are doing most of your work manually, however, the management costs associated with thousands of ad groups may not be worth the effort.

4. Track at the Keyword Level. Keywords are the DNA of your SEM campaigns. As such, you need to measure their performance on a keyword by keyword basis. Whenever I see a tracking URL that reads www.domain.com/?campaign=GoogleAdWords I know that the campaign is not being properly optimized. Individual keywords will vary tremendously in terms of performance. I often use the example of the word “mortgage rate” and “mortgage rates” to prove this point. Someone who types in “mortgage rate” is most likely looking for today’s current mortgage rate; someone who types in “mortgage rates” is looking to get multiple mortgage quotes. Depending on your business, the conversion rate between these two keywords can vary dramatically.

5. Test Match Types. Google offers three match types – broad, phrase, and exact – and you should make a point to test keywords on all of these match types. In most cases you will find that your exact match keyword has the highest conversion rate but also costs you the most with the least amount of traffic, and that the exact opposite is true for broad match. But every account is different and you need to test performance for your specific campaign. Note that I don’t count match types as part of the total number of keywords in your campaign – in other words, if you have 5000 keywords but you match each of these three times, your account would have a total of 15,000 keywords, which I still think is acceptable without “overdoing it.”

6. Be Negative. The number one mistake I see novice search marketers make is not paying enough attention to negative keywords. As the search engines continue to push the limits of broad matching, your best defense against getting a lot of unproductive clicks is to buy tons and tons of negative keywords. There are two ways to create negative keywords. The first way is to create a generic list of negatives that will apply to almost any keyword. This could include words like “lawsuit, complaint, refund, scam, do it yourself, free, sex, UK, etc” – this will vary of course depending on your business.

The second way to create negative keywords is to use the Google keyword tool and look for words that might be semantically related to your product but are in actuality not at all related. A funny example of this would be to exclude the word “one” from an advertisement for “night stands” that are used for bedroom furniture. I consider the creation of negative keywords to be just as important as the creation of actual keywords.

Stay tuned for part three of this series – best practices in ad text.

 
Leave a comment

Posted by on June 23, 2008 in keywords, seven habits

 

Do Keywords Matter Anymore?

Once upon a time, you could make thousands of dollars a month in search engine marketing simply by finding the “long tail” keywords that your competitors had overlooked. This usually meant one of three things:

1. Misspellings: “mortage rates”
2. Run-ons: “mortgagerates”
3. Multi-word phrases: “san mateo county bad credit mortgage rate loan offers”

These words were veritable goldmines. While your competitors fought tooth and nail just to show up on the first page of search results for “mortgage rates”, you often found yourself alone (and with a bid of $.10) on the tail.

As a result of this phenomenon, SEM experts spent a lot of their time focusing on keyword research, perhaps at the expense of ad copy optimization, landing page optimization, and process automation. But who could blame them – they were taking advantage of market inefficiencies and making a lot of profits!

These days, the value of the tail has diminished, if not disappeared entirely. There are three primary reasons for this:

1. The Search Engines are Smarter: All of the search engines now have “broad matching” or “advanced matching” features. This means that a big competitor that bought the keyword “mortgage rates” will still likely show up when a user types in “best mortgage rates” or “discount online mortgage rates.” The rationale behind this – from the search engine perspective – is fairly obvious; by populating obscure long-tail searches with results from very competitive keywords, the bids increase dramatically (no more $.10 clicks on the long-tail).

It used to be that the search engines tried to balance their drive for additional revenue with user experience concerns by looking at “token matching.” Think of a token as a word. A phrase like “bad credit mortgage loans” has four tokens. The old rule was as follows: if a generic phrase matches at least 50% of the tokens in a long-tail phrase, the search engine would consider that relevant enough to show the broad match results on the long-tail phrase.

So, in the “bad credit mortgage loans” example above, “mortgage loans” and “bad credit loans” would be broad-matched, but “mortgage” and “loans” would not (because they only have 25% of the tokens).

Based on what I see in the market today, this is no longer the case. The search engines have basically moved away from the “token-matching” system and are getting closer and closer to a “category-based matching” approach. In this model, a broad phrase like “mortgage rates” could be matched not only on “bad credit mortgage rates” but on “home loan rates”, “find a mortgage”, and “san mateo county loans for new home buyers.”

One other related point worth noting. In the olden days, even if a broad match had a 50%+ token match and did show up alongside long-tail searches, it had to compete in the same auction for top ranking.

Here’s what I mean: let’s say that you paid $5 CPC on Google for the keyword “mortgage rates” and you elected to be broad-matched across all long-tail keywords Google thinks are relevant to you. Let’s say that I bought the keyword “alabama low mortgage rates” and I was willing to pay $2 CPC for this keyword.

When a user did a search for Alabama low mortgage rates, Google looked at two factors to determine ranking – maximum CPC and click-through rate (CTR). The key point here, however, is that Google looked at CTR on a keyword-specific basis. So, if your broad matched keyword had an overall CTR of 10%, but only a .5% CTR on the specific keyword “Alabama low mortgage rates”, Google would use the .5% CTR to determine ranking. So, if my $2 bid had a CTR of 10% (due to the fact that I would likely have highly-specific ad-text, and I might be an Alabama-specific mortgage lender),
my effective “cost per thousand” (CTRxCPC=eCPM) basis would be $200 and yours would only be $25. So, if we went head-to-head against each other, I would clearly show up more highly than you.

That is not the way things work today. As far as I can tell, Google has moved away from the keyword-specific auction model to more of an “overall CTR” auction model. In this new scenario, if your generic keyword has an overall CTR of 5% (but remember, only .5% on the specific keyword) and my specific keyword has a 10% CTR, the eCPMs would be $250 for you and $200 for me – you would show up first. The long and the short of it is that this new system decreases the relevance of the tail by enabling high-CPC generic keywords to outposition targeted keywords.

By changing “token matching” to “category matching” and by changing the concept of a “single auction” to an “overall auction”, the results may be slightly less specific-ads for the user, but virtually eliminate market inefficiencies (read: no more low CPCs) for the search engine.

Of course, the flip-side of that is that as user queries are consolidated into a few categories, the bulk of ad-generated traffic is consolidated into those advertisers who can pay top position for the most popular searches. Or, to put it another way, instead of entering into 10,000 auctions for 10,000 different keywords, advertisers are now entering into just a few auctions for those 10,000 keywords. If you want to get traffic, you have to win in one of these few auctions.

2. Competitors are Smarter: Even assuming that the search engines weren’t doing everything in their power to move generic keywords into the long-tail results, the advantage of huge keyword lists has been reduced simply because competitors have caught onto the tactic. There are now dozens of companies (Trellian, WordTracker, GoogSpy, BadNeighborhood) that offer reams of keywords for pennies a day. Some of these folks now even offer APIs so that you can directly send the keywords right from their databases to your search engine accounts.

On top of that, the search engines have realized that being open about keywords helps their bottom line. Both Yahoo and Google offer keyword suggestions (taken directly from either user queries or other advertiser’s keyword lists) at multiple points in the account set-up process.

Two years ago, perhaps 30% of any keyword market was relatively devoid of advertisers and presented great arbitrage opportunities. Today, my guess is that less than 10% of the keyword universe isn’t heavily saturated. Combine smarter advertisers, more advertisers, and better tools and tail keywords are much less valuable.

3. Consumers are smarter: Finally, let’s not forget about the consumer. Remember the growth of AskJeeves – the “natural language” search engine. Consumers loved the fact that you could type in a question like “Where can I find low mortgage rates?” and get relevant results. Of course, it turned out that this was basically a gimmick – the system simply excluded noise words like “Where can I” and found results based on “find low mortgage rates.” Consumers caught on to this, the novelty wore off, and now Ask has abandoned the “natural language” concept all-together and is just another search engine.

Apparently, consumers are getting smarter on Google and Yahoo as well. A friend of mine who has already poured through the millions of user queries inadvertently released by AOL, tells me that user “query length” (the number of tokens in a search) is getting shorter. This to me suggests that user recognize that you can basically get what you want on a search engine with a basic search, rather than a 20 word diatribe.

No doubt searchers will also gradually learn to eliminate noise words like “and” or “where”, as well as to stop typing in searches like www.ebay.com into the search engines (did you know, by the way, that one of the top searches on Google is . . . “Google”? This will decline over time . . . I hope).

In any event, as searchers become smarter, the volume of long-tail searches seems to become smaller, thus reducing the need for huge effort into keyword research.

Conclusion

Combine search engines maximizing revenue by showing generic ad results, competitors who now understand the long-tail, and consumers who have a much better understanding of how search engines work, and I think it’s safe to say that the Holy Grail of search engine marketing is no longer the keyword. No doubt keyword lists are important, but you won’t be able to retire in Aruba just because you concatenated three adjectives, two cities names, and two suffixes.

This basically means two things: 1) that SEMers are going to have to get a lot smarter about the other elements of SEM – like ad text, landing pages, bid prices, analytics, and filtering and 2) that folks who survived on the long-tail and market inefficiencies are in trouble (especially when you add in Google’s Quality Score changes).

A few months back, I wrote about the similarities between SEM experts and eBay power sellers, the concept being that both eBay and SEM used to be easy for anyone to do but is now rapidly becoming the domain of specialized experts. The end of keywords is yet another example of this phenomenon for SEM.