The collective SEO and analytics communities have a lot to talk about right now; Google released Hummingbird, its’ biggest algorithm change in over a decade, and made it clear that it is rapidly moving towards 100% encrypted search (which means no keyword referral data for those of us who work with analytics).
I started this post in response to a thread on Linked In, but I realized that I had more to say than could fit there. If you searched these terms and found this post you probably have a job like mine so feel free to skim, skip to the possible solutions, act like you knew it all along, or share with others…please! :)
Why Semantic Search Is Better than Keywords
Keyword data can be valuable but let me explain a real world pitfall: I’d like this site to rank for ‘SEO company columbus ohio’, yet a larger proportion of queries are for ‘SEO companies…’
Which version (or both) should I put in the title of the home page? What about the h1 and that precious first line of the first paragraph? And what about ‘firm(s)’ or ‘consultant(s)?’
‘Companies’ is inaccurate, but has twice the volume of ‘company’…hmm.
I think we all see where this is headed.
What modern inbound marketer hasn’t faced such a dilemma? Who among us hasn’t competed with a large organization with no local presence that has optimized for exactly most commonly searched product or service phrasing followed by a city or state?
That is keyword search and it kinda sucks; it’s dumb and easy to manipulate. For another good explanation check out this Hummingbird FAQ by Danny Sullivan, especially read the examples about halfway down the article.
Semantic (linguistic or Semantic Web) and intent based search (Hummingbird) seems a little more natural and genuine. Search engines had already gotten pretty good at understanding intent, interests, misspellings…in short, natural language. You know, build sites for users, not for search engines and all that jazz.
Potential Solutions to the Lack of Keyword Data
First of all, let’s not forget there are other search engines and I haven’t heard they plan to stop passing keyword data. Even if Bing and Yahoo! continue at a humble 20% search market share, that is enough of a sample to provide a meaningful sample, and both of them have been more aggressive in their marketing lately (recall that for the first time since 2011 Yahoo! got more visitors than Google, and Bing’s ‘Bing It On’ campaign).
As for the rest of these I’ll spare you the prose, here’s the list:
- Google has announced plans to extend the “top search queries” data from 90 days to 1 year (and you can archive it if you access and save it via the API)
- Google AdWords Keyword Planner and traffic estimator comes to mind
- Bing Webmaster Tools gives specific volumes of searches
- Google Trends
- Use this method to track rank of visitors in Google Analytics with events (notice 3/4 down where he discusses analysis of ‘not provided’ visitors)
- For methods of more detailed analysis of (not provided) visitors check out this post by Avinash Kaushik
- Use conversion tracking in Google AdWords to compare to exact match keywords/phrases (and if you deal with offline sales, check out the new offline conversion import tool for end-to end comparisons)
Note: I am always careful to sort queries by Google property, and then likely filter out searches that are likely to have lower ‘buying intent’ (e.g. image searches). If you don’t do this it is entirely possible that pages which receive a lot of image/video traffic will have skewed data making it harder to divine searched phrases.
Lastly, use controlled experiments. Maybe you simultaneously create 2 very similar pages, each with a phrase variation to test. Give them a month to get indexed and see which does better (rank, conversions, visitors etc.). Granted, that seems a little spammy and is a pain in the butt however, if you are feeling bold you could alternate phrasing every month or few and after a few cycles could probably get a decent idea for which performs better.