The collective SEO and analytics communities have a lot to talk about right now; Google released Hummingbird, its’ biggest algorithm change in over a decade, and made it clear that it is rapidly moving towards 100% encrypted search (which means no keyword referral data for those of us who work with analytics).
I started this post in response to a thread on Linked In, but I realized that I had more to say than could fit there. If you searched these terms and found this post you probably have a job like mine so feel free to skim, skip to the possible solutions, act like you knew it all along, or share with others…please! :)
Why Semantic Search Is Better than Keywords
Keyword data can be valuable but let me explain a real world pitfall: I’d like this site to rank for ‘SEO company columbus ohio’, yet a larger proportion of queries are for ‘SEO companies…’
Which version (or both) should I put in the title of the home page? What about the h1 and that precious first line of the first paragraph? And what about ‘firm(s)’ or ‘consultant(s)?’
‘Companies’ is inaccurate, but has twice the volume of ‘company’…hmm.
I think we all see where this is headed.
What modern inbound marketer hasn’t faced such a dilemma? Who among us hasn’t competed with a large organization with no local presence that has optimized for exactly most commonly searched product or service phrasing followed by a city or state?
That is keyword search and it kinda sucks; it’s dumb and easy to manipulate. For another good explanation check out this Hummingbird FAQ by Danny Sullivan, especially read the examples about halfway down the article.
Semantic (linguistic or Semantic Web) and intent based search (Hummingbird) seems a little more natural and genuine. Search engines had already gotten pretty good at understanding intent, interests, misspellings…in short, natural language. You know, build sites for users, not for search engines and all that jazz.
Potential Solutions to the Lack of Keyword Data
Not to say that Yahoo is close to displacing Google’s dominance of the search market but there are other search engines.
First of all, let’s not forget there are other search engines and I haven’t heard they plan to stop passing keyword data. Even if Bing and Yahoo! continue at a humble 20% search market share, that is enough of a sample to provide a meaningful sample, and both of them have been more aggressive in their marketing lately (recall that for the first time since 2011 Yahoo! got more visitors than Google, and Bing’s ‘Bing It On’ campaign).
As for the rest of these I’ll spare you the prose, here’s the list:
Google has announced plans to extend the “top search queries” data from 90 days to 1 year (and you can archive it if you access and save it via the API)
Use conversion tracking in Google AdWords to compare to exact match keywords/phrases (and if you deal with offline sales, check out the new offline conversion import tool for end-to end comparisons)
Note: I am always careful to sort queries by Google property, and then likely filter out searches that are likely to have lower ‘buying intent’ (e.g. image searches). If you don’t do this it is entirely possible that pages which receive a lot of image/video traffic will have skewed data making it harder to divine searched phrases.
Lastly, use controlled experiments. Maybe you simultaneously create 2 very similar pages, each with a phrase variation to test. Give them a month to get indexed and see which does better (rank, conversions, visitors etc.). Granted, that seems a little spammy and is a pain in the butt however, if you are feeling bold you could alternate phrasing every month or few and after a few cycles could probably get a decent idea for which performs better.
I have no formal training in analytics. I got started because I had built a site and I was figuring out how to market it and I learned that I could set up a Google Analytics account for free; so I did.
One of my first clients was a friend in need of help. The vast majority of their marketing budget was dedicated to AdWords, and (in the off-season) a payment to the company managing the campaign had bounced.
They asked me to look into the account. I did, only to realize it was being completely mismanaged; I quickly started working on it.
Before long, I realized that they had serious bounce rate problems, which I suspect stemmed from page load time (site/page speed) problems. What do I mean by that?
After I had fixed other issues, their AdWords campaigns averaged a 90%-ish bounce rate (with a site average of over 85% bounce rate).
One day, I decided to take all their ‘conversions’ (only AdWords) and add together their cumulative ‘time on site.’ I then took all visitors and multiplied their average time on site. I then subtracted the cumulative time on site of their conversions from the cumulative time on site of all visitors and divided by the number of non-converted visitors (conversions were miniscule—in my eyes—Avinash says that 2% is the industry average and they were probably in line with that).
What I learned was that for the vast majority of visitors (likely, the ones who weren’t absolutely certain that this service was right for them), the time it took pages to load was longer than the average time on site.
Ouch.
I advised my clients/friends of this promptly. They opted not to do anything about it (website controlled by corporate), but it was eye-opening.
I didn’t know of anyway to do this automatically (perhaps a segment or custom report would do so), so I just did it with a calculator and a sheet of scrap paper. And it gave me supremely useful information—just by using a bit of common sense and simple math.