Yes, another article where I’m going against the “traditional” practices of so-called SEO.
Shock statement alert: I don’t agonize over trying to deliberately target ONE exact keyword. Or agonize over getting articles published with just the “exactly right anchor text density”. Or for that matter, stuff them full of a keyword(s).
I’m keeping this short and sweet, so I’m giving you a rundown of exactly why I don’t obsess over keywords (and why you shouldn’t either).
1. It doesn’t give a natural link profile
The way Google’s RankBrain is going, its essentially becoming more like an extremely fast and accurate human, every day. And a human can pretty much tell when an anchor text looks very deliberately placed – for SEO purposes.
If the site in question has a dozen links, all on different sites, with the same keyword phrases….
Yep, wouldn’t look too natural and probably a good chance of being penalised. (if not now, then a future update will annihilate it)
2. Google RankBrain doesn’t want keyword-stuffed content!
Ever since the Panda update in 2011, Google have been dampening down the relevance and power of keywords, due to the bad SEOers who kept over-optimizing their content by stuffing it full of keywords.
I mean sure, make sure its done in a reasonable way, and mention the keywords and related keywords around that keyword where its contextually relevant.
But you definitely don’t want to be over-doing this.
3. The future of search is voice activated
With the amount of voice search queries going up 35-fold between 2008 and 2016, its safe to say – voice search optimization is going to become extremely important going forward.
And with how the device interprets the said phrase and users speaking with a more conversational way, individual keywords just by themselves, with no other supporting factors, just will not (and probably should not) have any significant weight going forward.
Search engines will instead favour content that falls in line with their request (as the Google Hummingbird 2013 update did), introducing semantic context into play. Simply put – it analyses the language naturally, and deciphers what the search query actually means.
All we need is another update which goes even further than Hummingbird, signalling the final nail in the coffin for unnatural keyword usage!
The conclusion of all this, is, this is not a bad thing! It’s a necessary change in how search engines work and if anything, will improve the quality of the content on the internet.
What I mean is, create good content, provide value for the reader/customer and you will naturally mention those “keywords” and a multitude of other associated words. Combine this with some link building and with time, you’ll end up ranking for a multitude of terms. And that’s without having to analyse keyword lists, stuff content with keywords or painstakingly calculate anchor text ratios.
I’m not saying those “traditional” ways don’t work (even though the really old school way, is going to end up exactly like the new way, funny how things work out!) – but why not just future proof your site and links while you still can?