So yes recently there have been several “new” algorithm updates affecting some sites, and although I hate to be the type of person/writer/blogger to add more chatter to your RSS feeds or bookmarked pages, I figured I’d chime in a little bit on both sides of the equation.
First off is the EMD penalty, which isn’t really a penalty but merely Google trying to remove whatever weight EMDs carry as a ranking factor and trying to push webmasters more towards a branding perspective on things.
There’s no penalty for having an EMD! Poorly structured and designed EMDs will get hit though (and should).
Well-designed sites with good content AND EMDs can and DO rank well on their own, sometimes without needing any off-page SEO (link building) done to them, but this requires more of a build-it-and-they-will-come mindset and doesn’t always pay off for white-hatters and G snitches…
So what’s a poorly designed and structured website that’s susceptible to penalties?
Over-use of keywords, anchors, low-quality content, minimal content, weird non-user-friendly inter-site linking, etc.
I’m going to try and keep this post short, so suffice to say, if you know when you’re looking at an inferior websites, then it’s safe to safe that Google geniuses have an algo for that. If you have an EMD about EMD kws and devote time and energy towards developing reader value and focus on user experience, then your EMD is a good EMD and in a fair world it shouldn’t get whacked (shouldn’t…)
Google’s Link Disavow Tool- a mess they made and encourage…
If you’re unaware of this tool, it’s basically a mechanism by which webmasters (who don’t know any better?) can report inbound links TO their sites to Google as being “unsavory”, undesirable, unwanted, and for Google to NOT count these as links the webmasters want as part of their “credited” inbound links portfolio as seen by G.
A lot of bloggers and SEO types are saying that this is just another fishing expedition by Google to get link neighborhoods “outed” so they can then go ahead and discredit certain link types, methods, sources etc…
Is the disavow tool really a fishing expedition, A LA bad links messages in Webmaster accounts, or do they have a more altruistic reason for doing this?
Sometimes you have to look at things from a search engine’s perspective… With recent algorithm updates, Google made negative SEO even easier to do than before, thus making many people afraid of doing link building altogether and going Extreme White-hat with their link building or embracing the “no Seo is the new SEO” school of thought.
Since recent algorithm updates made negative SEO even easier to do, that means more negative SEO is being done, therefore the mess they created must have a counter-measure as seen by the Disavow Tool.
In a way, it’s brilliant planning and foresight on their behalf because it means that they rolled out algorithms that made a lot of people scared to game the search engines with link building, and part of those algo updates opened the door for the aforementioned negative SEO to become more commonplace.
When people who mistakenly or otherwise had bad links built to their sites, they REALLY don’t know why their rankings dropped so the safest thing to do is un-generate whatever links they couldn’t get rid of by contacting link-giving site owners and if that didn’t work, then just to just use the Disavow tool.
What to do?
1- Watch your Inbound link counts and track which ones are natural, which ones you self-generated and don’t disavow ANY of them
2- watch for link-bombing (negative SEO) done by others to your sites and disavow those only if you’re experiencing rankings drops
Final Tip that makes all this worth reading: Inbound link velocity should be commensurate with the amount of traffic your sites are getting, so ramp up accordingly… add in some PPC or other traffic sources to bring traffic to your sites UP to a level that seems natural with the amount of inbounds coming in/being generated… 🙂