Bill Slawski recently posted on a newly granted Google patent designed to “modify” rank for sites Google thinks are “spamming”.  It should be required reading for anyone interested in SEO.  In short, the patent describes Google applying time delays, negative responses, random responses, and unexpected responses to the placement of pages in the search results when the algorithm determines there is a possibility of attempted manipulation.  If the “spammer” reacts to these responses in an observable way, the page or site can then be designated as spam.

Leaving aside the fact that all SEO is an attempt at manipulating or increasing the rank of a website/page in the search results, and that entirely legit businesses following Google’s guidelines will be wrongly labeled as spammers, this patent points toward a very important ramification of micro vs. macro SEO.

Micro SEO Is Dead

Many years ago, SEO was simple.  You added keywords to all the right places, got your links with the right keyword anchor text, monitored your rank, and adjusted your keywords and links accordingly.  Sometimes you’d cross a threshold and hit a filter by having your keyword repeated too many times either on your site or in your external links.  No big deal.  Remove a few instances or switch them to synonyms, add some new links to diversify the anchor text in your portfolio, and BAM, you’d be back in the game in no time.

Spending hours and hours drilling down into Advanced Link Manager data and analyzing exact-match-anchors-from-unique-linking-domains…worked.  It worked well.

Nothing Lasts Forever

But a few years ago this started to change.  Google began ranking pages based on the authority of a site rather than a particular page for example.  So you might find a page ranking well, analyze it, and find very few traditional ranking signals.  It was ranking based on site factors instead of page factors.  This complicated analysis a bit, but not THAT much.  If you understood it, you could still figure out what was going on.

It’s one example of micro-managing SEO getting more complex.  These days though, micro-managing SEO is more than complex.  It’s a recipe for failure.

Although Google may have recently been granted the rank-modifying spammers patent, these random fluctuations have been at play for a long time.  They were commonly known among SEOs with experience as growing pains.  Especially with new sites, attempts to move up in rank would cause random result placement for a while, which would eventually settle.  These time-based delays have been common for years, and if you didn’t know about them you could inadvertently screw yourself.  Inexperienced SEOs would get a handful of links and see a small upward movement or a small downward movement, not realizing the unexpected/inappropriate movement was influenced by a time delay, and push harder or make drastic changes.  This caused them to pass filter/penalty thresholds without realizing they had done so, and ended up causing their sites to drop into never-never land for a very long time, with little chance of redemption.

It also created situations where it was easy to spend more than you needed (in time or money) for link building.  For smart SEOs, knowing there was a sandbox or time delay allowed them to play the slow-and-steady game instead of moving too fast, crashing, and burning sites.

Historical/Temporal Data In Play

In addition to the above evolution of the algorithm, new penalties have been appearing for unnatural activity over time.  Link velocity matters.  Link spikes that appear unnatural (potentially because they lack other signals of natural spikes: mentions, traffic, etc.) can also cause penalties.  And link loss can be as bad as link spikes.

It hasn’t been in effect as long as the sandbox, newer time delays, and random ranking fluctuations, but more recently, attempts to fix losses began leading to even further losses.  This may be the result of the above patent being applied before being granted.  Here’s how it works:  A webmaster gets a bunch of links and his site moves down instead of up.  He freaks out and removes the links, thinking they were the cause of the drop.  They were the cause of the drop.  But removing them looks even more unnatural than getting them, especially if the removal can be tied to the rank loss in time.  If the webmaster would have tried to create additional signals or simply slowed down, he may have come back.  But by undoing what caused the drop, he confirms Google’s suspicion…and is now labeled a spammer with greater certainty.

Unreliability of Micro SEO

Google’s algorithm is a complex, constantly changing mix of numerous interrelated factors with variable thresholds and multiple layers.  It’s no longer possible to look at isolated data and arrive at actionable conclusions.  One site with 3,567 links may rank #1, while another site with 3,567 links may be completely removed from the results.  The distribution of links to an entire site can and does influence the rank of a single page on that site.  The anchor text profile (keyword vs. generic vs. brand vs. URL, etc.) matters.  The quality of the linking sites matters.  The diversity of linking sites matters.  The placement of a link on a page matters.  The rate of link acquisition and loss over time matters.  And all of these factors are interrelated, changing regularly, and different for different sites.  And there are many more factors!

If all of the above isn’t complex enough, add in purposely randomized results over randomized periods.

SEO is not deadBut micro SEO died a long time ago.  (Unfortunately there are still many people selling it.  But I’ll save that for another post.)  Due to the complexity of the algorithm, analyzing results on a micro level is a waste of time at best.  And at worst (ever more likely), it will create obvious patterns that Google will notice and penalize.

Macro SEO: The Way Forward

Each time Google rolls out a significant change or massive penalty, there are cries around the interwebs that SEO is dead.  The cries come from individuals whose current methodologies have died.  They don’t realize it’s not SEO that’s dead, but their particular micro tactics and strategies.  Macro SEO has always worked, and it will as long as there are search engines ranking sites without requiring payment for placement.

What Is Macro SEO?

Macro SEO is about understanding the big picture.  What types of sites are ranking?  How big or small are they?  Are they brands?  What does their link profile look like, overall?  What does their anchor text profile look like overall?  Are the search results for a given phrase or niche dominated by big brands?  Are they dominated by Google verticals?

Macro SEO requires you to look at the details only in order to understand the big picture.  When micro SEO worked, ranking reports could be run weekly and micro changes could be made as a result of ranking fluctuations.  With macro SEO, ranking reports are still extremely useful.  But instead of using them to make immediate adjustments, they’re used to notice when significant changes have occurred and in what direction those changes are pointing.

Applying macro SEO means using tactics that go with the current rather than against it, and not freaking out or reacting to unexpected ranking “modifications”.  But you need to feel the direction of the current first, along with understanding the general causes of major penalties, and that does take a combination of experience and analysis.  But it’s not micro analysis.

Knowing that Google is going to mess with you along the way, that you’re going to see random fluctuations and ranking drops, will help you stay on the macro path to success.  Expect a bumpy ride.  Otherwise, you’re going to be outing yourself as the “spammer” you’ve become in Google’s eyes.

Did you find this post interesting or useful?
If so, be sure to subscribe so you don't miss the next one!
Your Email:
Delivered by FeedBurner