The “Off Page” SEO Strategy Explained

To help folks understand how indepth the “off page” strategy is, and to help see why SEO takes time, I decided to do a video on the whole off page strategy which is below. Any questions on it of course, please feel free shout 🙂

Here’s a copy of the mindmap diagram that was featured in the video below, click here to grab it.

Warning: Please do not simply copy this strategy, each part has multiple layers of strategies within it. Simply copying this could over or under optimise and potentially risk rankings and issues with Google.

The post The “Off Page” SEO Strategy Explained appeared first on Scott SEO & PPC Services.


The Worldpress Longtail Blog Strategy

Using WordPress Tags and Google+ targeting longtail keywords can really turbo charge rankings for multiple reasons, not only the long tails but also your mid tail and main SEO target keywords.

In the video below I first go over the strategy in brief, and then slowly make a post with this strategy so you can see every step of the way. Be warned though, my idea of a “quick video” on this turned into a 44 minute epic, so you may want to grab some strong coffee lol – if you’re pretty tech / SEO savvy the first 5-10 mins though should do the trick.

I’ve also made a little tool below to help speed up making lists for hashtags and WordPress tags for this strategy…

Any questions on it please feel free to contact me 🙂

The post The Worldpress Longtail Blog Strategy appeared first on Scott SEO & PPC Services.


Google’s “Random Algo” Explained

So what is Google’s “random algo” or “rank modifier” that it also goes by? well in the video at the end of this page (HIGHLY recommend viewing on this subject, if you prefer to watch than read, scroll down now) we go into depth on what it is and I highly recommend watching the video, but in essence, its designed to try and fool / worry the unknowing SEO provider and / or SEO client. Basically, if Google spots anything that could be SEO, they roll a dice on three metrics…

  1. positive or negative
  2. time frame
  3. position adjustment

Here’s some examples based on different scenarios…

An SEO gains a brilliant backlink to the site, the anchor text is natural, the content of the site and theme related to the target site, high quality original content on the page that links to the target site, etc. Google spots this, and this new link gives the target page a boost in trust, authority and power to take the target keyword from position 8 on page 1, to position 5, but the random algo kicks in, and rolls the dice on the three metrics and picks…

  1. positive or negative = negative
  2. time frame = 3 weeks
  3. position adjustment = 10

So instead of jumping to position 5, which this backlink has really given, the random algo pushes their 8 position down to position 18 for 3 weeks. If the link is still there in 3 weeks, this random algo is removed and they gain what they should of, and jump to position 5. If you didn’t know about the random algo, you may panic that this link has sunk the ranking from position 8 to 18 and remove it and never wait out the random algo and now that position 5 is lost, AND, you may of just tipped off Google you’re doing SEO because if this link wasn’t for SEO gain, why was it removed soon after a dip in rankings.

what is the Google random algoHere’s another example….

An unknowing SEO sees a great gig on fiverr where they can get 100,000 backlinks to their site for just $5, they’ve read that backlinks are powerful for SEO so they order 10 of these for $50, so 500,000 backlinks. These are made and point links to the target site all using the same anchor text, and all the links are from spammy websites. At the moment they’re ranking position 27 for their target keyword and they fire the 500,000 links to the target page, Google spots this and these link reduce trust and should take their position 27 to position 90 BUT, the random algo fires up yet again…

  1. positive or negative = positive
  2. time frame = 2 months
  3. position adjustment = 7

So their position 27 moves up to 20, the SEO thinks “wow this is brilliant” and buys even more so now there’s another 500,000 spammy links (upon seeing this Google would fire off another random algo layered on top of the first). After 2 months the random algo falls off, and the page drops to position 90. Google fooled the SEO into thinking they were good links so they would do more.

Now this is where it gets “fun”. Backlinks are not the only thing to fire off a random algo, other metrics such as: social signals, page / html / site changes, different user metrics, etc and to make it even more complex, every one of these gets its own random algo. For example, lets take just one metric, backlinks, and take the situation where a backlink is added daily to link to a target site, so every day a random algo fires, then daily the random algos could be something like…

day 1 = positive / 8 days / 3 positions

day 2 = positive / 32 days / 1 position

day 3 = negative / 2 days / 76 positions

…. etc – so when this is happening on every backlink found and processed, every social signal, every site change, the somewhat “simple” idea Google has come up with here is just beautiful (from their side at least lol) in making SEO testing take a lot longer (as you need to stop SEO tests and then wait it out for the random algos to “fall off”) and makes it easier for SEO to panic and shoot themselves in the foot, and finally it makes it harder for clients buying SEO services to see what is working / isn’t. For example, a client may rank at position 20 for a keyword, start with a great SEO provider who does amazing work and out of “bad luck” with the random algos, actually see things get worse for a time. If the SEO and client aren’t a) aware of the random algo and b) able to test and verify rank drops aren’t due to other reasons (as you can’t and shouldnt just write off a rank drop as “probably the random algo” as there could be other reasons such as site issues, page issues, content issue, negative seo etc) then the client may think the SEO service isn’t working and stop (which is what Google wants).



Any questions on the above please feel free to shoot me an email at, I’ve love to hear from you, and all the best with your SEO, Scott 🙂

This post is also available on our Google+ page at

If you’d like to get even more information about it, why not check out Googles patent on this at

The post Google’s “Random Algo” Explained appeared first on Scott SEO & PPC Services.


How to Protect Your Website Against Google Penalties

How to Avoid Google Penalties

Google constantly updates which websites appear in its results, and penalises sites which don’t meet its increasingly high standards. Google penalties are bad news if you’re running an online business, but fortunately, there’s plenty you can do to protect your website.

Penalties are handed out for various reasons, from duplicate content to poor-quality links. They can cause your ranking and traffic to plummet, so it’s important to bear them in mind and act quickly if your site is handed a penalty.

Websites are penalised for malicious backlink campaigns which their competitors have created to reduce their ranking, and links to websites which Google considers suspicious. Scraped content, bad coding and broken links can also result in a penalty.

Google is uncompromising when deciding which websites appear in its results. Powerful algorithms are regularly run to identify websites that aren’t up to scratch. The search engine’s Panda hands out penalties for bad content, Penguin for bad links and Hummingbird for sites considered unnatural. Google penalties are generated by algorithm updates or manual reviews. They reduce targeted sites’ SEO ranking for a specific page, the entire website or one or more keywords.  Google’s assessment is so far-reaching that protecting your website against penalties has become a challenge, but you can boost your site’s immunity with a few simple steps:

Make Sure Your Content is Up to Scratch

Google penalises for poor-quality or duplicate content, so it’s not worth cutting corners when it comes to the information on your website. Make sure your content is original, well-researched and error-free. Content which has been ‘spun’ by software is vulnerable to Panda, so taking the time to produce new, manually produced content for every section of your site is a smart move.  It’s important to include keywords in your content, but don’t overdo it. Google can react badly to too much of a good thing.

Pay Close Attention to the Quality of Your Links

Links are important if you want to attract visitors to your website, but don’t make the mistake of using poor-quality links. Buying lots of links could leave you particularly vulnerable to Google penalties, and sharing too many links may be seen as an attempt to manipulate results. Make sure your links are clearly visible to your visitors. Hidden links are more likely to be viewed as suspicious. Take the time to monitor your outbound links and remove any which are out-of-date. Broken links worsen users’ experience and increase the likelihood of penalties.

Act Fast to Repair Your Website

Don’t panic if your site is penalised. Most damage caused by Google penalties can be repaired. By addressing problematic links and other flaws on your site, it’s usually possible to restore a penalised site’s ranking, although the damage is sometimes so severe that starting from scratch is the best option. If you’re losing sales because a Google penalty has caused a sudden fall in ranking, Google penalty recovery services could be your best option. An SEO agency with Google penalty repair experience will assess the damage and take fast action to restore your ranking, while protecting your site against further penalties.

The post How to Protect Your Website Against Google Penalties appeared first on Scott SEO & PPC Services.


DAS – Domain Authority Stacking, 2 layer strategy

This is a funny post in a way, because I’m making this as the initial post in a dual layer DAS (domain authority stacking) strategy. Most SEO’s know the power of a DAS strategy using big authority sites such as Twitter, Facebook, etc and then syndicating content out from your blog to those assets, however I have that also doing DAS on your DAS assets to really turbo charge the power and trust you get from this strategy.

Using a mash up of additional content on your 2nd layer DAS’s also keeps everything natural and messy, to ensure this is something that gives long term benefits.

The post DAS – Domain Authority Stacking, 2 layer strategy appeared first on Scott SEO & PPC Services.