12 March 2017
12 March 2017,

How You Can Recognize and Stop Negative SEO in 2017

I'm going to show you through using one of my sites and stats, how you can know if you are being negative SEO'ed and how to stop it. The game has changed. As much as it's changed for mainstream SEO, the negative SEO guys are already using Google's new algorithm against you.

You can bulletproof your site against it by doing a few things. Let's get into this!

How can you spot Negative SEO?

What was my first sign I was negative SEO'ed? Believe it or not, it was Twitter. What I started to see was this tweet about a page I was planning to make on “upcoming Internet Marketing Launches” retweeted at least ten times a day. I had full intentions of making it part my site, but I never got around to it.

I thought I put it back to draft mode. I accidently left the page half finished. I 301 redirected it to by blog. You can negative SEO somebody using tools like MOZ or SEMRUSH. The same way your competition can find your strongest pages, they can identify your weak pages. The first signal was in my Jetpack stats. I use them all the time, almost more than Google Analytics.


The New Negative SEO Attacks Google Perception Of A User's Experience On Your Site


I was getting a huge spike and visits to my site. I don't get a ton of traffic because I go after micro-targeted terms. I started looking at my refers and seeing all this traffic from and

There is nothing on either site that has a link to my site or has anything to do with internet marketing. What they were doing was sending traffic through an iframe. I don't know how they were doing it, but I'll show you how I knew.

If I just left it to Google Analytics, I would have never spotted the bastards because Analytics gives you too much information to locate simple things quickly. I dug deeper using Google Analytics by searching acquisition -> referrals -> average visit duration and that is how I found the embed.


That's Right: User Experience Is A Bigger Threat The 600K GSA Links!


Google Analytics shows me because you can go right to the file! All of the bullshit traffic came through that same iFrame! In the past, to negative SEO somebody, you would blast them with exact match keywords and links from GSA or another spam tool. As much is that doesn't work for real SEO, it doesn't work for negative SEO either.

The way the new Penguin 2 algorithm works, there's no penalty for bad links, Google just adds a Penguin tag. If you get enough Penguin tags, you get a human review. That's why Penguin 2 took so long because Google didn't want negative SEO to penalize webmasters that didn't deserve it.

It took about two weeks for the negative SEO guys to figure a work around. Instead of attacking with links, they attack the site quality, and that is an algorithmic penalty.

Here's another one: This traffic is being sent again through an iFrame:

iframe traffic

Zero time on site. My conversion goal was 2 page views, which is a small goal. If 100% of that traffic has a 100% bounce rate, there's a good chance it is negative SEO. That is how the new negative SEO game is played! It's about killing the user experience in Google's eyes and deflating your site's metrics.

Back in the day, I used to say “index your tags.” To be honest, I ranked tags all day long, especially in certain niches. Working for lawyers as clients, I would make tags from obscure penal law codes. Those would be the ones that would convert too! If someone is searching a particular code, you could rank that term in a legal blog, just by being there, especially with a local modifier. I've always gone against the grain and indexed tags, but I'm changing my opinion.

If you get a manual penalty, it's because a person looked at your site and had decided there is manipulation against Google's terms of service.

If you are going add a post, make it badass or don't do it at all. Meaning, don't slack on anything including original images, original video, and text. Include other users quality content. Link out to quality sites; your site shouldn't be a dead end! I “nofollow” all my outbound links. Honestly, I don't think it makes any difference on how you rank or if it “keeps your linkjuice”.

How can you stop Negative SEO?

What you are trying to fight, is appearing like the traffic that came to your site wasn't engaged. If you let that traffic keep coming with a 100% bounce rate, it will drag down your site metrics over time. That is how negative SEO will be done going forward.

The same way you stay in shape. Keep your site healthy. Go in Search Console and look for crawl errors. Take a look at your stats see like this the crawl error represents a web address indexed by Google. I use Yoast SEO premium for the 301 feature only. You can add a 301, and it will adjust the links throughout the entire site. Eliminating the soft 404 errors is a bit tedious, but you are sealing weak and vulnerable spots on your site. You are building a defense mechanism or “digital wall”.

Be Careful Of Disallow Query Strings!

Redirect errors you find to a related page (if you have one). In my opinion, a 301 redirect is better than returning a 404 (401 server response code) “that pages there isn't anymore. To me, a deleted page looks more suspect.

Again, It isn't “Rankbrain” aka the “scary artificial intelligence Terminator”, you have to worry about, it's the human reviewers!

Don't use the disavow. That's stupid. I think Google engineers are laughing when people use that corny tool.

The “Easy Fix”:

Now, how do you fix the biggest problem, the traffic with no time on site? It's very simple you have your host block all requests from those domains. In this example: and Just get your host to block traffic from those domains. It's simple.

It's like keeping in shape, if stay body lean, and go to the gym you will live longer. Think of SEO now just like Fitness. You've got to keep your site shape keep just like your body. If someone at Google comes to review your site because the quality and they see you have proactively been fixing errors, you have no worries!


Last Considerations


If you are using structured data, be careful. A lot of people make mistakes writing it. The only manual penalty I've gotten in years was for structured data, and my structured data was pretty tame IMO. You can check your Schema errors in Search Console as well.


Lastly, people don't talk about this enough. Do your sitemap right! Don't know noindex your tags but them in your sitemap. That's stupid because you're sending mixed signals. If you don't want your tags indexed, remove them from your XML site!

If you follow the advice in this article and don't engage in any risky link building, you can almost bulletproof from how negative SEO will be done in 2017 forward!

I hope this post wasn't too long and rambling!

Grab a copy of my personal SEO cheat sheet! I got a lot of these note from geeking out on the pattents! Trust me, it's boring!!


seo cheat sheet


PS: You can have a copy of my personal SEO cheat sheet. I've been adding notes to it for years. Sign up here, and I'll send you a copy.

Jason Quinlan