SEO is an extremely active community, which is one of the best things about it – but because in theory, anyone with WiFi can offer SEO we tend to see some amazing advice being offered.
I’ve reached out to the SEO community via Twitter, asking my fellow SEO experts to share some of the best advice they’ve seen from previous agencies and consultants on accounts they’ve inherited. I’ve also scoured the internet and forums looking for other hidden gems, and I found some… For example:
The developers blocked Googlebot by IP address, because of suspicious crawling activity…
A few of them we will all likely have come across throughout our careers, others however are that… unbelievable you have to double-take.
Also, a quick thank you to everyone who responded and helped make this article possible.
So let’s begin – here are some of the best examples of bad SEO advice and horror stories:
More EMDs, more rankings!
Zack Neary-Hayes (Freelance SEO Consultant)
Previous SEO advisors had set-up a series of one page websites on exact match domains, with duplicate content, so that the content was discoverable at as many locations as possible.
David Iwanow (Global SEO Manager @ Schibsted Media Group)
So, saw on advice a business spun up ~4,000 EMD domains in order to rank for as many niche queries as possible, all containing duplicate content. Believe it or not, they then got a manual penalty notice for pure spam… The agency said it’s likely not related.
More traffic? Let’s rank for porn!
Jacque Urick (Director of SEO @ Sears Home Services)
This was circa 2007. The 3rd party SEO firm had changed ALL the image names and image alt text on the website, to sex crime and illegal porn related terms…
Site was getting lots of traffic, as the 3rd party SEO firm had promised, It was all image search traffic…
Results = Lots of content + No internal links
Andrew Cock-Starkey (aka Optimisey)
Had a client who’d been advised “For SEO, you need more content.” Not strictly bad advice, but they’d been on a content treadmill for months/years… but with zero focus and no internal linking. Nothing.
Sorted just the internal linking issues, and impressions/visibility went up 300%…
HTTPS & SSL don’t matter
This horror advice was covered extremely well by Troy Hunt on his site, but it’s that bad and with cyber security and data security being so prevalent at the moment – it has to be included…
If you don’t have sensitive information on your site, you’re not selling a product or a service, there is no checkout page, you don’t need a certificate. It doesn’t help, you know, increase security.
And from a different SEO expert in the same article by Troy…
Encrypting all pages on your website will only slow them down.
The more you spend on PPC, the better your SEO!
Zack Neary-Hayes (Freelance SEO Consultant)
Had one that was told site would never rank organically, so was recommended some whack £10k per-month for PPC campaigns and a landing page creation service.
They did that, had crap return, and then we worked on the site and it was such a straight forward campaign to improve organic search performance…
Luke McCarthy (Digital Product Lead @ Mayflex)
The more you spend on PPC with Google, the better your site ranks organically in Google. Classic.
Quick, fix your bounce rate!
Gerry White (SEO Consultant @ JustEat)
The business was advised by their SEO agency that bounce rate was definitely a ranking factor, and they had to lower it – so they installed the Google Analytics code twice on the website to do so…
Talking about Bounce rate I’ve also sat at a talk at MeasureCamp Manchester, where a “senior SEO” tried to give a talk on how he felt bounce rate was a critical ranking factor. In the same talk he also talked about limiting client access to their own Google Analytics and what metrics you show them. Oh dear.
Let’s manipulate dwell time!
Dwell time is a ranking factor. Someone told this poor lad at a dental surgery to go on all the work PCs and leave the homepage open all day to crank up the session duration.
Some SEO consultant told a poor lad working at a dental surgery to go on all the work PCs, and the leave the homepage open all day to crank up the session duration…
And whilst we’re talking about the wonders of Google Analytics:
One clients previous agency was charging $225 per month to “maintain Google Analytics”
Content = slow pages (and Google doesn’t care about content)
Adam Reaney (Sheffield based SEO)
The client’s site was professional services, which if anything made this suggestion even more shocking as the content was essentially their service list. It didn’t perform too badly in Google but there were some speed issues which we an issue.
The ‘SEO agency’ told them that removing the content on their service pages so it was just 50 words max would speed the site up and promote UX. For content, the site had around 7/8 service pages each with unique, high-quality content on them of around 800-1000 words.
Needless to say they completely vanished from the rankings, still had the site speed issues and experienced a disastrous drop in traffic.
Make sure your comment backlinks are 10 words+
So… Issues here…
- If you read this article, and watch the video on the page on how to improve your DA, both produced by Moz – who invented the DA metric… You’ll quickly see how flawed this is.
- DA is a metric that Google doesn’t take into account anyway
Dangerous way to have explained 301s
Christopher J Connor Jr. (SEO Consultant in Portland)
Saw this in an audit a client was given by a previous SEO agency:
301 redirects – Each time a page redirects to another page, there is potential that the site visitor will see a slight delay for the page to render. It is recommended to update old URLs to the correct URL to reduce page delays. Low priority item to fix.
Sorry? What? Remove all redirects?
Don’t allow people to link to your website
Yosef Silver (Founder of Fusion Inbound)
I was once in an SEO role at a very large organisation and discovered the terms of service (TOS) on the website did not allow people to link to content. This company would actively send cease and desist letters when they got new links.
Unfortunately, websites trying to control how other people link to their website through their terms of service isn’t something new, or uncommon, the below is from a large, well known UK based consumer product advisory TOS:
If we ask you to remove or change a link to our websites, you do so as soon as possible.
Unless you obtain our express permission, you must not include more than 10 links to our websites on any one of your web pages.
If you want to read these for yourself, the TOS can be found here.
Full service link spam creation & disavow service
Martin Woods (SEO Consultant)
I once found that a business was paying for SEO services from a well known SEO agency, and one side of the room they were producing backlinks for the client (and charging for it), but then on the other side of the room another team was working on disavowing the links (again, charging the client for it)… New meaning to full service agency!
On the topic of disavowing backlinks…
Harry Dance (Digital Marketing @ Eagle Online)
I was informed by an agency I was working with that
we don’t disavow bad backlinks because that highlights to Google there may be bad links on that site
It’s almost so bad it’s genius.
Google distrusts foreign websites linking to you
David Iwanow (Global SEO Manager @ Schibsted Media Group)
I have seen Google Search Console disavow files including sites I operate in other markets… Yes, disavowing their own websites because they use a different ccTLD and apparently Google distrusts foreign sites linking to you…
Externally link to Wikipedia to improve local rankings
I recently inherited a client from someone who recommended linking to Wikipedia pages for the cities they were targeting for local rankings. Outbound site-wide footer links to Wikipedia. Yup. That.
We don’t need SEO, you just need to let Google know about our site
Peter Nikolow (Mobilio Development)
Had a potential client come to me saying that they’re website already had been “SEO optimised”, and all I needed to do was to let Google know about their website – five to ten minutes worth of work in their eyes. They had already paid the development agency for a super-special, and expensive SEO package.
“Knows some SEO stuff”
Jeremy Rivera (Director of SEO and Content Marketing for Raven Tools and TapClicks)
Circa 2006. Guy said that he’d received a recommendation from his relative who “knew some SEO stuff”.
His site lost all rankings. It turns out he had added 1000 “Yorba Linda Real estate” in white text on his white background on every page of his site, and in the meta keywords tags.
Linking back to the agency
Esben Rasmussen (Online Analyst @ Kamstrup)
The former agency had placed a fully transparent png (invisible) in the footer of the website of its customer – used it as a backlink where the alt text was the name of the agency + desired keyword… Just wow.
I’ve also experienced this were the “development and SEO agency” gave the client a custom WordPress template with a hardcoded footer link back to their homepage… But they misspelt their own homepage URL, so it 404’d. I asked them to update it for the client, they came back with an invoice to do the dev work…
I once did some work for a local locksmith who’s development agency had hardcoded their agency backlink in the footer… The only issue is they misspelled their own agency in the href link so it was a permanent footer external 404. Then when I raised it to fix it, they wanted to charge the client the dev time!
We all love inheriting crap site migrations…
Steve Morgan (Freelance SEO consultant)
Took on a client that’d had 3-4 site redesigns over the years (each time with URL changes), & despite having worked with 2 SEO agencies in the past, none of the old URLs from any old versions of the sites had been redirected – they were all still 404ing.
I also once had a dev let a site update go live without consulting the SEO team, so all site arch., schema, metas, and the whole nine yards were wiped out. That was a NIGHTMARE 😱
I got threatened with being sued, a week after a client signed a contact, because their Google traffic went to zero. I hadn’t actually started as they had asked me to wait a week as their site redesign was finished.
That redesign included adding Disallow: / to robots.txt…
Twice, international publicly traded companies pushed new sites live, without telling us they were working on a new site, and just pushed test to production, changing all URLs with no redirects and leaving it all noindexed. Woot.
During a platform migration, a site was launched with no sitemap, (or capability to set a dynamic sitemap), database tables containing meta values weren’t carried over to the new platform, whole sections of the site we removed and 404’s were left as a secondary priority…
No canonical tags were set, the CMS didn’t even have capabilities for multi language deployment, hreflang tags were absent, a whole section of the website containing main pages didn’t even communicate with the CMS… there’s too much more to write…
Do they know what Hreflang is?
Working with a French travel website, who had content in French, German, English and Dutch, were desperate to rank within the UK for key commercial terms but weren’t visible at all. The specialist SEO agency they had been working with had been producing content, building links – great, but no result.
So, we had them implement Hreflang at the end of 2015, and you can see what happened…
Where we’re going, we don’t need redirects…
Dan Smullen (SEO Consultant)
Forget that redirect map, we will just automate redirects to internal search (post migration) this way no matter what the user searches they will land on the right page… But their all blocked by robots.txt
Unfortunately, I’ve come across a number of SEO and development teams not wholly understand how important redirects are (and how important it is to use the correct redirect codes).
Heard also internal stakeholders in the past block redirect projects as they apparently wanted a clean break from old brand…
Once came across a dev who used Excel to put together redirects (wait for it), and then used the drag feature in Excel so the 301 turned to 302, 303, 304, etc. for the redirects in the .htaccess file. Took down the site.
Had a new client come to us after their re-platformed e-commerce site went live with new URL structures and no redirects. Thousands of products 🙃
Google was crawling the site too quickly to index it properly
Developer put a Robots.txt no-index on the entire site (Fortune 50 travel site) because he thought “Googlebot crawled it too fast to index it properly”
Googlebot crawled the website too fast to index it properly
We all ❤️ developers
Implementing the GTM container code, using the meta description input field within the CMS… Universally… Across the whole website.
I’ve been trying to get an in-house developer for a project I’m working on to remove dev site, which has been hacked w malware and has tons of spam content, from index for 3 weeks now. Entire site is indexed. Fun.
Site normally hosted on 4 servers. Bumped up to 6 servers for peak season, but I didn’t know about the 2 additional servers.
Robots.txt was Disallow: / for the 2 servers. Pages constantly dropping on and out of indexes. Took a couple hours to figure out what was going on.
Developers of video game site thought it would be cool to use broken fragment URLs with the # for the 42 game character/figurine game pieces (easy to guess who).
This resulted in hundreds of rankings disappearing and couldn’t B fixed for a year when the next year’s game launched.
At my old job we were republishing more than 10.000 pages to a Newly optimised layout. the Dev published ALL the Pages with a small simple tag:
… it went from 1.5-milion page views to 10.000 page views in one day. Fortunately we discovered in 2 days and in 2 weeks the views got back
Want more SEO horror stories?
If that wasn’t enough, here are some other great articles written over the years with some more, truly amazing horror stories: