Senior SEO Consultant

Share
  • 0
  • 0

This year’s YMYL SEO horror stories include classics such as “Removing written content from pages to improve load speed” and “only allowing one person to handle leads, and the agency then being blamed for lack of conversion”.

I’d like to say thank you to everyone who took time out to contribute to this post, and for sharing their experiences!

This is one post in a series of Halloween SEO Horror Story blog posts going live this October! Stay tuned for the others, and horror stories from other verticals!

Kris Gunnars, Search Facts

@krisgunnars // Searchfacts.com

I was working with a massive website in the health / medical space. On a Monday morning, the editorial team was crawling a recently published article to check for dead links and realized that the robots.txt file was blocking all robot access to the site. Apparently, the developers pushed out an update on Friday but forgot to remove the directive from the robots.txt file.

The problem was fixed on Monday morning, but traffic went down by about 10% and was slightly depressed for a few days. Not a huge deal, but enough to cost several hundreds of thousands in revenue.

Dmitrii Kustov, Regex SEO

@DigitalSpaceman // RegexSEO.com

A couple of stories

1) A client of ours refused to track form submissions and phone calls as conversions, because “he knows better if there are more online leads”

2) A client’s business was going flat, while rankings, online leads, and calls were going up. After digging, we found out that a receptionist was not allowed to give any information, and only the owner (who was always super busy and unavailable) was allowed to talk to potential clients. After pointing that out, our agency got fired the next day due to “not delivering results”.

Gyi Tsakalakis, AttorneySync

@gyitsakalakis // AttorneySync.com

We can’t share GA, GSC, GMB data because they’re in “agency account.” 👻

Katherine Ong, WO Strategies

@kwatier // wostrategies.com

Based on my audit of a federal (.gov website), the developers started implementing all items that were mentioned. This includes the fact that some pages didn’t have strategic H1s. They took that issue to mean that the entire page should be blocked via a meta noindex tag in the header – blocking Google from crawling pages that were ranking in the Featured Snippets for high volume terms.

Luckily I spotted the issue before Google had a chance to recrawl those pages and we were able to remove the meta noindex tag.

I had another client who brought me on board due to a drop in organic traffic. They had launched their new redesign on an intentionally public staging URL while keeping their old site up – with the intention of getting stakeholder feedback. Google indexed their new design on the staging URL and started ranking some of that content over their public URLs. It took them months to fix the issue and get back the lost traffic.

Adam Reaney, SEO

@100percentapr

The client’s site was professional services, which if anything made this suggestion even more shocking as the content was essentially their service list. It didn’t perform too badly in Google but there were some speed issues that we knew were an issue. 

The ‘SEO agency’ told them that removing the content on their service pages so it was just 50 words max would speed the site up and promote UX. For content, the site had around 7/8 service pages each with unique, high-quality content on them of around 800-1000 words. 

Needless to say, they completely vanished from the rankings, still had the site speed issues and experienced a disastrous drop in traffic. 

 

👻🎃

Need more SEO Horror Stories?

Read the complete 6-post 2019 SEO horror stories series:

👻🎃

Share
  • 0
  • 0