Share
  • 0
  • 0

This year’s eCommerce SEO horror stories include classics such as “They sold out of stock, so they turned off the website” and “they used to send bot traffic to their site to influence Google rank metrics”.

I’d like to say thank you to everyone who took time out to contribute to this post, and for sharing their experiences!

This is one post in a series of Halloween SEO Horror Story blog posts going live this October! Stay tuned for the others, and horror stories from other verticals!

Joe Balestrino, SEO Consultant

@Joeybalestrino // JoeBalestrino.com

they reluctantly added back the menu navigation and traffic returned to normal

The client thought it would be a good idea to remove many important parts of the main navigation. They wanted to simplify it. However with no planning and not consulting SEO person they moved forward with the change. In about seven days time traffic dropped by 85%.

When I spoke with the client and asked if there were any changes made to the site they left that part out. Needless to say, once I discovered the problem I was then faced with the issue of them not wanting to add back full navigation.

They wanted to move forward with just using an XML site map I then had to explain that a site map helps index pages but doesn’t help with the ranking them. Long story short after small baby steps and showing them how a sub-navigation and menu navigation is necessary they reluctantly added back the menu navigation and traffic returned to normal.

Lama AlHaqhaq, SEO Manager

@LamaHQ // RBBi

I read a lot about Robots.txt Disallow rules being pushed to the production site – but with one of our clients, it was the exact opposite. I had provided an updated Robots.txt file for them to upload to the live website, and they added it to the staging website for me to validate… The worst part is that it took them weeks to take it down.

John Vantine, SEO Consultant

@w00t // JohnVantine.com

but even after the fix was applied it took several months to recover

We were moving a large section of the site over to a new tech stack. I had previously audited everything to verify that the new code checked out from an SEO perspective, but a set of breadcrumbs were added right before the final push.

One single section of the breadcrumbs did not force a trailing slash, and I failed to notice that this new stack didn’t 301 non-trailing slash URLs to their trailing slash counterparts.

Shortly after the release, organic traffic started to tank. It didn’t take long to realize that the crawl budget was being diverted to hundreds of thousands of new non-trailing slash URLs, but even after the fix was applied it took several months to recover.

Klaudio Fejzaj, eCommerce SEO

@klaudiofejzaj // KlaudioFejzaj.com

The whole website was not mobile-friendly

Onboarding a new client they told me that they had someone previously do SEO on their eCommerce website, and the proof was a lot of additional text on category pages. However, after a light investigation of the website:

  • The previous SEOers had no access to GA (they were looking at sales via CMS dashboard)
  • There was no Google Search Console setup
  • No HTTPS (on a transactional website)
  • The whole website was not mobile-friendly
  • 40% of the website had multiple redirect chains
  • Each product had multiple variations (no canonicals used)

David Iwanow

@davidiwanow // travel-network.co

They sold out of stock, so they turned off the website

Story 1

They sold out of stock, so they turned off the website (404 status) as they couldn’t handle the customer service inquiries asking about when would things be available.

Story 2

With the website relaunch, it was put offline for 3 days while the new website running on a different server was launched… should have been a 15-minute downtime to swap DNS.

Marcus McConkey, Technical SEO Executive

@MarcusMcConkey // Glass.Digital

Story 1

An “SEO Specialist” at one of the largest eCommerce platform suppliers in the UK once told my client that disallowing a URL in the robots.txt file can effectively be used as an alternative to the ‘noindex’ robots meta tag to take that page out of the SERPs instantly 😕

Story 2

I audited a prospect site that when crawled, had around 100,000 pages. For the site it was, I didn’t expect it to have that many crawlable pages, so I had a look at their robots.txt file.

They had a well-formed file with several lines of Disallow rules.

However, it turned out that they also had the following line as one of the first rules:

User-agent: Googlebot 

Disallow:

Because of this, every section of the site that they wanted to be disallowed (as listed in the file) was still being crawled! After fixing this, we have seen the total number of crawlable pages reduce by <90% and we are starting to see the first signs of life for the site, with great growth in clicks and impressions!

Paul Lovell, Founder & SEO Consultant

@_PaulLovell // AlwaysEvolvingSEO.com

I was working with a client that was about to go through a very large migration in multiple counties, after my, advise and assistance in creating a large scale redirection project the CTO decided it was not important for that amount of time to be spent on another part of the project.

The funniest thing is a year later they implemented it, I think they may have realized that as soon as their site migration was complete their visibility dropped and the old pages where still index and are still index now. You just can’t help some people!

Sachin Shaji K, Senior Marketing Analyst

@sachinshajik // SearchEngineNation.com

The previous SEO agency of my client used to send bot traffic to their site to influence “Google rank Metrics”!

One day the site got hit by a penalty and all the keyword ranking dropped. So much for influencing Google rank metrics!

Alex Quaye‏

@DigitalWhat

Had a new client come to us after their re-platformed e-commerce site went live with new URL structures and no redirects. Thousands of products 🙃

(From the 2018 collection)

Emma Knightly, Digital Marketing Institute

@dmigroup // DigitalMarketingInstitute.com

adding NoFollow tags to all internal links

I’ve witnessed an experienced London-based in-house SEO adding NoFollow tags to all internal links (including menu navigation buttons) on a popular nationwide pet store’s main domain and the subsequent 80% loss in organic traffic to these critical category pages, followed shortly by his removal from the role, and an agency being hired to tidy up the mess.

Chris Green, Head of Search

@chrisgreen87 // Strategiq.co

you know because uncurated landing pages dynamically created by user input will never go wrong

A tool provider was trying to sell in a search system that created static landing pages for the most-searched-for products – you know because uncurated landing pages dynamically created by user input will never go wrong… That wasn’t the horror story though when the vendor was demonstrating the software on a test site, we delved into the dynamically created landing pages.

Unfortunately for the rep was that he hadn’t checked the results for a while and in the meantime, someone had tried multiple MySQL injections which had created hundreds of junk pages as the search functionality was spammed into oblivion.

The guy tried to cover it up, but too late the PERFECT reason not to use their outrageously priced tool demoed quite effectively for us 🙈

Roey Skif, Founder

@roeyskif // tldrSEO.com

The first thing I always ask a new client is “what are your online assets”. Maybe there are other old websites with no redirects and this is easily considered one of the top low hanging fruits on my list.

However, they didn’t mention it. Only by accident, I’ve found an old domain that was linking to them. It had thousands of referring domains, it was practically their old domain, after asking them again.

Not only it wasn’t redirecting when I asked, but they also said they no longer own it, and someone had bought it already and redirect it to their competitor… <huge facepalm>

Stephen Kenwright, Founder

@stekenwright

40% of our main website’s organic traffic came directly to product listings in a single /search/ subfolder (the rest came via category pages and the homepage, with a small amount arriving on the blog).

We had a new version of the website weeks away from deployment where all the product listings had been moved to a subdomain made using JavaScript which couldn’t be rendered. We’d have lost more than 300,000 users/month if we’d gone live. We’d already launched a new brand on with the same setup that wasn’t getting any traffic to its product pages after 6 months.

Gianluca Fiorelli, ILoveSEO.net

@gfiorelli1 // iloveseo.net

It’s an International SEO horror story.

An e-commerce website that tried to literally target the entire world… so to create hreflang annotation for “countries” like the South Sandwich Islands, a group of islands close to Antarctica and where only 30 persons are living!

Moreover, they duplicated the English version so to have it in every country, even if that country English was not an official language. Almost 1,000 hreflang annotations, many contradicting others. Obviously, Google got crazy, exclaimed “f%%|k you” and started doing what it wanted.

Adam Brown

@adammartinbrown // Tai Web Design

The international luxury company with stores across the world decided they wanted to implement Hreflang finally after we had suggested it a long time ago… They decided the best action was to implement it for every single location and language in the world, not just their focus.

Causing over a thousand different websites to be generated in a matter of a day and submitted to Google. Let’s just say, it didn’t go well for them.

Dan O’Leary, Digital Strategist

Overit Media

Late one Friday, a client pushed out a site redesign that had been in a staging environment without the proper SEO team signoffs. When it went live, every page was set to meta robots=nonindex, as it was in the staging environment.

Over the weekend, the site’s search engine traffic began to plummet and by Monday morning our answering machines and inboxes were flush with panicked and desperate client stakeholders demanding to know what went wrong.

It was a very quick fix to spot and remedy, but it cost them a full week and a half to recover all of the lost traffic and sales.

Gil Gildner, Discosloth

@gilgildner // Disco Sloth

Over the course of a year, not only did his traffic decrease instead of increase, but they wanted to rebuild his WooCommerce site at a cost of $90,000.

Last year we on-boarded a new client, a great guy who manufactures small parts for a very popular hobby. He’s got a nice little e-commerce operation with profits of over $300,000 a year.

He had hired an SEO agency based in Texas to help increase his organic sales. Over the course of a year, not only did his traffic decrease instead of increase, but they wanted to rebuild his WooCommerce site at a cost of $90,000.

He fired them and hired us. The first thing I did was take a look at his link profile.

The previous agency had built over 11,000 backlinks on trash sites (including porn and terrorist sites) which I disavowed immediately among other things. A year later, and his YoY organic traffic is around 300% what it was previously.

John Hancock

Anon // Anon

This is a lesson in how automation for SEO can be dangerous.

Worked inhouse for a very well known eCommerce company that got bought by another company. This other company had a solution to get more SEO traffic by automating the creation of category landing pages. They used our data feed of products and then scraped keyword planner and other websites to find the most searched phrases that contain keywords from our data feed.

The SEO team was against this, but the feature rolled out none the less on a subsection of one the sites’ subdomain.

Traffic increased on the subdomain as landing pages were rolling out. Some pages where just products with sale type modifier in the URL:

  • xxxxx.com/mickey-mouse-wholesale
  • xxxxx.com/mickey-mouse-cheap
  • xxxxx.com/mickey-mouse-deals

Some were like super-specific categories or filters:

  • xxxxx.com/red-mickey-mouse-cartoon-mousmats
  • xxxxx.com/big-mickey-mouse-cartoon-mugs

One day we noticed some odd keywords coming in from Search Console. Lots of porn-related keywords, however, these were very, very XXX rated terms, the kind of keywords that would get you investigated and are illegal in most countries.

Their solution was crawling all sorts of sites, and clearly had no blacklist of keywords that should be dropped from creating pages. We caught this very early on and did a URL removal on the entire subfolder so it got removed from Google before anyone reported the site, or went to the press.

Axel Hansson, Head of SEO

@buffertse // viseo.se

Large eCommerce store where the developers by “accident” managed to block all bots (including Google) then acting gobsmacked over it when we contacted them.

This resulted in -40% organic traffic and -100% paid traffic over the course of a single month.

Roman Adamita, SEO Manager

@AdamitaRoman // medium.com/@RomanAdamita

Have you ever thought that Googlebot’s update when moving to Googlebot-Smartphone will decrease organic traffic for specific pages (in this case – Brand categories)? It’s happened!

Our client’s e-commerce website is absolutely mobile-friendly/responsive, but their mobile and desktop versions are a little different. Nobody could have guessed if indexable filters (optimized brand categories) from faceted navigation are lacking from the mobile version, Googlebot-Smartphone will not seem/crawl them. So after the update of Googlebot, these indexable brand categories were dropped in organic visibility.

Therefore we solved that problem, but instead, we didn’t add the brand filters to the main categories. We added the brand name with an anchor link to similar products of the brand in any product pages. After that, we gathered x2,5 clicks, x4 impressions to brand categories.

Pete McAllister, OutreachPete

@1petemcallister // outreachpete.com

I was asked to consult on strategy at an independent eCommerce company after a new graduate had started. The head of marketing (who wasn’t an SEO expert) was puzzled why the traffic was declining after active SEO work had taken place. After a couple of conversations with the new start, it was uncovered that he had misinterpreted what PageRank sculpting was. Whether PageRank sculpting works even when done correctly is a discussion for another day…

However, instead of following internal links to non-commercial pages, the employee had been reaching out to webmasters and asking them to nofollow external links to non-commercial pages, Oops!

 

đŸ‘»đŸŽƒ

Need more SEO Horror Stories?

Read the complete 6-post 2019 SEO horror stories series:

đŸ‘»đŸŽƒ

Share
  • 0
  • 0