Whether you’re in Silicon Valley, Silicon Hills, or a shared workspace – most SaaS and fintech companies have fallen foul of an SEO faux pas or received bad advice.
This year’s Saas & Fintech SEO horror stories include classics such as “changing Hreflang and URLs killing off traffic from a whole region” and “adding all code to a /js/ folder, then disallowing the folder in robots.txt”.
I’d like to say thank you to everyone who took time out to contribute to this post, and for sharing their experiences!
This is one post in a series of Halloween SEO Horror Story blog posts going live this October! Stay tuned for the others, and horror stories from other verticals!
Sam Taylor, Technical SEO Consultant
During the on-boarding month, I performed a backlink audit and found that just under 50% of links contained exact-match keyword anchor text.
The previous agency had spat out so many of these links that it was nearly 50% of the entire profile, not just actively built links. Worst of all, most of these were unnaturally placed into editorial pieces and the content wasn’t even unique to each site. Syndication is completely fine, say if it is a set of local newspapers syndicating a national news story. However, when you have 30+ websites that aren’t related to posting the exact same article, same links, and everything, that is NOT great!
I originally thought that our strategy could be to simply have the links changed to something a bit more natural until I noticed the duplicate content. Straight into the disavow file, such a shame!
Montse Cano, Digital Manager
SEO isn’t as important, because we don’t get any business from the web 🤦🏻♀️.
But, the SEO we have now needs to be extended to the new section. Hreflang is necessary in case someone is in Germany and wishes to read the page in English.
Edward Bate, SEO Consultant
The client (billion-dollar company) had H1 headings on pages in the HTML but either visually hidden or indented negative pixels so it wouldn’t show on the screen.
Apparently, it didn’t fit with their layout but they still wanted it for SEO purposes.
Dan Taylor, Technical SEO Consultant
Working with a global SaaS company, their internal “international expert” decided that that would move from targeting “es” to “es-es”, and made changes to the subfolder URLs to reflect this, as well as to the Hreflang.
I warned them very quickly that “es” was also catering for the LATAM market (South America), and changing to just targeting Spanish for Spain could be problematic. They said it would be fine.
Three weeks later, traffic, signups, and leads from the LATAM region had pretty much flatlined and the business was panicking. The quick fix was to create a global Spanish version as the Globalization head didn’t want to revert the mistake.
Peter Nikolow, MobilioDevelopment
I was assisting a company in identifying the cause of a post-migration traffic loss.
Since was complex migration including a transition from HTTP to HTTPS, creating a new URL structure, changing servers and changing CMS was normal traffic to be down.
But in this case traffic was falling too fast and even the homepage can’t be seen in SERP even for branding search. So we check redirects, log files, and other possible positions to see why.
And then I saw it… x-robots-tag in HTTP header was set to “none”. I was freaking out to see that and immediately (in midnight) contact (via phone) admins to fix that mess.
On morning traffic slowly become recovering, but then I saw other issues. Seems that sitemaps return error 500 to Google. I check a few times and they’re ok on my side.
But when I start hitting it too frequently, and I started to receive error 500 too. I again phone them. Seems that one of the servers wasn’t properly configured and rarely load balancer use it. But when main servers are overloaded and LB starts using that misconfigured server we’re getting error 500.
And then I hit a new issue. Because the site was changed HTML templates too someone makes heavy using of CSS display:none for links in some of the divs. Surprisingly – that div is visible on desktop and mobile browsers, but Google can’t see links. But this fix took 2 years and it’s still “in progress…”.
I had to move on.
Jacek Wieczorek, Technical SEO
Steven van Vessum, VP of Community
This specific company is absolutely massive, and everyone knows it. It’s a company that’s been making lots and lots of acquisitions for years, gathering thousands of domains with millions of links.
That’s great and all, but if you forget to renew the domains you quickly lose a lot of links too. And that’s what happened. Because of the sheer amount of domains this company was managing, and the lack of overview, they forgot to renew a bunch of domains (with links ranging in the 1,000-50,000 RDs).
This is a horror story because it’s not something that you become easily aware of.
Even when you start to see a steady decline in rankings, this issue isn’t easy to diagnose. Alternatively, redirecting domains with dodgy links can be very tricky as well. Therefore, always keep track of what domains are being redirected where.
It was long ago but this SEO fail taught me a lot!
It was a huge site suffering from a sharp visibility decline after the first Panda update (see, I said it was a long time ago!).
I just started as a junior SEO and was curious about all the things I could do to help this client. I identified that the website had an enormous amount of duplicate content, mostly because of sorting/filtering options. The most suitable solution for that was to implement canonicals.
That’s what we did.
I remember sitting and checking the cache of these pages hoping to see that the canonicals started working. I checked on Monday. Them Tuesday. Wednesday. Next Wednesday… Nothing happened. Until one day (I’m not sure if it was the Wednesday), I decided to check the canonicals in the pages source code…. and instead of:
<link rel="canonical" href=“URL“ />
<link rel="canonical" hfer=“URL“ />
HREF vs HFER! This minor typo made by developers cost the client money! We had that fixed ASAP and Google picked up the canonicals within a few days. Eventually, the website got its visibility, traffic, and sales back and even increased it.
Tad Miller, Marketing Mojo
Developers of video game site thought it would be cool to use broken fragment URLs with the # for the 42 game character/figurine game pieces (easy to guess who).
This resulted in hundreds of rankings disappearing and couldn’t B fixed for a year when the next year’s game launched.
Harry Dance, Kayo Digital
I was informed by an agency I was working with that:
we don’t disavow bad backlinks because that highlights to Google there may be bad links on that site
It’s almost so bad it’s genius.
Oliver Mason, SEO Consultant
I was asked to review an SEO agency’s recommendations – the audit had a section on site speed, with the main advice being to use Dynamic Serving (different HTML, same URL) in order to reduce image sizes for mobile devices.
Although this would have achieved the desired outcome, it would have been a drastically over-engineered change (a second website?) for something that could be achieved by making the images responsive using srcset. It was a particularly bold recommendation given how clued up and agile the client was.
Alizée Baudez, SEO consultant
CEO and founder of an SEO SaaS tool didn’t know what Google I/O was, believed Google Search will disappear in 3 years and didn’t care for headings on his pages.
Black hat methods were his main focus.
Muhammad Roohan, Growth Marketer
When we came back on Monday, the traffic was null.
We were optimizing our homepage that was already ranking on 400+ high volume keywords and the visits were in thousands. Somehow, during the optimization, the developer put a “No-follow, No-index” attribute in the homepage.
It was the weekend. When we came back on Monday, the traffic was null.
The dashboard was empty. We were shocked to see every single keyword we were ranking for, we were OUT! We thought it some algorithm change but soon realized that our website is de-indexed.
Orit Mutznik, Head of SEO
In a previous role, the company wanted to force a redirect to all US users to a 1 page US site due to regulation (the global site has 20+ languages and tens of thousands of pages).
Despite all of my objections and all proposed solutions in my arsenal, they went ahead with the forced redirect anyway. I prayed that having rel alternates in place would somehow save the site from the horror I’ve predicted and warned, but soon enough, within minutes in some cases, the US site started to replace all of the local sites across all locales, taking them down one by one.
I ran to the Devs immediately, showing them that the redirect resulted exactly as I predicted and tried to warn them about.
This convinced them and they rolled back immediately. Luckily, Google caught on the rollback immediately as well, and all went back to normal. No more forced redirects were ever attempted, and definitely proved the importance of working better together.
Svante Hansson, Founder
We had implemented a caching solution the previous year which had performed above expectations. One week during the summer we realized that we were getting about 1/100th of our visitors compared to normal.
After investigation, we realized we haven’t set up cache expiration so it was slowly but surely eating up all the available space on the drive.
Once it hit 100% usage the website decided it didn’t want to live anymore. The real horror story is that we didn’t have any monitoring in place and it went unnoticed for over a week before we realized it.
Need more SEO Horror Stories?
Read the complete 6-post 2019 SEO horror stories series:
- Ecommerce SEO Horror Stories
- Travel SEO Horror Stories
- YMYL SEO Horror Stories
- iGaming SEO Horror Stories
- SaaS SEO Horror Stories
- Local SEO Horror Stories