SEO & Digital Marketing Blog | Dan Taylor

Getting Drive-Thru Hours Visible On Google My Business Listings

Last Update: Friday November 9th, 2018

Google My Business is a must have tool in a local SEO’s arsenal. Actually, scratch that, in any SEO’s arsenal – I use it to influence brand real estate on SERPs, and I use Google Posts to show off were I’m speaking next, new blog posts and other cool things.

However, one thing that has started to appear in Google My Business listings (notably in America) is split opening hours, such as this example of Velocity Credit Union, in Austin TX:

If you want to see this for yourself:

https://www.google.com/search?q=Velocity+Credit+Union+Austin%2C+TX+78701%2C+USA&rlz=1CAACAG_enDE634DE634&oq=Velocity+Credit+Union++Austin%2C+TX+78701%2C+USA&aqs=chrome..69i57.3071j0j7&sourceid=chrome&ie=UTF-8

But as you probably know, in Google My Business you can only edit one set of opening hours – so how is Google discovering these split business hours for different services at the same location?

How to add drive-thru hours to your Google My Business Listing

From analysis of how Velocity Credit Union and Anchor Bank are achieving this, and it’s something very simple.

Tables.

Looking at the Velocity Credit Union page, their Google My Business listing used in the link example is on 610 East 11th Street in Austin. Now the URL from the GMB listing just goes to the homepage, and all locations are listed on a single page in an expandable <div>, which isn’t amazing but Google is still processing and understanding the relevant information enough to trust it and pull it through to the Google My Business listing.

On this branch, they have the lobby and drive-thru hours in a table. Google is understanding this table and even the M-F enough to trust the data and pull it through to the GMB listing.

Using simple HTML for a table, Google is able to understand it, much like it understands lists (ordered and unordered) and accordions enough to use them for featured snippets.

Industry Focus: SEO For Safari Tour Operators

In my career I’ve worked with more than 50 travel companies, ranging from global airlines to boutique turnkey trip operators. All of these experiences have given me great insight into the travel industry as a whole.

Understanding the industry, and consumer behaviour within the travel industry is key in putting together a strategic travel SEO plan to bring success in short, medium and long term.

One niche I’ve worked with within the travel industry is safari tour operators. Whether it’s safaris of the big 5, trips across the Okavango Delta, or a more luxurious experience in the Seychelles – I’ve helped optimise safari holiday agencies for them.

How SEO for Safari Companies differs from normal travel SEO

The safari niche is very different to the average travel niche; whilst the safari-goer blends luxury hotels, fine food and cultural experiences into the trip, they only tend to care about one thing. Animals.

Unlike holidaymakers looking for beach breaks or city breaks, the typical safari goer is influenced by the migration patterns of the animals they want to see, and where they can see them.

This means as an SEO, campaigns need to be hyper-relevant and not only focused around the commercial money phrases as a lot of research goes into booking an African safari holiday – so it’s important that various stages of the safari-holidaymaker booking cycle are targeted through various areas of the website to generate brand awareness, trust, and leads.

My past experience in Safari SEO

Below is an example of safari tour operator I worked with from September 2015 to January 2017.

When I took over the campaign, they were ranking for ~284 keywords in the Google UK database, but by the time I left stopped working with them as they moved SEO in-house, they were ranking for ~2,276 in the Google UK database.

SEMrush graph of a safari tour operator I used to work with at the end of 2015, through to the start of 2017.

This was achieved through:

  • Undertaking an initial technical audit of the website and working with their developers to improve site architecture, internal linking and improving site speed.
  • A lot of their content was duplicated from other websites, this was identified before we began the campaign, so we coupled this with in-depth keyword research and keyword/user-intent matching to produce in-depth guides to a safari holiday in their target countries, as well as tourist hot spots such as the Okavango Delta and the Maasai Mara.
  • I identified that they had previously migrated their URL structure (to the one I had inherited) but not put in place redirects – so I mapped these from an old XML sitemap that was cached in WayBack and recovered a lot of lost backlink equity pointing to the site.

As it was deemed a quiet period, the new content and URL structures were pushed live in the two weeks covering Christmas and New Years’, and we were reward almost instantaneously – I also think we benefitted from the Algorithm update on December 15th 2016 (as the search landscape had changed within that vertical), as well as the proceeding update on January 24th 2017.

safari-zebras
Safari holidays and tours vary greatly, so it’s important that any SEO really understands the product, the brand, and the end consumer.

How you can achieve these results yourself

For a lot of independent travel companies, the costs of employing an external SEO vendor can be a significant monthly outgoing – and in a lot of niche’s and verticals you can do the basics of good SEO and achieve results yourself.

Technical SEO remains the same

The best practices for technical SEO remain the same, your website needs to have:

  • Good information architecture and URL structures, with reasonable click-depths
  • Internal linking that makes sense for users, so that associated content can be easily discovered
  • Acknowledging the “reasonable-surfer” model and that users click about, and follow non-linear paths, so don’t try and funnel them.

If you’re just starting off in SEO, I’d recommend that you read this blog post here on the basics and fundamentals of what SEO is.

Content marketing and adding value

User value is one of the keystones of great content. There is no disputing this and content is now no longer about 500 words on a page with X links with Y mentions of keyword Z.

Companies that blog generate 67% more leads per month than those that don’t.

Hubspot

It’s also no longer about focusing only on the sell. You need to provide a lot of value and supporting content to back-up the main content (commercial content) to create “clusters” and areas of the site that hold authority.

For example, if you’re optimising for safari holidays in Tanzania, your core keyword base will probably look something like this:

Keyword/Search PhrasesUK Search Volume
Tanzania safari2,400
Tanzania safari holidays320
Tanzania safari tours170
Tanzania tours and safaris20
Tailor made safaris Tanzania10

Which is great, but users need to know and want to know more – this is your chance to establish yourself as an expert in the field (and saying we are experts in Tanzanian safari holidays blah blah blah isn’t enough). Also don’t get too focused on search volume, as this is a PPC metric and is a representative number of the number of average monthly searches that contained a paid ad, not searches in total.

If I was writing this resource for Tanzania, I would include:

Where to go in Tanzania

  • Serengeti National Park
  • Selous Game Reserve
  • Ngorongoro Crater
  • Katavi National Park/Lake Victoria
  • Ruaha National Park
  • Pemba Island
  • Mnemba Island
  • Lake Manyara National Park

And then with these, go even further and expand into:

  • What can I do there?
  • What animals will I see?
  • When’s the best time of year to go?
  • Is suitable for families?
  • Should I do this in conjunction with something else?
  • Is there a certain order I should do these in?

Answer these questions, also known as interrogative searches, and you can create a really powerful resource – and subtly introduce commercial CTAs. You can discover these through free tools such as Answer The Public.

Answer The Public – Tanzania Safari Prepositions & Interrogative Search Phrases

DIY keyword research for safari tour operators

Or through using tools such as Serpstat, who have a very non-offensive starter package priced at $19 a month, as well as a free account with limits of 30 queries per day – which if you’re focused in your research, you should be fine with this limit.

It’s also important to look at search results and see what kind of results Google is bringing back for certain terms, and the quality of those results – as those are your benchmarks that you need to mark.

Get in touch

If you’d like to talk about taking the organic search performance of your safari holiday agency, or your travel company in general, please get in touch.

The Basics Of SEO: What Is SEO?

Having worked in SEO for a number of years, and having been able to work with some of the biggest brands from across the globe, I’ve experienced a few things. I’ve also read a lot of content, some good and some… Left me speechless. There is a lot of noise in the SEO community, and unfortunately well publicised does not always correlate with expertise.

However, if I was starting SEO for the first time tomorrow – this, in my opinion, is the stuff I’d want to know straight off the bat. In this guide/article/very long blog post, I’m not intending to reinvent the wheel, I’m going to link out to a lot of content and stuff other people have done, because it’s great and me rewriting it or reinventing it would be pointless.

Contents

In this article, I’m going to cover:

  • What is SEO?
  • How do we make websites “Google friendly”?
  • How does Google work?
  • What are spiders/crawlers?
  • What are Google’s algorithms?
  • What is an XML sitemap?
  • What are meta titles and meta descriptions?
  • Understanding keywords
  • What is black hat SEO?
  • Basics of local SEO
  • How a local SERP differs from a normal SERP
  • Google’s local algorithms

Before we get started, I’d strongly recommend that sign-up to this email newsletter from DeepCrawl (a software provider). Almost every week Google’s John Mueller hosts a “webmasters hangout”, and as sure as the sun sets the team at DeepCrawl summarise it and email it out!

What is SEO?

search engine optimization
noun COMPUTING
noun: search engine optimization;
plural noun: search engine optimizations
the process of maximizing the number of visitors to a particular website by ensuring that the site appears high on the list of results returned by a search engine.
"the key to getting more traffic lies in integrating content with search engine optimization and social media marketing"

As well as generating traffic and leads, SEO in a practical sense is also about working with other stakeholders, such as developers and other non-marketing peoples within a business to instil “SEO best practice” across all activities.

An example of this would be working with the developers and the PPC team to remind them, and double check, that they are preventing Google from indexing PPC landing pages (through the use of a page level robots tag) to prevent Google from having two conflicting landing pages for the same group of keywords.

In my time I’ve also found myself working with companies on ensuring their tracking correctly through Google Analytics, and some of this scope even stretches to making sure they’re tracking other channels such as email and social correctly – it may be technically out of scope, but making sure all channels are measured correctly is ultimately in your favour.

How do we make websites “Google friendly”?

When I first started out in SEO properly I bought a book, The Art of SEO, and it, along with a lot of other sources in the industry classify SEO into three segments: Onpage, Offsite and Technical. However, given changes in Google’s algorithms and SERP (search engine result page) features, for me modern SEO is made up of four segments, these are:

What is SEO?
The SEO Tetrahedron. Devised by Dan Taylor 2018.

Technical SEO

  • Ensuring that search engines can effectively crawl, process, and index web-pages across a website
  • Making sure that the website responds and forms correctly across all devices and browsers
  • Ensuring that the website provides a strong, technically excellent foundation for on-page, off-site, and user experience efforts

Off-site SEO

  • Real businesses do marketing, not just for PR and backlinks, but real businesses are active
  • Having a business presence (as well as backlinks and citations) on industry relevant websites
  • Having members of the business and associated with the business active within the industry and business community, establishing themselves as industry leaders

On-page SEO

  • Content, how well it satisfies a user query (main content) and then goes on to either link to, or provide additional value around the topic (supporting content)
  • The structure of content ontologies, nesting of appropriate subfolders and categories to create content ontologies
  • Not spreading content thin and producing multiple URLs with minor content differences

User Experience

  • Site speed, and how quickly content loads for users on desktop, mobile and tablet
  • The mobile usability of website, such as it’s responsiveness
  • How content is presented (above the fold, clearly visible on load)

How does Google work?

Honestly, there are many ways to answer this question, but for me one of the best starting places here is to watch the below YouTube video from SMX West 2016, in which Paul Haahr (Google Software Engineer) gives great insight into how Google determines it’s ranking and algorithm changes.

Google pros, Gary Illyes, Webmaster Trends Analyst & Paul Haahr, Software Engineer give SMX West attendees an inside view of how Google determines it’s ranking and algorithm changes. 

What are spiders/crawlers?

A search engine spider does the search engine’s grunt work: It scans Web pages and creates indexes of keywords.

Once a spider has visited, scanned and categorized a page, it follows links from that page to other sites. The spider will continue to crawl from one site to the next, which means the search engine’s index becomes more comprehensive and robust.

What are Google’s algorithms?

Google’s algorithms, some more famous than others, have shaped the modern Google and much of the SEO industry as we know it. Rather than reinvent the wheel, Search Engine Journal and a number of authors put together this great resource covering pretty much all of Google’s algorithms. Specifically I recommend reading the below articles:

Some SEO Basics

What are XML sitemaps?

XML sitemaps can serve two functions; 1) to provide a list of URLs to Google, Bing and the other search engines that you want indexed, and 2) it can be used to implement Hreflang (we’ll come across this when we look at international SEO).

XML sitemaps are also great when combined with Google Search Console, as you can review the URLs you want indexed by the search engine and using reports determine any issues.

Google Search Console: Coverage Report

Unfortunately XML sitemaps also come associated with a myth, that you need to “submit your website to Google”. This is not submitting your site to Google, this is making use of a feature to better guide and advise search engine crawlers, as well as a way to get useful data and insights into your site performance.

You’ll typically find an XML sitemap by typing in domain.com/sitemap.xml, and this Yoast guide will guide you on how to submit to Google Search Console.

What are meta titles and meta descriptions?

Meta Titles / Title Tags

Meta titles, or title tags, are a very potent and undervalued weapon in an SEO’s arsenal. When optimised, they form an important ranking factor and important piece of correct information architecture.

There is no hard and fast rule as to “best practice” when producing meta titles, but as a rule of thumb:

  • Don’t make them longer than 60 characters, including spaces, special characters and brand. Google technically has a limit on pixel width, so a W is wider than an I. To check this you can use a tool like this, or a =LEN formula in Excel/Google Sheets.
  • Put the user first rather than try and keyword stuff, so I always try and explain the page first, then think about the terms users use when searching, and then the final thought is on including brand (especially if the page is a core commercial/money page).
  • Don’t repeat title tags across multiple pages. This is sometimes ok, if the page is a paginated page of the blog (i.e. blog page 2), or on taxonomy pages such as category and tag.

Meta Descriptions

Meta descriptions aren’t a ranking factor, and Google can (and will) overwrite them if it feels content on your page better meets the intent and needs of a searcher, but this doesn’t mean you should discount them.

Using Google Search Console you can look to see pages “not performing” as well as they could be in terms of clicks from the SERPs, as well as see the majority of keywords that the pages (URLs) are appearing for.

Google Search Console: Search analytics report

Good meta descriptions should focus on explaining to the user why the page is relevant to their query, this isn’t just about matching search phrases and keywords, but also the intent behind it. 

Further Reading

Honourable mention, Meta Keywords

Enough said. (However, meta keywords are useful in Yandex and Baidu)

Understanding Keywords

For me, keyword strategy is still one of the more misunderstood areas of SEO, and this has been made worse by a lot of the larger industry tools that try to be all things to all people.

When selecting keywords for tracking or optimisation, there are three things (in my opinion) you should take into consideration.

What’s currently ranking for that keyword?

Leave the tools and perform a search for that specific keyword and see what kind of results Google is showing:

  • Are they commercial/e-commerce results?
  • Is it showing Google shopping ads?
  • Is it showing other SERP features such as featured snippets, knowledge graph panels, or other items?
  • Is the query returning “branded” results?
  • Is it showing blogs/guides and other information resources?

From here, you can determine if the query is right for your site, business model, and ultimately worthwhile optimising for – if you’re not selling online and Google is pretty much only returning sites selling online, you’re going to struggle.

What’s the intent behind the keyword?

This relates to the checks you have already done by looking at the search results themselves, and this is defining the intent behind a keyword.

Again, rather than reinventing the wheel – here is a guide to user intent I wrote for Search Engine Journal:

Search volume is almost meaningless

A lot of people fall into the trap of using search volume as a “be all and end all” metric when choosing which keywords to track and optimise for.

Search volume is a paid search metric, and relates to PPC. Average monthly search volume actually means “the average number of monthly searches in which a paid advert appears”. We used to get the data from Google Keyword Planner (a PPC tool), and subsequently this PPC metric has made it’s way into a lot of “SEO” tools.

It’s great in helping identify the “big keywords”, which to be fair should be obvious if you know your industry – but tools like Google Search Console, Bing Webmaster Tools and Yandex Metrica all have search analytics reports that detail the search phrases you’re appearing for.

This combined with the bigger tools and competitor research will enable you to deliver some great keyword research, and identify the phrases that really will grow your clients organic presence (I’ll cover keyword research in detail in a future post).

What is black hat SEO?

Black hat SEO refers to a set of practices that are used to increases a site or page’s rank in search engines through means that violate the search engines’ terms of service. The term “black hat” originated in Western movies to distinguish the “bad guys” from the “good guys,” who wore white hats.

Source: WordStream

From experience, SEO is not as straight forward as black and white, it’s ultimately about delivering results – and sometimes that does mean stepping into grey areas. Recognised black hat tactics include:

  • Content Automation
  • Doorway Pages
  • Hidden Text or Links
  • Unnatural Keyword Stuffing
  • Cloaking
  • Link Schemes
  • Guest Posting Networks
  • Link Manipulation (including buying links)
  • Article Spinning
  • Link Farms, Link Wheels or Link Networks
  • Rich Snippet Markup Spam
  • Automated Queries to Google

The difference between grey and black however, is that grey hat SEO is often done in response to competitor analysis and matching what Google is currently ranking for certain queries (a sort of can’t beat them, join them mentally).

Black hats however set out to bring about results “quickly” and often without longevity, and can often lead to a Google penalty.

A Google Penalty, or adverse reaction to an algorithm change can often be costly to businesses in both the short and long term.

For me SEO is about delivering long term benefits to clients, and working with them for a number of years.

Also, from experience, as clients get bigger (and verticals more competitive), black hat techniques have little to no impact and are more reserved for smaller niches.

There are instances were tactics such as doorway pages (if implemented correctly) can work, but in my opinion you should learn about black hat techniques – so you can avoid them and be a better SEO.

For further reading on black hat SEO, I recommend Padraig O’Conner’s An Introduction to Black Hat SEO, on HubSpot.

TechSEO Boost: Call For Research Finalist

I’m really pleased to announce that the research paper that we (the team at SALT.agency) have submitted as part TechSEO Boost’s call for research.

We’ve been selected from the submissions alongside fellow SEO professionals, so the three finalists are:

Eric Enge, Perficient (formerly Stone Temple Consulting), with the talk: Do Links Still Matter for SEO in 2017

Dan Taylor, SALT.agency, with the talk: Utilising Cloudflare Workers to Overcome the Challenges of Legacy Tech Stacks and High DevOps Costs

Vincent Terrasi, OVH, with the talk: Build a Machine Learning Model Capable of Predicting a Webpage Ranking in SERP with a Reliability of 92%

All research papers submitted went through a peer vetting process by a panel of judges, and well known industry experts. The judging panel consisted of:

  • Russ Jones, Moz
  • Clark Boyd, Clickz
  • Alexis Sanders, Merkle
  • Neil Martinsen-Burrell, Moz
  • Aleyda Solis, Orainti
  • Ruth Burr Reedy, UpBuild
  • Rhea Drysdale, Outspoken Media
  • Geoff Kenyon, TechSEO Boost

The winners of the research competition shall be revealed live at the TechSEO Boost conference in November.

Introducing Hreflangchecker.com

After a few months of coming up with an idea, the first phase of the idea has become a reality, hreflangchecker.com.

Why do we need another Hreflang checker?

Honestly… We don’t, there are some great ones out there, including this tool by Merkle, and the Dejan SEO Flang tool (I feel as though this tool has been around forever).

However, this Hreflang checking tool is just one phase of what will be a wider international SEO and Hreflang project, including a brand new way to generate Hreflang tags on a page.

hreflang-checker-working
HreflangChecker works by auditing the entered URLs to verify the accuracy of the hreflang code, and if the right codes are being used.

What’s next?

HreflangChecker.com is part of a wider project, we’ve dubbed Project Sloth. Project Sloth is a wider project looking at improving, and making Hreflang easier and more accessible for developers (and businesses) to implement.

37 SEO Horror Stories From The Frontlines

SEO is an extremely active community, which is one of the best things about it – but because in theory, anyone with WiFi can offer SEO we tend to see some amazing advice being offered.

I’ve reached out to the SEO community via Twitter, asking my fellow SEO experts to share some of the best advice they’ve seen from previous agencies and consultants on accounts they’ve inherited. I’ve also scoured the internet and forums looking for other hidden gems, and I found some… For example:

The developers blocked Googlebot by IP address, because of suspicious crawling activity…

A few of them we will all likely have come across throughout our careers, others however are that… unbelievable you have to double-take.

bad-seo-advice
Warning: Some of these SEO advice stories are shocking, you will hide behind your sofa, sleep with the light on, and never look under the bed again.

Also, a quick thank you to everyone who responded and helped make this article possible.

So let’s begin – here are some of the best examples of bad SEO advice and horror stories:

More EMDs, more rankings!

Zack Neary-Hayes (Freelance SEO Consultant)

Previous SEO advisors had set-up a series of one page websites on exact match domains, with duplicate content, so that the content was discoverable at as many locations as possible.

David Iwanow (Global SEO Manager @ Schibsted Media Group)

So, saw on advice a business spun up ~4,000 EMD domains in order to rank for as many niche queries as possible, all containing duplicate content. Believe it or not, they then got a manual penalty notice for pure spam… The agency said it’s likely not related.

Manual penalty for spam? Definitely not related to the 4,000 duplicate content EMD domains we told you to put up.

More traffic? Let’s rank for porn!

Jacque Urick (Director of SEO @ Sears Home Services)

This was circa 2007. The 3rd party SEO firm had changed ALL the image names and image alt text on the website, to sex crime and illegal porn related terms…

Site was getting lots of traffic, as the 3rd party SEO firm had promised, It was all image search traffic…

Results = Lots of content + No internal links

Andrew Cock-Starkey (aka Optimisey)

Had a client who’d been advised “For SEO, you need more content.” Not strictly bad advice, but they’d been on a content treadmill for months/years… but with zero focus and no internal linking. Nothing.

Sorted just the internal linking issues, and impressions/visibility went up 300%…

HTTPS & SSL don’t matter

This horror advice was covered extremely well by Troy Hunt on his site, but it’s that bad and with cyber security and data security being so prevalent at the moment – it has to be included…

If you don’t have sensitive information on your site, you’re not selling a product or a service, there is no checkout page, you don’t need a certificate. It doesn’t help, you know, increase security.

And from a different SEO expert in the same article by Troy…

Encrypting all pages on your website will only slow them down.

The more you spend on PPC, the better your SEO!

Zack Neary-Hayes (Freelance SEO Consultant)

Had one that was told site would never rank organically, so was recommended some whack £10k per-month for PPC campaigns and a landing page creation service.

They did that, had crap return, and then we worked on the site and it was such a straight forward campaign to improve organic search performance…

Luke McCarthy (Digital Product Lead @ Mayflex)

The more you spend on PPC with Google, the better your site ranks organically in Google. Classic.

Quick, fix your bounce rate!

Gerry White (SEO Consultant @ JustEat)

The business was advised by their SEO agency that bounce rate was definitely a ranking factor, and they had to lower it – so they installed the Google Analytics code twice on the website to do so…

Dan Taylor

Talking about Bounce rate I’ve also sat at a talk at MeasureCamp Manchester, where a “senior SEO” tried to give a talk on how he felt bounce rate was a critical ranking factor. In the same talk he also talked about limiting client access to their own Google Analytics and what metrics you show them. Oh dear.

Let’s manipulate dwell time!

Zack Neary-Hayes

Dwell time is a ranking factor. Someone told this poor lad at a dental surgery to go on all the work PCs and leave the homepage open all day to crank up the session duration.

Some SEO consultant told a poor lad working at a dental surgery to go on all the work PCs, and the leave the homepage open all day to crank up the session duration…

And whilst we’re talking about the wonders of Google Analytics:

Seth Steinman

One clients previous agency was charging $225 per month to “maintain Google Analytics”

Content = slow pages (and Google doesn’t care about content) 

Adam Reaney (Sheffield based SEO)

The client’s site was professional services, which if anything made this suggestion even more shocking as the content was essentially their service list. It didn’t perform too badly in Google but there were some speed issues which we an issue. 

The ‘SEO agency’ told them that removing the content on their service pages so it was just 50 words max would speed the site up and promote UX. For content, the site had around 7/8 service pages each with unique, high-quality content on them of around 800-1000 words. 

Needless to say they completely vanished from the rankings, still had the site speed issues and experienced a disastrous drop in traffic. 

Make sure your comment backlinks are 10 words+

Sam Palmen

seo-comment-backlinks

So… Issues here…

  • If you read this article, and watch the video on the page on how to improve your DA, both produced by Moz – who invented the DA metric… You’ll quickly see how flawed this is.
  • DA is a metric that Google doesn’t take into account anyway

Dangerous way to have explained 301s

Christopher J Connor Jr. (SEO Consultant in Portland)

Saw this in an audit a client was given by a previous SEO agency:

301 redirects – Each time a page redirects to another page, there is potential that the site visitor will see a slight delay for the page to render. It is recommended to update old URLs to the correct URL to reduce page delays. Low priority item to fix.

Sorry? What? Remove all redirects?

Don’t allow people to link to your website

Yosef Silver (Founder of Fusion Inbound)

I was once in an SEO role at a very large organisation and discovered the terms of service (TOS) on the website did not allow people to link to content. This company would actively send cease and desist letters when they got new links.

Dan Taylor

Unfortunately, websites trying to control how other people link to their website through their terms of service isn’t something new, or uncommon, the below is from a large, well known UK based consumer product advisory TOS:

If we ask you to remove or change a link to our websites, you do so as soon as possible.

and

Unless you obtain our express permission, you must not include more than 10 links to our websites on any one of your web pages.

If you want to read these for yourself, the TOS can be found here.

Full service link spam creation & disavow service

Martin Woods (SEO Consultant)

I once found that a business was paying for SEO services from a well known SEO agency, and one side of the room they were producing backlinks for the client (and charging for it), but then on the other side of the room another team was working on disavowing the links (again, charging the client for it)… New meaning to full service agency!

On the topic of disavowing backlinks…

Harry Dance (Digital Marketing @ Eagle Online)

I was informed by an agency I was working with that

we don’t disavow bad backlinks because that highlights to Google there may be bad links on that site

It’s almost so bad it’s genius.

Google distrusts foreign websites linking to you

David Iwanow (Global SEO Manager @ Schibsted Media Group)

I have seen Google Search Console disavow files including sites I operate in other markets… Yes, disavowing their own websites because they use a different ccTLD and apparently Google distrusts foreign sites linking to you…

Externally link to Wikipedia to improve local rankings

Yosef Silver

I recently inherited a client from someone who recommended linking to Wikipedia pages for the cities they were targeting for local rankings. Outbound site-wide footer links to Wikipedia. Yup. That.

We don’t need SEO, you just need to let Google know about our site

Peter Nikolow (Mobilio Development)

Had a potential client come to me saying that they’re website already had been “SEO optimised”, and all I needed to do was to let Google know about their website – five to ten minutes worth of work in their eyes. They had already paid the development agency for a super-special, and expensive SEO package.

seo package deals
SEO package deals should be treat with caution at the best of times, and IMO, even more so if it’s asking an agency to critique their own development.

“Knows some SEO stuff”

Jeremy Rivera (Director of SEO and Content Marketing for Raven Tools and TapClicks)

Circa 2006. Guy said that he’d received a recommendation from his relative who “knew some SEO stuff”.

His site lost all rankings. It turns out he had added 1000 “Yorba Linda Real estate” in white text on his white background on every page of his site, and in the meta keywords tags.

Linking back to the agency

Esben Rasmussen (Online Analyst @ Kamstrup)

The former agency had placed a fully transparent png (invisible) in the footer of the website of its customer – used it as a backlink where the alt text was the name of the agency + desired keyword… Just wow.

I’ve also experienced this were the “development and SEO agency” gave the client a custom WordPress template with a hardcoded footer link back to their homepage… But they misspelt their own homepage URL, so it 404’d. I asked them to update it for the client, they came back with an invoice to do the dev work…

Dan Taylor

I once did some work for a local locksmith who’s development agency had hardcoded their agency backlink in the footer… The only issue is they misspelled their own agency in the href link so it was a permanent footer external 404. Then when I raised it to fix it, they wanted to charge the client the dev time!

We all love inheriting crap site migrations…

Steve Morgan (Freelance SEO consultant)

Took on a client that’d had 3-4 site redesigns over the years (each time with URL changes), & despite having worked with 2 SEO agencies in the past, none of the old URLs from any old versions of the sites had been redirected – they were all still 404ing.

I also once had a dev let a site update go live without consulting the SEO team, so all site arch., schema, metas, and the whole nine yards were wiped out. That was a NIGHTMARE 😱

Motherwell

I got threatened with being sued, a week after a client signed a contact, because their Google traffic went to zero. I hadn’t actually started as they had asked me to wait a week as their site redesign was finished.

That redesign included adding Disallow: / to robots.txt…

Twice, international publicly traded companies pushed new sites live, without telling us they were working on a new site, and just pushed test to production, changing all URLs with no redirects and leaving it all noindexed. Woot.

Martin Kelly

During a platform migration, a site was launched with no sitemap, (or capability to set a dynamic sitemap), database tables containing meta values weren’t carried over to the new platform, whole sections of the site we removed and 404’s were left as a secondary priority…

No canonical tags were set, the CMS didn’t even have capabilities for multi language deployment, hreflang tags were absent, a whole section of the website containing main pages didn’t even communicate with the CMS… there’s too much more to write…

Do they know what Hreflang is?

Dan Taylor

Working with a French travel website, who had content in French, German, English and Dutch, were desperate to rank within the UK for key commercial terms but weren’t visible at all. The specialist SEO agency they had been working with had been producing content, building links – great, but no result.

So, we had them implement Hreflang at the end of 2015, and you can see what happened…

hreflang-done-right
Hreflang implemented correctly lead to… Performance (can’t really say an increase, as there was nothing before it).

Where we’re going, we don’t need redirects…

Dan Smullen (SEO Consultant)

Forget that redirect map, we will just automate redirects to internal search (post migration) this way no matter what the user searches they will land on the right page… But their all blocked by robots.txt

Unfortunately, I’ve come across a number of SEO and development teams not wholly understand how important redirects are (and how important it is to use the correct redirect codes).

David Iwanow

Heard also internal stakeholders in the past block redirect projects as they apparently wanted a clean break from old brand…

 

Once came across a dev who used Excel to put together redirects (wait for it), and then used the drag feature in Excel so the 301 turned to 302, 303, 304, etc. for the redirects in the .htaccess file. Took down the site.

Had a new client come to us after their re-platformed e-commerce site went live with new URL structures and no redirects. Thousands of products 🙃

Google was crawling the site too quickly to index it properly

Tony N. Wright

Developer put a Robots.txt no-index on the entire site (Fortune 50 travel site) because he thought “Googlebot crawled it too fast to index it properly”

Googlebot crawled the website too fast to index it properly

We all ❤️ developers

Enroll Media Group

Implementing the GTM container code, using the meta description input field within the CMS… Universally… Across the whole website.

George Danny Murphy

I’ve been trying to get an in-house developer for a project I’m working on to remove dev site, which has been hacked w malware and has tons of spam content, from index for 3 weeks now. Entire site is indexed. Fun.

Barry Adams, Brighton SEO (September 28 2018). Picture Tweeted by @adoubleagent. To be fair, if the developer is calling me a dick because I’m asking them to fix things, I feel like I’m winning.

Joseph Klok

Site normally hosted on 4 servers. Bumped up to 6 servers for peak season, but I didn’t know about the 2 additional servers.

Robots.txt was Disallow: / for the 2 servers. Pages constantly dropping on and out of indexes. Took a couple hours to figure out what was going on.

Tad Miller

Developers of video game site thought it would be cool to use broken fragment URLs with the # for the 42 game character/figurine game pieces (easy to guess who).

This resulted in hundreds of rankings disappearing and couldn’t B fixed for a year when the next year’s game launched.

Guilherme Kawasaki

At my old job we were republishing more than 10.000 pages to a Newly optimised layout. the Dev published ALL the Pages with a small simple tag:

isfamilyfriendly=”no”

… it went from 1.5-milion page views to 10.000 page views  in one day. Fortunately we discovered in 2 days and in 2 weeks the views got back

Want more SEO horror stories?

If that wasn’t enough, here are some other great articles written over the years with some more, truly amazing horror stories: