Google My Business is a must have tool in a local SEO’s arsenal. Actually, scratch that, in any SEO’s arsenal – I use it to influence brand real estate on SERPs, and I use Google Posts to show off were I’m speaking next, new blog posts and other cool things.
However, one thing that has started to appear in Google My Business listings (notably in America) is split opening hours, such as this example of Velocity Credit Union, in Austin TX:
But as you probably know, in Google My Business you can only edit one set of opening hours – so how is Google discovering these split business hours for different services at the same location?
How to add drive-thru hours to your Google My Business Listing
From analysis of how Velocity Credit Union and Anchor Bank are achieving this, and it’s something very simple.
Looking at the Velocity Credit Union page, their Google My Business listing used in the link example is on 610 East 11th Street in Austin. Now the URL from the GMB listing just goes to the homepage, and all locations are listed on a single page in an expandable <div>, which isn’t amazing but Google is still processing and understanding the relevant information enough to trust it and pull it through to the Google My Business listing.
Using simple HTML for a table, Google is able to understand it, much like it understands lists (ordered and unordered) and accordions enough to use them for featured snippets.
In my career I’ve worked with more than 50 travel companies, ranging from global airlines to boutique turnkey trip operators. All of these experiences have given me great insight into the travel industry as a whole.
Understanding the industry, and consumer behaviour within the travel industry is key in putting together a strategic travel SEO plan to bring success in short, medium and long term.
One niche I’ve worked with within the travel industry is safari tour operators. Whether it’s safaris of the big 5, trips across the Okavango Delta, or a more luxurious experience in the Seychelles – I’ve helped optimise safari holiday agencies for them.
How SEO for Safari Companies differs from normal travel SEO
The safari niche is very different to the average travel niche; whilst the safari-goer blends luxury hotels, fine food and cultural experiences into the trip, they only tend to care about one thing. Animals.
Unlike holidaymakers looking for beach breaks or city breaks, the typical safari goer is influenced by the migration patterns of the animals they want to see, and where they can see them.
This means as an SEO, campaigns need to be hyper-relevant and not only focused around the commercial money phrases as a lot of research goes into booking an African safari holiday – so it’s important that various stages of the safari-holidaymaker booking cycle are targeted through various areas of the website to generate brand awareness, trust, and leads.
My past experience in Safari SEO
Below is an example of safari tour operator I worked with from September 2015 to January 2017.
When I took over the campaign, they were ranking for ~284 keywords in the Google UK database, but by the time I left stopped working with them as they moved SEO in-house, they were ranking for ~2,276 in the Google UK database.
This was achieved through:
Undertaking an initial technical audit of the website and working with their developers to improve site architecture, internal linking and improving site speed.
A lot of their content was duplicated from other websites, this was identified before we began the campaign, so we coupled this with in-depth keyword research and keyword/user-intent matching to produce in-depth guides to a safari holiday in their target countries, as well as tourist hot spots such as the Okavango Delta and the Maasai Mara.
I identified that they had previously migrated their URL structure (to the one I had inherited) but not put in place redirects – so I mapped these from an old XML sitemap that was cached in WayBack and recovered a lot of lost backlink equity pointing to the site.
As it was deemed a quiet period, the new content and URL structures were pushed live in the two weeks covering Christmas and New Years’, and we were reward almost instantaneously – I also think we benefitted from the Algorithm update on December 15th 2016 (as the search landscape had changed within that vertical), as well as the proceeding update on January 24th 2017.
How you can achieve these results yourself
For a lot of independent travel companies, the costs of employing an external SEO vendor can be a significant monthly outgoing – and in a lot of niche’s and verticals you can do the basics of good SEO and achieve results yourself.
Technical SEO remains the same
The best practices for technical SEO remain the same, your website needs to have:
Good information architecture and URL structures, with reasonable click-depths
Internal linking that makes sense for users, so that associated content can be easily discovered
Acknowledging the “reasonable-surfer” model and that users click about, and follow non-linear paths, so don’t try and funnel them.
If you’re just starting off in SEO, I’d recommend that you read this blog post here on the basics and fundamentals of what SEO is.
Content marketing and adding value
User value is one of the keystones of great content. There is no disputing this and content is now no longer about 500 words on a page with X links with Y mentions of keyword Z.
Companies that blog generate 67% more leads per month than those that don’t.
It’s also no longer about focusing only on the sell. You need to provide a lot of value and supporting content to back-up the main content (commercial content) to create “clusters” and areas of the site that hold authority.
For example, if you’re optimising for safari holidays in Tanzania, your core keyword base will probably look something like this:
UK Search Volume
Tanzania safari holidays
Tanzania safari tours
Tanzania tours and safaris
Tailor made safaris Tanzania
Which is great, but users need to know and want to know more – this is your chance to establish yourself as an expert in the field (and saying we are experts in Tanzanian safari holidays blah blah blah isn’t enough). Also don’t get too focused on search volume, as this is a PPC metric and is a representative number of the number of average monthly searches that contained a paid ad, not searches in total.
If I was writing this resource for Tanzania, I would include:
Where to go in Tanzania
Serengeti National Park
Selous Game Reserve
Katavi National Park/Lake Victoria
Ruaha National Park
Lake Manyara National Park
And then with these, go even further and expand into:
What can I do there?
What animals will I see?
When’s the best time of year to go?
Is suitable for families?
Should I do this in conjunction with something else?
Is there a certain order I should do these in?
Answer these questions, also known as interrogative searches, and you can create a really powerful resource – and subtly introduce commercial CTAs. You can discover these through free tools such as Answer The Public.
DIY keyword research for safari tour operators
Or through using tools such as Serpstat, who have a very non-offensive starter package priced at $19 a month, as well as a free account with limits of 30 queries per day – which if you’re focused in your research, you should be fine with this limit.
It’s also important to look at search results and see what kind of results Google is bringing back for certain terms, and the quality of those results – as those are your benchmarks that you need to mark.
Get in touch
If you’d like to talk about taking the organic search performance of your safari holiday agency, or your travel company in general, please get in touch.
Having worked in SEO for a number of years, and having been able to work with some of the biggest brands from across the globe, I’ve experienced a few things. I’ve also read a lot of content, some good and some… Left me speechless. There is a lot of noise in the SEO community, and unfortunately well publicised does not always correlate with expertise.
However, if I was starting SEO for the first time tomorrow – this, in my opinion, is the stuff I’d want to know straight off the bat. In this guide/article/very long blog post, I’m not intending to reinvent the wheel, I’m going to link out to a lot of content and stuff other people have done, because it’s great and me rewriting it or reinventing it would be pointless.
In this article, I’m going to cover:
What is SEO?
How do we make websites “Google friendly”?
How does Google work?
What are spiders/crawlers?
What are Google’s algorithms?
What is an XML sitemap?
What are meta titles and meta descriptions?
What is black hat SEO?
Basics of local SEO
How a local SERP differs from a normal SERP
Google’s local algorithms
Before we get started, I’d strongly recommend that sign-up to this email newsletter from DeepCrawl (a software provider). Almost every week Google’s John Mueller hosts a “webmasters hangout”, and as sure as the sun sets the team at DeepCrawl summarise it and email it out!
What is SEO?
search engine optimization nounCOMPUTING noun: search engine optimization; plural noun: search engine optimizations the process of maximizing the number of visitors to a particular website by ensuring that the site appears high on the list of results returned by a search engine. "the key to getting more traffic lies in integrating content with search engine optimization and social media marketing"
As well as generating traffic and leads, SEO in a practical sense is also about working with other stakeholders, such as developers and other non-marketing peoples within a business to instil “SEO best practice” across all activities.
An example of this would be working with the developers and the PPC team to remind them, and double check, that they are preventing Google from indexing PPC landing pages (through the use of a page level robots tag) to prevent Google from having two conflicting landing pages for the same group of keywords.
In my time I’ve also found myself working with companies on ensuring their tracking correctly through Google Analytics, and some of this scope even stretches to making sure they’re tracking other channels such as email and social correctly – it may be technically out of scope, but making sure all channels are measured correctly is ultimately in your favour.
How do we make websites “Google friendly”?
When I first started out in SEO properly I bought a book, The Art of SEO, and it, along with a lot of other sources in the industry classify SEO into three segments: Onpage, Offsite and Technical. However, given changes in Google’s algorithms and SERP (search engine result page) features, for me modern SEO is made up of four segments, these are:
Ensuring that search engines can effectively crawl, process, and index web-pages across a website
Making sure that the website responds and forms correctly across all devices and browsers
Ensuring that the website provides a strong, technically excellent foundation for on-page, off-site, and user experience efforts
Real businesses do marketing, not just for PR and backlinks, but real businesses are active
Having a business presence (as well as backlinks and citations) on industry relevant websites
Having members of the business and associated with the business active within the industry and business community, establishing themselves as industry leaders
Content, how well it satisfies a user query (main content) and then goes on to either link to, or provide additional value around the topic (supporting content)
The structure of content ontologies, nesting of appropriate subfolders and categories to create content ontologies
Not spreading content thin and producing multiple URLs with minor content differences
Site speed, and how quickly content loads for users on desktop, mobile and tablet
The mobile usability of website, such as it’s responsiveness
How content is presented (above the fold, clearly visible on load)
How does Google work?
Honestly, there are many ways to answer this question, but for me one of the best starting places here is to watch the below YouTube video from SMX West 2016, in which Paul Haahr (Google Software Engineer) gives great insight into how Google determines it’s ranking and algorithm changes.
What are spiders/crawlers?
A search engine spider does the search engine’s grunt work: It scans Web pages and creates indexes of keywords.
Once a spider has visited, scanned and categorized a page, it follows links from that page to other sites. The spider will continue to crawl from one site to the next, which means the search engine’s index becomes more comprehensive and robust.
What are Google’s algorithms?
Google’s algorithms, some more famous than others, have shaped the modern Google and much of the SEO industry as we know it. Rather than reinvent the wheel, Search Engine Journal and a number of authors put together this great resource covering pretty much all of Google’s algorithms. Specifically I recommend reading the below articles:
XML sitemaps can serve two functions; 1) to provide a list of URLs to Google, Bing and the other search engines that you want indexed, and 2) it can be used to implement Hreflang (we’ll come across this when we look at international SEO).
XML sitemaps are also great when combined with Google Search Console, as you can review the URLs you want indexed by the search engine and using reports determine any issues.
Unfortunately XML sitemaps also come associated with a myth, that you need to “submit your website to Google”. This is not submitting your site to Google, this is making use of a feature to better guide and advise search engine crawlers, as well as a way to get useful data and insights into your site performance.
You’ll typically find an XML sitemap by typing in domain.com/sitemap.xml, and this Yoast guide will guide you on how to submit to Google Search Console.
What are meta titles and meta descriptions?
Meta Titles / Title Tags
Meta titles, or title tags, are a very potent and undervalued weapon in an SEO’s arsenal. When optimised, they form an important ranking factor and important piece of correct information architecture.
There is no hard and fast rule as to “best practice” when producing meta titles, but as a rule of thumb:
Don’t make them longer than 60 characters, including spaces, special characters and brand. Google technically has a limit on pixel width, so a W is wider than an I. To check this you can use a tool like this, or a =LEN formula in Excel/Google Sheets.
Put the user first rather than try and keyword stuff, so I always try and explain the page first, then think about the terms users use when searching, and then the final thought is on including brand (especially if the page is a core commercial/money page).
Don’t repeat title tags across multiple pages. This is sometimes ok, if the page is a paginated page of the blog (i.e. blog page 2), or on taxonomy pages such as category and tag.
Meta descriptions aren’t a ranking factor, and Google can (and will) overwrite them if it feels content on your page better meets the intent and needs of a searcher, but this doesn’t mean you should discount them.
Using Google Search Console you can look to see pages “not performing” as well as they could be in terms of clicks from the SERPs, as well as see the majority of keywords that the pages (URLs) are appearing for.
Good meta descriptions should focus on explaining to the user why the page is relevant to their query, this isn’t just about matching search phrases and keywords, but also the intent behind it.
For me, keyword strategy is still one of the more misunderstood areas of SEO, and this has been made worse by a lot of the larger industry tools that try to be all things to all people.
When selecting keywords for tracking or optimisation, there are three things (in my opinion) you should take into consideration.
What’s currently ranking for that keyword?
Leave the tools and perform a search for that specific keyword and see what kind of results Google is showing:
Are they commercial/e-commerce results?
Is it showing Google shopping ads?
Is it showing other SERP features such as featured snippets, knowledge graph panels, or other items?
Is the query returning “branded” results?
Is it showing blogs/guides and other information resources?
From here, you can determine if the query is right for your site, business model, and ultimately worthwhile optimising for – if you’re not selling online and Google is pretty much only returning sites selling online, you’re going to struggle.
What’s the intent behind the keyword?
This relates to the checks you have already done by looking at the search results themselves, and this is defining the intent behind a keyword.
Again, rather than reinventing the wheel – here is a guide to user intent I wrote for Search Engine Journal:
A lot of people fall into the trap of using search volume as a “be all and end all” metric when choosing which keywords to track and optimise for.
Search volume is a paid search metric, and relates to PPC. Average monthly search volume actually means “the average number of monthly searches in which a paid advert appears”. We used to get the data from Google Keyword Planner (a PPC tool), and subsequently this PPC metric has made it’s way into a lot of “SEO” tools.
It’s great in helping identify the “big keywords”, which to be fair should be obvious if you know your industry – but tools like Google Search Console, Bing Webmaster Tools and Yandex Metrica all have search analytics reports that detail the search phrases you’re appearing for.
This combined with the bigger tools and competitor research will enable you to deliver some great keyword research, and identify the phrases that really will grow your clients organic presence (I’ll cover keyword research in detail in a future post).
What is black hat SEO?
Black hat SEO refers to a set of practices that are used to increases a site or page’s rank in search engines through means that violate the search engines’ terms of service. The term “black hat” originated in Western movies to distinguish the “bad guys” from the “good guys,” who wore white hats.
From experience, SEO is not as straight forward as black and white, it’s ultimately about delivering results – and sometimes that does mean stepping into grey areas. Recognised black hat tactics include:
Hidden Text or Links
Unnatural Keyword Stuffing
Guest Posting Networks
Link Manipulation (including buying links)
Link Farms, Link Wheels or Link Networks
Rich Snippet Markup Spam
Automated Queries to Google
The difference between grey and black however, is that grey hat SEO is often done in response to competitor analysis and matching what Google is currently ranking for certain queries (a sort of can’t beat them, join them mentally).
Black hats however set out to bring about results “quickly” and often without longevity, and can often lead to a Google penalty.
For me SEO is about delivering long term benefits to clients, and working with them for a number of years.
Also, from experience, as clients get bigger (and verticals more competitive), black hat techniques have little to no impact and are more reserved for smaller niches.
There are instances were tactics such as doorway pages (if implemented correctly) can work, but in my opinion you should learn about black hat techniques – so you can avoid them and be a better SEO.
After a few months of coming up with an idea, the first phase of the idea has become a reality, hreflangchecker.com.
Why do we need another Hreflang checker?
Honestly… We don’t, there are some great ones out there, including this tool by Merkle, and the Dejan SEO Flang tool (I feel as though this tool has been around forever).
However, this Hreflang checking tool is just one phase of what will be a wider international SEO and Hreflang project, including a brand new way to generate Hreflang tags on a page.
HreflangChecker.com is part of a wider project, we’ve dubbed Project Sloth. Project Sloth is a wider project looking at improving, and making Hreflang easier and more accessible for developers (and businesses) to implement.
SEO is an extremely active community, which is one of the best things about it – but because in theory, anyone with WiFi can offer SEO we tend to see some amazing advice being offered.
I’ve reached out to the SEO community via Twitter, asking my fellow SEO experts to share some of the best advice they’ve seen from previous agencies and consultants on accounts they’ve inherited. I’ve also scoured the internet and forums looking for other hidden gems, and I found some… For example:
The developers blocked Googlebot by IP address, because of suspicious crawling activity…
A few of them we will all likely have come across throughout our careers, others however are that… unbelievable you have to double-take.
Also, a quick thank you to everyone who responded and helped make this article possible.
So let’s begin – here are some of the best examples of bad SEO advice and horror stories:
So, saw on advice a business spun up ~4,000 EMD domains in order to rank for as many niche queries as possible, all containing duplicate content. Believe it or not, they then got a manual penalty notice for pure spam… The agency said it’s likely not related.
Had a client who’d been advised “For SEO, you need more content.” Not strictly bad advice, but they’d been on a content treadmill for months/years… but with zero focus and no internal linking. Nothing.
Sorted just the internal linking issues, and impressions/visibility went up 300%…
HTTPS & SSL don’t matter
This horror advice was covered extremely well by Troy Hunt on his site, but it’s that bad and with cyber security and data security being so prevalent at the moment – it has to be included…
If you don’t have sensitive information on your site, you’re not selling a product or a service, there is no checkout page, you don’t need a certificate. It doesn’t help, you know, increase security.
And from a different SEO expert in the same article by Troy…
Encrypting all pages on your website will only slow them down.
Talking about Bounce rate I’ve also sat at a talk at MeasureCamp Manchester, where a “senior SEO” tried to give a talk on how he felt bounce rate was a critical ranking factor. In the same talk he also talked about limiting client access to their own Google Analytics and what metrics you show them. Oh dear.
The client’s site was professional services, which if anything made this suggestion even more shocking as the content was essentially their service list. It didn’t perform too badly in Google but there were some speed issues which we an issue.
The ‘SEO agency’ told them that removing the content on their service pages so it was just 50 words max would speed the site up and promote UX. For content, the site had around 7/8 service pages each with unique, high-quality content on them of around 800-1000 words.
Needless to say they completely vanished from the rankings, still had the site speed issues and experienced a disastrous drop in traffic.
Saw this in an audit a client was given by a previous SEO agency:
301 redirects – Each time a page redirects to another page, there is potential that the site visitor will see a slight delay for the page to render. It is recommended to update old URLs to the correct URL to reduce page delays. Low priority item to fix.
I was once in an SEO role at a very large organisation and discovered the terms of service (TOS) on the website did not allow people to link to content. This company would actively send cease and desist letters when they got new links.
Unfortunately, websites trying to control how other people link to their website through their terms of service isn’t something new, or uncommon, the below is from a large, well known UK based consumer product advisory TOS:
If we ask you to remove or change a link to our websites, you do so as soon as possible.
Unless you obtain our express permission, you must not include more than 10 links to our websites on any one of your web pages.
If you want to read these for yourself, the TOS can be found here.
I once found that a business was paying for SEO services from a well known SEO agency, and one side of the room they were producing backlinks for the client (and charging for it), but then on the other side of the room another team was working on disavowing the links (again, charging the client for it)… New meaning to full service agency!
I have seen Google Search Console disavow files including sites I operate in other markets… Yes, disavowing their own websites because they use a different ccTLD and apparently Google distrusts foreign sites linking to you…
Externally link to Wikipedia to improve local rankings
Had a potential client come to me saying that they’re website already had been “SEO optimised”, and all I needed to do was to let Google know about their website – five to ten minutes worth of work in their eyes. They had already paid the development agency for a super-special, and expensive SEO package.
“Knows some SEO stuff”
Jeremy Rivera (Director of SEO and Content Marketing for Raven Tools and TapClicks)
Circa 2006. Guy said that he’d received a recommendation from his relative who “knew some SEO stuff”.
His site lost all rankings. It turns out he had added 1000 “Yorba Linda Real estate” in white text on his white background on every page of his site, and in the meta keywords tags.
The former agency had placed a fully transparent png (invisible) in the footer of the website of its customer – used it as a backlink where the alt text was the name of the agency + desired keyword… Just wow.
I’ve also experienced this were the “development and SEO agency” gave the client a custom WordPress template with a hardcoded footer link back to their homepage… But they misspelt their own homepage URL, so it 404’d. I asked them to update it for the client, they came back with an invoice to do the dev work…
I once did some work for a local locksmith who’s development agency had hardcoded their agency backlink in the footer… The only issue is they misspelled their own agency in the href link so it was a permanent footer external 404. Then when I raised it to fix it, they wanted to charge the client the dev time!
Took on a client that’d had 3-4 site redesigns over the years (each time with URL changes), & despite having worked with 2 SEO agencies in the past, none of the old URLs from any old versions of the sites had been redirected – they were all still 404ing.
I also once had a dev let a site update go live without consulting the SEO team, so all site arch., schema, metas, and the whole nine yards were wiped out. That was a NIGHTMARE 😱
I got threatened with being sued, a week after a client signed a contact, because their Google traffic went to zero. I hadn’t actually started as they had asked me to wait a week as their site redesign was finished.
That redesign included adding Disallow: / to robots.txt…
Twice, international publicly traded companies pushed new sites live, without telling us they were working on a new site, and just pushed test to production, changing all URLs with no redirects and leaving it all noindexed. Woot.
During a platform migration, a site was launched with no sitemap, (or capability to set a dynamic sitemap), database tables containing meta values weren’t carried over to the new platform, whole sections of the site we removed and 404’s were left as a secondary priority…
No canonical tags were set, the CMS didn’t even have capabilities for multi language deployment, hreflang tags were absent, a whole section of the website containing main pages didn’t even communicate with the CMS… there’s too much more to write…
Working with a French travel website, who had content in French, German, English and Dutch, were desperate to rank within the UK for key commercial terms but weren’t visible at all. The specialist SEO agency they had been working with had been producing content, building links – great, but no result.
So, we had them implement Hreflang at the end of 2015, and you can see what happened…
Forget that redirect map, we will just automate redirects to internal search (post migration) this way no matter what the user searches they will land on the right page… But their all blocked by robots.txt
Unfortunately, I’ve come across a number of SEO and development teams not wholly understand how important redirects are (and how important it is to use the correct redirect codes).
Once came across a dev who used Excel to put together redirects (wait for it), and then used the drag feature in Excel so the 301 turned to 302, 303, 304, etc. for the redirects in the .htaccess file. Took down the site.
I’ve been trying to get an in-house developer for a project I’m working on to remove dev site, which has been hacked w malware and has tons of spam content, from index for 3 weeks now. Entire site is indexed. Fun.