The Basics Of SEO: What Is SEO?

Having worked in SEO for a number of years, and having been able to work with some of the biggest brands from across the globe, I’ve experienced a few things. I’ve also read a lot of content, some good and some… Left me speechless. There is a lot of noise in the SEO community, and unfortunately well publicised does not always correlate with expertise.

However, if I was starting SEO for the first time tomorrow – this, in my opinion, is the stuff I’d want to know straight off the bat. In this guide/article/very long blog post, I’m not intending to reinvent the wheel, I’m going to link out to a lot of content and stuff other people have done, because it’s great and me rewriting it or reinventing it would be pointless.

Contents

In this article, I’m going to cover:

  • What is SEO?
  • How do we make websites “Google friendly”?
  • How does Google work?
  • What are spiders/crawlers?
  • What are Google’s algorithms?
  • What is an XML sitemap?
  • What are meta titles and meta descriptions?
  • Understanding keywords
  • What is black hat SEO?
  • Basics of local SEO
  • How a local SERP differs from a normal SERP
  • Google’s local algorithms

Before we get started, I’d strongly recommend that sign-up to this email newsletter from DeepCrawl (a software provider). Almost every week Google’s John Mueller hosts a “webmasters hangout”, and as sure as the sun sets the team at DeepCrawl summarise it and email it out!

What is SEO?

search engine optimization
noun COMPUTING
noun: search engine optimization;
plural noun: search engine optimizations
the process of maximizing the number of visitors to a particular website by ensuring that the site appears high on the list of results returned by a search engine.
"the key to getting more traffic lies in integrating content with search engine optimization and social media marketing"

As well as generating traffic and leads, SEO in a practical sense is also about working with other stakeholders, such as developers and other non-marketing peoples within a business to instil “SEO best practice” across all activities.

An example of this would be working with the developers and the PPC team to remind them, and double check, that they are preventing Google from indexing PPC landing pages (through the use of a page level robots tag) to prevent Google from having two conflicting landing pages for the same group of keywords.

In my time I’ve also found myself working with companies on ensuring their tracking correctly through Google Analytics, and some of this scope even stretches to making sure they’re tracking other channels such as email and social correctly – it may be technically out of scope, but making sure all channels are measured correctly is ultimately in your favour.

How do we make websites “Google friendly”?

When I first started out in SEO properly I bought a book, The Art of SEO, and it, along with a lot of other sources in the industry classify SEO into three segments: Onpage, Offsite and Technical. However, given changes in Google’s algorithms and SERP (search engine result page) features, for me modern SEO is made up of four segments, these are:

What is SEO?
The SEO Tetrahedron. Devised by Dan Taylor 2018.

Technical SEO

  • Ensuring that search engines can effectively crawl, process, and index web-pages across a website
  • Making sure that the website responds and forms correctly across all devices and browsers
  • Ensuring that the website provides a strong, technically excellent foundation for on-page, off-site, and user experience efforts

Off-site SEO

  • Real businesses do marketing, not just for PR and backlinks, but real businesses are active
  • Having a business presence (as well as backlinks and citations) on industry relevant websites
  • Having members of the business and associated with the business active within the industry and business community, establishing themselves as industry leaders

On-page SEO

  • Content, how well it satisfies a user query (main content) and then goes on to either link to, or provide additional value around the topic (supporting content)
  • The structure of content ontologies, nesting of appropriate subfolders and categories to create content ontologies
  • Not spreading content thin and producing multiple URLs with minor content differences

User Experience

  • Site speed, and how quickly content loads for users on desktop, mobile and tablet
  • The mobile usability of website, such as it’s responsiveness
  • How content is presented (above the fold, clearly visible on load)

How does Google work?

Honestly, there are many ways to answer this question, but for me one of the best starting places here is to watch the below YouTube video from SMX West 2016, in which Paul Haahr (Google Software Engineer) gives great insight into how Google determines it’s ranking and algorithm changes.

Google pros, Gary Illyes, Webmaster Trends Analyst & Paul Haahr, Software Engineer give SMX West attendees an inside view of how Google determines it’s ranking and algorithm changes. 

What are spiders/crawlers?

A search engine spider does the search engine’s grunt work: It scans Web pages and creates indexes of keywords.

Once a spider has visited, scanned and categorized a page, it follows links from that page to other sites. The spider will continue to crawl from one site to the next, which means the search engine’s index becomes more comprehensive and robust.

What are Google’s algorithms?

Google’s algorithms, some more famous than others, have shaped the modern Google and much of the SEO industry as we know it. Rather than reinvent the wheel, Search Engine Journal and a number of authors put together this great resource covering pretty much all of Google’s algorithms. Specifically I recommend reading the below articles:

Some SEO Basics

What are XML sitemaps?

XML sitemaps can serve two functions; 1) to provide a list of URLs to Google, Bing and the other search engines that you want indexed, and 2) it can be used to implement Hreflang (we’ll come across this when we look at international SEO).

XML sitemaps are also great when combined with Google Search Console, as you can review the URLs you want indexed by the search engine and using reports determine any issues.

Google Search Console: Coverage Report

Unfortunately XML sitemaps also come associated with a myth, that you need to “submit your website to Google”. This is not submitting your site to Google, this is making use of a feature to better guide and advise search engine crawlers, as well as a way to get useful data and insights into your site performance.

You’ll typically find an XML sitemap by typing in domain.com/sitemap.xml, and this Yoast guide will guide you on how to submit to Google Search Console.

What are meta titles and meta descriptions?

Meta Titles / Title Tags

Meta titles, or title tags, are a very potent and undervalued weapon in an SEO’s arsenal. When optimised, they form an important ranking factor and important piece of correct information architecture.

There is no hard and fast rule as to “best practice” when producing meta titles, but as a rule of thumb:

  • Don’t make them longer than 60 characters, including spaces, special characters and brand. Google technically has a limit on pixel width, so a W is wider than an I. To check this you can use a tool like this, or a =LEN formula in Excel/Google Sheets.
  • Put the user first rather than try and keyword stuff, so I always try and explain the page first, then think about the terms users use when searching, and then the final thought is on including brand (especially if the page is a core commercial/money page).
  • Don’t repeat title tags across multiple pages. This is sometimes ok, if the page is a paginated page of the blog (i.e. blog page 2), or on taxonomy pages such as category and tag.

Meta Descriptions

Meta descriptions aren’t a ranking factor, and Google can (and will) overwrite them if it feels content on your page better meets the intent and needs of a searcher, but this doesn’t mean you should discount them.

Using Google Search Console you can look to see pages “not performing” as well as they could be in terms of clicks from the SERPs, as well as see the majority of keywords that the pages (URLs) are appearing for.

Google Search Console: Search analytics report

Good meta descriptions should focus on explaining to the user why the page is relevant to their query, this isn’t just about matching search phrases and keywords, but also the intent behind it. 

Further Reading

Honourable mention, Meta Keywords

Enough said. (However, meta keywords are useful in Yandex and Baidu)

Understanding Keywords

For me, keyword strategy is still one of the more misunderstood areas of SEO, and this has been made worse by a lot of the larger industry tools that try to be all things to all people.

When selecting keywords for tracking or optimisation, there are three things (in my opinion) you should take into consideration.

What’s currently ranking for that keyword?

Leave the tools and perform a search for that specific keyword and see what kind of results Google is showing:

  • Are they commercial/e-commerce results?
  • Is it showing Google shopping ads?
  • Is it showing other SERP features such as featured snippets, knowledge graph panels, or other items?
  • Is the query returning “branded” results?
  • Is it showing blogs/guides and other information resources?

From here, you can determine if the query is right for your site, business model, and ultimately worthwhile optimising for – if you’re not selling online and Google is pretty much only returning sites selling online, you’re going to struggle.

What’s the intent behind the keyword?

This relates to the checks you have already done by looking at the search results themselves, and this is defining the intent behind a keyword.

Again, rather than reinventing the wheel – here is a guide to user intent I wrote for Search Engine Journal:

Search volume is almost meaningless

A lot of people fall into the trap of using search volume as a “be all and end all” metric when choosing which keywords to track and optimise for.

Search volume is a paid search metric, and relates to PPC. Average monthly search volume actually means “the average number of monthly searches in which a paid advert appears”. We used to get the data from Google Keyword Planner (a PPC tool), and subsequently this PPC metric has made it’s way into a lot of “SEO” tools.

It’s great in helping identify the “big keywords”, which to be fair should be obvious if you know your industry – but tools like Google Search Console, Bing Webmaster Tools and Yandex Metrica all have search analytics reports that detail the search phrases you’re appearing for.

This combined with the bigger tools and competitor research will enable you to deliver some great keyword research, and identify the phrases that really will grow your clients organic presence (I’ll cover keyword research in detail in a future post).

What is black hat SEO?

Black hat SEO refers to a set of practices that are used to increases a site or page’s rank in search engines through means that violate the search engines’ terms of service. The term “black hat” originated in Western movies to distinguish the “bad guys” from the “good guys,” who wore white hats.

Source: WordStream

From experience, SEO is not as straight forward as black and white, it’s ultimately about delivering results – and sometimes that does mean stepping into grey areas. Recognised black hat tactics include:

  • Content Automation
  • Doorway Pages
  • Hidden Text or Links
  • Unnatural Keyword Stuffing
  • Cloaking
  • Link Schemes
  • Guest Posting Networks
  • Link Manipulation (including buying links)
  • Article Spinning
  • Link Farms, Link Wheels or Link Networks
  • Rich Snippet Markup Spam
  • Automated Queries to Google

The difference between grey and black however, is that grey hat SEO is often done in response to competitor analysis and matching what Google is currently ranking for certain queries (a sort of can’t beat them, join them mentally).

Black hats however set out to bring about results “quickly” and often without longevity, and can often lead to a Google penalty.

A Google Penalty, or adverse reaction to an algorithm change can often be costly to businesses in both the short and long term.

For me SEO is about delivering long term benefits to clients, and working with them for a number of years.

Also, from experience, as clients get bigger (and verticals more competitive), black hat techniques have little to no impact and are more reserved for smaller niches.

There are instances were tactics such as doorway pages (if implemented correctly) can work, but in my opinion you should learn about black hat techniques – so you can avoid them and be a better SEO.

For further reading on black hat SEO, I recommend Padraig O’Conner’s An Introduction to Black Hat SEO, on HubSpot.

Published by

Dan Taylor

I'm Dan, and I'm an award winning SEO consultant and technical lead based in the United Kingdom. I work with brands around the world, ranging from SaaS, fintech and retail, to travel brokers, agencies and airlines.

Leave a Reply

Your email address will not be published. Required fields are marked *