Introducing Hreflangchecker.com

After a few months of coming up with an idea, the first phase of the idea has become a reality, hreflangchecker.com.

Why do we need another Hreflang checker?

Honestly… We don’t, there are some great ones out there, including this tool by Merkle, and the Dejan SEO Flang tool (I feel as though this tool has been around forever).

However, this Hreflang checking tool is just one phase of what will be a wider international SEO and Hreflang project, including a brand new way to generate Hreflang tags on a page.

hreflang-checker-working
HreflangChecker works by auditing the entered URLs to verify the accuracy of the hreflang code, and if the right codes are being used.

What’s next?

HreflangChecker.com is part of a wider project, we’ve dubbed Project Sloth. Project Sloth is a wider project looking at improving, and making Hreflang easier and more accessible for developers (and businesses) to implement.

37 Tales Of Bad SEO From The Frontlines

SEO is an extremely active community, which is one of the best things about it – but because in theory, anyone with WiFi can offer SEO we tend to see some amazing advice being offered.

I’ve reached out to the SEO community via Twitter, asking my fellow SEO experts to share some of the best advice they’ve seen from previous agencies and consultants on accounts they’ve inherited. I’ve also scoured the internet and forums looking for other hidden gems, and I found some… For example:

The developers blocked Googlebot by IP address, because of suspicious crawling activity…

A few of them we will all likely have come across throughout our careers, others however are that… unbelievable you have to double-take.

bad-seo-advice
Warning: Some of these SEO advice stories are shocking, you will hide behind your sofa, sleep with the light on, and never look under the bed again.

Also, a quick thank you to everyone who responded and helped make this article possible.

So let’s begin – here are some of the best examples of bad SEO advice and horror stories:

More EMDs, more rankings!

Zack Neary-Hayes (Freelance SEO Consultant)

Previous SEO advisors had set-up a series of one page websites on exact match domains, with duplicate content, so that the content was discoverable at as many locations as possible.

David Iwanow (Global SEO Manager @ Schibsted Media Group)

So, saw on advice a business spun up ~4,000 EMD domains in order to rank for as many niche queries as possible, all containing duplicate content. Believe it or not, they then got a manual penalty notice for pure spam… The agency said it’s likely not related.

Manual penalty for spam? Definitely not related to the 4,000 duplicate content EMD domains we told you to put up.

More traffic? Let’s rank for porn!

Jacque Urick (Director of SEO @ Sears Home Services)

This was circa 2007. The 3rd party SEO firm had changed ALL the image names and image alt text on the website, to sex crime and illegal porn related terms…

Site was getting lots of traffic, as the 3rd party SEO firm had promised, It was all image search traffic…

Results = Lots of content + No internal links

Andrew Cock-Starkey (aka Optimisey)

Had a client who’d been advised “For SEO, you need more content.” Not strictly bad advice, but they’d been on a content treadmill for months/years… but with zero focus and no internal linking. Nothing.

Sorted just the internal linking issues, and impressions/visibility went up 300%…

HTTPS & SSL don’t matter

This horror advice was covered extremely well by Troy Hunt on his site, but it’s that bad and with cyber security and data security being so prevalent at the moment – it has to be included…

If you don’t have sensitive information on your site, you’re not selling a product or a service, there is no checkout page, you don’t need a certificate. It doesn’t help, you know, increase security.

And from a different SEO expert in the same article by Troy…

Encrypting all pages on your website will only slow them down.

The more you spend on PPC, the better your SEO!

Zack Neary-Hayes (Freelance SEO Consultant)

Had one that was told site would never rank organically, so was recommended some whack £10k per-month for PPC campaigns and a landing page creation service.

They did that, had crap return, and then we worked on the site and it was such a straight forward campaign to improve organic search performance…

Luke McCarthy (Digital Product Lead @ Mayflex)

The more you spend on PPC with Google, the better your site ranks organically in Google. Classic.

Quick, fix your bounce rate!

Gerry White (SEO Consultant @ JustEat)

The business was advised by their SEO agency that bounce rate was definitely a ranking factor, and they had to lower it – so they installed the Google Analytics code twice on the website to do so…

Dan Taylor

Talking about Bounce rate I’ve also sat at a talk at MeasureCamp Manchester, where a “senior SEO” tried to give a talk on how he felt bounce rate was a critical ranking factor. In the same talk he also talked about limiting client access to their own Google Analytics and what metrics you show them. Oh dear.

Let’s manipulate dwell time!

Zack Neary-Hayes

Dwell time is a ranking factor. Someone told this poor lad at a dental surgery to go on all the work PCs and leave the homepage open all day to crank up the session duration.

Some SEO consultant told a poor lad working at a dental surgery to go on all the work PCs, and the leave the homepage open all day to crank up the session duration…

And whilst we’re talking about the wonders of Google Analytics:

Seth Steinman

One clients previous agency was charging $225 per month to “maintain Google Analytics”

Content = slow pages (and Google doesn’t care about content) 

Adam Reaney (Sheffield based SEO)

The client’s site was professional services, which if anything made this suggestion even more shocking as the content was essentially their service list. It didn’t perform too badly in Google but there were some speed issues which we an issue. 

The ‘SEO agency’ told them that removing the content on their service pages so it was just 50 words max would speed the site up and promote UX. For content, the site had around 7/8 service pages each with unique, high-quality content on them of around 800-1000 words. 

Needless to say they completely vanished from the rankings, still had the site speed issues and experienced a disastrous drop in traffic. 

Make sure your comment backlinks are 10 words+

Sam Palmen

seo-comment-backlinks

So… Issues here…

  • If you read this article, and watch the video on the page on how to improve your DA, both produced by Moz – who invented the DA metric… You’ll quickly see how flawed this is.
  • DA is a metric that Google doesn’t take into account anyway

Dangerous way to have explained 301s

Christopher J Connor Jr. (SEO Consultant in Portland)

Saw this in an audit a client was given by a previous SEO agency:

301 redirects – Each time a page redirects to another page, there is potential that the site visitor will see a slight delay for the page to render. It is recommended to update old URLs to the correct URL to reduce page delays. Low priority item to fix.

Sorry? What? Remove all redirects?

Don’t allow people to link to your website

Yosef Silver (Founder of Fusion Inbound)

I was once in an SEO role at a very large organisation and discovered the terms of service (TOS) on the website did not allow people to link to content. This company would actively send cease and desist letters when they got new links.

Dan Taylor

Unfortunately, websites trying to control how other people link to their website through their terms of service isn’t something new, or uncommon, the below is from a large, well known UK based consumer product advisory TOS:

If we ask you to remove or change a link to our websites, you do so as soon as possible.

and

Unless you obtain our express permission, you must not include more than 10 links to our websites on any one of your web pages.

If you want to read these for yourself, the TOS can be found here.

Full service link spam creation & disavow service

Martin Woods (SEO Consultant)

I once found that a business was paying for SEO services from a well known SEO agency, and one side of the room they were producing backlinks for the client (and charging for it), but then on the other side of the room another team was working on disavowing the links (again, charging the client for it)… New meaning to full service agency!

On the topic of disavowing backlinks…

Harry Dance (Digital Marketing @ Eagle Online)

I was informed by an agency I was working with that

we don’t disavow bad backlinks because that highlights to Google there may be bad links on that site

It’s almost so bad it’s genius.

Google distrusts foreign websites linking to you

David Iwanow (Global SEO Manager @ Schibsted Media Group)

I have seen Google Search Console disavow files including sites I operate in other markets… Yes, disavowing their own websites because they use a different ccTLD and apparently Google distrusts foreign sites linking to you…

Externally link to Wikipedia to improve local rankings

Yosef Silver

I recently inherited a client from someone who recommended linking to Wikipedia pages for the cities they were targeting for local rankings. Outbound site-wide footer links to Wikipedia. Yup. That.

We don’t need SEO, you just need to let Google know about our site

Peter Nikolow (Mobilio Development)

Had a potential client come to me saying that they’re website already had been “SEO optimised”, and all I needed to do was to let Google know about their website – five to ten minutes worth of work in their eyes. They had already paid the development agency for a super-special, and expensive SEO package.

seo package deals
SEO package deals should be treat with caution at the best of times, and IMO, even more so if it’s asking an agency to critique their own development.

“Knows some SEO stuff”

Jeremy Rivera (Director of SEO and Content Marketing for Raven Tools and TapClicks)

Circa 2006. Guy said that he’d received a recommendation from his relative who “knew some SEO stuff”.

His site lost all rankings. It turns out he had added 1000 “Yorba Linda Real estate” in white text on his white background on every page of his site, and in the meta keywords tags.

Linking back to the agency

Esben Rasmussen (Online Analyst @ Kamstrup)

The former agency had placed a fully transparent png (invisible) in the footer of the website of its customer – used it as a backlink where the alt text was the name of the agency + desired keyword… Just wow.

I’ve also experienced this were the “development and SEO agency” gave the client a custom WordPress template with a hardcoded footer link back to their homepage… But they misspelt their own homepage URL, so it 404’d. I asked them to update it for the client, they came back with an invoice to do the dev work…

Dan Taylor

I once did some work for a local locksmith who’s development agency had hardcoded their agency backlink in the footer… The only issue is they misspelled their own agency in the href link so it was a permanent footer external 404. Then when I raised it to fix it, they wanted to charge the client the dev time!

We all love inheriting crap site migrations…

Steve Morgan (Freelance SEO consultant)

Took on a client that’d had 3-4 site redesigns over the years (each time with URL changes), & despite having worked with 2 SEO agencies in the past, none of the old URLs from any old versions of the sites had been redirected – they were all still 404ing.

I also once had a dev let a site update go live without consulting the SEO team, so all site arch., schema, metas, and the whole nine yards were wiped out. That was a NIGHTMARE 😱

Motherwell

I got threatened with being sued, a week after a client signed a contact, because their Google traffic went to zero. I hadn’t actually started as they had asked me to wait a week as their site redesign was finished.

That redesign included adding Disallow: / to robots.txt…

Twice, international publicly traded companies pushed new sites live, without telling us they were working on a new site, and just pushed test to production, changing all URLs with no redirects and leaving it all noindexed. Woot.

Martin Kelly

During a platform migration, a site was launched with no sitemap, (or capability to set a dynamic sitemap), database tables containing meta values weren’t carried over to the new platform, whole sections of the site we removed and 404’s were left as a secondary priority…

No canonical tags were set, the CMS didn’t even have capabilities for multi language deployment, hreflang tags were absent, a whole section of the website containing main pages didn’t even communicate with the CMS… there’s too much more to write…

Do they know what Hreflang is?

Dan Taylor

Working with a French travel website, who had content in French, German, English and Dutch, were desperate to rank within the UK for key commercial terms but weren’t visible at all. The specialist SEO agency they had been working with had been producing content, building links – great, but no result.

So, we had them implement Hreflang at the end of 2015, and you can see what happened…

hreflang-done-right
Hreflang implemented correctly lead to… Performance (can’t really say an increase, as there was nothing before it).

Where we’re going, we don’t need redirects…

Dan Smullen (SEO Consultant)

Forget that redirect map, we will just automate redirects to internal search (post migration) this way no matter what the user searches they will land on the right page… But their all blocked by robots.txt

Unfortunately, I’ve come across a number of SEO and development teams not wholly understand how important redirects are (and how important it is to use the correct redirect codes).

David Iwanow

Heard also internal stakeholders in the past block redirect projects as they apparently wanted a clean break from old brand…

 

Once came across a dev who used Excel to put together redirects (wait for it), and then used the drag feature in Excel so the 301 turned to 302, 303, 304, etc. for the redirects in the .htaccess file. Took down the site.

Had a new client come to us after their re-platformed e-commerce site went live with new URL structures and no redirects. Thousands of products 🙃

Google was crawling the site too quickly to index it properly

Tony N. Wright

Developer put a Robots.txt no-index on the entire site (Fortune 50 travel site) because he thought “Googlebot crawled it too fast to index it properly”

Googlebot crawled the website too fast to index it properly

We all ❤️ developers

Enroll Media Group

Implementing the GTM container code, using the meta description input field within the CMS… Universally… Across the whole website.

George Danny Murphy

I’ve been trying to get an in-house developer for a project I’m working on to remove dev site, which has been hacked w malware and has tons of spam content, from index for 3 weeks now. Entire site is indexed. Fun.

Barry Adams, Brighton SEO (September 28 2018). Picture Tweeted by @adoubleagent. To be fair, if the developer is calling me a dick because I’m asking them to fix things, I feel like I’m winning.

Joseph Klok

Site normally hosted on 4 servers. Bumped up to 6 servers for peak season, but I didn’t know about the 2 additional servers.

Robots.txt was Disallow: / for the 2 servers. Pages constantly dropping on and out of indexes. Took a couple hours to figure out what was going on.

Tad Miller

Developers of video game site thought it would be cool to use broken fragment URLs with the # for the 42 game character/figurine game pieces (easy to guess who).

This resulted in hundreds of rankings disappearing and couldn’t B fixed for a year when the next year’s game launched.

Guilherme Kawasaki

At my old job we were republishing more than 10.000 pages to a Newly optimised layout. the Dev published ALL the Pages with a small simple tag:

isfamilyfriendly=”no”

… it went from 1.5-milion page views to 10.000 page views  in one day. Fortunately we discovered in 2 days and in 2 weeks the views got back

Want more SEO horror stories?

If that wasn’t enough, here are some other great articles written over the years with some more, truly amazing horror stories:

Featured: DeepCrawl’s Ultimate Guide To International SEO

I’ve been featured in DeepCrawl’s Ultimate Guide To International SEO, both as a quoted contributor – and previous work from my Search Engine Journal articles included.

DeepCrawl’s Ultimate Guide To International SEO (a 50 page monolith of knowledge).

Also featured in the guide are SEO experts Aleyda Solis, Eoghan Henn, Gianluca Fiorelli, Bill Hunt, Glenn Gabe, Bastian Grimm, David Iwanov… And many more!

My quote from the DeepCrawl international SEO whitepaper

EKM SEO: How SEO Friendly Is EKM? A Technical Review

For small to medium sized businesses, there is a lot of choice when it comes to selecting an e-commerce platform. In recent months one in particular has caught my eye through a lot of advertising on Twitter and YouTube, EKM.

Having worked with a lot of ecommerce websites, I’ve only ever personally come across the platform once, despite it’s high volume of ratings, and the claim they have more than 50,000 customers.

Interestingly, EKM also have a support article titled “What Can An SEO (Search Engine Optimisation) Company Do For Me?”, and the opening line of the article reads:

If you’re using ekmPowershop the answer is usually not much, as most of the SEO work has already been done for you.

This is quite a claim for any ecommerce platform that it’s 100% SEO friendly out of the box, especially given the issues faced by enterprise level platforms like SalesForce Commerce Cloud, Magento, Hybris… And even WooCommerce/WordPress (which is typically one of the more SEO friendly platforms OOTB).

Interesting User Feedback

If you do some Googling, there are also some interesting forum posts written by EKM users regarding SEO, including this post (extract) from MoneySavingExpert in 2014:

Now they are sending me details of adding on their ekm healthcheck for search engine optimisation for £350 + VAT but they will knock me £100 off. Saying they can get me to the top of google and direct traffic to my site.

This is what is included:

Page Titles
Meta Keywords
Meta Description
Introduce relevant H1 tags
Consultation Call
Before / After Report
Link Building Suggestions
Knowledge transfer

So in 2014, EKM were apparently offering an SEO package to users that contained meta keywords. Wow.

Also, link building suggestions, before and after report? How do those two work together when studies have shown it can take up to 22 weeks for the effects of link building to be felt. So I’m curious to know how the before and after report can show any type of causation from these recommendations – unless a) the site is super niche and b) it was that under-optimised previously changing the title tag to something relevant was revolutionary.

How I Conducted This Review

With any review, it’s important that methodology is transparent. This review has been conducted based on:

  • A review of a list of 13 websites sent to me via email by Tony Rushe, an Account Manager at EKM as examples of live sites.
  • Conversations with EKM live chat executives and their responses to technical related questions.

This review is not a review of whether or not EKM websites rank within organic search; it’s possible for all platforms to rank. This is a review of how close to best practice these websites are, and any common issues they have.

So without further adieu…

Technical SEO Review Of 13 EKM E-commerce Stores

First, I’m going to start at looking at EKM’s claims of SEO proficiency that are highlighted in the previously linked to article. These claims are that EKM benefits from the below SEO factors OOTB:

  • Search engine friendly design
  • H1 tags for product names
  • Matching page titles
  • Meta Keywords and Meta Description
  • Standard SiteMap
  • Google Sitemap (I’ve literally no idea what this would be?!)
  • XML Sitemap, using the sitemaps.org standard
  • Automatic Google Product Feed generation

So let’s start from the top.

EKM SEO Friendly Design

Top line: The websites provided as examples all had issues on mobile, in terms of elements being too close together, elements not readable… Rather than responding to smaller viewports page elements just appear to get smaller.

For example, this is the homepage on desktop for Mon Michelle, an example site provided by Tony:

EKM Desktop Homepage
EKM Desktop Homepage Example.

And on mobile you can see that the header banner just gets smaller, rather than the elements responding to the mobile screen size:

EKM Mobile Homepage
The same EKM homepage on mobile, elements resize and get smaller, rather than respond.

Given Google has now pretty much moved all sites over to the mobile first index, issues like this can be key in maintaining performance.

On a top level, all of the sites had missing viewport <meta> tags in the <head>.

H1 Tags for Product Names

Ok, so a H1 on a product page – that’s standard, however what’s interesting is that in the code there are two H1 elements, one for mobile and one for desktop.

H1 is also used to style the “search” text. Whilst this isn’t a major issue, a lot of value can be place in maintaining sound information architecture.

Matching Page Titles

Back in 2015, Google said that page titles (aka title tags, meta titles) and H1s should be consistent for rankings. However there is a difference between something being consistent, and something being the same.

Consistency is key. One thing we always try to get right is extracting your headline. And if there are different places on the page that point to different headlines, that’s very confusing for the bot.

And that is why we get publishers sometimes writing in – “oh, you guys got my headline wrong!” And we say, “well, there are different parts of your page that say different things.”

So really try to be consistent, it is the best way for us to correctly index your headline, index that snippet below. – John Mueller

Across all the sites I look at, all of the title tags and H1 tags matched, meaning you had search results with categories literally with the title tag “Jackets” or “Dresses”.

EKM serp appearance
Not being able to define separate page titles (H1s) and title tags is a big optimisation opportunity cost.

And in instances were the use of ALL CAPS is great for the on-page H1, couple that with a long product name and the resulting title tag just looks spammy:

ALL CAPS title tags aren’t user friendly.

Using the example of dresses and jackets, I’d have wanted to optimise the title tags for better information architecture, better user experience (and CTR from SERPs), and inclusion of search phrases to match a variety of search intents.

We also know from various studies and experiments that title tags carry a reasonable amount of weight as a ranking factor in Google.

Meta Keywords and Meta Description

Awesome that you can edit the meta descriptions of your pages, but looking at the thirteen sites, using a site: command, very few are optimised beyond the homepage.

Some of the meta descriptions also reveal that some hygiene pages, such as Terms & Conditions, and other policies (key user trust signals, for any ecommerce or YMYL site) are templated:

Privacy Policy This privacy policy sets out how ekm sitename ekm sitename uses and protects any information that you give ekm sitename ekm sitename.

So the second point here, meta keywords.

At the time of writing this, Google has on record and publicly told webmasters that they don’t use meta keywords in web ranking, nor have they used them for 9 years.

Matt Cutts went on record in 2009 in various videos and blogs talking about how Google doesn’t use meta keywords.

For reference, here are the meta tags that Google does support, and here is a post from their official webmasters blog on the subject.

Our web search (the well-known search at Google.com that hundreds of millions of people use each day) disregards keyword meta tags completely.

So it’s disappointing that a modern, forward thinking cloud-based platform still includes this field as something for their users to get distracted by.

Standard SiteMap (HTML Sitemap)

By this, they mean a HTML sitemap – which is good, as it helps with crawling, and helps other search engines such as Bing.

However, on some HTML sitemaps on the example stores, such as this one:

  • https://www.bankruptfashion.com/sitemap.asp

There are 1,766 links on the page (in total). That’s huge. Internal linking structures are vitally important, and resolving them can reap huge SEO benefits, to have a single page on the site is detrimental.

With a HTML sitemap, not every single product URL needs to be included, and as EKM is an ecommerce platform, this is a bit of an oversight and is creating pages damaging to crawl efficiency.

Why are these large HTML sitemaps an issue?

They’re an issue because of internal linking. As SEOs, we tend to use a rule of “100 links per page”, and this in part goes back to the days of PageRank sculpting.

Dynamical systems point of view (source: Cornell University)

With PageRank sculpting, you wanted to sculpt internal links to pass authority to key pages within the site.

During this time period in SEO, Matt Cutts said the following in an interview with Rand Fishkin:

The “keep the number of links to under 100” is in the technical guideline section, not the quality guidelines section. That means we’re not going to remove a page if you have 101 or 102 links on the page. Think of this more as a rule of thumb.

Whilst this is a rule of thumb, hundreds (or 1,000+) links on a page is detrimental.

XML Sitemap, using the sitemaps.org standard

They do, but they also contain tags such as:

  • <lastmod>
  • <changefreq>
  • <priority>

Tags I personally wouldn’t include based on experience.

Speciality Sitemaps

None of the thirteen example stores had speciality XML sitemaps, all pages (products, hygiene) are bundled into a single sitemap.

Will this affect Google crawling the website? Not at all. Will it affect an SEO’s ability to use Google Search Console to it’s full potential and identify indexing issues? Yes.

XML Sitemap Issues

I also crawled the thirteen XML sitemaps, and found that a lot of them contained non-200 status code URLs.

These URLs should be excluded from the XMLs to prevent wasted crawl resource.

Automatic Google Product Feed generation

This is actually really useful, and out of the box this can instantly make a small/medium sized business more competitive within organic search.

Other EKM SEO Observations

So, that’s the end of the list included in EKM’s SEO support article. So the rest of the analysis is done based on reviewing the thirteen example websites.

Do EKM products have an issue with being indexed by Google?

In order to test this, I chose 10 products at random from each of the 13 websites, and attempted site: on them to establish if Google was indexing the URL.

Out of the 130 products, only 103 were indexed and returned a URL through a site: command. This means ~20% of the products I tested weren’t being indexed by Google.

This is something common on a number of SME level e-commerce platforms, and is something I’ve come across on other platforms such as Shopify (especially during this webinar, were we reviewed Lauren Moshi). This can be resolved through better internal linking.

Does EKM have native blog functionality?

Not that I can see. Some websites I’ve found on EKM do have blogs, but they are WordPress installs reverse-proxied to a /blog/ subfolder. The WordPress blog feature is actually promoted to users, with instructions on how to install.

This isn’t a bad thing, as a WordPress blog is a great thing to have – however every one I’ve come across hasn’t been covered by the SSL, covering the EKM main platform. It also doesn’t appear that EKM’s customer support and “evolution mode” extends to helping customers get the most out of their WordPress blogs at a basic level.

This is concerning, as an out of date, or insecure WordPress platforms can lead to serious security breaches, even if they’re not the primary platform. This is a security risk that I’d love to see EKM address for it’s users, especially in a post GDPR era.

Internationalisation with EKM

Can EKM support Href lang? In short… No.

During a live chat with one of their representatives I asked this question, about whether or not I can implement Href lang from my online store – and I was advised that I should setup individual EKM stores (one for each target language), although none of them would be be connected by Href lang, a single database… Or share the same database.

I can see this working (of sorts) if you’re targeting say Great Britain (English) and Spain (Spanish), but if you’re targeting multiple countries with the same language (UK, Ireland, US…) this would just cause duplicate content issues.

URL Structures

All of the EKM stores have a flat URL structure, with everything sitting on the root.

From an information architecture and information retrieval perspective, there could be some gains here if users could implement a simple, and standard e-commerce URL structure such as:

  • example.com
  • example.com/product-category/
  • example.com/product-category/product-subcategory/
  • example.com/p/product-url

This structure will not only provide better architecture, but also enable better analytical analysis, making for better data led decisions.

Site Speed

When measuring site speed, I love to use Google’s official page speed insights tool, however all of the thirteen websites returned an “unavailable” when checking the speed, apart from one – which actually scored the best on the optimisation scores (mobile and desktop combined), clocking in with a fast 1.3s FCP 1.2s DCL.

They did however return the optimisation statistic (out of 100) for all thirteen sites.

EKM performed worse on desktop than it did on mobile in terms of speed optimisation for the sites sampled.

All of the thirteen websites also shared a number of speed optimisations, these included:

  • Compression enabled
  • CSS minified
  • HTML minified
  • JavaScript minified
  • Images had been optimised for load and file size
  • Critical render path content (content above the fold) had been optimised for first interactive paint

Robots.txt Errors

Similar to Shopify, the robots.txt for EKM platforms appears to be standardised across all EKM websites. Whilst nothing in the disallow:’s looks out of place, there is one interesting inclusion at the bottom of the .txt list:

Crawl Delay & SEO

Simply put, crawl delay is not a search engine friendly command and I strongly doubt any of the EKM websites (seeing as they have annual revenue limits of £1million on their highest paid plan) command such excessive crawl budgets.

However, this does make more sense if all the sites (or a lot of them) are sharing the same servers, and this is to prevent successive commands causing down time.

As ContentKing put it in their academy article:

Avoid using the crawl-delay directive for search engines as much as possible.

It’s also important to note that Google ignores the crawl-delay: directive in robots.txt files:

Is a crawl-delay rule ignored by Googlebot? Official Google Webmasters SEO Snippets video with John Mueller

So all this is doing is harming other prominent search engines in the UK, such as Bing (which in my opinion has difficulty crawling deep links of a site anyway), and I’ve never, ever known Bing to be an aggressive crawler.

Looking at latest data, Bing holds a market share of around 12%, so 1 in 10 people use the search engine – that’s a significant number when you’re an SME looking to attract business online.

JavaScript

Simply put, do EKM websites work with JavaScript disabled?

Impressively, yes. Aside from the menu drop downs I was able to navigate the websites without restriction or loss of content.

Whilst Google can crawl JavaScript, it costs more in terms of resources (and resources cost money), so the fact that these sites appear to have plain HTML links as fall backs and work without JS – without compromising user experience is a huge win.

Schema Markup/Structured Data

The only schema I was able to find was breadcrumbList, which is a shame as there are opportunities (out of the box) to include things like product schema on the product pages, and other options to include things like organisation schema.

I also know from EKM’s documentation that users can inject custom HTML into the head, so Organisation/LocalBusiness should be included – unless the EKM team haven’t advised this for some reason.

Conclusion

If you’re a small/medium sized business looking to sell online, enterprise level solutions such as Magento, SalesForce and Hybris come with hefty price tags and development challenges.

EKM offers a good solution to SMEs who want to get ahead in the digital world and establish an ecommerce foundation, and as a starter platform to provide initial growth and brand establishment.

However, that being said – from working with a number of ecommerce platforms over the years, I would choose Shopify ahead of EKM. For the out of the box issues, I feel Shopify is a much more workable platform and a lot of the SEO issues I’ve come across can be resolved with some tweaking – even on site’s pushing £2million+ in annual online revenue.

I would however place EKM in my top 3, ahead of BigCommerce, SquareSpace and OpenCart as e-commerce solutions for SME businesses on a budget. My new top 3 being:

  1. Shopify
  2. EKM
  3. SquareSpace

EKM has some great potential and with some relatively minor amends to it’s technical SEO capabilities, EKM can easily rival Shopify in my opinion for OOTB proficiency, from an SEO standpoint.

I also love that EKM are attempting this evolution mode idea, and trying to add value to users – but I can’t see any value on any of these sites from an SEO perspective.