I’ve been explaining technical SEO to non-technical audiences for the better part of 10 years. Prior to launching my own consulting business this year I was the lead SEO for a SaaS software company. Educating internal clients, small businesses (while moonlighting as an SEO consultant) and executives on the technical specifics of SEO requires a deft touch, which I think I’ve developed fairly well. Make no mistake, SEO is alive and well in large part to these tactics. Here’s my top ten, top-of-mind technical tips:
1. Duplicate Content
Anyone interested in SEO has likely been warned of duplicate content issues, however many don’t realize just how easy it is to create internal duplicative content. Many CMS systems (especially .net CMSs) will generate multiple version of a page. While I don’t believe search engines are down ranking sites like they might have in the past – or like they can do for publishing duplicate content from external sites — internal duplicate content forces the search engine to determine which version is the preferred (see #2: Canonical below) page. If your homepage, or any page, is accessible at multiple locations like fakesite.com/home, fakesite.com/index.html or fakesite.com/, then you likely have CMS driven duplicate content.
2. Canonicalization
Related to duplicate content, Canonicalization, or selecting the preferred version of a domain or a webpage is critical to technical SEO. As you develop a website you should have developers ensure the site is only available at one location. Pick either www or non-www version, and have the alternate version permanently (301) redirected to the preferred. So, your http://www.fakesite.com/ redirects to http://fakesite.com/. From the example in the duplicate content section above, these two homepage duplications, http://fakesite.com/index.html and http://fakesite.com/home should redirect to the preferred or canonical http://fakesite.com/. It is possible to have completely unique content at all three of those URLs, so search engines must treat those as potentially unique pages even if the same content is displayed. There’s a meta tag (technically it’s a link element) that helps identify your preferred page. So the alternate pages above could use the rel=”canonical” tag to point to the preferred page, instructing search engines to ignore them, and only index the canonical page.
3. Robots Crawl Settings
I’ve seen many small business sites, and some mid-market sites that redesign their sites only to shoot themselves in the foot at launch with the Robots meta tag. Quite simply the Robots meta tag instructs search engines whether or not they should crawl and index a webpage. During the development process it’s a good idea to block search engines from indexing your site when it isn’t finished, or on a development server. Forgetting to “re-open” your site to search engines can be catastrophic for a successful site launch. If you think your site has indexation issues, I’d recommend sampling a few pages by viewing the source code and searching for “robots”. If you see something that looks like <meta name=’robots’ content=’noindex,nofollow ‘ /> you might have issues, but rest assured it’s an easy fix.
4. Robots.txt
Similar to the Robots meta tag in purpose, this is a more powerful way of instructing search engines how they should and should not crawl and index your site. This is a file that sits on your server and search engines must request the file when they crawl your site. Just like with the meta tag, it’s all too common to forget to open up your new website when you launch. Just one line of code can set you back weeks in regards to Organic traffic success.
5. Page Speed
Everyone wants a beautiful site with pretty images of the highest quality. Unfortunately those images are usually the biggest drain on your site’s load time than anything else. Search engines not only reward faster sites with better search position, but they also will crawl more of your great content if they can navigate through your site more quickly. Just like a web browser, a search engine requests pages from your server, then waits for responses and for returned pages (html, images, scripts, etc.). The faster they receive those requests and content, the more pages they can crawl in the allotted time. Find a good developer who will minimize your code, images and scripts for performance.
6. Site Architecture
The way in which you organize your site content into categories, sub-categories and detail pages is essentially site architecture. No content on your site should take more than 3-4 clicks to find for a user. Organizing logical categories that make sense to your site visitors will also help silo your related content for search engines. Once your content is organized logically a search engine can crawl through your navigation and find as much of your great content as possible without hitting any roadblocks or getting stuck with nowhere to go. Much like the page load above, a proper architecture helps search engines crawl and index more of your content.
7. Blog Indexation
Ok, I’ll admit this is more personal preference (pet peeve) of mine than anything else, but I see far too many blogs that don’t lock down the way search engines crawl their content. Blog categories, tags and archives pose two issues for technical SEO. First, they often display duplicative content in the form of summaries or complete copies of blog post content. Second, if left open to search engine indexing, they waste your site authority on low-value pages. This might sound counter productive, but it isn’t; you should prevent search engines from indexing categories, tags and archives by using the Robots meta tag and applying “noindex,follow” directive. To clarify, that tag instructs search engines to ignore the categories, but follow (crawl) through to the posts that are listed. Assuming a search engine enters your homepage, they might crawl a category URL next. We want to bypass that duplicative, or non-unique page and pass all your blog authority (Google’s PageRank) into the blog post. This tactic gives your blog posts the best chance to rank well for your target key phrases.
8. HTML Organization
Perhaps this might be the least important on the list, but I do believe it remains important in technical SEO. Your HTML (code) should be organized in a logical manner following a few best practices. There are Headline tags (H1, H2, H3, etc) that help users scan your content for important topics and also help search engines understand the most important (H1) content and follow through the next most important (H2) and so on. Page titles – the content at the top of your page – should be H1’s, and then each heading following should follow suit. A general rule of thumb is you should only have one H1, but it’s okay to have multiple H2’s or H3’s since that’s easier for design purposes.
9. XML Sitemaps
An XML Sitemap is a file on your server that is only intended for search engines, in fact you can use Google and Bing Webmaster Tools to alert them to your sitemap locations. These are lists of all the pages on your site and have traditionally been used to inform search engines of new content, however many SEOs prefer to use them a little differently. Using multiple sitemaps for different sections of your site allows you to monitor the indexation of those site sections. If you submit a sitemap of 50 pages and only 23 are indexed (data available in the Webmaster Tools), you know you have a site architecture or other technical issue.
10. On-Site Technology
Please no Flash sites. I still see them today, especially in the restaurant industry. Flash websites are SEO killers. The search engines just haven’t evolved to index the content contained within your Flash files. The engines have made significant progress indexing URLs and content contained within JavaScript, however your site navigation should not utilize JavaScript to display links. Use CSS and HTML instead. If you offer any content in PDF form, try to make sure you offer an HTML version as well, and block search engines from the PDFs. I mean does anyone really need a PDF?
Assessing your technical standing requires a comprehensive SEO site audit. Once complete, you’ll have a roadmap for locking down your technical SEO so your content marketing can shin and help build your brand. If I missed anything you think is important, please drop your ideas in the comments below. Thanks!
Leave a Reply