The world of search engine optimization is a complicated one, and if you’re a “non-techie” business owner who’s been thrown into webmaster duties for the simple sake of having a business web presence, just learning the basics of SEO might seem overwhelming.
Unfortunately, the web’s search engine spiders don’t care about how skilled you are in the technical arena. What they care about is the content they can read on your pages, so if you aren’t utilizing both basic and advanced SEO techniques on your website, you risk being indexed and ranked for the wrong words – or none at all!
For this reason, it’s important to familiarize yourself with SEO techniques and to implement SEO best practices to the best of your abilities. Fair warning – the following three topics may seem complex. However, if you invest a little time in reading more about them, you should be able to implement them in a way that makes sense for you and your website.
Let’s get started!
Technique #1 – Canonicalization
Canonicalization sounds tricky, but it’s not that difficult to understand. Essentially, there are a number of situations that can cause content on your website to appear on multiple URLs. These instances can cause your site to be indexed improperly or to trigger duplicate content filters, in which the search engine spiders must determine which version of your content (if any) to display in the SERPs.
The easiest example to understand is that your website can be accessed from both “http://www.mysite.com” and “http://mysite.com”. If you have inbound links pointing at both of these URLs, they’ve likely both been indexed by the search engine spiders, which can lead to complications in terms of how link juice is passed and how your site appears in the SERPs.
Similarly, if you use a platform like WordPress (which dynamically creates category and tag pages that display your articles in multiple locations) or a service that adds tracking or session IDs to your URLs (for example, “http://www.mysite.com/file.php?var1=value&mysession=123”), the search engines could be indexing multiple copies of your original articles.
To prevent these situations from influencing where and how your site is ranked in the SERPs, consider implementing the following best practices:
For more recommendations on how to handle canonicalization issues, take a look at the following resources:
Technique #2 – Redirects
Deploying proper redirects on your site follows a similar principle as canonicalization. That is, if you move content on your website (or from one website to another), you’ll want to be sure the search engine spiders are properly informed of the move and able to find and access your content in its new location.
There are two types of redirects that are commonly used in web development: 301 and 302.
301 redirects are permanent redirects. Using this specific code tells the search engines that your content has been moved permanently and should be indexed at its new location. 301 redirects have the advantage of passing link juice and accumulated SEO authority to your content’s new home, which makes them incredibly valuable from a search perspective.
302 redirects, on the other hand, are temporary redirects. A 302 redirect tells the search engines, “I’ve moved this content temporarily, but it will be back. Please don’t attempt to redirect SEO authority away from my original URL.”
In most cases, you’ll want to use 301 redirects to control how the search engines index moved or deleted content. To learn how to deploy this type of redirect correctly, check out the following articles:
Technique #3 – Schema.org microdata
Microdata – which sounds significantly more complicated than it really is – is a language that’s used to add supplementary tags to your site’s HTML in order to provide more data to the search engine spiders and result in the creation of “Rich Snippets.”
When you think about the tags included in traditional HTML, it’s easy to see where some major deficiencies lie. Typically, the only tags found in your site’s code include the body tag, title tag, meta description tag, heading tags and a few others. While the search engines are able to capture the data stored in these tags, they occasionally run into challenges analyzing this information qualitatively.
For example, suppose you built a website reviewing the popular movie, “Avatar.” Your initial code might look something like this:
<span>Director: James Cameron (born August 16, 1954)</span>
And while the search engine spiders will be able to tell that you’ve written a page about the word “Avatar”, they can’t conclusively determine from this limited text whether your content is about the movie or about online profile pictures.
By adding Schema.org microdata, we can add extra information within our website’s code to instruct the search engines on how to process and index this content. In the following example, the sample code shared above is modified with the “Movie” schema tag, which informs the search engines that what follows is content based on a movie called “Avatar”:
<div itemscope itemtype ="http://schema.org/Movie">
<span>Director: James Cameron</span> (born August 16, 1954)</span>
<span Science fiction</span>
Not only does this Schema.org microdata help our sites to be indexed more appropriately. When integrated correctly, sites marked up with microdata become eligible to use “Rich Snippets” (basically, SERPs listings with additional information) in the search results pages. These snippet enhancements can result in an increased clickthrough rate from the SERPs, making the time needed to mark up a website with standard microdata well worth the effort.
The following image shows two SERPs listings that are fully marked up with Schema.org microdata and two that are not. If you wound up on this results page, which result would you be more likely to click through to?
To learn more about the different Schema.org microdata tags, as well as how to implement them on your website, check out the following resource articles:
Again, although these concepts may initially seem overwhelming, they’re worth learning (or outsourcing to a web development professional) in order to prevent negative SEO impacts from occurring on your site. By being proactive about managing more complex SEO issues, you’ll avoid penalties or the incorrect indexation that could hind your website’s rankings in the natural SERPs.
Image: Lynn Friedman