Oct
08
2008
0

The benefits of using subdomains to enhance SEO

There are many advantages in using sub domains (e.g. searchengineoptimising.retiarius.com is a subdomain of the retiarius.com primary domain) to enhance the ranking of the primary domain.

Google and MSN both class a subdomain as being a seperate site to that of the subdomain (see this article ).

Barry Schwartz points to a thread at Search Engine Watch Forums discussing the subject:

“Subdomains work very well at the moment. No doubt about that. I can take a prominent, old, domain, set up a brand new subdomain, add one link from the original domains front page, throw up whatever content I want and within days have plenty of traffic. These days it seems that almost all linkpop value from the original domain is transfered – and I see this happening in both MSN and Google.”

Another advantage is that sub domains are usually free to implement on commercial servers – you own the domain and therefore can apply unlimited subdomains to it.

I think that this separation of a subdomain and its primary domains by search engines will have to continue because of the free blogging sites such as blogger all issue their clients a subdomain such as yourname.blogspot.com, and as such cannot exclude subdomains from their search results because ALL the blogs on blogger are owned by seperate individuals.

By using a subdomain for a blog, and cross-linking a new primary site to it you automatically have set up some linking value to the new primary site. If the blog is sued as ‘link bait’ with articles likely to interest a lot of readers then you can increase the cross-traffic to the, probably less frequently updated, primary site.

But you must not go mad at producing endless referring subdomains because because Google has set a bar on the number of subdomains they will reference.

I would like to quote Vanessa Fox, an ex-Googler and contributor to Search Engine Land :

“Google is no longer treating subdomains (blog.widgets.com versus widgets.com) independently, instead attaching some association between them. The ranking algorithms have been tweaked so that pages from multiple subdomains have a much higher relevance bar to clear in order to be shown.”

So, with care the use of subdomains will enhance your site a low cost but don’t over-egg the pudding … it could backfire on your search engine ranking.

Related Posts:

Oct
06
2008
0

W3C compliance – is it important for SEO?

The W3C (World Wide Web Consortium) set the Internationally agreed standards for the languages used for constructing web sites.  The primary coding language still in use is HTML (Hyper Text Markup Lanquage) and was invented by Tim Berners-Lee, the ‘father’ of the internet.

HTML was developed through several versions until it reached 4.01, at which point the language called XHTML (Entensible Hypertext Markup Language)  became the standard to follow.

All browsers are backwards compatible.  i.e. they can read and process all the variants of HTML and XHTML up to the browser’s implementation date.

By creating an International standard for web coding the W3C have enabled browser designers to build systems capable of (almost) reading any website.

The problems arise when certain browser producers have tried to introduce propriety coded operations into their browsers. A prime example of this has been Microsoft which has consistently tried to impose its ASP (Active Server Paging) systems by making Internet Explorer capable of working outside of W3C standards in an attempt to monopolise the server market. They attempted the same by modifying the JAVA programming language with propriety codes until they were successfully sued by Sun Microsystems (the originators of Java), at which point Microsoft unilaterally under the guise of a security update, and without users permission, modified all existing copies of Internet Explorer to remove any form of Java support, immediately crippling many web sites until Sun could implement a rescue strategy.

Unfortunately for Microsoft, and very fortunately for all other web designers, the majority of servers are run on Linux/Unix servers and the free Apache web server.  This has effectively minimised the ‘damage’ Microsoft has done to the concept of a free and open web design language standard as ASP can only run on Microsoft servers.

What has all this to do with search engine optimisation you may ask?

The answer is that search engine designers are moving more and more towards implementing search criteria which mimic human search behaviour. They are looking with semantic checkers within pages for the information to try and ‘understand’ the true content of the page.  To do this they need to be able to read, unambiguously, every statement on the page.

The use of non-standard coding (we shall be dealing with the use of Javascript, Java, Flash and Shockwave in a later article) makes this difficult for the search engine algorithms to do this and so, to save processing power and time, and to encourage standards compliant coding, if the detect non-standard or poorly written coding they tend to rank your site lower because a ‘sloppily’ coded site infers a lack of quality.

A well written, W3C complaint coded site can be fully read by the search engine systems and as such, if two sites of equal stature are to be ranked, one compliant and one not, the compliant site will rank higher.  Which would you choose if you were employing a designer?

So always check that your sites are compliant. W3C have an excellent free HTML/XHTML Validator you can use.

We at Retiarius Internet Design undertand the importance of this and we produce fully W3C complaint code for our clients as standard.

Related Posts:

Oct
04
2008
0

Google spell checking – and how it affects SEO

It has been shown by our experience and from our research that Google’s spidering and indexing systems place great store on the way a site has been constructed. Google appear to penalise sites with spelling errors and sites with non-compliant HTML/XHTML coding.

It can advantageous to check the spelling of some words using Google’s own spell checker. This uses the same lexicon as the search engines and so any misspelled words or word variants that Google thinks are accurate could be used as keywords in your site.

Accurate spelling prevents your site being penalised and increases your keyword density because a misspelled word obviously will not be recognised.

Related Posts: