on 01 August 2024

Being able to practice successful search engine optimization relies on getting search engine spiders visiting websites. A website needs to be build or constructed in a way that attract these spiders. Anything that is visible is only visible because of coded instructions built in during development. These codes are encased in tags which consist of < at the beginning of the text and > at the end of the text and it this that tells the web site how to display them.

The World Wide Web or WWW as we know it is the brainchild of one man. Tim Berners-Lee understood that the WWW could be designed a way to store date, access and update it and share it amongst a vast audience. He was originally working on hypertexts when he created it. He is accredited with creating the first version of HTML and it is this code that is today still prevalent in the way web pages are designed.

In those early days web page links did not always result in a user getting to where they wanted to go because of the different formats that were being used in hypertexts. Not all protocols matched up. In order to rectify this he pioneered the World Wide Web Consortium, more commonly known as W3C.

Since then W3C is responsible for putting in place standards for how HTML is used in website building. Although these standards are voluntary it allows web designers to make the choice of whether or not to make use of them. If implemented however, they provide websites that are accessible by every computer operating system in the world. Today we can thank the acceptance of these standards for creating the internet as we know it today. It is highly a useable and highly reliable source of information.

How does W3C compliance help with SEO?

If you want to get good rankings your website must be visible to search engine spiders. To keep track of the billions of web pages on the WWW search engines uses spiders or web robots. Google alone has eight different spiders skittering around whose job it is to decide which web pages are worth adding to their index and which aren’t.

These spiders have been carefully trained to understand and recognise W3C HTML code. If they find different code or codes that contain errors they aren’t going to waste their time and will look elsewhere. If you want to keep these spiders happy you have to give them what they are looking for.

If that means complying to W3C standards it is worth the effort. What you want is to keep the spiders coming back again each time you update or add content to your web pages. SEO and W3C is a match made in heaven.

Spiders do not see what we see and if you are developing predominantly human user friend web pages your Search engine optimization may take a slump. Spiders are able to instantly recognise badly written code or code that is not W3C compliant.

This blog was written over 6 months ago and Internet Marketing and SEO is an always changing industry which means the information within this blog may be out of date. Use caution when using any methods or suggestions within it.

Leave a Reply

Your email address will not be published. Required fields are marked *