Creating SEO friendly websites using the latest web development technologies rating not rated
SoftXML
                  Programming Books
                Open Web Directory
           SoftXMLLib | SoftEcartJS
xml products   Home page    contact    site map
Contact Us
Latest Blog Articles:   Latest SEO Articles | RSS Feed
DOM manipulation best practices
JavaScript Module Pattern
How to start developing with the Nextjs
Angular application structure
Write Guest Articles
Articles Archive




Loading...

The Art & Science of JavaScript ($29 Value FREE For a Limited Time)
The Art & Science of JavaScript ($29 Value FREE For a Limited Time)









Creating SEO friendly websites using the latest web development technologies


In recent years, the internet has seen a major change in trends with regard to functionality, Both the hardware and software have been improved greatly, accompanied by improvements in the technology which websites use to actually function.

Technology that has become popular in recent years includes AJAX and XML, most notably used to add real-time interaction to web pages. Of course, with broadband now being widely available in developed countries, Search Engine Optimisation or simply ``SEO`` is also a vital component of any web development endeavor today. For the purposes of reference, it should be highlighted that AJAX stands for ``Asynchronous Javascript and XML``. XML stands for ``Extensible Markup Language``.

Whilst the possibilities offered by new technologies may be exciting, there is an issue: almost all websites strive to receive traffic, thus they turn to SEO techniques for the solution, yet technologies such as AJAX and XML are widely believed to affect this optimisation process in a negative manner. This may thus seem to be a situation where you can only have one or the other; if you wish to rank well in the search engines then you should avoid using such technologies to add functionality. Inversely, if you want functionality then you may not fare too well in the rankings. However, there are a number of ways in which you can have the best of both worlds.

To help understand the potential issues it is easiest to look at what the search engines can ``see``. By default, most search engines will index plain text only, for example Javascript within a webpage will be ignored. AJAX utilises Javascript in order to perform instant actions in your browser. Additionally, the purpose of search engines is to index any pages linked to your website. AJAX uses Javascript and XML to essentially bring multiple pages into one. An example of this would be a website selling multiple products. To achieve effective SEO, you would hope that each individual product would be visited by any search engine spiders and indexed, increasing the chance that your website turns up during search results. AJAX negates the need for these additional product pages; as the user clicks on each product they aren`t taken to a specific web page, the XML information is simply read using Javascript code and rendered onto the page. This makes the web page function far more quickly and interactively as mentioned earlier, yet it hugely compromises SEO. From the search engine spider`s point of view, your products all exist on the same page, so the engine indexes one page whereas in reality you should hope for many more to be indexed than this. In order to carry out the most effective web development, the way in which search engines view a website must thus always be taken into account.

So we have identified the problem and it is a fairly simple one: AJAX affects SEO by reducing the number of pages that a user needs to visit in order to get the same amount of information. This is because Javascript is ignored by many search engines and any information requested through AJAX will be retrieved from an XML source and rendered by Javascript thus causing it to be ignored.

One potentially effective way in which to achieve the desired SEO effect is to use scripts which to detect whether or not a visitor to your website is a search engine spider or not. By looking at some of the HTTP header request information, this is arguably relatively easy to determine and can be achieved in many prominent languages used for web development. Now, if the visitor is a bot, they can be redirected to a version of the site which does not use AJAX or XML functionality, instead using standard HTML output with no XML involved at all. This way, the spider will be able to index everything in an unhindered manner.

The issue here, however, is that you run the risk of a ``cloaking`` penalty. Presenting one version of a web page to visitors and another to search engines may mirror this spammy behaviour of a few years ago and may cause significant if not catastrophic search engine ranking penalties.

Many consider that simply moving some of the AJAX script to external files and triggering it via a hrefs as opposed to the traditional onclick event will improve spidering and reduce the negative SEO effect although you will likely need to combine AJAX and XML with other methods of web development, revolving around using server-side languages such as PHP and ASP rather than solely client-side languages such as Javascript. The reason for this is that server-side languages only render static content: they may be dynamically generated pages but the only thing reaching your browser is HTML which is (if well structured) entirely SEO friendly. Others advise that you ensure that a great deal of content is available when the page initially loads even if you subsequently use AJAX can switch it out for the user. Of course, sometimes the same functionality just isn`t attainable using server-side features, but most the time you will be able to come surprisingly close.

All in all, when carrying out web development on a website which requires Web 2.0 interactivity such as AJAX, simple XML or plain Javascript but which also needs SEO, ensure that when your web page reaches a browser, all important content (including navigation links etc) can be read as stand-alone static HTML. Webmasters shouldn`t have to choose between SEO and Web 2.0 so carry out web development with this new understanding in mind.

Tag cloud

AJAX

XML

in

search the AJAX

HTML

Javascript SEO

Web

XML

able

achieve

available

browser

carry

content

course

development

effect

effective

engine engines

ensure

functionality

hope

ignored

index

indexed

information
Rate This Article
(votes 4)

No Comments comments

Post Comment

We love comments on this blog - they are as important as anything we write ourself. They add to the knowledge and community that we have here. If you want to comment then you�re more than welcome � whether you feel you are a beginner or an expert � feel free to have you say.



* = required
Leave a Reply
Name *:
Email *
(will not be published):
Website:
Comment *:
Human Intelligence Identification *:
What is the background color of this web page?
  
Close
Please enter a valid email Please enter a valid name Please enter valid email Please enter valid name Enter valid year
™SoftXML.   Privacy Statement  |  Article Archive  |  Popular Web Development Books
^Top