Logo  
www.indiaseos.com
91-44-66311111
email
index
Home About us Services SEO Plans Contact us Articles FAQ Link
 
index

CATEGORIES

index

SEO RESOURCES

Extractor Tools

index

Seo India >> SEO Articles >> Seo india: December 2006

Search Engine Optimization

Importance of Sitemaps

There are several SEO tips moreover tricks that assist in optimizing a site but one of those, the significance of which is from time to time underestimated is sitemaps. Sitemaps, as the name involve, are just a map of your site - i.e. on one particular page you illustrate the structure of your site, its segment, the links sandwiched between them, etc. Sitemaps make plot a course your site easier as well as having an updated sitemap on your site is good both meant for your users as well as for search engines. Sitemaps are an important way of declaration with search engines. At the same time as in robots.txt you tell search engines which parts of your site to keep out from indexing, in your site map you let know search engines someplace you'd like them to go.

Sitemaps are not an innovation. They have at all times been part of best Web design practices but with the implementation of sitemaps by search engines, now they turn out to be even more significant. However, it is essential to make amplification that if you are interested in sitemaps for the most part from a SEO point of view; you can't go on with the predictable sitemap only (though currently Yahoo! as well as MSN still keep to the standard html format). Such as, Google Sitemaps uses a particular (XML) format that is different from the run of the mill html sitemap meant for human visitors.

One may ask why two sitemaps are essential. The reply is obvious - one is for humans, the other is meant for spiders (for now mainly Googlebot but it is rational to expect that other crawlers will stick together the club shortly). In that family member it is necessary to make clear that having two sitemaps is not look upon as duplicate content. In 'Introduction to Sitemaps', Google unambiguously states that using a sitemap will never lead to punishment for your site.

Why Use a Sitemap

Using sitemaps has several benefits, not only easier direction-finding and better visibility by search engines. Sitemaps offer the chance to inform search engines without delay about any changes on your site. Obviously, you cannot be expecting that search engines will hurry right away to index your misrepresented pages but without doubt the changes will be indexed faster, measure up to to when you don't have a sitemap.

Also, when you have a sitemap in addition to put forward it to the search engines, you rely less on peripheral links that will bring search engines toward your site. Sitemaps can even assist with confused internal links - such as if you by accident have broken internal links otherwise orphaned pages that cannot be accomplish in other way (though there is undoubtedly that it is a good deal better to fix your errors than rely on a sitemap).

The World Of Search Engines - Past... Present... Future

The world of search engines is varying at a fast rate... what we saw in the earlier period few months was consolidation of numerous search engines... most important players like Yahoo get your hands on the smaller ones with a view to get better their own search engine technology as well as capture more of the market share.

A year back we had Yahoo by means of Google's results, at the same time as Inktomi was the provider for MSN. Google as well had AOL using its results. Search engines like Overture used Inktomi as their liberated results provider. Lycos used results make available by FAST, as well as Altavista had its own engine.

The initial phase of consolidation started with Altavista as well as Alltheweb being bought over by Overture. At this moment Overture seemed to be in a very overriding position, being the market person in charge in Pay per click advertising as well as just having bought over 2 most important natural search engine technology providers.

The scenario all of a sudden changed again, when Overture itself was bought over by Yahoo. By means of Overture, what Yahoo got was not solitary the worlds leading Pay per click engine but as well the search technology of Altavista as well as Alltheweb. And to add icing to the cake, Yahoo went to the lead and acquired Inktomi.

This convoluted the picture from Microsoft's perspective. With Microsoft's have possession of search engine technology still in the development phase along with not yet ready for rollout, it was still completely dependent relative on Inktomi to provide results on MSN... as well as Inktomi now be in the right place to MSN's major rival Yahoo.
To top it all... even Overture, MSN's partner for sponsored results was now owned by Yahoo.

So at present what is the scenario?

We have Google provided that results on google.com as well as AOL. Yahoo using its own search technology, which seems to be a combination of Inktomi, FAST and Altavista as well as MSN still by means of Inktomi's database.
This is in sharp dissimilarity to the recent past when Google hand round about 2/3rd of the market (being the search engine technology contributor to Yahoo) as well as Inktomi had only 1/3rd (MSN's contribute to) of the search engine market.

Google's New SEO Rules

Google has in recent times made some pretty major changes in its ranking algorithm. The most recent update, dubbed by Google forum users as "Allegra", has left several web sites in the dust as well as propels others to top positions. Most important updates like this can happen a few times a year at Google, which is why picking the literal search engine optimization company can be the distinction between online success as well as failure. Nevertheless, it becomes a more and more difficult decision when SEO firms themselves are affliction from the Allegra update.

Over-optimization might have played the major part in the dipping of seo-guy.com from the top 50 Google results. Pass through a filter out web sites that have had readability sacrificed for optimization is a mounting trend at Google. It in progress with the Sandbox Effect in late 2004, where comparatively new sites were not being seen at all in the Google results even with excellent keyword placement in content as well as incoming links. Many thought it was an on purpose effort by Google to reprimand sites that had SEO work done. Its a few months later as well as we see many of the 'sandboxed' web sites lastly appearing well intended for their targeted keywords.

With 44 occurrences of 'SEO' on the comparatively short home page of seo-guy.com, and many of them in close proximity to each other, the satisfied reads like a page calculated for search engine robots, not the visitor. This position shift should come as no revelation to SEO professionals as people have been proverb it for years now: Sites be supposed to be designed for visitors, not search engine robots. Unfortunately, some of us don't listen as well as this is what happens when search engines lastly make their move.

One aspect of search engine optimization that is also affected in a roundabout way is link reputation development. After scrutinize the effects of strictly appropriate link exchanges on many of our client's sites recently, we have noticed extremely fast ranking on Google. It seems Google may be give the feeling of being out for links pages intended for the sole purpose of elevate link popularity and diminish the significance of the site. On balance, if a links page on a real estate site has 100 sociable links to pharmacy sites, there has to be a lot of content on that page totally unrelated to real estate. Not until now has that been so disadvantageous to a site's overall significance to search terms. It goes back to the old rule of thumb: construct your visitors the top priority.

Build Your Website with a Search Engine Friendly Design

Build Your Website with a Search Engine Friendly Design

To become successful on the internet you should have a good looking web site and it should be user friendly. Still, a well designed websites also cannot produce result if it doesnt have a good traffic. The best websites are those that are both good-looking and user-friendly by your human users, and at one fell swoop, well-situated for the search engine robots that are trying to find as well as accumulate data from your site.

Oftentimes a site that might look good to your eye has some design imperfection that impairs its search engine friendliness. Here are a few stuffs to look for when designing new sites otherwise optimizing an on hand site.

Where does your first line of text begin?

You might think, Thats easy, the first line of text is right at the top? If you analysis your web page using Notepad otherwise the html view of popular editors you might be astonished to find that the first line of your actual searchable text might be pushed down, 100 lines or else more, by long strings of java script as well as by the html code that defines your tables.

The privileged your text become visible in this html view of the site, the easier it is for the robot to find it plus put it in the search engine data base. You be able to save space in your html code by doubling-up your java script as well as placing it in an external file uploaded to your server. As an alternative of having 50 lines of java script commands in your html code, there will only be one line spot to the separate file with the java script.

Likewise if you make simpler your table structure, your searchable text will turn out to be more prominent. The left-hand navigation bar, for instance, with its separate graphic elements each in its own row, might be a place where you can spend less on your code by integrating the rows into one cell.

Is your website graphics-predominant, at the expense of searchable text?

If your site begins with a splash page, for instance a lovely page-filling picture of the ocean as well as no text except, enter here, then you is wasting a big prospect. Search engines think about your main page, the one you arrive at when you land at www.yourcompany.com, to be the most significant page. Your key text with its important keywords should be on your first page. If you before now have splash page, you be supposed to consider scrapping it in total, or at least adding a paragraph with an influential capsule portrayal of your activity.

If your sites have a flash-only first page then the text message on that page is not able to be seen, except for what you are capable of put in your title and description tags. Search engine robots cannot read the text message that has been put in the outward appearance of a flash movie. If you would like to use flash, and as well do well in search engine rankings it is better to make a hybrid page where the flash is bounded by a normal html page with text. The text in the region of the flash movie should be optimized so that the page ranks well in search engine queries intended for your important keywords.

Anatomy of a Search Engine

For some unlucky souls SEO are just the learning of tricks as well as techniques that, according to their understanding, is supposed to propel their site into the high rankings on the major search engines. This perceptive of the way SEO works be able to be effective for a time however it have one fundamental flaw: the rules change. Search engines are in a steady state of evolution in order to continue with the SEO's in much the same way that McAfee, Norton, AVG otherwise whichever of the other anti-virus software companies are constantly trying to continue with the virus writers.

Basing your whole websites future on one easy set of rules (read: tricks) regarding how the search engines will rank your site have an additional flaw, there are many factors being well thought-out than any SEO is aware of and can bear out. Thats right, I will freely disclose that there are factors at work that I might not be aware of as well as even those that I am conscious of I cannot with 100 percent correctness give you the literal weight they are given in the in general algorithm. Even if I could, the algorithm would change a few weeks later as well as whats more, seize your hats for this one: there is more than one search engine.

So if we cannot pedestal our optimization on a set of rigid rules what can we do? The key my friends, is not to appreciate the tricks but to a certain extent what they accomplish. Shimmering back on my high school math teach Mr. Barry Nicholl I remember a silly story that had a huge impact. One weekend he had the whole class watch Dumbo The Flying Elephant (there was in fact going to be a question regarding it on our test). Why? The session we were to acquire from it is that formulas (like tricks) are the feather in the story. They are pointless and yet we hold on to them in the false conviction that it is the feather that works as well as not the logic. Without a doubt, the tricks and techniques are not what work but to a certain extent the logic they follow as well as that is their shortcoming.

And So What Is Necessary?

To rank a website highly as well as keep it ranking over time one should optimize it with one most important understanding, that a search engine is a living thing. without a doubt this is not to say that search engines have brains, I will run off those tales to Orson Scott Card and other science fiction writers, as well as nevertheless their very nature results in a believable being with far additional storage capacity.

If we think about for a split second how a search engine purposes; it goes out into the world, follows the road signs as well as paths to get where its going, along with collects all of the information in its path. From this tip, the information is sent back to a group of servers where algorithms are applied in order to settle on the importance of specific documents. How these algorithms are breed? They are fashioned by human beings who have a great deal of knowledge in understanding the ground rules of the Internet and the documents it contains as well as who also have the capacity to gain knowledge of from their mistakes, as well as update the algorithms consequently. Fundamentally we have an entity that collects data, stores it, along with then sorts through it to decide whats significant which its happy to share with others as well as whats unimportant which it keeps put away.

Archives