Saturday, May 14, 2005

Search Engine Optimization that Works in the Long-Term by Hristo Hristov



Search engines are constantly tweaking their ranking algorithms and when that happens some pages lose their top ranking positions. One such event was the infamous Florida Update. Many pages were practically kicked-out of the top 1000 pages for competitive keywords.

With recent updates, webmasters have been thinking that Google does not use PageRank because low PR pages can get very good rankings. Before that everyone was saying that PageRank was THE factor for top positions. Now, everyone is saying that keyword rich anchor text links from many different sites is the key for the top ranks.

All these recent events seem to indicate that search engine algorithms are totally unpredictable, right? Wrong!

All search engines are going in the very same direction. The scientific literature related to information retrieval and recent search engine patents reveal the not-so-distant future of search engine ranking algorithms.

Introducing Topic Specific Link Popularity

For the last few years search engines relied on General Link Popularity to assess the importance of every page. Relevancy was based on a combination of General Link Popularity (importance) and keyword matches on page and off page (anchor text of links for specificity).

General Link Popularity is measured by summing the weight of ALL incoming links to a page. With General Link Popularity ANY link improved the importance of a page. Webmasters started to buy high-PR links from totally unrelated sites. Pages were getting unrelated votes.

To combat this problem, Google implemented a Topic Specific Link Popularity algorithm. When a user specifies a query, Google determines the importance of a page by the Link Popularity it gets from RELATED to the keywords pages.

A link from a page will give you considerable Topic Specific Link Popularity when:

1) the page itself is optimized for your keywords

2) the page has a high General Link Popularity (PageRank)

3) the page is from a site owned by someone else (you can't vote for yourself)


From a search engine's point of view, implementing a Topic Specific Link Popularity algorithm is a very tough task when the queries need to be answered in less than a second.

All you need to know is this: the top ranked pages for competitive keywords are the ones with the highest Topic Specific Link Popularity.

You need links from pages that have high PageRank, are optimized for YOUR keywords and are owned by someone else.

How do you get these links?

1. Search for your keywords on Google and look at all pages that rank for your keywords. Seek links from these pages.

2. Reciprocal Links. Swap links with sites that can give you a link on a page optimized for your keywords. Look for pages with high PageRank that have your keywords in their title and in their incoming links. Reciprocal links work provided that they come from optimized for your keywords (related) pages.

3. Buy links from some of the top ranked for your keywords pages.

4. DMOZ and Yahoo's directory usually have pages that are very well ranked for your keywords. You absolutely must get links from these pages. If you have a commercial site, don't hesitate and buy a link from Yahoo immediately. It is well worth the $299.

5. Find out who links to the top ranked pages for your keywords. Many of their links will not be topic specific, but many WILL be. Try to get links from the related ones. A page is related when it has your keywords in its title, text etc.

6. Form a link exchange ring with some of your competitors. That's a brutally effective strategy. Basically, you link to your competitors from your main optimized page (usually the home page) and they link to you from their most optimized page! Such rings can dominate the top positions and will be very difficult to outrank (it is difficult to get that amount of topic specific links). The caveat here is that the link exchange is on the main page and is not buried somewhere deep.

One more very important tip.

Increase the relevancy of the page that links to you by using your keywords in the anchor text and the description of your site! Yes, having keywords in the links pointing to your page increase your rankings not only by associating the keywords with your page but also by increasing the relevancy of the page that gives you the link! That's the reason SEOs think anchor text is the most important factor. It is NOT. You can get a monstrous ranking boost from a link that does not use your keywords in the anchor text provided that the page has high PageRank and is optimized for your keywords (an example would be a DMOZ listing).

What about getting unrelated links?

Let's say you buy a high PR unrelated link. The page that links to you does not have your keywords in the title and text. The only factor that makes the link relevant to your keywords is the anchor text to your site and your description. You'll still get some benefit but that's nothing compared to a link from an optimized for your keywords page.

Your site can't get into Google's top 1000 results?

If your site lacks Topic Specific Links, it may get filtered out from the results even if it has a good amount of PageRank (from non-related or affiliated sites). You need some threshold amount of Topic Specific Link Popularity to get into the top 1000 pages for very competitive keywords.

Two Final Points

1. Only one link per site can give you a Topic Specific ranking boost. Look for a link from the most optimized for your keywords page.

2. If you find a page that ranks well for your keywords, go for the link EVEN if that page has a lot of links on it.

To recap: the more optimized a page is for your keywords (measured by PageRank and keywords found on-page and off-page) the more Topic Specific Link Popularity Boost you will get from a link.

Topic Specific Link Popularity is and will be the key for top rankings. Anchor text plays a major role but it is not THE factor. PageRank is still very important especially the PageRank of the pages that link to you.
About the Author
Hristo Hristov, owner of the Search Engine Optimization Guide

Why valid HTML code is crucial to SEO by Roseanne van Langenberg







Valid HTML code is crucial to search engine optimization



Valid HTML code is crucial to search engine optimization


Copyright 2005 Marketing Defined. All Rights Reserved
Why valid HTML code is crucial to your web site's search engine optimization efforts and subsequent high rankings:
Many webmasters and newcomers to web page design overlook a crucial aspect of web site promotion: the validity of the HTML code.
What is valid HTML code?
Most web pages are written in HTML.
As for every language, HTML has its own grammar, vocabulary and syntax, and each document written in HTML is supposed to follow these rules.
Like any language, HTML constantly changes. As
HTML has become a relative complex language, it's very easy to make mistakes ... and we do know by now the favorable weight the new msn.com beta search engine places on proper coding practise ... see a recent article on msn.com coding requirements
HTML code that is not following the official rules is called invalid HTML code. Why is valid HTML code important to search engine optimization and your whole marketing effort?
Search engines have to parse the HTML code of your web site to find the relevant content. If your HTML code contains errors, search engines might not be able to find the content on the page, and there ends your seo efforts and quest for high rankings of that page.
Search engine crawler programs obey HTML standards. They only can index your web site if it is compliant to the HTML standard. If there's a mistake in your web page code, they might stop crawling your web site and they might lose what they've collected so far because of the error.
Although most major search engines can deal with minor errors in HTML code, a single missing bracket in your HTML code can be the reason your web page cannot be found in search engines.
If you don't close some tags properly, or if some important tags are missing, search engines might ignore the complete content of that page.
How can you check the validity of your HTML code? Fortunately, there are free services that allow you to check the validity of your HTML code.
The search engine optimization community's guideline HTML validator is the free W3C HTML Validator .
It is the service that checks HTML documents for conformance to W3C HTML and XHTML recommendations and other HTML standards.
To correct the errors outlined by the W3C validator enter the address of your web page at the Netmechanic in-valid HTML repair page where you have the option of automatically fixing the errors on that page. This Netmechanic resource does have a demo evaluation mode which enables self-application of the referenced alterations.
Although not all HTML errors will cause problems for your search engine rankings, some of them can keep web spiders from indexing your web pages and spoil your search engine optimization efforts.
Valid HTML code makes it easier for search engine spiders to index your site so you should make sure that at least the biggest mistakes in your HTML code are corrected.
.. run your web pages through the W3C validator, make the recommended alterations and the new MSN.com beta search will love you .. the MSN search engine places a high value on proper coding practise.
Entire article available at:
Marketing Defined:Valid HTML code crucial to SEO

This article may be reproduced in its entirety, with no alterations. The resource boxes, live URL's and Author Bio must be included.

Google's New SEO Rules by John Metzler



Google has recently made some pretty significant changes in its ranking algorithm. The latest update, dubbed by Google forum users as "Allegra", has left some web sites in the dust and catapulted others to top positions. Major updates like this can happen a few times a year at Google, which is why picking the right search engine optimization company can be the difference between online success and failure. However, it becomes an increasingly difficult decision when SEO firms themselves are suffering from the Allegra update.

Over-optimization may have played the biggest part in the dropping of seo-guy.com from the top 50 Google results. Filtering out web sites that have had readability sacrificed for optimization is a growing trend at Google. It started with the Sandbox Effect in late 2004, where relatively new sites were not being seen at all in the Google results even with good keyword placement in content and incoming links. Many thought it was a deliberate effort by Google to penalize sites that had SEO work done. It's a few months later and we see many of the 'sandboxed' web sites finally appearing well for their targeted keywords.

With 44 occurrences of 'SEO' on the relatively short home page of seo-guy.com, and many of them in close proximity to each other, the content reads like a page designed for search engine robots, not the visitor. This ranking shift should come as no surprise to SEO professionals as people have been saying it for years now: Sites should be designed for visitors, not search engine robots. Alas, some of us don't listen and this is what happens when search engines finally make their move.

One aspect of search engine optimization that is also affected in a roundabout way is link popularity development. After observing the effects of strictly relevant link exchanges on many of our client's sites recently, we've noticed incredibly fast #1 rankings on Google. It seems Google may be looking out for links pages designed for the sole purpose of raising link popularity and devalues the relevance of the site. After all, if a links page on a real estate site has 100 outgoing links to pharmacy sites, there has to be a lot of content on that page completely unrelated to real estate. Not until now has that been so detrimental to a site's overall relevance to search terms. It goes back to the old rule of thumb: Make your visitors the top priority. Create a resources page that actually contains useful links for your site users. If you need to do reciprocal linking then keep it relevant and work those sites in with other good resources.

Keeping up with the online search world can be overwhelming for the average small business owner or corporate marketing department. Constant Google changes, MSN coming on the scene in a big way, and all the hype around the new Become.com shopping search function can make heads spin. But just keep things simple and follow the main rules that have been around for years. Google, as well as other search engines, won't ever be able to ignore informative, well written content along with good quality votes from other web sites.
About the Author
John Metzler is the co-creator of Abalone Designs, Inc. www.abalone.ca, a Search Engine Optimization company in Vancouver, Canada. He has been involved in web design and web marketing since 1999 and has helped turn Abalone Designs into one of the top SEO companies in the world.