Thursday, April 06, 2006

Google Algorithm Problems

by Rodney Ringler

Have you noticed anything different with Google lately? The Webmaster community certainly has, and if recent talk on several search engine optimization (SEO) forums is an indicator, Webmasters are very frustrated. For approximately two years Google has introduced a series of algorithm and filter changes that have led to unpredictable search engine results, and many clean (non-spam) websites have been dropped from the rankings. Google updates used to be monthly, and then quarterly. Now with so many servers, there seems to be several different search engine results rolling through the servers at any time during a quarter. Part of this is the recent Big Daddy update, which is a Google infrastructure update as much as an algorithm update. We believe Big Daddy is using a 64 bit architecture. Pages seem to go from a first page ranking to a spot on the 100th page, or worse yet to the Supplemental index. Google algorithm changes started in November 2003 with the Florida update, which now ranks as a legendary event in the Webmaster community. Then came updates named Austin, Brandy, Bourbon, and Jagger. Now we are dealing with the BigDaddy!

The algorithm problems seem to fall into 4 categories. There are canonical issues, duplicate content issues, the Sandbox, and supplemental page issues

1. Canonical Issues: These occur when a search engine treats www.yourdomain.com, yourdomain.com, and yourdomain.com/index.html all as different websites. When Google does this, it then flags the different copies as duplicate content and penalizes them. Also, if the site not penalized is http://yourdomain.com, but all of the websites link to your website using www.yourdomain.com, then the version left in the index will have no ranking. These are basic issues that other major search engines, such as Yahoo and MSN, have no problem dealing with. Google is possibly the greatest search engine in the world (ranking themselves as a 10 on a scale of 1 to 10). They provide tremendous results for a wide range of topics, and yet they cannot get some basic indexing issues resolved.

2. The Sandbox: This has become one of the legends of the search engine world. It appears that websites, or links to them, are "sandboxed" for a period before they are given full rank in the index, kind of like a maturing time. Some even think it is only applied to a set of competitive keywords, because they were the ones being manipulated the most. The Sandbox existence is debated, and Google has never officially confirmed it. The hypothesis behind the Sandbox is that Google knows that someone cannot create a 100,000 page website overnight, so they have implemented a type of time penalty for new links and sites before fully making the index.

3. Duplicate Content Issues: These have become a major issue on the Internet. Because web pages drive search engine rankings, black hat SEOs (search engine optimizers) started duplicating entire sites' content under their own domain name, thereby instantly producing a ton of web pages (an example of this would be to download an Encyclopedia onto your website). As a result of this abuse, Google aggressively attacked duplicate content abusers with their algorithm updates. But in the process they knocked out many legitimate sites as collateral damage. One example occurs when someone scrapes your website. Google sees both sites and may determine the legitimate one to be the duplicate. About the only thing a Webmaster can do is track down these sites as they are scraped, and submit a spam report to Google. Another big issue with duplicate content is that there are a lot of legitimate uses of duplicate content. News feeds are the most obvious example. A news story is covered by many websites because it is content the viewers want. Any filter will inevitably catch some legitimate uses.

4. Supplemental Page Issues: Webmasters fondly refer to this as Supplemental Hell. This issue has been reported on places like WebmasterWorld for over a year, but a major shake up around February 23rd has led to a huge outcry from the Webmaster community. This recent shakeup was part of the ongoing BigDaddy rollout that should finish this month. This issue is still unclear, but here is what we know. Google has 2 indexes: the Main index that you get when you search, and the Supplemental index that contains pages that are old, no longer active, have received errors, etc. The Supplemental index is a type of graveyard where web pages go when they are no longer deemed active. No one disputes the need for a Supplemental index. The problem, though, is that active, recent, and clean pages have been showing up in the Supplemental index. Like a dungeon, once they go in, they rarely come out. This issue has been reported with a low noise level for over a year, but the recent February upset has led to a lot of discussion around it. There is not a lot we know about this issue, and no one can seem to find a common cause leading to it.

Google updates were once fairly predictable, with monthly updates that Webmasters anticipated with both joy and angst. Google followed a well published algorithm that gave each website a Page Rank, which is a number given to each webpage based on the number and rank of other web pages pointing to it. When someone searches on a term, all of the web pages deemed relevant are then ordered by their Page Rank.

Google uses a number of factors such as keyword density, page titles, meta tags, and header tags to determine which pages are relevant. This original algorithm favored incoming links and the anchor text of them. The more links you got with an anchor text, the better you ranked for that keyword. As Google gained the bulk of internet searches in the early part of the decade, ranking well in their engine became highly coveted. Add to this the release of Google's Adsense program, and it became very lucrative. If a website could rank high for a popular keyword, they could run Google ads under Adsense and split the revenue with Google!

This combination led to an avalanche of SEO'ing like the Webmaster world had never seen. The whole nature of links between websites changed. Websites used to link to one another because it was good information for their visitors. But now that link to another website could reduce your search engine rankings, and if it is a link to a competitor, it might boost his. In Google's algorithm, links coming into your website boost the site's PageRank (PR), while links from your web pages to other sites reduce your PR. People started creating link farms, doing reciprocal link partnerships, and buying/selling links. Webmasters started linking to each other for mutual ranking help or money, instead of quality content for their visitors. This also led to the wholesale scraping of websites. Black hat SEO's will take the whole content of a website, put Google's ad on it, get a few high powered incoming links, and the next thing you know they are ranking high in Google and generating revenue from Google's Adsense without providing any unique website content.

Worse yet, as Google tries to go after this duplicate content, they sometimes get the real company instead of the scraper. This is all part of the cat and mouse game that has become the Google algorithm. Once Google realized the manipulation that was happening, they decided to aggressively alter their algorithms to prevent it. After all, their goal is to find the most relevant results for their searchers. At the same time, they also faced huge growth with the internet explosion. This has led to a period of unstable updates, causing many top ranking websites to disappear while many spam and scraped websites remain. In spite of Google's efforts, every change seems to catch more quality websites. Many spam sites and websites that violate Google's guidelines are caught, but there is an endless tide of more spam websites taking their place.

Some people might believe that this is not a problem. Google is there to provide the best relevant listings for what people are searching on, and for the most part the end user has not noticed an issue with Google's listings. If they only drop thousands of listings out of millions, then the results are still very good. These problems may not be affecting Google's bottom line now, but having a search engine that cannot be evolved without producing unintended results will hurt them over time in several ways.

First, as the competition from MSN and Yahoo grows, having the best results will no longer be a given, and these drops in quality listings will hurt them. Next, to stay competitive Google will need to continue to change their algorithms. This will be harder if they cannot make changes without producing unintended results. Finally, having the Webmaster community lose faith in them will make them vulnerable to competition. Webmasters provide Google with two things. They are the word of mouth experts. Also, they run the websites that use Google's Adsense program. Unlike other monopolies, it is easy to switch search engines. People might also criticize Webmasters for relying on a business model that requires free search engine traffic. Fluctuations in ranking are part of the internet business, and most Webmasters realize this. Webmasters are simply asking Google to fix bugs that cause unintended issues with their sites.

Most Webmasters may blame ranking losses on Google and their bugs. But the truth is that many Webmasters do violate some of the guidelines that Google lays out. Most consider it harmless to bend the rules a little, and assume this is not the reason their websites have issues. In some cases, though, Google is right and has just tweaked its algorithm in the right direction. Here is an example: Google seems to be watching the incoming links to your site to make sure they don't have the same anchor text (this is the text used in the link on the website linking to you). If too many links use the same anchor text, Google discounts these links. This was originally done by some people to inflate their rankings. Other people did it because one anchor text usually makes sense. This is not really a black hat SEO trick, and it is not called out in Google's guidelines, but it has caused some websites to lose rank.

Webmasters realize that Google needs to fight spam and black hat SEO manipulation. And to their credit, there is a Google Engineer named Matt Cutts who has a Blog site and participates in SEO forums to assist Webmasters. But given the revenue impact that Google rankings have on companies, Webmasters would like to see even more communication around the known issues, and help with identifying future algorithm issues. No one expects Google to reveal their algorithm or what changes they are making. Rumor on the forum boards speculates that Google is currently looking at items like the age of the domain name, websites on the same IP, and frequency of fresh content. It would be nice from a Webmaster standpoint to be able to report potential bugs to Google, and get a response. It is in Google's best interest to have a bug free algorithm. This will in turn provide the best search engine results for everyone.

About The AuthorRodney Ringler is President of Advantage1 Web Services, Inc., which owns a network of Web Hosting Informational Websites including Hostchart.com, Resellerconnection.com, Foundhost.com and Resellerforums.com.

Tuesday, April 04, 2006

Can duplicate content influence your rankings?

Every few months, webmaster forums discuss if search engines penalize duplicate content.
Duplicate content can happen if web pages publish the same articles, if different domains point to the same web space or if webmasters steal the content of other pages. If two shops sell the same item and use similar shop systems, some product pages can also look like duplicated web pages.

Is there really a penalty for duplicate content?
There are many opinions in the discussion forums but there's no proof that search engines really penalize duplicate content.

If there really was a duplicate content filter then many news web sites that publish AP or
Reuters news would be banned from search engines. For example, you can find many web pages with exactly the same article here. All pages can be found on Google.
However, many people insist that a duplicate content filter exists.

Why do people think that there's a duplicate content penalty?

Some people think that there is a duplicate content penalty because a web page that shows a special article might have a Google PageRank 0 and another web page with the same article might have a Google PageRank 5.
Not all web pages with the same content have the same search engine rankings. If a web site is older than another, if it has better inbound links and if that site has more content than it's likely that it will get better rankings than another page that lists the same article.

That doesn't mean that the web site with the worse ranking has been penalized. It just means that the other web site probably has more links and that the page is more trustworthy to search engines.

Some people think that there must be a duplicate content filter because additional domain names that point to the same web space as the main domain name are usually not listed on search engines.

This is not due to a duplicate content filter. It's an issue with canonical URLs. Google has addressed that problem with their latest ranking algorithm update.

Whether there is a duplicate content penalty or not has yet to be proved. If you want to outperform your competition on search engines, make sure that your web site has unique content that cannot be found on other sites.

If your web site has unique content, you don't have to worry about potential duplicate content penalties. Optimize that content for search engines and make sure that your web site has good inbound links.

It's hard to beat a web site with great optimized content and many good inbound links.

Wednesday, March 29, 2006

SEO For MSN

This is article one of a four part series on optimizing your website for the "Big Three". Part two will focus on Yahoo!, Part three will focus on Google and part four of this series will explain how to perform SEO on your website to attain high rankings across all three major engines. We are beginning with MSN as rankings are generally faster attained on this engine and thus it is a good place to begin, especially if you have a new site that is likely still in the sandbox on Google or are just at the beginning stages of link building.Like all of the major search engines, MSN builds their index of sites using spiders to crawl the web finding new and changed information. This information is then processed by the MSN servers using complex algorithms to determine which sites are most relevant to the search query entered.

This may seem like an extraordinarily complex process and it is however the resulting environment is simple: all search engine algorithms are mathematical and thus, there is a fixed set of rules and factors which, if addressed correctly, will result in a high ranking. In short, because it's math we have the benefit of knowing that if we take action x and action y we will get results.

The Rules For MSN

Assuming that you are following the right rules, the results you can achieve on MSN can be fast and solid. MSN does not apply the same types of aging delays that the other two engines do and thus, when you change your content the change in results can be realized as quickly as they reindex your site and as quickly as your incoming links get picked. This differs greatly from Google and Yahoo! in that those two engines age both domains and links requiring a longer period of time before the full effects of your efforts are realized.

As an additional note on MSN, users of MSN are 48% more likely to purchase a product or service online than the average Internet user according to a comScore Media report.

So what are the rules for MSN that can help us get top rankings? As with all the major engines, there are two fundamental areas that need to be addressed to attain top rankings. The first is the onsite factors, the second is the offsite. Because they are fundamentally different we will address them separately.

Onsite SEO Factors

The problem with writing an article about the onsite factors is that by the time many of you read this some of the weight these factors hold and the optimal levels noted may well be out-of-date. Thus, rather than listing overly-specific-and-sure-to-change factors we will focus on how to know what the factors are, how to get a handle on what you need to adjust and by how much, and how to predict what will be coming down the road. And so we'll begin:

How To Know What The Factors Are:

Unfortunately there's no one over at MSN Search calling us up weekly to let us know what the specifics of their algorithm are, we have to figure it out for ourselves with research, reading and playing with test sites. From all of this there is only one conclusion that an SEO can make: the details matter. When we're discussing onsite factors this includes:

* the content of the page including keyword density
* the internal linking structure of the site (how the pages of your site are linked together)
* the number of pages in your site and the relevancy of those pages to your main topic and phrases
* the use of titles, heading tags and special formats

There are a number of lower weight factors however the ones noted above, if addressed correctly, will have very significant results on your rankings if the offsite factors noted below are also addressed.

Page Content:

The content of your page must be perfect. What I mean by this is that the content must appeal to both the search engines and the algorithms. In order to write properly for the visitors you must be able to write clearly and in language that is both appealing and understandable to your target market. While there is much debate about whether the keyword density of your page is important I am certainly one who believes that it is. It only makes sense that a part of the algorithm takes into account the use of the keywords on your page. Unfortunately the optimal keyword density changes slightly with each algorithm update and also by site type and field. For this reason it would be virtually impossible for me to give you a density that will work today and forevermore. For this reason you will need a keyword density analysis tool which you will want to run on your own site as well as the sites in the top 10 to assess what the optimal density is at this time. You may notice a variation in the densities of the top 10. This is due to the other factors including offsite which can give extra weight to even a poorly optimized site. I recommend getting your site to a keyword density close to the higher-end of the top 10 but not excessive. Traditionally this percentage will fall somewhere near 3.5 to 4% for MSN.

Internal Linking Structure:

The way your pages link together tells the search engines what the page is about and also allows them to easily (or not-so-easily) work their way to your internal pages. If your site has an image or script-based navigation it is important to also use text links either in your content, in a footer, or both. The text links are easy to follow for a spider and perhaps more importantly, the text links allow you the opportunity to tell the spiders what a specific page is about though the anchor text and, in the case of footers, allows you to add in more instances of the targeted phrases outside of your general content area.

The Number Of Pages & Their Relevancy:

MSN wants to please their visitors. For this reason they want to insure that highest likelihood that a searcher will find what they need once they get to your site. For this reason a larger site with unified content will rank higher that a smaller site or a site with varying content topics. (note: this assumes that all else is equal in regards to the other ranking factors)

When you are optimizing your site for MSN be sure to take some time to built quality content. Do a search on your major competitors to see how large their sites are, over time you will want to build yours to the same range through general content creation or the addition of a blog or forum to your site.

Titles, Heading Tags & Special Formats:

Titles are the single most important piece of code our your entire web page for two reasons. The first is that it holds a very high level of weight in the algorithm. the second reason is that it is your window to the world. When someone runs a search the results will generally show your page title in the search results. This means that a human visitor has to be drawn to click on your title or rankings your site is a futile effort (this isn't about bragging rights, it's about return on investment).

Heading tags are used to specify significant portions of content. The most commonly used is the H1 tag though there are obviously others (or they wouldn't bother numbering them would they). The H1 tag is given a significant amount of weight in the algorithm provided that it is not abused though overuse (it should only be used once per page). Try to keep your headings short-and-sweet. They're there to tell your visitor what the page is about, not your whole site.

Special formats are, for the purpose of this article, and text formatting that distinguishes a set of characters or words apart from the others. This includes such things as, anchor text, bold, italic, different font colors, etc. When you set content apart using special formats MSN will read this as a part of your content that you want to draw attention to and which you obviously want your visitors to see. This will increase the weight of that content. Now don't go making all your keyword bold or the such, simply make sure to use special formats properly. Inline text links (links in the body content of your page) is a great way to increase the weight of specific text while actually helping your visitor by providing easy paths to pages they may be interested in.

Offsite SEO Factors

With MSN, the offsite factors are much simpler to deal with than either Google or Yahoo! MSN will give you full credit for a link the day they pick it up so link building, while time consuming, is reworded much quicker on MSN. When dealing with MSN and offsite SEO there are two main factors we must consider when finding links:

* Relevancy. The site must be relevant to yours to hold any real weight.
* Quality is better than quantity. Because PageRank is Google-specific we can't use it as the grading tool for MSN however upon visiting a website it's generally fairly clear whether we're visiting a good site or not. Spending extra time to find quality is well rewarded. Also, finding one-way links as opposed to reciprocal links is becoming increasingly important and I'd recommend utilizing both in your link building strategies.

You will have to begin your offsite optimization by running link checks on your competitors to see what you're up against. This is also a good place to start for potential link partners though those of you using a tool such as Total Optimizer Pro or PR Prowler will find it far faster and more effective to use these tools.

Conclusion

This entire article may seem fairly simplistic and there's a reason for that, what we've noted above is a list of the more important areas however to save you frustration and me from receiving hundreds of emails a few months from now noting that the keyword densities don't work, etc. I've tried to keep it general. Below you'll find a list of recommended resources. These are tools and SEO resources to help keep you updated and on top of the rankings.

Next week we will be covering Yahoo!

Resources
Total Optimizer Pro - A keyword density and backlink analysis tool. This tool breaks down a variety of onsite and offsite factors giving you a full snapshot of how the top 10 got their positions.
Microsoft Press Room - Read the latest press releases from Microsoft. This may not give you the algorithm but it will tell you the direction they're going. Understand this and you'll be better equipped to deal with changes down the road.
SearchEngineWatch's MSN Forums - Read the latest news, feedback and discussions on the SearchEngineWatch forums. A great way to keep updated but beware, not everyone in there is a qualified opinion.

About This Author:
Dave Davies is the CEO of the SEO services firm Beanstalk Search Engine Positioning, Inc. Beanstalk provides guaranteed search engine positioning and SEO%20consulting[/url'>services to clients from around the world. If you're more interested in doing it yourself please visit our [url=http://www.beanstalk-inc.com/blog/]SEO news blog to keep updated on the latest goings-on in the search engine world.

Sunday, March 19, 2006

Rant and Rave About Google

Rant and Rave About Google

The Search

I recently purchased a book called "The Search" by John Battelle that explores how Google and its rivals rewrote the rules of business and transformed our culture. What really caught my attention was Chapter 7 titled The Search Economy. It related how a small e-commerce store got yanked by the short-hairs when Google made an unexpected algorithm change in 2003 virtually wiped out their business.

Google did it again in late 2005, and will no doubt do it again and again and again. Rumor has it that it will happen again in March 2006.

Google Giveth, Google Taketh Away

Those of you who are webmasters already know how this happens. Some eager beaver group of Google engineers laid waste to thousands of mom and pop businesses by tweaking Google’s indexing algorithm. These businesses depended on their Google listings for their income and livelihoods. Google giveth and Google taketh away.

Knowing that the Google paradigm will always change puts you ahead of the pack; not putting all your marketing eggs in one basket will keep you there.
Your unpaid or "organic" rankings in search engines are free. But how many times have you heard the axiom "there's no such thing as a free lunch."

Starting to get the picture now?

Google Still Likes Links

I've been getting tons of automated requests for two-way and three-way linking. I can't believe what these people are thinking. The rules are displayed in black and white on Google's web site. Allow me to paraphrase "Build pages for users, not search engines."

Here's a typical email I get every day:

Hello Sir/Madam,I'm mailing you for exchanging three-way link with your site. Though we are accustomed with reciprocal link exchange, the fact is that three-way link is always better than reciprocal link exchange, as all search engines give more attention to three-way links.When search engines can't trace a link back from one site to another it thinks that site is very important so other site is linking to it just like we use google or yahoo in our website.

Do these bozos really believe this?

Fact of the matter is you should link to a website that you believe will be of value to your website viewer’s. That’s it. No schemes, no tricks, no 2, 3, 4, 5 or 6 way linking. Provide your visitors good content and good links. Period.

Vertical Channels and Directories

Web sites like Global Spec (for engineers) and Find Law (for lawyers) are quasi-vertical advertising channels and global directories. Vertical marketing is a great way to target business in the same genre that you practice and participate. It’s also a great way for regional companies to inexpensively obtain entry into new markets or regions.

For web guys like myself, there are similar verticals or directories such as Marketing Tool, Xemion, and Top Web Designers. But here’s the rub. These directories have an unfair advantage in the linking scheme of Google. But who said life is fair.

Directories have "muddied the Internet waters." With so many links from so many sites across so many states, it is the equivalent of being a 500-pound linking gorilla. I keep hoping that the next Google dance will place these linking monsters accordingly, but that has yet to happen. Once everyone catches onto this flawed "link ranking" scheme, a search for any term or phrase in Google will provide you nothing but a page full of directories.

Pure Search

This leads to my next point. Say I have an xyz disease (God forbid). If I search "xyz disease nutrition" in Google, I want to find web sites about xyz disease nutrition, or battling xyz disease with proper diet and so on. I don’t want to see a directory full of re-packaged information, filled with ads, newsletters and other useless directory fluff.

I want my search engine to emulate the Library of Congress. Let's say the librarian says, "Books on xyz disease nutrition are located on isle 700b" on row 3. I stroll over to 700b row 3 and pull out a book that systematically lists all the books on 700b, row 3. Wait a second! This isn’t a book about xyz disease nutrition, it's a "directory" or reference book that belongs in the "reference section" of the library.

Are you listening Google?

Maybe Mr. Gates is.

Or better yet maybe Mr. Jobs has a trick or two left up his sleeve.iPod.com today iSearch.com tomorrow.

Wednesday, February 15, 2006

The Dark Side Of Google

By Christine Stander
The way that search marketers dream up conspiracy theories you'd think that we're all paranoid with nothing better to do.

Is there a true reason for concern? I think not, but reading other peoples paranoia is always entertaining. We all know search engines are "out to get webmasters". They have nothing better to do than to think of new ways that will infringe on websites rankings or play hide and seek with site PageRank.
Google is at the forefront of the theorists' attention. And it's not very hard to see why.

It's Tough Being At The Top

Google's market share is certainly growing. It handled 60 percent of Internet search queries in November 2005, up from 47 percent a year earlier, according to ComScore Networks. Google's chief officers have expressed that they are committed to growing the company itself in a sustainable way.

Quoting CFO George Reyes: "Google would be spending more on research and development, and will invest heavily in its computing infrastructure."

Google's motto "do no evil" has been analyzed and debated so many times. Forum posts and articles are always met with "Google does this" or "Google does that", the fact of the matter is that none of us know "what Google's intentions are, except Google themselves of course...but it's still nice to enter the guessing game to see exactly "how close, or far off" you are from the materialization.

Enter the Conspiracies

Everyone has their opinion on the matter - which makes for entertaining reading at least.

Jagger Update


The conspiracy: Google is out to destroy all the organic listings so that everyone will move over to PPC.

The real deal: Google updates their algorithm from time to time to help make search results more relevant. Each update usually receives a name by the SEO community - somewhat like naming hurricanes. The most recent update was called "Jagger". Many scraper directory sites and sites that bought those links were removed from the update.

If you had made use of any shädy techniques it is most likely that your site was caught in Jagger. It was quite a harsh update if you had not employed solid SEO techniques. So needless to say there are a lot of angry webmasters out there. A good example is the German BMW site (bmw.de) which was recently removed for making use of sp@m techniques. Just goes to show SEO is SEO no matter what the language.

Google Adsense

The conspiracy:

Google Adsense sites get priority in rankings so that Google can make more monëy. And also Google is trying to take dominance and force webmasters to use Adsense rather than outbound links (link building).

The real deal: If this were true, regardless of how hard Google was to "try", they couldn't force a greater number of people to Adwords through preventing the achievement of a favorable ranking.

Besides, when Adwords first was released, several SEO's tested this theory buy purchasing paid listings over varied lengths in time. The results? There was absolutely no correlation between purchasing an Adwords account and your organic search ranking.

IP Recording / Privacy Infringement

The conspiracy: Search engines log IP addresses. The data collected can be used against you.

The real deal:
There have been many theories that Google logs searchers' IP addresses etc., to track their search behaviour, but the situation has gotten much biggër than that. With all the hype stemming from the Department of Justice requesting logs from the Big Shots of search to see what searches were conducted, the talk has shifted to legal implications should the court find in favour of government.

Every bit of network traffïc you use is marked with your IP address; it can be used to link all of those disparate transactions together.

Filtering Results

The conspiracy: If Google can filter the results for China, what stops them from filtering the rest of world?

The real deal: Well this is still very much a hot topic at the moment and I have not really made up my mind on this one quite yet. I can only refer to the Google "Human Rights Caucus Briefing" in their Blog.
Excerpt from blog: "In deciding how best to approach the Chinese - or any - market, we must balance our commitments to satisfy the interests of users, expand access to information, and respond to local conditions. Our strategy for doing business in China seeks to achieve that balance through improved disclosure, targeting of services, and local investment."

And "In order to operate Google.cn as a website in China, Google is required to remove some sensitive information from our search results. These restrictions are imposed by Chinese laws, regulatïons, and policies. However, when we remove content from Google.cn, we disclose that fact to our users."

This is nothing new; in fact Google has altered their search results to comply with local laws in France, Germany, and the United States previously. Also, is it not better to have censored information than none at all? At least this way Google has a starting point from which to fight the censorship.

Do No Evil
According to Larry Page: "Google's goal is to provide a much higher level of service to all those who seek information, whether they're at a desk in Boston, driving through Bonn, or strolling in Bangkok."

The Google philosophy:
1. Focus on the user and all else will follow
2. It's best to do one thing really, really well
3. Fast is better than slow
4. Democracy on the web works
5. You don't need to be at your desk to need an answer
6. You can make monëy without doing evil
7. There is always more information out there
8. The need for information crosses all borders
9. You can be serious without a suit
10. Great just isn't good enough

Excerpt from site: Full-disclosure update: When we first wrote these "10 things" four years ago, we included the phrase "Google does not do horoscopes, financial advice or chat." Over time we've expanded our view of the range of services we can offer -- web search, for instance, isn't the only way for people to access or use information - and products that then seemed unlikely are nöw key aspects of our portfolio. This doesn't mean we've changed our core mission; just that the farther we travel toward achieving it, the more those blurry objects on the horizon come into sharper focus (to be replaced, of course, by more blurry objects).

Some psychologists say that the closer one becomes to a person (or something) the harder it is to see the good stuff. Has Google become so intertwined in our daily lives that we no longer recognize the good stuff that it has brought us?

Let me remind you of a few:

1. Relevant Search Results: A source to find information faster. Every update gets rid of the "clutter".

2. Gmail: As far as frëe web based email goes, this must be the most user-friendly with the largest amount of storage space to boot. You can also tie in any other email accounts you may hold and use Google's interface as the "one stop shop" so to speak.

3. Gtalk: Google's frëe IM and Voice Chat service. Nöw also tying in with your Gmail interface. This means that it's accessible from wherever you have internet - you don't need to have the program installed on the machine that you're working from.

4. Leader of other SE: There is no doubt that Google is at the forefront of "great new ideas" for search engines. Google leads and the rest follow. One example is Gmail - more storage space for frëe. Yahoo! was soon to follow with a similarly sized email account for Yahoo! Mail users at no cost. MSN, however, charges for an increased mailbox.

5. Google Earth: Geographic information at your fingertips. Get driving directions and location information for just about anywhere on the globe, and because they use satellite imagery intertwined with maps you get a pretty good idea of what any place looks like.

6. Google Video: A selection of homemade clips, TV shows, movies and viral clips *freely available on the net. (*some TV shows and movies need to be purchased of course)

7. Google Alerts: Need to know when someone has mentioned you, your company or any topic of interest to you on their website? With Google Alerts you are notified *as it happens. (*as Google spiders that site)

These are only but a few things that Google has brought into our lives so to speak.

So ask yourself again - is there really any concern for their progress, or are we benefiting from it at the end of the day?

Forget About It

It's a typical situation where a good company gets too big and people start getting a little uncomfortable about its dominance in society.

So I say forget about all the clutter and focus on the good stuff of which 2006 will bring many new innovations and a whole bunch of new conspiracy theories no doubt.

About The AuthorChristine Stander is a professional search engine optimization and online marketing strategist with experience in many facets of search marketing, user behaviour analysis and brand management. For more information please refer to: http://www.altersage.com.

Wednesday, February 08, 2006

Google Big Daddy SearchQuake

Running ranking reports for clients is a standard part of an SEO's job. This week I created a position report for a client - one for which we'd made significant gains in ranking for their targeted search phrase - and proudly sent off the report to them before a scheduled conference call to discuss our progress and status.

The client sent an email upon receiving the report saying "There is something wrong with your report - we rank higher than this report claims." I went back to Google and typed in the search phrases to find rankings exactly where the report showed them the previous day.
I explained to that client that Google has (at last count) nine data centers which serve up search results and that they were getting results from a data center in the Eastern US which showed differing results from results shown to us here in California.

"Take a look at this link where Google datacenter IP addrresses are listed in detail."

http://www.webworkshop.net/seoforum/viewtopic.php?t=548

"Here is an overview of a coming update to all Google datacenters expected in February or March of 2006."

http://directmag.com/searchline/1-25-06-Google-BigDaddy/

"So you ARE ranking better from your area of the country and that particular data center which returns results to you. Things usually update to match in all data centers, but sometimes you may do better in one data center than in others. If you search from each individual IP address in that list discussed in the forum linked above, you'll see different rankings and may find datacenters where you rank at the bottom of page two of results."

You might also search from that new "Big Daddy" data center referenced in that article above, which discusses upcoming Google ranking algorithm changes due soon.

http://66.249.93.104

Where I'm seeing you ranked at #17 (bottom of page two.)

It's a measure of where you might expect to be when Google moves to that new algorithm for all data centers in February or March. (Of course we continue to work to achieve better results before then.)

This upcoming change in algorithm and the interestingly named server "Big Daddy" were
publicly posted on Matt Cutts blog for beta testing by SEO's (and other Google Watchers) who read him regularly. (For those who don't know, Cutts is a software engineer at Google & shares SEO tips on his blog)

http://www.mattcutts.com/blog/

Of course this news was a bit much for the client to digest in one chunk and he had little time to read the articles I referenced in my note above, but it was enough to assure him that I knew what I was talking about and explain the differences in my report and his own keyword searches at his end of the country. It's a bit odd to try to explain to a client "there are different Googles." Few know or understand this.

Another issue cropped up later in the day when I was doing further research for a different client and found, while we were speaking on the phone, that his results differed from my own on specific query operator searches. We were using the "site:businessdomain.com" query operator and the "allinurl:pick-your-own-URL" query operator to limit search results and got vastly different numbers of results and rankings for the same searches.

The first stunning thing in this example was that we are less than 25 miles apart in Southern California. The second shocker was that I tried simply hitting the "Search" button a second time after getting the first results page and things changed again! All of this happening in a single day makes me believe that some percolating of results is going on as Google eases into an algorithm change.

Perhaps this is not all that unusual, but in seven years of this work, I've not seen the volatility noted in January of 2006. Are we about to have a major SearchQuake? Is Google about to split the earth and spew volcanic new results? Stand by for the BigDaddy SearchQuake sometime this month or next.

About The AuthorMike Banks Valentine blogs on Search Engine developments from http://RealitySEO.com and can be contacted for SEO work at: http://www.seoptimism.com/SEO_Contact.htm. He operates a free web content distribution site at: http://Publish101.com

Thursday, February 02, 2006

The Google Conspiracy Theory

By Mark Daoust

In December, I published an article on the effect of purchasing links for pagerank. Much to my surprise, I got quite a bit of feedback – most of which was negative. The feedback echoed a sentiment that I have seen from more than one person involved in the SEO industry. It is a sentiment that seems to think that Google is happily manipulating the entire SEO and webmaster community for their own profitable gain. The whole idea seemed like a conspiracy.

I generally do not like conspiracies.

What Was Said

The article on purchasing links for page rank was supposed to look simply at whether link buying was a good practice for website owners. The conclusion I reached and tried to prove was that any website owner who wanted to take a long-term approach to SEO should avoid buying links. The primary reason behind this conclusion is Google and Yahoo's adamant stance against purchasing links for search engine gain. Although several website owners are currently purchasing links and seeing a positive effect, this does not mean that Google is not actively trying to detect those who purchase links to devalue those purchased links. Website owners who may be successful nöw with this strategy may find themselves with a not so successful ranking if Google detects that their ranking is the result of purchased links.

The responses I received against Google were numerous. However, the idea that Google was trying to make SEO more difficult by discouraging link exchanges and link purchasing for website owners in order to force more people into their Adsense program was a theme that ran throughout all the responses.

So is this true? Is Google looking to undermine the honest efforts of honest webmasters who are just looking for a decent ranking in the world's most popular search engine? Did the Googleplex devise a grand and sinister plan to force the wallets of small business owners?

If Google Is Against Link Buying, Then Why Do They Sell Links Through Adwords?

Jim Tarabocchia of Just Binoculars was quick to point out that Google would be hypocritical to encourage website owners to not purchase text links. After all, as Jim put it, "if this is the case, why does Google sell Adwords"?

This is a good point.

It is obvious that Google believes in the power of link advertising – it represents the largest share of Google's revenue. If Google were indeed against text link advertising, there would be only one conclusion that we could draw: Google does not like text link advertising because they want to be the only ones to sell text links. Therefore, Google is using the power of their network and the desire that every website owner has to get a top ranking in Google to get more people to buy Adwords, and force any text-link competition out of business.

The problem with this conclusion is that Google is not penalizing websites for text link advertising if it is done in a certain manner. I will concede that Google probably does want to gain as much market share as possible in the text link advertising industry, but so does every other text link ad network. This does not mean with any certainty that Google is changing their SEO requirements to eliminate the text link advertising industry.

In fact, one could even argue that Google has protected the industry. The introduction of the "nofollow" tag found its birth in a need to curb blog comment sp@m. Whenever a link has this attribute added on, Google and a few other search engines will not pass on any pagerank to the site being linked to. This has served as a way for website owners to sell text links as advertisements without being mistaken as participating in a program to artificially raise a website's ranking in the search engine.

Google is not against text link advertising – they are against purchasing text links for the purpose of manipulating your search engine rankings. It is these purchased links that they are trying to detect and that their engineers have warned webmasters about.

The Argument Against Google: You Have No Choice But Adwords

Jim continued with his points in a follow up email:

"In my opinion, Google does not want this done because sites that begin to rank well no longer need to purchase text links through adwords or adsense. This means less revenue for Google. Let's face it, in order to receive traffïc through the engines you need to rank well, if you don't then your only other option is to purchase your position through adwords or adsense. And in my opinion if you are buying adwords then it is EXACTLY the same thing if you were to buy text links from someone else to get your PR to boost up and achieve better results in the engines."

Jim responded directly to my defense of Google. As I stated in the previous section, Google is not against purchasing text links for advertising purposes, they are only against purchasing links for the purpose of getting a top ranking. Jim makes the point that buying links for pagerank to get a top ranking is essentially the same thing as buying a top ranking through Adwords.

The problem is that it is not the same thing. The first problem with this idea is that it equates natural rankings as being equal in value to paid listings. Paid listings have shown time and time again that they are not nearly as effective as an organic ranking. Users are much more likely to trust a website if they find it through an organic listing.

Yet Jim is not alone in his point. Many website owners believe that Google wants to keep sites from ranking well in order to turn them to Adwords. Bruce from A1 Web Design had this to say:

"How on earth does a new website online get ranked? Mmmmm... PPC and Adsense! Nöw there's a good topic... Google frowns upon links but has created its own linking affïliate scheme!"

The idea that Google wants every website owner to participate in Adwords is not a new idea, and it probably is not far from the truth. After all, I don't know any business that would turn down an offer to sell their product to every person in their industry. But the idea that Google is somehow trying to force website owners into purchasing an Adwords campaign puts Google into a sinister light.

Google Cannot Prevent Websites From Ranking Well

There is only a limited number of websites that will achieve a top ranking for any keyword. In fact, we know exactly how many websites will receive a top ranking. On the first page, there will be 10 ranked websites, on the second page their will also be 10 websites, etc. The fact is, for any given keyword, regardless of how many people are competing for that keyword, there are a fixed number of top rankings available.

Both Tim and Bruce implied that Google wants to keep websites out of the top 10 to force them to Adwords. The problem with this is that Google cannot keep websites out of the top 10. They always must rank at least 10 websites in the top 10, as well as 10 more websites in the top 20, and so on. It does not matter if Google discounts link exchanges or purchased links, or if they turn traditional SEO practices on its head, they still are forced to rank a certain fixed number of websites well for any given keyword. Regardless of how hard they try, they cannot force a greater number of people to Adwords through preventing the achievement of a favorable ranking.

The theory that Google's organic listings and paid listings are linked in some way is not a new theory. In this scenario, we can see that it is an impossibility for Google to turn more people to paid listings by making organic listings more difficult to attain. Regardless of how difficult they make the ranking criteria, we will always have a fixed number of websites that rank well.

Does Google Reward Adsense Users With Favorable Rankings?

Another conspiracy theory that seems to have a lot of believers is that Google somehow rewards its Adsense advertisers or even publishers with more favorable rankings. That is, if you spend a regular amount of monëy on paid listings, Google will then treat you more favorably in their natural search results. The theory would make Pavlov's dog drool.

Once again, however, we have a problem. This time we simply fail to see any empirical evidence to support this theory. When Adwords first was released, several SEO's tested this theory buy purchasing paid listings over varied lengths in time. The results? There was absolutely no correlation between purchasing an Adwords account and your organic search ranking.

Back to What Was Said

So in the article that spawned this mini-debate, I came to the conclusion that purchasing links for the purpose of attaining a better organic listing in the search engines was not a good idea. The reason it was not a good idea is that the search engines do not like purchased links. The criticism of this article seemed to want to establish a link between Adwords and Google's organic listings – that somehow Google was trying to encourage more users to use Adwords rather than aspiring after an organic listing.

But we do not see any evidence that Adwords and Google's organic listings are linked in any way. In fact, it is fairly well known that Google has separated their Adwords department entirely from their organic search listing department in an effort to keep the two from influencing each other.

So if Google is not going after link purchasers for their own personal financial gain, why are they so much against link purchasing and even some forms of link exchanges? This is the question that I tried to answer in the last article. Evidently I did not answer it as well as I could have, but you may want to go back and read it.

If I were to summarize the article, however, I would simply say that Google discourages purchasing links for the purpose of getting a higher pagerank as well as exchanging links only for the purposes of pagerank because it is usually done as an attempt to manipulate their rankings.

So What Should You Do

So if purchasing text links for pagerank is not a good idea, and since it seems as if Google is nöw trying to devalue links that are a part of a planned link exchange program, what should website owners do? What is the plan to get a top ranking?

You should still try to get inbound links to your website. You can even do so through link exchanges, although you should try to do so as naturally as possible. What does this mean? It means only linking to sites that are of value to your visitors, and being willing to link to a website without a link in return. It means getting rid of that enormous directory on your website that leads to tens or hundreds of websites that are really only there for the sake of getting a higher pagerank. It means that you should also engage in activities outside of direct SEO that could garner you frëe links. Press releases and news stories as well as writing exclusive articles are all powerful ways to get frëe links without having to do anything in return.

Whether or not you agree or disagree with Google's approach to link exchanges, if for no other reason than for the sake of your users, you should always approach link exchanges as a way to offer more value to your users. What you will find when you take this approach is that your traffïc will increase more than any link exchange program can bring, and your search engine rankings will increase as well.

About The Author
Mark Daoust is the owner of Site Reference. If you want to reference this article, please reference it at its original published location.

Wednesday, January 18, 2006

15 Shades of SEO Spam

Spam, in almost any form, is somehow bad for your health. The vast majority of web users would agree with that statement and nobody would even think of the finely processed luncheon meat-product made by Hormel. Even the word itself is infectious in all the worst ways, being used to describe the dark-side and often deceptive side of everything from Email marketing to abusive forum behaviour. In the search engine optimization field, Spam is used to describe techniques and tactics thought to be banned by search engines or to be unethical business practices.While writing copy for our soon to be revised website, the team put together a short list of the most outrageous forms of Spam we had seen in the last year and a short explanation of the technique.Please note, we do not encourage, endorse or suggest the use of any of the techniques listed here.

We don't use them and our clients' sites continue to rank well at Google, Yahoo, MSN and Ask. It is also worth noting Google has been the dominant search engine for almost five years. Most of the spammy tricks evolved in order to game Google and might not apply to the other engines.

1. Cloaking
Also known as "stealth(ing)", cloaking is a technique that involves serving or feeding one set of information to known search engine spiders or agents while displaying a different set of information on documents viewed by general visitors. While there are unique situations in which the use of cloaking might be considered ethical in the day-to-day practice of SEO, cloaking is never required. This is especially true after the Jagger algorithm update at Google, which uses document and link histories as important ranking factors.

2. IP Delivery
IP delivery is a simple form of cloaking in which a unique set of information is served based on the IP number the info-query originated from. IP addresses known to be search engine based are served one set of information while unrecognized IP addresses, (assumed to be live-visitors) are served another.

3. Leader Pages
Leader pages are a series of similar documents each designed to meet requirements of different search engine algorithms. This is one of the original SEO tricks dating back to the earliest days of search when there were almost a dozen leading search engines sorting less than a billion documents. It is considered SPAM by the major search engines as they see multiple incidents of what is virtually the same document. Aside from that, the technique is no longer practical as search engines consider a far wider range of factors than the arrangement or density of keywords found in unique documents.

4. Mini-Site networks
Designed to exploit a critical vulnerability in early versions of Google's PageRank algorithm, mini-site networks were very much like leader pages except they tended to be much bigger. The establishment of a mini-site network involved the creation of several topic or product related sites all linking back to a central sales site. Each mini-site would have its own keyword enriched URL and be designed to meet specific requirements of each major search engine. Often they could be enlarged by adding information from leader pages. By weaving webs of links between mini-sites, an artificial link-density was created that could heavily influence Google's perception of the importance of the main site.
In the summer of 2004, Google penalized several prominent SEO and SEM firms for using this technique by banning their entire client lists.

5. Link Farms
Link farms emerged as free-for-all link depositories when webmasters learned how heavily incoming links influenced Google. Google, in turn, quickly devalued and eventually eliminated the PR value it assigned to pages with an inordinate collection or number of links. Nevertheless, link farms persist as uninformed webmasters and unethical SEO firms continue to use them.

6. Blog and/or Forum SpamBlogs and forums are amazing and essential communication technologies, both of which are used heavily in the daily conduct of our business. As with other Internet based media, blogs and forum posts are easily and often proliferated. In some cases, blogs and certain forums also have established high PR values for their documents. These two factors make them targets of unethical SEOs looking for high-PR links back to their websites or those of their clients. Google in particular has clamped down on Blog and Forum abuse.

7. Keyword Stuffing
At one time, search engines were limited to sorting and ranking sites based on the number of keywords found on those documents. That limitation led webmasters to put keywords everywhere they possibly could. When Google emerged and incoming links became a factor, some even went as far as using keyword stuffing of anchor text.
The most common continuing example of keyword stuffing can be found near the bottom of far too many sites in circulation.

8. Hidden Text
It is amazing that some webmasters and SEOs continue to use hidden text as a technique but, as evidenced by the number of sites we find it on, a lot of folks still use it. They shouldn't.
There are two types of hidden text. The first is text that is coloured the same shade as the background thus rendering it invisible to human visitors but not to search spiders. The second is text that is hidden behind images or under document layers. Search engines tend to dislike both forms and have been known to devalue documents containing incidents of hidden text.

9. Useless Meta Tags
Most meta tags are absolutely useless. The unethical part is that some SEO firms actually charge for the creation and insertion of meta tags. In some cases, there seems to be a meta tag for virtually every possible factor but for the most part are not considered by search spiders.
In general, StepForth only uses the description and keywords meta tags (though we are dubious about the actual value of the keywords tag), along with relevant robots.txt files. All other identifying or clarifying information should be visible on a contact page or included in the footers of each page.

10. Misuse of Directories
Directories, unlike other search indexes, tend to be sorted by human hands. Search engines traditionally gave links from directories a bit of extra weight by considering them links from trusted authorities. A practice of spamming directories emerged as some SEOs and webmasters hunted for valuable links to improve their rankings. Search engines have since tended to devalue links from most directories. Some SEOs continue to charge directory submission fees.

11. Hidden Tags
There are a number of different sorts of tags used by search browsers or website designers to perform a variety of functions such as; comment tags, style tags, alt tags, noframes tags, and http-equiv tags. For example, the "alt tag" is used by site-readers for the blind to describe visual images. Inserting keywords into these tags was a technique used by a number SEOs in previous years. Though some continue to improperly use these tags, the practice overall appears to be receding.

12. Organic Site Submissions
One of the most unethical things a service-based business can do is to charge clients for a service they don't really need. Charging for, or even claiming submissions to the major search engines are an example. Search engine spiders are advanced enough to no longer require site submissions to find information. Search spiders find new documents by following links. Site submission services or SEO firms that charge clients a single penny for submission to Google, Yahoo, MSN or Ask Jeeves, are radically and unethically overcharging those clients.

13. Email Spam
Placing a URL inside a "call-to-action" email continues to be a widely used of search marketing spam. With the advent of desktop search appliances, email spam has actually increased. StepForth does not use email to promote your website in any way.

14. Redirect Spam
There are several ways to use the redirect function to fool a search engine or even hijack traffic destined for another website! Whether the method used is a 301, a 302, a 402, a meta refresh or a java-script, the end result is search engine spam.

15. Misuse of Web 2.0 Formats (ie: Wiki, social networking and social tagging)
An emerging form of SEO spam is found in the misuse of user-input media formats such as Wikipedia. Like blog comment spamming, the instant live-to-web nature of Web 2.0 formats provide an open range for SEO spam technicians.

Many of these exploits might even find short-term success though it is only a matter of time before measures are taken to devalue the efforts.
Search engine optimization spam continues to be a problem for the SEO industry as it tries to move past the perceptions of mainstream advertisers. When under-ethical techniques are used, trust (the basis of all business) is abused and the efforts of the SEO/SEM industry are called into question.

Fortunately, Google’s new algorithm appears to be on the cutting edge of SEO Spam detection and prevention. Let’s hope 2006 is the year the entire SEO industry goes on a Spam-free diet.

Thursday, January 12, 2006

SEO Tips In a Sea of Change

Advanced SEO 2006

Waves of change have cascaded over the search marketing sector in the past year prompting changes in the methods, business and practice of search engine optimization. Though many things have been altered, expanded or otherwise modified, the general search engine market share has not. Google remains the most popular search engine and continues to drive more traffic than the other search engines combined. Another thing that has not changed is the greater volume of site traffic generated by organic search placement over any other form on online advertising.
There are six or seven advanced public search engines out there but the vast majority of SEO attention is naturally given to Google. Many of the tips offered in this piece, while useful at the other search engines, are written with Google in mind. We are also thinking about alternative file formats and other ways visitors might find websites aside from pure-search.

The most visible changes can be seen in the variety of search formats and in search results returned by the major search engines but the greatest changes are taking place in the philosophies and practices of search engine optimizers. As the search environment has changed, so too have the techniques and tools used by search marketers. More time is focused on improving website content and navigation in order to appeal both live-visitors and search spiders. There are also new metrics measuring the success of a search marketing campaign, all of which are far more complicated than simple search engine rankings.

Since the introduction of the Jagger Update at Google, we have been doing a number of things slightly differently and have updated expectations of our clients and ourselves.

Organic search engine placement now requires a lot more work on our part and on the part of our clients or their webmasters. Content needs to be updated regularly, navigation simplified and shared analysis of on-site traffic is increasingly important. Top10 websites, especially around their main entry points, have become production pieces requiring a greater degree of strategic planning than the general, annually updated brochure sites do. Creation of that content needs to be considered a standing business expense though that expense should be more than made up for in long-term advertising savings.

Along with that greater effort, we strongly advise our clients to integrate their PPC campaigns with their SEO campaigns though, not necessarily in the hands of the same person. SEO and PPC are two unique arms of search engine marketing. Many SEOs spread their time crafting both paid and organic campaigns for clients though each requires unique and highly developed skill sets. PPC offers guaranteed placements for a fee but require greater attention and monitoring, along with different levels of analysis. We have set caps on the number of PPC campaigns we can run in conjunction with organic placement campaigns and have taken measures to outsource via recommendation any overload. The key here is to have the PPC and the organic SEO teams working together on several aspects of the client's web documents.

That said, we need to stop thinking of search engines as the main show in website marketing. This might sound like a self-defeating statement coming from a search engine optimization specialist however search, as a tool, is no longer confined to the search engines as we know them. Think about paid-ad generating site visits from a third-party website. The transactions that brought the visitors were not conducted on a search engine, but one or more search engines, in conjunction with that third-party website facilitated them.

Now, think about social commentary and viral marketing. Internet users, as is true with most of us offline, tend to rely on first-person recommendations. I tell a friend about a service that worked particularly well for me. They try that service and tell their friends as well. It works that way with almost any industry from restaurants to airlines, moving companies and magazines. Now, try to imagine your personal network of friends and contacts. How many of them know each other or might connect through a third or fourth party?

Imagine the impact of giving users the ability to tag their search experience with comments. During the Christmas sales rush, Yahoo Shopping experimented with user-compiled shopping lists, sort of a global gift-guide that used social networking and comment tagging to cross-reference for search results. (If you are interested in Stereo Speakers, you might also be interested in StacyB's Audiophile Shopping List.) Yahoo's Flickr photo sharing service has seen amazing growth through global networks of friends exchanging images they have tagged with their comments.

Similarly, the appearance of Blogs has substantially expanded the online marketing environment. It is estimated that by the year 2010, there might be as many as one billion Blogs published online. While most are personal diaries, blogs appear to have lasted long enough to be more than a fad and are evolving rapidly as users learn to modify and improve on them.

Businesses are increasingly turning to Blogs to communicate with customers or to respond to inquiries. Newsgathering organizations are using Blogs to fill the gap between TV broadcast and the Internet by posting everything from breaking news, information podcasts, video clips, and reporters notebooks to recipe ideas, shopping tips and paid-search advertising.

There are two major advantages Blogs offer search marketers. The ability to link Blog entries together to form an information-thread network provides search marketers with a number of tools beyond the improvement of the knowledge base. We are able to help clients establish communications centers from which they can link to information supplied by suppliers, distributors and clients on their websites or blogs. An important goal for search marketers is to help our clients provide users with a clear path to information they need. Clear paths tend to get followed by many people, a trait today's search spiders look and account for. Blogs, if maintained properly can be an important component in a winning website structure.

The second important feature of Blogs is RSS, real simple syndication. Anyone who expresses interest can subscribe to your blog, getting instant notification of updates or messages.

Search is going to be a facet of all information applications and many electronic appliances moving forward into the next decade. The major search engines are each working to make deals with the major appliance and electronics manufactures in order to provide search results to users in planes, trains and all automobiles, along with your kitchen, living room, mobile phone and quite possibly to display screens appearing in shopping carts.

In other words, search will be a greater part of our daily lives, which brings us back to search engine optimization for websites. That's still important, even if the traditional search engine rankings pages are less important.

Building a good website structure is critical. Search engines have changed radically over the past ten years to the point that we are now in a period of what appears to be constant change and evolution. The most important elements of SEO today, more important than writing the perfect keyword enriched title tag, are ease of navigation, clarity of purpose, and relevant links (think of links as information-threads). Keywords are important, make no mistake about that but search engines have moved far beyond simple keyword/context measurements.

Search engines have significantly improved their ranking algorithms over the past two years and in particularly, the past few months. From the earliest years until about five years ago, search engines looked for keywords in several areas or elements of a website, including incoming and outgoing links. Rankings were determined by the arrangement of keywords and the number of incidents of those keywords found on or around the site.

For the past five years, Google has set the standards SEOs work to achieve but over the last six months, those standards have subtly changed and will continue to change long into the foreseeable future. What made Google different five years ago was their method of using a standard keyword based spider that also factored in the number of incoming links to each site. That led to a number of techniques based around making artificial link-densities by creating link-networks, portal sites and other tricks aimed at gaming Google. After a series of algorithm updates aimed primarily at preventing “black-hat” manipulation of its rankings, Google has moved well past the basic premise of PageRank and its simple, democratic explanation.

We believe the Jagger Update is only one of many algorithm shifts that are leading Google away from pure link-context to include shared incidents of semantic intention found between linked documents.

Where we used to look at a website as a collection of similar documents, often of a common file type, found within a distinct URL, we are now examining far more complex layers of differing web-documents strung between several URLs. Again, think of links between documents as information threads being followed by the spiders. As much as possible, these threads should be more than useful links between relevant sites, they should help complete whatever story the live-user is experiencing. Your site visitors are looking for something, at least, that's what Google, Yahoo and the rest want to think. Google is especially interested in how visitors use your site, how often they return and how often they use links leaving your site.

Google has just reopened Google Analytics on a limited, invitation basis. Overwhelmed by massive user-interest when it released its modified Urchin site-statistics program, Google Analytics provides a detailed look at how visitors use your site. We are strongly urging clients to sign up for Google Analytics as it becomes available and will be offering assistance interpreting data extracted. One of the features of the free software package is the integration of AdWords/AdSense support showing how your ad campaigns are performing and how ads displayed on your site are doing.

While Google is making it easier for search marketers and advertisers, its goal is obviously to make itself more money by increasing click-through rates while collecting user data from the millions of websites signing up with the service. It has also provided SEOs with a dashboard view of critical factors involved with how it ranks sites.

The practice of search engine optimization has in some ways become more difficult but in others, has actually gotten easier. SEO has come a log way since its early days in the mid 1990's. A decade ago, SEOs were considered secretive and manipulative cowboys, roughneck mercenaries who would (because they could) do just about anything to get a site ranked in the Top10 on the major engines of the time. There were more search engines along with a variety of directories, spidered databases such as Inktomi that sold results to other engines.

This switch, combined with the rapid growth of the Web necessitated better search algorithms and a crackdown on manipulative search marketers. At the same time, the SEO and SEM sectors have seen tremendous growth due mostly to a shift towards paid-search marketing by major advertisers and the attendant growth of interest in Google, Yahoo and MSN. The search marketing sector has doubled or perhaps tripled in size in just twenty-four months as new practitioners were hired by established SEO firms or forming their own businesses. Many of those new practitioners have spent that time absorbing and adding to the huge volume of information that makes up the SEO sector's knowledge base.

Those SEOs are coming of age, professionally speaking, and are very good at what they do. Their skills are going to be an important asset to the sector in the coming year as the business of search expands way beyond the desktop and into everyday life. Change is good.

Wednesday, January 11, 2006

Pulling Google

by Mark Daoust

Admittedly, I have a bit of a childish mind. I often see things as more animated and fantasized than they really are. When I think of search engine optimizers, whether professionals or the casual SEO for a personal website, often remind me of a room full of school children all waiving their hands up in the air holding their breath, grunting, and whimpering for the chance to have the teacher call on them (they have the best answer, after all).

Its true. Most website owners would gladly spend a day outside of the Googleplex jumping up and down, hoping, praying, and whimpering for Google to take notice of their website if they thought for a moment that it would have a chance at getting them a top ranking. We are absolutely obsessed with search – it is the ultimate ego stroke to being a website owner.

Most modern SEO theories find their genesis in trying to push a website to the front of Google's rankings. They start with the idea that your website is the one that should be called on by the teacher and give you methods on how to get the teacher's attention. They teach how to raise your hand higher, how to squirm just a bit more, how to sigh with extreme dissappointment when the teacher picks the website that is obviously the teacher\'s pet.

This is push SEO, and it does work for many people. The problem with push SEO is that our 'classroom' is huge. We are asking Google to pick our site out of literally thousands, if not millions, of websites that all have something to offer on the subject at hand. We may believe that we have the best thing to offer, but Google does not know that.

Lately, however, a theory (or method) seems to be arising that counters the idea of push SEO. Rather than asking you change your website to fit Google\'s standards of a 'good result', this theory is supposed to literally change Google\'s standards.

Google Has a Confidence Issue

I have already admitted to having a childish mind that creates fantastic visions of how the world works, but I really think that Google has a confidence issue. They are the ultimate 'know-it-all's'. Most of us are annoyed by that person who is quick to correct us in a small detail or who seems to have an answer to just about every question, but Google does just that.

Think about it – if you do a search for 'amazen', Google will respond with “Did you mean: amazon?”. How arrogant and rude can a search engine be? How can they assume that they know what I am looking for?

All joking aside, they usually do know what we are looking for. They are so supremely confident that they know what we are looking for because they have been able to successfully respond to millions of questions daily for the past several years. But like most people with confidence issues, if they feel that they are being left out on a particular topic, they grow a feeling of inadequacy. As a resutl, Google is constantly trying to know everything about everything. The idea behind pull SEO is to tell Google that they are wrong or that they do not know something – and that you have the website that they need to know about.

Mike Grehan on Pull SEO

I first was introduced to the idea of pull SEO by Mike Grehan, a man, in my opinion, who understands real SEO rather than just a bunch of SEO tricks. Although I do not know the man personally (although I would be happy to make his acquaintance), he is the one person who most closely echo's my thoughts on SEO.

Just recently he posted on his blog an interesting article on how an in-progress event can effect search results. For example, take a tragedy such as Hurricane Katrina. When the Hurricane hit, it was all that was on our minds and hearts, and as a result, it was what people searched for in Google. Consequently, the search results of the major search engines changed.

Think about it – anytime a major disaster hits it becomes the major subject of the search engines. When Pope John Paul II died in 2005 searches for his name topped most search engine charts. After Janet Jackson's right breast obfuscated the Super Bowl halftime of 2004, search engines were quickly used a resource to relive the questionable moment. After September 11th, the world flocked to a younger Google to find information on the World Trade Towers and Osama Bin Laden.

If you think like a search engine, being able to present up to date information based on the news of the day gives you a distinct competitive advantage. If you have the results people are looking for faster than others, then you suddenly become the trusted resource everyone looks to.

Mike discusses in several other posts the idea of pull marketing and how he actually uses it in his professional SEO consultations. I am not sure if Mike is the originator of the idea of pull SEO, but he is the first person that I learned this theory from.

Marketing in a Bathroom

I read an interesting comment at Threadwatch that gives a great example of how pull SEO can actually work. The comment related a story which seems to be fairly common place among the website owner world. A new website owner, who was completely unfamiliar with search engine optimization and website marketing was looking for help. In an effort to help market the website, the owner was instructed to place post-it notes with his website address on it in several bathrooms.

The result of this marketing activity? Within a few months his website rose to the top of the search engine rankings, he started to see a good amount of traffic, and his search engine woes were quickly taken care of.

What SEO work did this person actually do? In all reality, there was no SEO work at all – just regular viral marketing

Making a Splash Big Enough To Notice – The Real Payoff

Allow me to be overdramatic for a moment, but if you want to get to the top of Google, you not only have to be the website that shows all the information possible on Hurricane Katrina, you also have to be the website that causes Hurricane Katrina. In other words, if you want to get to the top of the rankings – make enough noise that people start searching for your website independent of 'just finding' you in the search engine results pages.

If Google's base is hammering their search results to know more about BlueWidgets.com, then they will ultimately serve BlueWidgets.com as a result to their users. If they fail to do this, then they will lose trust among their users.

Mike Grehan often talks about the effect of a client launching a major television commercial campaign and how there is an immediate effect on that client\'s rankings in the search engines. This is not a coincidence, but a direct result of raising awareness of a website and Google responding to that new awareness.

The Reality – Small Businesses Have Trouble Making Big Splashes

Pull SEO is good in theory, and it is very good for a Fortune 500 corporation, but the small company will certainly have trouble utilizing pull SEO. Making a big publicity splash is either very expensive or it takes something so unique and revolutionary that making a splash is relatively simple. And, for the small company that is able to grab a lot of attention independent of the search engines, getting a top ranking really becomes ancillary to all the news coverage they are probably receiving.

But maybe this is the way it should actually be. Is it possible that the way to get to the top of the rankings is to develop an actual plan on how we will make our websites popular - independent of the search engines? If we are able to create enough buzz about our website, then search engine rankings, although nice, suddenly become less of a focus.

Put Your Hand Down – Get Your Marketing Geared Up

Google asks us millions of questions every day. Which website should they rank first for every topic that people ask about? Naturally, we want to raise our hands hoping that Google will call on us to answer their user's needs. But in all reality, we need to put our hands down and start working.

Relying on a single entity, such as Google, is a bad strategy. Google, as I mentioned earlier, is the ultimate stroke to a webmaster's ego. It is the 'icing on the cake', the affirmation of a job well done. It is not, however, the goal in and of itself.

Your goal is to be successful independent of Google. Make your website buzz worthy and Google will eventually take notice. Google cannot ignore the demands of thousands of users.

ALSO SEE:
Understanding Search Engines Generates Higher Web Rankings

Saturday, December 24, 2005

Google SEO : Sandbox, TrustRank, Jagger Update

Jagger – Tying it all together
The Jagger update seems to have taken the aging factor and combined the TrustRank factor into one, forming a new age for Google.
In other words, sites have to reach a certain age AND acquire relevant links from authoritative sources. Further those links must also be aged before they are attributed to your site.

As you can see, Jagger is quite the update – forcing not only quantity but quality.
You need to have both a sufficient number of quality aged content pages as well as a sufficient number of properly aged relevant links. Jagger is doing what Google has been striving for ever since the Florida update 1 year ago. That is to make the index more relevant than ever.

By forcing sites to endure an aging delay (also called the “Sandbox”) it is attempting to ensure that a site is indeed worthy of joining its regular index.

Also, it is assuming that those sites which are related will want to link to each other without reciprocating links. In other words, you will want to link to another site because it offers more relevant information to your site visitors, and not because it will help artificially boost your rankings. Further the update also ensures your site is worthy because it assumes that only those that link to you will do so because it’s worthy.

How to work in this new age of Google

Obviously a new site will probably take much longer to get out of this age delayed “sandbox”. There are ways to make the delay shorter, however. Those include building links from highly authoritative relevant sites. Now before you go saying “how the heck do I find those” remember that there are a couple directories which Google still considers important – Yahoo! And the Open Directory Project, also known as DMOZ.org.

Granted there are issues with both. With Yahoo! You have to pay to get reviewed but that doesn’t guarantee you’ll be included. You could spend $300 and never get your site into Yahoo! Directory.

The ODP also has it’s problems ranging from slow update schedule to corrupt editors.
Granted the group is trying to clean up its image but I’ve found in the past that unless you know someone on the inside it can take months if not years to get included. Finally, even when you do get included in these and other related directories you still are subject to that aging delay. Aging happens when Google finds the link and not when the link is added to the directory page.

In other words it could be a few days or weeks from when the link is added before Google discovers it.

On the content side you also run into similar issues.
While its great to have an always growing site, you must architect your navigation so that new content is discovered earlier to help it work through the aging delay.

I know this sounds like an overwhelming task, and to an e-commerce site this can be particularly troubling especially when the bills begin to pile up and your site still isn’t found in Google, but remember that Google isn’t the only engine out there. Granted it can account for between 35% and 50% or your traffic, but the other engines combined will account for that other 50% to 70%.

Yahoo! And MSN are much less picky at who they allow in the index.
So if you handle your content development and link building properly – that is staying away from link exchanges and link farms, your site will eventually get out of the “sandbox” and into the index. Also, keep your eyes open for related sites. Perhaps you’ve written an article on something new, or provided commentary about the current state of your industry. You could always submit that article, or a link to it, to other industry specific sites.

Consider reviewing industry blogs as well. They can be a great source of links.

Finally, make sure that the site is always growing and improving. A growing site helps increase your internal links which has a positive effect on link popularity, PageRank and TrustRank.
Also ensure to keep building those links. Even if you aren’t entirely sure of a link’s quality, take the time to request it anyways.

After all, if it isn’t relevant Google will filter it.

Tuesday, December 06, 2005

What is Google Base?

Got something you want the world to see?

Google Base is Google's database into which you can easily submit all types of content. Google base will host your content and make it searchable online.

You can describe any item you post with attributes, which will help people find it when they search Google Base. In fact, based on the relevance of your items, they may also be included in the main Google search index and other Google products like Froogle and Google Local.

You'll need a Google Account to use Google Base.

A Google Account lets you sign into Google Base and most other Google services (like Froogle Shopping List and Groups). Once you've created your account, sign in everywhere with just your email address and a password of your choosing.

This is how it worked for me

I do not plan to use Google database for my SEO (Search Engine Optimization) but I was pretty curious how it works. I set up a listing and added graphic which took about 10 minutes. After I published it my listing appeared 30 minutes later. To see my test go to Google Base http://base.google.com/ and type in ‘web design Virginia’ or ‘Virginia web design’. Presto, my Visionefx mini-listing appears!

This is how it can work for you

I have clients who sell and wide variety of products. Some are e-commerce and some are not, but that does not matter. You can link to your web site or post images right from your own computer. You don't need to post images but a thumbnail is worth a thousand words.

Google Base is very applicable to half of our Visionefx client base.Whether it’s replacement windows, real estate, jewelry, frozen fish, electronic, Christmas wreaths, music, t-shirts or credit cards Google Base is a great way to generate interest in a particular product or service. A Google Base listing will also generate traffic back to your main web site thus generating more sales and leads.
Even if you don’t sell a particular item it doesn't’t matter. A lawyer could publish an 'important brief' or a mom could publish a 'favorite cookie recipe'.

Is this the eBay-killer, Monster-killer, Craigslist-killer that some expect?

Let's take eBay as one example. eBay growth is based on the service of the community that surrounds it. There are rules, plus buyers and sellers evaluate each other. It's easy to decide if I want to risk purchasing something from a seller, based on their ratings. Google Base lacks any such functionality for the moment. Potentially, it could come -- but it's not there yet.

How about Monster?

I don't do any hiring, but the disadvantage of Google Base for 'job search' is immediate. Looking for a job? Google Base gives you one single box -- that's it. Perhaps entering something like ‘web design jobs in Virginia beach’ will ultimately work to get you a listing of all jobs appropriate to that. But it might come back with false matches. One example of this was where Google Base brought back matches for ‘freelance web designers’.

Google Base Summary

Google Base is a way for Google to let anyone upload information to Google about anything. That's the master plan. Exactly how that master plan will unfold isn't clear. Maybe there won't be any particular date types that are uploaded. Maybe it really will turn into a great place for those with classified listings that will lead to a dedicated spin-off service. The overall goal seems to be put this tool out there and see what people make of it.

Google SEO : Sandbox, TrustRank, Jagger Update

- The Jagger update seems to have taken the aging factor and combined the TrustRank factor into one, forming a new age for Google.

In other words, sites have to reach a certain age AND acquire relevant links from authoritative sources. Further those links must also be aged before they are attributed to your site.
As you can see, Jagger is quite the update – forcing not only quantity but quality.

You need to have both a sufficient number of quality aged content pages as well as a sufficient number of properly aged relevant links.
Jagger is doing what Google has been striving for ever since the Florida update 1 year ago. That is to make the index more relevant than ever.

By forcing sites to endure an aging delay (also called the “Sandbox”) it is attempting to ensure that a site is indeed worthy of joining its regular index.
Also, it is assuming that those sites which are related will want to link to each other without reciprocating links. In other words, you will want to link to another site because it offers more relevant information to your site visitors, and not because it will help artificially boost your rankings.

Further the update also ensures your site is worthy because it assumes that only those that link to you will do so because it’s worthy.

How to work in this new age of Google

Obviously a new site will probably take much longer to get out of this age delayed “sandbox”. There are ways to make the delay shorter, however. Those include building links from highly authoritative relevant sites.

Now before you go saying “how the heck do I find those” remember that there are a couple directories which Google still considers important – Yahoo! And the Open Directory Project, also known as DMOZ.org.

Granted there are issues with both. With Yahoo! You have to pay to get reviewed but that doesn’t guarantee you’ll be included. You could spend $300 and never get your site into Yahoo! Directory.

The ODP also has it’s problems ranging from slow update schedule to corrupt editors. Granted the group is trying to clean up its image but I’ve found in the past that unless you know someone on the inside it can take months if not years to get included.

Finally, even when you do get included in these and other related directories you still are subject to that aging delay. Aging happens when Google finds the link and not when the link is added to the directory page.

In other words it could be a few days or weeks from when the link is added before Google discovers it.

On the content side you also run into similar issues.

While its great to have an always growing site, you must architect your navigation so that new content is discovered earlier to help it work through the aging delay.

I know this sounds like an overwhelming task, and to an e-commerce site this can be particularly troubling especially when the bills begin to pile up and your site still isn’t found in Google, but remember that Google isn’t the only engine out there. Granted it can account for between 35% and 50% or your traffic, but the other engines combined will account for that other 50% to 70%. And Yahoo! And MSN are much less picky at who they allow in the index.
So if you handle your content development and link building properly – that is staying away from link exchanges and link farms, your site will eventually get out of the “sandbox” and into the index.

Also, keep your eyes open for related sites. Perhaps you’ve written an article on something new, or provided commentary about the current state of your industry. You could always submit that article, or a link to it, to other industry specific sites.

Consider reviewing industry blogs as well. They can be a great source of links.

Finally, make sure that the site is always growing and improving. A growing site helps increase your internal links which has a positive effect on link popularity, PageRank and TrustRank.

Also ensure to keep building those links. Even if you aren’t entirely sure of a link’s quality, take the time to request it anyways.

After all, if it isn’t relevant Google will filter it.