Saturday, December 24, 2005

Google SEO : Sandbox, TrustRank, Jagger Update

Jagger – Tying it all together
The Jagger update seems to have taken the aging factor and combined the TrustRank factor into one, forming a new age for Google.
In other words, sites have to reach a certain age AND acquire relevant links from authoritative sources. Further those links must also be aged before they are attributed to your site.

As you can see, Jagger is quite the update – forcing not only quantity but quality.
You need to have both a sufficient number of quality aged content pages as well as a sufficient number of properly aged relevant links. Jagger is doing what Google has been striving for ever since the Florida update 1 year ago. That is to make the index more relevant than ever.

By forcing sites to endure an aging delay (also called the “Sandbox”) it is attempting to ensure that a site is indeed worthy of joining its regular index.

Also, it is assuming that those sites which are related will want to link to each other without reciprocating links. In other words, you will want to link to another site because it offers more relevant information to your site visitors, and not because it will help artificially boost your rankings. Further the update also ensures your site is worthy because it assumes that only those that link to you will do so because it’s worthy.

How to work in this new age of Google

Obviously a new site will probably take much longer to get out of this age delayed “sandbox”. There are ways to make the delay shorter, however. Those include building links from highly authoritative relevant sites. Now before you go saying “how the heck do I find those” remember that there are a couple directories which Google still considers important – Yahoo! And the Open Directory Project, also known as DMOZ.org.

Granted there are issues with both. With Yahoo! You have to pay to get reviewed but that doesn’t guarantee you’ll be included. You could spend $300 and never get your site into Yahoo! Directory.

The ODP also has it’s problems ranging from slow update schedule to corrupt editors.
Granted the group is trying to clean up its image but I’ve found in the past that unless you know someone on the inside it can take months if not years to get included. Finally, even when you do get included in these and other related directories you still are subject to that aging delay. Aging happens when Google finds the link and not when the link is added to the directory page.

In other words it could be a few days or weeks from when the link is added before Google discovers it.

On the content side you also run into similar issues.
While its great to have an always growing site, you must architect your navigation so that new content is discovered earlier to help it work through the aging delay.

I know this sounds like an overwhelming task, and to an e-commerce site this can be particularly troubling especially when the bills begin to pile up and your site still isn’t found in Google, but remember that Google isn’t the only engine out there. Granted it can account for between 35% and 50% or your traffic, but the other engines combined will account for that other 50% to 70%.

Yahoo! And MSN are much less picky at who they allow in the index.
So if you handle your content development and link building properly – that is staying away from link exchanges and link farms, your site will eventually get out of the “sandbox” and into the index. Also, keep your eyes open for related sites. Perhaps you’ve written an article on something new, or provided commentary about the current state of your industry. You could always submit that article, or a link to it, to other industry specific sites.

Consider reviewing industry blogs as well. They can be a great source of links.

Finally, make sure that the site is always growing and improving. A growing site helps increase your internal links which has a positive effect on link popularity, PageRank and TrustRank.
Also ensure to keep building those links. Even if you aren’t entirely sure of a link’s quality, take the time to request it anyways.

After all, if it isn’t relevant Google will filter it.

Tuesday, December 06, 2005

What is Google Base?

Got something you want the world to see?

Google Base is Google's database into which you can easily submit all types of content. Google base will host your content and make it searchable online.

You can describe any item you post with attributes, which will help people find it when they search Google Base. In fact, based on the relevance of your items, they may also be included in the main Google search index and other Google products like Froogle and Google Local.

You'll need a Google Account to use Google Base.

A Google Account lets you sign into Google Base and most other Google services (like Froogle Shopping List and Groups). Once you've created your account, sign in everywhere with just your email address and a password of your choosing.

This is how it worked for me

I do not plan to use Google database for my SEO (Search Engine Optimization) but I was pretty curious how it works. I set up a listing and added graphic which took about 10 minutes. After I published it my listing appeared 30 minutes later. To see my test go to Google Base http://base.google.com/ and type in ‘web design Virginia’ or ‘Virginia web design’. Presto, my Visionefx mini-listing appears!

This is how it can work for you

I have clients who sell and wide variety of products. Some are e-commerce and some are not, but that does not matter. You can link to your web site or post images right from your own computer. You don't need to post images but a thumbnail is worth a thousand words.

Google Base is very applicable to half of our Visionefx client base.Whether it’s replacement windows, real estate, jewelry, frozen fish, electronic, Christmas wreaths, music, t-shirts or credit cards Google Base is a great way to generate interest in a particular product or service. A Google Base listing will also generate traffic back to your main web site thus generating more sales and leads.
Even if you don’t sell a particular item it doesn't’t matter. A lawyer could publish an 'important brief' or a mom could publish a 'favorite cookie recipe'.

Is this the eBay-killer, Monster-killer, Craigslist-killer that some expect?

Let's take eBay as one example. eBay growth is based on the service of the community that surrounds it. There are rules, plus buyers and sellers evaluate each other. It's easy to decide if I want to risk purchasing something from a seller, based on their ratings. Google Base lacks any such functionality for the moment. Potentially, it could come -- but it's not there yet.

How about Monster?

I don't do any hiring, but the disadvantage of Google Base for 'job search' is immediate. Looking for a job? Google Base gives you one single box -- that's it. Perhaps entering something like ‘web design jobs in Virginia beach’ will ultimately work to get you a listing of all jobs appropriate to that. But it might come back with false matches. One example of this was where Google Base brought back matches for ‘freelance web designers’.

Google Base Summary

Google Base is a way for Google to let anyone upload information to Google about anything. That's the master plan. Exactly how that master plan will unfold isn't clear. Maybe there won't be any particular date types that are uploaded. Maybe it really will turn into a great place for those with classified listings that will lead to a dedicated spin-off service. The overall goal seems to be put this tool out there and see what people make of it.

Google SEO : Sandbox, TrustRank, Jagger Update

- The Jagger update seems to have taken the aging factor and combined the TrustRank factor into one, forming a new age for Google.

In other words, sites have to reach a certain age AND acquire relevant links from authoritative sources. Further those links must also be aged before they are attributed to your site.
As you can see, Jagger is quite the update – forcing not only quantity but quality.

You need to have both a sufficient number of quality aged content pages as well as a sufficient number of properly aged relevant links.
Jagger is doing what Google has been striving for ever since the Florida update 1 year ago. That is to make the index more relevant than ever.

By forcing sites to endure an aging delay (also called the “Sandbox”) it is attempting to ensure that a site is indeed worthy of joining its regular index.
Also, it is assuming that those sites which are related will want to link to each other without reciprocating links. In other words, you will want to link to another site because it offers more relevant information to your site visitors, and not because it will help artificially boost your rankings.

Further the update also ensures your site is worthy because it assumes that only those that link to you will do so because it’s worthy.

How to work in this new age of Google

Obviously a new site will probably take much longer to get out of this age delayed “sandbox”. There are ways to make the delay shorter, however. Those include building links from highly authoritative relevant sites.

Now before you go saying “how the heck do I find those” remember that there are a couple directories which Google still considers important – Yahoo! And the Open Directory Project, also known as DMOZ.org.

Granted there are issues with both. With Yahoo! You have to pay to get reviewed but that doesn’t guarantee you’ll be included. You could spend $300 and never get your site into Yahoo! Directory.

The ODP also has it’s problems ranging from slow update schedule to corrupt editors. Granted the group is trying to clean up its image but I’ve found in the past that unless you know someone on the inside it can take months if not years to get included.

Finally, even when you do get included in these and other related directories you still are subject to that aging delay. Aging happens when Google finds the link and not when the link is added to the directory page.

In other words it could be a few days or weeks from when the link is added before Google discovers it.

On the content side you also run into similar issues.

While its great to have an always growing site, you must architect your navigation so that new content is discovered earlier to help it work through the aging delay.

I know this sounds like an overwhelming task, and to an e-commerce site this can be particularly troubling especially when the bills begin to pile up and your site still isn’t found in Google, but remember that Google isn’t the only engine out there. Granted it can account for between 35% and 50% or your traffic, but the other engines combined will account for that other 50% to 70%. And Yahoo! And MSN are much less picky at who they allow in the index.
So if you handle your content development and link building properly – that is staying away from link exchanges and link farms, your site will eventually get out of the “sandbox” and into the index.

Also, keep your eyes open for related sites. Perhaps you’ve written an article on something new, or provided commentary about the current state of your industry. You could always submit that article, or a link to it, to other industry specific sites.

Consider reviewing industry blogs as well. They can be a great source of links.

Finally, make sure that the site is always growing and improving. A growing site helps increase your internal links which has a positive effect on link popularity, PageRank and TrustRank.

Also ensure to keep building those links. Even if you aren’t entirely sure of a link’s quality, take the time to request it anyways.

After all, if it isn’t relevant Google will filter it.

Wednesday, November 23, 2005

Google's Jagger Update Completing Cycles

Ever since Google introduced its latest algorithm update in September, a fair amount of column space has been dedicated to telling webmasters and small business owners to wait until the update is complete. In so much as it can be said that the Jagger Update will ever be complete, the final cycle of the immediate update appears to be playing out.

Jagger was a different sort of algorithm update for Google. Its infamous predecessors, Florida and Hilltop were generally limited shifts in the values Google assigned domains based on content and links. After the immediate punch of previous updates, the search engine results pages (SERPs) would generally return to a stable and predictable state. SERPS generated by Jagger are expected to constantly update themselves with a greater degree of flux and change.

So, what exactly happened during the Jagger Update and what might it mean to your website? Quite a bit as it turns out.

The Jagger Update was introduced for three main reasons. The first was to deal with manipulative link-network schemes, sites generated with scraped content and other forms of SE-Spam. The second was to allow and account for the inclusion a greater number of spiderable documents and file types. The third was to allow and account for new methods of site acquisition beyond the use of the spider Googlebot.

The update made its first public appearance in late September but had its greatest impact in early October. At that time, hundreds of thousands of websites that enjoyed previously strong listings were suddenly struck and sent to the relative oblivion found beyond the second page of results.

Most of those sites lost position due to participation in what Google obviously considers inappropriate linking schemes. This was actually one of the first conclusions we came to in late September based on the experience of a few clients who joined link-networks that had not been recommended or vetted by our link-experts. This is now backed up by discussion in various search engine forums. While most of those hurt by this part of the update are good people running honest businesses, Google put out notice that irrelevant link-networks, no matter how simple or complex, are unhealthy additions to what might otherwise be a good website.

The problem Google faced was some webmasters misunderstood what links are for and how Google uses them to rank documents. For some unknown reason, many webmasters or site administrators participated in wholesale link mongering, bulking up on as many inbound links as possible without consideration of the most important factor (in Google’s estimation), the relevance of inbound links.

Now, Google appears to be applying filters based on historic data it has collected about all sites in its index over time. In other words, Google likely knows a lot more about documents linking to a particular website than the person who placed or requested the link in the first place. SEOs and webmasters should brush up on the “Information retrieval based on historical data” patent application Google filed on March 31, 2005 for highly detailed information.

Google is judging sites on who they link to along with who links to them. Before the update, a link from your site to an irrelevant site was more a waste of time than a waste of opportunity. Today irrelevant links seem to be both. Google’s desire to offer stable and highly relevant SERPS while preventing outright manipulation of those SERPS was the biggest cause of the shift.

The second and third reasons for updating the algorithm at this time is the allowance for indexing documents or information obtained through alternative sources such as Google Base, Froogle, and blogs and other social networking tools. Google’s stated goal is to grow to include reference to all the world’s information. That information is being expressed in multiple places using several unique file formats, some of which are difficult to weigh against others. By checking the file or document in question against the long-term history of documents linking to it, Google is better able to establish its theme and intent.

Mass adoption of blogs, while promoted by Google gave the search engine a number of problems. Webmasters and search marketers will take almost any opportunity to promote their sites, by any means available. Blogs provided ample opportunities and soon issues ranging from comment spam to scraped content Splogs started to gum up the SERPS. By comparing document content with the history of other related documents in its index, Google has become much better at spotting blog-enabled spam.

Google faced problems with forms of search engine spam such as fake directories and on-page spamming techniques such as hiding information in CSS files. The Jagger Update seems designed to deal with these issues by applying Google’s vast knowledge about items in its index against every document or file it ranks. A site that scrapes content, for example, might be weighed against the documents that content was originally published on and the intent of the republisher. One that hides information in the CSS file will similarly trip Google’s memory of how the same domain looked and operated before the spam-content was inserted.

The third reason for the algo update comes from the expansion of Google itself. Google is now much larger than it was when the Bourbon update was introduced in the early summer. Audio and video content is spiderable and searchable. Google’s comparison shopping tool Froogle is starting to integrate itself in with Google Local, just as Google Local and Google Maps are beginning to merge. There is some speculation in the SEO community that Google is preparing to integrate personalized data into the search results served to specific individuals. A strong assumption is that Jagger is part of Google’s movement towards personalization though there is little to firmly point at to support this idea.

If your website is still suffering the lagging effects of the Jagger Update, your SEO or SEM vendor should be able to offer good advice. Chances are, the first thing he or she will do is a point by point inspection of your inbound and outbound links associated with your website. Next, they will likely suggest making it easier for Google to spider various document file types in your site by providing an XML sitemap to instruct Google’s spider cycle. Lastly, they will likely suggest a look at how website visitors behave when visiting your site. Site visitor behaviours will play a part in Google’s view of the importance and relevance of sites in its index. The introduction of Google Analytics provides webmasters with a lot of free information regarding site visitors, along with other information on how the site fares on Google’s search engine. It also provides Google with a lot of information about sites running it. More on the effect of Google Analytics on the SERPS next week.

About the Author Jim Hedger
Jim Hedger is a senior editor for ISEDB.com. Also he is a writer, speaker and search engine marketing expert working for
StepForth Search Engine Placement in Victoria BC. He has worked as an SEO for over 5 years and welcomes the opportunity to share his experience through interviews, articles and speaking engagements. Hedger can be reached at jimhedger@stepforth.com

Monday, November 21, 2005

Jagger, Google Analytics, and the Future of Search and SEO

Two big things have just happened in Google-land: Jagger and Google Analytics. Together, these two events may have changed the face of search forever.

Jagger

First, let's discuss Jagger... Just like hurricanes, Google updates have names. (A Google update is a change to the way Google determines its rankings. Google makes these changes periodically, and they're universally feared because they can impact dramatically on a website's ranking.) The latest update is called Jagger, and it has search engine optimizers (SEOs) all around the world in a state of panic.

Why was Jagger such a fearful update? Simple... With Jagger, Google once again outsmarted huge numbers of SEOs. You see, many/most SEOs spend their time (and their clients' money) trying to trick Google into thinking that their websites are more relevant and important than they really are. They do this mostly by swapping links, buying cheap links, and placing links on free directories. While there's nothing wrong with these sorts of links (i.e. they're not considered 'black-hat'), they don't really show that the site is relevant or important. All they really show is that the site owner has made a deal with another site owner. In these deals, the incentive for the linking site owner is a reciprocal link, money, or increased link volume. Google much prefers it when the linking site adds the link simply to enhance the value of their content or to increase their own credibility and authority.

In other words, Google wants its search results to contain relevant, important sites, not sites that merely appear to be relevant and important. To this end, Google invests millions of dollars and employs the world's smartest mathematicians to create algorithms which identify sites that are trying to trick them. And that's exactly what Jagger did; and when it found those sites, it simply adjusted their ranking to more accurately reflect their true importance. (Unfortunately, it also demoted some sites which actually deserve a high ranking. It is hoped that these mistakes will be ironed out with future minor updates, but that's a topic for another article...)

From a technical standpoint, Jagger was well described by Ken Webster in his article, http://www.webpronews.com/topnews/topnews/Jagger . To summarize, Jagger:

1) Increased importance placed on IBL (Inbound Links) Relevancy?
2) Increased importance placed on OBL (Outbound Links) Relevancy?
3) Promotion of relevant Niche Directories (related to #1 & #2)?
4) More weight thrown back to PR @ top domain?
5) Increased importance on AdSense placement relevancy?
6) Possible introduction of CSS Spam filtering?
7) Overall Blog demotions?
8) New and unresolved "canonical" issues?

Some more interesting effects were reported by WG Moore (http://www.sitepronews.com/archives/2005/nov/9.html) who runs a number of test sites for SEO purposes. By monitoring the links to his test sites as reported by Google, he established that:

"all reciprocal links had vanished. We think that this is because Google is down-grading or eliminating reciprocal links as a measure of popularity. This does make sense, actually. Reciprocal links are a method of falsifying popularity. Sort of a cheap method of buying a link, if you want to think of it that way... During the second week of the Jagger Update, a few of our reciprocal links did come back up. However, we also noticed that these were from places where we had highly relevant content. They came from articles where we discussed our area of expertise: Web Analytics, or from forums where we had relevant threads. So we feel that these links came back because of content, not linking.

The other group that came back up was one-way inbound text links, regardless of the originating web site. These links also had strong relevance to our web analytics business. In other words, they contained keywords and/or phrases related to our site and its business."

In short, Jagger undid the hard work of thousands - if not millions - of people! As a result, hard-won high rankings and revenues plummeted.

Interestingly, article PR (article submission) came through Jagger seemingly unscathed. My SEO copywriting website http://www.divinewrite.com , for example, went from no.4 to no.1 worldwide for "copywriter", and I've employed article PR almost exclusively. Whether it was promoted or the sites around it were demoted, one thing is clear: article PR is one of the best ways to obtain a high ranking.

Google Analytics

The second monumental event to occur recently was Google Analytics - http://www.google.com/analytics/index.html . Google Analytics is a free web-stats solution which not only reports all the regular site stats, but also integrates directly with Google AdWords giving webmasters and insight into the ROI of their pay-per-click ads. According to Google, " Google Analytics tells you everything you want to know about how your visitors found you and how they interact with your site."

Why is this such a landmark move? Because for the first time ever, Google will have access to your real web stats. And these stats will be far more accurate than those provided by Alexa - http://www.alexa.com . Furthermore, Google's privacy statement says: " We may also use personal information for auditing, research and analysis to operate and improve Google technologies and services." - http://www.google.com/intl/en/privacy.html . Now let's put two and two together:

1) Google is 'giving' every webmaster in the world free access to quality web-stats.
2) Millions of webmasters will accept this 'gift', if only because it integrates directly with their Google AdWords campaigns.
3) Google will then have full access to the actual web stats of millions of commercial websites.
4) Google will have the right to use these stats to develop new technologies.
5) What's the next logical step? Google will use these statistics to help determine its rankings, of course!

It should come as no surprise. It's been on the cards - and frequently discussed - for a long time. For example, Jayde Online CEO, Mel Strocen, recently published an article on this very topic, ' The Future of WebSite Ranking' . He quite rightly asserts that:

"Google's "democratic" vision of the Web will never be achieved by manipulating algorithm criteria based on content. It will only be achieved by factoring in what is important to people, and people will always remain the best judge of what that is. The true challenge for search engines in the future is how to incorporate web searcher input and preferences into their ranking algorithms."

In fact, the Jayde Online network already owns and operates a search engine, ExactSeek (http://www.ExactSeek.com) which incorporates user popularity statistics in its rankings.

The Future of Search & SEO

To date, ExactSeek is the only search engine which uses visitor stats as criteria for its rankings. But Google isn't far behind. We all know that Google specializes in taking a good idea and implementing and adapting it brilliantly. This is exactly what we'll see in this case. By combining link popularity and user popularity statistics, Google will be the only major search engine to consider both what other sites think of your website and what your visitors think of your website. And because they have the most advanced algorithms for assessing link popularity, and will soon have access to the farthest reaching, most accurate web stats to assess user popularity, its competitors will be a long time catching up.

So if that's the future of search, what's the future of SEO? The future of SEO is undoubtedly one where:
• One-way text links from relevant pages continue to be the most valuable links
• Reciprocal linking continue to decline
• The 'shotgun' approach to link buying declines
• Mass email link requests decline
• Free directory submission declines
• Niche directory submission increases
• Article PR (article submission) increases
• Article submission sites (e.g. EzineArticles - http://www.ezinearticles.com , GoArticles - http://www.goarticles.com , and ArticleBlast - http://www.articleblast.com ) play a much bigger and more important role in helping online publishers locate quality articles (due to the increasing article volume)
• User popularity is just as important as link popularity, which means:- The quality of article PR improves in order to increase site traffic, credibility, and loyalty- The quality of website content improves in order to convert traffic and encourage repeat visits Clearly, the choices for SEOs will be pretty much limited to paying for links at niche sites and/or engaging in article PR. Being an SEO copywriter, I may be a little biased, but for mine, article PR is the hands-down winner in this comparison:
• It satisfies Google's criteria for relevance and importance. Linking site owners include your article and link because, in doing so, their site becomes more useful to visitors, and their business gains credibility and authority.
• It generates hundreds of free links quickly enough to make it worth your while, but not so quickly as to raise red flags at Google (in the form of link dampening).
• Links are permanent and you don't have to pay to keep them there.
• You get a lot of qualified referred traffic who already trust you and your expertise. This satisfies Google's visitor popularity criteria, while at the same time bringing you a lot of extra customers.

(For more information on article PR, read How to Top Google with Article PR .)

Conclusion

The lesson from Jagger is, don't try and trick Google! They've got more money and more brains than virtually any company in the world. It'll only end in tears! Don't spend time and money trying to make your site look important and relevant. Instead, spend that time and money actually making it important and relevant! Content - the real content behind the optimization - is the answer. After all, whether it's an article or a web page, it's the content that keeps 'eyes on paper', and that's what it's all about.

Happy optimizing!

Wednesday, November 16, 2005

On the Google Jagger Algo Update - Part 1

There has been a major update of Google’s ranking algorithm, changing the way the search engine orders search results. Atul Gupta of RedAlkemi discusses the consequences for webmasters.

By Pandia Guest Writer Atul Gupta

Google does minor algorithm updates almost on a monthly basis. Once in a while, it implements a major algorithm update.

If there is one thing search engine marketers and website owners fear, it is a major algorithm update, especially by Google. Well, much as we may like it not to happen, its here. Google has recently done a major algorithm update, nick named the “Jagger” update series.

The last major Google algorithm update, called the Florida update, happened in November 2003 and created quite a stir with website rankings.
Big changes in rankings
Like the Florida update, the Jagger update has done the much feared “blender” act. It has churned the top-ranking websites and turned them into a list of unrecognizable pulp.

Google has been the favorite amongst the web community searching for information. Most feel that the search results have always been highly relevant. It would therefore be safe to assume that whatever algorithm Google has, works just fine.

So why does Google need to re-engineer its perfect-looking algo so drastically? Has it not heard the saying “if it works, don’t fix it”?

Beating the spammers
From Google’s standpoint, the reason is simple and valid. Well, for starters, the web is ever-evolving and the algo always needs to be adjusted in order to provide the best results. Google has engineered an algorithm which they believe will reward good sites and rank them well for its viewers.

Google, like most other search engines, keeps this algorithm a closely guarded secret to prevent it from being exploited.

However, the SEO community is constantly at work trying to rank their sites well. Using calculated guesswork, logical thinking, special tests and extensive trial-and-error methods, they gradually figure out what the algorithm likes and dislikes.
Armed with this knowledge, it is not difficult to work on websites to rank them high in SERP (Search Engine Result Pages), irrespective of whether the site deserves to rank at the top or not. This kind of algorithm abuse results in ‘less than desirable’ websites displacing good sites from the top ranks, contaminating the Google index.

Consequently, following the Kaizen philosophy, Google needs to re-engineer its algorithms to keep what it believes are bad sites out of its top ranks. Naturally, major algorithm updates upset the current high-ranking websites and sends a lot of SEO professionals back to their work-bench in order to start all over again.

The timing
What is interesting to note is the timing of the algorithm update. When Google updated its algorithm in November 2003, there were large scale allegations by website owners that Google intentionally upset the rankings of popular websites just before the Christmas shopping season to force them into buying Google AdWords paid advertising in order to sustain the visitor traffic.
While Google claims that the algo update decisions are not influenced by the AdWords team, it is difficult to understand why they would once again choose a critical timing just before Christmas shopping season to update their algorithm.
The stakes are very high and this is business after all. Google earned $1.57 billion in Q3 of 2005. If 2003 pre-Christmas algorithm update effect is any indication, I estimate that Google would record revenues of over $2.05 billion in Q4 of 2005.

Jagger history
The Jagger 1 update pre-shocks actually started with a string of back-link updates that began in September 2005 and continued into middle of October 2005.
In mid October, Google updated its PageRank database for public view. Usually updated once a quarter, the PR update always creates a stir.

While most SEO professionals heavily play-down the importance of PR in ranking, the legacy of its importance is so deep-rooted in the minds of most webmasters, that it is difficult to shake it off as an insignificant ranking parameter.
[PageRank is Google’s measure of the “popularity” of a web page, based on the number and quality of incoming links. The Editor.]

It is believed that the second phase of the Jagger update — Jagger 2 — is now complete and replicated to all the data centers of Google. However, you may still notice some fluctuations in the rankings as things stabilize for each update.

We are now at the threshold of the third phase of the Jagger update, which is expected to initiate sometime in the second week of November 2005.

The changes
From what we have studied so far, Google has re-engineered several aspects of its algorithm. Amongst other aspects we will know as things roll out, we believe it has altered the impact of the following:
1. Value of incoming links
2. Value of anchor text in incoming links
3. Content on page of incoming links
4. Keyword repetitions in anchor text
5. Age of the incoming links
6. Nature of sites linking to you
7. Directory links
8. Speed and volume of incoming links created
9. Value of reciprocal links
10. Impact of outbound links / links page on your website
11. Sandbox effect / age of your site, domain registration date
12. Size of your site’s content
13. Addition and frequency of fresh content update
14. Canonical / sub domains, sub-sub domains
15. Multiple domains on same IP numbers
16. Duplicate content on same site or on multiple domains
17. Over-optimization, excessive text markup
18. Irrational use of CSS

We are studying various aspects of the Jagger algo update and are closely monitoring the impact of changes in each of the above mentioned parameters and many more not mentioned here.
We shall be discussing the impact of each of these aspects in the next parts of this article, which are likely to be written once the Jagger 3 update and our study of it is complete.

In the meanwhile, we’d like to give out a word of caution – If you have suffered drop in your website rankings, do not do any drastic changes on your website until the Jagger 3 update is fully implemented and stabilized.

There is a delicate balance and inter-dependence of all these parameters that can bring back your ranks once the Jagger 3 update is completed.

About the Author: Atul Gupta is the founder and CEO of RedAlkemi.com (formerly known as SEOrank.com & PugmarksDesign.com), an internet marketing, e-commerce, graphic design, web & doftware development services company.

He has about 20 years of experience in the field of graphic design, visual communication, web development and search engine marketing services. He has spent the last nine years of his career devoted solely in pursuing search engine marketing and web development activities.

Friday, November 11, 2005

Google's Jagger Update - The Dust Begins To Settle?

Ken Webster Expert Author Published: 2005-11-10

What happened? Webmaster's, site owners, online businesses and SEO companies everywhere have been desperately trying to decipher the fallout from the longest and most grueling algorithm update in the history of the Internet.

Relevancy and Revenue Generation are the two top goals of any SE (search engine). As the Internet and associated technologies mature, search engine algorithms have become much more complex. This was demonstrated in Google's 3-4 week long 3 phase "Jagger" update.

The initial response was very negative and Google received more bad press from every conceivable corner than what could have been imagined, going in. Many sites fell completely out of [Google's] SERPs (Search Engine Result Placement) over night, seemingly unexplainably. Some have recovered but many haven't, others have improved traffic.

Compounding prognostication, Yahoo initiated a much milder Index Update during the latter phase of the Jagger update.

Google had several issues to deal with:
1) Scraper Sites
2) Faux Adsense Directory Sites
3) CSS Spamming Techniques
4) Growing "Generic" SERP Irrelevancy
5) Reciprocal Linking Abuse
6) Ballooning BlogSpam

Google had no choice but to act decisively and convincingly.

The following list is how we believe Google has handled these issues in the Jagger update:

1) Increased importance placed on IBL (Inbound Links) Relevancy?
2) Increased importance placed on OBL (Outbound Links) Relevancy?
3) Promotion of relevant Niche Directories (related to #1 & #2)?
4) More weight thrown back to PR @ top domain?
5) Increased importance on Adsence placement relevancy?
6) Possible introduction of CSS Spam filtering?
7) Overall Blog demotions?
8) New and unresolved "canonical" issues?

Let's look at each action separately:
1) Increased importance placed on IBL Relevancy

Reciprocal linking abuse was growing out of hand, even "organic" SERP were losing relevancy because the majority of active site administrators were link-mongering anywhere and with anyone they could, regardless of relevant value. Google created that monster throwing the weight behind quantity over quality for a long time. It appears they simply started applying several published relevancy measurement factors (See US Patent Application #2005007174), which seem to have started becoming more noticeable during the "Bourbon" update.

2) Increased importance placed on OBL Relevancy?

The patent application mentioned above is ripe for OBL relevancy algorithm application. The "Bourbon" update ushered in a marked hit on irrelevantly linked and broader based directories, while promoting "niche" or "focused" more relevant topical based directories. It makes perfect sense to cut spam at it's source. This move was subtle but at the same time was an engineering masterpiece because it addressed every form of link spam to some degree, including CSS spammed links.

Theoretically; If a link can't be seen, it won't be selected by visitors and no measurable time is spent there, therefore it's "Relevancy Rating" starts to diminish immediately. Some even hypothesize that those kind of links can effect the overall "Relevancy Ranking" for the entire Site and has potential to effect the page and Site PR (Page Ranking). We definitely saw a promotion of "Relevant" Directories almost across the board with Jagger.

3) Promotion of relevant Niche Directories (related to #s 1, 2 & 5)?

We began seeing a Directory SERP shift in the "Bourbon" update and definitely saw a promotion of "relevant" directories almost across the board with Jagger. Based on those facts, no one can deny that there has been a significant algorithm reemphasis in and about "linking" issues.

4) More weight thrown back to PR @ top domain?

Google had seemed to stray from earlier value ascribed to PageRank for some time in quest of content, content freshness and other goals. After Jagger3 I was surprised to find PR0 pages highly placed in important Topic SERP with a great deal of code and 2 sentences of content. One example is prominent just below Matt Cutt's Blog when doing a GOOGLE search for "Jagger Update".

This particular example is mostly javascript, Adsense and intra-site links. On further inspection, the site is well done contains a good deal of relative information and has a top domain ranking of PR6. Based on these observations one might concur that more emphasis has been placed on top domain PR. This "observed" return focus to "Authoritive" or sites holding Trusted" status should hold no real surprise in the quest for "relevancy" improvement.

5) Increased importance on Adsence placement relevancy?

Google has declared all out war against spam Adsense sites of every kind. Many of these are/were faux directories and scrapers or other mega-sites utilizing auto content and auto link generation technologies and services. Matt Cutts in his bBlog openly asked for and gave specific instructions on how to report these sites to help augment the overall effect of the alg changes targeting those raging atrocities.

The war rages on against all kinds of spam, but you can always bet that relevancy, revenue protection and growth will be at the top of the list. 6) Possible introduction of CSS Spam filtering? Matt Cutts issued an unusually stern warning about using CSS spam techniques, coinciding with the Jagger update (strangely enough) on Oct 19, 2005. Here is link to the article in Threadwatch entitled; "Google Engineer Hammered over CSS Spam Comments". There is a great deal of controversy over this issue, but it has been a growing cancer for a long time.

Some almost seem to be speculating that Google couldn't figure out the algs to combat these issues yet outside of OBL relevancy implementation almost dismissing Matt's warning as "huff and puff" to scare CSS Spam abusers into compliance. Google always addresses serious Spam issues eventually and this one has been on the table for around a year, that I know of! It just doesn't make sense to ignore a warning from a top Google engineer, does it?

7) Overall Blog demotions?

Blog spam became a growing problem after Blogging gained prominence in 2004. Google had to backtrack on Blog SERP prominence because many of them were not managed well, or at all, losing topical relevancy. Jagger seems to have ushered in a SERP sweep of Blogs that were not topically focused, managed with purpose, and contained adsense and link spam. It got to the point that it seemed that half the top SERP for almost any topic were Blog listings, many have fallen in Jagger.

8) New and unresolved "canonical" issues?

Many are complaining of incorrect indexing issues, especially for sSites that were indexed for the first time during Jagger. The problem seems to stem from the Google treating the abbreviated site URL (without www) and the complete URL. I'll use one of my own as an example: www.precisioncompletion.com is a new unranked launch during Jagger and comes up correctly.

Do a Google search for precisioncompletion.com and look at the cache - A PR7 and the wrong website!

Half of the listings are correct and the other half pertain to that other site. Google is aware of these canonical issues being reported, and I believe they are planning to address them as the dust settles a little more on this update. Maybe I need to do a 301 permanent redirect to the full "URL" before I lose that PR7 and see if I can get it to transfer and magically pump up that PR0!

What to expect next? There are a large number of Sites that saw crippling SERP demotions, including clean coded, relevant, W3C validated, completely "White Hat" sites that haven't ever even engaged in link exchange programs. I know, I had one that got hit, my first time ever in a GOOGLE update.

Many of us in that position hope that, that effect is temporary "Collateral Damage" which will be rectified in subsequent alg tweaking as the dust continues to settle on the "Jagger" update. I don't see that Google has deviated off their widely expressed intentions and historical path in the Jagger update.

They will continue to fight Spam at any level that protects the footsteps in their expressed intended path: Relevancy - Market Share - Revenue Generation (Maintenance & Growth) - Fiscal and Community Responsibility.

View All Articles by Ken Webster

Saturday, November 05, 2005

So You Want to Trade Links ?

We see them everyday in our in box."I would like to link to your site".
I suggest you choose your links and create your links pages with as much care as you would your homepage.

Go to the web site requesting to link with you. Using the Google toolbar check for the following:

(If you don't have the Google toolbar you should. Go the www.Google.com and search Google toolbar)

1. Has the homepage been indexed by Google?
2. When were they last indexed?
3. Is the link page where they placed your web site hyperlink been indexed by Google?
4. How many links are on the links page? Less than 100 - over 100?
5. Are the links relevant to your business or are they mixed links?
6. Finally - Pretending for a moment that search engines do not exist. Now ask yourself, 'Is this a useful place or resource for a web site visitor.

You might also want to download the Alexa toolbar. I use both Alexa's and Google's tool bar.

When you are looking at web sites requesting links you can see if their web site is listed in the top 100 ~100, 000 ~ 500,000 ~ 1,000,000 and so on. With the number of web sites numbering in the billions this can be a useful metric of determining whether or not to link with a particular web site.

Too many link pages are built for search engines and not human visitors.

The new Google update detects this. This probably will not get your web site banned, but it certainly won't help your rankings. I'm not trying to tell you how to run your link campaign, but
I am very careful not to link to a page built for search engines versus a page built for humans. In light of this update and feedback from other SEO pros I have lightened my link load, added
heavy descriptions for each link and removed links not (closely related) to my business. When time permits I may even add thumbnails of my link partner web sites.

I want my links page to be a resource for users not spider food for search engines.

The Google Jagger Update
Google has made some major changes to their search engine algorithms. However, the Google updates have not fully run the course. Some of the most widely discussed elements for this Google update include:

• Hidden text, especially text that is hidden in CSS or DIV layers
• Paid linking or reciprocal linking that is considered outside of “Google Quality Guidelines
• Using internal links or anchor text as one’s sole source of optimization

THE BOTTOM LINE IS BUILDING YOUR LINKS PAGES FOR PEOPLEZ
- NOT SEARCH ENGINES!

This should improve your position. Create thoughtful pages with links and detailed link descriptions. "Build it for the user" not the "spiders".

See my link partners page: http://www.visionefx.net/partners.htm.
It's not perfect, but I'm striving to create a better page that will interest a casual or professional visitor.

More info about the Google update here:
http://groups.google.com/group


About the Author
Ricardo Vidallon Site Owner and Designer http://www.visionefx.net

Friday, November 04, 2005

Jagger or Jäger? Google’s Update Unraveled

After the past week, you may feel like you need a bottle of Jägermeister (Jäger) to digest the recent Google update. There’s even been some naming discussion by Danny Sullivan, Brett Tabke, Matt Cutts and others. While each has provided ample reasoning for their proposed name, I find Brett’s reasoning most compelling, so I’ll use Jagger.

What does the Jagger Update really mean? Matt Cutts has been providing regular “weather updates” on his blog, and based on that, reading I’ve done, and experience with our clients, Jagger seems to be an effort to increase the proportion of relevant content in the Google SERPs by removing some of the spam.

Some of the most widely discussed elements include:

Hidden text, especially text that is hidden in CSS or DIV layers
Paid linking or reciprocal linking that is considered outside of “Google Quality Guidelines”
Using internal links or anchor text as one’s sole source of optimization

For more commentary, try SearchEngineWatch and WebMasterWorld, but keep in mind this is all just speculation. Only Google has all the answers.

As for my personal take, I’ve investigated the impact Jagger has had on our clients so far, and what I’ve found definitely supports the commentary I’ve been reading.

Very few of our clients have seen any impact to their rankings as a result of this update, and we’ve identified one or more of the above mentioned techniques in use for those clients that have been affected. While we screen clients’ programs carefully to eliminate spam techniques, they sometimes slip by, or are added after we initiate the program.

In one particular situation, a client participated in a link building effort they believed would enhance their SEM campaign, not hinder it - and found it was quite the opposite when Jagger hit.

All that being said, the update isn’t over yet. So while we’ve certainly made it through the eye of the storm, the hurricane’s still a-blowin’. GoogleGuy, engineer at Google and frequent poster to WebmasterWorld, wants us to think about Jagger as three updates in one:

“I believe that our webspam team has taken a first pass through the Jagger1 feedback and acted on a majority of the spam reports. The quality team may wait until Jagger3 is visible somewhere before delving into the non-spam index feedback. If things stay on the same schedule (which I can’t promise, but I’ll keep you posted if I learn more), Jagger3 might be visible at one data center next week.

”So should you panic? Not as long as you’re implementing best practice SEO techniques. Notice that most all of the techniques listed above are considered “spam practices”? Sure, internal linking and anchor text aren’t spam, but over using them or using them as the only method of optimization is certainly not a best practice.

If you’re an SEO, what do you tell your clients or VP of Marketing about a shift like this?

The answer’s easy. If you’ve been following best practices and aren’t engaging in any spammy link practices, you’re probably fine.

If you have noticed a shift in your rankings and are sure that you don’t have any of the above tactics implemented in your program, it’s best to just wait it out. Since the update isn’t over yet, it’s very possible that your site will go back to where it was – and that includes dramatic increases in rankings as well.

If you or your clients’ rankings have fallen dramatically, ask them if they’re engaging in any of the practices listed above. If they are, it’s a good idea to go ahead and remove the offending content, as the Jagger 3 update might pick up the change faster than normal indexing will later.

Here at WebSourced, we’re also riding out the changes, and so far very few of our clients have been affected. For those that have, we’re employing the strategy outlined above, and continuing to optimize with best practices in the meantime.

If you’re an algoholic, you’ve just gotten the lowdown. Go and relax with a little Jäger.

- Jenny “Weather Analyst” Halasz

Wednesday, November 02, 2005

Jagger Update rattles SEO world

Jagger Update rattles SEO world
In yet another of its constant efforts to improve the site-ranking search mechanisms currently in place, Google has embarked on an algorithm update that has caught the attention of webmasters everywhere.

The search engine giant is believed to be in the second phase of a three-part overhaul that is seen by some in the field as a major update. With many webmasters seeing big changes in their rankings, the SEO experts are being inundated with phone calls of alarm from those who have seen drops in their current status. But SEO experts caution that it is necessary to let the new algorithm run its course. Those experts indicate that a sudden change in rankings this week is by no means a new status quo for a site and that search results for the current week will likely change yet again in the weeks ahead.

Citing the Florida update of two years ago, experts insisted that it might take upwards of three months for the search results to settle out properly.

In his web blog, Google engineer Matt Cutts appears to categorize the changes as less substantial than those witnessing the updates. Says Cutts, "These days rather than having a large monolithic update, Google tends to have smaller (and more frequent) individual launches."

As with other such updates, Brett Tabke from WebMaster World has given the new Google algorithm a name, in this instance Jagger. Among the rumored changes in the Google update are greater penalization for hidden text spam because the overhaul now also recognizes text that is hidden in invisible CSS layers.

In addition, some web blogs have speculated that links from automated exchanges and from text link advertising systems now have a less positive impact in SEO rankings while still others suggest that changes may stem from a series of back-link updates that began in early September.

By Thomas HansonNov 1, 2005, 11:33© Copyright 2004-05 SEOProject.com

Thursday, October 27, 2005

Google Jagger Update

Googler Matt Cutts – who’s becoming a sort of weather man for the SEO crowd – has some infos on the current Google rankings update (which has been dubbed “Jagger” by WebmasterWorld). Matt stresses the point that there is not one single large update, but rather a series of smaller and medium-sized ones. Matt says there might be some noticeable PageRank (and backlinks) changes to be expected in the coming days.

Other great resources aboutthe Google Jagger Update:
Jagger1 - I (Included changes in displayed PR and BL )
Jagger2 - Has started and will probably end next week.
Jagger3 - Will start next week hopefully (Wednesday probably at the earliest)
See: http://www.ysearchblog.com/archives/000095.html and http://www.jimboykin.com/45/

My friend and SEO mentor who shall remain nameless (The less Google knows about him the better) explained it to me from his point of view. 'Mr. K" I will call him became involved in the Internet when Cobalt, dBase and FoxPro were all the rage.

This is Mr. K's take on the recent Google Dance
When Google rolls out a radically new 'algo' they don't feed this bad-boy to all their servers units. Some servers IP are blocked so they may test the new "algo" against the group of blocked servers. The updated servers or 'Google feed' you and I see were rolled back, (I'm guessing 6-8 months) so they can test the indexing speed of the new 'algos' against the old 'algo' or block IPs.

Hmmm- This makes sense!

Fast, greasy, blinding indexing speed means faster Google caching of data. In light of the growing paradigm of new data being posted to the World Wide Web - speed is what it’s all about. I'm also guessing that 'Google Base' figures into this equation, but that's food for another blog thought.

So my friends keep the faith and next time - don't put all your marketing eggs in one basket. There's never been such a thing as a free-lunch!

Rick Vidallon
www.visionefx.net

Thursday, October 13, 2005

Google Inc. and Sun Microsystems Partnership

Last week, Google Inc. CEO Eric Schmidt and Sun Microsystems CEO Scott McNealy announced a distribution partnership.

What did they decide?Google's toolbar will be bundled into downloads of the Java Runtime Environment and Sun's Java will be used to power new software developed and released by Google.

Google might also include links to Sun software that directly competes with Microsoft software such as the Open Office suite in future updates of its toolbar.

What does this mean for search?
This is probably only the first step in Google's and Sun's battle against Microsoft. Google wants to win more market share on the desktop of computer users and it wants to move computer applications from the desktop to the Internet.

Google has also recently filed a new patent that indicates that Google is working on a way to constantly monitor all of your actions in order to build personalized search queries.
According to the patent specification, Google aims to monitor whatever you type in your word processor, the things you copy to your clipboard, the position of your mouse, the content of your emails, instant messenger messages and more.

If Google has access to Sun's free Open Office suite, it might be easier to do that. By gathering as much information about you as possible, Google can offer you personalized search results and - more important to Google - personalized ads.

What does this mean to you?It seems that many of Google's recent "free" applications mainly serve the purpose of gathering more data about you for Google so that Google can monetize that information for targeted ads.

If you use many different Google services, you share a lot of information with Google. It's up to you to decide if you're willing to exchange private information for "free" software and services.

This distribution partnership is probably only the start. It's likely that we can expect a lot more from this alliance between these two online giants.

Monday, October 10, 2005

Chasing the Search Engines' Algorithms... Should You or Shouldn't You?

Chasing the Search Engines' Algorithms... Should You or Shouldn't You?
By Robin Nobles

It’s a common occurrence. SEOs often spend countless hours trying to 'break" a search engine's algorithms. "If I could just crack Google's algorithm, my pages would soar to the top of the rankings!" Let's look at some flaws in this way of thinking.

1. Picture the Google engineers and tech folks turning the algorithms dial as soon as you "think" you have "cracked" the algorithms. Your rankings may fall, and you would have to figure out what's working with the engine right now. In other words, your rankings may never be long term.

2. Instead of spending all of this time trying to impress a search engine with a perfect page, why not impress your true target audience... your customers. Has Google, MSN, or Yahoo! Search ever bought anything from you? They're not your target audience. Your customers are your target audience. Write your pages and content for them.

3. When you expend so much of your energy chasing algorithms, you often focus on only a few elements that influence ranking – those elements that are working right now and that you hope will give your pages the best chance for success. It is said that Google has over 100 ranking elements that influence ranking and relevancy. Some are more important than others. But focusing on just one or two "main" elements and discounting the rest can prove disastrous to a Web site.

A different approach . . . Wouldn't you rather achieve top rankings and keep them there, and have those rankings equate to sales and money in your back pocket? After all, isn't it ultimately the sales you're after, as opposed to just the rankings? If those rankings don't equate to traffïc that equates to sales, you lose, any way you look at it.

Five Basic Steps for Achieving Top Rankings without Chasing Algorithms
1. Forget about the search engines. Yes, you heard me correctly. The search engines aren't and never will be your "ideal target audience." They don't buy your goods and services. They're not who you should be trying to please with your Web pages and site. Instead, write your Web page content for your target audience.

2. Don't ever forget the basics. No matter what's happening in the algorithms, continue using your main keyword phrase prominently in your title tag, META description and keyword tags, link text, body, heading tags, and so forth. That way, when the algo dial is turned, you won't have to make changes to all of your pages. You'll always be ready.

3. Focus your keyword-containing tags and body text on one keyword phrase only. Each page should be focused on one keyword phrase, and each page should have its own unique tags.

4. Write well-crafted content for your Web pages, and add new content on a regular basis. If content is king, context is queen. Focus on your keyword phrase, synonyms and related words, and surrounding text. Use a program like Theme Master if you need help determining those supporting words.

5. Remember that both on-page and off-page factors are important. Don't sacrifice one for the other. On-page factors are your tags, body text, prominence, relevance, etc. Off-page factors are link popularity (quality and number of your inbound links) and link reputation (what those inbound links "say" about your Web page when they link to you).

What about search engine research? Isn't it important? - - It's crucial.
Let me give you an example. At the beginning of this year, pages began falling out of Google's index. The forums were alive with speculation and what to do about it. Through research, we determined this was a compliancy issue. By having compliant code, the search engine spiders are more easily able to spider the content. The solution? Make sure you use a DOCTYPE tag and an ISO Character Set Statement at the top of every Web page.

For example:


If you didn't know about the compliancy issues, you could have made changes to your Web pages that didn't need to be made, wasted countless hours trying this or that, all to come up dry. Research helps to make sure you remain on top of what's happening in the search engine industry. It's what sets you apart from other SEOs. You make your decisions based on research and facts, versus speculation and theory.

In Conclusion...
"Take it from someone who has been in this business for nine years and studies the algorithms closely - don't chase the algorithms. You say that you have a #2 ranking for a certain keyword phrase that alone is bringing your site 550 visitors per day? Great. In the time that you have spent gaining that ranking, I have written 285 pages of unique content, obtained 821 links, etc., and collectively I am getting over 1,300 visitors per day," says Jerry West of WebMarketingNow.

In other words, by focusing on more than just chasing algorithms, you have the potential of having a much more successful Web site.

About The Author
Robin Nobles conducts live SEO workshops in locations across North America. She also teaches online SEO training and offers the Workshop Resource Center, a networking community for SEOs. Localized SEO training is being offered through the Search Engine Academy. Copyright 2005 Robin Nobles. All rights reserved.


POST NOTES FROM VISIONEFX
Also See Google Sitemaps
Visionefx is partnered with Hostmysite ISP company who provides a Google sitemap generator for free as part of their ISP hosting services in all their ISP web hosting packages. It is easy to use and takes the guess work out of manually configuring a Google Site map file.
More about Hostmysite Google Sitemap Auto-Generator!

Saturday, October 08, 2005

Ten reasons to redesign your web site

Ten Reasons to Redesign your Website
Published: 14 August 2003 in DesignBy: Nick Tatt

1. Beat the competition
The power of the web allows people to find information at the drop of a hat. It is possible for new customers to find your web site from anywhere in the world. The down side is that they can also find your competitors just as easily. It is important to make a good first impression and stay ahead of the competition. Failure to do so could lose you valuable customers. If your web site is comparable to your competitors’ consider a timely redesign to make sure you are leading the pack rather than following.

2. Present your organisation’s current market position
Organisations need to evolve to ensure they can deliver what customers need today. If your web site was designed a couple of years ago and has not been updated since, it is possible that it does not reflect your organisation’s current market position. A web site redesign is a great opportunity to evaluate where you are today and where you want to be in the future.

3. Out with the old, in with the new Websites date.
It is an unfortunate fact for organisations that websites show their age if left unattended. Considerable damage can be done to your reputation if customers discover that information or products on your web site are out of date. Worse still, incorrect. If your website is out of date consider using some kind of content management system, appropriate for the job, to keep it fresh.

4. Self serviceThe beauty of the web is its immediacy.
You don’t have to wait for the current batch of printed brochures to run out. With a website a quick update can get the latest product information out to a global audience. The theory is great but in practice many websites forget about their content and it soon becomes out of date. There are a few reasons why this might be but often it is because the website can only be updated by one person, probably the original designer. If you struggle to keep your website content fresh it might be time to consider a redesign, allowing people within your organisation to keep it up to date.

5. Make the site more usable and give the client what they want
How many times have you visited a site and not been able to find what you needed or tried to buy something online only to find that it is out of stock? Problems like these can be avoided with some thought in what your customers are trying to do on your website. Resolving the problem might be as simple as improving the the signposts to key sections or pages but a redesign allows you to listen to your customers and create something that will be easier to use.

6. Reach a wider audience
Just because you have a website doesn’t mean people will automatically find it. Competition for the top slot in search results is fierce. Making your website ‘search engine friendly’ will improve the chances of it being found. Building a ‘search engine friendly’ website, that conforms to web standards, from scratch is more effective than trying to adapt an existing one. In doing so you can insure that it appeals to both customers and search engines alike.

7. Increase your sales
The Holy grail for all websites. Websites are often designed with no thought on how organisations can harness the power of the web effectively. The wrong message in the wrong place can result in a website failing to meet your organisation’s expectations. Developing a web presence is relatively straight forward but developing one that meets your organisations expectations and goals is a little more complex.
Are visitors being channelled to the right sections or the site to make a purchase? Does the design promote your current brand? Are people signing up to your newsletter? Is the copy right for the site? If you suspect they may not be then it is time to get your website to work a little harder for your organisation.

8. Create appropriate content for the web
Web content is different to print content. With the immediacy of the internet it is not appropriate to use the same copy as your print material. When reading web pages users scan for information relevant to them. If your content is not presented in a way that delivers that information swiftly you run the risk of losing customers to your competition.

9. Promote an event or product launch
The launch of a new product or event might be the catalyst to consider a website redesign. If your current website fails to do an event or product launch justice you could damage its success. A website tailored to the needs of the event/product launch is far more effective than trying to shoehorn it into your website.

10. Communicate with your customers
What better way to raise your organisations profile than by announcing a new improved website delivering what your customers want in a clear usable fashion. The launch of a new website is a great excuse to contact your customers and strenghten your relationship with them.

Friday, October 07, 2005

W3C Compliance & SEO

W3C Compliance & SEO
By Dave Davies

From reading the title many of you are probably wondering what W3C compliance has to do with SEO and many more are probably wondering what W3C compliance is at all. Let's begin by shedding some light on the later.

What Is W3C Compliance?

The W3C is the World Wide Web Consortium and basically, since 1994 the W3C has provided the guidelines by which websites and web pages should be structured and created. The rules they outline are based on the "best practices" and while websites don't have to comply to be viewed correctly in Internet Explorer and other popular browsers that cater to incorrect design practices, there are a number of compelling reasons to insure that you or your designer insure that the W3C guidelines are followed and that your site is brought into compliance.

In an interview with Frederick Townes of W3 EDGE Web Design he mentioned a number of less SEO-related though very compelling arguments for W3C-complaince. Some non-SEO reasons to take on this important step in the lifecycle of your site are:

* Compliance help insure accessibility for the disabled.

* Compliance helps insure that your website is accessible from a number of devices; from different browsers to the growing number of surfers using PDA's and cellphones.

* Compliance will also help insure that regardless of the browser, resolution, device, etc. that your website will look and function in the same or at least a very similar fashion.

* At this point you may be saying, "Well that's all well-and-good but what does this have to do with SEO?" Good question.

We at Beanstalk have seen many examples of sites performing better after we had brought them, or even just their homepage, into compliance with W3C standards. While discussing this with Frederick he explained it very well with:

"Proper use of standards and bleeding edge best practices makes sure that not only is the copy marked up in a semantic fashion which search engines can interpret and weigh without confusion, it also skews the content-to-code ratio in the direction where it needs to be while forcing all of the information in the page to be made accessible, thus favoring the content.
We've seen several occasions where the rebuilding of a site with standards, semantics and our proprietary white hat techniques improves the përformance of pages site-wide in the SERPs."

Essentially what he is stating is a fairly logical conclusion - reduce the amount of code on your page and the content (you know, the place where your keywords are) takes a higher priority. Additionally compliance will, by necessity, make your site easily spidered and also allow you greater control over which portions of your content are given more weïght by the search engines.

Examples
The Beanstalk website and the W3 EDGE site themselves serve as good examples of sites that performed better after complying with W3C standards. With no other changes than those required to bring our site into compliance the Beanstalk site saw instänt increases. The biggest jumps were on Yahoo! with lesser though still significant increases being noticed on both Google
and MSN.

As we don't give out client URLs, I can't personally list off client site examples we've noticed the same effect on, however we can use W3 EDGE as another example of a site that noticed increases in rankings based solely on compliance.

So How Do I Bring My Site In Compliance With W3C Standards?
To be sure, this is easier said than done. Obviously the ideal solution is to have your site designed in compliance to begin with. If you already have a website, you have one of two options:

1. Hire a designer familiar with W3C standards and have your site redone,

(or)

2. Prepare yourself for a big learning curve and a bit of frustration (though well worth both).

Resources
Assuming that you've decided to do the work yourself there are a number of great resources out there. By far the best that I've found in my travels is the Web Developer extension for FireFox. You'll have to install the FireFox browser first and then install the extension. Among other great tools for SEO this extension provides a one-click chëck for compliance and provides a list of where your errors are, what's causing them and links to solutions right from the W3C. The extension provides testing for HTML, XHTML, CSS and Accessibility compliance.

Other resources you'll definitely want to chëck into are:
CSS Zen Garden ~ A List Apart ~ Holy CSS ZeldMan!

(Frederick lists this one as one of the best resources for the novice to find answers. I have to agree.)

Where Do I Get Started?
The first place to start would be to download FireFox (count this as reason #47 to do so as it's a great browser) and install the Web Developer extension. This will give you easy access to testing tools. The next step is to bookmark the resources above.

Once you've done these you'd do well to run the tests on your own site while at the same time keeping up an example site that already complies so you can look at their code if need be.

To give you a less frustrating start I would recommend beginning with your CSS validation. Generally CSS validation is easier and faster than the other forms. In my humble opinion, it's always best to start with something you'll be able to accomplish quickly to reinforce that you can in fact do it.

After CSS, you'll need to move on to HTML or XHTML validation. Be prepared to set aside a couple hours if you're a novice with a standard site. More if you have a large site of course.

Once you have your CSS and HTML/XHTML validated its time to comply with Accessibility
standards. What you will be doing is cleaning up a ton of your code and moving a lot into CSS, which means you'll be further adding to your style sheet. If you're not comfortable with CSS, you'll want to revisit the resources above. CSS is not a big mystery, though it can be challenging in the beginning. As a pleasant by-product, you are sure to find a number of interesting effects
and formats that are possible with CSS that you didn't even know were so easily added to your site.

But What Do I Get From All This?
Once you're done you'll be left with a compliant site that not only will be available on a much largër number of browsers (increasingly important as browsers such as FireFox gain more users) but you'll have a site with far less code that will rank higher on the search engines because of it.

To be sure, W3C validation is not the "magic bullet" to top rankings. In the current SEO world, there is no one thing that is. However, as more and more websites are created and competition for top positioning gets more fierce, it's important to take every advantage you can to not only get to the first page, but to hold your position against those who want to take it from you as
you took it from someone else.

About The Author
Dave Davies is the CEO of
Beanstalk Search Engine Positioning, Inc.. He writes with years of experience in SEO and Internet Marketing. A special thanks go out to Frederick Townes of W3EDGE for his help with this article. W3 EDGE provides W3C-compliant web site design for their clients. To keep updated on new SEO articles and news be sure to visit the Beanstalk blog regularly.

POST NOTE from VISIONEFX :
If you validate your web site (or) your clients website display the W3C badge of honor!- Link to the W3C validation page and display their logo on your client or your clients web site!

I did this on my company web site, VISIONEFX.
Did it help my business and SEO? You bet!

Wednesday, October 05, 2005

DMOZ in 2005

(a.k.a. The Open Directory Project)

By Phil Craven (c) 2005 WebWorkShop
The original concept of DMOZ was excellent for its time. The DMOZ site's About page makes these statements about the concept, and about the reasons for the directory's creation:- "Automated search engines are increasingly unable to turn up useful results to search queries. The small paid editorial staffs at commercial directory sites can't keep up with submissions, and the quality and comprehensiveness of their directories has suffered. Link rot is setting in and they can't keep pace with the growth of the Internet."

"The Open Directory follows in the footsteps of some of the most important editor/contributor projects of the 20th century. Just as the Oxford English Dictionary became the definitive word on words through the efforts of volunteers, the Open Directory follows in its footsteps to become the definitive catalog of the Web."

But things have changed a lot since DMOZ began in the mid 1990s. Since then, Google came along with very relevant search results, and they were kind enough to show the other engines how to produce such relevant results. That caused dramatic improvements, to the extent that top search engines have been able to provide very relevant search results for some time, and they provide a lot more of them than DMOZ is able to do.

The small paid editorial staffs at commercial directory sites still can't keep up with submissions, but their backlogs are small when compared with DMOZ's massive backlog. According to reports, there are over a million site submissions that are waiting to be reviewed, and delays of several years between submitting a site and it being reviewed are not uncommon. The backlog problem is so huge that many editors have redefined the problem so that it no longer exists. To them there is no backlog, because the submitted sites are not there to be reviewed. They are merely a low priority pool of sites that they can dip into if they want to, and some of them prefer to find sites on their own.

Link rot (dead links) has become widespread in DMOZ through the years, and they certainly can't "keep pace with the growth of the Web". There isn't a single reason for the creation of DMOZ that DMOZ itself doesn't nöw suffer from. So how come such an excellent original concept ended up with a directory that has the same problems that it sought to solve, and on a much largër scale?

One reason is that the Web has grown at a much faster pace than was perhaps anticipated, and the DMOZ editors simply can't keep up. Another reason is that there are simply not enough editors who are adding sites to the directory. At the time of writing, the DMOZ front page boasts 69,412 editors, but that is the number of editors that they've had since the beginning, and most of them are no longer there.

A recent report stated that there are currently about 10,000 editors who are able to edit, and that only around 3,000 of those are active in building the directory. The word "active" is used to describe editors who actually edit quite often, but as little as one edit every few months is acceptable. The word doesn't mean "busy", although some of them are. With so few people doing anything, it isn't even possible for them to keep up with the link rot in such a huge directory,
and there's the ever increasing problem of listings that link to topics other than what they were listed for. It simply isn't possible for them to maintain the directory as they would like.

The idea of becoming "the definitive catalog of the Web" was a fine one, but it turned out to be an impossible dream. The purpose of DMOZ is dead. Today's search engines produce excellent results in large quantities, and much more quickly than drilling down into a directory to find something.

So is there any value at all in the DMOZ directory? As a useful catalog of the Web, and when compared with the major search engines, the answer is no, although a few people do find it to be a useful research resource. For website owners, the links to their websites that a listïng in DMOZ creatës are useful for search engine ranking purposes, but even those are becoming less useful as search engines improve, and seek to block out unwanted duplicate content from
their indexes.

It was a fine concept, and it looked promising for a while, but the idea of DMOZ becoming the definitive catalog of the Web is gone. Improvements in the search engines eclipsed its value, and the growth rate of the Web meant that it could nevër achieve its goal. It began with an excellent concept, and they gave it a good shot, but it didn't work. The continuing growth rate of the Web ensures that it can nevër work. It continues as a good directory of a large number of web sites, but that is all. And not many people use directories when the search engines produce such good results, and so quickly.

About The Author
Article by Phil Craven of WebWorkShop. Phil is well-known in the world of webmasters and search engine optimization and his views have been sought and published by various online and offline publications.

Tuesday, October 04, 2005

Using Content Hubs To Promote Your Web site

Using Content Hubs To Promote Your Web site
By David Risley

We've all heard it before: content is king. And it is true. If you own a site, you need to post something interesting that people want to read before you can expect people to stop by. If your site is a content-based website, then you've already taken a huge step.

However, if your website is a business website whose only purpose is to talk about your services, then you really should make an effort to post some content onto your website which is helpful to readers, free, and relevant to your services or website. If you do this, your site will attract traffic from people looking for information, not just to purchase something. And with increased traffic in general, you will get increased attention. And this increases your statistics.

Writing content for your own website is only half the battle, though. You have got to get people to read it. Just posting a website is not going to get people to come to it. It would be like building a business in the middle of the mountains. Nobody knows its there and you won't get any customers. If you get your articles out there for people to read and the articles are written correctly, you can position yourself as an expert in your field and promote your own website. One way to do this is by publishing on content hubs rather than limiting it to your own website.

A content hub is a site which publishes articles on all topics (usually categorized). Those articles are freely available to anyone to use on their own website, newsletter, blog, etc. So, many publishers or site owners in need of fresh content for their website can go to one or more of these content hubs, find an article they like, and use it. They have to maintain proper credit to the author and publish the small author bio which accompanies the article.

Let's look at this, though, from the author's viewpoint - your viewpoint. Let's say you are selling consulting services for search engine optimization. You have a site for your services, but you blend in with all the other such services. So, you write a series of articles giving tips to webmasters on how they can optimize their website. With your article you include a short bio of yourself.

You include a mention of your services and a link to your website. You publish your article
on a bunch of content hubs. Other websites, newsletters and blogs grab your article off those sites and use it on their own. Your article therefore spreads throughout the internet. Being that your site is linked with the article and is therefore on all of these other websites now (including the content hubs themselves), search engines who are constantly spidering the internet pick up on your article and index it associated with your website.
This, in turn, raises your ranking in the search engines. And you get increased traffic to your website not only from search engine searches but also from your article.

Now, let's say you have done some research on keywords and you interlace your article with certain keywords. When the search engines spider your article all over the internet and associates with your website, it will raise your search engine rankings even more. There is a real science to this, and if done correctly, can drastically raise your internet presence in a short time. I recently had a meeting with the CEO of In Touch Media Group, a Clearwater, FL based company which is in the business of internet marketing.

They use content hubs as part of their strategy for clients and they couple this with their vast archived data regarding keywords. They showed me the stats of one site which they have, in the course of just a few months, taken from essentially no traffic to a VERY respectable level of traffic. After getting an article out in the content hubs, they will follow up a few weeks later with a press release.

So, how can you publish some of your articles on content hubs? Well, the first step is to find and visit them. There are many of them out there, but below are some of the better ones:

GoArticles.com
ISnare.com
SubmitYourArticle.com - a service to send your article to a bunch of hubs at once
ArticleCity.com
ExchangeNet.com
Article-Directory.net
FreeZineSite.com

There are services to help you distribute to a large collection of publishers at once. I have used Isnare's distribution service and it seems to work well. There are also distribution groups on Yahoo.

Here are a few of them
Free-Content
Article
Announce List

Article
Announce

Articles4You2Use4Promotion
Article
Submission

Free Reprint Articles

With that, I wish you the best of luck in your promotion efforts. Start writing!

About The Author
David Risley is a web developer and founder of
PC Media, Inc.. Specializes in PHP/ MySQL development, consulting and internet business management. He is also the founder of PC Mechanic, a large website delivering do-it-yourself computer information to thousands of users every day.

Saturday, October 01, 2005

Why Should I Bother With Optimized Online Copywriting?

Why Should I Bother With Optimized Online Copywriting?

It's no good having a creative, individual website with brilliant, informative copy if customers can't find you on the internet. On the other hand, it's also detrimental if you have a website that can be easily found (has a high ranking) but people become bored and alienated reading it. Producing effective online copywriting is a creative process blending art and science in a balanced technique combining many different elements. This integration of disciplines is required to satisfy both the technical and the aesthetic objectives of a website.

Optimized online copywriting should ensure that your website is:
• highly readable to your viewers
• highly visible to the search engines, and thereby
• commercially successful for you.

Many people and businesses don't have the time to actually write web copy themselves. A professional freelance copywriter can furnish you with keyword-rich, highly original web content to enhance and improve the quality of your website with the aim of transforming more of your visitors into customers.

Rarely will you get a second chance to engage your customer's attention, so your first shot must be formatted for maximum sales potential, catching the eye of the search engine robots as well. But not too much… If your copy goes overboard in favor of the search engines it earns you a penalty from Google that will negatively affect your rankings. Your website must always have the reader as priority. This makes more business sense anyway.

Search engines provide a way for potential customers to find you on the internet. People type a key phrase or keyword into a search engine, such as Google, Yahoo or MSN (or one of the many other popular engines) and this returns a page of listings - web page suggestions for that particular phrase or word. Obviously, you want your website to feature highly in this list.

Optimized online copywriting specifically targets the words and phrases people are using when searching for a product on the internet (Search Engine Marketing (SEM), keyword research). You want to make sure your website stays at the top of the listings so people go to your website before others. With targeted copy in place, search engines are more likely to index your web site on page one than if it does not include keyword-rich copy. This is an ever more important
issue when dealing with Google, the leading search-engine today.

To rank highly in the search engines the words on your web pages should never be an afterthought, but should be included right at the beginning in the original design of your website. Content development is the most valuable asset web developers can utilize in the bid for productive, successful search engine optimization and Search Engine Marketing (SEM).

Hiring a professional copywriter is a wise investment in your business future. Even if you don’t want to optimize your site you should make sure that the words on your site are reasonable, enticing, spelled correctly and artfully arranged to engage attention. Just because you can type letters or write some emails doesn't mean you can write the copy for your website. The writing on your homepage is often how people determine whether the website is a scam or the genuine article. Your website’s credibility takes a nose-dive if the spelling is wrong, or the grammar is incorrect, or it just reads like bad, clumsy English. People will be disinclined to trust your content.

Within the search engines new technologies and algorithms are being developed all the time to make search methodologies smarter, more astute. It's never a coincidence when someone types in a search phrase and your website is indexed highly on the page. Keyword rich online copywriting is a significant and critical component in gaining high rankings on the search engines.

IMPORTANT!
Google has been pioneering a new trend of intelligent search engines which are not attracted by mere repetition of words throughout the text, but which look for meaning, attempting to make grammatical sense of the information, trying to understand what the web page is actually saying. This is forcing webmasters to improve the content on their web pages or suffer the onsequences.
The old saying has never been more relevant: 'content is king.'