Wednesday, November 23, 2005

Google's Jagger Update Completing Cycles

Ever since Google introduced its latest algorithm update in September, a fair amount of column space has been dedicated to telling webmasters and small business owners to wait until the update is complete. In so much as it can be said that the Jagger Update will ever be complete, the final cycle of the immediate update appears to be playing out.

Jagger was a different sort of algorithm update for Google. Its infamous predecessors, Florida and Hilltop were generally limited shifts in the values Google assigned domains based on content and links. After the immediate punch of previous updates, the search engine results pages (SERPs) would generally return to a stable and predictable state. SERPS generated by Jagger are expected to constantly update themselves with a greater degree of flux and change.

So, what exactly happened during the Jagger Update and what might it mean to your website? Quite a bit as it turns out.

The Jagger Update was introduced for three main reasons. The first was to deal with manipulative link-network schemes, sites generated with scraped content and other forms of SE-Spam. The second was to allow and account for the inclusion a greater number of spiderable documents and file types. The third was to allow and account for new methods of site acquisition beyond the use of the spider Googlebot.

The update made its first public appearance in late September but had its greatest impact in early October. At that time, hundreds of thousands of websites that enjoyed previously strong listings were suddenly struck and sent to the relative oblivion found beyond the second page of results.

Most of those sites lost position due to participation in what Google obviously considers inappropriate linking schemes. This was actually one of the first conclusions we came to in late September based on the experience of a few clients who joined link-networks that had not been recommended or vetted by our link-experts. This is now backed up by discussion in various search engine forums. While most of those hurt by this part of the update are good people running honest businesses, Google put out notice that irrelevant link-networks, no matter how simple or complex, are unhealthy additions to what might otherwise be a good website.

The problem Google faced was some webmasters misunderstood what links are for and how Google uses them to rank documents. For some unknown reason, many webmasters or site administrators participated in wholesale link mongering, bulking up on as many inbound links as possible without consideration of the most important factor (in Google’s estimation), the relevance of inbound links.

Now, Google appears to be applying filters based on historic data it has collected about all sites in its index over time. In other words, Google likely knows a lot more about documents linking to a particular website than the person who placed or requested the link in the first place. SEOs and webmasters should brush up on the “Information retrieval based on historical data” patent application Google filed on March 31, 2005 for highly detailed information.

Google is judging sites on who they link to along with who links to them. Before the update, a link from your site to an irrelevant site was more a waste of time than a waste of opportunity. Today irrelevant links seem to be both. Google’s desire to offer stable and highly relevant SERPS while preventing outright manipulation of those SERPS was the biggest cause of the shift.

The second and third reasons for updating the algorithm at this time is the allowance for indexing documents or information obtained through alternative sources such as Google Base, Froogle, and blogs and other social networking tools. Google’s stated goal is to grow to include reference to all the world’s information. That information is being expressed in multiple places using several unique file formats, some of which are difficult to weigh against others. By checking the file or document in question against the long-term history of documents linking to it, Google is better able to establish its theme and intent.

Mass adoption of blogs, while promoted by Google gave the search engine a number of problems. Webmasters and search marketers will take almost any opportunity to promote their sites, by any means available. Blogs provided ample opportunities and soon issues ranging from comment spam to scraped content Splogs started to gum up the SERPS. By comparing document content with the history of other related documents in its index, Google has become much better at spotting blog-enabled spam.

Google faced problems with forms of search engine spam such as fake directories and on-page spamming techniques such as hiding information in CSS files. The Jagger Update seems designed to deal with these issues by applying Google’s vast knowledge about items in its index against every document or file it ranks. A site that scrapes content, for example, might be weighed against the documents that content was originally published on and the intent of the republisher. One that hides information in the CSS file will similarly trip Google’s memory of how the same domain looked and operated before the spam-content was inserted.

The third reason for the algo update comes from the expansion of Google itself. Google is now much larger than it was when the Bourbon update was introduced in the early summer. Audio and video content is spiderable and searchable. Google’s comparison shopping tool Froogle is starting to integrate itself in with Google Local, just as Google Local and Google Maps are beginning to merge. There is some speculation in the SEO community that Google is preparing to integrate personalized data into the search results served to specific individuals. A strong assumption is that Jagger is part of Google’s movement towards personalization though there is little to firmly point at to support this idea.

If your website is still suffering the lagging effects of the Jagger Update, your SEO or SEM vendor should be able to offer good advice. Chances are, the first thing he or she will do is a point by point inspection of your inbound and outbound links associated with your website. Next, they will likely suggest making it easier for Google to spider various document file types in your site by providing an XML sitemap to instruct Google’s spider cycle. Lastly, they will likely suggest a look at how website visitors behave when visiting your site. Site visitor behaviours will play a part in Google’s view of the importance and relevance of sites in its index. The introduction of Google Analytics provides webmasters with a lot of free information regarding site visitors, along with other information on how the site fares on Google’s search engine. It also provides Google with a lot of information about sites running it. More on the effect of Google Analytics on the SERPS next week.

About the Author Jim Hedger
Jim Hedger is a senior editor for ISEDB.com. Also he is a writer, speaker and search engine marketing expert working for
StepForth Search Engine Placement in Victoria BC. He has worked as an SEO for over 5 years and welcomes the opportunity to share his experience through interviews, articles and speaking engagements. Hedger can be reached at jimhedger@stepforth.com

Monday, November 21, 2005

Jagger, Google Analytics, and the Future of Search and SEO

Two big things have just happened in Google-land: Jagger and Google Analytics. Together, these two events may have changed the face of search forever.

Jagger

First, let's discuss Jagger... Just like hurricanes, Google updates have names. (A Google update is a change to the way Google determines its rankings. Google makes these changes periodically, and they're universally feared because they can impact dramatically on a website's ranking.) The latest update is called Jagger, and it has search engine optimizers (SEOs) all around the world in a state of panic.

Why was Jagger such a fearful update? Simple... With Jagger, Google once again outsmarted huge numbers of SEOs. You see, many/most SEOs spend their time (and their clients' money) trying to trick Google into thinking that their websites are more relevant and important than they really are. They do this mostly by swapping links, buying cheap links, and placing links on free directories. While there's nothing wrong with these sorts of links (i.e. they're not considered 'black-hat'), they don't really show that the site is relevant or important. All they really show is that the site owner has made a deal with another site owner. In these deals, the incentive for the linking site owner is a reciprocal link, money, or increased link volume. Google much prefers it when the linking site adds the link simply to enhance the value of their content or to increase their own credibility and authority.

In other words, Google wants its search results to contain relevant, important sites, not sites that merely appear to be relevant and important. To this end, Google invests millions of dollars and employs the world's smartest mathematicians to create algorithms which identify sites that are trying to trick them. And that's exactly what Jagger did; and when it found those sites, it simply adjusted their ranking to more accurately reflect their true importance. (Unfortunately, it also demoted some sites which actually deserve a high ranking. It is hoped that these mistakes will be ironed out with future minor updates, but that's a topic for another article...)

From a technical standpoint, Jagger was well described by Ken Webster in his article, http://www.webpronews.com/topnews/topnews/Jagger . To summarize, Jagger:

1) Increased importance placed on IBL (Inbound Links) Relevancy?
2) Increased importance placed on OBL (Outbound Links) Relevancy?
3) Promotion of relevant Niche Directories (related to #1 & #2)?
4) More weight thrown back to PR @ top domain?
5) Increased importance on AdSense placement relevancy?
6) Possible introduction of CSS Spam filtering?
7) Overall Blog demotions?
8) New and unresolved "canonical" issues?

Some more interesting effects were reported by WG Moore (http://www.sitepronews.com/archives/2005/nov/9.html) who runs a number of test sites for SEO purposes. By monitoring the links to his test sites as reported by Google, he established that:

"all reciprocal links had vanished. We think that this is because Google is down-grading or eliminating reciprocal links as a measure of popularity. This does make sense, actually. Reciprocal links are a method of falsifying popularity. Sort of a cheap method of buying a link, if you want to think of it that way... During the second week of the Jagger Update, a few of our reciprocal links did come back up. However, we also noticed that these were from places where we had highly relevant content. They came from articles where we discussed our area of expertise: Web Analytics, or from forums where we had relevant threads. So we feel that these links came back because of content, not linking.

The other group that came back up was one-way inbound text links, regardless of the originating web site. These links also had strong relevance to our web analytics business. In other words, they contained keywords and/or phrases related to our site and its business."

In short, Jagger undid the hard work of thousands - if not millions - of people! As a result, hard-won high rankings and revenues plummeted.

Interestingly, article PR (article submission) came through Jagger seemingly unscathed. My SEO copywriting website http://www.divinewrite.com , for example, went from no.4 to no.1 worldwide for "copywriter", and I've employed article PR almost exclusively. Whether it was promoted or the sites around it were demoted, one thing is clear: article PR is one of the best ways to obtain a high ranking.

Google Analytics

The second monumental event to occur recently was Google Analytics - http://www.google.com/analytics/index.html . Google Analytics is a free web-stats solution which not only reports all the regular site stats, but also integrates directly with Google AdWords giving webmasters and insight into the ROI of their pay-per-click ads. According to Google, " Google Analytics tells you everything you want to know about how your visitors found you and how they interact with your site."

Why is this such a landmark move? Because for the first time ever, Google will have access to your real web stats. And these stats will be far more accurate than those provided by Alexa - http://www.alexa.com . Furthermore, Google's privacy statement says: " We may also use personal information for auditing, research and analysis to operate and improve Google technologies and services." - http://www.google.com/intl/en/privacy.html . Now let's put two and two together:

1) Google is 'giving' every webmaster in the world free access to quality web-stats.
2) Millions of webmasters will accept this 'gift', if only because it integrates directly with their Google AdWords campaigns.
3) Google will then have full access to the actual web stats of millions of commercial websites.
4) Google will have the right to use these stats to develop new technologies.
5) What's the next logical step? Google will use these statistics to help determine its rankings, of course!

It should come as no surprise. It's been on the cards - and frequently discussed - for a long time. For example, Jayde Online CEO, Mel Strocen, recently published an article on this very topic, ' The Future of WebSite Ranking' . He quite rightly asserts that:

"Google's "democratic" vision of the Web will never be achieved by manipulating algorithm criteria based on content. It will only be achieved by factoring in what is important to people, and people will always remain the best judge of what that is. The true challenge for search engines in the future is how to incorporate web searcher input and preferences into their ranking algorithms."

In fact, the Jayde Online network already owns and operates a search engine, ExactSeek (http://www.ExactSeek.com) which incorporates user popularity statistics in its rankings.

The Future of Search & SEO

To date, ExactSeek is the only search engine which uses visitor stats as criteria for its rankings. But Google isn't far behind. We all know that Google specializes in taking a good idea and implementing and adapting it brilliantly. This is exactly what we'll see in this case. By combining link popularity and user popularity statistics, Google will be the only major search engine to consider both what other sites think of your website and what your visitors think of your website. And because they have the most advanced algorithms for assessing link popularity, and will soon have access to the farthest reaching, most accurate web stats to assess user popularity, its competitors will be a long time catching up.

So if that's the future of search, what's the future of SEO? The future of SEO is undoubtedly one where:
• One-way text links from relevant pages continue to be the most valuable links
• Reciprocal linking continue to decline
• The 'shotgun' approach to link buying declines
• Mass email link requests decline
• Free directory submission declines
• Niche directory submission increases
• Article PR (article submission) increases
• Article submission sites (e.g. EzineArticles - http://www.ezinearticles.com , GoArticles - http://www.goarticles.com , and ArticleBlast - http://www.articleblast.com ) play a much bigger and more important role in helping online publishers locate quality articles (due to the increasing article volume)
• User popularity is just as important as link popularity, which means:- The quality of article PR improves in order to increase site traffic, credibility, and loyalty- The quality of website content improves in order to convert traffic and encourage repeat visits Clearly, the choices for SEOs will be pretty much limited to paying for links at niche sites and/or engaging in article PR. Being an SEO copywriter, I may be a little biased, but for mine, article PR is the hands-down winner in this comparison:
• It satisfies Google's criteria for relevance and importance. Linking site owners include your article and link because, in doing so, their site becomes more useful to visitors, and their business gains credibility and authority.
• It generates hundreds of free links quickly enough to make it worth your while, but not so quickly as to raise red flags at Google (in the form of link dampening).
• Links are permanent and you don't have to pay to keep them there.
• You get a lot of qualified referred traffic who already trust you and your expertise. This satisfies Google's visitor popularity criteria, while at the same time bringing you a lot of extra customers.

(For more information on article PR, read How to Top Google with Article PR .)

Conclusion

The lesson from Jagger is, don't try and trick Google! They've got more money and more brains than virtually any company in the world. It'll only end in tears! Don't spend time and money trying to make your site look important and relevant. Instead, spend that time and money actually making it important and relevant! Content - the real content behind the optimization - is the answer. After all, whether it's an article or a web page, it's the content that keeps 'eyes on paper', and that's what it's all about.

Happy optimizing!

Wednesday, November 16, 2005

On the Google Jagger Algo Update - Part 1

There has been a major update of Google’s ranking algorithm, changing the way the search engine orders search results. Atul Gupta of RedAlkemi discusses the consequences for webmasters.

By Pandia Guest Writer Atul Gupta

Google does minor algorithm updates almost on a monthly basis. Once in a while, it implements a major algorithm update.

If there is one thing search engine marketers and website owners fear, it is a major algorithm update, especially by Google. Well, much as we may like it not to happen, its here. Google has recently done a major algorithm update, nick named the “Jagger” update series.

The last major Google algorithm update, called the Florida update, happened in November 2003 and created quite a stir with website rankings.
Big changes in rankings
Like the Florida update, the Jagger update has done the much feared “blender” act. It has churned the top-ranking websites and turned them into a list of unrecognizable pulp.

Google has been the favorite amongst the web community searching for information. Most feel that the search results have always been highly relevant. It would therefore be safe to assume that whatever algorithm Google has, works just fine.

So why does Google need to re-engineer its perfect-looking algo so drastically? Has it not heard the saying “if it works, don’t fix it”?

Beating the spammers
From Google’s standpoint, the reason is simple and valid. Well, for starters, the web is ever-evolving and the algo always needs to be adjusted in order to provide the best results. Google has engineered an algorithm which they believe will reward good sites and rank them well for its viewers.

Google, like most other search engines, keeps this algorithm a closely guarded secret to prevent it from being exploited.

However, the SEO community is constantly at work trying to rank their sites well. Using calculated guesswork, logical thinking, special tests and extensive trial-and-error methods, they gradually figure out what the algorithm likes and dislikes.
Armed with this knowledge, it is not difficult to work on websites to rank them high in SERP (Search Engine Result Pages), irrespective of whether the site deserves to rank at the top or not. This kind of algorithm abuse results in ‘less than desirable’ websites displacing good sites from the top ranks, contaminating the Google index.

Consequently, following the Kaizen philosophy, Google needs to re-engineer its algorithms to keep what it believes are bad sites out of its top ranks. Naturally, major algorithm updates upset the current high-ranking websites and sends a lot of SEO professionals back to their work-bench in order to start all over again.

The timing
What is interesting to note is the timing of the algorithm update. When Google updated its algorithm in November 2003, there were large scale allegations by website owners that Google intentionally upset the rankings of popular websites just before the Christmas shopping season to force them into buying Google AdWords paid advertising in order to sustain the visitor traffic.
While Google claims that the algo update decisions are not influenced by the AdWords team, it is difficult to understand why they would once again choose a critical timing just before Christmas shopping season to update their algorithm.
The stakes are very high and this is business after all. Google earned $1.57 billion in Q3 of 2005. If 2003 pre-Christmas algorithm update effect is any indication, I estimate that Google would record revenues of over $2.05 billion in Q4 of 2005.

Jagger history
The Jagger 1 update pre-shocks actually started with a string of back-link updates that began in September 2005 and continued into middle of October 2005.
In mid October, Google updated its PageRank database for public view. Usually updated once a quarter, the PR update always creates a stir.

While most SEO professionals heavily play-down the importance of PR in ranking, the legacy of its importance is so deep-rooted in the minds of most webmasters, that it is difficult to shake it off as an insignificant ranking parameter.
[PageRank is Google’s measure of the “popularity” of a web page, based on the number and quality of incoming links. The Editor.]

It is believed that the second phase of the Jagger update — Jagger 2 — is now complete and replicated to all the data centers of Google. However, you may still notice some fluctuations in the rankings as things stabilize for each update.

We are now at the threshold of the third phase of the Jagger update, which is expected to initiate sometime in the second week of November 2005.

The changes
From what we have studied so far, Google has re-engineered several aspects of its algorithm. Amongst other aspects we will know as things roll out, we believe it has altered the impact of the following:
1. Value of incoming links
2. Value of anchor text in incoming links
3. Content on page of incoming links
4. Keyword repetitions in anchor text
5. Age of the incoming links
6. Nature of sites linking to you
7. Directory links
8. Speed and volume of incoming links created
9. Value of reciprocal links
10. Impact of outbound links / links page on your website
11. Sandbox effect / age of your site, domain registration date
12. Size of your site’s content
13. Addition and frequency of fresh content update
14. Canonical / sub domains, sub-sub domains
15. Multiple domains on same IP numbers
16. Duplicate content on same site or on multiple domains
17. Over-optimization, excessive text markup
18. Irrational use of CSS

We are studying various aspects of the Jagger algo update and are closely monitoring the impact of changes in each of the above mentioned parameters and many more not mentioned here.
We shall be discussing the impact of each of these aspects in the next parts of this article, which are likely to be written once the Jagger 3 update and our study of it is complete.

In the meanwhile, we’d like to give out a word of caution – If you have suffered drop in your website rankings, do not do any drastic changes on your website until the Jagger 3 update is fully implemented and stabilized.

There is a delicate balance and inter-dependence of all these parameters that can bring back your ranks once the Jagger 3 update is completed.

About the Author: Atul Gupta is the founder and CEO of RedAlkemi.com (formerly known as SEOrank.com & PugmarksDesign.com), an internet marketing, e-commerce, graphic design, web & doftware development services company.

He has about 20 years of experience in the field of graphic design, visual communication, web development and search engine marketing services. He has spent the last nine years of his career devoted solely in pursuing search engine marketing and web development activities.

Friday, November 11, 2005

Google's Jagger Update - The Dust Begins To Settle?

Ken Webster Expert Author Published: 2005-11-10

What happened? Webmaster's, site owners, online businesses and SEO companies everywhere have been desperately trying to decipher the fallout from the longest and most grueling algorithm update in the history of the Internet.

Relevancy and Revenue Generation are the two top goals of any SE (search engine). As the Internet and associated technologies mature, search engine algorithms have become much more complex. This was demonstrated in Google's 3-4 week long 3 phase "Jagger" update.

The initial response was very negative and Google received more bad press from every conceivable corner than what could have been imagined, going in. Many sites fell completely out of [Google's] SERPs (Search Engine Result Placement) over night, seemingly unexplainably. Some have recovered but many haven't, others have improved traffic.

Compounding prognostication, Yahoo initiated a much milder Index Update during the latter phase of the Jagger update.

Google had several issues to deal with:
1) Scraper Sites
2) Faux Adsense Directory Sites
3) CSS Spamming Techniques
4) Growing "Generic" SERP Irrelevancy
5) Reciprocal Linking Abuse
6) Ballooning BlogSpam

Google had no choice but to act decisively and convincingly.

The following list is how we believe Google has handled these issues in the Jagger update:

1) Increased importance placed on IBL (Inbound Links) Relevancy?
2) Increased importance placed on OBL (Outbound Links) Relevancy?
3) Promotion of relevant Niche Directories (related to #1 & #2)?
4) More weight thrown back to PR @ top domain?
5) Increased importance on Adsence placement relevancy?
6) Possible introduction of CSS Spam filtering?
7) Overall Blog demotions?
8) New and unresolved "canonical" issues?

Let's look at each action separately:
1) Increased importance placed on IBL Relevancy

Reciprocal linking abuse was growing out of hand, even "organic" SERP were losing relevancy because the majority of active site administrators were link-mongering anywhere and with anyone they could, regardless of relevant value. Google created that monster throwing the weight behind quantity over quality for a long time. It appears they simply started applying several published relevancy measurement factors (See US Patent Application #2005007174), which seem to have started becoming more noticeable during the "Bourbon" update.

2) Increased importance placed on OBL Relevancy?

The patent application mentioned above is ripe for OBL relevancy algorithm application. The "Bourbon" update ushered in a marked hit on irrelevantly linked and broader based directories, while promoting "niche" or "focused" more relevant topical based directories. It makes perfect sense to cut spam at it's source. This move was subtle but at the same time was an engineering masterpiece because it addressed every form of link spam to some degree, including CSS spammed links.

Theoretically; If a link can't be seen, it won't be selected by visitors and no measurable time is spent there, therefore it's "Relevancy Rating" starts to diminish immediately. Some even hypothesize that those kind of links can effect the overall "Relevancy Ranking" for the entire Site and has potential to effect the page and Site PR (Page Ranking). We definitely saw a promotion of "Relevant" Directories almost across the board with Jagger.

3) Promotion of relevant Niche Directories (related to #s 1, 2 & 5)?

We began seeing a Directory SERP shift in the "Bourbon" update and definitely saw a promotion of "relevant" directories almost across the board with Jagger. Based on those facts, no one can deny that there has been a significant algorithm reemphasis in and about "linking" issues.

4) More weight thrown back to PR @ top domain?

Google had seemed to stray from earlier value ascribed to PageRank for some time in quest of content, content freshness and other goals. After Jagger3 I was surprised to find PR0 pages highly placed in important Topic SERP with a great deal of code and 2 sentences of content. One example is prominent just below Matt Cutt's Blog when doing a GOOGLE search for "Jagger Update".

This particular example is mostly javascript, Adsense and intra-site links. On further inspection, the site is well done contains a good deal of relative information and has a top domain ranking of PR6. Based on these observations one might concur that more emphasis has been placed on top domain PR. This "observed" return focus to "Authoritive" or sites holding Trusted" status should hold no real surprise in the quest for "relevancy" improvement.

5) Increased importance on Adsence placement relevancy?

Google has declared all out war against spam Adsense sites of every kind. Many of these are/were faux directories and scrapers or other mega-sites utilizing auto content and auto link generation technologies and services. Matt Cutts in his bBlog openly asked for and gave specific instructions on how to report these sites to help augment the overall effect of the alg changes targeting those raging atrocities.

The war rages on against all kinds of spam, but you can always bet that relevancy, revenue protection and growth will be at the top of the list. 6) Possible introduction of CSS Spam filtering? Matt Cutts issued an unusually stern warning about using CSS spam techniques, coinciding with the Jagger update (strangely enough) on Oct 19, 2005. Here is link to the article in Threadwatch entitled; "Google Engineer Hammered over CSS Spam Comments". There is a great deal of controversy over this issue, but it has been a growing cancer for a long time.

Some almost seem to be speculating that Google couldn't figure out the algs to combat these issues yet outside of OBL relevancy implementation almost dismissing Matt's warning as "huff and puff" to scare CSS Spam abusers into compliance. Google always addresses serious Spam issues eventually and this one has been on the table for around a year, that I know of! It just doesn't make sense to ignore a warning from a top Google engineer, does it?

7) Overall Blog demotions?

Blog spam became a growing problem after Blogging gained prominence in 2004. Google had to backtrack on Blog SERP prominence because many of them were not managed well, or at all, losing topical relevancy. Jagger seems to have ushered in a SERP sweep of Blogs that were not topically focused, managed with purpose, and contained adsense and link spam. It got to the point that it seemed that half the top SERP for almost any topic were Blog listings, many have fallen in Jagger.

8) New and unresolved "canonical" issues?

Many are complaining of incorrect indexing issues, especially for sSites that were indexed for the first time during Jagger. The problem seems to stem from the Google treating the abbreviated site URL (without www) and the complete URL. I'll use one of my own as an example: www.precisioncompletion.com is a new unranked launch during Jagger and comes up correctly.

Do a Google search for precisioncompletion.com and look at the cache - A PR7 and the wrong website!

Half of the listings are correct and the other half pertain to that other site. Google is aware of these canonical issues being reported, and I believe they are planning to address them as the dust settles a little more on this update. Maybe I need to do a 301 permanent redirect to the full "URL" before I lose that PR7 and see if I can get it to transfer and magically pump up that PR0!

What to expect next? There are a large number of Sites that saw crippling SERP demotions, including clean coded, relevant, W3C validated, completely "White Hat" sites that haven't ever even engaged in link exchange programs. I know, I had one that got hit, my first time ever in a GOOGLE update.

Many of us in that position hope that, that effect is temporary "Collateral Damage" which will be rectified in subsequent alg tweaking as the dust continues to settle on the "Jagger" update. I don't see that Google has deviated off their widely expressed intentions and historical path in the Jagger update.

They will continue to fight Spam at any level that protects the footsteps in their expressed intended path: Relevancy - Market Share - Revenue Generation (Maintenance & Growth) - Fiscal and Community Responsibility.

View All Articles by Ken Webster

Saturday, November 05, 2005

So You Want to Trade Links ?

We see them everyday in our in box."I would like to link to your site".
I suggest you choose your links and create your links pages with as much care as you would your homepage.

Go to the web site requesting to link with you. Using the Google toolbar check for the following:

(If you don't have the Google toolbar you should. Go the www.Google.com and search Google toolbar)

1. Has the homepage been indexed by Google?
2. When were they last indexed?
3. Is the link page where they placed your web site hyperlink been indexed by Google?
4. How many links are on the links page? Less than 100 - over 100?
5. Are the links relevant to your business or are they mixed links?
6. Finally - Pretending for a moment that search engines do not exist. Now ask yourself, 'Is this a useful place or resource for a web site visitor.

You might also want to download the Alexa toolbar. I use both Alexa's and Google's tool bar.

When you are looking at web sites requesting links you can see if their web site is listed in the top 100 ~100, 000 ~ 500,000 ~ 1,000,000 and so on. With the number of web sites numbering in the billions this can be a useful metric of determining whether or not to link with a particular web site.

Too many link pages are built for search engines and not human visitors.

The new Google update detects this. This probably will not get your web site banned, but it certainly won't help your rankings. I'm not trying to tell you how to run your link campaign, but
I am very careful not to link to a page built for search engines versus a page built for humans. In light of this update and feedback from other SEO pros I have lightened my link load, added
heavy descriptions for each link and removed links not (closely related) to my business. When time permits I may even add thumbnails of my link partner web sites.

I want my links page to be a resource for users not spider food for search engines.

The Google Jagger Update
Google has made some major changes to their search engine algorithms. However, the Google updates have not fully run the course. Some of the most widely discussed elements for this Google update include:

• Hidden text, especially text that is hidden in CSS or DIV layers
• Paid linking or reciprocal linking that is considered outside of “Google Quality Guidelines
• Using internal links or anchor text as one’s sole source of optimization

THE BOTTOM LINE IS BUILDING YOUR LINKS PAGES FOR PEOPLEZ
- NOT SEARCH ENGINES!

This should improve your position. Create thoughtful pages with links and detailed link descriptions. "Build it for the user" not the "spiders".

See my link partners page: http://www.visionefx.net/partners.htm.
It's not perfect, but I'm striving to create a better page that will interest a casual or professional visitor.

More info about the Google update here:
http://groups.google.com/group


About the Author
Ricardo Vidallon Site Owner and Designer http://www.visionefx.net

Friday, November 04, 2005

Jagger or Jäger? Google’s Update Unraveled

After the past week, you may feel like you need a bottle of Jägermeister (Jäger) to digest the recent Google update. There’s even been some naming discussion by Danny Sullivan, Brett Tabke, Matt Cutts and others. While each has provided ample reasoning for their proposed name, I find Brett’s reasoning most compelling, so I’ll use Jagger.

What does the Jagger Update really mean? Matt Cutts has been providing regular “weather updates” on his blog, and based on that, reading I’ve done, and experience with our clients, Jagger seems to be an effort to increase the proportion of relevant content in the Google SERPs by removing some of the spam.

Some of the most widely discussed elements include:

Hidden text, especially text that is hidden in CSS or DIV layers
Paid linking or reciprocal linking that is considered outside of “Google Quality Guidelines”
Using internal links or anchor text as one’s sole source of optimization

For more commentary, try SearchEngineWatch and WebMasterWorld, but keep in mind this is all just speculation. Only Google has all the answers.

As for my personal take, I’ve investigated the impact Jagger has had on our clients so far, and what I’ve found definitely supports the commentary I’ve been reading.

Very few of our clients have seen any impact to their rankings as a result of this update, and we’ve identified one or more of the above mentioned techniques in use for those clients that have been affected. While we screen clients’ programs carefully to eliminate spam techniques, they sometimes slip by, or are added after we initiate the program.

In one particular situation, a client participated in a link building effort they believed would enhance their SEM campaign, not hinder it - and found it was quite the opposite when Jagger hit.

All that being said, the update isn’t over yet. So while we’ve certainly made it through the eye of the storm, the hurricane’s still a-blowin’. GoogleGuy, engineer at Google and frequent poster to WebmasterWorld, wants us to think about Jagger as three updates in one:

“I believe that our webspam team has taken a first pass through the Jagger1 feedback and acted on a majority of the spam reports. The quality team may wait until Jagger3 is visible somewhere before delving into the non-spam index feedback. If things stay on the same schedule (which I can’t promise, but I’ll keep you posted if I learn more), Jagger3 might be visible at one data center next week.

”So should you panic? Not as long as you’re implementing best practice SEO techniques. Notice that most all of the techniques listed above are considered “spam practices”? Sure, internal linking and anchor text aren’t spam, but over using them or using them as the only method of optimization is certainly not a best practice.

If you’re an SEO, what do you tell your clients or VP of Marketing about a shift like this?

The answer’s easy. If you’ve been following best practices and aren’t engaging in any spammy link practices, you’re probably fine.

If you have noticed a shift in your rankings and are sure that you don’t have any of the above tactics implemented in your program, it’s best to just wait it out. Since the update isn’t over yet, it’s very possible that your site will go back to where it was – and that includes dramatic increases in rankings as well.

If you or your clients’ rankings have fallen dramatically, ask them if they’re engaging in any of the practices listed above. If they are, it’s a good idea to go ahead and remove the offending content, as the Jagger 3 update might pick up the change faster than normal indexing will later.

Here at WebSourced, we’re also riding out the changes, and so far very few of our clients have been affected. For those that have, we’re employing the strategy outlined above, and continuing to optimize with best practices in the meantime.

If you’re an algoholic, you’ve just gotten the lowdown. Go and relax with a little Jäger.

- Jenny “Weather Analyst” Halasz

Wednesday, November 02, 2005

Jagger Update rattles SEO world

Jagger Update rattles SEO world
In yet another of its constant efforts to improve the site-ranking search mechanisms currently in place, Google has embarked on an algorithm update that has caught the attention of webmasters everywhere.

The search engine giant is believed to be in the second phase of a three-part overhaul that is seen by some in the field as a major update. With many webmasters seeing big changes in their rankings, the SEO experts are being inundated with phone calls of alarm from those who have seen drops in their current status. But SEO experts caution that it is necessary to let the new algorithm run its course. Those experts indicate that a sudden change in rankings this week is by no means a new status quo for a site and that search results for the current week will likely change yet again in the weeks ahead.

Citing the Florida update of two years ago, experts insisted that it might take upwards of three months for the search results to settle out properly.

In his web blog, Google engineer Matt Cutts appears to categorize the changes as less substantial than those witnessing the updates. Says Cutts, "These days rather than having a large monolithic update, Google tends to have smaller (and more frequent) individual launches."

As with other such updates, Brett Tabke from WebMaster World has given the new Google algorithm a name, in this instance Jagger. Among the rumored changes in the Google update are greater penalization for hidden text spam because the overhaul now also recognizes text that is hidden in invisible CSS layers.

In addition, some web blogs have speculated that links from automated exchanges and from text link advertising systems now have a less positive impact in SEO rankings while still others suggest that changes may stem from a series of back-link updates that began in early September.

By Thomas HansonNov 1, 2005, 11:33© Copyright 2004-05 SEOProject.com