Friday, November 04, 2005

Jagger or Jäger? Google’s Update Unraveled

After the past week, you may feel like you need a bottle of Jägermeister (Jäger) to digest the recent Google update. There’s even been some naming discussion by Danny Sullivan, Brett Tabke, Matt Cutts and others. While each has provided ample reasoning for their proposed name, I find Brett’s reasoning most compelling, so I’ll use Jagger.

What does the Jagger Update really mean? Matt Cutts has been providing regular “weather updates” on his blog, and based on that, reading I’ve done, and experience with our clients, Jagger seems to be an effort to increase the proportion of relevant content in the Google SERPs by removing some of the spam.

Some of the most widely discussed elements include:

Hidden text, especially text that is hidden in CSS or DIV layers
Paid linking or reciprocal linking that is considered outside of “Google Quality Guidelines”
Using internal links or anchor text as one’s sole source of optimization

For more commentary, try SearchEngineWatch and WebMasterWorld, but keep in mind this is all just speculation. Only Google has all the answers.

As for my personal take, I’ve investigated the impact Jagger has had on our clients so far, and what I’ve found definitely supports the commentary I’ve been reading.

Very few of our clients have seen any impact to their rankings as a result of this update, and we’ve identified one or more of the above mentioned techniques in use for those clients that have been affected. While we screen clients’ programs carefully to eliminate spam techniques, they sometimes slip by, or are added after we initiate the program.

In one particular situation, a client participated in a link building effort they believed would enhance their SEM campaign, not hinder it - and found it was quite the opposite when Jagger hit.

All that being said, the update isn’t over yet. So while we’ve certainly made it through the eye of the storm, the hurricane’s still a-blowin’. GoogleGuy, engineer at Google and frequent poster to WebmasterWorld, wants us to think about Jagger as three updates in one:

“I believe that our webspam team has taken a first pass through the Jagger1 feedback and acted on a majority of the spam reports. The quality team may wait until Jagger3 is visible somewhere before delving into the non-spam index feedback. If things stay on the same schedule (which I can’t promise, but I’ll keep you posted if I learn more), Jagger3 might be visible at one data center next week.

”So should you panic? Not as long as you’re implementing best practice SEO techniques. Notice that most all of the techniques listed above are considered “spam practices”? Sure, internal linking and anchor text aren’t spam, but over using them or using them as the only method of optimization is certainly not a best practice.

If you’re an SEO, what do you tell your clients or VP of Marketing about a shift like this?

The answer’s easy. If you’ve been following best practices and aren’t engaging in any spammy link practices, you’re probably fine.

If you have noticed a shift in your rankings and are sure that you don’t have any of the above tactics implemented in your program, it’s best to just wait it out. Since the update isn’t over yet, it’s very possible that your site will go back to where it was – and that includes dramatic increases in rankings as well.

If you or your clients’ rankings have fallen dramatically, ask them if they’re engaging in any of the practices listed above. If they are, it’s a good idea to go ahead and remove the offending content, as the Jagger 3 update might pick up the change faster than normal indexing will later.

Here at WebSourced, we’re also riding out the changes, and so far very few of our clients have been affected. For those that have, we’re employing the strategy outlined above, and continuing to optimize with best practices in the meantime.

If you’re an algoholic, you’ve just gotten the lowdown. Go and relax with a little Jäger.

- Jenny “Weather Analyst” Halasz

Wednesday, November 02, 2005

Jagger Update rattles SEO world

Jagger Update rattles SEO world
In yet another of its constant efforts to improve the site-ranking search mechanisms currently in place, Google has embarked on an algorithm update that has caught the attention of webmasters everywhere.

The search engine giant is believed to be in the second phase of a three-part overhaul that is seen by some in the field as a major update. With many webmasters seeing big changes in their rankings, the SEO experts are being inundated with phone calls of alarm from those who have seen drops in their current status. But SEO experts caution that it is necessary to let the new algorithm run its course. Those experts indicate that a sudden change in rankings this week is by no means a new status quo for a site and that search results for the current week will likely change yet again in the weeks ahead.

Citing the Florida update of two years ago, experts insisted that it might take upwards of three months for the search results to settle out properly.

In his web blog, Google engineer Matt Cutts appears to categorize the changes as less substantial than those witnessing the updates. Says Cutts, "These days rather than having a large monolithic update, Google tends to have smaller (and more frequent) individual launches."

As with other such updates, Brett Tabke from WebMaster World has given the new Google algorithm a name, in this instance Jagger. Among the rumored changes in the Google update are greater penalization for hidden text spam because the overhaul now also recognizes text that is hidden in invisible CSS layers.

In addition, some web blogs have speculated that links from automated exchanges and from text link advertising systems now have a less positive impact in SEO rankings while still others suggest that changes may stem from a series of back-link updates that began in early September.

By Thomas HansonNov 1, 2005, 11:33© Copyright 2004-05 SEOProject.com

Thursday, October 27, 2005

Google Jagger Update

Googler Matt Cutts – who’s becoming a sort of weather man for the SEO crowd – has some infos on the current Google rankings update (which has been dubbed “Jagger” by WebmasterWorld). Matt stresses the point that there is not one single large update, but rather a series of smaller and medium-sized ones. Matt says there might be some noticeable PageRank (and backlinks) changes to be expected in the coming days.

Other great resources aboutthe Google Jagger Update:
Jagger1 - I (Included changes in displayed PR and BL )
Jagger2 - Has started and will probably end next week.
Jagger3 - Will start next week hopefully (Wednesday probably at the earliest)
See: http://www.ysearchblog.com/archives/000095.html and http://www.jimboykin.com/45/

My friend and SEO mentor who shall remain nameless (The less Google knows about him the better) explained it to me from his point of view. 'Mr. K" I will call him became involved in the Internet when Cobalt, dBase and FoxPro were all the rage.

This is Mr. K's take on the recent Google Dance
When Google rolls out a radically new 'algo' they don't feed this bad-boy to all their servers units. Some servers IP are blocked so they may test the new "algo" against the group of blocked servers. The updated servers or 'Google feed' you and I see were rolled back, (I'm guessing 6-8 months) so they can test the indexing speed of the new 'algos' against the old 'algo' or block IPs.

Hmmm- This makes sense!

Fast, greasy, blinding indexing speed means faster Google caching of data. In light of the growing paradigm of new data being posted to the World Wide Web - speed is what it’s all about. I'm also guessing that 'Google Base' figures into this equation, but that's food for another blog thought.

So my friends keep the faith and next time - don't put all your marketing eggs in one basket. There's never been such a thing as a free-lunch!

Rick Vidallon
www.visionefx.net

Thursday, October 13, 2005

Google Inc. and Sun Microsystems Partnership

Last week, Google Inc. CEO Eric Schmidt and Sun Microsystems CEO Scott McNealy announced a distribution partnership.

What did they decide?Google's toolbar will be bundled into downloads of the Java Runtime Environment and Sun's Java will be used to power new software developed and released by Google.

Google might also include links to Sun software that directly competes with Microsoft software such as the Open Office suite in future updates of its toolbar.

What does this mean for search?
This is probably only the first step in Google's and Sun's battle against Microsoft. Google wants to win more market share on the desktop of computer users and it wants to move computer applications from the desktop to the Internet.

Google has also recently filed a new patent that indicates that Google is working on a way to constantly monitor all of your actions in order to build personalized search queries.
According to the patent specification, Google aims to monitor whatever you type in your word processor, the things you copy to your clipboard, the position of your mouse, the content of your emails, instant messenger messages and more.

If Google has access to Sun's free Open Office suite, it might be easier to do that. By gathering as much information about you as possible, Google can offer you personalized search results and - more important to Google - personalized ads.

What does this mean to you?It seems that many of Google's recent "free" applications mainly serve the purpose of gathering more data about you for Google so that Google can monetize that information for targeted ads.

If you use many different Google services, you share a lot of information with Google. It's up to you to decide if you're willing to exchange private information for "free" software and services.

This distribution partnership is probably only the start. It's likely that we can expect a lot more from this alliance between these two online giants.

Monday, October 10, 2005

Chasing the Search Engines' Algorithms... Should You or Shouldn't You?

Chasing the Search Engines' Algorithms... Should You or Shouldn't You?
By Robin Nobles

It’s a common occurrence. SEOs often spend countless hours trying to 'break" a search engine's algorithms. "If I could just crack Google's algorithm, my pages would soar to the top of the rankings!" Let's look at some flaws in this way of thinking.

1. Picture the Google engineers and tech folks turning the algorithms dial as soon as you "think" you have "cracked" the algorithms. Your rankings may fall, and you would have to figure out what's working with the engine right now. In other words, your rankings may never be long term.

2. Instead of spending all of this time trying to impress a search engine with a perfect page, why not impress your true target audience... your customers. Has Google, MSN, or Yahoo! Search ever bought anything from you? They're not your target audience. Your customers are your target audience. Write your pages and content for them.

3. When you expend so much of your energy chasing algorithms, you often focus on only a few elements that influence ranking – those elements that are working right now and that you hope will give your pages the best chance for success. It is said that Google has over 100 ranking elements that influence ranking and relevancy. Some are more important than others. But focusing on just one or two "main" elements and discounting the rest can prove disastrous to a Web site.

A different approach . . . Wouldn't you rather achieve top rankings and keep them there, and have those rankings equate to sales and money in your back pocket? After all, isn't it ultimately the sales you're after, as opposed to just the rankings? If those rankings don't equate to traffïc that equates to sales, you lose, any way you look at it.

Five Basic Steps for Achieving Top Rankings without Chasing Algorithms
1. Forget about the search engines. Yes, you heard me correctly. The search engines aren't and never will be your "ideal target audience." They don't buy your goods and services. They're not who you should be trying to please with your Web pages and site. Instead, write your Web page content for your target audience.

2. Don't ever forget the basics. No matter what's happening in the algorithms, continue using your main keyword phrase prominently in your title tag, META description and keyword tags, link text, body, heading tags, and so forth. That way, when the algo dial is turned, you won't have to make changes to all of your pages. You'll always be ready.

3. Focus your keyword-containing tags and body text on one keyword phrase only. Each page should be focused on one keyword phrase, and each page should have its own unique tags.

4. Write well-crafted content for your Web pages, and add new content on a regular basis. If content is king, context is queen. Focus on your keyword phrase, synonyms and related words, and surrounding text. Use a program like Theme Master if you need help determining those supporting words.

5. Remember that both on-page and off-page factors are important. Don't sacrifice one for the other. On-page factors are your tags, body text, prominence, relevance, etc. Off-page factors are link popularity (quality and number of your inbound links) and link reputation (what those inbound links "say" about your Web page when they link to you).

What about search engine research? Isn't it important? - - It's crucial.
Let me give you an example. At the beginning of this year, pages began falling out of Google's index. The forums were alive with speculation and what to do about it. Through research, we determined this was a compliancy issue. By having compliant code, the search engine spiders are more easily able to spider the content. The solution? Make sure you use a DOCTYPE tag and an ISO Character Set Statement at the top of every Web page.

For example:


If you didn't know about the compliancy issues, you could have made changes to your Web pages that didn't need to be made, wasted countless hours trying this or that, all to come up dry. Research helps to make sure you remain on top of what's happening in the search engine industry. It's what sets you apart from other SEOs. You make your decisions based on research and facts, versus speculation and theory.

In Conclusion...
"Take it from someone who has been in this business for nine years and studies the algorithms closely - don't chase the algorithms. You say that you have a #2 ranking for a certain keyword phrase that alone is bringing your site 550 visitors per day? Great. In the time that you have spent gaining that ranking, I have written 285 pages of unique content, obtained 821 links, etc., and collectively I am getting over 1,300 visitors per day," says Jerry West of WebMarketingNow.

In other words, by focusing on more than just chasing algorithms, you have the potential of having a much more successful Web site.

About The Author
Robin Nobles conducts live SEO workshops in locations across North America. She also teaches online SEO training and offers the Workshop Resource Center, a networking community for SEOs. Localized SEO training is being offered through the Search Engine Academy. Copyright 2005 Robin Nobles. All rights reserved.


POST NOTES FROM VISIONEFX
Also See Google Sitemaps
Visionefx is partnered with Hostmysite ISP company who provides a Google sitemap generator for free as part of their ISP hosting services in all their ISP web hosting packages. It is easy to use and takes the guess work out of manually configuring a Google Site map file.
More about Hostmysite Google Sitemap Auto-Generator!

Saturday, October 08, 2005

Ten reasons to redesign your web site

Ten Reasons to Redesign your Website
Published: 14 August 2003 in DesignBy: Nick Tatt

1. Beat the competition
The power of the web allows people to find information at the drop of a hat. It is possible for new customers to find your web site from anywhere in the world. The down side is that they can also find your competitors just as easily. It is important to make a good first impression and stay ahead of the competition. Failure to do so could lose you valuable customers. If your web site is comparable to your competitors’ consider a timely redesign to make sure you are leading the pack rather than following.

2. Present your organisation’s current market position
Organisations need to evolve to ensure they can deliver what customers need today. If your web site was designed a couple of years ago and has not been updated since, it is possible that it does not reflect your organisation’s current market position. A web site redesign is a great opportunity to evaluate where you are today and where you want to be in the future.

3. Out with the old, in with the new Websites date.
It is an unfortunate fact for organisations that websites show their age if left unattended. Considerable damage can be done to your reputation if customers discover that information or products on your web site are out of date. Worse still, incorrect. If your website is out of date consider using some kind of content management system, appropriate for the job, to keep it fresh.

4. Self serviceThe beauty of the web is its immediacy.
You don’t have to wait for the current batch of printed brochures to run out. With a website a quick update can get the latest product information out to a global audience. The theory is great but in practice many websites forget about their content and it soon becomes out of date. There are a few reasons why this might be but often it is because the website can only be updated by one person, probably the original designer. If you struggle to keep your website content fresh it might be time to consider a redesign, allowing people within your organisation to keep it up to date.

5. Make the site more usable and give the client what they want
How many times have you visited a site and not been able to find what you needed or tried to buy something online only to find that it is out of stock? Problems like these can be avoided with some thought in what your customers are trying to do on your website. Resolving the problem might be as simple as improving the the signposts to key sections or pages but a redesign allows you to listen to your customers and create something that will be easier to use.

6. Reach a wider audience
Just because you have a website doesn’t mean people will automatically find it. Competition for the top slot in search results is fierce. Making your website ‘search engine friendly’ will improve the chances of it being found. Building a ‘search engine friendly’ website, that conforms to web standards, from scratch is more effective than trying to adapt an existing one. In doing so you can insure that it appeals to both customers and search engines alike.

7. Increase your sales
The Holy grail for all websites. Websites are often designed with no thought on how organisations can harness the power of the web effectively. The wrong message in the wrong place can result in a website failing to meet your organisation’s expectations. Developing a web presence is relatively straight forward but developing one that meets your organisations expectations and goals is a little more complex.
Are visitors being channelled to the right sections or the site to make a purchase? Does the design promote your current brand? Are people signing up to your newsletter? Is the copy right for the site? If you suspect they may not be then it is time to get your website to work a little harder for your organisation.

8. Create appropriate content for the web
Web content is different to print content. With the immediacy of the internet it is not appropriate to use the same copy as your print material. When reading web pages users scan for information relevant to them. If your content is not presented in a way that delivers that information swiftly you run the risk of losing customers to your competition.

9. Promote an event or product launch
The launch of a new product or event might be the catalyst to consider a website redesign. If your current website fails to do an event or product launch justice you could damage its success. A website tailored to the needs of the event/product launch is far more effective than trying to shoehorn it into your website.

10. Communicate with your customers
What better way to raise your organisations profile than by announcing a new improved website delivering what your customers want in a clear usable fashion. The launch of a new website is a great excuse to contact your customers and strenghten your relationship with them.

Friday, October 07, 2005

W3C Compliance & SEO

W3C Compliance & SEO
By Dave Davies

From reading the title many of you are probably wondering what W3C compliance has to do with SEO and many more are probably wondering what W3C compliance is at all. Let's begin by shedding some light on the later.

What Is W3C Compliance?

The W3C is the World Wide Web Consortium and basically, since 1994 the W3C has provided the guidelines by which websites and web pages should be structured and created. The rules they outline are based on the "best practices" and while websites don't have to comply to be viewed correctly in Internet Explorer and other popular browsers that cater to incorrect design practices, there are a number of compelling reasons to insure that you or your designer insure that the W3C guidelines are followed and that your site is brought into compliance.

In an interview with Frederick Townes of W3 EDGE Web Design he mentioned a number of less SEO-related though very compelling arguments for W3C-complaince. Some non-SEO reasons to take on this important step in the lifecycle of your site are:

* Compliance help insure accessibility for the disabled.

* Compliance helps insure that your website is accessible from a number of devices; from different browsers to the growing number of surfers using PDA's and cellphones.

* Compliance will also help insure that regardless of the browser, resolution, device, etc. that your website will look and function in the same or at least a very similar fashion.

* At this point you may be saying, "Well that's all well-and-good but what does this have to do with SEO?" Good question.

We at Beanstalk have seen many examples of sites performing better after we had brought them, or even just their homepage, into compliance with W3C standards. While discussing this with Frederick he explained it very well with:

"Proper use of standards and bleeding edge best practices makes sure that not only is the copy marked up in a semantic fashion which search engines can interpret and weigh without confusion, it also skews the content-to-code ratio in the direction where it needs to be while forcing all of the information in the page to be made accessible, thus favoring the content.
We've seen several occasions where the rebuilding of a site with standards, semantics and our proprietary white hat techniques improves the përformance of pages site-wide in the SERPs."

Essentially what he is stating is a fairly logical conclusion - reduce the amount of code on your page and the content (you know, the place where your keywords are) takes a higher priority. Additionally compliance will, by necessity, make your site easily spidered and also allow you greater control over which portions of your content are given more weïght by the search engines.

Examples
The Beanstalk website and the W3 EDGE site themselves serve as good examples of sites that performed better after complying with W3C standards. With no other changes than those required to bring our site into compliance the Beanstalk site saw instänt increases. The biggest jumps were on Yahoo! with lesser though still significant increases being noticed on both Google
and MSN.

As we don't give out client URLs, I can't personally list off client site examples we've noticed the same effect on, however we can use W3 EDGE as another example of a site that noticed increases in rankings based solely on compliance.

So How Do I Bring My Site In Compliance With W3C Standards?
To be sure, this is easier said than done. Obviously the ideal solution is to have your site designed in compliance to begin with. If you already have a website, you have one of two options:

1. Hire a designer familiar with W3C standards and have your site redone,

(or)

2. Prepare yourself for a big learning curve and a bit of frustration (though well worth both).

Resources
Assuming that you've decided to do the work yourself there are a number of great resources out there. By far the best that I've found in my travels is the Web Developer extension for FireFox. You'll have to install the FireFox browser first and then install the extension. Among other great tools for SEO this extension provides a one-click chëck for compliance and provides a list of where your errors are, what's causing them and links to solutions right from the W3C. The extension provides testing for HTML, XHTML, CSS and Accessibility compliance.

Other resources you'll definitely want to chëck into are:
CSS Zen Garden ~ A List Apart ~ Holy CSS ZeldMan!

(Frederick lists this one as one of the best resources for the novice to find answers. I have to agree.)

Where Do I Get Started?
The first place to start would be to download FireFox (count this as reason #47 to do so as it's a great browser) and install the Web Developer extension. This will give you easy access to testing tools. The next step is to bookmark the resources above.

Once you've done these you'd do well to run the tests on your own site while at the same time keeping up an example site that already complies so you can look at their code if need be.

To give you a less frustrating start I would recommend beginning with your CSS validation. Generally CSS validation is easier and faster than the other forms. In my humble opinion, it's always best to start with something you'll be able to accomplish quickly to reinforce that you can in fact do it.

After CSS, you'll need to move on to HTML or XHTML validation. Be prepared to set aside a couple hours if you're a novice with a standard site. More if you have a large site of course.

Once you have your CSS and HTML/XHTML validated its time to comply with Accessibility
standards. What you will be doing is cleaning up a ton of your code and moving a lot into CSS, which means you'll be further adding to your style sheet. If you're not comfortable with CSS, you'll want to revisit the resources above. CSS is not a big mystery, though it can be challenging in the beginning. As a pleasant by-product, you are sure to find a number of interesting effects
and formats that are possible with CSS that you didn't even know were so easily added to your site.

But What Do I Get From All This?
Once you're done you'll be left with a compliant site that not only will be available on a much largër number of browsers (increasingly important as browsers such as FireFox gain more users) but you'll have a site with far less code that will rank higher on the search engines because of it.

To be sure, W3C validation is not the "magic bullet" to top rankings. In the current SEO world, there is no one thing that is. However, as more and more websites are created and competition for top positioning gets more fierce, it's important to take every advantage you can to not only get to the first page, but to hold your position against those who want to take it from you as
you took it from someone else.

About The Author
Dave Davies is the CEO of
Beanstalk Search Engine Positioning, Inc.. He writes with years of experience in SEO and Internet Marketing. A special thanks go out to Frederick Townes of W3EDGE for his help with this article. W3 EDGE provides W3C-compliant web site design for their clients. To keep updated on new SEO articles and news be sure to visit the Beanstalk blog regularly.

POST NOTE from VISIONEFX :
If you validate your web site (or) your clients website display the W3C badge of honor!- Link to the W3C validation page and display their logo on your client or your clients web site!

I did this on my company web site, VISIONEFX.
Did it help my business and SEO? You bet!

Wednesday, October 05, 2005

DMOZ in 2005

(a.k.a. The Open Directory Project)

By Phil Craven (c) 2005 WebWorkShop
The original concept of DMOZ was excellent for its time. The DMOZ site's About page makes these statements about the concept, and about the reasons for the directory's creation:- "Automated search engines are increasingly unable to turn up useful results to search queries. The small paid editorial staffs at commercial directory sites can't keep up with submissions, and the quality and comprehensiveness of their directories has suffered. Link rot is setting in and they can't keep pace with the growth of the Internet."

"The Open Directory follows in the footsteps of some of the most important editor/contributor projects of the 20th century. Just as the Oxford English Dictionary became the definitive word on words through the efforts of volunteers, the Open Directory follows in its footsteps to become the definitive catalog of the Web."

But things have changed a lot since DMOZ began in the mid 1990s. Since then, Google came along with very relevant search results, and they were kind enough to show the other engines how to produce such relevant results. That caused dramatic improvements, to the extent that top search engines have been able to provide very relevant search results for some time, and they provide a lot more of them than DMOZ is able to do.

The small paid editorial staffs at commercial directory sites still can't keep up with submissions, but their backlogs are small when compared with DMOZ's massive backlog. According to reports, there are over a million site submissions that are waiting to be reviewed, and delays of several years between submitting a site and it being reviewed are not uncommon. The backlog problem is so huge that many editors have redefined the problem so that it no longer exists. To them there is no backlog, because the submitted sites are not there to be reviewed. They are merely a low priority pool of sites that they can dip into if they want to, and some of them prefer to find sites on their own.

Link rot (dead links) has become widespread in DMOZ through the years, and they certainly can't "keep pace with the growth of the Web". There isn't a single reason for the creation of DMOZ that DMOZ itself doesn't nöw suffer from. So how come such an excellent original concept ended up with a directory that has the same problems that it sought to solve, and on a much largër scale?

One reason is that the Web has grown at a much faster pace than was perhaps anticipated, and the DMOZ editors simply can't keep up. Another reason is that there are simply not enough editors who are adding sites to the directory. At the time of writing, the DMOZ front page boasts 69,412 editors, but that is the number of editors that they've had since the beginning, and most of them are no longer there.

A recent report stated that there are currently about 10,000 editors who are able to edit, and that only around 3,000 of those are active in building the directory. The word "active" is used to describe editors who actually edit quite often, but as little as one edit every few months is acceptable. The word doesn't mean "busy", although some of them are. With so few people doing anything, it isn't even possible for them to keep up with the link rot in such a huge directory,
and there's the ever increasing problem of listings that link to topics other than what they were listed for. It simply isn't possible for them to maintain the directory as they would like.

The idea of becoming "the definitive catalog of the Web" was a fine one, but it turned out to be an impossible dream. The purpose of DMOZ is dead. Today's search engines produce excellent results in large quantities, and much more quickly than drilling down into a directory to find something.

So is there any value at all in the DMOZ directory? As a useful catalog of the Web, and when compared with the major search engines, the answer is no, although a few people do find it to be a useful research resource. For website owners, the links to their websites that a listïng in DMOZ creatës are useful for search engine ranking purposes, but even those are becoming less useful as search engines improve, and seek to block out unwanted duplicate content from
their indexes.

It was a fine concept, and it looked promising for a while, but the idea of DMOZ becoming the definitive catalog of the Web is gone. Improvements in the search engines eclipsed its value, and the growth rate of the Web meant that it could nevër achieve its goal. It began with an excellent concept, and they gave it a good shot, but it didn't work. The continuing growth rate of the Web ensures that it can nevër work. It continues as a good directory of a large number of web sites, but that is all. And not many people use directories when the search engines produce such good results, and so quickly.

About The Author
Article by Phil Craven of WebWorkShop. Phil is well-known in the world of webmasters and search engine optimization and his views have been sought and published by various online and offline publications.

Tuesday, October 04, 2005

Using Content Hubs To Promote Your Web site

Using Content Hubs To Promote Your Web site
By David Risley

We've all heard it before: content is king. And it is true. If you own a site, you need to post something interesting that people want to read before you can expect people to stop by. If your site is a content-based website, then you've already taken a huge step.

However, if your website is a business website whose only purpose is to talk about your services, then you really should make an effort to post some content onto your website which is helpful to readers, free, and relevant to your services or website. If you do this, your site will attract traffic from people looking for information, not just to purchase something. And with increased traffic in general, you will get increased attention. And this increases your statistics.

Writing content for your own website is only half the battle, though. You have got to get people to read it. Just posting a website is not going to get people to come to it. It would be like building a business in the middle of the mountains. Nobody knows its there and you won't get any customers. If you get your articles out there for people to read and the articles are written correctly, you can position yourself as an expert in your field and promote your own website. One way to do this is by publishing on content hubs rather than limiting it to your own website.

A content hub is a site which publishes articles on all topics (usually categorized). Those articles are freely available to anyone to use on their own website, newsletter, blog, etc. So, many publishers or site owners in need of fresh content for their website can go to one or more of these content hubs, find an article they like, and use it. They have to maintain proper credit to the author and publish the small author bio which accompanies the article.

Let's look at this, though, from the author's viewpoint - your viewpoint. Let's say you are selling consulting services for search engine optimization. You have a site for your services, but you blend in with all the other such services. So, you write a series of articles giving tips to webmasters on how they can optimize their website. With your article you include a short bio of yourself.

You include a mention of your services and a link to your website. You publish your article
on a bunch of content hubs. Other websites, newsletters and blogs grab your article off those sites and use it on their own. Your article therefore spreads throughout the internet. Being that your site is linked with the article and is therefore on all of these other websites now (including the content hubs themselves), search engines who are constantly spidering the internet pick up on your article and index it associated with your website.
This, in turn, raises your ranking in the search engines. And you get increased traffic to your website not only from search engine searches but also from your article.

Now, let's say you have done some research on keywords and you interlace your article with certain keywords. When the search engines spider your article all over the internet and associates with your website, it will raise your search engine rankings even more. There is a real science to this, and if done correctly, can drastically raise your internet presence in a short time. I recently had a meeting with the CEO of In Touch Media Group, a Clearwater, FL based company which is in the business of internet marketing.

They use content hubs as part of their strategy for clients and they couple this with their vast archived data regarding keywords. They showed me the stats of one site which they have, in the course of just a few months, taken from essentially no traffic to a VERY respectable level of traffic. After getting an article out in the content hubs, they will follow up a few weeks later with a press release.

So, how can you publish some of your articles on content hubs? Well, the first step is to find and visit them. There are many of them out there, but below are some of the better ones:

GoArticles.com
ISnare.com
SubmitYourArticle.com - a service to send your article to a bunch of hubs at once
ArticleCity.com
ExchangeNet.com
Article-Directory.net
FreeZineSite.com

There are services to help you distribute to a large collection of publishers at once. I have used Isnare's distribution service and it seems to work well. There are also distribution groups on Yahoo.

Here are a few of them
Free-Content
Article
Announce List

Article
Announce

Articles4You2Use4Promotion
Article
Submission

Free Reprint Articles

With that, I wish you the best of luck in your promotion efforts. Start writing!

About The Author
David Risley is a web developer and founder of
PC Media, Inc.. Specializes in PHP/ MySQL development, consulting and internet business management. He is also the founder of PC Mechanic, a large website delivering do-it-yourself computer information to thousands of users every day.

Saturday, October 01, 2005

Why Should I Bother With Optimized Online Copywriting?

Why Should I Bother With Optimized Online Copywriting?

It's no good having a creative, individual website with brilliant, informative copy if customers can't find you on the internet. On the other hand, it's also detrimental if you have a website that can be easily found (has a high ranking) but people become bored and alienated reading it. Producing effective online copywriting is a creative process blending art and science in a balanced technique combining many different elements. This integration of disciplines is required to satisfy both the technical and the aesthetic objectives of a website.

Optimized online copywriting should ensure that your website is:
• highly readable to your viewers
• highly visible to the search engines, and thereby
• commercially successful for you.

Many people and businesses don't have the time to actually write web copy themselves. A professional freelance copywriter can furnish you with keyword-rich, highly original web content to enhance and improve the quality of your website with the aim of transforming more of your visitors into customers.

Rarely will you get a second chance to engage your customer's attention, so your first shot must be formatted for maximum sales potential, catching the eye of the search engine robots as well. But not too much… If your copy goes overboard in favor of the search engines it earns you a penalty from Google that will negatively affect your rankings. Your website must always have the reader as priority. This makes more business sense anyway.

Search engines provide a way for potential customers to find you on the internet. People type a key phrase or keyword into a search engine, such as Google, Yahoo or MSN (or one of the many other popular engines) and this returns a page of listings - web page suggestions for that particular phrase or word. Obviously, you want your website to feature highly in this list.

Optimized online copywriting specifically targets the words and phrases people are using when searching for a product on the internet (Search Engine Marketing (SEM), keyword research). You want to make sure your website stays at the top of the listings so people go to your website before others. With targeted copy in place, search engines are more likely to index your web site on page one than if it does not include keyword-rich copy. This is an ever more important
issue when dealing with Google, the leading search-engine today.

To rank highly in the search engines the words on your web pages should never be an afterthought, but should be included right at the beginning in the original design of your website. Content development is the most valuable asset web developers can utilize in the bid for productive, successful search engine optimization and Search Engine Marketing (SEM).

Hiring a professional copywriter is a wise investment in your business future. Even if you don’t want to optimize your site you should make sure that the words on your site are reasonable, enticing, spelled correctly and artfully arranged to engage attention. Just because you can type letters or write some emails doesn't mean you can write the copy for your website. The writing on your homepage is often how people determine whether the website is a scam or the genuine article. Your website’s credibility takes a nose-dive if the spelling is wrong, or the grammar is incorrect, or it just reads like bad, clumsy English. People will be disinclined to trust your content.

Within the search engines new technologies and algorithms are being developed all the time to make search methodologies smarter, more astute. It's never a coincidence when someone types in a search phrase and your website is indexed highly on the page. Keyword rich online copywriting is a significant and critical component in gaining high rankings on the search engines.

IMPORTANT!
Google has been pioneering a new trend of intelligent search engines which are not attracted by mere repetition of words throughout the text, but which look for meaning, attempting to make grammatical sense of the information, trying to understand what the web page is actually saying. This is forcing webmasters to improve the content on their web pages or suffer the onsequences.
The old saying has never been more relevant: 'content is king.'

Thursday, September 22, 2005

Does Google Penalize Innocent Websites?

Whether you like it or not, Google is the place to be ranked well. Yahoo! and MSN can offer their share of traffïc, but nothing serves up traffïc like a top ranking in Google. Unfortunately, no search engine is quicker to hand out a penalty either.

As the clear leader in the search engine market, it is hard to blame Google for being quick to hand out a penalty on a website. There are hundreds of 'black-hat' SEO techniques and tricks that all aim to 'crack' Google and give a website owner a top ranking without them doing as much work to achieve that ranking. If one person discovers a hole in Google, it takes very little time for an entire drove of website owners to start changing their sites to take advantage of this hole.
But is Google too quick to hand out a penalty? They have claimed in the past that it would be unlikely that a legitïmate site would receive a penalty. However, with all the confusion on the Internet about what good SEO really is, is it possible that a legitïmate site owner accidentally employs a technique that is shared by spammers? The site owner may have no intention of defrauding Google, but they may receive the penalty all the same.

Google Plans to Alert Site Owners of Potential Problems
There is some great news for website owners who fear they may have been penalized by Google. Matt Cutts, the owner of this quickly growing blog and employee of Google, confirmed on his website that Google is piloting a new program which will proactively alert website owners of potential problems on their website.

This is definitely exciting for website owners who do not know if they have been penalized, but it should not be taken for something that it is not. Keep in mind the following points:

1. This is a pilot program. It is not a full fledged program that guarantees everyone will be contacted who has been negatively effected. Chances are, you will not be contacted at all.

2. It is an automated program. Google will not have any one person sending out these emails, but a bot that will have to 'discover' your email address. If it can't find one, it will try to guess an email address. If you are good at protecting yourself from sp@m, you may not get a message from Google even if they want to contact you.

There may be a day in the not-so-far future where Google is able to contact legitïmate website owners who made an honest (or maybe not so honest) mistake. That day is not hëre yet, so the responsibility is still that of the individual website owner to make sure they have a legitïmate website in the eyes of Google.

The Many Ways to Get Penalized by Google
There are many ways to get accidentally penalized by Google. Preventing your site from being penalized takes a lot of attention to detail. Even if you have hired on a professional SEO firm, you should be mindful of the problems that can arise from a simple mistake. Below are several things to look out for on your site.

Duplicate Pages
This is a common problem, and a problem that can be difficult to avoid, especially if you have a large website. Duplicate pages are pages that have essentially the same content; it is an old trick employed by search engine spammers. Search engine spammers would use the same page over and over again, but change keywords at the bottom of the page to create some variance and to focus in on different niches.

Accidentally recreating this sp@m technique can be very easy to do. Below are a few ways in which you could have duplicate pages without even knowing about it:

* If you use different landing pages in your advertising campaigns to measure ad effectiveness, you are essentially building duplicate pages. If Google discovers these different landing pages, they may think that you are using duplicate content.

* Sites that offer the ability to print pages often create two pages that have essentially the same content. Using mod_rewrite to create search engine friendly URL's can create duplicate pages.

* When you use mod_rewrite the server will serve up the same page regardless of whether you use the search engine friendly url or the regular url.

These are just a few examples of how duplicate pages can creep into your website. You should look for more ways that duplicate pages could have creeped into your website.

If you find that you do have duplicate pages within your website, you should use the robots.txt file to exclude the duplicate pages. We published an article last week about the robots.txt file which should be helpful: How to Prevent Duplicate Content with Robots.txt and Meta Tags

Redirecting Users
Another favorite technique of search engine spammers is to use redirects to create doorway pages (otherwise known as cloaking). The idea hëre is to present one page to a search engine spider that is optimized for the search engine and present an entirely different page to the user.

Search engine spammers use all different types of redirects, from complicated javascrïpt redirects to simple http-refresh commands.

There are many valid reasons to redirect users on your website to a different page. Whether you are changing the name of your website or changing the structure, your website pages may not always be in the same place and you nevër want to löse a visitor to an ugly 404 page (even

Google does not like 404 pages).

Google does recognize that you may need to throw in a redirection from time to time. If you need to do so, you should use a 301 redirect. There are several ways to employ a 301 permanent redirect. Below are two examples:

Example 1 - Using mod_rewrite
Options +FollowSymLinksRewriteEngine onRewriteCond %{HTTP_HOST} ^yourdomain\.comRewriteRule ^(.*)$ http://www.yourdomain.com/$1 [R=permanent,L]

Save this in a file called .htaccess and upload it to your server.

Example 2 - Using an Apache Redirect
Redirect 301 / http://www.yourdomain.com/

Save this in a file called .htaccess and upload it to your server.

Keyword Stuffing
Keyword stuffing is the oldest search engine spamming technique known. All this entails is using your targeted keywords over and over and over again on your website. Keyword stuffing can happen throughout the content of your website, in hidden text, in the alt property of your images, in the meta tags of your website, in HTML comments, or a variety of other ways. To see an example of keyword stuffing, take a look at this thread over in our SEO Tips and Tricks portion of our forums.

The example above is an exaggerated example of keyword stuffing, but it happens a lot with website owners. The desire to rank high in the search engines often leads a person to put their keywords in their site much more often than they would do so normally. As a general rule, if the text on your page appears unnatural to you, it will appear unnatural to the search engines.

Be Vigilant and Be Natural
So far Google has done a decent job of keeping sp@m out of their index. It still finds its way into their results, especially for less competitive keywords, but when Google does find sp@m they tend to develop new methods to detect that sp@m and remove it from their index. Unfortunately they will inevitably affect website owners who really do not know that they are doing something wrong.

Google has taken a very positive step in starting their pilot program aimed at notifying website owners who may be innocently doing something wrong, but the responsibility ultimately will always reside with the website owner. If you are having troubles ranking well for your targeted keywords, take the time today to review your website. Ask yourself if you have duplicate pages, if you have any hidden text or are possibly stuffing keywords on your page. Do you have any redirects which could be misinterpreted? Take the time to re-read Google's webmaster info and familiarize yourself with it.

Getting to the top of Google is hard work, but it is well worth it when you reach the top.

About The Author
Does Google Penalize Innocent Websites was written by Mark Daoust, the owner of Site-Reference.com.

Sunday, September 18, 2005

Keywords, Competition, Being Number One Uncovering the Algorithm

Keywords, Competition, Being Number One Uncovering the Algorithm
By John Krycek (c) 2005

By following these steps you will see that most closely guarded secret-- the search algorithm. Remember the movie "the Matrix?" The Matrix is there, you just can't see it. So is the search algorithm. It's easy to pay a Search Engine Optimizer to give your pages some ranking power. Unfortunately, given the inherent time factor involved in climbing the ranks, your money may be long gone before you know if you've spent your money well.

There Is No Magic Pill
Forget any advertisement you see for instant number one search results or automated this or that. Most are scams, and the ones that aren't might get you positioned, but it will be very short lived.

Search engine optimization is an ongoing process. Achieving and maintaining a high rank, especially on highly competitive keywords, requires constant maintenance. If you do find a legitimate SEO firm, it is well worth the money to pay their monthly maintenance fee and let them continue to help you after the initial project. At least for 6 months or a year as you establish yourself.

In this article we'll look at some of the intricate and complex tasks of optimizing a page for long term ranking power. You will learn how to read between the code and the content to find what is necessary to bring you to the top. Being number one is easy to say, but is quickly overwhelming when you stare at tens of thousands of pages you want to out rank. So how do you begin?

The starting line on the road to that first page SERP (search engine results page) ranking is not as blurry as you might think. In fact, you can uncover the starting line, the route, and all the scenery along the way to the finish line without knowing the search engine algorithm.

STEP 1-
Your Keywords Are The Crowning Achievement Of Grueling Days Of Work

If you have investments in the stock market you know how much research and thought goes into choosing those securities. Now take that same effort and multiply it by three. That's how much planning and revision your keywords should take.

A simple, broad key phrase like "shoes" could hypothetically bring you up in a countless stream of different searches. Women's shoes, baby shoes, sneakers, high heels, etc. If somehow you manage to settle into a good ranking (which would be difficult) you would have more traffic on your site than you could handle. But traffic is worthless if it doesn't get to it's destination.

Chances are, you weren't that destination.

Your keywords must be focused and precise, specific to what you are selling. Using a key phrase like "Gucci mens black leather loafer" will bring a targeted lead to your site. You may not reach as many people as the more generalized keyword, but the people that do come to you have a much deeper interest in the specific product you are selling.

Therefore you have much greater chance of converting that targeted lead to a sale. Your keywords are your magic beans, your winning lotto numbers, your energizer bunnies, your sales force, whatever you want to call them. They must be perfect.

STEP 2-
Want To Be Number One? Look At Who Already Is

Competition Analysis- no SEO book can give you this information.

Now take your keyword list and type them into a search engine. Who comes up in the first ten results? That company that is number one is because they have most closely matched what the search engine algorithm says should be number one. You can learn a great deal from them.

A. Internal Factors
Take that number one page, and the other top 9 pages and study them, look at the code, break them down. You are looking at the first half of what is needed to rank in the top 10 pages for your key phrases on that particular search engine. The list of what to look for is enormous.
Studying the Internal Factors on a page is taking it apart to see how it's put together. Not how it works, but statistical research into the precise construct and layout of keywords and phrases in relation to each other within the page.

Start with these areas:
URL address, Page Title, Meta description, Meta Keywords, First sentence on the page, Body copy, Bold or Emphasized Phrases, H1 or other tags, Alt Tags, Navigation system

In each of those sections, look at:

* Keyword densities- the number of times your phrase and each word in your phrase appears
compared to the text around it
* Where, and how many times, the same phrase and words appear in different sections
* The word and character position of each phrase in each section
* The total number of characters
* The total number of words

Beginning with these comparisons should keep you quite busy for awhile. A spreadsheet is quite useful. Some commercial products are also available that can make this daunting task much more feasible. Keep looking for other patterns and differences. You want to duplicate them in your own page. NOT copy and steal. You want to mimic the patterns that are bringing that page to the position it's in. Then move onto to examining the external factors of these pages.

B. External Factors
External factors of a web page deal with the links to, from and within a web page, both inside the same site, and out into the web. This analysis usually takes more time because it involves more dissection of pages beyond the one you're trying to optimize. In this analysis, as with Internal Factors, you want to compare and contrast your page versus the top 10 competitors, find similarities and differences. Below is a list of criteria to get you started.

* Number of internal links (to the same site) on that page
* Number of external links
* Number of links pointing TO that page* (see below for details)
* The link/anchor text- which keywords are used and where
* Google Page Rank value of incoming links
* Alexa Rank of incoming links
* The quality and thought of the content

To get a listing of the links that point to a site, type the following into Google, MSN and Yahoo searches: "link:www.domainname.com". Google tends to only show a small portion of the links back, but MSN and Yahoo will give you much more pertinent data.

Now you want to compare the content on each of these pages to the one they point to. Is it of similar theme, in what context does the link back appear and where. Subject of much debate, the consensus is that Google Page Rank does not mean what it used to. However, if it is in some fashion a measure of how significant or "important" a site is, it is worth looking more closely at the sites that link back that are of high page rank.

Even A Surgeon Uses Tools
Now, this is definitely a ton of work to do all by hand. There are software programs that can help do some of the digging and mathematical computations for you, figuring out densities and organizing information.
Tools like this are definitely ones a professional SEO will have in their arsenal. But remember, these are tools, not miracle workers. It takes a human being to evaluate and realize connections, similarities, draw conclusions and interpret the data. Then, you have to extrapolate this data.
Remember, you want to do one better than every site you just examined. To do that you have to draw some conclusions and make some educated guesses and link to even better sites.

Final Thoughts
You have access to the inner workings of every page that you want to beat. Learn from them and do one better. This process is not a one-time shot. It is ongoing. Check your key phrases every week. Do the same people still rank in the top ten?
Some have probably moved. Remember too that they're going to adapt to maintain their positions too. If you want the ranks, you have to spend the time, and not just once, or pay someone to do it for you.

Don't ever believe anyone who says they can guarantee any kind of results. And ask them how they will optimize your pages. If they explain to you something like the above, then you've probably got yourself someone experienced and honest. Your money will be well spent and you'll quickly recover it.

About The Author
John Krycek is the owner and creative director of theMouseworks.ca web design in Toronto. Learn more about search engine optimization and internet marketing in easy, non-technical, up front English!

Saturday, September 10, 2005

LOGO DESIGN 101

Logotype, commonly know as a logo, is a design, a graphic representation/image/trademark symbolizing one’s organization. Designed for instant identification, a logo can appear on company letterhead, advertising material and signs as an emblem by way of which the organization can easily be recognized.

Value of a logo should be based on a few important criteria:

1. Experience of the logo designer
2. Size & budget of the company using the logo
3. Scope and usage of the logo
4. Difficulty of the design
Does your current logo represent 3 of the key elements that make up a credible and high quality logo design?
a. Does the logo portray your company in a manner which says that you are an expert in this field?
b. Is the logo "contemporary", symbolizing a "forward-thinking" look?
c. Is the message that you are trying to convey to the consumer clear? If you answered ‘yes’ to all of these questions, then why change your logo? By revamping your company image, you may risk losing your supporters, clients that are already familiar with your products and services, your popularity, respect, as well as your market share. You can, however, clean-up your logo or update it with a lot less risk.

Important Points To Consider:

• A logo should: -Attract attention and leave an impression
-Create a look that in unique
-reflect the personality of the company

• Reproduction costs: The more detailed and colorful the logo design, the more difficult to reproduce, meaning a higher cost.

• The size: The prefect logo design will look great on a sign board as well as on a business card or a pen.

• Logo design companies are by the dozen. Take your time, research different companies and designers and compare packages in order to select a logo design company suited to your needs.

• Check your competition. What designs, graphics, and colors do they use? Remember that you need to be competitive.

• Trademark your logo. If your logo is trademarked, this prevents competitors and other third parties from stealing it.

• And last but not least, when in doubt, try the janitor test.
More about the janitor test tommorrow!

Rick
www.visionefx.net

Friday, September 09, 2005

SEO According to Google's Webmaster Guidelines

By Mark Daoust (c) 2005

Every SEO expert has their own set of rules as to how they believe Google ranks websites. Most of the time, these rules tend to vary slightly from one another. But what does Google tell us? Fortunately they gave us all the guidelines we need. This article will look at these guidelines in depth.

Make a Site With a Clear Hierarchy and Text Links. Every Page Should Be Reachable From at Least One Static Text Link.
Your website's navigation is the foundation of your entire website. It binds all the pages together into a common theme and provides a clear vision to your website visitors. Clear navigation is extremely important for search engine spiders. Spiders want to find as many pages on your website as possible. However, if your navigation is confusing, or worse, incomplete, search engine spiders will not discover your entire website. It is also important to remember that search engines learn what a page is about partially by the text used to link to that page. In this regard, the anchor text you use in your navigation can help you rank better for your targeted keywords. This is why Google also heavily recommends that you use text links on your website.

Offer a Site Map to Your Users With Links That Point to the Important Parts of Your Site.
If the Site Map is Larger Than 100 or So Links, You May Want to Break the Site Map Into Separate Pages.

Having a sitemap is crucially important. When a spider visits your website one of the first things it will look for is a file called sitemap. Search engine spiders have millïons of pages to visit in a very short amount of time, they want to work as efficiently as possible. A sitemap will allow the spider to know what pages are most important to your visitors, and therefore, the most important to the search engines.
Even if you don't have a sitemap you can still get your pages indexed by Google if you have a solid navigation system. Placing a sitemap on your website will simply make life easier on the spider, which is something you should strive for.

Create a Useful, Information-Rich Site, and Write Pages That Clearly and Accurately Describe Your Content.

You've certainly heard everyone talk about how content is king. It is important to note, however, that not just any content is king...unique content is king.
Undoubtedly you have heard experts touting hundreds of sources for free content, from RSS to free reprint articles. Although RSS and free reprint articles can be very useful for your website, nothing will ever replace content that is developed by yourself or your company.

It is said that nearly half of all searches performed on a daily basis include combination of words that have never been searched on before. If users are searching using unique combinations of words, what are the search engines looking for?
Of course, RSS and free reprint articles are useful, especially if the information in the articles or feed is pertinent to the subject matter of your website. You should not be afraid to use free reprint articles or RSS feeds (in fact you probably should use them), but you should also be sure to have a significant amount of unique content on your site as well.

Try to Use Text Instead of Images to Display Important Names, Content, or Links. The Google Crawler Doesn't Recognize Text Contained in Images.

This is an extremely important point that many, many website owners tend to ignore. They miss great opportunities to place heavy hitting keywords in great spots within their website.

Instead, they place an image in that spot solely for the purpose of appearance. Fortunately the web and most browsers have adopted CSS. With CSS you can format the style of your links and present important text in a stylish manner. Using CSS you can also present an image to your visitors while keeping important text in your code for the search engines.

Make Sure That Your TITLE and ALT Tags Are Descriptive and Accurate.

The title tag is the single most important part of SEO that you can optimize. The title tag is included in the head portion of every HTML page. The tag is intended to tell your visitors and the search engine what the subject of that specific page is.
Many website owners make a number of mistakes with their title tag. Many times they'll stuff the title with a list of keywords. Other times they will not change the title from page to page. All of these things will hurt your rankings and drive the search engines away from your website.

A good title tag should be relatively short and highly descriptive. It should contain your most important keywords for that page, and it should make sense to a human reader. If you use images on your website, you should always include an alt tag. When choosing your alt tags you should try to be descriptive and brief. Alt tags are not an opportunïty to stuff your page with your targeted keywords. Search engines are smart enough to know when a website owner is trying to take advantage of image alt tags.

Of course, using CSS, it is possible to design a good looking website without using any images.

Check For Broken Links and Correct HTML.

This is a simple step to take, and extremely important. This goes back to the concept that we want to make life easier on the search engine spiders. If a spider comes across a website with broken links, it may think that your website is incomplete. Worse yet, it may consider your website not worth visiting as it wastes the spider' src="path/to/image.jpg"

Fortunately there are several link checkers available. You can use the Site Reference site link checker.

Correct HTML can be a little more difficult, especially if you use an HTML editor such as FrontPage and do not know HTML. As the owner of a website, however, you should take the time and effort to make sure the site is written in valid HTML. This will make it easier for a spider to find out what your site is really about.
To validate your website, visit the World Wide Web Consortium's HTML Validator. If you have any mistakes in your HTML, the validator will tell you what they are and what you need to do to correct them.
Once you have validated your web site you may also place the Consortium's image link right on your web site. Look at the bottom right of the following page: http://www.visionefx.net
Click on the W3c logo!

If You Decide to Use Dynamic Pages (i.e., the URL Contains a '?' Character), Be Aware That Not Every Search Engine Spider Crawls Dynamic Pages As Well As Static Pages. It Helps to Keep the Parameters Short and the Number of Them Few.

The larger a website gets, the greater the need for it to get some sort of content management system. If you own a shopping based website or a content based website that generates its pages dynamically, you should be aware of the implications to the search engines. Dynamic websites often employ the use of query strings. Below is an example of a query string:
http://www.somesite.com/index.php?query_string=this-is-the-query-string


Query strings can employ a wide range of characters which make it difficult for search engines to spider your website. They are able to spider sites with simple query strings, however, it is generally not a good idea to do so.

You can easily change your website URL's to be search friendly by using a tool called Apache Mod_Rewrite. Using Mod_Rewrite you can turn a URL from looking like this:
http://www.somesite.com/index.php?query_string=2932&name=some+name
to this:
http://www.somesite.com/2932/some-name.html

Conclusion

Most SEO experts have their own set of rules to rank well in the search engines. If you travel from one SEO to another, you will find that they differ slightly in what they believe should be emphasized in your SEO strategy. Even though they may all have their differences, the most basic and most important aspects of quality SEO are all the same.

Fortunately, Google provides us with these rules.About

The AuthorSEO According to Google's Webmaster Guidelines
was written by Mark Daoust.
Visit Site-Reference.com for more
articles on search engine marketing.

Monday, September 05, 2005

Logo Design

Logo Design
Corporate and commercial firms spend thousands of dollars on creative consultants to find that special combination of color, style and shape.

Case Study
A consultant study for a new Fox 43 television affliate logo design involved analysis of the existing logo color and design of all the other local television affiliates in the region. Preliminary analysis showed the over use of 'blue' in the other station logos. A Fox 43 logo design and color palette of gold, black and red was developed that provided a powerful stand-out against a the local competitors.

Branding
A company logo is the basis for your 'brand identity.
Think about it.. A logo is featured on;
* Letterhead
* Business cards
* Brochures, reports, flyers, newsletters
* Auto truck window stickers
* Auto truck bumper stickers
* Billboard signs
* Sales promotion materials - clothing, hats, pens and so on.
* Broadcast graphics
* Web site
* Emails

If you don't have a huge budget for logo design - DON'T DO THIS YOURSELF.
Go online and Google:
'online logo desing services' - 'cheap logo design' - 'affordable logo design'.

You will at least be in the hands of a knowledgable professional who can desing something better than chicken scratch on a cocktail napktin. - DON'T DO THIS YOURSELF.
I find that individuals who actually believe they can design a professional logo fall into the category of tone-deaf people who love to sing.
Let us create that professional 'brand presence' for your business.

Your logo reflects how serious you are about doing business.
So be serious about your logo.

Rick Vidallon
www.visionefx.net

Tuesday, August 23, 2005

Google Inc. is set to introduce its own instant messaging system













Google Inc. is set to introduce its own instant messaging system, the Los Angeles Times reported on Tuesday, marking the expansion by the Web search leader into text and also voice communications.The logo of Google Inc. is seen outside their headquarters building in Mountain View, California in the August 18, 2004 file photo. REUTERS/Clay McLachlan

SAN FRANCISCO (Reuters) - Google Inc. is set to introduce its own instant messaging system, the Los Angeles Times reported on Tuesday, marking the expansion by the Web search leader into text and also voice communications.

Citing unnamed sources "familiar with the service," the Los Angeles Times said that Google's Instant Messaging program would be called Google Talk and could be launched as early as Wednesday.

Google Talk goes beyond text-based instant messaging using a computer keyboard to let users hold voice conversations with other computer users, the newspaper quoted a source as saying.
A Google spokeswoman declined to comment on the company's product plans.
If confirmed, the combined computer text and voice-calling service would put Google in competition with a similar service pioneered by Skype, which has attracted tens of millions of users, especially in Europe, to its own service.

Separately, independent journalist Om Malik on his blog at http://gigaom.com
pointed to technical clues that suggest Google is preparing to run an instant messaging service based on an open-source system known as Jabber.

Jabber technology would allow Google instant message users to connect with established IM systems that also work with Jabber, including America Online's ICQ and Apple Computer Inc.'s iChat, Malik said.

"This is the worst possible news for someone like Skype, because now they will be up against not two but three giants who want to offer a pale-version of Skype," he wrote. Earlier this week, Google said it was branching out beyond pure search to help users manage e-mail, instant messages, news headlines and music. It introduced a new service called the Google Sidebar, a stand-alone software program that sits on a user's desktop and provides "live" information updates.

Over the past year or so, the company has expanded into e-mail, online maps, personalized news and more.

The product push comes as rivals Yahoo Inc., Microsoft Corp. and Time Warner Inc.'s AOL are all pushing to upgrade existing instant messaging systems and expand into new Internet phone-calling services.

Google's moves take it beyond its roots in Web search and closer to becoming a broad-based Internet media company. With instant messaging, Google would be breaking into a market in which its major competitors boast tens of millions of subscribers to their established instant messaging services.

America Online, with its AIM and ICQ brands, counts more than 40 million IM users in the United States alone. Yahoo has around 20 million and Microsoft's MSN Messenger numbers some 14 million users, according to recent comScore Media Metrix data.

Monday, August 22, 2005

Search Engine Spiders Lost Without Guidance - Post This Sign!

Robots.txt Signpost Warns Trespassers From Private Property,
By Mike Banks Valentine (c) 2005


The robots.txt file is an exclusion standard required by all web crawlers/robots to tell them what files and directories that you want them to stay OUT of on your site. Not all crawlers/bots follow the exclusion standard and will continue crawling your site anyway. I like to call them "Bad Bots" or trespassers. We block them by IP exclusion which is another story entirely.
This is a very simple overview of robots.txt basics for webmasters. For a complete and thorough lesson, visit Robotstxt.org.

To see the proper format for a somewhat standard robots.txt file look directly below. That file should be at the root of the domain because that is where the crawlers expect it to be, not in some secondary directory.

Below is the proper format for a robots.txt file ----->

User-agent: *Disallow: /cgi-bin/Disallow: /images/Disallow: /group/
User-agent: msnbotCrawl-delay: 10
User-agent: TeomaCrawl-delay: 10
User-agent: SlurpCrawl-delay: 10
User-agent: aipbotDisallow: /
User-agent: BecomeBotDisallow: /
User-agent: psbotDisallow: /

--------> End of robots.txt file

This tiny text file is saved as a plain text document and ALWAYS with the name "robots.txt" in the root of your domain.

A quick review of the listed information from the robots.txt file above follows. The "User Agent: MSNbot" is from MSN, Slurp is from Yahoo and Teoma is from AskJeeves. The others listed are "Bad" bots that crawl very fast and to nobody's benefit but their own, so we ask them to stay out entirely. The * asterisk is a wild card that means "All" crawlers/spiders/bots should stay out of that group of files or directories listed.

The bots given the instruction "Disallow: /" means they should stay out entirely and those with "Crawl-delay: 10" are those that crawled our site too quickly and caused it to bog down and overuse the server resources. Google crawls more slowly than the others and doesn't require that instruction, so is not specifically listed in the above robots.txt file. Crawl-delay instruction is only needed on very large sites with hundreds or thousands of pages. The wildcard asterisk * applies to all crawlers, bots and spiders, including Googlebot.

Those we provided that "Crawl-delay: 10" instruction to were requesting as many as 7 pages every second and so we asked them to slow down. The number you see is seconds and you can change it to suit your server capacity, based on their crawling rate. Ten seconds between page requests is far more leisurely and stops them from asking for more pages than your server can dish up.

(You can discover how fast robots and spiders are crawling by looking at your raw server logs - which show pages requested by precise times to within a hundredth of a second - available from your web host or ask your web or IT person. Your server logs can be found in the root directory if you have server access, you can usually download compressed server log files by calendar day right off your server. You'll need a utility that can expand compressed files to open and read those plain text raw server log files.)

To see the contents of any robots.txt file just type robots.txt after any domain name. For instance for this web site http://www.visionefx.net you would type: http://www.visionefx.net/robots.txt Neat huh? If they have that file up, you will see it displayed as a text file in your web browser. Click on the link below to see that file for Amazon.com http://www.Amazon.com/robots.txt

You can see the contents of any website robots.txt file that way.

The robots.txt shown above is what we currently use at Publish101 Web Content Distributor, just launched in May of 2005. We did an extensive case study and published a series of articles on crawler behavior and indexing delays known as the Google Sandbox. That Google Sandbox Case Study is highly instructive on many levels for webmasters everywhere about the importance of this often ignored little text file.

One thing we didn't expect to glean from the research involved in indexing delays (known as the Google Sandbox) was the importance of robots.txt files to quick and efficient crawling by the spiders from the major search engines and the number of heavy crawls from bots that will do no earthly good to the site owner, yet crawl most sites extensively and heavily, straining servers to the breaking point with requests for pages coming as fast as 7 pages per second.

We discovered in our launch of the new site that Google and Yahoo will crawl the site whether or not you use a robots.txt file, but MSN seems to REQUIRE it before they will begin crawling at all. All of the search engine robots seem to request the file on a regular basis to verify that it hasn't changed.

Then when you DO change it, they will stop crawling for brief periods and repeatedly ask for that robots.txt file during that time without crawling any additional pages. (Perhaps they had a list of pages to visit that included the directory or files you have instructed them to stay out of and must now adjust their crawling schedule to eliminate those files from their list.)

Most webmasters instruct the bots to stay out of "image" directories and the "cgi-bin" directory as well as any directories containing private or proprietary files intended only for users of an intranet or password protected sections of your site. Clearly, you should direct the bots to stay out of any private areas that you don't want indexed by the search engines.

The importance of robots.txt is rarely discussed by average webmasters and I've even had some of my client business' webmasters ask me what it is and how to implement it when I tell them how important it is to both site security and efficient crawling by the search engines. This should be standard knowledge by webmasters at substantial companies, but this illustrates how little attention is paid to use of robots.txt.

The search engine spiders really do want your guidance and this tiny text file is the best way to provide crawlers and bots a clear signpost to warn off trespassers and protect private property - and to warmly welcome invited guests, such as the big three search engines while asking them nicely to stay out of private areas.

About The AuthorGoogle Sandbox Case Study. Mike Banks Valentine operates Publish101.com Frëe Web Content Distribution for Article Marketers and Provides content aggregation, press release optimization and custom web content for Search Engine Positioning. http://www.seoptimism.com/SEO_Contact.htm

Saturday, August 20, 2005

How Important is ALT Text in Search Engine Optimization?

By Robin Nobles, Professional Writer and SEO

For years, search engine optimizers have included their important keyword phrases in ALT text for images, feeling confident that many of the search engines considered the contents of ALT text when determining relevancy.

The big question is, has this changed?
Yes . . .
None of the Major Engines Considers ALT Text When Determining Relevancy

According to research by expert SEO researcher Jerry West of WebMarketingNow and Search Engine Academy, at the present time, none of the "Big Three" search engines (Google, Yahoo!, nor MSN), considers ALT text when determining relevancy.

West explains, "Over the last six months, we have seen a trend on our testing servers that shows that using ALT text for SEO purposes has not only diminished, but adversely affects the rankings in the SERPs. It is clear that search engines continue to catch up to "SEO tricks" that are intended to improve search engine ranking while damaging the visitor experience. The American Disabilities Act (ADA) has strict guidelines as to wha your site needs to contain in order to be ADA compliant. I guarantee you, they do not look favorably at ALT text that has been keyword stuffed.

"Have you ever witnessed a visually impaired individual use th Web? With a device which reads aloud the contents of a Web page, the impaired individual will be inundated with what I refer to as, ALT Text Sp@m. Sometimes the reader is stuck on one graphic for more than 40 seconds reading all of the keywords that have been stuffed.

According to a Google engineer, what you should do is create an ALT tag that is relevant to the picture, so it gives the user a good experience, including the visually impaired. The ALT text is indexed, but it is down graded in the algorithm. The reason?

"We see ALT text as relevant as the Keyword Meta tag,' said the engineer. That should say it all as Google has nevër used the Keyword Meta tag due to the high sp@m rate.

"How do we test? I have outlined our testing methodology below," continues West.

"Our Testing Setup:

* We have four servers (Two Apache servers, one Windows, one Sun Solaris);

* Each server is located in a different part of the United States;

* Each test server has 16 test domains;

* Domains are matched in pairs for A/B testing;

* All domains are "dot com"; no testing is done with other extensions for the algorithms;
The 8 pairs are configured as follows: 3 pages, 8 pages, 25 pages, 50 pages, 100 pages, 150
pages, 300 pages, 500 pages;

* When performing testing, one of the domains in the pair is tested while the other remains
constant;

* Due to varying issues within the algorithms, it takes approximately six weeks to see consistent numbers in order to formulate accurate conclusions.

What Does This Mean to SEOs?
Search engine optimizers no longer need to use keyword phrases in the ALT text of images on their Web pages.

However, let's look at a smarter approach.

I've been recommending to my online and offline SEO students for a long time that they needed to use ALT text in the manner in which it was designed to be used by the W3C: to describe the image. Then, they can include the keyword phrase in one or two images on the page, if appropriate.

Continuing with that strategy is still viable. The major engines don't consider the contents of ALT text nöw, but that doesn't mean they won't six months from nöw. Always remembering the "basics" is one of the best strategies to follow.

Other ALT Text Tips . . .
1. Remember that the purpose of ALT text is to describe the image for the benefit of those who surf the Web with images turned off and for those who have the contents of Web pages read out loud to them. The WC3 highly recommends that Web site owners use ALT text to describe images.

2. Use your keyword phrase in one or two instances of ALT text on the page ­no more. Use moderation in everything you do in search engine optimization.

3. Don't use text that is non-relevant to the image. Don't keyword stuff. Jerry West adds, "Give the visitor information that is worthwhile, especially for the visually impaired."

4. "Consider using a description below the graphic. Based on recent test results, this is read often," states West.

West continues, "Basically, remember to be compliant, not just with the W3C but also with the ADA. It all comes down to intent. If your intent is to fool the search engine into giving you a higher ranking, you are performing 'grey or black hat' strategies. Stay on the right side of the path and the engines will bless you.":)

Remember . . . ALT Text is Just One "Piece of the Pie"
Relevancy and ranking are determined by over 100 different factors. ALT text was just one piece of that pie, a sliver at that.

Don't ever focus on just one piece of the pie. Always remember the basics ­- the SEO foundation ­- and make sure it's solid.

If you know you're weak in one or two areas, you know you have to beef up on other pieces of the SEO pie.

We'll talk more about the "SEO pie" in future articles. Or, attend our on-location workshops, where the SEO pie is always a topic of conversation.

About The Author Robin Nobles teaches 2-, 3-, and 5-day hands-on search engine marketing workshops in locations across the globe as well as online SEO training courses. They have recently launched localized SEO training centers through Search Engine Academy.

Friday, August 19, 2005

Deep Thoughts From the Googleplex

Deep Thoughts From the Googleplex
by Gord Hotchkiss, Thursday, August 18, 2005

IT WAS ONE YEAR AGO that I wrote my first Search Insider column. I remember that by the fact that I wrote about the San Jose Search Engine Strategies Show and now here I am, back in San Jose, going for my semi-regular search marketing total body immersion. Thank goodness this only happens occasionally. It can do strange things to one's perspective to spend four days with thousands of people who live, breathe, and eat search. Compare this to my other life, where my wife is still not exactly sure what I do for a living.

For those of us privileged to live on the inside of this industry, we gain a glimpse into a fantastic and highly illogical world. It's a world where empires can grow from mere ideas overnight and where vast territories can disappear just as quickly. Intellectual capital is the currency here, and it can be redeemed only through the acceptance of the masses. The winners in our world are the ones that pull the gem of an idea, nurture it into life, and find it picked up by the world. It's like throwing little bits of our soul at the public, and hoping one of them sticks.

Case in point: Google. While here in San Jose, I had the opportunity to visit the new Googleplex in Mountain View. I walked through the immense complex (on the morning after the Google Dance, so I was still bleary-eyed) and joined my host for a hot breakfast in Google's gourmet cafeteria, one of many places to grab a meal. I was surrounded by impossibly young, blue jean- and t-shirt-clad Googlites (Googlians?) that were all searching for the next big idea that will resonate with the public. They bellied up to the counter for a custom-made omelet or fruit smoothie, and then gathered around tables to start discussing the future, built in their terms. As my host said, this was the kingdom of the engineers, and Google is still very much an engineer-driven company.

In our world, this is as close to Camelot as it gets. Our society has switched paradigms. Many of us no longer look to our governments or spiritual organizations to make the world a better place. We've put our faith in the raw power of ideas. And if we happen to make a few billion in the process, so be it. Empires like Google no longer need assembly lines or oil wells, smelting plants, or factories to grow and prosper. All you need is people with bright ideas.

It was a telling note that my host told me that the new Google campus was in fact the old Silicon Graphics headquarters. As technology passes on, a new king has come to occupy the castle. The old guard has passed the torch to the new. He acknowledged the irony and said, "Hopefully we'll be able to stay here awhile." Meanwhile, the engineers downed their omelets and smoothies, blissfully unaware of the fact that, more often than not, history is doomed to repeat itself.

As I took in the sheer immensity of the complex, with all its high-tech touches and iconic lava lamps, I couldn't help but think that all this came from one single idea. And it's not even that defensible an idea. The Google Empire has been built from a clever thought, a shard of the souls of Mr. Page and Mr. Brin that has lodged in our collective bosoms. By making "Googling" a verb, they have built an enormous company. And they've done it in seven years. Yet no one seems aware of how ephemeral this all is. The phrase "Castle in the sky" couldn't help but come to mind.

I felt torn between the father in me and the self-acknowledged tech geek. Part of me loves the idea of a world built on sheer intellectual horsepower. I am excited by the constantly shifting challenges and the persistent question: "What's the next big idea? Who could be the next Google?" As I often say, working in this industry is like dancing on quicksand. But the dad in me says: "Be careful. This could all come crashing down tomorrow."

Gord Hotchkiss is the president of Enquiro, a search engine marketing firm. He loves to explore the strategic side of search and is a frequent speaker at Search Engine Strategies and Ad:Tech.

You amy reach him at:
gord.hotchkiss@enquiro.com

Enquiro Search Solutions Inc.
www.enquiro.com
800 277 9997