Thursday, October 27, 2005
Other great resources aboutthe Google Jagger Update:
Jagger1 - I (Included changes in displayed PR and BL )
Jagger2 - Has started and will probably end next week.
Jagger3 - Will start next week hopefully (Wednesday probably at the earliest)
See: http://www.ysearchblog.com/archives/000095.html and http://www.jimboykin.com/45/
My friend and SEO mentor who shall remain nameless (The less Google knows about him the better) explained it to me from his point of view. 'Mr. K" I will call him became involved in the Internet when Cobalt, dBase and FoxPro were all the rage.
This is Mr. K's take on the recent Google Dance
When Google rolls out a radically new 'algo' they don't feed this bad-boy to all their servers units. Some servers IP are blocked so they may test the new "algo" against the group of blocked servers. The updated servers or 'Google feed' you and I see were rolled back, (I'm guessing 6-8 months) so they can test the indexing speed of the new 'algos' against the old 'algo' or block IPs.
Hmmm- This makes sense!
Fast, greasy, blinding indexing speed means faster Google caching of data. In light of the growing paradigm of new data being posted to the World Wide Web - speed is what it’s all about. I'm also guessing that 'Google Base' figures into this equation, but that's food for another blog thought.
So my friends keep the faith and next time - don't put all your marketing eggs in one basket. There's never been such a thing as a free-lunch!
Thursday, October 13, 2005
What did they decide?Google's toolbar will be bundled into downloads of the Java Runtime Environment and Sun's Java will be used to power new software developed and released by Google.
Google might also include links to Sun software that directly competes with Microsoft software such as the Open Office suite in future updates of its toolbar.
What does this mean for search?
This is probably only the first step in Google's and Sun's battle against Microsoft. Google wants to win more market share on the desktop of computer users and it wants to move computer applications from the desktop to the Internet.
Google has also recently filed a new patent that indicates that Google is working on a way to constantly monitor all of your actions in order to build personalized search queries.
According to the patent specification, Google aims to monitor whatever you type in your word processor, the things you copy to your clipboard, the position of your mouse, the content of your emails, instant messenger messages and more.
If Google has access to Sun's free Open Office suite, it might be easier to do that. By gathering as much information about you as possible, Google can offer you personalized search results and - more important to Google - personalized ads.
What does this mean to you?It seems that many of Google's recent "free" applications mainly serve the purpose of gathering more data about you for Google so that Google can monetize that information for targeted ads.
If you use many different Google services, you share a lot of information with Google. It's up to you to decide if you're willing to exchange private information for "free" software and services.
This distribution partnership is probably only the start. It's likely that we can expect a lot more from this alliance between these two online giants.
Monday, October 10, 2005
By Robin Nobles
It’s a common occurrence. SEOs often spend countless hours trying to 'break" a search engine's algorithms. "If I could just crack Google's algorithm, my pages would soar to the top of the rankings!" Let's look at some flaws in this way of thinking.
1. Picture the Google engineers and tech folks turning the algorithms dial as soon as you "think" you have "cracked" the algorithms. Your rankings may fall, and you would have to figure out what's working with the engine right now. In other words, your rankings may never be long term.
2. Instead of spending all of this time trying to impress a search engine with a perfect page, why not impress your true target audience... your customers. Has Google, MSN, or Yahoo! Search ever bought anything from you? They're not your target audience. Your customers are your target audience. Write your pages and content for them.
3. When you expend so much of your energy chasing algorithms, you often focus on only a few elements that influence ranking – those elements that are working right now and that you hope will give your pages the best chance for success. It is said that Google has over 100 ranking elements that influence ranking and relevancy. Some are more important than others. But focusing on just one or two "main" elements and discounting the rest can prove disastrous to a Web site.
A different approach . . . Wouldn't you rather achieve top rankings and keep them there, and have those rankings equate to sales and money in your back pocket? After all, isn't it ultimately the sales you're after, as opposed to just the rankings? If those rankings don't equate to traffïc that equates to sales, you lose, any way you look at it.
Five Basic Steps for Achieving Top Rankings without Chasing Algorithms
1. Forget about the search engines. Yes, you heard me correctly. The search engines aren't and never will be your "ideal target audience." They don't buy your goods and services. They're not who you should be trying to please with your Web pages and site. Instead, write your Web page content for your target audience.
2. Don't ever forget the basics. No matter what's happening in the algorithms, continue using your main keyword phrase prominently in your title tag, META description and keyword tags, link text, body, heading tags, and so forth. That way, when the algo dial is turned, you won't have to make changes to all of your pages. You'll always be ready.
3. Focus your keyword-containing tags and body text on one keyword phrase only. Each page should be focused on one keyword phrase, and each page should have its own unique tags.
4. Write well-crafted content for your Web pages, and add new content on a regular basis. If content is king, context is queen. Focus on your keyword phrase, synonyms and related words, and surrounding text. Use a program like Theme Master if you need help determining those supporting words.
5. Remember that both on-page and off-page factors are important. Don't sacrifice one for the other. On-page factors are your tags, body text, prominence, relevance, etc. Off-page factors are link popularity (quality and number of your inbound links) and link reputation (what those inbound links "say" about your Web page when they link to you).
What about search engine research? Isn't it important? - - It's crucial.
Let me give you an example. At the beginning of this year, pages began falling out of Google's index. The forums were alive with speculation and what to do about it. Through research, we determined this was a compliancy issue. By having compliant code, the search engine spiders are more easily able to spider the content. The solution? Make sure you use a DOCTYPE tag and an ISO Character Set Statement at the top of every Web page.
If you didn't know about the compliancy issues, you could have made changes to your Web pages that didn't need to be made, wasted countless hours trying this or that, all to come up dry. Research helps to make sure you remain on top of what's happening in the search engine industry. It's what sets you apart from other SEOs. You make your decisions based on research and facts, versus speculation and theory.
"Take it from someone who has been in this business for nine years and studies the algorithms closely - don't chase the algorithms. You say that you have a #2 ranking for a certain keyword phrase that alone is bringing your site 550 visitors per day? Great. In the time that you have spent gaining that ranking, I have written 285 pages of unique content, obtained 821 links, etc., and collectively I am getting over 1,300 visitors per day," says Jerry West of WebMarketingNow.
In other words, by focusing on more than just chasing algorithms, you have the potential of having a much more successful Web site.
About The Author
Robin Nobles conducts live SEO workshops in locations across North America. She also teaches online SEO training and offers the Workshop Resource Center, a networking community for SEOs. Localized SEO training is being offered through the Search Engine Academy. Copyright 2005 Robin Nobles. All rights reserved.
POST NOTES FROM VISIONEFX
Also See Google Sitemaps
Visionefx is partnered with Hostmysite ISP company who provides a Google sitemap generator for free as part of their ISP hosting services in all their ISP web hosting packages. It is easy to use and takes the guess work out of manually configuring a Google Site map file.
More about Hostmysite Google Sitemap Auto-Generator!
Saturday, October 08, 2005
Published: 14 August 2003 in DesignBy: Nick Tatt
1. Beat the competition
The power of the web allows people to find information at the drop of a hat. It is possible for new customers to find your web site from anywhere in the world. The down side is that they can also find your competitors just as easily. It is important to make a good first impression and stay ahead of the competition. Failure to do so could lose you valuable customers. If your web site is comparable to your competitors’ consider a timely redesign to make sure you are leading the pack rather than following.
2. Present your organisation’s current market position
Organisations need to evolve to ensure they can deliver what customers need today. If your web site was designed a couple of years ago and has not been updated since, it is possible that it does not reflect your organisation’s current market position. A web site redesign is a great opportunity to evaluate where you are today and where you want to be in the future.
3. Out with the old, in with the new Websites date.
It is an unfortunate fact for organisations that websites show their age if left unattended. Considerable damage can be done to your reputation if customers discover that information or products on your web site are out of date. Worse still, incorrect. If your website is out of date consider using some kind of content management system, appropriate for the job, to keep it fresh.
4. Self serviceThe beauty of the web is its immediacy.
You don’t have to wait for the current batch of printed brochures to run out. With a website a quick update can get the latest product information out to a global audience. The theory is great but in practice many websites forget about their content and it soon becomes out of date. There are a few reasons why this might be but often it is because the website can only be updated by one person, probably the original designer. If you struggle to keep your website content fresh it might be time to consider a redesign, allowing people within your organisation to keep it up to date.
5. Make the site more usable and give the client what they want
How many times have you visited a site and not been able to find what you needed or tried to buy something online only to find that it is out of stock? Problems like these can be avoided with some thought in what your customers are trying to do on your website. Resolving the problem might be as simple as improving the the signposts to key sections or pages but a redesign allows you to listen to your customers and create something that will be easier to use.
6. Reach a wider audience
Just because you have a website doesn’t mean people will automatically find it. Competition for the top slot in search results is fierce. Making your website ‘search engine friendly’ will improve the chances of it being found. Building a ‘search engine friendly’ website, that conforms to web standards, from scratch is more effective than trying to adapt an existing one. In doing so you can insure that it appeals to both customers and search engines alike.
7. Increase your sales
The Holy grail for all websites. Websites are often designed with no thought on how organisations can harness the power of the web effectively. The wrong message in the wrong place can result in a website failing to meet your organisation’s expectations. Developing a web presence is relatively straight forward but developing one that meets your organisations expectations and goals is a little more complex.
Are visitors being channelled to the right sections or the site to make a purchase? Does the design promote your current brand? Are people signing up to your newsletter? Is the copy right for the site? If you suspect they may not be then it is time to get your website to work a little harder for your organisation.
8. Create appropriate content for the web
Web content is different to print content. With the immediacy of the internet it is not appropriate to use the same copy as your print material. When reading web pages users scan for information relevant to them. If your content is not presented in a way that delivers that information swiftly you run the risk of losing customers to your competition.
9. Promote an event or product launch
The launch of a new product or event might be the catalyst to consider a website redesign. If your current website fails to do an event or product launch justice you could damage its success. A website tailored to the needs of the event/product launch is far more effective than trying to shoehorn it into your website.
10. Communicate with your customers
What better way to raise your organisations profile than by announcing a new improved website delivering what your customers want in a clear usable fashion. The launch of a new website is a great excuse to contact your customers and strenghten your relationship with them.
Friday, October 07, 2005
By Dave Davies
From reading the title many of you are probably wondering what W3C compliance has to do with SEO and many more are probably wondering what W3C compliance is at all. Let's begin by shedding some light on the later.
What Is W3C Compliance?
The W3C is the World Wide Web Consortium and basically, since 1994 the W3C has provided the guidelines by which websites and web pages should be structured and created. The rules they outline are based on the "best practices" and while websites don't have to comply to be viewed correctly in Internet Explorer and other popular browsers that cater to incorrect design practices, there are a number of compelling reasons to insure that you or your designer insure that the W3C guidelines are followed and that your site is brought into compliance.
* Compliance help insure accessibility for the disabled.
* Compliance helps insure that your website is accessible from a number of devices; from different browsers to the growing number of surfers using PDA's and cellphones.
* Compliance will also help insure that regardless of the browser, resolution, device, etc. that your website will look and function in the same or at least a very similar fashion.
* At this point you may be saying, "Well that's all well-and-good but what does this have to do with SEO?" Good question.
We at Beanstalk have seen many examples of sites performing better after we had brought them, or even just their homepage, into compliance with W3C standards. While discussing this with Frederick he explained it very well with:
"Proper use of standards and bleeding edge best practices makes sure that not only is the copy marked up in a semantic fashion which search engines can interpret and weigh without confusion, it also skews the content-to-code ratio in the direction where it needs to be while forcing all of the information in the page to be made accessible, thus favoring the content.
We've seen several occasions where the rebuilding of a site with standards, semantics and our proprietary white hat techniques improves the përformance of pages site-wide in the SERPs."
Essentially what he is stating is a fairly logical conclusion - reduce the amount of code on your page and the content (you know, the place where your keywords are) takes a higher priority. Additionally compliance will, by necessity, make your site easily spidered and also allow you greater control over which portions of your content are given more weïght by the search engines.
The Beanstalk website and the W3 EDGE site themselves serve as good examples of sites that performed better after complying with W3C standards. With no other changes than those required to bring our site into compliance the Beanstalk site saw instänt increases. The biggest jumps were on Yahoo! with lesser though still significant increases being noticed on both Google
As we don't give out client URLs, I can't personally list off client site examples we've noticed the same effect on, however we can use W3 EDGE as another example of a site that noticed increases in rankings based solely on compliance.
So How Do I Bring My Site In Compliance With W3C Standards?
To be sure, this is easier said than done. Obviously the ideal solution is to have your site designed in compliance to begin with. If you already have a website, you have one of two options:
1. Hire a designer familiar with W3C standards and have your site redone,
2. Prepare yourself for a big learning curve and a bit of frustration (though well worth both).
Assuming that you've decided to do the work yourself there are a number of great resources out there. By far the best that I've found in my travels is the Web Developer extension for FireFox. You'll have to install the FireFox browser first and then install the extension. Among other great tools for SEO this extension provides a one-click chëck for compliance and provides a list of where your errors are, what's causing them and links to solutions right from the W3C. The extension provides testing for HTML, XHTML, CSS and Accessibility compliance.
Other resources you'll definitely want to chëck into are:
CSS Zen Garden ~ A List Apart ~ Holy CSS ZeldMan!
(Frederick lists this one as one of the best resources for the novice to find answers. I have to agree.)
Where Do I Get Started?
The first place to start would be to download FireFox (count this as reason #47 to do so as it's a great browser) and install the Web Developer extension. This will give you easy access to testing tools. The next step is to bookmark the resources above.
Once you've done these you'd do well to run the tests on your own site while at the same time keeping up an example site that already complies so you can look at their code if need be.
To give you a less frustrating start I would recommend beginning with your CSS validation. Generally CSS validation is easier and faster than the other forms. In my humble opinion, it's always best to start with something you'll be able to accomplish quickly to reinforce that you can in fact do it.
After CSS, you'll need to move on to HTML or XHTML validation. Be prepared to set aside a couple hours if you're a novice with a standard site. More if you have a large site of course.
Once you have your CSS and HTML/XHTML validated its time to comply with Accessibility
standards. What you will be doing is cleaning up a ton of your code and moving a lot into CSS, which means you'll be further adding to your style sheet. If you're not comfortable with CSS, you'll want to revisit the resources above. CSS is not a big mystery, though it can be challenging in the beginning. As a pleasant by-product, you are sure to find a number of interesting effects
and formats that are possible with CSS that you didn't even know were so easily added to your site.
But What Do I Get From All This?
Once you're done you'll be left with a compliant site that not only will be available on a much largër number of browsers (increasingly important as browsers such as FireFox gain more users) but you'll have a site with far less code that will rank higher on the search engines because of it.
To be sure, W3C validation is not the "magic bullet" to top rankings. In the current SEO world, there is no one thing that is. However, as more and more websites are created and competition for top positioning gets more fierce, it's important to take every advantage you can to not only get to the first page, but to hold your position against those who want to take it from you as
you took it from someone else.
About The Author
Dave Davies is the CEO of Beanstalk Search Engine Positioning, Inc.. He writes with years of experience in SEO and Internet Marketing. A special thanks go out to Frederick Townes of W3EDGE for his help with this article. W3 EDGE provides W3C-compliant web site design for their clients. To keep updated on new SEO articles and news be sure to visit the Beanstalk blog regularly.
POST NOTE from VISIONEFX :
If you validate your web site (or) your clients website display the W3C badge of honor!- Link to the W3C validation page and display their logo on your client or your clients web site!
I did this on my company web site, VISIONEFX.
Did it help my business and SEO? You bet!
Wednesday, October 05, 2005
By Phil Craven (c) 2005 WebWorkShop
The original concept of DMOZ was excellent for its time. The DMOZ site's About page makes these statements about the concept, and about the reasons for the directory's creation:- "Automated search engines are increasingly unable to turn up useful results to search queries. The small paid editorial staffs at commercial directory sites can't keep up with submissions, and the quality and comprehensiveness of their directories has suffered. Link rot is setting in and they can't keep pace with the growth of the Internet."
"The Open Directory follows in the footsteps of some of the most important editor/contributor projects of the 20th century. Just as the Oxford English Dictionary became the definitive word on words through the efforts of volunteers, the Open Directory follows in its footsteps to become the definitive catalog of the Web."
But things have changed a lot since DMOZ began in the mid 1990s. Since then, Google came along with very relevant search results, and they were kind enough to show the other engines how to produce such relevant results. That caused dramatic improvements, to the extent that top search engines have been able to provide very relevant search results for some time, and they provide a lot more of them than DMOZ is able to do.
The small paid editorial staffs at commercial directory sites still can't keep up with submissions, but their backlogs are small when compared with DMOZ's massive backlog. According to reports, there are over a million site submissions that are waiting to be reviewed, and delays of several years between submitting a site and it being reviewed are not uncommon. The backlog problem is so huge that many editors have redefined the problem so that it no longer exists. To them there is no backlog, because the submitted sites are not there to be reviewed. They are merely a low priority pool of sites that they can dip into if they want to, and some of them prefer to find sites on their own.
Link rot (dead links) has become widespread in DMOZ through the years, and they certainly can't "keep pace with the growth of the Web". There isn't a single reason for the creation of DMOZ that DMOZ itself doesn't nöw suffer from. So how come such an excellent original concept ended up with a directory that has the same problems that it sought to solve, and on a much largër scale?
One reason is that the Web has grown at a much faster pace than was perhaps anticipated, and the DMOZ editors simply can't keep up. Another reason is that there are simply not enough editors who are adding sites to the directory. At the time of writing, the DMOZ front page boasts 69,412 editors, but that is the number of editors that they've had since the beginning, and most of them are no longer there.
A recent report stated that there are currently about 10,000 editors who are able to edit, and that only around 3,000 of those are active in building the directory. The word "active" is used to describe editors who actually edit quite often, but as little as one edit every few months is acceptable. The word doesn't mean "busy", although some of them are. With so few people doing anything, it isn't even possible for them to keep up with the link rot in such a huge directory,
and there's the ever increasing problem of listings that link to topics other than what they were listed for. It simply isn't possible for them to maintain the directory as they would like.
The idea of becoming "the definitive catalog of the Web" was a fine one, but it turned out to be an impossible dream. The purpose of DMOZ is dead. Today's search engines produce excellent results in large quantities, and much more quickly than drilling down into a directory to find something.
So is there any value at all in the DMOZ directory? As a useful catalog of the Web, and when compared with the major search engines, the answer is no, although a few people do find it to be a useful research resource. For website owners, the links to their websites that a listïng in DMOZ creatës are useful for search engine ranking purposes, but even those are becoming less useful as search engines improve, and seek to block out unwanted duplicate content from
It was a fine concept, and it looked promising for a while, but the idea of DMOZ becoming the definitive catalog of the Web is gone. Improvements in the search engines eclipsed its value, and the growth rate of the Web meant that it could nevër achieve its goal. It began with an excellent concept, and they gave it a good shot, but it didn't work. The continuing growth rate of the Web ensures that it can nevër work. It continues as a good directory of a large number of web sites, but that is all. And not many people use directories when the search engines produce such good results, and so quickly.
About The Author
Article by Phil Craven of WebWorkShop. Phil is well-known in the world of webmasters and search engine optimization and his views have been sought and published by various online and offline publications.
Tuesday, October 04, 2005
By David Risley
We've all heard it before: content is king. And it is true. If you own a site, you need to post something interesting that people want to read before you can expect people to stop by. If your site is a content-based website, then you've already taken a huge step.
However, if your website is a business website whose only purpose is to talk about your services, then you really should make an effort to post some content onto your website which is helpful to readers, free, and relevant to your services or website. If you do this, your site will attract traffic from people looking for information, not just to purchase something. And with increased traffic in general, you will get increased attention. And this increases your statistics.
Writing content for your own website is only half the battle, though. You have got to get people to read it. Just posting a website is not going to get people to come to it. It would be like building a business in the middle of the mountains. Nobody knows its there and you won't get any customers. If you get your articles out there for people to read and the articles are written correctly, you can position yourself as an expert in your field and promote your own website. One way to do this is by publishing on content hubs rather than limiting it to your own website.
A content hub is a site which publishes articles on all topics (usually categorized). Those articles are freely available to anyone to use on their own website, newsletter, blog, etc. So, many publishers or site owners in need of fresh content for their website can go to one or more of these content hubs, find an article they like, and use it. They have to maintain proper credit to the author and publish the small author bio which accompanies the article.
Let's look at this, though, from the author's viewpoint - your viewpoint. Let's say you are selling consulting services for search engine optimization. You have a site for your services, but you blend in with all the other such services. So, you write a series of articles giving tips to webmasters on how they can optimize their website. With your article you include a short bio of yourself.
You include a mention of your services and a link to your website. You publish your article
on a bunch of content hubs. Other websites, newsletters and blogs grab your article off those sites and use it on their own. Your article therefore spreads throughout the internet. Being that your site is linked with the article and is therefore on all of these other websites now (including the content hubs themselves), search engines who are constantly spidering the internet pick up on your article and index it associated with your website.
This, in turn, raises your ranking in the search engines. And you get increased traffic to your website not only from search engine searches but also from your article.
Now, let's say you have done some research on keywords and you interlace your article with certain keywords. When the search engines spider your article all over the internet and associates with your website, it will raise your search engine rankings even more. There is a real science to this, and if done correctly, can drastically raise your internet presence in a short time. I recently had a meeting with the CEO of In Touch Media Group, a Clearwater, FL based company which is in the business of internet marketing.
They use content hubs as part of their strategy for clients and they couple this with their vast archived data regarding keywords. They showed me the stats of one site which they have, in the course of just a few months, taken from essentially no traffic to a VERY respectable level of traffic. After getting an article out in the content hubs, they will follow up a few weeks later with a press release.
So, how can you publish some of your articles on content hubs? Well, the first step is to find and visit them. There are many of them out there, but below are some of the better ones:
SubmitYourArticle.com - a service to send your article to a bunch of hubs at once
There are services to help you distribute to a large collection of publishers at once. I have used Isnare's distribution service and it seems to work well. There are also distribution groups on Yahoo.
Here are a few of them
Free Reprint Articles
With that, I wish you the best of luck in your promotion efforts. Start writing!
About The Author
David Risley is a web developer and founder of PC Media, Inc.. Specializes in PHP/ MySQL development, consulting and internet business management. He is also the founder of PC Mechanic, a large website delivering do-it-yourself computer information to thousands of users every day.
Saturday, October 01, 2005
It's no good having a creative, individual website with brilliant, informative copy if customers can't find you on the internet. On the other hand, it's also detrimental if you have a website that can be easily found (has a high ranking) but people become bored and alienated reading it. Producing effective online copywriting is a creative process blending art and science in a balanced technique combining many different elements. This integration of disciplines is required to satisfy both the technical and the aesthetic objectives of a website.
Optimized online copywriting should ensure that your website is:
• highly readable to your viewers
• highly visible to the search engines, and thereby
• commercially successful for you.
Many people and businesses don't have the time to actually write web copy themselves. A professional freelance copywriter can furnish you with keyword-rich, highly original web content to enhance and improve the quality of your website with the aim of transforming more of your visitors into customers.
Rarely will you get a second chance to engage your customer's attention, so your first shot must be formatted for maximum sales potential, catching the eye of the search engine robots as well. But not too much… If your copy goes overboard in favor of the search engines it earns you a penalty from Google that will negatively affect your rankings. Your website must always have the reader as priority. This makes more business sense anyway.
Search engines provide a way for potential customers to find you on the internet. People type a key phrase or keyword into a search engine, such as Google, Yahoo or MSN (or one of the many other popular engines) and this returns a page of listings - web page suggestions for that particular phrase or word. Obviously, you want your website to feature highly in this list.
Optimized online copywriting specifically targets the words and phrases people are using when searching for a product on the internet (Search Engine Marketing (SEM), keyword research). You want to make sure your website stays at the top of the listings so people go to your website before others. With targeted copy in place, search engines are more likely to index your web site on page one than if it does not include keyword-rich copy. This is an ever more important
issue when dealing with Google, the leading search-engine today.
To rank highly in the search engines the words on your web pages should never be an afterthought, but should be included right at the beginning in the original design of your website. Content development is the most valuable asset web developers can utilize in the bid for productive, successful search engine optimization and Search Engine Marketing (SEM).
Hiring a professional copywriter is a wise investment in your business future. Even if you don’t want to optimize your site you should make sure that the words on your site are reasonable, enticing, spelled correctly and artfully arranged to engage attention. Just because you can type letters or write some emails doesn't mean you can write the copy for your website. The writing on your homepage is often how people determine whether the website is a scam or the genuine article. Your website’s credibility takes a nose-dive if the spelling is wrong, or the grammar is incorrect, or it just reads like bad, clumsy English. People will be disinclined to trust your content.
Within the search engines new technologies and algorithms are being developed all the time to make search methodologies smarter, more astute. It's never a coincidence when someone types in a search phrase and your website is indexed highly on the page. Keyword rich online copywriting is a significant and critical component in gaining high rankings on the search engines.
Google has been pioneering a new trend of intelligent search engines which are not attracted by mere repetition of words throughout the text, but which look for meaning, attempting to make grammatical sense of the information, trying to understand what the web page is actually saying. This is forcing webmasters to improve the content on their web pages or suffer the onsequences.
The old saying has never been more relevant: 'content is king.'