Tuesday, September 30, 2008

Finding Targeted Keyword Phrases Your Competitors Miss

Author: Daria Goetsch

Finding Targeted Keyword Phrases Your Competitors Miss

Daria Goetsch Search Innovation

Finding keyword phrases your competition is missing is easier than you might think. Combinations of two and three word phrases are often overlooked by your competitors when vying for the top competitive terms. This missed opportunity may be a benefit to you to overcome your competition in the search engine rankings.

Think Like A Searcher - Study Your Target Audience Really look at the audience you want to bring to your website. Are there terms you might not ordinarily use, or that your competitors use, that would work for a small portion of visitors? Remember that single words tend to be more competitive. Find two and three word phrases that would work for a searcher looking for your website topic. If your visitors usually search on ""vertical widgets"", look at ""horizontal widgets"" as well. Dig deep to find terms that might not be obvious to you. Be sure to focus your terms on the actual topic of your website, and terms that people would really search for. Have another person compile a list of keyword phrases used to find your website or product. You'd be surprised at the number of variations two minds can come up with instead of one. Think like a searcher - not a website owner.

View Your Competitor's Source Code And Content For Keyword Phrases Viewing your competitor's source code is very easy and a good way to see what keyword phrases (if any) they are using. Using your browser, view the source code of their page. The title and meta tags should contain the same keywords or variations of keyword phrases if the competitor's website is optimized. Look over the web page content as well as for keyword phrases worked into the text, image alt text, headings and hyperlinks of the pages. If their pages are not optimized you may gain an even bigger edge on the competition by optimizing your web pages.

Using Keyword Tools To Find Variations Of Keyword Phrases The Overture Suggestion Tool will provide keyword variations. You can find the tool at http://www.content.overture.com/d/USm/ays/index.jhtml

Clicking on the suggestion tool link will bring up a window that allows you to search for terms and variations of terms. Begin with your list and see how many variations come up with the results. You might be surprised at the popularity of some of the search variations you see. Be sure to add you new keyword phrases to your list.

WordTracker is a keyword tool as well, you can purchase a yearly subscription or even a one day subscription. Learn more about it here: http://www.wordtracker.com/

Search On Keyword Phrases In The Search Engines Using your expanded list of keyword phrases, search for those terms in the search engine databases. Note the number of search engine results. The more results, typically the more competitive the term. See the differences in number of search results for plural versions as opposed to singular versions of your keywords in each engine. Note the descriptions that the search engine results bring up - are there any keyword phrases there that might apply to your website? Don't forget the ads Google displays in their search results. Study the ads that come up with your search terms as well. While you are searching on your keyword phrases, check your competitor's ranking, along with the new keyword phrase variations you come up with through the Overture Keyword and WordTracker tools.

Add Keywords Reflecting Your Local Cities And State You can also target local areas by including them in your title/meta tags and text of your web pages. List only the cities and state you reside in and/or provide services to. You never know who will be looking for a local contact producer of ""blue widgets"" in your city or state. Some people prefer to work with a local company. Adding in those type of specifics, even on your contact page with your local information, can pull in traffic your local competitors are missing.

Check Your Site Statistics Last but certainly not least, check your search engine stats program or raw server logs to see what terms your visitors are using to find your website. There may be combinations of words your visitors are using you have not thought of or that may not be in the content of your pages.

Incorporate Keyword Phrases Into Content Of Your Web Pages Once you have your list of varied keyword phrases, work them into your web page. Incorporating these terms into your web pages should ""make sense"", in other words, they should read well and not sound ""spammy"". Most of all, they should realistically be part of the content of the page, not placed there only because you need them in the content. Have another person read your copy to see if it sounds reasonable to them.

Keyword Variations Make A Difference Don't miss out on the keywords your competitors might miss. Those extra keywords could translate into profits and increased viewing of your website by visitors who might otherwise not find you.

# # #

Daria Goetsch is the founder and Search Engine Marketing Consultant for Search Innovation Marketing (http://www.searchinnovation.com), a Search Engine Promotion company serving small businesses. Besides running her own company, Daria is an associate of WebMama.com, an Internet web marketing strategies company. She has specialized in search engine optimization since 1998, including three years as the Search Engine Specialist for O'Reilly & Associates, a technical book publishing company.

Copyright © 2003 Search Innovation Marketing. All Rights Reserved.

Permission to reprint this article is granted as long as all text above this line is included in its entirety. We would also appreciate your notifying us when you reprint it: please send a note to reprint @ searchinnovation.com.

About the author: Daria Goetsch is the founder and Search Engine Marketing Consultant for Search Innovation Marketing (http://www.searchinnovation.com), a Search Engine Promotion company serving small businesses. She has specialized in search engine optimization since 1998, including three years as the Search Engine Specialist for O'Reilly & Associates, a technical book publishing company.

Sunday, September 21, 2008

Introduction to DIY search engine optimisation

Author: Rachael Sankey

A lot is made of the importance of search engine optimisation and rightly so. If you are serious about getting your website noticed, you will have to consider how your site ranks in search engines. If you decide to optimise your site, you will either do it yourself, or pay someone to do it for you. Doing it yourself can be time consuming but will save you money. If saving money is your priority and you have the time, DIY page optimisation IS possible.

You will want to look at three aspects:

Search Engine Submission - This involves visiting search engine websites, finding the 'submit URL / site' section - and keying in the appropriate information. Some of them offer a free option but this doesn't guarantee inclusion and could take time to see results. The most important one is the open directory which will help your ranking with Google, among others. You can also submit your site to Google, Lycos, Yahoo, AltaVista, MSN and many others. Before you do, make sure you know what information you want to give them. You may be asked for a description, so have one ready. No search engine will accept sales style copy in their listings - so keep it plain. Most importantly, read the guidelines for each service - it may seem like a drag but could help your submission.

Page Optimisation - Now the search engines know you are there, how easy will it be for them to index your site? This stage comes first, of course - so once you register you site, you can sit back and wait - if you do it for free - you WILL have to wait! The easier it is for the SEs to index, the faster your site will get noticed. So what do you need to take into consideration? Here are a few tips:

Meta Tags and Description: These are tags at the top of your page (HTML code) which provide SEs with information about your site. Meta Tags are keywords and phrases, targeted to the subject of your site. Make sure yours are relevant to your site content. The Meta Description should reflect what your site is about. Page Title: This shows at the top of a browser when your page is viewed. Make sure it's relevant, has a few good keywords - and is different on every page on your site. Site content: This should go without saying - make your site content relevant, useful and accurate. Reflect this in your keywords and title. Links: Any internal links should have relevant link text or Alt-text tags if they are graphic. Page Rank & Back Links - This relates mainly to Google - possibly the most important search engine on the 'net. Once indexed, Google assigns each page a 'page rank'. You can see what page rank a site has by downloading the Google toolbar. PR is given as a rank of 0 - 10, 10 being high. Page rank relates to the number of links (votes) Google can detect that point to your site. You can right click (IE6) and select 'backward links' on a website to see who links to it. As a result, many sites now engage in the sport of link building. This can have its drawbacks, if you end up linking to a 'bad neighbourhood'. If you want to build links - follow a few guidelines:

Don't swap links with 'link farms' - or sites that may have a penalty* All links aren't equal - the value of a link decreases as the number of links on the page increases and the Page Rank of a site will determine how valuable its link is to your site. When submitting your link to another site, try to get them to use meaningful link text with relevant keywords. Ensure that links to and from your site are relevant to your site content *A penalty can be indicated by a grey Page Rank bar or a white 'PR-0' bar. However, a grey bar can also indicate that a site has not been indexed and a white bar can indicate a site has been indexed but only recently - or has no recognised back links. Penalties are incurred when a site breaks the rules set by Google - SEO crimes include:

Cloaking - a page is made to grab SE attention and redirect to the URL of the site in question. Keyword stuffing - keywords are 'stuffed' into image tags & other places where they shouldn't be, or aren't relevant. Hidden text - text on a page that matches the background colour - so it's invisible to site visitors but not to search engines.

There are many resources on the 'net to help you optimise your site safely. Search Engine Ethos has resources and links to get you started.

About the author: Rachael is a web designer who has just started her own company - Moneytooth Web Services.

Saturday, September 20, 2008

Steal Traffic From Your Competitors

Author: Terence Tan

This interesting idea is likely to drive your competitors nuts. It is published purely for its entertainment value.

Some might consider it unethical and even illegal so USE AT YOUR OWN RISK ( We take no responsibility for any conflict, legal or otherwise, that you may get into.)

Basically, the approach is to choose your most successful competing web sites, especially those that get top 10 listings in search engines, and to create doorway pages using their web site address and names as key words. So when people who have previously visited them but cannot remember their exact domain name use search engines to search for them, your page gets displayed beside theirs and diverts traffic away from them. To avoid breaking the law, some people suggest avoiding trademark names and not hogging the top spot in the search listings. It does seem like a very clever, fun idea but remember:

USE AT YOUR OWN RISK!

About the author: Terence Tan is the founder of HugeAffiliates.com, a website dedicated towards the development of Multi Level Affiliate Programs as an alternative system of business. Visit http://mlap.net to learn how MLAPs can multiply your affiliate referral commissions.

Friday, September 19, 2008

Link Building For Top Search Engine Placement - StepForth Search Engine Placement

Author: Dave Davies

For many, the idea of optimizing a website for top search engine placement means entering some META tags, maybe titling the page appropriately, and then you're done. A long time ago, in an SEO galaxy far FAR away, this tactic worked. Unfortunately for those optimizing their websites, and fortunately for those using search engines to find information, this is no longer the case.

There are now some 80+ factors of your website that are taken into consideration when determining the ranking of your website. Everything from titles and META's to content and ALT tags are weighed and analyzed when your placement on the search engines is determined. In a recent article by Ross Dunn, CEO of StepForth Search Engine Placement, he addressed the fundamentals of optimizing your web pages. The article he wrote was entitled "A Ten Minute Search Engine Optimization" and can be found on the StepForth website at http://news.stepforth.com/2003-news en-minute-optimization.shtml.

This article addresses many of the internal factors taken into account in determining your ranking. Another factor which has to be taken very seriously is the external links to your website. Links to your site are not the most important factor in determining your ranking and you will have to have a well-optimized site to rank well, however, when all else is equal (i.e. when your competitors also have well-optimized sites) this can be the determining factor between being found and being buried in the search engine rankings.

Links That Work The first consideration you have to make in your link-building efforts is who should be linking to you and whom you should link to. These are two separate considerations and despite that fact that you will be working on both at the same time, they must be considered independently.

Who Should Link To You? (Incoming Links) When you are looking for sites to link to you there are five questions that you must ask yourself: 1)Do they compete with you? While you can try to request a link from a site that provides the same or similar products and services that you do, this is generally a waste of time that could be spent finding legitimate links from sites that would like to promote your product or service. 2)Does their site relate to your content? If you have a site promoting carpet cleaning products, a link from a hair salon will not be of much benefit. Google and the other major search engines look for content relationship when determining the value of a link. If the content of the two sites is totally unrelated the link is given very little weight if any. Focus only on attaining links from sites relevant to your own. 3)How does Google rate the site? Google has come out with a fantastic tool called the Google Toolbar. The advanced version of the toolbar includes the PageRank of the site you are currently visiting. Without getting into a long description of PageRank (see Google's definition), the higher the number the better (it is a ranking out of 10 where traditionally anything above 4 is good and anything above 6 is excellent. If Google rates the site well then the link will be more valued than from a site that Google rates poorly. When looking for links give more time and attention to those with PageRanks of 4 or higher. The Google Toolbar is a free download available from Google at http:/ oolbar.google.com/. 4)Will they require a reciprocal link? Whether the site will require a reciprocal link or not is a serious consideration. The more links to your site that you have that are not reciprocated the better. These links are given added weight. This area will be addressed further below. 5)How many links on the page? How many links are on the page that will link to you, and where your link will be placed is another serious consideration. If your link from their site will be on a page with 100 other links then the value of the link itself is greatly reduced. Also, whether your link will be on the top of the page or the bottom will also determine the value of the link itself.

This may be a lot to consider, however it can save you enormous amounts of time and frustration. People will often work for hours to attain a link from a site they like when in reality the site has a low PageRank and the link won't even carry much weight as far as search engine placement is concerned.

It is only responsible to note that as a general rule any relevant inbound link will help somewhat. If, in your travels, you find a related site with a PageRank of 2 that is very simple to get a link from, it's well worth your time to do so given that that time taken is only about 5 minutes. Not all link building is this simple and it's in the more advanced efforts (email communications with the webmaster for example) that you will want to apply the above noted "rules".

Who Should You Link To? (Outgoing Links) The question, "who should you link to?" is a very serious one and can have significant repercussions on your search engine placement. If you are linking to sites this is your way of saying, "This site is highly relevant to mine and that my visitors will enjoy the content on it." For this reason there are a number of considerations that have to be made when determining whether reciprocal links are in your best interest. Factors of the website that should be considered when determining whether to link to that website are:

1)Is the site's content related to yours? Like incoming links (sites linking to you), the relevancy of the content on both sites should be high. If you have a number of links from your site to websites that are completely unrelated to you're the value of these links is negligible and further, will reduce the perceived value of your site. 2)Does the site compete with you? In this case it is your interests, not those of the other webmaster, which must be taken into account. Do you want to link to a site that provides the same or similar products/services as you? Unless the site is willing to reciprocate the link and they have a very high PageRank it is probably not wise to give your visitors the opportunity to go to the site of a competitor. 3)What is their PageRank? Many people falsely believe that any outbound link will hurt your placement. This is simply not the case. Poor link-building is the cause of this misconception, not the link itself. When you are determining whether to link to another site, take a look at the PageRank it has been assigned by Google. Like the boost this gives to your site in the incoming links, so to can this have a positive effect on your outbound links. If all of your outbound links are to highly regarded sites (by the search engines) and whose content is relevant to yours then these links will help, and not hinder your rankings.

Finding The Links Since you're looking for links to boost your search engine placement, the best place to start is… the search engines. A few searches should produces hundreds of potential links. There are a few tactics that work better than others. The first tactic provides the best links for their relevancy and for their PageRank. The second provides the best results for getting many links quickly and easily.

Getting High Quality Links – The easiest way to get high quality links that will be well regarded by Google and the other search engines is to perform a search on the major search engines for your targeted keyword phrases. The supplied results will provide you with a list of those sites that the engine rates as the top sites for that phrase. If the engines believe this to be of value for searches looking for a particular phrase then likewise, they will view it as a valuable link to your site, which obviously deals with the same subject.

You don't have to stick to your main targeted keyword phrase either. In this stage of link building you can run searches on all the keyword phrases that you are targeting and request that they link to your site. You will have to obey the above-noted guidelines and this will mean that there will be many sites you will have to skip, as they are competitors of yours.

Getting Many Links – Getting many links is not as difficult as getting high-quality links. Some of the same rules apply here. You will want the site to be related to yours, you will want it to be well-regarded by the search engines, and you will want it to be easy to submit to. To accomplish this, the easiest way is to once again turn to the search engines. This time however, the search will be a little bit different.

Rather than typing in the keyword phrase you are targeting you should type in the keyword phrase followed by the words "submit" or "add url". What this will give you is a listing of sites related to your keywords but also with an added bonus; a submission page. Sites that advertise their submissions are traditionally easier to submit to (i.e. they probably have a simple form to fill out rather than you having to email webmasters, etc.).

You'll be surprised at how many of these sites will link to you without the need for a reciprocal link. If the form is easy then submit to it. If the form will require significant efforts to fill out (requiring information you don't have on hand for example) or if they require a reciprocal link you will have to use the above-noted guidelines to determine if the effort is worth your time and/or outbound link.

Build Quality – And They Will Link Why would anyone link to your website without requiring a reciprocal link? What benefit do they possibly get out of this? The answers to these questions depends greatly on the website, it's design, and the content it carries.

The most significant factor that will affect your ability to attain incoming links to your website is the quality of the site itself. If you have a well-designed website that contains a significant amount of useful content it will be much easier to get other webmasters to link to you as your site is a valuable resource. If, however, your site is poorly designed and/or does not contain any useful information then you have provided nothing that the other site would need to link to, and thus, probably won't.

If you have a website on Tea Tree Oil for example, and in it you provided a great deal of information on the oil, it's benefits, and it's medicinal uses, without cluttering it with a glaring sales-pitch, you stand a very good chance of attaining links from other sites as the content you have provided will be useful to their visitors.

An important thing to remember is this: If you want people to link to you without having to link to them you have to provide valuable information for their visitors and present that information in an attractive format.

Where To Start The easiest place to start, when building non-reciprocating incoming links, is the directories. There are thousands of directories out there focused on a variety of different fields. Find the directories related to your industry and submit your site to them.

After you have submitted to all the directories related to your website it's time to move on to other sites. Now you will have to apply the rules noted above and determine how much time each link is worth and how to allot your valuable time in attaining them.

Best Practices For Outbound Links There are a few considerations you will want to make in regards to how you organize the outbound links from your website. The most important thing to do is to create a "Resources Page". You should call it a "Resources Page" or something similar rather than a "Links Page" for both search engine considerations and for your visitors.

Placing the majority of your outbound links on one page will avoid inadvertently affecting the optimization and search engine considerations taken with the rest of your website and gives you a place to place new links as they come in the future.

Each outbound link should look something like the following example linked from an adventure tour web site: Tea Tree Oil Exposed Everything you wanted to know about Tea Tree Oil! From its history to its many uses, Tea Tree Oil is a requirement for any home first aid kit.

Each link should have descriptive text within it (not something ambiguous like 'click here') and there should be a quality description of the web site below the link. If you don't know what to include as the description, just ask the site owner, they are often very pleased that you are putting so much care into the reciprocal link.

Something you will also want to do is have the outbound links open in a new window. It's surprising the number of websites that don't do this. If you can keep a visitor in your site, even if your site is now in a browser beneath the one being looked at, you stand a higher chance that the visitor will return than if they have completely left your site and you're now relying on them to go back.

Conclusion With these practices put in place your link-building efforts, while time-consuming, will be well worth the effort. As mentioned above, however, link-building, like META tags, are not the end-all and be-all of attaining top search engine placement. First you will have to build a marketable and optimized web site that provides your visitors valuable content for the search terms they are entering. Link building is the icing. Without the cake it amounts to nothing.

About the author: Dave Davies is the Marketing Manager for StepForth Search Engine Placement and a knowledgeble search engine optimization expert with many years and sites to his credit.

Thursday, September 18, 2008

Optimizing Dynamic Pages - Part I

Author: Dale Goetsch

The Widget Queen You are the Widget Queen. You eat, breathe, and live widgets. You sell more widgets than anyone. You want to reach more widget customers, so you have decided to sell widgets on the web. You have spared no expense in designing and building the ultimate widget website. You have widget descriptions; you have widget specifications; you even have widget movies. The only thing your widget website does not have is visitors.

Off to the search engines you go. You type in the phrase ""left-handed blue widgets"" and look at the results. All of your major competitors are listed. There are even competitors you have never heard of. But you, the Widget Queen, do not have a listing there.

What's up with that? What follows is some very basic introductory material followed by some advanced technical details on dynamic sites and SEO.

What is a search engine? First of all, you need to understand what a search engine actually searches. When a potential visitor does a search in a search engine, such as Google or AllTheWeb/FAST, she is not really searching the web; rather, she is looking at a database compiled by that search engine. This database consists of the text and links from the web pages that have been visited by the search engine's robot.

How is a search engine database compiled? Search engines compile these databases automatically using software programs called ""robots"" or ""spiders"". These automatic programs visit pages on the World Wide Web, much as humans visit web pages using browsers, by starting at some arbitrary location and following links. When a website owner ""submits"" a page to a search engine, in most cases she is supplying the search engine's robot with a starting point for their automatic journey. Starting in that location, the robot then follows links and thus ""discovers"" other pages in your website or visits other sites to which your site is linked. (This, by the way, is how search engines can find individual pages or whole sites that have never been submitted to them--if there is a link to one site from another site, chances are good that eventually a search engine robot is going to find that link and follow it.)

Even though robots visit pages like human visitors do, what they can do with what they ""see"" is quite different. When a human visitor uses a browser to view a web page, that visitor can read the text on the page, look at images, play movies, listen to sounds, submit information in forms, follow hyperlinks, and any number of other tasks. The human visitor really interacts with the site. The search engine robot, on the other hand, can only do a few of these things. It is this difference that can keep your dynamic page from being included in the search engine database.

What does a robot do? Search engine robots are very simple creatures. They can ""read"" text, and they can follow links. That's it. Robots cannot view a Flash movie, they cannot fill in a form, and they cannot click a ""submit"" button. What that means is that no matter how much great information your web page may contain, if a visitor has to select it from a list, or type a password, or submit a form full of information to get there, no robot will ever visit that page.

The origins of dynamic pages Most dynamic web pages are generated in response to queries run against databases. Behind your widget website there is a large database of widgets. When a visitor comes to your site and looks for left-handed blue widgets, it is this database that supplies the response. The database provides that information to the visitor. Typically the visitor checks a box or selects from a list or even types text onto the page and presses a ""submit"" button. Once she jumps through those hoops, your visitor gets her page full of left-handed blue widgets.

I can't see you Unfortunately, when a search engine robot visits this page, it cannot check that box, it cannot select from that list, and it cannot click the ""submit"" button. Put simply, the robot cannot get to page of widgets. If the robot can't get there, the page will not be included in the search engine database. If it's not in the database, searchers cannot find it.

So how do you get there? So how do we attract other visitors to our dynamic page of left-handed blue widgets? There must be some way to get there without having to click on that ""submit"" button.

Next month we will look at several ways to get search engine robots to visit dynamic web pages. Stay tuned.

About the author: Dale Goetsch is the Technical Consultant for Search Innovation Marketing (http://www.searchinnovation.com), a Search Engine Promotion company serving small businesses and non-profits. He has over twelve years experience in software development. Along with programming in Perl, JavaScript, ASP and VB, he is a technical writer and editor, with an emphasis on making technical subjects accessible to non-technical readers.

Wednesday, September 17, 2008

Optimizing Dynamic Pages - Part II

Author: Dale Goetsch

Optimizing Dynamic Pages - Part II

The Widget Queen Revisited You have the world's finest collection of widgets. You created the world's best widget website. You have no traffic.

You checked in the search engines and find that your site does not appear at all, even though all your competitors' sites do. Perhaps the search engine robots cannot get to your pages to index them.

Search Engine Robots Search engine robots are simple creatures. They can ""read"" text to add to their databases, and they can follow ""normal"" links--those links that are coded to look like

blue widgets

or the slight variation

That's it. Search engine robots cannot select items from lists; search engine robots cannot type text into boxes; search engine robots cannot click ""submit"" buttons. That means that no matter how important our dynamically-generated page of blue widgets is, if the only way to access that page is to select it from a list or click on a button, the robot will never be able to visit it. That, in turn, means that it will never appear in the search engine results.

So how do you get your dynamic information to show up in non-dynamic ways?

The Painful Solution One of the reasons that dynamic pages exist is because of the difficulty involved in constantly updating -- adding and deleting -- pages from your site, based on which widgets you are offering this season. If you have a separate page for each make and model of widget, each of those pages can be spidered. They can all be reached through links that look like

blue widgets style 1 blue widgets style 2 red widgets style 1 red widgets style 2 new widgets style 1 new widgets style 2

The bad news here, of course, is that you now have to create all of those pages. This loses the benefit of drawing the widget information from a database.

A Better Solution A better solution is to create only a ""shell"" of each page, and then to dynamically populate the page from our database. By creating a ""real"" file, you can assign a fixed URL, but still use the database to fill-in the page, using any of various server-side techniques (HTML server-side includes, Perl, Active Server Pages, Java Server Pages, PHP, etc.). A simple page like this might suffice:

Blue Widgets style 1

Save this page as ""bluewidget-1.html"" and you're good to go, assuming that ""myscript.pl"" will actually return the content you want for the body of the page. True, you will have a discrete page for each item in your inventory, but at least you only need to hard-code the bare-bones of that page.

Another Way To Go There is yet another way to go. This method does not require creating dozens of static pages, or of having to include exotic scripts in your web pages. It also may not work for all search engines!

Some search engine robots just will not follow links that include a ""querystring"" as part of the URL. You have seen a querystring if you have ever looked at the URL of a page of search results in Google. For example, if you look for ""blue widgets"" on Google, not only do you get page after page of blue widgets, you also see that these pages have very complicated-looking addresses

http://www.google.com/search?hl=en&lr=&ie=UTF-8&oe=UTF-8&q=blue+w idgets

In this address, everything after the question mark (""?"") is a querystring. This is used to pass additional information to the web server. While some search engines can follow a complicated address like this, many simply will not follow such a link. That means that if you use a URL like

http://www.mycompany.com/catalog.html?item=widget&color=blue&mode l=1

that the robot may not be able to follow it. This is bad.

On the other hand, an increasing number of search engine robots will follow such links. Usually, links like this are created ""on the fly"" by filling-out forms and clicking a ""submit"" button, but that doesn't have to be the case. You can grab that address, querystring and all, and put it into a ""normal"" link, like this

blue widgets style 1

Put several of these on a page and the search engine robot can now visit your dynamic pages from links that require no button-clicking. Remember that not all robots will follow these links, so your mileage may vary.

As long as the link to the page exists in a form that does not require human intervention to get to it (pulldown menus, search results, form submits, etc) then a bot will follow it.

Widgets Out The Door Using any of these methods will help search engine robots to find the dynamic pages on your site. This means that the important content on those pages is more likely to be included in the search engine databases, and that people will be better able to find you. That, of course, means that the Widget Queen will reign supreme, knowing that widget customers the world over will now be able to find you and buy your widgets.

# # #

Dale Goetsch is the Technical Consultant for Search Innovation Marketing (http://www.searchinnovation.com), a Search Engine Promotion company serving small businesses and non-profits. He has over twelve years experience in software development. Along with programming in Perl, JavaScript, ASP and VB, he is a technical writer and editor, with an emphasis on making technical subjects accessible to non-technical readers.

Copyright © 2003 Search Innovation Marketing. All Rights Reserved.

Permission to reprint this article is granted as long as all text above this line is included in its entirety. We would also appreciate your notifying us when you reprint it: please send a note to reprint@searchinnovation.com.

About the author: Dale Goetsch is the Technical Consultant for Search Innovation Marketing (http://www.searchinnovation.com), a Search Engine Promotion company serving small businesses and non-profits. He has over twelve years experience in software development. Along with programming in Perl, JavaScript, ASP and VB, he is a technical writer and editor, with an emphasis on making technical subjects accessible to non-technical readers.

Tuesday, September 16, 2008

Leading Your Web Designer to SEO

Author: Christian Nielsen

Many SEO projects involve taking a site that has already been built and changing or adding optimization elements to help the site rank well in the search engines. For a site that has already been built, the web designer is usually not involved in the process.

However, there are two situations where the designer should be very much involved in the SEO process: when a new site is constructed, or when an old site is being redesigned.

When building a new site, the SEO consultant should be involved as the site concept is being developed. The perspective of the SEO consultant is much different from that of the design team and the site owner. The SEO understands that a balance of keyword phrases and well-written marketing copy is what helps the site attract traffic and convert visitors into buyers. The SEO can also provide advice on the best way to add new content. For example, a site that is about food will draw many visitors if it has a section that offers recipes, and a site about music will see many more visitors if it also offers MP3 music files.

More importantly, the SEO can provide advice on how to best construct the site. SEO consultants understand that sites which have 100% of the content in Flash, or which use frames will pose problems during the optimization process. The consultant will also understand how to overcome some of the SEO limitations that dynamic sites pose.

When the design team is aware and involved in the optimization process from the beginning, a site can be optimized "from the ground up" as it is being built, which will involve less time spent by the SEO consultant later. This can also provide a level of optimization that is often not practical after a site has already been constructed.

When keyword research is completed before the site construction has started, designers can use them when they create new pages and graphics files in order to derive the benefit of keyword-rich file names. e.g., Instead of using words like "logo" and "header," the designer can choose keywords that are more descriptive. The SEO can also provide a basic Meta tag set which can be used for the site. By including the Meta tags in the site templates, the work of editing each page later can often be avoided.

What about redesigns?

The first and foremost role the SEO has when a site is being redesigned is to ensure that the web designer doesn't change all the page names when performing a site update. Otherwise, when the new site is launched, the traffic to the site may substantially drop off when people click on search engine listings that no longer exist.

To combat this phenomenon, the consultant will advise the designer to re-use the old page names as much as possible. If that's not possible, there are two ways to deal with this problem:

1)A custom 404-error page can be installed to inform the visitor that the page(s) no longer exist and present them with several options to continue into the site. 2)Redirection pages can be set up which use the old page name and an informative message for the visitor. The pages should offer one or more clickable links, and might include a timed-redirect that should be at least 30 seconds before taking the visitor to the page with the new name.

The second option is more desirable, since it allows the search engine to note the change, follow the link to the new page and add it to its index at some later point. And of course, if new pages are added, or the filenames have to change, it provides a chance to use filenames that can be optimized with keyword phrases.

Just as you are unlikely to turn an SEO consultant into a web designer, you can't turn a web designer into an SEO. However, SEO consultants and designers should work together to bring all of their skill sets to the table. The SEO needs to understand that graphic images can be very important in the visual appeal of a well-designed web site, just as the designer needs to understand the benefit of having a home page title that says more than just "Home."

About the author: Christian Nielsen is the owner of Nielsen Technical Services , which provides Internet consulting and SEO services, including the optimization of dynamic database-driven sites, and Blog optimization. The company maintains a policy of answering questions from clients and potential clients, even if they are just ""tire kickers"", and has a Blog that can be seen at NielsenTech.Blogger.com which deals with (mostly) search engine related-issues.

Monday, September 15, 2008

SEO contest - prize to charity

Author: Patrick Altoft

A charity entry in a google search engine optimisation contest could win thousands of dollars for Oxfam. The aim of the contest is to rank number one on a google search for the term v7ndotcom elursrebmem.

w.v7ndotcom-e lursrebmem-contest.com has been created with the sole aim of winning the contest for charity. Several high profile webmasters and search engine optimisation experts have already pledged their support and the sites creators believe that more will follow.

The designers of w.v7ndotcom-e lursrebmem-contest.com are hoping that a large number of webmasters will be interested in this contest but won't have the time or resources to take part themselves. By putting their weight, or back links, behind this charity entry everybody can take part and win a large sum of money for Oxfam.

w.v7ndotcom-e lursrebmem-contest.com is not just relying on people's generosity to win this contest - it also uses all the available knowledge of search engine optimisation to create a site that is clear and easy to navigate with lots of useful information.

Key features of the site include a blog which will be updated throughout the contest and a live v7ndotcom elursrebmem leader board which can be easily integrated to any third party website.

Controversy and a certain amount of hype has surrounded this contest from the start, the official press release for the contest stated that ""John Scott, editor and administrator of v7n.com, announced the contest on December 20th, with a $1,000 grand prize. Soon afterwards Greg Boser, a web page optimizer, announced a $1,000 reward for not playing by v7n rules. Two of Greg Boser's friends - Mike Grehan and Todd Friesen - then contributed to the reward pot, bringing the anti-contest grand prize to $3,000.

Many in the SEO community were outraged by what they saw as an attack on the popular v7n community, and John Scott in particular. The blogging community rallied and added $3,000 plus an iPod to the v7n contest, putting the total at $7,000 and an iPod for the grand prize winner, with smaller cash amounts to those who place 2nd through 5th.""

Whatever the motives of the organisers it is clear that this contest and v7ndotcom elursrebmem is set to be one of the most discussed topics of the next few months.

If you wish to help w.v7ndotcom-e lursrebmem-contest.com win this contest for charity or are interested in the methods being used to promote the site please look at our homepage or the FAQ's section.

About the author:

w.v7ndotcom-e lursrebmem-contest.com is donating all the prize money to charity.

Sunday, September 14, 2008

KEYWORD POWER! Stop Fumbling Around In the Dark! Get Qualified Traffic Now!

Author: Jeff Smith

KEYWORD POWER! Stop Fumbling Around In the Dark! Get Qualified Traffic Now!

If you had to take a stress test right now - while someone asks you how effective you are at selling a product online?

Would you show moderate frustration?

Register an ""above average"" score on the liklihood you will strangle the next person who mentions getting rich online?

Or are you ready to pull the plug on the who freaking thing, and start your own meat carving business?

You want to sell a product online, and everyone wants to tell you HOW to sell a product online....BUT....

What they DON'T tell you is that it all depends on 2 simple factors -- knowing what people want to buy AND what they search for when they are looking.

Listen: it's taken me over 1-year to learn the TRUE meaning of this one single word.

Everyone say KEYWORD.

Say it again -- KEYWORD.

Ok - with that out of our systems, let's look at the way that keywords can make your life simpler, massively increase your ability to sell products online AND...provide a formula you can repeat again and again -- until you're tired of making money.

KEYWORD Tip #1: Your Product Idea

Keywords tell you what people search for on the internet. Lots of searches for information means you can provide answers to urgent

Study keyword combinations, discover demand and supply ratios -- made faster with a tool like Adword Analyzer -- http://www.infoproductcreator.com/part/adwanalyze and you will have a window into product demand online.

KEYWORD Tip #2: Product Names and Titles That Sell

You know what people are looking for with keywords.

Figure out WHY they are using the keywords and you have the perfect high-demand title that will sell you a ton of products.

KEYWORD Tip #3 Sales Pages That Sell Products Online

Your sales page is only converting 1 in 1,000 visitors into buyers?

Usually two main reasons - your copy sucks, or you are attracting the wrong people to your website.

Guess what?

Either way - KEYWORDS come to the rescue.

Optimize your site for keywords that make sense for your product or service. Don't go general here, get really specific -- you want less traffic but more sales, not a TON of traffic and virtually no sales.

Here's a little known secret.

People look for information online - they don't look for products.

So - what you need to do is attract your visitors with content, the content they are looking for AND THEN have links and a menu that lead them to your products.

If you need help converting visitors into buyers - look no further than the Hypnotic Marketer himself -- http://www.infoproductcreator.com/hypnotic.html

KEYWORD Tip #4: Keywords Help You Find Super Affiliates

When you arm yourself with the list of most popular keywords your most qualified visitors use to find information related to your products -- you can use that information to form SUPER affiliate joint ventures which will let you sit back and watch the money roll in.

Seek out those sites that are performing best and optimized best for your target market's most searched keywords - then make really good friends FAST!

Because they can make you a truckload of sales in next to no time.

KEYWORD Tip #5 It's the ONLY Thing We Really Know About Online.

Keywords don't lie.

The fact that ""travel book"" gets roughly double the searches as ""travel information"" tells you exactly how people are searching for your information (assuming you've written a travel book)

When it comes right down to it - the comings and goings of people on the internet is far from an exact science, no matter what the ""guru's"" tell you.

But...KEYWORDS put you back in the driver seat, give you the tools to sell products online and bring your stress level down to a level where you may not even know who you are anymore.

------------------------------------------------------- WARNING! Not All Infoproducts Are Created Equal. With These 7 Simple Lessons, You Really Can Turn Your Knowledge Into Income-Generating eBooks, Books, Special Reports or Any Infoproduct Fast! Find Out How In This Complimentary Minicourse -- http://www.infoproductcreator.com/minicoursenew.html

About the author: Jeff Smith is the author of too many articles to list on the topic of Creating eBooks, Special Reports and Internet Marketing. Visit his site at: http://www.infoproductcreator.com

Saturday, September 13, 2008

How to improve your search engine position

Author: Polly Nelson

When promoting your website, the importance of submission to major search engines cannot be overstated. ""46% of web-users use search engines to first find a website.""1 They won't find yours unless it features in these engines. Getting a good ranking, however, is not as easy as it seems. To better your chances, you need to make sure that you do the following things before submitting your site.

Start by writing a descriptive title for each page of your site. This should accurately describe the services offered on the page in around 5-8 words, avoiding words such as 'the' and 'and'. When your website comes up on the search engines, this title will be the thing that first grabs the attention of the user. Next, you need to write your 'meta-tags'. Featured on every page of your site (although hidden to the user), these are simply either 'keyword tags' (10-12 important keywords / phrases that describe your services) or 'description tags' (just a short description of what's on the page - designed to attract users from the search engines). If you are trying to build your own meta-tags and are unsure of how to do it, here's a free service to help you: www.bcentral.com/products/metatags.asp

When deciding on your keywords, imagine your audience and how they would search for your services. Your keywords need to be the search terms that they will type into the engines. Will they search using industry-standard terms, or will you need to put some layman terms in too? You should also think about spelling mistakes. Is it likely that customers will search for you using mis-spellings or typos? If so, you should consider including these in your meta-tags. You can also have a look into which search terms are currently most in-use: try www.wordtracker.com for this. Also consider having a look at your competition - type your keywords / phrases into google with ""inverted commas around them"". The results that come up are your direct competition for these search terms, so if there are too many, you should think of alternative terms.

When you have finalised your keywords/phrases, write a 'description' meta-tag for each page of 200-250 characters. This doesn't need to include any words from your title, but should be based around your keywords.

Next, you need to optimise each page of your site by using your keywords throughout the content and in alternative text for the images. If your keywords do not accurately reflect the content on your pages you are unlikely to get a good listing. You should however bear in mind that unnecessary repetition of keywords in your content will also damage your ranking. For optimum results, mix the words and phrases up in different combinations rather than using the same one over and again.

Although you can expect it to take up to (and sometimes exceeding) six weeks to get listed in a search engine, you should make sure that your site is 'search engine friendly' before submitting it. Ensure you have done the following: Read the rules/listing requirements of each search engine and make sure your site complies; Bear in mind that frames and flash-based pages on your site make it hard for the search engines to index your pages; Ensure your meta-tags are complete and your content accurately reflects them; Make sure all your pages link to each other (especially if you have a 'cover page' that redirects to the rest of your site) so the engines can find all the pages; Never have a link that points back to the current page, (i.e. if your logo is a link to your home page you must make sure that it is not a link on the home page); and finally - Make sure that you do not over-submit your site on each search engine. Most have some sort of 'spam trigger', which this will set off and you will not be listed. Submitting your home page once to each engine should be all that is necessary. Apart from meta-tags, there are a number of things you can do to improve your ranking on the search engines. If you are willing to have links to other sites on your website, Link Exchanges can be a great way to promote your site and improve its position on the search engines. If you have a large number of incoming links to your site from other sites, search engines are likely to rank you much higher up the list. Links from high traffic sites will be of the most use, so ask for a link from any associations you are a member of. You can also request reciprocal links from complementary sites (consider an 'on topic' link, i.e. ""website design and custom software development"" - including important keywords), and develop an out of the way page to put links to their sites on.

Another thing you can do is to make sure that you provide useful and plentiful content on your site, for good indexing and good inbound links. You could include a press page on your site for all your media releases and mentions (you can submit press releases online at www.prweb.com , amongst other places), or you could write useful articles to put on your site and submit them to relevant places online (good places to start include www.GoArticles.com - there are many others, try to ensure you include a link to your site at the end of the article).

Other online places to promote your site (if you have the time) include newsgroups and discussion lists. Use google groups to find appropriate discussions and make sure your 'signature' includes your website link.

When submitting your site to the search engines, avoid the plethora of free programs on the internet that promise to submit you to 300+ engines - this is largely a waste of your time. Most search engines are powered by the few major ones, which usually have at least some sort of a free submit service. (If you are thinking of paying for a high search engine ranking, the pay-per-click options offered by search engines such as https://adwords.google.com/select/main can be a cost-effective option.) Make sure you submit to the major engines and directories, including www.google.com , www.yahoo.com , www.altavista.com , www.dmoz.org , www.lycos.com , www.overture.com , www.askalix.com/uk/index.php3 , www.thomsonlocal.com .

Also, look for specialised directories for your sector. There are hundreds of these to be found, so take the time to search the web for them (www.google.com is a good place to start) and get yourself listed; with a link to your site if at all possible as this will help with your search-engine positioning. Similarly, there are specialised search engines you can submit to: try www.searchengineguide.com/searchengines.html for this.

Of course, you should make sure that your website address features on all your company stationery and literature.

Finally, a few words about the use of email to promote your site... Ensure all the emails you and your staff send have a signature which includes your website url; put a feedback form on your site and encourage users to leave their email to receive updates and offers from you; but whatever you do, don't send out bulk, unsolicited, untargeted emails - it will do more to damage your credibility than almost anything else.

There are, of course, many more strategies that can be used to better your search engine position and you should bear in mind that website promotion has to be an ongoing process. However, hopefully the methods described above will help you start your quest for a top search engine ranking.

This article is copyright Fire Without Smoke Software 2003. Please contact www.fwoss.com or info@fwoss.com for permission to reproduce.

About the author: Polly is FWOSS' research director and does a lot of search engine optimisation for clients. She writes many technological articles for the company, which can be viewed online.

Friday, September 12, 2008

Big Site? Make the most of it on Google

Author: John Saxon

For many people a five or six page web site is all they need or want, but for others, selling services and products on the internet, a hundred page site is barely adequate – if you're one of those companies then here are some tips on making the most of your site on Google and other deep search engines.

One of the sites we manage is 520 pages packed with content and informative articles. It has some 10 / 12 levels of pages in its structure and we became aware that Google only indexed 126 of those 520 pages, what was going on?

Maximising each page

We worked diligently with the web site owner to optimise each page, ensuring it had a unique and the page content was rich in the KEYWORDS for that topic, for instance if you're trying to get onto Google with 'content management systems' and the phrase 'content management systems' does not appear on the page in HTML text then you won't hit the top 1000!! Similarly if the phrase 'content management system' appears lots you will still fail because Google sees 'system' and 'systems' as 2 totally separate words

Remember, each page has a , , and ALT TAGS if you, or the designer has simply duplicated another page, as a template, in the design all your pages will have the same attributes as far as the spider is concerned.

Spider depth

Most spiders do not index below level 3 and therefore they do not find what may be very important pages at all. In addition we noticed with Google Page Ranking that the Index Page was 5/10, a level 2 page on the same site was 4/10 and a level 3 page was 3/10. Presumably pages beyond level 3 are considered so insignificant that the spider has been programmed to ignore them.

In addition the spider was stopping dead at drop down menus and graphic links it could not move beyond. Spiders essentially follow HTML text links and that's about it. If you stick to that rule you won't go too far wrong.

Our challenge was therefore to bring every page, no matter where it was in the site, to a level 3 position at least – without changing the structure of the site itself, so the spider would index it and the page ranking would be higher. This would give us 520 marketable, optimised pages rather that 126.

The solution was quite simple. A site map – we simply spent a few hours setting up a site map with a link from the index page ( making the site map level 2) and then an HTML text link to every page on the site, making every page on the site at least level 3.

The next time the site was spidered by Google, there it was, 520 content rich optimised pages and an increase in traffic of 1000%

Big sites, make the most of them, don't keep your content hidden under a bushel!!

About the author: John Saxon is technical director of site-pro limited a site offering free tips, tools and articles for web site optimisation – the site may be visited at http://www.site-pro.co.uk

Thursday, September 11, 2008

COMPLETE WEB-SITE OPTIMIZATION FOR SEARCH ENGINES (part2)

Author: Pavel Lenshin

COMPLETE WEB-SITE OPTIMIZATION FOR SEARCH ENGINES (part2)

------------------------------------------------------------ copyright (c) Pavel Lenshin ------------------------------------------------------------

Source code optimization.

{title}...{ itle}

This tag is to be a winner. This is a primary spot to include our keywords for SE spiders, bots or crawlers (""spider"" hereafter). {title} tags are the best ""dainty dish"" for SE spiders. They eat them as cakes, so make title tags to be tasty for them, about 65 characters long.

{meta name=description content=""...""}

Important Meta tag. Very often the description you put will be shown at the SE searching results. To my personal opinion they have more important marketing role of attracting visitors than actual optimization. The SEs' trust in ""description"" tag as well as our next ""keywords"" tag has been greatly discriminated due to fraud and unfair competition. Make it no more than 250 characters long, including, of course, your targeted keywords as well.

{meta name=keywords content=""...""}

Another advisable to use Meta tag should be included with all your targeted and untargeted, but related to the topic, key phases separated by commas. Note that highly popular and stand alone keywords like ""web-site"", ""internet"", ""business"" etc. will give you nothing more than increase the size of your web-page. I won't be mistaken, if I say that about several millions of web-pages have them. Don't overuse your keywords as well, spiders don't like to be forced to eat what they don't want to.

{meta name=author content=""...""}{meta name=copyright content=""...""}{meta name=language content=""...""} etc.

Subsidiary Meta tags that are used more likely to satisfy webmasters' ego, rather than bring any real help in rankings.

{h1}...{/h1} {h2}...{/h2} {h3}...{/h3}

In contrary to the previous tags the importance of, let's call them, ""body"" tags have substantially risen for simple reason, they are readable by visitors and it is hardly to cheat SE with them than Meta description or keywords tags where any webmaster may put anything s/he wants. Given that these tags determine the headers of your web-page from the SE spiders' viewpoint, try to include your targeted keywords in them.

{img src=: alt=""...""}

""Alt"" is just a comment for every image you insert into the page. Use this knowledge at your advantage. Include your key phrases where possible and safe. By ""safe"" I mean common sense, don't input comment like ""ebook package"" into the image of the button that leads to your partner, say, ""Pizza ordering"" web-site. On the contrary, if your web-site has graphical menu and buttons, it is very wise to include ""alt"" comments according to directions they lead to, i.e. ""Home"", ""Services"", ""About Us"", ""Contacts"" etc. If for any reason visitors have their browser with images turned off, they won't see any menu if you haven't inserted ""alt"" comments.

Content

Your informational coverage should be keyword/phrase rich, the same way as headers. In general the more relevant key phrases your textual information will contain, the better your chances of being ""remarked"" by SE spider are.

HTML text format tags like bolding {b}, italic type {i} and underlining {u} may also have some weight in SEs placement.

Key word density and frequency are another indexes vastly used by SE to rank web-pages. Don't overuse them though.

Link popularity (page rank)

Another extremely important parameter for your listing position nowadays. In general the more links on third party web-sites point to your site the better. Although try to avoid ""link farms"" or other ""clubs' the only aim of which is to artificially increase your link popularity. These tactics may simply result in penalization or banning of your web-site.

Link popularity without any doubt helps to increase the relevance of searched terms more often than it doesn't, but makes SEO even more far-reaching target, because establishing quality ""incoming"" links pointing to your site is beyond your direct power.

To be short, your task is to find web-sites that have highest SE listing positions and/or page rank (determined via Google Toolbar) and negotiate a link to your site in return for some service, product or solicit simple exchange of links. As you see these ""manual"" work is the most time-consuming, but it repays if you are focused to get as much relevant links as possible.

You may apply viral strategies by offering some free/paid service that implies putting a link back to your site.

Google has developed its own link popularity evaluation tool called Page Rank. It is calculated basing on consistently changing number rules: current rank of the site the link to your page is pointing from, its relevance to your web-site topic, presence of targeted words etc.

Fake tactics

They are what I call them and used by webmasters similar to ways some ""marketers"" use spam to promote their businesses.

Unfortunately, usual internet users don't have ability to ""ban"" spammers the same way SEs penalize those ""smart"" webmasters. I don't recommend you to use any of these tactics, even on someone's ""advice"".

They include excessive use of related and totally unrelated keywords, comment tags, hidden layers, text on the background of the same color, artificial link farms, numerous entry pages etc. This game simply won't be worth candles if your web-site is banned for good.

robots.txt file

Very important file every web-site should have. It allows you to literally rule or direct SE spider to the ""proper"" places, explaining what and where should be scanned, not just blind waiting of your lucky day. With its help you can also protect your confidential web-pages and or directories from scanning and showing at the SE searching results, very important feature many web-masters solve with ""tons"" of Java or even Perl coding instead of one line string in the robots.txt file that will forbid to scan ""download"", so-called ""thank you"" pages or anything you want!

General rules of creating robots.txt file you can find here http://www.robotstxt.org/wc/robots.html

Design & Layout issues

Next point is to have a textual info. The simple declaration of content rich web-site is not enough, SEs need text to scan.

Clear to follow links. If you have Flash or Java applet navigation menu, make sure to duplicate somewhere and include HTML links as well. Most SE spiders cannot distinguish dynamically created web-pages with the help of ASP, Perl, PHP or other languages. It is also clear that all web-pages, access to which was forbidden (no matter how) by administrator, would also be left unnoticed. The same relates to HTML frame sites. What frames actually do is complicate the way web-site is being scanned, no more, no less. When I see web-site made of frames, it is like webmaster telling me: ""I want lower SE position.""

Because of the excessive work spiders have to do in order to scan as many pages as possible, their scanning ""accuracy"", if we can say so, have dropped, so they will hardly scan each and every of your pages from the very top to the bottom, it is more likely to be selective scanning, so, to ease this process you should try to arrange the most valuable info, including header tags and text at the very top of web-pages. Having ""site map"" page with all link connections of your site not only does it help your potential visitors, but SEs as well.

All link names, inside your informational content, are to contain your related keywords or phrases, not just ""click here"" or ""download here"".

Avoid a lot of javascripts, cascade style sheet tags or a lot of image tags at the top of the page that could occupy more than a page of HTML source code with almost no textual info. If you have java or .css coding save them as separate files and upload on request, leaving one string of code in your HTML document only. This tactic is also very smart considering general web-page optimization and space saving purposes.

Allow to the Internet market know your business better.

About the author: Pavel Lenshin is a devoted Internet entrepreneur, founder of ASBONE.com, where you can find everything to make your business prosper. Discounted Internet services, FREE ebooks http://ASBONE.com/ebooks/ FREE reports http://ASBONE.com/reports/

Wednesday, September 10, 2008

Where on Earth is your Website?

Author: Robert McCourty

Where on Earth is your Web Site? by Robert K. McCourty

You've just finished congratulating your marketing team. After six months of concentrated effort you can now actually find your own company web site within the search engines. Everyone is busy handshaking and back patting when a voice from the back of the room rises above the din. ""Yeah this is great! Can't wait until we can find ourselves on wireless devices."" All conversation comes to an abrupt halt. Eyes widen. Everyone turns to the fresh-faced intern standing in the corner with a can of V8 juice in one hand and a PALM device in the other. You, being the Department Manager, barely managing to control your voice not to mention your temper, ask the now nearly frozen with panic intern, ""What do you mean find ourselves on wireless? We just spent thousands on our web site visibility campaign!"" ""Well... Explains the sheepish intern, ""There is no GPS or GIS locational data within our source code. Without it, most wireless appliances won't be able to access our site.""

Guess what? The intern is absolutely correct. Anyone interested in selling goods and services via the Internet will soon be required to have some form Geographic Location data coded into your web pages. There are approximately 200 satellites currently orbiting the Earth. (even Nasa won't confirm the exact number) Some are in geosynchronous or geostationary orbit 27,000 miles above your head. The Global Positioning System (GPS) is the name given to the mechanism of providing satellite ephemerides (""orbits"") data to the general public, under the auspices of the International Earth Rotation Service Terrestrial Reference Frame (ITRF). Sounds like Star Wars doesn't it? It's pretty close. The NAVSTAR GPS system is a satellite-based radio-navigation system developed and operated by the U.S. Department of Defense (DOD). The NAVSTAR system permits land, sea, and airborne users to determine their three-dimensional position, velocity, 24 hours a day, in all weather, anywhere in the world, with amazing precision. http://igscb.jpl.nasa.gov/

Wireless devices, WAP, Cellular, SATphones and a whole host of newly emerging appliances and indeed, new software applications, will all utilize some form of GPS or more likely GIS data retrieval. GIS stand for Geographic Information System and relies on exact Latitude and Longitude coordinates for location purposes. Several car manufacturers currently utilize GPS for on-board driver assistance and the Marine and Trucking Industries have been using it for years. Obviously your web site is a stable beast. It sits on a server somewhere and doesn't move much, so at first glance it seems quite unplausible you'll need GIS Locational Data within your source code. On the contrary. One aspect your web site represents is your business's physical location(s) and if people are going to try to find your services and products, shouldn't you at the very least, tell them where it is and how to get there?

Let's look at it from the other end of the spectrum. The end user approach. Let's say you're vacationing in a new city for the first time. Once you get settled into your Hotel room, what's the first thing you want to find? Restaurants? Bank machines? Stores? So you pull out your hand-held, wireless, device, log onto the web and search for ""Italian Food in San Francisco."" Five Hundred results come back so you click the new ""location"" feature on your hand-held (which knows exactly where you are) and ten Italian restaurants, who were smart enough to code their web sites with GIS data, light up on the screen. Guess which restaurants didn't get selected? The other four hundred and ninety. Starting to get the picture?

How does this affect you and your web site marketing? GIS Latitude and Longitude co-ordinates will soon be a must have on every web site operators and web developer's list and an absolute necessity for anyone wishing to trade good and services via the Internet. This data may relate to the physical location of the web site or where the site is being served from (if applicable) or where the actual business represented by the site is physically located. There may be multiple web site locations and coding involved, if for example, you have a franchise with multiple locations, each location will probably need a page of it's own with the correct corresponding location data. If you run a home-based business, I doubt if the co-ordinates to your living room are going to be necessary, but you should provide the latitude and longitude of the closest city or town. Large corporations such as banks may want to code the exact location of every automated teller machine across the country. Industry standards and the methods of serving out this data are still in the development phases but it's a safe bet to assume there are plenty of people working on the solutions right now and given the speed of technology, implementation will probably be much sooner than later. Give yourself an edge. Find out where in the world your web site is...before your web site is nowhere to be found.

About the author: Robert McCourty is a founding partner and the Marketing Director of Metamend Software and Design Ltd., a cutting edge search engine optimization (SEO) and web site promotion and marketing company. Scores of Metamend Client web sites rank near or on top of the search engines for their respective search terms. http://www.metamend.com/

Tuesday, September 09, 2008

Does Your Website Need Search Engine Placement?

Author: Dave Davies

Introduction For many, the value of their websites can be measured in visitors, for others it is the amount of revenue that the site generates. Regardless of how you measure the success of your website, you are going to have to bring visitors to it that are looking for the information, content and/or products that you are providing. So how do you do this?

For many businesses the yellow pages is a first choice for promotions and marketing. A standard yellow page ad runs for about $1,200+/year for a smaller ad and serves only a local market. You further have to consider the cost of ad development, which can often run into the thousands of dollars as well.

While this may be an essential form of advertising for many businesses which serve only local areas, it is certainly not the only one.

On the Internet there are many forms of advertising that have and are being used to bring traffic to websites. From banner ads and sp@m email to PPC and natural search engines there are countless methods for promoting your website online. So how do you choose which marketing tactic to utilize?

One thing to consider is that over 80% of all Internet traffic comes from search engines with Google currently responsible for the vast majority of that. With such an overwhelming amount of traffic coming from a single identifiable source it makes sense to put a lot of weight on what traffic from this source can mean for your website, and for your business.

In this article we will explore the difference between sites that should consider search engine placement as a viable choice in their marketing strategy, and those that would benefit little from top placements. As well we will look at ways to insure that you are maximizing the effect and potential return on investment of your search engine placement campaign should you choose to go that route.

Who Should Consider Search Engine Placement An ethical search engine placement firm will tell a client honestly if search engine optimization will benefit their website. Of course, not every company is ethical and further, how do you know until you have undergone search engine optimization, whether it will benefit your website? By this point you have paid the firm and they have done their job whether it helped you or not.

There are a few things you should consider before you apply search engine placement tactics to your website, or consider hiring a search engine placement firm. I include "apply search engine placement tactics to your website" as, even though it may be "free" to do it yourself you can detract from the overall visual appeal of your site if not done correctly, and also it is a very time consuming process to both learn and perform.

Questions You Should Ask Yourself: What do I stand to gain from higher search engine traffic? As search engine optimization has a built in cost of either time and resources should you choose to do it yourself, or through the direct cost of money should you hire a search engine placement firm, you have to insure that what you hope to accomplish will be at least equal to and preferably greater than the cost of time and money it will take to do successfully. For more information on this you will want to read the section below "So You're Going To Market Your Website On The Search Engines … Now What?" for tips on establishing whether there is truly enough traffic to be had to make your efforts worth it.

Can I compete? The short answer to this question is generally always "yes" however there are many factors to consider. While any site, with enough work, can rank well you do have to consider whether the effort will be worth it. For example, if you own a small computer store in Michigan that repairs computers and troubleshoots software issues, it is theoretically possible for you to rank your website well for the term "Microsoft Windows". To do so would require an ENORMOUS amount of both time and money. And so you have to consider, is it really worth investing years of time and money into this one ranking? The answer is "probably not" but that doesn't mean that search engine placement would not be beneficial for you, just that those keywords are not worth targeting. There are suggestions for choosing the right keywords below.

Do I make money from my website traffic? This question isn't the end-all-be-all however it's certainly relevant if you're considering hiring a search engine placement firm to optimize your site. If you plan on "going it alone" you may want to put in the effort for no money in return simply for the "fun" of it, however if you're planning on spending your hard earned money on a search engine placement firm you have to make sure that it is in your economic best interest to do so. This may be from either direct product/services sales or from the sale of advertising on your site.

If, after answering these questions to yourself you have determined that search engine placement is indeed a good choice for your marketing strategy you will now have the task of determining exactly what tactics will produce the greatest Return On Investment for your efforts.

So You're Going To Market Your Website On The Search Engines … Now What? Now that you have determined that search engine placement is indeed an avenue of marketing that can produce beneficial results for your site and for your business, you have to decide on a "plan of attack". Many SEO firms will "help" you determine keywords to target, and some will even build links for you from "valuable" sites. In many case they may be entirely truthful but how do you know?

The choosing of the keywords to target is probably the most crucial step of the entire search engine optimization process. This will determine the success and/or failure of your promotions. Even if you attain all the top placements that were targeted, if you target the wrong terms these rankings will produce little or no results. So how do you determine for yourself which keywords to target?

Without getting into anything too technical there are a couple of great resources out there to help you isolate the keywords that you should target.

The Overture Search Term Suggestion Tool Advantages – The Overture Search Term Suggestion Tool is free and produces a large number of results for related searches. Disadvantages – There are two main disadvantages to the Overture Search Term Suggestion Tool in determining your keywords. The first of these is that it puts everything in terms on the singular and further will correct misspellings where you may want to know how many people searched for a misspelled term. If you run a shopping site that sells gifts you will not be able to determine whether the main searched phrase was "gift" or "gifts". The second major disadvantage to the Overture Search Term Suggestion Tool is that it doesn't give you alternative search phrases that are related but that you might not have thought to punch in ("presents" for example). Which brings us to the second tool.

WordTracker Advantages – WordTracker addresses all the disadvantages noted about the Overture Search Term Suggestion Tool. It differentiates between singular and plural and will allow you to search for misspellings. Further, it searches a thesaurus and will make a number of suggestions for other terms you may not have thought of but which may be related to your industry. It will then analyze the variety of terms that you have searched and chosen and actually make recommendations on which keywords to target based on the number of competing pages and the specific search engine you are targeting. It will also give you a predicted number of searches per day for each phrase. Disadvantages – The only real disadvantage to WordTracker is that it has a cost. There is a free trial on the site which you can use though the results it produces are far lower in numbers and it does not give you information on all of the major search engines. Certainly worth checking into even if you only try the demo mode (not a download – this is an online resource).

So you now have a list of possible keyword phrases … now what? The next step is to determine whether you can compete with those currently holding the top positions and, more importantly, whether it will be economical to do so. The first place to look when you are trying to determine this is the search engines themselves. Let's assume for a second that you have determined that there are a good number of searches for your product and that the main keyword phrase you would like to target is "acne cures". The next step is to run a search for "acne cures". Most people are interested primarily in their Google rankings and so you would run that search on Google producing the following results: http://www.google.com/search?hl=en&lr=&ie=UTF-8&oe=UTF-8&q=acne+c ures.

The site currently holding the #4 position is www.approvedcures.com. Take a look at the site. There are two major things that you will first be looking for:

Is this a large site with a lot of content related to the search phrase? In this case the answer is "yes". They have a very large site and all of it is related to the topic the search was for.

How many incoming links do they have? You will now want to find out what sort of link popularity you will be competing with. Links are not the end-all-be-all of search engine optimization, if fact it is just one of a large number of deciding factors, however the links help determine your Google PageRank and this PageRank is a final multiplier in determining your position. Because the number of incoming links is easily determined it is something you should look into. To determine the number of links simply enter into the Google search bar "link:www.domain-in-question.com". In this case it would be http://www.google.com/search?hl=en&ie=UTF-8&oe=UTF-8&q=link%3Awww .approvedcures.com . They do not have a large number of incoming links and thus, this is not going to be a significant factor in determining their ranking.

And Now You're Ready For Search Engine Placement At this point you should have a pretty good idea as to whether search engine placement is a viable choice for your website promotions, which keywords you should target, and what competition you will be facing.

Assuming that your website needs to be optimized, you will now be faced with the choice of doing it yourself or hiring a search engine placement firm.

If you will be doing your own optimization I would highly recommend reading an article by the CEO of StepForth Search Engine Placement entitled "A 10 Minute Search Engine Optimization". It can be found on the StepForth website at http://news.stepforth.com/2003-news en-minute-optimization.shtml.

If you will be hiring on a search engine placement firm I would recommend first submitting your site for a free website review off of our homepage at http://www.stepforth.com. There is no obligation and it will give you a very good idea of what areas need to be addressed.

About the author: Dave Davies is the marketing manager for StepForth Search Engine Placement Inc. Visit them online at http://www.stepforth.com or email him at dave@stepforth.com.

Monday, September 08, 2008

Meta Tags- What Are They and Which Search Engines Use Them?

Author: Richard Zwicky

Meta Tags - What Are They & Which Search Engines Use Them? By: Richard Zwicky

Defining Meta Tags is much easier than explaining how they are used, and by which engines. The reason is very few engines clearly lay out what they do and do not look at, and how much emphasis they put on any one factor. So, we'll start with the easy part

Meta Tags are lines of HTML code embedded into web pages that are used by search engines to store information about your site. These ""tags"" contain keywords, descriptions, copyright information, site titles and more. They are among the numerous things that the search engines look for, when trying to evaluate a web site.

Meta Tags are not ""required"" when you're creating web pages. Unfortunately, many web site operators who don't use them are left wondering why the saying ""If I build it they will come"" didn't apply to their site.

There's also a few naysayers in the search engine optimization industry who claim that Meta Tags are useless. You can believe them if you like, but you would be wise not to. While not technically ""required"", Meta Tags are essential.

If you simply create a web site and register the URL with the search engines, their spiders will visit your site, and attempt to index it. Each search engine operates slightly differently, and each one weighs different elements of a web site according to their own proprietary algorithms. For example, Altavista places an emphasis on the description tag and Inktomi states on their web site that;

Inktomi ""(...) indexes both the full text of the Web page you submit as well as the meta-tags within the site's HTML."" Other search engines like Exactseek are true meta tag search engines which clearly state their policy:

""Your site will not be added if it does not have Title and Meta Description tags."" They also use the keywords tag.

Of course, not all search engines work this way. Some place their emphasis on content. The search engines have over 100 individual factors they look at when reviewing a web site. Some of these factors deal with page structure. They check to see that all the 't's are crossed, and the 'i's dotted. They note sites that have omitted basic steps, like missing tags.

One reason so many engines de-emphasized the meta-keyword tag had to do with spam. There was a time when 'search engine promotion specialists' would cram keywords tags full of irrelevant information. The web site would be selling garbage cans, but the keywords tags were chock full of irrelevant terms like ""mp3"" or ""Britney Spears"". They figured that if enough people visited their site, some would buy.

So today, to avoid and penalize this kind of abuse, some search engines don't specifically use the keywords tag as part of the scoring of a site, but they monitor the keywords to ensure they match the content in the site. The reasoning being that, if the tags are irrelevant, they must have an alternate purpose. Is it a spam site? When keywords tags are completely irrelevant to the content, some search engines, that don't specifically use keywords tags, will penalize that web site.

Even for those engines that have downplayed the value of Meta Tags, there are situations where Meta Tags gain considerably in importance, e.g. sites with rich graphics, but poor textual content. Unfortunately, a picture is worth 1000 words to you and me, but zero to a search engine. If a site has poor textual content, the engines will be more dependent than ever on the Meta Tags to properly categorize it.

Even if you ensure you have completely relevant Meta Tags, some search engines will still ignore them. But better they ignore them, than they ignore your whole site because they suspect something is less than above board. Never hope that having Meta Tags will make the difference in all the search engines; nothing is a substitute for good content. But in cases where the engine depends on that content, it may be the only thing that does work for your site.

So How To Use The Meta Tags? Meta tags should always be placed in the area of an HTML document. This starts just after the tag, and ends immediately before the tag. Here's how the most basic set should look:

Search Engine Optimization Software - Metamend

Always make sure that your meta tags do not have any line breaks, otherwise the search engines will just see bad code and ignore them. You should also avoid use of capitals in your code (html5 standard) as well as repetition of terms within the keywords tag.

What Goes Into a Meta Tag? For the Description tag:

; Many search engines will display this summary along with the title of your page in their search results. Keep this reasonably short, concise and to the point, but make sure that it's an appropriate reflection of your site content.

For the keyword tag;

Keywords represent the key terms that someone might enter into a search engine. Choose only relevant keywords. If the terms are going to appear in your keywords tag, they must appear in the content of your site, or be a synonym to a term on your site. Most search engines compare your meta content with what is actually on your page, and if it doesn't match, your web site can get penalized, and suffer in search results.

for the Robots tag ;Many web pages have this tag wrong. An example of the wrong usage is content=""index, follow, all"" - wrong because some spiders can't handle spaces between the words in the tag or the word ""all"". Most engines by default assume that you want a web page to be indexed and links followed, so using the wrong syntax can actually result in the spider coming to the wrong conclusion and penalizing, or worse, ignoring the page outright. If by chance you do not want your links followed, or the page not indexed, then you would substitute ""noindex"" and or ""nofollow"" into the tag.

With the Internet growing at a rate of over 8,000,000 new pages per day, and the search engines adding a fraction of that number, Meta Tags are a common standard which can reasonably ensure a measure of proper categorization for a web site. So, always ensure that you cover all the bases, and use completely relevant terms in properly structured Meta Tags. Using tags properly will pay dividends in the short and long term. After all, using them properly only helps the search engines, which means they will send you more qualified traffic - customers.

About the author: Richard Zwicky is a founder and the CEO of Metamend Software & Design Ltd., www.metamend.com, a Victoria B.C. based firm whose cutting edge Search Engine Optimization software is recognized as the world leader in its field. Employing a staff of 10, the firm's business comes from around the world, with clients from every continent. Most recently the company was recognized for their geo-locational, or GIS, along with their phraseological and context sensitive search technologies.

Sunday, September 07, 2008

Keyword Ownership: What it is and Where it's Headed

Author: Richard Zwicky

Keyword Ownership: What It Is and Where It's Headed? By: Richard Zwicky Published: August 20, 2003, SiteProNews.

Have you ever got one of those silly emails that offers to let you own a keyword? Silly question. How many such emails do you get every day?

A number of such services regularly email me offering keyword ownership of premium keywords for $300/year. They say that anyone can type the keyword I bought in the address bar of Internet explorer, instead of typing in a URL, and they will be sent directly to my site. In total it seems that there are about 2% of Internet users worldwide who have enabled one type or another of this system, spread out between a few competitive services.

Data shows that between 4% and 7% of search queries are performed by entering something in the address bar. By default for IE users, these searches are automatically routed through to MSN search. Many of us however have installed so much software over time, and unknowingly, some of this software has re-routed these search queries to other search portals, such as iGetNet, or others. This often happens if you've installed any file sharing software. We have all heard / read about how many extra 'features' come with programs like Kazaa. This means that your default search from the address bar may no longer be MSN, and may have been rerouted elsewhere, but the basic principle still applies. Of the queries that are actually run from an address bar, at least half of them are unintentionally instigated by people mistyping the desired URL. This means that between 2% and 4% of Internet users actually search via their address bar.

So how exactly do these address bars work? There are many of these companies offering this kind of service, with each one of them selling the very same keywords to different and sometimes competing companies. To make things worse, the keywords you might buy will only work with the issuing companys proprietary address bar plug-in. Then, to actually offer search capabilities from the address bar, each of these service providers needs to get individual Internet users to download and install their plug-in, and remember to run searches from the address bar.

How effective can a marketing strategy of this nature be when the various tools are not interchangeable, theres numerous competitors selling the same key words to different companies, and you are targeting only a small fraction of Internet users? If your ad is being displayed because its similar to the search query, are you paying for irrelevant results? This can happen; If there is not a perfect match to a search query, the next closest match may be displayed.

Competing with these companies is any search engine that offers its own toolbar. You can download a toolbar from any number of engines, and run searches on any key word or phrase quickly and easily. You then get the search engines selection of closest matches, from all the web sites they have indexed. They offer more than just one choice, and dont cost anything

Who Started This? Started in 1998, Realnames was the first company that tied searching via the address bar to a web browser. At the time, it was touted as a value added solution for businesses around the world who were attempting to get their products found quickly, but didn't want customers to have to wade through a sea of Web addresses to reach their destination.

In part, it was deemed necessary because so few web site operators were search engine savvy, and fewer still knew anything about search engine optimization and promotion. What the Realnames solution did was allow a web site operator to buya keyword, and then when any user of Internet Explorer would type that keyword into the IE address toolbar, they would get directed to the web site that owned the keyword.

The company hoped to profit from businesses which wanted to reach Internet users who would type keywords into their browsers address bar instead of remembering the url, or going through a standard search interface.

Unfortunately for the company, the service was entirely dependent on Microsoft; and when Microsoft stopped supporting the technology in May 2002, the company was forced to close. The reason it was so totally dependent was simple; Unlike the new companies on the market today, Realnames did not depend on an end user downloading and installing a plugin, instead it was essentially integrated into Internet Explorer by Microsoft. Therefore everyone who used IE automatically had the plugin.

The Legal Question Each of the companies offering these services has a policy designed to ensure that a web site only buys keywords related to their content, and their review process is designed to keep cybersquatters from hijacking popular names and products. Unfortunately, there is no way to guarantee that any one of these keyword ownership services adheres to any naming standard, or even ensures that any purchaser has the legal right to any of the terms they are buying. This means that the rights to copyrighted material like ""Pepsi"" or generic words like ""business"" could end up in the hands of the first buyer. While Pepsiis a well known brand name, there are millions of copyrighted and trademark protected terms, covered in multiple jurisdictions. For these services to police copyright and trademark infringement would not be cost effective or practical.

In the summer of 1999, the U.S. Court of Appeals for the Ninth Circuit, denied Playboy's request for an injunction barring a search engine from selling advertising based on the terms playboyand playmate. In the precedent setting ruling regarding keyword advertising, Judge Stotler of the United States District Court in Santa, Ana, California, dismissed a lawsuit brought by Playboy Enterprises against the search engine Excite, Inc. and Netscape. The ruling limited the online rights of trademark holders, as it recognized that a trademark may be used without authorization by search engines in advertising sales practices.

Playboy claimed that the search engines were displaying paid banner ads from pornographic web sites whenever ""playboy"" or ""playmate"" were used as a search term. As the owner of the trademarks for both terms, Playboy argued that the use of its trademarks for a third party sales scheme was trademark infringement and branding dilution.

In the ruling dismissing Playboy's case, the Judge found that Excite had not used the trademarks ""playboy"" and ""playmate"" in an unlawful manner. This was because Excite had not used the trademarked words to identify Excites own goods or services and therefore trademark infringement laws did not apply. It was further determined that even if there was trademark usage, there was no infringement because there was no evidence that consumers confused Playboy products with the services of Excite or Netscape.

What about within Meta Tags? Is it illegal to use trademarked terms in your meta tags? Sometimes. The problem occurs with how and why you are using the terms. Web sites that use the tags in a deceptive manner have lost legal battles. However, legitimate reasons to use the terms have resulted in successful defenses.

In a case involving Playboy, the firm was able to prove trademark infringement, based on use of their trademark in meta tags, url and content on the web site. The case was filed by the firm against web site operators for stuffing their web pages with the words Playboyand Playmatehundreds of times. Furthermore, the defendants were also using the terms Playboy and Playmate in the site names, URLs, and slogans. In this case the Judge ruled for Playboy, as there was a clear case of trademark infringement.

In the separate case, Playboy vs. Terri Welles, the court refused Playboys request. The reason was simple. Terri Welles was Playboy's 1981 Playmate of the Year. She had used the terms ""Playmate"" and ""Playboy"" on her web pages and within her meta tags, and the Court felt she had a legitimate right to use them to accurately describe herself, and to ensure that the search engines could catalog her web site properly within their databases. Playboy's appeal was dismissed on Feb. 1, 2002.

In Summary It is clear that if you have a legitimate reason to use a trademarked word or phrase in your web site you can. You may also rent their ownership from one of the keyword ownershipcompanies. Be careful, though, it is possible that may get sued.

Does the technology work? Yes, but only for some of the approximately 3% of Internet users worldwide who have installed any one of a variety of competing plugins that enable this type of searching. I stress a fraction of the 3%, as you would need to buy the keywords from each individual vendor to ensure reaching all 2%.

About the author: Richard Zwicky is a founder and the CEO of Metamend Software, www.metamend.com, a Victoria B.C. based firm whose cutting edge Search Engine Optimization software is recognized as the world leader in its field. Employing a staff of 10, the firm's business comes from around the world, with clients from every continent. Most recently the company was recognized for their geo-locational, or GIS, along with their phraseology technology and context sensitive search technologies.