Monday, July 31, 2006

Build A Website Your Clients Will Love

Author: Robert Warren

You've just spent good money on your first business website. You have invested in search engine optimization, researched your keywords, bought paid inclusions. You have read every article promising unlimited success carried to your front door on the back of mouse clicks. You are confident that you've used every website traffic technique there is.

And you're getting traffic, but it's not boosting business. So what's wrong?

Especially as a professional service provider, it is not enough to simply direct traffic - web surfers are extremely unlikely to purchase your services based on a single visit to your website. They will research, they will compare. They will only approach you once they have reason to trust you, and trust themselves for choosing you.

Your true website prospects are the return visitors ; for marketing purposes, everything else is background noise. Use these techniques to cut through that noise, by providing an online resource worthy of repeat traffic - a website that your clients will love:

Don't sell. Provide. It is important to understand that on the Internet, the user is in complete control of the transaction: hard selling will not work, and will probably antagonize your prospects. Skip the pitch, and instead build a website that serves as a true information resource.

Write and post articles that directly relate to your expertise - if you are a CPA, consider writing articles about financial planning or the importance of tax records; a dentist might write articles about the myths of gum care or the differences between common filling types. Provide a public place where you answer the questions of website users. Keep your website content rich and timely.

Write short and lean. Website users don't casually ease themselves into online reading: they want the facts now and they don't want to spend a lot of time finding them. This means that your content must be written in a lean and compact style that can be quickly scanned by the eye.

Keep your text pieces under 500 words, and preferably in the 250-350 word range. Use simple and direct sentences, in the clearest language possible. Don't make your readers wade through a sea of worthless prose, just to arrive at a small island of information: get right to the point and deliver the goods.

Think navigation. The best content on Earth means nothing if it can't be found quickly. Carefully organize your website in hierarchical format, with plenty of internal links - make all of your important pages only a mouse click or two from the top page. Deliver your content with as much convenience as possible to your visitors.

Appreciate context. Strong navigation design helps the left-brain surfers who know what they want, but many of your visitors will browse your site more creatively: they surf by context rather than placement.

Provide links within the content itself, pointing to other related information on your website. Develop clusters of associations in your content that allow readers to find information intuitively as well as logically.

Build community. Savvy Internet marketers are now learning what technologists have known for years: that the Internet is primarily a social medium. The most popular and profitable websites are those that foster community among their visitors. Provide facilities - forums, newsletters, mailing lists - for your clients to communicate with each other.

Be creative - help your clients turn your website into a favorite meeting place, a place to return to, time and again. Develop a website that your clients will love.

About the author: Robert Warren ( www.rswarren.com ) is a freelance copywriter in the Orlando, Florida area, specializing in providing for the marketing and communications needs of the independent professional private practice.

Avoid Search Engine Blacklisting

Author: Kevin Kantola

The best way to avoid being blacklisted by the search engines is to avoid using some questionable techniques that were once popular to gain high rankings. Even if your website is not blacklisted by using some of the techniques below, it may be penalized (buried in the rankings) so your traffic will suffer all the same. When a search engine blacklists a website it will throw your listing off their site and block your site from coming aboard again. This can be done by blocking the domain name, the IP address or both.

Here are a few techniques to avoid, so that your site will not be blacklisted:

Mirror Websites

Mirror websites are sites with identical content but different URL's. This was once a method used to gain high rankings in the search engines, but since search engines are smarter now, this will only get you penalized or blacklisted.

Doorway (gateway) Pages

Doorway pages are pages with little real content for your visitors that are optimized to rank highly within the search engines. These pages are designed so that visitors will move deeper into the website where the real content lies. Navigation to the doorway pages are usually hidden from the visitors (but not the SE robots) on the homepage.

Invisible Text and Graphics

Using invisible text (text the same or a very similar color to the background) was once used to spam a homepage and some inside pages with non-stop keywords and keyphrases. Also links to doorway pages and hidden site maps can be done with invisible text (or invisible graphics). Some designers will create a graphic link with a 1 pixel by 1 pixel raster image and link this to a hidden inner page such as a hidden site map.

Submitting Pages Too Often

Submitting the same pages to the search engines within a 24 hour period can get you penalized and may delay your website from being listed in the rankings. Some search engines believe that pages submitted sooner than every 30 days is too much. The 30 day rule is a good rule to follow when submitting to multiple search engines.

Using Irrelevant Keywords

Using irrelevant keywords in a website's metatags and / or body copy in order to achieve high rankings will most certainly backfire. Search engines now want to see parity between these two areas and if your site is thought to be spamming with irrelevant keywords, you site will be penalized or blacklisted.

Automated Submissions to the Major Search Engines

Using an automated service or software to submit your website to the search engines can be extremely counterproductive. Most of the major search engines and directories accept manual submissions but do not like to be spammed with the automated ones.

Cloaking

Cloaking is the practice of deceiving both the search engine and the visitor by serving up different pages for each. The visitor sees a nicely designed and formatted page and the search engine robot scans a page of highly optimized text. Any practice that is deceptive should be avoided and the downfall of cloaking is that, if caught, the website can be banned permanently.

Using a Cheap or Free Web Host

Using a cheap or free web host can hurt in the search engine rankings. Frequent downtime, pages taken down for exceeding the bandwidth deter robots from indexing your site. If a robot cannot access your site often enough, your site will be dropped from the search engines. Hosting is cheap, so if you are serious about your website get your own domain name and host not one like geocities.com/yoursite.

Sharing an IP Address

Sharing an IP Address even from a legitimate web host can get your site in trouble. If you have cleaned up your website from all of the techniques mentioned above and your website still does not get relisted by the search engines in a couple of months, check with your host to see if you are sharing an IP address with other sites. If so, you may consider moving your website to a new host who will give you your own IP address or at least one that is not shared with another company who has had their IP address (an yours) banned by the search engines.

FAST's Director of Business Development and Marketing, Stephen Baker, has stated that globally there are approximately 30 million crawl-able servers and approximately two-thirds have been banned by the FAST network for spamming. If these numbers are correct, your site may be blacklisted or penalize for ""guilt by association.""

About the author: Copyright © 2004 SEO Resource

Kevin Kantola is the CEO of SEO Resource, a search engine optimization company, devoted to achieving high rankings and increased traffic. Visit the SEO Resource website at http://www.seoresource.net to see how your site may benefit.

Guarantees from SEO Companies

Author: James Peggie

Recently there has been a spate of well publicized court cases against SEO companies who have been involved in dubious practices such as making false claims or false guarantees. That is why it is essential to hire an ethical SEO company.

It has become clear that companies operating within the industry must become accountable for what they claim. In terms of search engine rankings it is impossible for an organic SEO company to guarantee specific placement. This is because organic SEOs do not control the search engine rankings. The rankings are controlled only by the individual search engines. Therefore SEO companies must set realistic claims in their communications with clients.

The truth is a guarantee by a SEO company relating to rankings means little or nothing. What the SEO company should be offering is a guarantee that they will work with the utmost dedication and diligence towards getting your site ranked.

In reality that is all that they can guarantee.

About the author: James Peggie is the marketing manager for Elixir Systems - a search engine optimization company located in Scottsdale, Arizona. www.elixirsystems.com

Is Your Website Ready For Local Search Engine Traffic

Author: John Jantsch

I suppose the real reason for a local small business to have a website at all is to provide information for the local market, generate leads from local shoppers. Up until now too many small businesses have created websites more like monuments to their company name. If someone knew the name of the company, they could probably find the website. That was good to a point but what about those people who just know they need what you sell but they don't know anybody who sells it?

What if, instead, local businesses began to think about their websites more like a listing in a phone directory. What if they began to build and optimize their websites with the primary intent of being found in their hometown as the leader in a category. Someone looking for ""Farm Fresh Tuna in Upper Cutbank, Montana"" is going to enter just such a search, right?

Google and Yahoo both announced this month their model for tapping into the local search traffic. In other words they are now going to make it easier for web surfers that want to find an accountant in their home town to do so.

Everyone knew they would eventually get around to this very lucrative market so now more than ever you need to prepare your website to be found in your town. More about local search at these sites. http://www.google.com/lochp - Beta site http://www.google.com/help/faq_local.html#what FAQ http://local.yahoo.com/u_s__states - yahoo local

What I'm talking about today is ""local"" search engine optimization. In one sense the principles are the same as everyday regular search engine optimization but the way of thinking about them is a bit different. ~~~~~~ gEEk Term definition: Search Engine Optimization (SEO) is the science of making sure web pages are ""designed"" in such a way that search engines can find, index, and rank them according to the value of their content. For those of you who don't know, there is an entire industry built around this science. ~~~~~~

In the old day the mindset was to create a website and optimize it for anyone looking for a certain topic.

Local SEO focuses first on being found in your town....for a certain topic. Geo targeting is the key. When someone is looking for a veterinarian, they don't search locally (at the moment) for the name of your firm. Think in terms of a Yellow Pages directory. They go to city they are looking for and then the category and then the name of the firm to call.

Local search is structured much the same. People who are looking for an auto mechanic online will search ""Kansas City Auto mechanic"" In order to win the local search game you must be able to win that type of search.

There are no hard and fast rules and even if there were they would change but here are some things you need to begin to think about to bury your competition in the local search game.

Title tags - Probably one of the most important info on your page anyway so make sure your title reads something like ""YOURFIRMNAME Kansas City's oldest bakery

H1 tags - Make sure that your keywords for your site and your geography have H1 tags - The Best Baked Brioche in Peoria, Il

Content - Add your address and phone number early and prominently (not a bad thing for every page really)

Meta tags - Opinions vary on the usefulness of these but there are some tags that may gain usefulness depending on how the search engines refine their methods. meta name=""zipcode"" content=""64105,64113,64112,64110,64106,66207,66208,66210 meta name=""city"" content=""Kansas City"" meta name=""state"" content=""Missouri, Kansas"" meta name=""ICBM"" content=""39.10246, -94.59009 City, State, and Zip code tags are pretty self explanatory but the ICBM one is a bit out there but kind of cool too.

If you go to the GEOUrl Address Server you can locate the exact latitude and longitude of your business. That's what those two numbers after the ICBM tag are. (Of course I think that is the same system they use to target bombs.)

Linking - Make your internal links local friendly - Instead of ""Remodeling Projects"" use ""Omaha Kitchen Projects""

DMOZ - The Open Directory Project is a directory of sites that are listed by human volunteers. It seems that getting listed here gives you high marks with search engines so you need to do it but make sure that you go for the Regional listings all the way down to your town. It is unlikely (and not very useful) that you will get listed for a broad category, particularly if you don't provide world-wide service. Go for the poodle clipping section of your town and you will have better luck.

Other Directories - Another good reason to get listed in DMOZ for your town is that other local directories like Verizon's Smart Pages and SBC's Yellow Pages rely on these listing as well. By the way, get listed in as many of these phone book type of directories like Smart Pages as you can. Some are free and there is speculation that initially the big search engines will rely on these already built local directories.

There...that should keep you busy

About the author: John Jantsch is a marketing consultant based in Kansas City, Mo. He writes frequently on real world small business marketing tactics and is the creator of “Duct Tape Marketing” a turn-key small business marketing system. Check out his blog at http://www.DuctTapeMarketing.com/weblog.php - get these kinds of killer tips weekly by sending an email to mailto:subscribe@ducttapemarketing.com

Sunday, July 30, 2006

Writing SEO Copy - 8 Steps to Success

Author: Glenn Murray

We all know that the lion's share of web traffic comes through the search engines. We also know that keywords and links to your site are the two things that affect your ranking in the search engines. Your keywords tell the search engines what you do, and the inbound links tell them how important you are. This combination is what determines your relevance. And relevance is what the search engines are after.

There's a lot of information around about how to incorporate keyword phrases into your HTML meta tags. But that's only half the battle. You need to think of these tags as street-signs. That's how the search engines view them. They look at your tags and then at your copy. If the keywords you use in your tags aren't used in your copy, your site won't be indexed for those keywords.

But the search engines don't stop there. They also consider how often the keyword phrase is used on the page.

To put it simply, if you don't pepper your site with your primary keywords, you won't appear in the search results when a potential customer searches for those keywords.

But how do you write keyword-rich copy without compromising readability?

Readability is all-important to visitors. And after all, it's the visitors that buy your product or service, not search engines.

By following these 8 simple guidelines, you'll be able to overhaul the copy on your website ensuring it's agreeable to both search engines and visitors.

1) Categorise your pages Before writing, think about the structure of your site. If you haven't built your site yet, try to create your pages around key offerings or benefits. For example, divide your Second Hand Computers site into separate pages for Macs, and PCs, and then segment again into Notebooks, Desktops, etc. This way, you'll be able to incorporate very specific keyword phrases into your copy, thereby capturing a very targeted market. If you're working on an existing site, print out each page and label it with its key point, offering, or benefit.

2) Find out what keywords your customers are searching for Go to www.wordtracker.com and subscribe for a day (this will only cost you about AUD$10). Type in the key points, offerings, and benefits you identified for each page, and spend some time analysing what words customers use when they're searching for these things. These are the words you'll want to use to describe your product or service. (Make sure you read WordTracker's explanation of their results.)

3) Use phrases, not single words Although this advice isn't specific to the web copy, it's so important that it's worth repeating here. Why? Well firstly, there's too much competition for single keywords. If you're in computer sales, don't choose ""computers"" as your primary keyword. Go to Google and search for ""computers"" and you'll see why... Secondly, research shows that customers are becoming more search-savvy - they're searching for more and more specific strings. They're learning that by being more specific, they find what they're looking for much faster. Ask yourself what's unique about your business? Perhaps you sell cheap second hand computers? Then why not use ""cheap second hand computers"" as your primary keyword phrase. This way, you'll not only stand a chance in the rankings, you'll also display in much more targeted searches. In other words, a higher percentage of your site's visitors will be people after cheap second hand computers. (WordTracker's results will help you choose the most appropriate phrases.)

4) Pick the important keyword phrases Don't include every keyword phrase on every page. Focus on one or two keyword phrases on each page. For your Macs page, focus on ""cheap second hand macs"". For the PCs page, focus on ""cheap second hand pcs"", etc.

5) Be specific Don't just say ""our computers"". Wherever you would normally say ""our computers"", ask yourself if you can get away with saying ""our cheap second hand Macs"" or ""our cheap second hand PCs"". If this doesn't affect your readability too badly, it's worth doing. It's a fine balance though. Remember, your site reflects the quality of your service. If your site is hard to read, people will infer a lot about your service...

6) Use keyword phrases in links Although you shouldn't focus on every keyword phrase on every page, it's a good idea to link your pages together with text links. This way, when the search engines look at your site, they'll see that the pages are related. Once again, the more text links the better, especially if the link text is a keyword phrase. So on your ""Cheap Second Hand Macs"" page, include a text link at the bottom to ""Cheap Second Hand PCs"". If you can manage it without affecting readability, also include one within the copy of the page. For example, ""As well as providing cheap second hand Macs, we sell high quality cheap second hand PCs"". TIP: If you don't want your links to be underlined and blue, include the following in your CSS file:

Then format the HTML of each link as follows:

As well as providing cheap second hand Macs, we sell high quality cheap second hand pcs .

7) Use keyword phrases in headings Just as customers rely on headings to scan your site, so to do search engines. This means headings play a big part in how the search engines will categorise your site. Try to include your primary keyword phrases in your headings. In fact, think about inserting extra headings just for this purpose. Generally this will also help the readability of the site because it will help customers scan read.

8) Test keyword phrase density Once you've made a first pass at the copy, run it through a density checker to get some metrics. Visit GoRank's Keyword Density Analyzer and type in the domain and keyword phrase you want to analyse. It'll give you a percentage for all the important parts of your page, including copy, title, meta keywords, meta description, etc. The higher the density the better. Generally speaking, a density measurement of at least 3-5% is what you're looking for. Any less, and you'll probably need to take another pass.

Follow these guidelines, and you'll be well on your way to effective SEO copy.

Just remember, don't overdo it. It's not easy to find the balance between copy written for search engines and copy written for customers. In many cases, this balance will be too difficult to achieve without professional help. Don't worry, though. If you've already performed your keyword analysis, a professional website copywriter should be able to work your primary keyword phrases into your copy at no extra charge.

About the author: * Glenn Murray is an SEO copywriter and article submission specialist . He is a director of article PR company Article PR and also of copywriting studio Divine Write .

10 Things You Should Expect From Your Website Copywriter

Author: Glenn Murray

As websites and electronic commerce are becoming more and more common, business owners and marketing managers are realising that quality web copy is every bit as important as impressive design. And with the ever increasing importance of search engine presence, the role of web copy has never been more critical.

But in such a relatively new field, customers are still coming to grips with what they can expect of their website copywriter. The question a lot of people are asking is, ""How do I know I'll get what I pay for?""

Before engaging a website copywriter for your next project, ask them whether they're able to provide you with the following ten essentials...

1) Fixed Quote A lot of website copywriters will tell you they only work on an hourly rate. They'll cite varying requirements, rapidly changing technologies, greater incentive, the risk of customer indecision, and a host of other reasons why they can't provide a fixed quote. But don't be fooled. You have a right to know what the job is going to cost you. If a website copywriter won't give you a fixed quote, think twice...

2) Contract of Works to be Completed Just as important as a fixed quote is a signed contract. It may not be drawn up by a lawyer, but a written and signed document outlining the works to be carried out, and the cost of those works is essential. If a website copywriter is reluctant to provide a written, itemised quote including estimated number of words, you have to ask yourself why.

3) Timeframe Always ask how long your job is going to take. If you've already had a go at writing your own web copy, you'll know how time consuming it is. Never make the mistake of thinking the job will be done in a day. Granted, a professional website copywriter will be very efficient in crafting your copy, but no matter who the writer, a quality product requires time. And on top of writing time, remember that you'll have to review and provide feedback on everything they write. In a lot of cases, it's the review phase that takes the most time, so make sure you try to set some time aside, otherwise you'll find yourself the bottleneck!

4) Plan of Attack Try to get some idea from your website copywriter about how they plan to approach your project. Don't be fooled into believing you have to hand over the dollars before they'll reveal their plan of attack. You have a right to be comfortable with their approach before you engage their services. Will you receive individual drafts of every page, or a single draft of the entire site? What format will you receive the finished product in? How many review iterations do they anticipate?

5) Samples A lot of ambitious web service providers of all types are calling themselves writers these days. They offer copywriting as a specialist service, but don't engage a specialist to complete the work. Always ask to see samples of their previous copy. Read it thoroughly and ask yourself, ""Does this copy convey benefits?"". Pretend you're the intended audience and ask ""Does this copy answer the questions I need answered before I'll buy?""

6) CV Most copywriters' websites will give you a very high-level overview of their business and the services they offer. Some even offer samples. But very few offer a professional biography of their writers. If you're not happy relying on their website as your sole source of information, ask for a copy of their CV. The things you're looking for are a professional history in writing, and preferably some tertiary education in the same.

7) Testimonials Perhaps the best indication of a website copywriter's ability is customer satisfaction. Don't be afraid of asking for customer testimonials. A good website copywriter will be proud of their testimonials - so proud, in fact, that they'll be offering them without you even asking. Look for testimonials from companies you recognise and/or can verify. Anyone can get their great-aunt write them a testimonial. Some will even write their own. If you really want to be sure, ask for contact details so you can give the customer a call and hear it straight from the horse's mouth.

8) SEO Copy Skills Approximately 80% of all web traffic comes through search engines, so it's essential that your website copywriter has proven experience in SEO copy. Ask them their general approach to SEO copy. Do they normally perform the keyword analysis themselves? How do they know when they've used enough keywords in enough of the right places? Can they show you a high ranking site they've written the copy for? What steps do they take to avoid diluting the effectiveness of your primary keyword phrases? Will their SEO copy change the text links on your pages? (It should!)

9) SEO Copy at No Extra Charge! Never be fooled into paying more for SEO copy. If you've already performed your keyword analysis, and you know where you want your keyword phrases used, writing of the copy should take no longer than usual. I'll say it again... SEO copy is not an extra - it's how web copy should be written! Do not pay extra for it! The only things you should expect to pay extra for are keyword analyses, adding the HTML code for unmarked text links, providing guidance on site structure, sourcing of inbound links to your site, etc. SEO copy by itself should cost no extra.

10) Writing Experience for Online Media Writing for an online medium is entirely different to writing for print. Readers have different requirements and objectives, and reading conditions are very different. Make sure your website copywriter knows how to cater to these differences. Ask them to recommend a maximum page length or word count per page. The correct answer should include some comment on the trade-off between the problems of scrolling and the need for a high keyword count for SEO. Ask them whether they prefer long sentences or short (and hope to hear ""short""). Ask them whether they will include lots of text links within the main body of the copy, and if so, will they appear as regular links (colored and underlined) or will they be unmarked.

Professionally written copy can mean the difference between a great looking site and a great looking site THAT EARNS YOU MONEY.

Choose your website copywriter carefully.

About the author: * Glenn Murray is an SEO copywriter and article submission specialist . He is a director of article PR company Article PR and also of copywriting studio Divine Write .

Saturday, July 29, 2006

Free Online SEO Tools

Author: Arif Hanid

For anyone wanting to do a bit of their own Search Engine Optimisation, there is an abundance of free online SEO tools available on the internet. Most of them provide some pretty impressive statistics and information to help you optimise your website, analyse search engine positions, research your competitors, plus much more!

There are two ways these free online SEO tools can be used: (1)For those who are new to the area of search engine optimisation, these tools provide excellent insight on how a website is performing and ranking. They can quickly highlight issues and trends with their current website and provide a good insight as to where optimisation work is necessary. (2)For the more experienced search engine optimisers amongst us, these tools will act as a complement to the more specialised SEO tools, like WebPosition Gold or SpyderOpts. They can even be used to supplement an SEO’s internal knowledge base and experience.

Here is a selection of some choice tools for both the novice and the experienced search engine optimisers:

Keyword Research Tool http://www.webmaster-toolkit.com/keyword-research-tool.shtml This helps to research appropriate words and phrases to include in your webpage's body text to aid promotion. It’s simple and to use and requires the user to enter the sort of word of phrase you wish to be found under, the tool will then suggest some additional words and phrases you can think about using. One of the great things about this tool is that it gives you the option to select from a range of top search engines, e.g. Google, Yahoo, MSN, Teoma, etc.

Keyword Analyser Tool http://www.webmaster-toolkit.com/keyword-analysis-tool.shtml This tool reads the body of the page you specify and gives a report on what words are used, and how many times they are used. This is a valuable tool as most engines will rank your site depending on your keyword density (which typically ranges between 3% and 9%).

Search Engine Position Checker Tool http://www.webmaster-toolkit.com/search-engine-position-checker.s html This tool checks whether your website appears in the first fifty results in major search engines for your chosen keyword or phrase. If the URL is present, it will output what position it occupies. As an additional feature, the tool also informs you if any other URLs from your domain appear in the search results.

Link Popularity Tool http://www.instantposition.com/link_popularity_check.cfm This tool measures the total number of links or ""votes"" that a search engine has found for your website. This is a pretty cool tool because as well as tabulated data it also produces a nice graph of the resulting data. One final key element of this tool is its ability to compare your website to your competitors to help you with your overall marketing strategy.

Meta Tag Generator http://www.webmaster-toolkit.com/meta-tag-generator.shtml This automatically generates a Meta Keyword tag by reading the page you specify, removing common words from it, and picking the most used words on the page. Extra weight is given to words in a heading tag

Search Term Suggestion Tool http://inventory.overture.com/d/searchinventory/suggestion/ Displays how many times a certain keyword was searched for at Overture.com. Shows all related searches for the entered keyword. A good measure to use in determining frequency of search among related keyword phrases

Search Engine Optimisation Tool http://www.instantposition.com/seo_doctor.cfm. A very impressive tool that tests the performance of a web site, by analysing a page on the important elements of web page creation, such as its title and content. It then scores the page against given criteria for the top search engines, followed by some valuable SEO advice to improve overall ranking. The report produced is well laid out and easy to follow for anyone doing their own optimisation. A freebie tool that works better than many costly SEO tools I know!

The area of online SEO tools is an exciting area of growth as SEO developers come up with more and more tools to represent website positions on the internet. No doubt we will be revisiting this area again….

About the author: Internet Marketing Manager

Ambleton Computing Experts in all areas of Internet Marketing, inc. SEO and Web Design.

arif_hanid@ambleton.com www.ambleton.co.uk

Why Articles Are Not The Route To High Search Engine Rankings

Author: Priya Shah

Copyright 2004 Priya Shah http://www.PriyaShah.com

If you have any interest in getting high search engine rankings for your website (and who doesn't) you've probably been sold the idea that writing and publishing your own articles will do it for you.

Here's why that's not entirely true.

Imagine the following scenario...

You write an article around a keyword or keyphrase you want to rank well for.

You submit that article to all the article submission sites and directories and ezines you can find.

Your article gets published in hundreds of places.

You now have hundreds of links pointing back to your main site...

But your own site never shows up in the top ten results for that particular keyword or keyphrase.

Instead you find that there are lots of other sites carrying your article that rank better than yours.

You've completely missed out on an excellent opportunity to get high rankings for your keyword or keyphrase.

Even worse... you just handed your precious keyword-rich content on a platter to possible competitors who happened to publish your article on their website, and may have lost some of your most targeted visitors and sales to them.

So where did you go wrong?

Your mistake lay in using your precious article - the keyword-rich content you toiled for hours to write - for entirely the wrong purpose.

You failed to use the power of the medium of article publishing to give your site an unbeatable advantage over others.

Here's how to use your articles the right way to boost your search engine rankings.

1. Publish Unique Content On Your Website

When you make an article available for reprint, the article, by virtue of it being published on hundreds of other sites, now no longer qualifies as unique content.

In the eyes of search engines, those pages with higher Pagerank (and hence greater importance) than yours will now rank better than you for the keywords your article is optimized for.

Instead of making your article the main course, use it as an appetizer to direct search engines and readers to a UNIQUE, keyword-rich, well-optimized report or white paper on your website, and you'll see dramatically different results.

2. Use Your Article As Spider Bait

Think of your articles as simply the conduit that leads search engines to your website.

Publishing your articles all over the web is like leaving scraps for a puppy (a.k.a. the search engines) that follows them all the way back to the kennel (a.k.a. your website) where it can feast on the main course - your UNIQUE content.

3. Use Keyword-Rich Anchor Text In Your Resource Box

Use your main keyword or keyphrase in the anchor text of the article resource box that contains a link pointing back to your unique content.

This will create hundreds of keyword-rich links pointing back to the well-optimized report on your website, and give your pages a powerful edge over other websites.

Often this factor alone is sufficient to take your website to the top of the search results, especially with search engines like Google and MSN.

The guidelines here include only a few of the steps you need to take to get high rankings for the keywords of your choice.

To learn how to use your articles and unique content to get an edge over your competitors and secure long-term, top rankings for your website, check out the search engine optimization ebook, Number One In Your Niche.

http://www.numberoneinyourniche.com

About the author: Priya Shah is the author of Number One In Your Niche http://www.NumberOneInYourNiche.com and edits the newsletter Be a Whiz at eBiz! http://ebizwhiz-publishing.com

Make The Search Engines Love Your Site

Author: Matt Colyer

Most webmasters have no idea on how to make a search engine friendly web site. If you are one of them this will all change by following these steps below.

1. Research keywords - Before you start to build your web site you should research your keywords or your site may get hurt in the short term. Use the keyword research tool, use Overture to research the most popular keywords that are related to your site. Overture will show you how much traffic each keyword has got in the past 30 days.

2. Create a list of about 50 to 100 keywords that you can include within your web pages. After having completed the above research, you should have found the keywords that were searched on most frequently, but few competing sites.

3. Write a paragraph of at least 250, but better with 500 words of text for the top of each web page. Put your keywords within this text, but be careful because you could repeat your keywords to much and make sure the paragraph makes sense with all those keywords, remember visitors are more important then the search engines.

4. Optimize meta tags - Meta tags have lost there touch with most search engines, but they still help! The most important meta tags are the keyword and description meta tags. Include your keywords within each of these meta tags. Your keyword meta tag should include the most frequently used keywords contained within your web page, but keep it short to about 10 to 15.

5. Title Tag - The title tag is one of the most powerful on-site SEO at your disposal, so use it wisely. Put your most important keyword in the title close to the beginning as posable, keep it short and to the point.

7. Optimize your site size - Too many images or very large images on your web page will slow down your server and cause slow loading times for your site. Slice large images into smaller pieces with a graphic editors. Also to long of pages and to much text will do the same.

8. Find backward links - Web sites that link to yours raise your link popularity. Search for web sites that are compatible with yours. Write articles that are related to your site and submit them to sites like ezinearticles.com or Goarticles.com.

About the author: Matt Colyer is the owner of the Superior Webmaster in 2004 as a source of articles and tutorials for Web site owners looking to improve their Web site.

Friday, July 28, 2006

Importance of Meta Tags Optimization

Author: Buniei R. Ahn

Importance of Meta Tags Optimization In this article, we simply talk about what Meta Tags are, their importance, the important Meta Tags and useful tips on how to optimize your Meta Tags for a better ranking with the search engines. ""Meta Tag Optimization is an important aspect of your site optimization process. Careful handling can get you great Ranking Results "" What are Meta Tags?

Meta Tags are the information inserted in the area of the HTML code of your web pages, where apart from the Title Tag, other information inserted is not visible to the person surfing your web page but is intended for the search engine crawlers. Meta Tags are included so that the search engines are able to list your site in their indexes more accurately.

Using Meta Tags in HTML is not necessary while making your web pages. There are many websites that don’t feel the requirement to use Meta Tags at all. In short Meta information is used to communicate information to the search engine crawlers that a human visitor may not be concerned with. Infoseek and AltaVista were the first major crawler based search engines to support Meta keywords Tag in 1996. Inktomi and Lycos too followed thereafter.

Why are Meta Tags used?

Meta Tags were originally designed to provide webmasters with a way to help search engines know what their site was about. This in turn helped the search engines decide how to rank the sites in their search results. Making Meta Tags is a simple process. As the competition increased, webmasters started manipulating this tool through spamming of keywords. In turn most search engines withdrew their support to Meta keywords Tag, which included Lycos and AltaVista. From being considered as one of the most reliable and important tool, Meta Tags are now often abused. In the present day scenario a vital feature that the Meta Tags provide to the websites is the ability to control, to a certain extent, how some search engines describe its web pages. Apart from this, Meta Tags also offer the ability to specify that a certain website page should not be indexed.

Using Meta Tags, however, provides no guarantee that your website page would rank highly in the search engine rankings. Due to the rampant abuse and manipulation of the Meta keywords Tag by webmasters, most search engines don't support it anymore.

Types of Meta Tags

The more important Meta Tags are discussed below in detail.

The Title Tag

The Title Tag is not a Meta Tag. However, since it’s a very important Tag, we thought it necessary to discuss it here. The Title Tag is an HTML code that shows the words that appear at the top title bar of your browser. The Title Tag is not displayed anywhere else on the page. It is these words or phrase that appear as the title of your page in the hyperlink listings on the search engine results.

The users in-turn click on this hyperlink to go to your website from Search Engine Results Page (SERP). Therefore, the significance of the Title Tag is evident as all search engines use the Title Tag to gather information about your site.

The Meta Description Tag

The Meta Description Tag is an HTML code that allows you to give a short and concise summary of your web page content. The words placed in this Meta Tag, are often used in the SERP, just below the Title Tag as a brief description of your page. In the Search Engine Results Pages, after reading the Title of the page, a user goes through the description of the page and decides whether she wants to go to your site or not. It is therefore important that your Meta Description Tag is nicely composed describing your page offering while enticing the user to click on your listing.

The Meta Keywords Tag

Most search engines do not read the Meta Keywords Tag anymore. It is okay to ignore the Meta Keywords Tags. However, if you feel more comfortable using it, you can have about 15 important non-repetitive keywords in this Tag, separated by commas.

Meta Robots Tag

The Meta Robots Tag gives you the ability to specify whether search engines should index that page or follow the links appearing on that page. However, there is no need for using Meta Robots tag if one is already using detailed robots.txt file to block any specific indexing.

The various commands used under Meta Robots Tag are: Index: allows the spider to index that page.

Noindex: instructs the spider not to index the page.

Follow: instructs the spider to follow the links from that page and index them.

Nofollow: instructs the spider not to follow links from that page for indexing.

Note: Use only one of the above given commands.

If you have not specified any Meta Robots Tag on a page, by default, the spiders understand that the page and all the links appearing on that page are open for indexing. Therefore, it makes more sense to use this Meta Tag in case you don’t want certain parts of your web page indexed.

Finding Your Home Business Niche Copyright © Buniei Ahn, The Home Biz Guy http://www.internetmarketpower.com http://www.pluginprofit.com/main-3666

About the author: ----------------------------------------------------------------- Buniei Ahn, The Home Biz Guy can help you launch your very own m.oney making website today that's 100% ready to take orders and pull in massive profits for you right now ... g.uaranteed! Visit: http://www.internetmarketpower.com http://www.pluginprofitsite.com/main-3666 ----------------------------------------

Gogle Search Engine - Analyzing the Misspelling Strategy

Author: Alec Duncan

To Gogle, Or Not To Gogle? A while back I was posting an article submitted by one of our regular authors on LilEngine.com and I did it a bit faster than I normally would. I was in a hurry to catch an appointment and was already running late. While posting the article I tripped on something that would change my view on mistakes forever.

In my haste I made a mistake in the article’s title, yes a misspelling, just as I have purposely misspelt Google Search Engine as Gogle Search Engine in the title of this article to clue you in on its content. Weirdly enough the mistake in the title I posted slipped by unnoticed and eventually got pushed off the homepage and into our archives.

If you follow the course of your content pages after posting them they usually go into hiding for a few days and then resurface with varying placement depending on their content and other variables. Every now and again you will have some page on your site that attracts large amounts of traffic compared to some of your other pages. This page inevitably grabs your attention and this is what happened to me.

I noticed a large increase in the daily traffic to our

Search Engine Optimization site www.lilengine.com and started analyzing the logs to find the culprit. Now, I describe it as a large increase in traffic as opposed to a large spike in traffic as this traffic gain did not suddenly appear then disappear. It was a stable increase in traffic and was funneled to our site by the Gogle Search Engine :-). It so followed that the page that was responsible for this flood was the same page with the mistake and it was showing up as #1 in Google for this keyword misspelling.

Finding Common Misspellings It is pretty easy to come up with misspellings for your targeted keywords, however, incorporating them into your content may not be as easy. With a little imagination you can come up with several methods to keep your content legitimate for your users and the search engines.

Using the Overture Keyword Suggestion Tool and Google Search Results you can decide which misspellings get the most searches and which are highly competitive hence which ones would be worth your while to optimize for.

Here’s how you do it. Use the Overture Keyword Suggestion Tool to see how many searches there are for the misspelling. If this number is satisfactory for you then do a Gogle Search for the misspelling and see how many results Gogle has for this keyword. If this number is too high then there may be too much competition for this keyword and you might want to try another.

Summary People will always make mistakes and these will include misspellings. If you can reach out further to your target market by incorporating words that they may misspell to find your site, in a tasteful manner, then gearing pages of your website for misspellings should be considered when optimizing your website.

About the author: Alec Duncan is the founder of LilEngine.com a

Search Engine Optimization resource site. Visit Li'l Engine for search engine optimization tools and strategies and also check out

Developer Tutorials for web development techniques and strategies.

Thursday, July 27, 2006

How Search Engines Work

Author: Matt Colyer

Before anyone can start optimizing a web site, you must understand how search engines work. Search engine optimization is the hardest thing to do for a webmaster because there are so many rules to it and you have to stay up to date with all the new search engine optimization techniques.

Search engines send out what is know as a robot or some people call them spiders or robots to index your web page. They find web pages by links, When a robot finds a link on a web page it will follow it to that page (you can join www.linkexchangeit.com to trade links with other members).

Each search engine has it's own robot and each robot acts different, then other robots. Some robots will index all of your web site in day and others will take weeks before they get all of your web pages. A spider is a computer software that moves from web page to web page by links gathering information.

After a robot has indexed your web page it is sent to a database which holds large amount of other web sites. After in the database your web page will be part of the search engine results. The way it indexes your web page and where it places it on its search engine results depends on a number of factors that are on your site.

Search engines will rank your web page based on the information robots receive from indexing. The better you have SEO yourweb site, the higher the ranking you'll get in the search results.

About the author: Matt Colyer is the owner of the Superior Webmaster , which provides free webmaster tools and resources. He also is a php, CGI and ASP developer.

All About Links -- Interview with link building expert , Bob Gladstein

Author: Julia Hyde

Julia: Welcome Bob. Thank you for taking the time to answer my questions about link building. I'm going to jump right in ask you why Web sites need links?

Bob: There are a number of reasons to have links pointing to your site. But let’s start with the reason they were created in the first place. The original purpose of the Internet was to enable the sharing of information. For example, if a scholarly paper existed on a server at the University of California, and a professor at Oxford wanted to read it, the Internet made that instantly possible. Now, if the Oxford professor had a paper that referenced information from the UC paper, they could link directly to that other document rather than just quoting from it. So a hyperlink was intended as a way of connecting data, ideas, and references together. It’s like saying, “if you’d like further information on this topic, here’s a place to find it.”

When the Google search engine was created, its developers took this into account. And drew the conclusion that a link was an indication that the page being linked to was relevant to some particular subject-matter.

So that’s a rather long introduction to a short answer to your question. Web sites need links because they send traffic that’s already targeted to their subject matter to other sites, and because they help the search engines determine both their theme and what the web as a community deems their importance to be. Basically (although not absolutely), the more links that point to a page, the more relevant that page is determined to be. In addition, links are now considered the most reliable way (apart from paying) to get a site into the search engines in the first place.

While both Google and Yahoo allow you to submit a site to their index, it’s clear that the best way to get the search engines to pay attention to your site is to get a page that their spiders already know about to link to yours. The spiders then follow that link to your site, and add it to their index.

Julia: Thanks, Bob. But there are different types of links aren't there? Can you explain differences?

Bob: As we discussed in the previous question, there are text links from other sites. Some of these are reciprocal (that is, they link to you and you link back to them) and others are one-way (the owner of the other site decides, for whatever reason, to link to your site and doesn’t expect you to link back).There are also image links: banners, buttons, etc. These have the advantage of standing out visually from the rest of the page, but many people have become immune to the standard banner ad and just ignore them, because it’s assumed they’re just advertisements, and as such, not necessarily relevant to the page on which they appear.

Then there are directory listings, where a link to your site appears on a page containing links to numerous other sites in what the directory editor has determined to be your particular niche.

An important thing to consider regarding getting a link is the code behind it. If your primary concern is to send traffic to your site, this isn’t important. In that case, what you need to think about is whether the link is going to send the right people to you. But if you want the link to be recognized by the search engines and to contribute to your ranking in searches, you need the link to be in simple HTML, without JavaScript or other code that will hide the link from search engine spiders.

There are also links that won’t help you at all, or will put you in danger of losing your position on the search engines. Guestbook spam, the practice of going to a site’s guest book area and posting a message like “Nice site. Come visit mine, at…” will do you no good. The search engines know that such links carry no value, and just ignore them. The same is true for free-for-all links pages, on which you can immediately add a link to any site, without any editorial oversight.

Link farms are a far more dangerous subject. These are networks of sites that are heavily cross-linked and offer to link to you as long as you link back into the network, or host a page on your site that serves as a directory of sites that the link farm has linked to. The idea here is to abuse the power search engines give to links by exponentially increasing the number of links to your site, without regard for theme or value. You link into the farm, and you have hundreds, perhaps thousands of links pointing back to you. But the links are only there to increase link popularity. The sites on which the links reside are not intended to actually be viewed by people; they’re just intended to give search engine spiders the mistaken impression that your site is extraordinarily popular.

Julia: So, what's the best way to get legitimate and relevant sites to link to yours?

Bob: Before you can get a site to link to yours, you first have to find it. You need to do research on the subject-matter of your site by searching on the keywords you hope people will use to find it. The results of those searches will give you a list of sites that are already performing well for those keywords. You should then study those sites, so that you can write to the webmaster and request a link in such a way that demonstrates that you understand the purpose of their site. And give reasons as to why you think their audience will find your site of interest.

You can buy links from sites as well, sometimes on a single page, and sometimes all across the site. These are just like any other form of advertising. So before you part with your money you need to determine if they’re worth the purchase price by deciding if they’ll send you enough of the right traffic. That’s why sites that offer the opportunity to buy links will make claims about how much traffic they get and how their audience is made up of “decision makers.”

Finally, there are directories, which normally require you to drill down to find the most relevant category for your listing. You can then (depending on the directory) either contact them with your information, or fill out a form on the directory itself and request a listing.

Julia: What would you say to Web site owners who are reluctant to use links because they think it will take people away from their site?

Bob: For one thing, a Web site without any off-site links is a dead end, and there is some evidence to suggest that search engines view sites that don’t link out as being less valuable.

Unless you’re willing to pay, you may have a hard time convincing people to link to you if you’re not planning on linking back to them. But it’s still possible, especially if you’ve got content that’s so good people will want to link to you anyway, but it’s definitely harder to get one-way links than reciprocal ones. I’m not suggesting that people link directly to their competitors. The idea is to link to sites that complement the content that you’re providing. By doing so, you’re contributing to the impression that your site is an authority on your theme: not only do you have great information, but you have links to other sources of information. That’s another reason for people to come back to your site more often. And if you’re still worried about sending people away from your site and never seeing them again, you can set your off-site links to open in a new window, by adding target=”_blank” to the code for the link. If you do this, however, it’s a good idea for usability purposes to let people know that the link will open in a new window. Otherwise, people who have their browser windows maximized may not realize what’s happened, and should they try to get back to your site by hitting their back button they’re likely to be confused when it fails to take them anywhere.

Julia: We often hear the term ""Anchor text"". Can your explain what this means and why it's important?

Bob: Anchor text is the part of a text link that’s visible on the page. On a Web page, that would look like this: : Search Engine Marketing and Copywriting Services ”Search Engine Marketing and Copywriting Services” is the anchor text. What’s important about it is that it tells both the user and the search engine spider what the page the link points to is about. In a search engine optimization project, getting links to your site that use your keywords in the anchor text helps to get your page to rank higher for those keywords. That’s why it’s important to have something other than “click here” as anchor text.The power of anchor text can be seen by the example of the practice of “Googlebombing,” in which numerous sites will link to a particular page using the same anchor text. If enough sites do it, Google will rank that page at the top of its listings for searches on that text. George W. Bush’ biography page on the site of the White House is still number one in Google for the query “miserable failure” about half a year after that particular Googlebomb was created. Whether or not you personally agree that those words do a good job of describing Mr. Bush, Google accepts what it sees as the opinion of the general online community. If enough pages tell Google that miserable failure George W. Bush, then as far as Google is concerned, it must be true.

Julia: Another thing we hear a lot about is Pagerank™—a tool webmasters often use to determine whether a site is worth linking to or not. What does this mean?

Bob: PageRank (not to be confused with “page rank”) is a part of Google’s algorithm for ranking pages. There are numerous theories as to how it’s calculated, but only Google knows for certain. In any case, that’s not important to this discussion. What matters is that PageRank is a measure of the value of a page based on the links pointing to it, the value of the pages on which those links reside,and the number of other links that are on those pages. It’s strictly numerical, and has absolutely nothing to do with relevance or value to the reader. In other words, if I have a page about Shakespeare, and I link to two pages, one about Shakespeare, and the other about the care and feeding of parakeets, the same amount of PageRank will be passed to both of those pages. The fact that one of those pages is about the same subject as my page does not enter into the calculation.

You can see an estimation of the PageRank of a given page if you have the Google toolbar installed. But it’s important to keep in mind that PageRank is not everything, nor is it the most important thing. It’s one of many factors Google takes into account when it ranks pages for queries, and it’s not at all uncommon to see that a site that ranks on the top of a SERP (search engine results page) has a lower PageRank than the pages below it on the SERP.

One of the reasons people believe that PageRank is important is that if you do a backlink check in Google by typing “link:www.site.com” in the search box, you’ll generally (but not absolutely) only see pages that link to the URL in question and have a PageRank of 4/10 or higher. People have taken this to mean that a link from a page with a lower PR doesn’t count, and that simply isn’t true. It’s true that, all other things being equal, the higher the PR of a page linking to yours, the more PR it’s going to pass to your page, but as I said, PR is just one aspect of Google’s algorithm, and every link apart from the troublesome ones we spoke of earlier has some value.

It’s also worth keeping in mind that a page that shows a PR of 2/10 in the toolbar today may have a 5/10 or 6/10 a few months from now.When I’m looking for sites from which I may wish to request links, the only time what I see in the toolbar matters to me is when I see that it has no PageRank at all. Assuming the site isn’t new, that can sometimes be an indication that the site has done something which caused Google to demote it. That is, it may be what Google refers to as a “bad neighborhood,” and as such, you should be extra careful in checking it out before you agree to link back to it.

Julia: Thank you so much for sharing your knowledge, Bob! I hope you all will check out Bob's site at: for more information about his company.

About the author: Julia is an independent copywriter and consultant specializing in search engine marketing and copywriting, direct mail, print advertising and other marketing materials businesses need to increase sales. Learn more about how Julia can help boost your profits by visiting www.juliahyde.com. Or email info@juliahyde.com. She'll get back to you right away.

Wednesday, July 26, 2006

.com Not Listed in Regional Yahoo? Don't Despair!

Author: Glenn Murray

.com Not Listed in Regional Yahoo? Don't Despair! By Glenn Murray *

If you're a non-American business with a .com web address, and your regional Yahoo ranking is important to you, then my story might interest you.

Recently my copywriting website dropped out of Yahoo's Australian rankings. For quite a while, it had been at number 1 for my primary keywords ""advertising copywriter"", ""copywriter"", and ""website copywriter"". But then it suddenly disappeared. I clicked through about 10 pages of results, and it was nowhere to be seen. I then searched for my domain, and Yahoo couldn't find it.

Something smelt fishy.

I'd done nothing 'naughty' to my site to warrant a ban, and I still had heaps of links to my site (actually, I had more than ever before).

I'm an Australian advertising copywriter. I'm based just north of Sydney and I host my website with a major Australian host. But my web address is a .com, not a .au. I started thinking this might be the problem.

So I emailed Yahoo support, explaining the problem, and sharing my thoughts on the cause.

And all of a sudden, nothing happened.

So I waited. And I waited. And I waited. And finally, after about a couple of weeks, I received an email from a Yahoo support representative informing me - incorrectly - that my keyword wasn't featured in my page title or description. I should remedy this shortcoming and re-submit my site to Yahoo.

Frustrated, I replied. I repeated the important facts from the first email just to ensure they'd listened. They hadn't. They hadn't even searched for my domain to confirm that Yahoo no longer recognised it.

When they got back to me this time, they had started paying a bit more attention. The support rep confirmed my suspicion that Yahoo had excluded my site because of its .com URL. Her very helpful solution was that I should change my domain to .au! She included some ridiculously complex instructions for how to do so, and sent me on my merry way.

As you might expect, I wasn't satisfied. Nor was I merry. I explained to her that this was not an acceptable solution because all the links to my site on the internet are pointing to the .com and my email address uses the .com.

She was unmoved. She asserted that this was the best and only way to solve the problem. Oh... and it might help if I added my primary keyword to my title and description.

My laughter was not good humoured! I wrote back expressing my displeasure at this ""solution"". I painstakingly explained how Yahoo had made a mistake, and that if Google was capable of recognising my Australian business despite its .com addresses, I would think it's technically possible. I also cited several other .coms in the first couple of pages of Australian results.

No response.

The situation didn't look promising...

If this sounds like a familiar story to you, don't despair. A week or two later, I searched Yahoo Australia for my primary keyword, and surprise, surprise... My site was ranked number 1 again!

The moral to the story? Don't be intimidated by Yahoo. Trust your instincts and don't give up. If you're an Australian business with a .com, and you're not listed in Australian searches, this might be why. In fact, I would think this story is relevant to all regional Yahoos. (Of course, before making any accusations, it's a good idea to make sure your site is properly optimised and that you have plenty of inbound links.)

Anyway, that's my story. I hope it helps someone.

And they all lived happily ever after. So far at least...

Yahoooooooo!

The End.

About the author: * Glenn Murray is an SEO copywriter and article submission specialist . He is a director of article PR company Article PR and also of copywriting studio Divine Write .

What is the Robot Text File?

Author: Alan Murray

The robot text file is used to disallow specific or all search engine spider's access to folders or pages that you don't want indexed.

Why would you want to do this?

You may have created a personnel page for company employees that you don't want listed. Some webmasters use it to exclude their guest book pages so to avoid people spamming. There are many different reasons to use the robots text file.

How do I use it?

You need to upload it to the root of your web site or it will not work - if you don't have access to the root then you will need to use a Meta tag to disallow access. You need to include both the user agent and a file or folder to disallow.

What does it look like?

It's really nothing more than a ""Notepad"" type .txt file named ""robots.txt"" The basic syntax is User-agent: spiders name here Disallow:/ filename here

If you use User-agent: * The * acts as a wildcard and disallows all spiders. You may want to use this to stop search engines listing unfinished pages. To disallow an entire directory use Disallow:/mydirectory/ To disallow an individual file use Disallow:/file.htm You have to use a separate line for each disallow. You cannot you for example use Disallow:/file1.htm,file2.html You should use Use-agent/* Disallow:/file1.htm Disallow:/file2.htm

For a list of spider names visit http://www.robotstxt.org/wc/active/html/ Make sure you use the right syntax if you don't it will not work. You can check you syntax here http://www.searchengineworld.com/cgi-bin/robotcheck.cgi

For help on creating robot text files there is a program call robogen. There is a free version and an advanced version, which costs $12.99 http://www.rietta.com/robogen/

About the author: Alan Murray is a Certified Internet Webmaster Professional and Provides SEO Services and Website design

Tuesday, July 25, 2006

Find google pagerank fast and boost your ranking

Author: Ashish Thakkar

Manually going through each and every page of Google results, for each and every keyword related to your site to find pages and link partners with high Pr is a big hassle. First you have to search google for the sites then find the PR of the site, after that you need to find the backward links to that site. You would have to do this for each and every link , because there is no way you can find enough high quality link partners in one search(unless you buy a database of HIGH PR sites worth many dollars). This takes a LOT of time and effort.With Jvw Pagerank finder software you can find High quality links in just seconds. It is ideal tool for finding valuable domains. Now you would ask why pagerank ? Because PageRank is Google's way of deciding a page's importance. It matters because it is one of the important factors that determines a page's ranking on the search results. Jvw PageRank Finder help you find valuable domains with high PageRank.

About the author: Web and software developer at Jimmy's Value World, Inc since 1999.

Optimizing Your Web Site for the Search Engines Using CSS and Javascript

Author: Michael L. White Copyright © 2003-2004 All Rights Reserved.

Two of the greatest techniques to come along for web site refinement are cascading style sheets (CSS) and javascript navigational menus. In this article, I want to show you how to use both of these to ease the strain of site maintenance while defending against at least two problems with using javascript menus.

CSS can make web site maintenance much easier by consolidating a site's style and appearance attributes into one central file which can be edited alone and yet affect the look of the entire site. Just as wonderful, one javascript file can accomplish a similar effect with your site's navigational menu by making it available to every page on your site through a single line of code per page linking that page to the javascript file. By removing all this CSS style and javascript code into two separate files, you will clean up your web pages' textual content, thus making it easier for search engine spiders to crawl and index your site and more effectively rank it according to your actual textual content. These are definitely two techniques worth implementing.

Here are the examples to show you how this is done. First, here's how your web page incorporating both the CSS and the navigational menu javascript file should look:

<HTML> <HEAD> <TITLE>Your Page Name</TITLE> <LINK TYPE=""text/css"" MEDIA=""Screen"" REL=""stylesheet"" HREF=""http://www.yourdomain.com/your_css_file.css"">

</HEAD> <BODY> <DIV ID=""center"">

<H1>Your Page Name</H1> Your page's textual content goes here.... </DIV> <DIV ID=""left""> Your navigational menu is inserted here from your javascript file using the following line of code. See the next example for sample code for the navigational menu javascript file. <SCRIPT LANGUAGE=""javascript"" TYPE=""text/javascript"" SRC=""http://www.yourdomain.com/your_nav_menu_file.js""> </SCRIPT> </DIV> </BODY>

</HTML>

Now, here's how your navigational menu javascript file should look:

<!--

document.write('<a href=""http://www.yourdomain.com/your_web_file1.html"">Page One</a>'); <BR> document.write('<a href=""http://www.yourdomain.com/your_web_file2.html"">Page Two</a>'); <BR> --> You can add as many menu items as you need, so you get the picture.

Finally, here's the part of the code in your CSS file which gives your site the table-like look without the high-maintenance, cluttered effect of the HTML TABLE code:

...other CSS code, such as font style, etc., can precede the following segment.

The #left and #center blocks of code below correspond to the left and center columns on your web page. You can also add a #right and #top column and section, respectively, if you so desire.

#left { position: absolute; top: 0px; left: 0px;

width: 220px; padding: 10px; margin: 5px;

background-color: #f2f2f2; }

#center { top: 0px; margin-left: 230px; padding: 10px; }

Hopefully, those examples give you a fairly good idea of the benefit of using these two powerful practices. For more about using CSS, I can recommend downloading the sample chapters from Dan Shafer's book, HTML Utopia: Designing Without Tables Using CSS , at SitePoint.com.

Besides these two optimization techniques, however, we're also hearing about all kinds of ways to optimize our web sites for the search engines these days. The competition for those coveted top placements is fierce, for sure. We've heard all about how important it is to have good, pertinent content in the textual portion of our pages, how effective it can be to include our site's keywords within the alternate attributes (i.e, ALT=""keyword"") of our image tags, and how valuable a link to/from a high traffic, like-minded web site can be. All this is certainly true and well worth the effort to make our web pages rank higher in the search engines, but with all this improvement to web site maintenance, what is the downside? Well, take note, so you can say you saw it here first.

I've detected two pesky problems in this web page wonderland. One is the absence of navigational links for search engine spiders to follow, and the other is the possibility of javascript-disabled web browsers. That's right; as fabulous as it is to store our navigational menu in one javascript file for easier updating, it removes all the key links from our start page so the search engine spiders have no other pages left to index on our site, and javascript-disabled web browsers can't see a menu at all! What's a webmaster to do? Well, here's how I decided to handle it.

I put my navigational menu with its various links to all my site's other pages on two key pages: the start page and the site map page. This way, when the search engine spiders come calling, they can follow every link from my navigational menu to every other page on my site, and, at least, javascript-disabled web browsers will still have a menu to follow. The same is true of my site map page. For all the rest of my pages, however, I decided to leave intact the line of code calling the javascript file containing my navigational menu in order to take advantage of its centralization benefits. The more pages I add to my site over time, the more beneficial this approach will be, too. I see it as having the best of both worlds: easy site maintenance and search engine optimization.

So, if you want to lighten your web site maintenance load while keeping your site optimized for the search engines, I recommend using CSS to consolidate your site's style attributes, to include a tableless, yet table-like, appearance and the centralization of a single javascript file containing your navigational menu. Just don't remove your navigational links from your start and site map pages.

You can visit either of my two web sites at

http://webmarketersguide.com or

http://www.parsonplace.com to see how I've done this. You're welcome to email me anytime at info@parsonplace.com with any questions or comments.

About the author: Michael L. White is an Internet entrepreneur who currently manages two web sites: The Web Marketer's Guide http://webmarketersguide.com, which provides resources for Internet entrepreneurs to create, market, and manage a small business on the Internet, and Parson Place http://www.parsonplace.com, which has a more personal bent. Both have subscription-only newsletters to keep you well abreast of news and information.

Google Local Search And The Impact On Natural Optimization

Author: Rob Young

With the advent of Google Local, a service that helps Web users find local businesses by typing in a search term and a city name, many questions arise concerning its impact on Natural Optimization.

Google Local tracks down local stores and businesses by searching billions of pages across the Web, and then cross-checking these findings with Yellow Pages information to locate the local resources Web users wish to access. In addition to local business listings and related Web links, Google Local also provides maps of the desired region and directions made available by MapQuest. This makes Google Local convenient for Web searchers and extremely useful for local businesses, if their sites are optimized for local-searches. If not, some businesses could be missing out on a tremendous increase in local site visibility and traffic.

Case-in-point: The Home Depot, whose Web site features its own Store Finder with zip code-accessed location listings. Type ""Home Depot"" into Google Local and while a list of local stores appears, no related local landing pages come up. In fact, none of the related Web links even direct Web users to Home Depot's home page. Most large sites that have retail stores have a search feature or ""enter your zip"" option. Google and other Search Engines will never be able to index this content. For retailers looking to increase sales and traffic from their Web sites, this could prove to be a big problem.

The Home Depot is not alone. Countless other large and small businesses alike do not have city-oriented pages accessible through local search sites. Many are not listed in the top 15 return results for related keywords for Google Local, despite their location in the immediate proximity to the search location. Google Local ranks listings based on their relevance to the search terms the user enters, not solely by geographic distance. This means that unless your site has a city and/or county-oriented landing page for each location, Google will not be able to access your contact page, no matter how relevant your site is to a search term, or how close you are in geographic distance.

Natural Optimization specialists never really focused on the optimization of contact and location pages on websites, but now it's becoming a vital tool to drive more qualified traffic to the sites. In order to make sites local search-ready, they should start creating sitemaps that include every store location and then build individual landing pages for each specific location with a brief overview of the store along with a map and detailed directions. Without this, Google does not have a path to index the pages and information. Doing this small step will increase your qualified traffic as well as increase sales in your retail store or business.

By making your keywords city-specific and including more location-specific information on your site, Google Local can access your contact information and, as a result, drive more related traffic to your site.

Take Hard Rock Café. Their Web site is an ideal example of a site that is perfectly optimized for local Search Engines like Google Local. When entered in as a search term, Hard Rock Café's number one listing links to their home page's restaurant location page. Search users can instantly access information on Hard Rock Café in general, as well as learn more about location and contacts.

Local search is one of the most hyped areas of development in the Search industry today. Other Search engines including Yahoo!, Ask Jeeves, MSN and CitySearch are hot on Google's tail to perfect their own versions of local Search Engines. Soon, not having your site optimized for local Search Engines will make your business's site obsolete. The impact of local search is already apparent, and it is still only in its infancy.

About the author: Rob Young, Manager of Natural Optimization and Creative Director of full-service interactive marketing and advertising agency UnREAL Marketing Solutions, has been with the company since its inception in 1999. Young oversees the Natural Optimization and Creative departments.

The Road to Better Results

Author: Shawn Campbell

A lot has changed in the way sites are optimized for search engines since last year. For one thing, Google is not the only search engine worth looking into anymore; Yahoo has definitely managed to take away some of Google’s oomph over the past twelve months. Another important change is that the intelligence of the search engine spiders and algorithms has increased dramatically. So without further ado, I will present you with a standard search engine optimizing process.

Keyword Research

Nothing can be done until you know what your target phrases are. Keyword research must be done to find out what people are actually typing into the search engines. For example, do they type in “medical insurance” or “health insurance” more often? Is it worth targeting the keyword “dental insurance”? What do your competitors think its clients type?

Keyword research usually begins by asking the client what they think are good keywords and by looking at your competitor’s Meta tags and text. You then have to brainstorm to find new and related keywords that were not previously thought of. The use of Wordtracker, Overture, and Google AdWords’ estimates is indispensable. If you use the “KEI” offered at Wordtracker, don’t fall into the trap of giving it too much worth. It is a good tool to help discover keywords that have not been exploited by the competition, but the really important number is the amount of traffic each keyword generates. Finally, create a chart to determine the relationship between keywords used. For example, there is no point promoting dental insurance if your site does not offer it. Texts

The next step is to write the text. Hire a specialized writer to put the text together. Ideally someone who has been trained in Internet writing, Internet marketing, and search engine optimization (SEO), or get advice from professional SEOs, marketing experts, and usability experts. Work with the client to get a feel for what is needed for the site. Then use all these skill to put together the delicate balance needed between selling to people, selling to search engines, and making the text interesting/useful to read.

Domain Selection

Once the text is written, come up with a catchy domain name for the site. Try to include part of the keyword in the domain, and to think ahead so that the domain can be expanded into the title. Our site www.gloriousbahamas.com is a good example of a domain with a keyword in it that is catchy and clearly stated. The keyword for that site is “Bahamas real estate”, so having part of the keyword in the domain will help in the long run.

Title and Meta Tags

From the domain name, you can then create a title with the full main keyword in it (such as Glorious Bahamas Real Estate). The title is the most important text on the site. The Meta tags include the description tag, and the keyword tag. The description is what the searchers will see in many search engine results, so it must have the keywords in it and, more importantly, it must sell the site. Write a description that is objective, not subjective. Zeal has some good advice for titles and especially description writing at http://zeal.com/guidelines/style/site_titledesc/. The keyword tag is done just in case some engines still use it (though very few still do), so don’t pull your hair out over it. Just list 10-15 keyphrases and try not to repeat any single word more than three times.

New Content

Now we come to the meat of today’s search engine optimization. So far, we have not discussed anything new or original. It is the same strategies that have been used since I first got into the business of SEO in 1998. Today, with smarter engines, a site needs to be something that is cared about. A site has to grow, develop, and expand as if it were someone’s baby. Gone are the days when you could build a site, get good listings, and then forget about it as it brought in the traffic and the dough. Take care of your site by adding useful content to it on a regular basis, and then the site will gradually grow from a few pages to dozens of pages. Not only will this make the site seem more alive - radiating with the healthy glow of a developing child - but it has the added benefit of increasing the amount of content the site contains, and thus increasing the amount of keywords found within it. For example with www.canada-health-insurance.com we add pages with more details about dental coverage or pages with details about government coverage for each province. Every month there are new pages, so that every time the spider comes back to visit, it spends more time at the site reading new content. This is one half of the key to getting good listings in the search engine results pages (SERP).

Link Campaigns

The second half of the key is getting good sites to link to your site. Going after web sites with related content, sites with good authority in your web site’s field, and sites that are “popular” are the priority. Getting only reciprocal links is not the goal, getting the aforementioned sites to link to you because you have good, valuable content is the goal. Sites that do reciprocal linking usually have hundreds of links on their link pages and these will add very little value to your site. Don’t waste your time with reciprocal linking. Only link to a site if doing so will increase the value of your site in the eyes of your clients.

A link campaign is a lot of work, and it involves a lot of frustration and rejection. You have to approach bigger sites and sell the value that linking to your site will bring them. For every 20 sites you approach, you will be lucky to get one to link to you. You have to be persistent, consistent, and determined.

Conclusions

Optimizing a site is no longer something you can do and then forget about. For a site to succeed in the search engines today, it has to constantly be changing and growing either in content or in links, and ideally in both. It has to appear that the site is the life and soul of its creator, and that somebody cares enough about it to pay attention to it. Because after all, if the creator doesn’t care, why should the search engines?

About the author: Shawn Campbell is an enthusiastic player in the ecommerce marketplace, and co-founded Red Carpet Web Promotion, Inc . He has been researching and developing marketing strategies to achieve more prominent listings in search engine results since 1998. Shawn is one of the earliest pioneers in the search engine optimization field.

SEO for CEOs - Search Engine Optimization Unmasked for CEOs

Author: Glenn Murray

If you're like most other CEOs, the term ""search engine optimization"" will mean very little. Either that or it means expense! But it doesn't have to be that way... If you feel like you're standing in a dark room handing money to strangers to get you in the search engines, then this article is for you.

This is an article written by a business owner for other business owners and CEOs. It explains Search Engine Optimization (or SEO) in layperson's terms. It won't make you an expert, but it will give you some insight into what you're spending your money on, what you should be spending your money on, and just as importantly, what you shouldn't.

But before launching straight into an explanation of SEO, let's talk a bit about search engines. Approximately 75%-80% of website traffic comes through search engines. What's more, research shows that most people don't look beyond the first 2 pages of search results. This means if your website doesn't rank in the first 2 pages of the major search engines, it's only receiving 20% of its rightful traffic... and revenue. (And remember, being ranked number 1 when you search for your company name or web address doesn't count. You need to rank highly for the words your customers use at search engines.)

The biggest concern for search engine companies like Google, Yahoo, etc., is finding content that will bring them more traffic (and thus more advertising revenue). They do this by using complex algorithms to determine whether a site is useful and should be included in their search results.

This is where SEO comes in.

SEO is the art of ranking in the search engines. Nothing more, nothing less.

SEO means creating your site such that the search engines consider it useful. The two main weapons in your arsenal are:

·Keywords ·Links to your site

KEYWORDS

Figure out what words your customers are looking for at search engines, and use those words at your site. By frequently using keywords that are important to your customers, you tell the search engines what you do. These keywords are used in your copy and in the code behind the page. Generally speaking, the more you use the keywords, the more relevant you are to searches in that field.

Keywords in Your Copy

The use of keywords in your copy is easy to understand. But it's not easy to do. You can't just pepper your site with a meaningless array of words. The trick is using the most important keywords a lot without compromising the readability of your copy. It's a balance between writing for the search engines and writing for your reader.

TIP: If you find this too time consuming, a website copywriter can take care of it for you. And if you know your keywords already, it should cost you no more than normal web copy.

Keywords in Your HTML Code

The use of keywords in your HTML code is harder to understand, but it's easier to do. There are four main places these keywords are used:

·Keywords ·Description ·Alt ·Title

TIP: When you hear people talking about meta tags, this is what they're talking about. To see how meta tags are used in practice, go to Google and pretend you're a customer. Search for something your customers would search for. e.g. If you're in car audio, search for ""car audio"". Click on the first couple of results to bring up their website. Right-click on the home page, and select ""View Source"". You'll see a whole lot of code. You can ignore most of it. What you're looking for are the following...

meta name=""KEYWORDS"" CONTENT=""keyword 1,keyword 2,keyword 3""

meta name=""DESCRIPTION"" CONTENT=""Meaningful description of page using the main keywords""

img src=""filename.gif"" alt=""Meaningful description of picture using the main keywords""

The title of the page using the main keywords

Take a look at the way the creators of the site have used keywords in these areas, and follow their lead. You already know they're ranked highly, so chances are they've done a good job. Alternatively, take a look at my site, http://www.divinewrite.com to see how I've done mine.

Links to your site

Now that you know how to tell the search engines what you do, let's talk about how to convince them you're important.

Links to your site (or ""inbound links"") are the most important factor in ranking. The more links you have to your site from other sites, the better your ranking (related sites generate better rankings).

TIP: Think of the Internet as a big election. All the websites in the world are candidates, and all the links to those websites are votes. The more votes (links) a candidate (website) has, the more important it is, and the higher its ranking.

There are many possible ways to generate links. Some are dubious (like auto-generation software). Others are legitimate, but offer limited results (like asking customers and suppliers to list you on their sites, and adding your site to various business directories). You can experiment with these methods, but I've always found the best way to generate inbound links is article PR - write helpful articles and let publishers of newsletters and e-zines use them for free - on the proviso that they link back to your site.

People who publish e-Zines and newsletters are always hungry for quality content. And there are websites out there dedicated to giving them just that. If you submit a well written, relevant, helpful article to one of those sites, you can have thousands of newsletter publishers ready to snap it up. Then you just sit back and watch the links multiply!

TIP: Like traditional PR, article PR is beneficial in other ways too. Readers of your article will see that you know what you're talking about, and because you're published, they'll see you as an authority.

It's impossible to say how much time you'll need to spend generating links. You just have to keep at it until you have achieved a high ranking. Even then, you'll still need to dedicate some ongoing time to the task, otherwise your ranking will drop.

Summary

So to cut a long story short, it comes down to this. If you have a lot of the right keyword phrases, used in real sentences, distributed realistically throughout your site, and a lot of links from other relevant sites, you stand a good chance of being ranked highly.

That's what you're paying your providers for. And that's what SEO is all about.

About the author: * Glenn Murray is an SEO copywriter and article submission specialist . He is a director of article PR company Article PR and also of copywriting studio Divine Write .