Monday, September 28, 2009

What is The Google Sandbox Effect?

In the age of fair competition you may find it hard to believe that a search engine may hinder the appearance of a new website. This is what is currently believed to be happening on more web servers today. Some programmers have viewed Google as uncomfortable to rank newer websites until they have proven their viability to exist for more than a period of "x" months. Thus the term "Sandbox Effect" applies to the idea that all new websites have their ratings placed in a holding tank until such time is deemed appropriate before a ranking can commence.

However the website is not hindered as much as the links that are reciprocated from other users. Newer links that are created are put on a "probationary" status until again they pickup in rank from other matured sites or placed directly by an ad campaign. The idea behind the hindrance is to prevent a fast ranking to occur on a new website. The usual holding period seems to be between 90 and 120 days before a site would start obtaining rank from reciprocal or back linking.

Some advice has been given to have companies you are going to reciprocate back add your link first to the website. This may help grandfather your site in, thus reducing the waiting time associated with "new" websites. People have noticed a 0 page rank when first signing up and receiving a bolstering 7 page ranking after 4 months. Why the delay? The fact is, that if people realized how easy it would be to get a high ranking, would that take away the credibility of the engine. It depends on whom you ask, but it does seem to be happening frequently to newer subscribers. Do not discontinue back linking, your rank will eventually appear.

Jakob Jelling is the founder of http://www.sitetube.com. Visit his website for the latest on planning, building, promoting and maintaining websites.

Sunday, September 27, 2009

What is PageRank?

Chances are you have been on the Internet and have been surfing in and out of websites looking for valuable information pertaining to a favorite topic or researching a subject for school or work. As you type in keyword(s) you match the information you are searching for on Google, you come up with 10,000 pages of information. It's virtually impossible to go through every one, so you refine your search by adding more exclusive keywords. Voila the number of pages reduces to around 1,000. Still this is a lot of pages, but you start looking through the information to find what you want. 

As you go through the first 10 links on the page, WHAM! The information you needed to find was in the first or second in order of PageRank. You wonder how did they get such a high rank on Google? You may think it was very expensive to get that site at the top of the heap. The funny thing is, with a little know how and about $75 you too can go for the top.

Search Engine Optimization or "SEO", has become a standard in the web design industry, every customer of a good web designer wants to be number one in their keyword and may be willing to pay the extra money to get there. A good web designer will dress up a web sites home page to match the requirements of their client on specific keywords. The client will also pay more for the exclusivity to remain there untouched. SEO has become a niche for a lot of web companies. They know if they can get the company to the top fast, the word of mouth will be helpful toward their business.

Through specialized META tags (hidden group of keywords) the web designer will strategically place keywords multiple times in the title bar, keywords, and even as hidden text. Some search engines have figured these tricks of the trade out and have banned certain websites from their indexes. Google has become the engine of choice for a lot of people today. There is a different logic Google uses to calculate page rank and keywords is only a portion of it.

Google actually uses a specialized mathematical equation to place your site in a predetermined order. First things first, if your website is a keyword, that does not automatically give you a top spot. It will take time to move up the ranks and you should register with Google as soon as possible to drive your rank upwards. But just having the right URL (Universal Resource Locator) doesn't guarantee the top spot either. You must also be swapping or reciprocating links with other Google users. The more you use Google websites that are indexed the faster and higher your site will go in the ranks. 

A Google robot will visit your site frequently so continue to modify your code and keep checking its rank and status. Eventually, your site will drive up the ranks and land on top. It may take time and work, but you will get the hang of keeping it there once you employ the right mix of keywords with links. Some companies can charge up to $1,000 for the top spot, they employ the same techniques, even though they don't want you to know this. Keep your META tags, title, keywords and content in line with your keywords and continuously look to optimize them. Under no circumstances take another persons keywords off of their code; this is potentially dangerous as you could be violating copyright laws.

Good Luck!!!

Jakob Jelling is the founder of http://www.sitetube.com. Visit his website for the latest on planning, building, promoting and maintaining websites.

Wednesday, September 23, 2009

What is the Google Dance?

As with any good web developer, the ability to time the changes Google will update your website and refresh the content for better SEO (Search Engine Optimization) is in your favor. Welcome to the world of "Google Dance". The Google Dance is simply that the predetermination of when the actual update will commence.

What really happens is Google sends out spiders to crawl the Internet, usually done to DNS (Domain Name Servers), upon spidering all of the available tables it begins to go through each individual site and updates the content on to Google.com. Thus if you watch your rank on the Google Toolbar, you can tell when your overall page rank has changed.

Understanding SEO and what is necessary to improve your overall ranking. Webmasters have been looking for ways to increase the odds of guessing when the next spidering would commence. There are various versions and servers that go out and crawl thousands of servers at a time, it takes time to relay and decipher this information back to the web server that Google.com pulls its information from.

Some vendors have created programs that actually go out to the data centers themselves to find out approximately when the last index was kicked off. Depending on the information returned from the trace the exact time and date are pinpointed. Is there really that much of an advantage to doing it this way? It depends, if you have a vital update that you want to optimize your site, you may want to know when the last time Google visited your site for content to keep information fresh and relevant.

There are many data centers that Google uses to spider across the world. Each center has a specific region it covers and all the information is gathered in aggregate and returned back to populate Google.com. With this many data centers the chances for continuous indexing is good, but not guaranteed.

Jakob Jelling is the founder of http://www.sitetube.com. Visit his website for the latest on planning, building, promoting and maintaining websites.

Monday, September 21, 2009

Got Spiders?

Many internet marketers blow mountains of start-up cash on their websites just trying to break into search engine rankings. I was one of these internet marketers.

I spent cash on get-rich-quick submission services that claimed they would submit my page to thousands of seach engines. . .for a small fee. I spent money to get registered with big name search engines. I even spent money on search engine optimization services.

And then I waited. . .

I waited to see SOME response--any response--from all of this work I had done. I waited for the spiders to reach my pages and rank my page for everyone to see. I waited for my blown marketing funds to come back to me in profits. . .

But they never did. The spiders took weeks. The page rankings were dismal. I was ready to give up.

And then I received this vital piece of information from a fellow marketer: the more links you have on bigger websites, the faster your page is spidered. . .

That's right: Since search engines rank higher-PR sites more often, they will find your page a lot faster if you get your link on a highly-trafficked website!

Not only will they find your site faster, but they will also return to it faster and rank it for FREE. This is all in addition to receiving a huge PR boost for having a link on a high-PR site.

So. . .

If you haven't done it already, slap some metatags on your site that are optimized for your content. Make sure you use keywords that are repeated multiple times in the text of your website, so that your pages will rank higher for relevency.

Now begin your hunt for high-PR websites to post your link on. You can start by going to google and searching for your specific service or product in quotes to find high-PR sites in your category.

You'll find out quickly that there are a number of ways to get your link on these websites. Some of them will agree to do a main-page link exchange for free. Others will allow you to submit articles along with a resource box that will give you a link on their website. And others wont allow you to work with them at all unless you have special qualifications or a high page-ranking.

If you have a free product, like an eBook or some software, you can give away free downloads on a high-PR site. Just place an ad on the site with a link back to your website, where they can receive the free download. You can find these sites by googling phrases like "free content directory."

There are plenty of ways to bring in traffic with search engines without paying hefty registration fees. Start searching today. Find the big sites and get your link out there!

You may use this article for reprint if it remains unaltered and includes the author information and resource box. - Isaiah Hull

Isaiah Hull publishes Work At Home Right Now, a fresh and informative newsletter about making money on the internet and using proven methods to increase your site's traffic and profitability. If you're looking for time-saving and money-saving tools, as well as honest business advice, come by and subscribe at http://www.workathomerightnow.net

Search Engine Marketing 101 For Corporate Sites

When most people want to find something on the web, they use a search engine. Millions of searches are conducted every day on search engines such as: google.com, yahoo.com, msn.com and many others. Some people are looking for your website. So how do you capture people searching for what your site has to offer? Through techniques called search engine marketing (SEM).

This tutorial is foundational information for anyone looking to implement search engine marketing. This tutorial will also help you understand how the search engines work, what SEM is, and how it can help you get traffic.

What is a Search Engine? All search engines start with a "search box", which issometimes the main focus of the site, e.g. google.com, dmoz.org, altavista.com; sometimes the "search box" is just one feature of a portal site, e.g. yahoo.com, msn.com, netscape.com. Just type in your search phrase and click the "search" button, and the search engine will return a listing of search engine result pages (SERPs). To generate SERPs the search engine compared your search phrase with information it has about various web sites and pages in its database and ranks them based on a "relevance" algorithm.

Search Engine Classes Targeted audience, number of visitors, quality of search and professionalism is what determines a search engine's class. Each search engine typically target specific audiences based on interest and location. World-class search engines look very professional, include virtually the entire web in their database, and return highly relevant search results quickly.

Most of us are familiar with the major general search engines; google.com, yahoo.com, msn.com. A general search engine includes all types of websites and as such are targeting a general audience. There are also the lesser known 2nd tier general search engines; zeal.com,ask.com,whatyouseek.com. The primary difference is that 2nd tier engines are lesser known and generate significantly less traffic.

There are also several non-general or targeted search engines that limit the types of websites they include in their database. Targeted search engines typically limit by location or by industry / content type or both. Most large metro areas will have local search engines that list local businesses and other sites of interest to people in that area. Some are general and some are industry specific, such as specificallylisting restaurants or art galleries.

Many other targeted search engines list sites from any location but only if they contain specific types of content. Most webmasters are familiar with webmaster tools search engines such as; webmasterworld.com, hotscripts.com, flashkit.com and more. There are niche SEs for practically any industry and interest.

Search Engine Models There are two fundamentally different types of search engine back ends: site directories and spidering search engines. Site directory databases are built by a person manually inputting data about websites. Most directories include a site's url, title, and description in their database. Some directories include more information, such as keywords, owner's name, visitor rankings and so on. Some directories will allow you to control your website's information yourself others rely on editors that write the information to conform to the directory standards.

It is important to note that most directories include directory listings as an alterative to the search box for finding websites. A directory listing uses hierarchal groupings from general to specific to categorize a site.

Spidering search engines take a very different approach. They automate the updating of information in their database by using robots to continually read web pages. A search engine robot/spider/crawler acts much like a web browser, except that instead of a human looking at the web pages, the robot parses the page and adds the page's content it's database.

Many of the larger search engines will have both a directory and spidering search engine, e.g. yahoo.com, google.com, and allow visitors to select which they want to search. Note that many search engines do not have their own search technology and are contracting services from elsewhere. For example, Google's spider SE is their own, but their directory is and Open Directory; additionally aol.com and netscape.com both use Google's spider SE for their results.

There are a few other search engine models of interest. There are some search engines that combine results from other engines such as dogpile.com and mamma.com. There are also search engines that add extra information to searches such as Amazon's alexa.com, which uses Google's backend but adds data from its search bar regarding tracking traffic to the site.

Getting In One of the most important things to understand about the SE database models is how to get into their database and keep your listing updated. With a search directory, a submission needs to be done to provide the directory all the information needed for the listing. It is generally recommended that this be done by hand, either by you or a person familiar with directory submissions. There are many submission tools available that advertise they automate the submission process. This may be fine for smaller directories but for the major directories, manual submissions are worth the time.

Not all search directories are free; many charge a one-time or annual fee for review. Many of the free search directories have little quality control. For free directories you may have to submit your site several times before being accepted.

There are three different methods for getting into spidering search engines; free site submission, paid inclusion and links from other sites. Virtually all spidering SEs offer a free site submission. For most, you simply enter your url into a form and submit. Paid inclusion is normally not difficult, except for the credit card payment. For free site submission there is no quality control. The SE may send a spider to your site in the next few weeks, months or never. Typically with paid inclusion you will get a guarantee that the page you submitted will be included within a short amount of time. The other standard way to get included is to have links to your website from other web pages that are already in the SEs database. The SE spiders are always crawling the web and will eventually follow those links to find your site.

Once you are in a search engine database, you might change your site and need the search engine to update their database. Each directory handles this differently; generally each database will have a form for you to submit a change request. Spidering search engines will eventually find the change and add your updates automatically.

Getting High Rankings Getting into a search engine database is only the first step. Without other factors you will not rank in the top positions, a prerequisite for quality traffic. So how do you get top positions? You can pay for placement with sponsored links that is covered in the next section. To place well in the free, organic SERPs, you will need to perform search engine optimization.

Search engine optimization is one of the most complicated aspects of web development. Each search engine uses a different algorithm, using hundreds of factors, that they are constantly changing, and they carefully guard their algorithm as trade secrets. Thus no one outside of the search engines employ knows with 100% certainty the perfect way to optimize a site. However, many individuals called search engine optimizers have studied the art and derived set of techniques that have a track record for success.

In general, there are two areas to focus on for top rankings; on-page factors and linking. On-page factors mean placing your target keywords in the content of your site in the right places. The structure of and technologies used on your website also play a role in on-page factors. Linking, refers to how other website's link to yours and how your site links internally.

Search Engine's Marketing Offerings Search engines in the early days of the web were focused solely on serving the visiting searcher. They worked to capture as much of the web as possible in their database and provide fast, relevant searches. Many early website owners learned to reverse engineer the relevancy algorithms and to make their sites "search engine friendly" to get top rankings. They were the first search engine optimizers, manipulating the search engine's natural or organic SERPs as a means of generating free web traffic.

Often times these optimized sites compromised the integrity of the SERPs and lowered the quality for the searcher. Search engines fought, and continue to fight, to maintain the quality of their results. Eventually, the search engines embraced the fact that they are an important means for marketing websites. Today most search engines offer an array of tools to balance website's owners need to market while maintaining quality for the searcher.

You can generally break search engine marketing tools into free and for-pay. Realize these classifications are from the search engine's point of view. Effort and expense is required to setup and maintain any search engine marketing campaign.

Organic rankings are still one of the most important ways to drive quality traffic. Search engines now seek to reward ethical, high-quality websites with top rankings and remove inappropriate "spam" websites. While organic rankings can produce continual free traffic, it takes time from an experienced individual to achieve optimum results. Additionally, organic placement offers no guarantees, it generally takes months to get listed and can be unpredictable once listed.

Some search engines offer services that add more control to your organic campaign. Most of these services will list / update your site faster or will guarantee that all essential content is listed. For integrity reasons, no major search engine offers higher organic rankings for a fee.

If you need top rankings quickly, pay-per-positioning (PPP) is the most popular way to go. PPP rankings appear in normal organic SERPs but are usually designated as "sponsored listings". PPP listings use a bidding process to rank sites. If you are the top bidder, e.g. willing to pay the most per click on a given phrase, you will have top placement. The 2nd highest bidder is two; the next is 3 and so on. While most PPP works using this model, some search engines offer modifications such as Google's AdWords where bid price and click-through rates are both factors for positioning.

Search Engines have many other marketing tools, such as search specific banner ads; listings on affiliate sites and more.

Getting Started The majority of websites have sub-optimal search engine marketing. Most sites have no effective search engine marketing and are continually missing out on valuable leads. Many other websites are too aggressive, wasting money on low value traffic or harming the functionality of their site due to over optimization. Too many sites are even paying money and receiving no results because they have trusted unethical or inexperienced search engine optimizers.

All SEM campaigns should start with a strategic evaluation of SEM opportunities based on return on investment (ROI). You need to assess how much each lead is worth for each keyword phrase and determine which SEM tools will achieve the best ROI for the phrase.

You also have to decide how much you want to do in-house vs. retaining an expert. A qualified expert will typically produce better results faster, but the high expenses may destroy the ROI. Often it is best to work with an expert as a team, the expert to develop the strategy and internal staff to perform implementation and ongoing management.

Tom McCracken is the Director of LevelTen Design, a Dallas based e-media agency. He has over 14 years of experience in software engineering and marketing. He has developed solutions to improve custom service and communications for some of the worlds largest companies. With an education in chemical engineering and economics from Johns Hopkins University, his background includes; web and software development, human factors engineering, project management, business strategy, marketing strategy, and electronic design.

Thursday, September 17, 2009

Choosing a good domain name isnt always so simple.

So you need a domain name for your brand new internet business. You may even have some cool ideas for a new domain name combination that will really impress your friends. Question is, is your new domain name going to help your business or hurt it?

What could be simpler than choosing a domain name right? Wrong. There are a number of things you need to consider and research before you register your favorite domain name.

First off, what is a domain name and why would I want one?

A domain name makes our lives much easier when surfing the internet. You see, all computers on the internet are actually referenced with what is called IP addresses. On the internet, IP addresses are four sets of numbers that serve like street addresses allowing two computers to talk over a network. An example of an IP address is the one for Google.com. It is 216.239.39.99. If you enter this IP address into the address bar of your browser it will bring you to Google's home page in that very same way that typing www.google.com would get you there. Unfortunately, we humans have difficulty remembering our phone numbers let alone so many digits for all kinds of sites. That's one of the main reasons domain names were invented.

Domain names make it easy for us humans to remember how to find a site. Most people know Google.com and anyone familiar with the internet knows that to reach Google, you simply type www.google.com in your address bar and you are transported to their website. The same goes for Disney.com, Microsoft.com, CNN.com, etc?

Now you would think that choosing a domain name would simply be a matter of choosing something that is unique and that people would remember. The problem with that approach is that most of us don't have the money needed to turn our name into a brand name on the mass market. Most of us need to rely on our prospects reaching our website through other means. The best of these are search engines.

Choosing a good domain name for your site starts with the main keywords you have chosen to focus on for your website. Before you launch your business, you should conduct some preliminary research online to determine which keywords have the most traffic and the least number of other websites competing for that particular keyword. Some tools that help in this are the Overture keyword suggestion tool and Wordtracker.com. Both of these tools will give you a rough idea of how much traffic each of your chosen keywords will likely get each month. This helps to determine which keywords to focus on.

Should you choose a domain name that includes your main keywords?

In most cases, the answer is yes. Google and to some degree Yahoo both give you a small boost for your domain name. If your domain name happens to contain your targeted keywords, your domain name will help you in your quest for higher search engine rankings. Now if you do everything else wrong, having your main keywords in your domain name will not magically catapult you to the top of the listings. Many other parts of your site must be working for you as well. Other things you can do to improve rankings are beyond the scope of this article.

Choosing a keyword rich domain is a smart business move.

For some sites, it could be the edge they need to move up a few spots in the search engines. When choosing a keyword rich domain name, you may want to consider hyphens between your keywords. An example is cheap-airline-tickets.com. Current research trends for Google and Yahoo suggest that hyphens are the only way to separate keywords within a URL that will give you a rankings boost.

Why not simply choose your company name? Simple. Is your company a household name? Are you so dominant in a category that people have stopped referring to the generic name of your category and use your brand name like Kleenex has for tissue paper? If so, register your company name. If not, register a keyword rich domain wherever possible.

You may be thinking, "But I already own a domain name that is my company name. Should I go and register a new domain and point it to the same site? The short answer is no. Years ago, you could improve your rankings on search engines simply by setting up lots of doorway pages and having them all link back to your home page with all kinds of domain names. That tactic nowadays can backfire. You are better off optimizing individual pages within your existing website than you are creating a whole bunch of "fluff" sites just to increase rankings.

The technique I suggest above is really best suited for brand new business ventures. If you still have not registered your domain name for that special online business you are about to start, then make it keyword rich wherever possible. If you have already launched your business, you'll just have to take advantage of this information next time you start another online venture.

This article was written by Joe Duchesne, president of http://www.yowling.com/ , a budget web hosting company that specializes in helping online business owners increase their website traffic. Copyright 2004 Yowling. Reprint Freely.

Wednesday, September 16, 2009

Your Website Title Could Be Costing You Money

Nothing could be simpler than the title you give to your web pages right? Unfortunately, the vast majority of the websites I visit these days have absolutely terrible titles that hurt their online business. The title of your website is a very important part of getting good rankings on most of the major search engines. A good title also goes a long way towards getting your prospects to click on your listings.

If you go to Google right now, and type any search phrase you want, you get back a listing of web sites that match the keywords you entered in. If you look closely, you'll notice that each search listing' hyperlink is also the title of that website. The title you choose needs to describe to your prospects what your website is all about. It needs to be able to entice your prospects to click on your listing over any other listing. If your title is simply your company name, you are most likely loosing lots of traffic. You will also find it difficult to rank highly on relevant keywords to your site.

Here are some things to consider:

1. Make sure you use relevant keywords

Keywords are simply search terms that your web site prospects will type into a search engine in order to find you. The keywords you are targeting need to be included in your title. Your keywords also need to be as close to the beginning of the title as it makes sense to do. For example, if you were selling shoes online and you were targeting the keyword "children shoes", you could have a title like "Children's shoes for hard to fit children." Notice how the targeted keywords were at the beginning of the title. Putting your keywords at the front of your title speaks to keyword prominence. Prominence refers to the importance of your keyword in the title. If your main keywords are at the very beginning of the title, it is said to have a prominence of 100%. If they are at the very end of the title, they have a prominence of 0%. As much as possible, you want to have your main keywords appear towards the beginning of your title.

2. Consider using your main keyword twice in your title

If you are optimizing your site to rank well on Google, you should also consider finding a way to include your main keyword twice in the title. The trick is to do this without making the title sound stupid. One way I do this is to use the pipe character | between your main keywords. For example, if I was writing a title for a fishing website and the main keyword I was targeting was 'fishing charter' I could repeat the keywords this way, "Fishing Charter | Are you ready for a fishing charter you won't soon forget?" This example gets my target keyword at the beginning and manages to repeat it again without making it look stupid.

3. Persuade your prospect to click on your link

The link that your prospects will see when they do a search of your website on a search engine will almost always be the title of your website. Even if you get to the first page on a search engine for the keywords you are targeting, you still need to persuade your prospect to click on your link over all the others around you. If you title is not persuasive, or even non-existent, you won't get the traffic you expect even if you are number one in the listings. Your title must be persuasive.

4. Say what you want in 65 characters or less.

Almost all search engines limit the length of the title that will appear to the search engine surfer. Google for instance, only displays the first 60 to 66 characters. Sometimes, a webmaster will try to include every one of their keywords in the title in the hope that all of their keywords will be picked up by the search engines. Keep your main keyword prominent in the first 65 characters of your title. Do this while making sure that your title is properly targeted to your target market. You can include your secondary keywords in the body of your web page, but keep them out of the title unless it makes sense to keep them in. The rule of thumb for including secondary keywords in your title is to include them only if you can still keep the title persuasive to your website prospects.

Your website title is crucial to your success online.

Your title is vital to your efforts of getting traffic online. Make sure it is descriptive, and persuasive. It needs to include your main keywords as close to the start of your title as it makes sense to do. You need to avoid repeating the same keywords over and over again. This may work for keywords that have little or no competition on them, but it won't work for any keyword that gets even a decent amount of traffic on them.

This article was written by Joe Duchesne, president of http://www.yowling.com/, a budget web hosting company that specializes in helping online business owners increase their website traffic. Copyright 2004 Yowling. Reprint Freely.

The Budget Webmaster's 6 Step Guide to Improving Existing Rankings in Google

The Budget Webmaster's 6 Step Guide to Improving Existing Rankings in Google

You know the scenario. You get an occasional click from Google for a certain keyword. You go to find out why you aren't getting more clicks, and you find out that you're ranked in the 30's, 50's, or heaven forbid, the 300's. "Great", you think, "I finally get ranked for a good keyword and it's a worthless ranking".

Not necessarily.

If you got ranked for a keyword you wanted At All, the game's not over yet. If your site's content is geared towards that subject, you can get your ranking in search engines increased, at no cost. How?

The first thing you want to do is find out how well you are ranked for this keyword. For Google in particular, this used to be a difficult chore. In the old days of 2003, you'd spend your valuable time doing a search on your desired keyword, then a sub-search for your site, and crawling through pages of listings to find out exactly where you stood.

Now there is hope in the form of the following website. Direct your browser to:

http://www.googlerankings.com/index.php

You can use this site to find out what number you come up for in the Google listings, which can be very powerful information if used correctly. If you're ranked in the top 1000, you have a shot at raising your listing for that page by tweaking the page to be a little more relevant.

So, secondly, you have to know how good a shot you have at getting a better listing. Go to:

http://www.searchguild.com/difficulty/

I posted a tip about this a month ago, and it's also in the free optimization Guide I released the week of March 7th. It tells you how hard it is to rank well for certain keywords in Google. You'll need a free Google API key to use it.

Now that you know your chances, the third piece of information you need to know is how much traffic you can expect. Digital Point has a free tool that gives an approximation of how many hits per day a good ranking gets. Access it here:

http://www.digitalpoint.com/tools/suggestion/

Okay, let's say everything checks out so far. You rank in the top 1000. The term you want won't be that hard to get, and will get you enough traffic per month to justify your efforts.

Our fifth step is to take the term you chose and optimize your page.

This site does periodic reports on the search engines, and their February report gives their analysis of what the best ranking pages in Google have in common. And as a free bonus, it will also tell you what Yahoo wants. Follow the following link for details-http://www.gorank.com

Now that you know what to shoot for, you need to know how the page you want will measure up- you need to calculate your keyword density. You can also do the sixth step at gorank.com - it has a free tool that will calculate it for you. Prepare your page with that in mind, re-upload, and you're almost done.

Great, you're all set. Now you should submit your site to Google, right?

Wrong. Absolutely not. If you can help it, you should never, ever submit any page of your site to Google. Let it find you. HOW it finds you can affect your page rank. I don't mean that there is a standard penalty for submitting. There's been speculation on that for a while but I have yet to prove it matters.

What I DO know from personal experience and testing on my member's sites, is that getting the Googlebot search engine spider to happen upon your site shaves up to 6 weeks off the standard time it takes for indexing. You can show up in Google in as little as 4 days.

Which site links to you can also affect your Google Page Rank. While this is not as important as it once was, it still carries significant weight- my site didn't start getting spidered on a daily basis until my Page Rank increased to 5.

So even if the spider comes to your site on a Monthly basis, you're better off waiting for the spider to come back by. That's the seventh step, let your page be re-discovered with it's great new changes.

And yes, there's an even faster, better way to get Google.com's search engine spider to re-index that page, but that's another article, isn't it?

For more free traffic tips, subscribe to her newsletter at ftdsecrets-subscribe@topica.com or visit her feed enabled blog: http://www.freetrafficdirectory.com/blog

Tuesday, September 15, 2009

If Content is King, then surely Relevance is Queen!

There has been a lot of to-ing and fro-ing in the search
engine world of late and there are lots of conspiracy
theories as to why these things happen.

It is easy as a webmaster to get caught up in these webs of
intrigue.

You get email notes about them, you view so-called experts'
thoughts on bulletin Boards - hey you probably even read
things in newsletter articles!

Well I hope so anyway....

The big driver for webmasters currently appears to be
content and link building. 

While link building is important I don't believe it makes
Queen. Maybe a Prince. Content and links DO go hand in
hand but, without relevance, the Kingdom is doomed. Sorry I
will stop the analogy now! :-)

If your site is about finance, then finance content is best
supported by finance link exchanges. Relevance!

If your site is about finance, then finance content
supported by casino link exchanges from a PR8 site while in
the short term may help,?but all the signs are saying this
is not a long term strategy.

Okay,so what is the best strategy?

Keep EVERYTHING relevant. It is that simple. 

Make sure that you only swap or link to sites that are
relevant to the content on your pages. Yes I am suggesting
link exchanging on pages of your site not a links page.

Links pages seem to be being abused. There are rumours that
pages called links, resources or partners are not passing
page rank. You could be wasting your time building links
that are not giving you any benefits!

Delivering relevant links from relevant content is the
future.

Look at sites such as www.bbc.co.uk or
www.independent.co.uk. News sites have the right idea.
They have 2 or 3 relevant internal links to other
articles on the same topic or links to internal tools that
are related. These usually can be found at the right hand
side of the article.

They also then have weblinks or external links to sites of
interest that are related to the topic. These are relevant!

Another benefit of this is that with a content rich site you
can add hundreds of links quite legitimately and really add
some value both to your Rankings and your users.

With a content-poor site it is difficult, you have to add
link pages or create a links directory. A five page site
will need to add 10 or 12 good link pages to compete and
even then with algorithm changes, this may not be prudent.

Having a site with 400 pages means you can easily add 3
links per page, so you have 1200 link options straight away.

Hopefully this explains that relevance runs a close second
to content.

Always bear in mind when writing content that relevant
links will not only boost your search engine rankings,
but you will also add a service to your visitors.

2004 © J2 Squared Limited. All Rights Reserved.


Jason Hulott is Director of J2 Squared, leading specialists in Internet Consultancy whose specific aim is to drive more revenue to websites. Their main area of focus are the insurance, finance, and automotive industries.

Get Better Search Engine Rankings with RSS

RSS is the latest craze in online publishing. But what exactly is RSS?

RSS or Rich Site Syndication is a file format similar to XML, and is used by publishers to make their content available to others in a format that can be universally understood.

RSS allows publishers to "syndicate" their content through the distribution of lists of hyperlinks.

It has actually been around for a while, but with the advent of spam filters and online blogging, it is fast becoming the choice of ezine publishers who want to get their message across to their subscribers.

However, not much attention has been given to the advantages RSS provides for search engine optimization.


Why Search Engines Love RSS

Many SEO experts believe that sites optimized around themes,or niches, where all pages correspond to a particular subject or set of keywords, rank better in the search engines.

For example, if your website is designed to sell tennis rackets, your entire site content would be focused around tennis and tennis rackets.

Search engines like Google seem to prefer tightly-themed pages.


But where does RSS figure in all this?

RSS feeds, usually sourced from newsfeeds or blogs, often correspond to a particular theme or niche.

By using highly targeted RSS feeds, you can enhance your site's content without having to write a single line on your own.

It's like having your own content writer - writing theme-based articles for you - for free!


How can RSS improve my Search Engine Rankings?

There are three powerful reasons why content from RSS Feeds is irresistible bait for search engine spiders.


1. RSS Feeds Provide Instant Themed Content

There are several publishers of RSS feeds that are specific to a particular theme.

Since the feed is highly targeted, it could contain several keywords that you want to rank highly for.

Adding these keywords to your pages helps Google tag your site as one with relevant content.


2. RSS Feeds Provide Fresh, Updated Content

RSS feeds from large publishers are updated at specific intervals. When the publisher adds a new article to the feed, the oldest article is dropped.

These changes are immediately effected on your pages with the RSS feed as well. So you have fresh relevant content for your visitors every hour or day.


3. RSS Feeds Result in More Frequent Spidering

One thing I never anticipated would happen as a result of adding an RSS feed to my site was that the Googlebot visited my site almost daily.
To the Googlebot, my page that had the RSS feed incorporated into it was as good as a page that was being updated daily, and in its judgement, was a page that was worth visiting daily.

What this means to you, is that you will have your site being indexed more frequently by the Googlebot and so any new pages that you add to your site will be picked up much faster than your competitors.




How does this benefit you as a marketer?

Well, for example, let's says a top Internet Marketer comes out with a new product that you review and write up a little article on, and that your competitors do the same.

Google generally tends to index pages at the start of the month and if you miss that update, you will probably need to wait till the next month to even see your entry in.

But, since your site has RSS feeds, it now gets indexed more frequently. So the chances of getting your page indexed quickly are much higher.

This gives you an advantage over the competition, as your review will show up sooner in the search results than theirs.

Imagine what an entire month's advantage could do to your affiliate sales!


Why Javascript Feeds Are Not Effective

Some sites offer javascript code that generates content sourced from RSS feeds for your site.

These are of absolutely no value in terms of search engine rankings, as the googlebot cannot read javascript and the content is not interpreted as part of your page.

What you need is code that parses the RSS feed and renders the feed as html content that's part of your page.

This is achieved using server side scripting languages like PHP or ASP.

A good free ASP script is available from Kattanweb
http://www.kattanweb.com/webdev/projects/index.asp?ID=7


An equally good PHP script is CARP
http://www.geckotribe.com/rss/carp/


So in conclusion, besides optimizing on page and off page factors, adding RSS feeds to your pages should be an important part of your strategy to boost your search engine rankings.




Satyajeet Hattangadi is the CEO of Novasoft Inc, a software
solutions provider, that specializes in affordable
customized software solutions. http://www.novasoft-inc.com
Get the Free Email Course "RSS Riches" and learn how to use
RSS to get high search engine rankings and monetize your
website at http://www.trafficturbocharger.com

Monday, September 14, 2009

The Other Side of the Search Gods Abracadabra!

Thousands of servers ...billions of web pages.... the possibility of individually sifting through the WWW is null. The search engine gods cull the information you need from the Internet...from tracking down an elusive expert for communication to presenting the most unconventional views on the planet. Name it and click it. Beyond all the hype created about the web heavens they rule, let's attempt to keep the argument balanced. From Google to Voice of the Shuttle (for humanities research) these ubiquitous gods that enrich the net, can be unfair ...and do wear pitfalls. And considering the rate at which the Internet continues to grow, the problems of these gods are only exacerbated further.

Primarily, what you need to digest is the fact that search engines fall short of Mandrake's magic mechanism! They simply don't create URLs out of thin air but instead send their spiders crawling across those sites that have rendered prayers (and expensive offerings!) to them for consideration. Even when sites like Google claim to have a massive 3 billion web pages in its database, a large portion of the web nation is invisible to these spiders. To think they are simply ignorant of the Invisible Web. This invisible web holds that content, normal search engines can't index because the information on many web sites is in databases that are only searchable within that site. Sites like www.imdb.com - The Internet Movie Database , www.incywincy.com - IncyWincy, the invisible web search engine and www.completeplanet.com - The Complete Planet that cover this area are perhaps the only way you can access content from that portion of the Internet, invisible to the search gods. Here, you don't perform a direct content search but search for the resources that may access the content. (Meaning - be sure to set aside considerable time for digging.)

None of the search engines indexes everything on the Web (I mean none). Tried research literature on popular search engines? AltaVista to Yahoo, will list thousands of sources on education, human resource development, etc. etc. but mostly from magazines, newspapers, and various organizations' own Web pages, rather than from research journals and dissertations- the main sources of research literature. That's because most of the journals and dissertations are not yet available publicly on the Web. Thought they'll get you all that's hosted on the web? Think again.

The Web is huge and growing exponentially. Simple searches, using a single word or phrase, will often yield thousands of "hits", most of which will be irrelevant. A layman going in for a piece of info to the internet has to deal with a more severe issue - too much information! And if you don't learn how to control the information overload from these websites, returned by a search result, roll out the red carpet for some frustration. A very common problem results from sites that have a lot of pages with similar content. For e.g., if a discussion thread (in a forum) goes on for a hundred posts there will be a hundred pages all with similar titles, each containing a wee bit of information. Now instead of just one link, all hundred of those darn pages will crop up your search result, crowding out other relevant site. Regardless of all the sophistication technology has brought in, many well thought-out search phrases produce list after list of irrelevant web pages. The typical search still requires sifting through dirt to find the gold. If you are not specific enough, you may get too many irrelevant hits.

As said, these search engines do not actually search the web directly but their centralized server instead. And unless this database is updated continually to index modified, moved, deleted or renamed documents, you will land yourself amidst broken links and stale copies of web pages. So if they inadequately handle dynamic web pages whose content changes frequently, chances are for the information they reference to quickly go out-of-date. After they wage their never ending war with over-zealous promoters (spamdexers rather), where do they have time to keep their databases current and their search algorithms tuned? No surprise if a perfectly worthwhile site may go unlisted!

Similarly, many of the Web search engines are undergoing rapid development and are not well documented. You will have only an approximate idea of how they are working, and unknown shortcomings may cause them to miss desired information. Not to mention, amongst the first class information, the web also houses false, misleading, deceptive and dressed up information actually produced by charlatans. The Web itself is unstable and tomorrow they may not find you the site they found you today. Well if you could predict them, they would not be god!...would they?! The syntax (word order and punctuation) for various types of complex searches varies some from search engine to search engine, and small errors in the syntax can seriously compromise the search. For instance, try the same phrase search on different search engines and you'll know what I mean. Novices... read this line - using search engines does involve a learning curve. Many beginning Internet users, because of these disadvantages, become discouraged and frustrated.

Like a journalist put it, "Not showing favoritism to its business clients is certainly a rare virtue in these times." Search engines have increasingly turned to two significant revenue streams. Paid placement: In addition to the main editorial-driven search results, the search engines display a second - and sometimes third - listing that's usually commercial in nature. The more you pay, the higher you'll appear in the search results. Paid inclusion: An advertiser or content partner pays the search engine to crawl its site and include the results in the main editorial listing. So?...more likely to be in the hit list but then again - no guarantees. Of course those refusing to favor certain devotees are industry leaders like Google that publishes paid listings, but clearly marks them as 'Sponsored Links.'


The possibility of these 'for-profit' search gods (which haven't yet made much profit) for taking fees to skew their searches, can't be ruled out. But as a searcher, the hit list you are provided with by the engine should obviously rank in the order of relevancy and interest. Search command languages can often be complex and confusing and the ranking algorithm is unique to each god based on the number of occurrences of the search phrase in a page, if it appears in the page title, or in a heading, or the URL itself, or the meta tag etc. or on a weighted average of a number of these relevance scores. E.g. Google (www.google.com) uses its patented PageRank TM and ranks the importance of search results by examining the links that lead to a specific site. The more links that lead to a site, the higher the site is ranked. Pop on popularity!

Alta Vista, HotBot, Lycos, Infoseek and MSN Search use keyword indexes - fast access to millions of documents. The lack of an index structure and poor accuracy of the size of the WWW, will not make searching any easier. Large number of sites indexed. Keyword searching can be difficult to get right.
In reality, however, the prevalence of a certain keyword is not always in proportion to the relevance of a page. Take this example. A search on sari - the national costume of India -in a popular search engine, returned among it's top sites, the following links:
?www.scri.sari.ac.uk/- of the Scottish Crop research Institute
?www.ubudsari.com/ -a health resort in Indonesia
?www.sari-energy.org/ - The South Asia Regional Initiative for Energy Cooperation and Development

Pretty useful sites for someone very much interested in knowing how to drape or the tradition of the sari?! (Well, no prayer goes unanswered...whether you like the answer or not!) By using keywords to determine how each page will be ranked in search results and not simply counting the number of instances of a word on a page, search engines are attempting to make the rankings better by assigning more weight to things like titles, subheadings, and so on.
Now, unless you have a clear idea of what you're looking for, it may be difficult or impossible to use a keyword search, especially if the vocabulary of the subject is unfamiliar. Similarly, the concept based search of Excite (instead of individual words, the words that you enter into a search are grouped and attempted to determine the meaning) is a difficult task and yields inconsistent results.

Besides who reviews or evaluates these sites for quality or authority? They are simply compiled by a computer program. These active search engines rely on computerized retrieval mechanisms called "spiders", "crawlers", or "robots", to visit Web sites, on a regular basis and retrieve relevant keywords to index and store in a searchable database. And from this huge database yields often unmanageable and comprehensive results....results whose relevance is determined by their computers. The irrelevant sites (high percentage of noise, as it's called), questionable ranking mechanisms and poor quality control may be the result of less human involvement to weed out junk. Thought human intervention would solve all probes....read on.

From the very first search engine - Yahoo to about.com, Snap.com, Magellan, NetGuide, Go Network, LookSmart, NBCi and Starting Point, all subject directories index and review documents under categories - making them more manageable. Unlike active search engines, these passive or human-selected search engines like don't roam the web directly and are human controlled, relying on individual submissions. Perhaps the easiest to use in town, but the indexing structure these search engines cover only a small portion of the actual number of WWW sites and thus is certainly not your bet if you intend specific, narrow or complex topics.

Subject designations may be arbitrary, confusing or wrong. A search looks for matches only in the descriptions submitted. Never contains full text of the web they link to - you can only search what you see titles, descriptions, subject categories, etc. Human-labor intensive process limits database currency, size, rate of growth and timeliness. You may have to branch through the categories repeatedly before arriving at the right page. They may be several months behind the times because of the need for human organization. Try looking for some obscure topic....chances for the people that maintain the directory to have excluded those pages. Obviously, machines can blindly count keywords but they can't make common-sense judgement as humans can. But then why does human-edited directories respond with all this junk?!

And here's about those meta search engines. A comprehensive search on the entire WWW using The Big Hub, Dogpile, Highway61, Internet Sleuth or Savvysearch , covering as many documents as possible may sound as good an idea as a one stop shopping.Meta search engines do not create their own databases. They rely on existing active and passive search engine indexes to retrieve search results. And the very fact that they access multiple keyword indexes reduces their response time. It sure does save your time by searching several search engines at once but at the expense of redundant, unwanted and overwhelming results....much more - important misses. The default search mode differs from search site to search site, so the same search is not always appropriate in different search engine software. The quality and size of the databases vary widely.

Weighted Search Engines like Ask Jeeves and RagingSearch allows the user to type queries in plain English without advanced searching knowledge, again at the expense of inaccurate and undetailed searching. Review or Ranking Sources like Argus Clearinghouse (www.clearinghouse.net),
eBlast (eblast.com) and Librarian's Index to the Internet (lii.org). They evaluate website quality from sources they find or accept submissions from but cover a minimal number of sites.

As a webmaster, your site registration with the biggest billboards in Times Square can get you closer to bingo! for the searcher. Those who didn't even know you existed before are in your living room in New York time!

Your URL registration is a no-brainer, considering the generation of flocking traffic to your site. Certainly a quick and inexpensive method, yet is only a component of the overall marketing strategy that in itself offers no guarantees, no instant results and demands continued effort for the webmaster. Commerce rules the web. Like how a notable Internet caveman put it, "Web publishers also find dealing with search engines to be a frustrating pursuit. Everybody wants their pages to be easy for the world to find, but getting your site listed can be tough. Search sites may take a long time to list your site, may never list it at all, and may drop it after a few months for no reason. If you resubmit often, as it is very tempting to do, you may even be branded a spamdexer and barred from a search site. And as for trying to get a good ranking, forget it! You have to keep up with all the arcane and ever-changing rules of a dozen different search engines, and adjust the keywords on your pages just so...all the while fighting against the very plausible theory that in fact none of this stuff matters, and the search sites assign rankings at random or by whim.

"To make the best use of Web search engines--to find what you need and avoid an avalanche of irrelevant hits-- pick search engines that are well suited to your needs. And lest you'd want to cry "Ye immortal gods! where in the world are we?", spend a few hours becoming moderately proficient with each. Each works somewhat differently, most importantly in respect to how you broaden or narrow a search.

Finding the appropriate search engine for your particular information need, can be frustrating. To effectively use these search engines, it is important to understand what they are, how they work, and how they differ. For e.g. while using a meta search engine, remember that each engine has its own methods of displaying and ranking results. Remember, search strategies affect the results. If the user is unaware of basic search strategies, results may be spotty.

Quoting Charlie Morris (the former editor of The Web developer's journal) - "Search engines and directories survive, and indeed flourish, because they're all we've got. If you want to use the wealth of information that is the Web, you've got to be able to find what you want, and search engines and directories are the only way to do that. Getting good search results is a matter of chance. Depending on what you're searching for, you may get a meaty list of good resources, or you may get page after page of irrelevant drivel. By laboriously refining your search, and using several different search engines and directories (and especially by using appropriate specialty directories), you can usually find what you need in the end."

Search engines are very useful, no doubt. Right from getting a quick view of a topic to finding expert contact info...verily certain issues lie in their lap. Now the very reason we bother about these search engines so much is because they're all we've got! Though there sure is a lot of room for improvement, the hour's need is to not get caught in the middle of the road. By simply understanding what, how and where to seek, you'd spare yourself the fate of chanting that old Jewish proverb "If God lived on earth, people would break his windows."

Happy searching!

Liji is a PostGraduate in Software Science, with a flair for writing on anything under the sun. She puts her dexterity to work, writing technical articles in her areas of interest which include Internet programming, web design and development, ecommerce and other related issues.