No Comments

Leveraging Small Resources For Traffic

SEO digest

There are hundreds of search resources across the Web that may drive traffic to your sites. Those search resources usually fit into one of these categories:

  1. Algorithmic search engines using their own software
  2. Algorithmic search engines using someone else’s software
  3. Meta search engines
  4. Web directories

Ask, Google, Live, and Yahoo! are the major algorithmic search engines using their own software.

Yahoo! operates some search engines under old brands it acquired (the most notable being Altavista) but there are a fair number of ISP Web sites (like Comcast and Verizon) that use major search engines’ resources. Also, some search resources like AOL use other search engines’ resources.

The best known meta search engines are probably Mamma, Dogpile, and Webcrawler.

Web directories may number in the thousands but most of them are specialty or niche directories. Of course, everyone knows about DMOZ, JoeAnt, and a few others like them.

The content you create for major search engines works with small search engines

If you have enough content on your site, you can probably draw traffic from many small resources. While that traffic may not be substantial, you should be aware of those resource and cultivate your listings in them. That’s common sense and most SEOs make at least some effort to get clients into resources with a reasonable amount of traffic.

But how much do you optimize for those smaller resources? The meta search engines in particular can drive significant traffic to a Web site that performs poorly on a major search engine like Google. Take Ixquick, for example. You’ll find that their ranking strategy is pretty straightforward. In some queries, you don’t need to perform well on any one search engine in order to hit the top ten or even top five on Ixquick.

Dogpile mixes sponsored listings with the organic search results (fully disclosed with a label). You can buy your way to Dogpile success through one of several advertising services. Dogpile may in fact be the only search engine with technology that tries to match ads from competing services with user queries.

My personal sites receive organic traffic from Dogpile so despite the vehement criticisms some people have directed at Dogpile for integrating their sponsored listings so closely with their organic listings, they seem to have a fairly satisfied user base. SEO Theory also receives organic traffic from Dogpile, so Dogpile users have a robust variety of interests.

Maintain reasonable expectations about potential traffic

In a world where people obsess over 20,000 queries on Google (of which as many as half may only be due to rank-checking) for a keyword, you may find some easy pickings from the alternative search resources. And before you get out your calculator and start figuring out how many clicks Google drives through its search results, stop and remind yourself that a number 1 listing on search engine X will drive more traffic to your site than a number 11 listing on Google.

This is where you separate the SEOs from the trend chasers. Trend chasers focus on Google and obsess over number 1 listings on Google. SEOs optimize for search wherever it comes to them.

An oft-overlooked fundamental principle of search engine optimizaton

Let me restate that as a Fundamental Principle of Search Engine Optimization: For every search engine that drives traffic to your site, you must ensure that you receive optimum traffic from that search engine.

Implementing this principle is easy. You ignore every search engine on the Web. Even Google. You create a Web site, you let it get spidered by whomever wants to spider it, and then you watch your referral data.

The referral data will show you which search engines have users who are really interested in your content (as those search engines show it to them). At some point you need to do some work for the major search engines, but if you’re going to turn down 100 converting visitors a week because you cannot get them from Google, you don’t belong in this industry.

Every time a new search engine shows up on your radar (it sends you traffic), check it out. Some old-school SEOs used to investigate the search engines as soon as the robots appeared — there is merit to that approach because you have a better chance of trapping rogue robots early on. But if you’re merely optimizing for search, you want to capture all the search referral data you can.

Evaluate before you decide to ignore a new search engine

You can evaluate a search engine by looking at its estimated traffic from as many data sources as possible (start with Alexa, Compete, and Quantcast). Then do some keyword research on the search engine’s brand name. Are people searching for it on Google, Yahoo!, and Live? Remember that a lot of people now use the major search services as navigational tools. They do use search engine A to get to search engine B. It’s probably easier to let a major search engine figure out where you want to go than to browse a list of bookmarks or remember all those URLs.

If the search engine receives traffic and you’re receiving traffic from the search engine, check your rankings for the query where you got traffic. Can you improve them? Can you expand your visibility in the search engine? Quite often it takes no more effort to improve your visibility in a minor search engine than it does to improve it in a major search engine.

Improving visibility has nothing to do with link building. Improving visibility has everything to do with creating and organizing content. Sometimes you will find that you can submit sub-domains or directories from your Web site (assuming they have very distinct topics) to niche directories. Those niche directory listings may drive some qualified traffic your way. That doesn’t mean go submit your site to 1500 directories. That means, if a small directory is sending you traffic for keyword A, see if you can leverage that directory to get traffic for keyword B.

It’s an option, not a requirement. You don’t always have the option, but if you do get an opportunity to seek multiple listings from a small directory that sends you traffic, you should explore it.

Use referrals from low-traffic search resources for keyword analysis

Meta search tools can help you identify high value mid-traffic keywords. Just because no one is searching for “X Y Z” on Google doesn’t mean it’s not worth optimizing for. Google obtains less than half of all search users. That means more than 200,000,000 people are using services OTHER than Google to find content they are looking for. You’ll find those 200,000,000 people scattered across many search resources, not just Ask, Live, and Yahoo!.

If you find you obtain more traffic from meta search than from major search, you’re either trying to invade competitive keywords or your optimization needs adjusting. Simply comparing your major-to-meta success rates should give you an idea of how well you’re meeting your optimization goals.

If you find you cannot obtain ANY traffic from meta search, then you’re doing something wrong. A natural site that ranks moderately well for a variety of keywords on major search engines should be able to draw traffic from meta search, niche directories, and secondary search engines that use other search engines’ software. If you’re not getting referrals from anyone other than Google, you’re probably not looking at your referral data correctly. If you’re REALLY not getting traffic from anyone other than Google, chances are good that you’re missing out on a lot of opportunity.

Use existing referrals as footholds to expand your visibility

There are some types of content that index better in the major search engines than in the niche directories and meta search tools. Directory pages, forum discussions, blog posts, and other dynamic content usually does not perform well. But you can often obtain rankings with static content pages that leverage your dynamic content.

Smaller search resources have fewer resources to work with. They don’t gorge themselves on billions of Web documents the way major search engines do. The smaller directories will never compete with DMOZ and JoeAnt for listings. You need to pay attention to what these resources like to index and be sure you create relevant, unique content that matches their inventories.

Take advantage of meta search and repurposed search filters and nuances

Although meta search engines don’t build Web document indexes, there are certainly small search engines out there that sample the Web in ways different from how Ask, Google, Live, and Yahoo! do. And some search engines that reuse other search engines’ software and data apply different filters or priorities. For example, AOL’s search results don’t always match Google’s search results.

You may not do well on Google, but in some queries you can do better in AOL simply because AOL filters out some of the higher-ranking competitors. I normally see this happen with mature content in non-adult queries. So family-friendliness definitely works well with the AOL search engine.

Do the math: maximize your qualified referrals

If three dozen search engines each send you 1 visitor per week, do you really want to ignore the opportunity to draw more than 1872 visitors per year from those search engines? Those visitors may turn out to be influencers in ways you never imagined.

If you need invest only one or two hours’ time to double your small resource referrals from 1872 to 3744 visitors per year, what’s your business case for ignoring that traffic?

Keeping unwanted traffic out of your site

Now, let’s say you’re getting unqualified, non-converting traffic from those search resources. You need to exercise some reverse optimization. Either you need to tweak your content so that it places well in the right queries for the right people or you need to do whatever you can to block that unqualified traffic.

Image search tools are a huge source of unqualified traffic for many Web sites. Some SEOs leverage the hotlinks that image search enables. Other people simply block the image search tools from crawling and indexing their sites.

Using robots.txt, you can tell honest crawlers to stay away. However, you may need to use .htaccess to block hotlinks. You can also use .htaccess to deny referral traffic from specific domains:

RewriteEngine on
RewriteCond %{HTTP_REFERER} blocked-site-01\.com [NC,OR]
RewriteCond %{HTTP_REFERER} blocked-site-02\.com
RewriteRule .* - [F]

Be very sure that you cannot cultivate useful traffic from a resource before blocking it.

You should look at 2-3 months’ worth of data before blocking a referral source, unless you can determine that the referrals will never be useful (which is the case for hotlinking). The decision to turn your back on referrals from small search services could be catastrophic for you, because if you put all your hope in one or two major search engines where you are struggling to compete, you’re stomping on very thin ice.

www.seo-theory.com

published @ September 4, 2008

Similar posts:

Sorry, the comment form is closed at this time.