No Comments

Three Levels of Search Engine Optimization

SEO digest

As I have noted previously, the Searchable Web is a three part ecosystem consisting of Publishers, Indexers, and Searchers. All three groups can optimize search from their own perspectives, but you enjoy the most flexibility in optimization as a Publisher rather than as an Indexer or Searcher. Here is why.

Searchers cannot see everything

Searchers optimize their queries from a position of ignorance. They don’t know (when they begin searching) if what they are looking for is available in a Web index. Search engineers undoubtedly have one or more expressions that describe the actions you take from the moment you load a search engine into your browser until you leave the search engine. We’ll call that period a Search Window. This search window may include many specific, unrelated searches or all the queries performed within the search window may be related to a specific topic.

There is no continuity from search window to search window. That is, if you search for “eggs” today and tomorrow, tomorrow’s search results may be influenced by factors that had no effect upon today’s search. So each search window is an isolated event. If Searchers improve their queries over a sequence of isolated events, by the time they develop an ideal query (it resolves their quest for information in a completely satisfying manner) they may have passed by a great deal of available information that would have been equally satisfying or better.

The information may be lost through de-indexing or simply left on an abandoned query pathway that the Searchers don’t develop fully. A query pathway is a sequence of evolving queries that refine the searching process. You add terms, subtract terms, and alter terms in the hope of completing your quest for information.

Indexers cannot show you everything

The spammiest Web site may be the most complete, reliable, and authoritative source of information on a particular topic but because it’s a spammy Web site search engines may ban or penalize it. Furthermore, search engines don’t index the entire Web. They only index portions of the Web, so the most authoritative, complete, reliable source of information may not be available to search engines even though it is accessible through the Web (private forums, subscription-based content, etc. are good examples of this kind of unavailable content).

Furthermore, whenever you execute a query on a typical search engine, that search engine has to limit its results to 1,000 listings. The most complete, reliable, and authoritative source of information on your topic may not make the cut. The potential for disconnect between a Searcher’s query and a Publisher’s content, or between an Indexers selection process and a Publisher’s content, is immense. You search for “canine” and I write about “man’s best friend”. What’s a poor Indexer to do?

The current generation of search engines literally go out of their way to suggest alternative queries and additional query refinements to their users. But the fact remains that they cannot show us everything they know about.

Additionally, search engines will occasionally penalize Web sites, depressing their natural search rankings to prevent those sites from capitalizing on suspicious behaviors. The rules that search engines impose on their listings thus detract from (as well as improve) the quality of their results. That is, a search engine must carefully weigh the pros and cons for each limiting rule it imposes or it risks losing Searchers to another search engine.

So while it may seem that the Indexers have all the power, in reality they have to work within practical limitations and meet Searcher expectations.

Publishers have the most optimizing power

As Web Publishers we control the information that is available, we influence how that information is indexed, and we can even suggest to Searchers how they should search for it. Neither Searchers nor Indexers have that kind of power.

Furthermore, we can exercise our power of optimization in three ways: transparently, discreetly, and stealthily.

Transparent Search Engine Optimization may follow either the so-called “best practices” route or the spam route. Some spammers are quite transparent about what they are doing (perhaps if only through naivete, but I would not assume all transparent spammers are acting naively).

Discreet Search Engine Optimization is subtle but not necessarily deceptive. How do you tell if someone is discreetly optimizing for search? They don’t make a fuss. They don’t draw attention to what they are doing. They may rely more on repetition than emphasis, whereas a transparent optimizer will use emphasis wherever it makes sense.

In other words, Discreet Search Engine Optimization sacrifices some of the options available to Transparent Search Engine Optimization. If you’re being transparent you don’t care who sees that you’re optimizing for search. If you’re being discreet, you’d rather not have anyone notice but if they do notice you want them to think your optimization is subdued.

Discreet Search Engine Optimization can be very powerful, particularly in a competitive query space where everyone is watching what everyone else is doing. Discreet Search Engine Optimization is more likely to be overlooked in a competitive analysis.

Stealth Search Engine Optimization is secretive but not necessarily deceptive, at least not deceptive in the classical SEO spam sense. Now, if someone is cloaking a keyword-stuffed page, they are being both stealthy and deceptive. However, if someone is embedding links in places that competitors don’t know about and cannot get to, they are merely being stealthy.

Content can also be stealthy, following a dual optimization strategy that makes a page look like it’s trying to rank well for one set of expressions when in fact it’s seeking conversions from a different set of expressions. The most common method employed is to direct links at a page with anchor text that doesn’t appear on the page, but there are other ways to be stealthy.

Since secrecy is critical to stealth, you want to hide as many of your resources from prying eyes as possible. Since your content and links have to be indexed, you cannot hide them from the search engines but you can hide almost everything from other Publishers. Links, cached content, even the queries you get traffic from can all be hidden from your competitors. The search engines you rely upon can also be hidden from your competitors (just talk about how great you’re doing on Google and almost no one will notice you dominate on Yahoo!, Live, and Ask).

The more you know about the search environment, the more stealth you can employ. The more stealth you employ, the more risks you take. However, not every risk applies to search engines. That is, not all stealth techniques will get you in trouble with the search engines — just the stealth techniques that search engines feel hurt the quality of their search results.

The difference between Discreet Search Engine Optimization and Stealth Search Engine Optimization can be defined in several ways. For example, your discreet optimization may become the subject of discussion you find complimentary. Your stealth optimization, should it become the subject of discussion, is no longer secretive. It has therefore lost its competitive advantage. Hence, if you risk the loss of competitive advantage when people discover what you are doing, you’re being stealthy; otherwise, you’re only being discreet.

There should be no question about whether Stealth Search Engine Optimization provides a competitive advantage: it does. If your competitors don’t know how you’re optimizing for search, they have to compete in the dark. You can make adjustments they are unaware of and thus will likely to be slow to respond to. Your stealth strategy doesn’t have to cross any lines because Stealth Search Engine Optimization can be completely transparent to the search engines, totally compliant with their guidelines, but almost invisible to your competition.

Making all your data available to your competitors makes you vulnerable to the competition. You cannot hide the pages you want to rank for competitive queries, but you can hide a lot of other stuff — including server logs, linking resources, alternative queries, content that optimizes for alternative queries, your page cache information (including caching frequency), and more.

How to choose which level of SEO to use

Even the most hardened “best practices” SEO technician needs to engage at all three levels, if only to protect as many assets as possible from competitive analysis. And hard-core spammers have been known to use smokescreen content to confuse people about what they are doing. The smokescreen content may not violate search engine guidelines but it serves some purpose that only the spammers understand completely.

Your business model has to be the determining factor when you select which level of SEO to pursue. A robust strategy will consider all three levels, if only to protect information from competitive analysis. If you haven’t been engaged at the Stealth level already, now is the time to start. It’s never too late to improve your search engine optimization, unless you’ve gone out of business. Don’t let weak search engine optimization be the reason for your business failure.

www.seo-theory.com

published @ October 15, 2008

Similar posts:

Sorry, the comment form is closed at this time.