No Comments

SEO Myths About SEO Myths

SEO digest

In February 2008, Kalena Jordan wrote an article recapping Jill Whalen’s Webstock 2008 45-minute presentation on SEO. Jill is a fundamentalist, eschewing algorithmic analysis in favor of creating solid, informative content that is useful to clients (coupled with strong on-site navigation). There is much to be said for Jill’s point of view and like Kalena said in reply to a reader, I can agree with most of what Jill tells people.

You can always find something to nit-pick about anyone’s SEO philosophy. For example, I could pick on all the pseudo-blogs that have recently been reprinting Kalena’s article, but why should I fuss about a little Web spam? (NOTE: Not every reprint may be Web spam — I don’t know who actually asked for and/or received permission to reprint the article.)

SEO mythology is an area that has grown by leaps and bounds, partially because of the ongoing debate between people over what works best or which strategy is the most effective. When some opinions are reported in capsule form, a lot of valid context may fall away. But we can also see, through the Echo Effect, which ideas are taking hold or remaining popular.

Let’s take a couple of examples from Jill’s advice (as reported by Kalena):

SEO Myth About SEO Myths No. 1

Under “Link Building Myths”, Kalena reported that Jill included: “that Google’s link: command is accurate. It’s not a useful tool. Use Google Webmaster Tools or the Yahoo link command instead.”

I agree that the Google link: query operator is not accurate. It only shows you a random sampling of backlinks that Google knows about. And this criticism is well-established in the SEO community (although you’ll still find plenty of people using it, some for good reason).

Here is the problem with the conventional SEO wisdom that Jill shared with the Webstock audience: neither Google Webmaster Tools nor Yahoo! Site Explorer are HONEST. That is, they are both DISHONEST LINK TOOLS.

The Google link query operator, on the other hand, has not been shown to mislead people. Hence, of the three options ( A. Google link: query; B. Google Webmaster Tools; C. Yahoo! Site Explorer), which would best serve your link research? The correct answer is A. You’ll get more useful information from Google’s link query operator than from the other two sources because you cannot trust what the other two sources report.

Doing your link analysis on the basis of bogus information is equivalent to making up numbers for your income statement and balance sheet. Don’t be surprised if the auditors question your sound business practices.

Why do SEOs cling to the false belief that Yahoo! Site Explorer offers useful link research information? The answer seems to be that Yahoo! will report a plethora of backlinks for most Web sites — it’s satisfying to know, perhaps, that your hard work has not gone unnoticed by Yahoo!. Never mind the fact that many of the links simply don’t exist.

And why do SEOs cling to the false belief that Yahoo!’s data is somehow relevant to whatever is happening in Google? That’s equivalent to using U.S. Census statistics to evaluate the demographics of the European Union.

So, is Google’s link query operator accurate? No. It’s a myth to suggest that it is accurate. Is it reliable? To date, I have found no evidence to indicate that it is as unreliable as Google Webmaster Tools and Yahoo! Site Explorer. Take that for what it’s worth.

SEO Myth About SEO Myths No. 2

There was another point in Kalena’s article I wanted to address. Under the “Submitting, Crawling and Indexing Myths” section, she reported that Jill included: “frequent spidering helps rankings. Not true.”

I can almost see how this idea became popular. With all the discussion about PageRank you find in the SEO community, every now and then someone points out the connection between high PageRank and frequent crawling (a point that Matt Cutts seems to have implied on more than one occasion). That is, documents with high INTERNAL PageRank are crawled more often than documents with low internal PageRank.

If those documents are updated often, you’ll most likely see their Google cache data updated fairly often (although Matt has pointed out that Google may fetch pages more often than it will update the cache for those pages). Because Google hid their Supplemental Results Index, people have come to rely upon the cache data as an indicator of which pages may have high PageRank. By clocking your cache update frequency (a technique I have suggested in the past), you may be able to determine if your page is stuck in the Supplemental Results Index or if it is in the Main Web Index.

The only significance I attach to the two Web indexes is that Supplemental pages very rarely rank for anything useful. That is, Google positions less relevant content from the Main Web Index above more relevant content from the Supplemental Results Index.

Matt Cutts has tried to deflect discussion of this failure on Google’s part by pointing out that all the major search engines have dual indexes (he has asked other search engine reps to publicly confirm their indexes). Danny Sullivan has made a point over the years of mentioning the other engines’ dual indexes, too.

Okay, fine. Everyone uses dual indexes. The problem, however, is that pages that rank well in other search engines DUE TO RELEVANCE usually suck in Google’s search results until you point value-passing links at them. That means Google’s Supplemental Index is more than just a “secondary index”.

To be fair to Google, I’ve only recently learned that they may be hashing the text for Supplemental pages (I say “hashing” but I’m using the term loosely to save space here). If that’s the case, then I understand why they cannot allow Supplemental Results Pages to rank highly — they just don’t know what’s on those pages. But that’s a subject for another day.

Here is where the SEO mythology becomes confused: Where Google is concerned, you pretty much MUST get value-passing links for any document you want to rank above other RELEVANT documents. Chasing the long-tail of search is easy, but when you’re competing with PageRank-rich content, just being the most authoritative and relevant information source doesn’t work with Google. You HAVE TO BE IN THE MAIN WEB INDEX to rank above less relevant content.

It doesn’t matter how often your page is crawled. But frequent crawling (and caching) MAY indicate that you have enough value to be able to tweak your on-page optimization.

HOWEVER, in support of Jill’s point of view, let me say this: If you optimize your content first, then you’ll know when your page is in the Main Web Index without looking at cache dates and crawl rates. How will you know? Your page will rank well in the search results (because it’s both optimized AND because you got value-passing links to point to it).

So the choice is yours: You can add to your burden by clocking cache updates or you can just stick to the basics and get the job done more quickly.

If you want or are required to chase algorithms, you need to clock data changes in the SERPs. But most SEO technicians and Web promoters don’t need to clock data changes. If you just do the work, it should pay off. If you don’t see improvement, you either need to adjust your on-page optimization, your link profile, or both.

SEO Myth About SEO Myths No. 3

I’m going to pick on Paul O’Brien’s SEO Myths post now. He starts off with: “SEO is all about secret tactics”.

Is that a myth? Yes. Is there any truth to it? Yes. More precisely, efficient search engine optimization relies upon use of undisclosed resources. Now, you can be completely open about how you do everything and still be effective, but you will NOT be efficient. The reason you lose efficiency through full disclosure of tactics is that any competitor who keeps you in the dark has an advantage over you.

But can there really be “secret techniques”? In this industry, there are relatively few secret methods if any. You have two things to work with: links and content. On the other hand, nature builds some very complex elements out of neutrons, protons, and electrons — and we combine those complex elements into even more complex compounds.

Don’t underestimate the power of using simple ideas in complex patterns. The secrets that the best SEOs keep generally fall into two categories: resources and combinations of tactics and techniques. There are, in fact, proprietary SEO service models. They are proprietary because the details of implementation are not shared openly. But don’t confuse nondisclosure of doing what everyone else seems to be doing with doing stuff that you’re pretty sure most people are not doing.

The process of weaving cloth is fairly simple and straightforward; nonetheless, we manage to weave cloth in many different patterns that produce different textures, thicknesses, elasticities, and other non-colored qualities. Some cloths are even bullet-proof.

There ARE SEO secrets and if you think there aren’t that just means you don’t have any.

SEO Myth About SEO Myths No. 4

Keyword Density is about the repetition of keywords. I’ve found this concept in several places, and it leaves me wondering where people get their information from.

Keyword Density is a valid measurement from old Information Retrieval science. It was used in closed document systems to find which documents were most likely to be relevant to a particular term. Some search engines used Keyword Density in the 1990s. We’re approaching the end of 2008, but a fair number of people still talk about Keyword Density.

On the other hand, Keyword Frequency IS a signal that makes a difference. How do I know this? Because I can point to many search results where value-passing link-rich sites are outranked by relatively link-poor documents that just beat the crap out of the query expressions. And most of the value people are counting on from their links is in the anchor text any way. The vast majority of SEOs depend on Keyword Frequency for the success of their campaigns.

Keyword Frequency works. When a search engine evaluates a document’s relevance to a topic, it only has a few options to consider: placement of words, emphasis of words, rarity of words, and repetition of words. When you’re looking for collections of documents and ordering those collections, you can also look at co-occurrence, but while your analysis is confined to a single document, you have to look at placement, emphasis, rarity, and emphasis. There is little else to consider.

Most SEOs cover placement through titles, page URLs, and meta tags (which is a very limited point of view, in my opinion).

Most SEOs cover emphasis through Hx, bold, and italics (a tactic I have often advocated, but there are other ways to provide emphasis).

Most SEOs don’t really consider rarity because they are chasing query patterns established by searchers.

And most SEOs cover frequency (repetition) either through density analysis (a waste of time) or through inbound link anchor text (extremely inefficient). These are safe plays because everyone knows that you if you just repeat an expression (repeat an expression, repeat an expression, repeat an expression, repeat an expression, repeat an expression, repeat an expression) many times it looks spammy and silly and search engines will either filter it out or penalize you or ban you.

Worse, you may be OUTED for your excess repetition. (Matt — be gentle with me, please)

The bottom line is: If you want to use an expression 100 times on your page, do so. Just be sure the page is worth reading. But that’s not a miracle cure because someone else may get 200 value-passing links to outrank you.

And there are at least 199 other signals that, collectively, may tip the SERPs in favor of another page — but in general Repetition Is The God-Emperor of Search Optimization.

If you cannot do anything else to optimize for search, find a way to repeat what you say.

If you cannot do anything else to optimize for search, find a way to repeat what you say.

Repetition done right may help you overcome obstacles you believed were insurmountable.

www.seo-theory.com

published @ October 21, 2008

Similar posts:

Sorry, the comment form is closed at this time.