No Comments

12 Easy Mistakes that Plague Newcomers to the SEO Field

SEO digest

I've been working with a lot of newcomers to SEO lately thanks to our PRO membership Q+A (BTW - sorry for the delays, the volume's tripled in the last 3 weeks, so we're a bit overwhelmed). It's been a great learning experience and I've gotten to see many of the struggles and misconceptions that affect entrants to the subject. As a partial remedy, I thought I'd take some time tonight to cover a few of the worst offenders:

  1. Repetitive Keyword Targeting
    If you're targeting a specific keyword term or phrase, it's not necessary, and often ill-advised, to place that keyword in the title tag, H1 and body text of every page on your site. It's certainly OK to use the term/phrase in passing and when relevant, but remember that pages target rankings, not sites - a good rule is to target one specific keyword term/phrase per page, sometimes more, but only in rare circumstances (like when you're trying to get a secondary, indented listing) do you actually want to targe the same term on multiple pages.
  2. Splitting Efforts Across Many Domains
    If you've found 10 reliable sources to get links, don't fall into the trap of thinking you can register 500 domains, get them all links from your 10 sources, then link from those sites back to your main domain and suddenly appear to have a diverse domain backlink profile. Splitting up your efforts on multiple domains is, in my opinion, rarely advisable (see this post for more) and can seriously detract from your goal of gaining rankings on a single site. The days of link farms and link islands are long gone - with link acquisition, quality is slowly but surely getting the upper hand on quantity, so use strategies that will get you the right links, not just any link.
  3. Reciprocal Linking
    I'm not sure what happened, but reciprocal linking seems to be making a comeback (and both Google & Yahoo! have been doling out some penalties recently for sites engaging in it). It's not that reciprocal linking is inherently bad - if I link to Aaron's SEOBook (a great site, BTW), and he links to me (which he sometimes does), that's not what I'd call "reciprocal linking." I'm referring to the practice of creating link list pages on websites and then trading with other link list pages on other sites - the "you link to me and I'll link to you" phenomenon. These aren't hard to algorithmically spot and we see penalty after penalty (or at least, devaluation) hitting sites that leverage this tactic.
  4. Keyword Stuffing
    If you're not ranking for a given keyword, placing a few dozen more instances of it on your page is very rarely the answer. Folks have been asking about modified versions of their keywords, whether they need to add more related text content, whether they need to use it more times per sentence of per paragraph and my answer is always the same. Once you've got your keyword in your content a few times, in your H1, title and URL, and maybe in the alt tag of an image, you're 80-90% of the way there with on-page optimization. The content needs to be valuable to a human (so you can earn links and interest and return visits and sharing) not more "optimized" for search engines with repetitions of your keyword.
  5. Blocking Bots Access to Duplicate Content
    I think fear plays a major part in this one. People read that duplicate content causes a penalty, so they block bots from accessing duplicate versions of their content. The problem is that you lose the link juice those pages may accumulate and potentially restrict access to pages that would let the engines better crawl and index your site. If you want to fight duplicate content, try either 301 re-directing the duplicate version back to the original or employing the new canonical URL tag (if & when appropriate). Don't just go blocking bots from pages unless you're sure you know what you're doing.
  6. Avoiding XML Sitemaps
    Maybe it's my fault for previously recommending that webmasters don't submit Sitemaps. Let me just go on record again as saying that I haev come 180 degrees on this and now completely endorse sitemaps for nearly every kind of site. In every instance we've seen them used properly, they've added significant boosts in search traffic over a very short period. If you want an easy win, and haven't yet invested in the sitemaps XML protocol, go do it.
  7. Blocking Bots Rather than Using Nofollow
    I think there's been some confusion about how PageRank sculpting works. You should NOT block bots access to pages you don't want to send link juice to - in fact, this doesn't even accomplish that goal. If a link points to a page, even if that page is blocked via robots.txt or meta robots noindex, it still accrues link value metrics. The only way to stop a link from passing juice is to use a nofollow (or to make it via an external Javascript redirect or embed in a non-HTML-parseable object). Please be careful about what you noindex or block to bots and why - you don't just hurt that page, you can hurt downstream areas of your site by walling off navigation paths too (and even stop link juice that flows in from flowing out).
  8. Paranoia About Registering with Google Webmaster Tools
    Unless you truly are nefarious and have designs around link manipulation and scheming, there's really nothing to fear by registering your site with Google's Webmaster Tools. The value of having their metrics on hand and knowing what they report (especially with non-accessible pages) almost always outweights the tinfoil hat thoeries.
  9. Ignoring Non-Google Search Engines
    Why limit yourself to Google!? Just because they're the market leader doesn't mean that another 15-20% of search traffic from Yahoo!, MSN & Ask isn't worthwhile. Most people salivate at boosting SEO 15-20% but continue to ignore the other engines. As a first step, at least register with Live Webmaster Tools and Yahoo! Site Explorer and send them your sitemap. Beyond that, Ben's research next week will go into a bit more about the ranking algorithm differences between these three engines.
  10. Using Google's Link Command
    The Google link: command presents a (very) small sample of the links Google knows about, and while, for comparison purposes, it does appear to have some degree of accuracy, the actual links reported by it are in no way particularly significant. Just because a link does or does not show via this command doesn't make it better/worse/more interesting than any other link you have (or should try to acquire).
  11. Submitting Articles for Links
    I'm not always agains the practice of submitting articles or content to other sites (heck, YOUmoz is one of the best links an SEO can get, IMO), but beware of submitting tons of the same article over and over to sites in the hope of getting more links. The value passed is not high (at least, from 99% of these sites) and you do lose the value of having great content on your own site (and earning the search traffic and links that might have flowed to it naturally). Just be smart about why you're pursuing this strategy and mindful of the benefits and drawbacks.
  12. Chasing DoFollow Blogs
    When did dofollow blogs become the link acquisition practice du jour? Honestly, I have yet to see competitive rankings earned from a dofollow blog link strategy, and the focus on it to the exclusion of other, more valuable and scalable linking tactics is a cause for concern. When you're starting a link building campaign, remember the immortal words "nothing worth having comes easy."

Please feel free to contribute your own frequently-seen misconceptions or mistakes made by new SEOs. And for those reading this who are new - don't feel bad at all! I can't enumerate the number of pitfalls I suffered in my first few years tackling organic search. From 2002-2004, I was as green as they come, and it really took a half dozen campaigns and lots of help from the good people in the SEO industry to pull me out.

www.seomoz.org

published @ March 4, 2009

Similar posts:

Sorry, the comment form is closed at this time.