Eric Nagel

Eric Nagel

CTO, PHP Programmer, Affiliate Marketer & IT Consultant

ASW11 Session Recap: SEO Site Review

I have been, or can be if you click on a link and make a purchase, compensated via a cash payment, gift, or something else of value for writing this post. Regardless, I only recommend products or services I use personally and believe will be good for my readers.

It’s hard to write a session recap on the SEO Site Review/Clinic, because site owners submit their sites to the panel (Rae Hoffman-Dolan, Michael Gray, Kenny Hyder, and Michael Streko) and the critique is done on-the-fly with most of the content specific to the site owner. However, there are some useful tips you can get that can be applied to other sites.

The first site reviewed was Because this is an aggregator site, there is a huge problem with duplicate content. When you’re pulling data from other sites, do something to it to make it unique (more on this in an upcoming post).

A common theme was that links to internal pages (contact page, privacy policy, etc.) shouldn’t be nofollow, but rather use the robots meta tag with noindex, nofollow.

This site had a lead form on every page, and it was recommended the form be put in an iframe. If the content isn’t changing, and it’s something that can be cached (like a form), why not?

SEO Site Review Panel at Affiliate Summit West 2011 was next, and this site had plenty of opportunities for improvement. Without getting into specifics for this site, the one point I got that anyone could use is, if you’re going to buy links, point them to a sub-site, not your main site. was up next, and this was an interesting example. Basically, this site had the opposite problem of all the others: all of the problems were on-site. The site had plenty of backlinks, but the on-site SEO was terrible.

If you’re running a product-driven site, and the product is unavailable, do not 404 the page. Instead, 301 (permanent) redirect it to the category the product was in.

So many of these sites had simple diagnostic problems that can be solved in less than 20 minutes. Like the XML sitemap throwing a 404 (not found), or the robots.txt returning 403 (forbidden). Don’t rush these things, and get your site in Google Webmaster Tools to verify the bots can crawl your site successfully. was up next, with long page titles and even longer meta descriptions. Keep page titles (the title tag) less than 70 characters, and the meta description at 150 characters or less (that includes spaces).

Their blog was a mess, with permalinks set to the default ?p=## and the tag pages being indexed. A blog is a great opportunity to get another listing in the SERP, as a blog post will be indexed, but a tweet won’t. However, it’s believed by the panel that links in tweets have an indirect SEO benefit.

As you’re obtaining links, it’s better to get them on a content page, within the relevant content, versus a side-bar or footer link.

If you’d like to get more content on a page, without cluttering the page, use Read More expanding divs.

Finally, if you buy a domain that was doing spammy things, or blackhat link building, contact Google and explain the situation, and ask them to wipe the slate clean. Just remember that they don’t just wipe out all of the bad things and keep the good, they wipe it 100%

This was the first SEO Site Review Clinic I’ve attended, although I’ve watched the videos of previous clinics. If you attended a different session during this time, or otherwise missed this session, I highly recommend watching the video once released in about a month.

  • Vinny O'Hare
    Posted January 15, 2011 3:30 pm 0Likes

    Great recap of the session. It is my favorite part of Affiliate Summit. Lesson learned is that if you have a questionable site and you are buying links don’t comment on Matt Cutts blog and put the url in the siggy.

  • Sharon
    Posted January 15, 2011 3:51 pm 0Likes

    I will watch it again for the pure entertainment value & in the hopes of understanding more! 🙂 Excellent session.

  • Rae Hoffman-Dolan
    Posted January 15, 2011 4:24 pm 0Likes

    “shouldn’t be nofollow, but rather use the robots meta tag with noindex, nofollow.”

    It was Streko that recommended that in specific, but to be a bit clearer, he meant to noindex the page and not nofollow it. (You don’t need the nofollow in the meta directive, only the noindex: meta name=”robots” content=”noindex”)

    Noindex can be done via a meta tag as you mentioned and I’ve shown above or through the robots.txt file (last I checked, it’s only obeyed by Google, but that might have changed):

    Noindex: /page.html

    I prefer to do it via the latter so I always remember what’s been noindexed and what hasn’t. Just my two cents. thanks for finding the session worth covering!

  • Jeff Zickgraf
    Posted January 15, 2011 6:25 pm 0Likes

    Nice coverage of the session, especially good point on the 404 errors for product unavailability. Interestingly, one of the sites reviewed had an excellent 404 page where they noted the user could get x% off with the 404 page’s coupon code. Best 404 page idea I’ve seen yet – it would probably get me to at least enter a search on their site or use their menu to see if I could find what I came in for from google!

  • Streko
    Posted January 17, 2011 12:54 pm 0Likes

    Thanks for the coverage.

    And thanks Rae for clearing that up. What she said is what I meant.

Leave A Comment

Your email address will not be published. Required fields are marked *

This site uses Akismet to reduce spam. Learn how your comment data is processed.