Eric Nagel

Eric Nagel

CTO, PHP Programmer, Affiliate Marketer & IT Consultant

Feeding the Google Panda Keyword-Rich Subdomains

I have been, or can be if you click on a link and make a purchase, compensated via a cash payment, gift, or something else of value for writing this post. Regardless, I only recommend products or services I use personally and believe will be good for my readers.

Feed the Google PandaThere’s been talk lately that the Google Panda loves keyword-rich subdomains, so I decided to test this on my coupon website.

Aside from a WordPress blog, the pages on the coupon website are all driven from a coupon database. I use a system I wrote myself, although you can just as easily use a service like For Me to Coupon.

The merchant pages on the site look like this:

The “m925” part in the filename tells my system this is merchant 925 in the database. If you change that ID, you’ll be shown another merchant’s coupons instead (and the URL will be corrected).

Previously, the full URL was, but I wanted to feed the Google Panda by putting the merchant name in the subdomain. Of course, adding all of those DNS entries wasn’t an option, and adding all of those ServerAliases in apache wasn’t an option, either. After all, there are nearly 3,000 merchants in the database. So to get over this hurdle, I’m using wildcard DNS.

The first step is to add an A-record to your zone file, such as:

A	*

Of course, you want to point this to your server, not mine.

Then, you need to add the ServerAlias to the apache configuration. I’m using Plesk, so I create a vhost.conf file in /var/www/vhosts/ which contains:

ServerAlias *

Now, you can go to and the site will come up.

The next step is to automatically redirect the pages to their new, keyword-friendly subdomain URLs. Here’s how I’m checking to make sure the user is on the right page, and redirecting if not:

$cURIActual = 'http://' . $_SERVER['SERVER_NAME'] . $_SERVER['REDIRECT_URL'];
$cURIExpected = 'http://' . strtolower(simplify($rsMerchantData['cName'], true)) . '' . simplify($rsMerchantData['cName']) . '-m' . $rsMerchantData['nMerchantID'] . '.php';

if ($cURIActual != $cURIExpected) {
	header("Location:  $cURIExpected", TRUE, 301);
} // ends if ($cURIActual != $cURIExpected)

Here’s the simplify function:

function simplify($cString, $bNoDashes = false) {
	$cString = str_replace("'", '', $cString);
	$cString = preg_replace("/[^A-Za-z0-9]/", "-", $cString);
	$cString = str_replace('--', '-', $cString);
	$cString = str_replace('--', '-', $cString);
	$cString = str_replace('--', '-', $cString);
	$cString = str_replace('--', '-', $cString);
	$cString = preg_replace("/\-$/", "", $cString);
	if ($bNoDashes) {
		return str_replace('-', '', $cString);
	} // ends if ($bNoDashes)
	else {
		return $cString;
	} // ends else from if ($bNoDashes)
} // ends function urlfriendly($cString)

Hopefully you can follow the PHP coding.

So if you have a large database-driven site, you can use wildcard DNS to create the appearance of many, many subdomains. Just be sure to put checks in place, so you don’t have thousands of copies of the page across all of the subdomains.

  • LGR
    Posted August 9, 2011 2:47 pm 0Likes

    I read that post over at SEOBook as well and while this might be true right now, Google is not going to keep it that way. Of course I do not know the mind of Google but I would say by creating hundreds and thousands of sub domains Google will flag the root domain as being web spam and then you could have bigger problems to worry about.

    • Eric Nagel
      Posted August 9, 2011 2:51 pm 0Likes

      Hey Lee – yeah, I agree Google will adjust this soon, but I don’t think they’ll label this any more or less spammy than using mod_rewrite rules to make keyword-rich filenames.

  • Igor
    Posted August 10, 2011 5:07 pm 0Likes

    “Just be sure to put checks in place..” – You might want to drop subdomains for your Privacy Policy and Disclosure pages. It’s not that important though. Just heads up.

    • Eric Nagel
      Posted August 10, 2011 7:54 pm 0Likes

      Yeah, those should be noindex, even on the main www site. I’ll add it to my list

  • Joe Zepernick
    Posted August 24, 2011 8:19 am 0Likes

    Great post Eric. Are you throwing all those urls into a sitemap for the Google?

    • Eric Nagel
      Posted August 26, 2011 5:01 pm 0Likes

      I’m not 100% sure how to do that. I don’t want to submit all of them… and I don’t think I can submit a sitemap index from the root.

      If / once I figure it out, I’ll let you know

  • richard v
    Posted October 15, 2011 7:12 pm 0Likes

    Eric, Any updates on SERP rankings after making the changes?

    • Eric Nagel
      Posted October 16, 2011 3:04 pm 0Likes

      Here’s a look at Google Analytics – can you guess when I turned on the subdomains, and when I turned them off?
      Subdomains Google Panda

      Funny thing… that’s all Bing traffic – subdomains had no impact on Google.

      I turned them off to sell the site, but it looks like I should turn them back on again!

Leave A Comment

Your email address will not be published. Required fields are marked *

This site uses Akismet to reduce spam. Learn how your comment data is processed.