Why You Need a Mobile Version of Your Website

If you share links to your website via Social Media (Facebook, Twitter, Google+) you need to be ready for mobile visitors.

First thing in the morning, as I wait for my first cup of coffee to hit my bloodstream, I’m on my phone. First, I go through email and answer anything urgent. Then, over to Facebook and catch up on what happened while I slept. And finally, I’ll open TweetDeck and go through my Twitter feed.

When I see a story I want to read, and click through, many times the linked site doesn’t have a mobile version of their site. When this happens, I end up going back and favoriting the tweet to read later. But making your site mobile-friendly isn’t difficult. In fact, if you use WordPress, you can have it done in less than 5 minutes.

How To & Tips




Google Affiliate Network API PHP Script

Google Affiliate Network

I was excited when I read Google had released an API for their Affiliate Network, as I wanted to automate pulling sales data from GAN. But I quickly became disheartened when I realized how difficult it is to use.

Since this was for myself, and not a web-based service that would be used by others, the Simple API method of authorization was good enough. After getting it to work with Picasa, but not GAN, I asked for some help and was introduced to the OAuth 2.0 Playground. This showed me the headers I’d need to send, and how OAuth 2.0 works.

For the record: the Google Affiliate Network API does NOT support Simple API Access. You cannot access the service with an API Key and IP locking.

I now have a working script, and have written up step-by-step instructions on how you can pull orders from GAN automatically each day.

This script is not complete, as it’s up to you to do something with the data once you have it. You can also modify the final call to pull advertisers, instead of orders.

How To & Tips



Google Rank Checker API

I read the other day, It would make life so much easier if Google had a paid rank checker API. Google might not have one, but Raven Tools does

Dec-7 2012 Update

Raven Tools no longer supplies SERP rank data. Use Microsite Masters instead

It would make life so much easier if Google had a paid rank checker API
Matt Sawyer

Not many people know this, but Raven Tools has an API.





Creating an Image Sitemap from a Datafeed

Google just announced an image sitemap format which will help get images from your site indexed. I thought this was a perfect extension to the datafeed series.

The first step is to make Google think we have the images on our server. So inside an “images” folder, create “image.php” like so:


	$cQuery = "select * from products where ProductID=" . (int)$_GET['ProductID'] . " limit 1";
	$oResult = mysql_query($cQuery);
	$rsData = mysql_fetch_array($oResult);

	header('Content-Type: image/jpeg');
	$fp = fopen($rsData['Thumbnail'], "r");

next, at the root of the site, create images.php:

	header('Content-Type: text/xml');
?><<?= '?' ?>xml version="1.0" encoding="UTF-8"?>
 <urlset xmlns="http://www.sitemaps.org/schemas/sitemap/0.9"

$aProductsToInclude = array("Green Tea", "White Tea", "Black Tea", "Oolong Tea", "Iced Tea");

$cQuery = "select * from products where (";
foreach ($aProductsToInclude as $cCategory) {
	$cQuery .= "MerchantSubcategory like '%" . $cCategory . "%' or MerchantCategory like '%" . $cCategory . "%' or ";
} // ends foreach ($aProductsToInclude as $cCategory)
$cQuery = ereg_replace(" or $", ")", $cQuery);
$cQuery .= " and Thumbnail<>'' order by Name";
// echo("$cQuery");
$oResult = mysql_query($cQuery);
while ($rsData = mysql_fetch_array($oResult)) {
		<loc>http://www.greenwhiteandblacktea.com/<?= simplify($rsData['Name']) ?>-p<?= $rsData['ProductID'] ?>.php</loc>
			<image:loc>http://www.greenwhiteandblacktea.com/images/<?= simplify($rsData['Name']) ?>-i<?= $rsData['ProductID'] ?>.jpg</image:loc>
			<image:caption><?= $rsData['Name'] ?></image:caption>
} // ends while ($rsData = mysql_fetch_array($oResult))


What this does is creates an image sitemap file.

Finally, add to your .htaccess:

RewriteRule ^images/(.*)\-i([0-9]+).jpg$ images/image.php?ProductID=$2 [L]
RewriteRule ^images.xml$ images.php [QSA]

Now, you have an image sitemap (images.xml) and when you visit an image URL like http://www.greenwhiteandblacktea.com/images/Bao-Zhong-Royale-Oolong-i469648571.jpg , htaccess gets the productID, using fopen opens the image, and passes it through to the browser.

If you were slick, you’d add some error checking and caching to these scripts to make things go quicker.

The tea niche may not be ideal for someone searching for images (think shoes!) but this at least gives you an idea of how to take advantage of this new tool by Google.

How To & Tips


Getting Stats from Google AdWords using the API

Previously I have posted how to automatically get your revenue figures from ShareASale, Commission Junction, and PepperJam. Having that done, the next step to automate my P&L report was to get the advertising costs from Google using the AdWords API.

First, I had to set-up a MCC account and apply for a developer token. In about 24 hours, my developer token was approved and I was ready to roll.

At the time I’m writing this, the AdWords API is on v2009, but reporting is only available via v13. So my script is written for the v13 service.

AdWords, like some other APIs, transfer data via the SOAP protocol.

Start by downloading the AdWords API PHP Client Library, which includes the SOAP library file. I saved these in a folder called “lib”, knowing I may use the other files at a later time.

The script starts by defining SoapClientFactory:

class SoapClientFactory{
  public static function GetClient(
	  $endpoint, $wsdl = false, $proxyhost = false, $proxyport = false,
	  $proxyusername = false, $proxypassword = false, $timeout = 0,
	  $response_timeout = 30) {
	if (!extension_loaded('soap')) {
	  return new soapclientNusoap($endpoint, $wsdl, $proxyhost, $proxyport,
		$proxyusername, $proxypassword, $timeout, $response_timeout);
	} else {
	  return new soapclientNusoap($endpoint, $wsdl, $proxyhost, $proxyport,
		$proxyusername, $proxypassword, $timeout, $response_timeout);

Then define the SOAP headers and set them

$headers =
	'<email>' . $email . '</email>'.
	'<password>' . $password . '</password>' .
	'<clientEmail>' . $client_email . '</clientEmail>' .
	'<useragent>' . $useragent . '</useragent>' .
	'<developerToken>' . $developer_token . '</developerToken>' .
	'<applicationToken>' . $application_token . '</applicationToken>';

$namespace = 'https://adwords.google.com/api/adwords/v13';
$report_service = SoapClientFactory::GetClient($namespace . '/ReportService?wsdl', 'wsdl');

Next, I had to define the job I wanted to run. Read through the Report Rules to figure out what you’re looking for; for this example, I did a summary by AdGroup in one Campaign. Note: “Validating a report job costs only 1 API unit, whereas a failed call toscheduleReportJob will cost 500 API units.”

$report_job =
	'<selectedReportType>AdGroup</selectedReportType>' .
	'<name>My Report Name Here</name>' .
	'<aggregationTypes>Summary</aggregationTypes>' .
	'<campaigns>8675309</campaigns>' .

	'<startDay>' . $dYesterday . '</startDay>' .
	'<endDay>' . $dYesterday . '</endDay>' .

	'<selectedColumns>Campaign</selectedColumns>' .
	'<selectedColumns>CampaignId</selectedColumns>' .
	'<selectedColumns>AdGroup</selectedColumns>' .
	'<selectedColumns>AdGroupId</selectedColumns>' .
	'<selectedColumns>AdGroupStatus</selectedColumns>' .

	'<selectedColumns>Impressions</selectedColumns>' .
	'<selectedColumns>Clicks</selectedColumns>' .
	'<selectedColumns>Cost</selectedColumns>' .


Of course, use your own Campaign ID – if you omit this, you will get all AdGroups from all Campaigns. My first report omitted this, just so I was able to see the proper value to set in there. In case you were wondering, $dYesterday was set earlier in my script as:

$dYesterday = date("Y-m-d", time()-86400);

The rest is some code I pulled from Google’s sample code, but I had to get it from a cached version of the page, so I’ll put it here, too, so everything’s in one place:

$request_xml =
	'<validateReportJob>' .
	'<job xmlns:impl="https://adwords.google.com/api/adwords/v13" ' .
	'xsi:type="impl:DefinedReportJob">' .
	$report_job .
	'</job>' .

# Validate report.
$report_service->call('validateReportJob', $request_xml);
if ($debug) show_xml($report_service);
if ($report_service->fault) show_fault($report_service);

# Schedule report.
$request_xml =
	'<scheduleReportJob>' .
	'<job xmlns:impl="https://adwords.google.com/api/adwords/v13" ' .
	'xsi:type="impl:DefinedReportJob">' .
$report_job .
	'</job>' .
$job_id = $report_service->call('scheduleReportJob', $request_xml);
$job_id = $job_id['scheduleReportJobReturn'];
if ($debug) show_xml($report_service);
if ($report_service->fault) show_fault($service);

# Wait for report to finish.
$request_xml =
	'<getReportJobStatus>' .
	'<reportJobId>' .
	$job_id .
	'</reportJobId>' .
$status = $report_service->call('getReportJobStatus', $request_xml);
$status = $status['getReportJobStatusReturn'];
if ($debug) show_xml($report_service);
if ($report_service->fault) show_fault($service);
while ($status != 'Completed' and $status != 'Failed') {
	// echo 'Report job status is "' . $status . '".' . "\n";
	$status = $report_service->call('getReportJobStatus', $request_xml);
	$status = $status['getReportJobStatusReturn'];
	if ($debug) show_xml($report_service);
	if ($report_service->fault) show_fault($service);

if ($status == 'Failed') {
	echo 'Report job generation failed.' . "\n";

# Download report.
$request_xml =
	'<getGzipReportDownloadUrl>' .
	'<reportJobId>' .
	$job_id .
	'</reportJobId>' .
$report_url = $report_service->call('getGzipReportDownloadUrl', $request_xml);
$report_url = $report_url['getGzipReportDownloadUrlReturn'];
if ($debug) show_xml($report_service);
if ($report_service->fault) show_fault($service);
echo 'Report is available at "' . $report_url . '".' . "\n";

At this point, we have the URL where the report is available, so I use wget to download it, then extract it.

`/usr/bin/wget -q --output-document=temp/$dYesterday.xml.gz $report_url`;
`/bin/gunzip temp/$dYesterday.xml.gz`;

(Those are backticks, not single quotes) Then I use simplexml_load_file to load the report into a variable, which I can then work with:

$xml = simplexml_load_file('temp/' . $dYesterday . '.xml');
$namespaces = $xml->getNamespaces(true);

The script then loops through the rows of data, converts the data into an easily usable format, then it’s up to you do actually do something with it:

for ($i = 0; $i < count($xml->table->rows->row); $i++) {
	$rsData = array();
	foreach($xml->table->rows->row[$i]->attributes() as $var => $val) {
		$rsData[$var] = (string)$val;
	} // ends foreach($xml->table->rows->row[0]->attributes() as $var => $val)

		[campaignid] => 8675309
		[campaign] => My Campaign Name
		[adgroupid] => 4815162342
		[adgroup] => First AdGroup Name
		[agStatus] => Enabled
		[imps] => 37
		[clicks] => 1
		[cost] => 300000
} // ends for ($i = 0; $i < count($xml->table->rows->row); $i++)

If you notice, cost seems really high. I did not pay $300,000 for a single click – you have to massage that figure a bit:


Finally, remove the XML file from the server

unlink('temp/' . $dYesterday . '.xml');

And stick some helper functions to the end

function show_xml($service) {
	echo $service->request;
	echo $service->response;
	echo "\n";

function show_fault($service) {
	// print_r($service);
	echo "\n";
	echo 'Fault: ' . $service->fault . "\n";
	echo 'Code: ' . $service->faultcode . "\n";
	echo 'String: ' . $service->faultstring . "\n";
	echo 'Detail: ' . $service->faultdetail . "\n";

It looks like a lot of programming, but really, if you just put it all together, and adjust the variables as necessary, you’ll be able to automate the process of importing your AdWords advertising costs into whatever tracking system you use.

How To & Tips