PHP script to send test emails on a schedule to make sure you are able to receive email

Ever wondered if your email was actually working? I’ve had numerous people over the past few years wonder the same thing. Ideally you would use a consistent, dependable email service provider (think Gmail, Hotmail, Yahoo!, etc.). But sometimes that isn’t an option. If you think you might be receiving email intermittently or not at all, feel free to use this script to send test emails to yourself on a regular basis. If you get the test emails, you know at least someone can get to you.

	print('Not sending an email after hours.');

// Get the current day of the week as an index (0=Sunday, 6=Saturday)
$dayOfWeek = date('w');

// Do not send the email on weekends
if($dayOfWeek == 0 || $dayOfWeek == 6)
	print('Not sending an email on the weekends.');

// Info of person to receive the tests
define('TO_EMAIL',		'');
define('TO_NAME',		'John Doe');

// Info of person sending the tests
define('FROM_EMAIL',	'');
define('FROM_NAME',	'Email Tester');

// Example: 8:00 am on 1 Nov 2010
$subject = 'Test: ' . date('g:i a \o\n j M Y');

$message = 'This email was automatically generated. Please send an email to if you would like to disable these automated tests.';

$result = mail(TO_NAME . ' <' . TO_EMAIL . '>', $subject, $message, 'From: ' . FROM_NAME . ' <' . FROM_EMAIL . '>');

And finally, setup a cron job that runs on a regular basis (every 15 minutes in this case):

*/15 	* 	* 	* 	* 	wget

I realize that you could do some of the scheduling being done in the PHP script in the cron job definition, but I wanted to have more control over that for future features (think pulling schedules from a database for a multi-user environment).


Shorten URLs with Zend Framework and

This function uses version 3.0 of the API. You’ll need to register for an account with them if you don’t already have one. Then you can retrieve your API key and begin using it immediately.

function bitlyShorten($url)
	$client = new Zend_Http_Client('');

		'longUrl' => $url,
		'login' => 'xyz',
		'apiKey' => 'xyz'

	$response = $client->request();

		$response = Zend_Json::decode($response->getBody());
		if($response['status_code'] == 200)
			return $response['data']['url'];
	return (false);


$myLongUrl = '';
$myShortUrl = bitlyShorten($myLongUrl);

Force Zend Framework to use the index controller by default


Everyone wants pretty URLs these days—both for convenience and to optimize for search engines. So having URLs with unnecessary information is a major no-no. Over the past year I’ve been slowly absorbing the Zend Framework and its MVC pattern. Historically, projects I created required the user to specify the index controller like so:

…where “index” is the controller and “page” is the action. Since almost all people will be using the index controller, why is it in the URL? Want to get rid of it? Add the following to your application.ini file in the production section:

# Routes
resources.router.routes.default.route = /:action
resources.router.routes.default.defaults.controller = index
resources.router.routes.default.defaults.action = index

Now your application will use the index controller by default so your URLs will be even prettier:

But wait! What if you need to access a different controller? Maybe an admin controller (or module)? Add this beneath the previous addition:

resources.router.routes.admin.route = /admin/:action
resources.router.routes.admin.defaults.controller = admin
resources.router.routes.admin.defaults.action = index

The downside, of course, is that for each additional controller you create, you’ll need to add these three lines to your application.ini file. Luckily for me, I don’t anticipate having very many controllers.

While this technique may seem obvious, I couldn’t find it anywhere on Google. So, if you’re successfully doing this some other way, please share in the comments.

More on the Zend Framework Router


Caching With Zend Framework Using Zend_Cache

Cheeks Blowing

Today I taught myself how to use Zend_Cache and implemented it within 20 minutes. It’s super easy and very effective. Take a look at the code sample below and you’ll be up and running in no time.


Step 1: Setup the Cache
{code type=php}
$frontendOptions = array(
‘lifetime’ => 180, // Cache for 3 minutes
‘automatic_serialization’ => true

$backendOptions = array(‘cache_dir’ => dirname(__FILE__) . ‘/cache/’);

$cache = Zend_Cache::factory(

Step 2: Use the Cache
{code type=php}
$data = null;
if(!$data = $cache->load(‘data’))
$service = new Service(API_KEY);
$result = $service->generateReport();
$data = $service->getReport();
$cache->save($data, ‘data’);
print(“Cache Hit!”);

The page load time went from about 9 seconds to 0.5 seconds! 18x faster and it only took a few lines of code. Awesome.

My main motivation for caching the data ($data in the code example) was actually to reduce the load on the web service which provides the data. We have a good relationship with the company providing the service but there’s a good chance they would become annoyed if we hammered their system to get the exact same data over and over. The load time improvement was a good side effect, though!

For more information on Zend_Cache which comes with the Zend Framework, check out the reference guide and API documentation.


How to backup your website

Hard Drive in Flames

Everyone knows (or should by now) that cheap web hosts (Bluehost, Dreamhost, MediaTemple, etc.) don’t backup your data for you. So you’d better do it yourself. If you’re on any respectable host, you should have ssh access to the box.

Connect to your box via ssh and run the following commands to create a backup of your site.

cd ~
mkdir Backup
nohup zip -r Backup/ www/ > backup_log.txt &

(Replace YYYY with the 4-digit year, MM with the 2-digit month, HH with the 24-hour format of the hour, and MM with the 2-digit minute)

cd ~ navigates to your home folder

mkdir Backup creates the backup directory in which the backups will be stored

nohup is short for no hangup and allows processes started by users at the terminal to continue running even after the user logs out

zip is a program which combines many files into one and compresses them to make the end result even more portable

-r tells zip to burrow into all subdirectories in order to grab all of the files

Backup/ is the path to the backup file

www/ is the directory to backup (it may be html, htdocs, httpdocs, etc. on your box)

> backup_log.txt redirects all output from zip to the backup_log.txt file so you can review the file later

& tells linux to run the zip program in the background so that you can logout or perform other tasks without killing the process

Now all you need to do is download that zipped file. Use your favorite SFTP client to login to your box and snag it. I recommend FileZilla Client for all platforms. If you’re looking for an FTP server, FileZilla Server is perfect.