Wonderful Pistachios: Should You Buy Them In The Shell?

Pistachios Bag

I’m a sucker for a good deal, so when I see delicious food at Sam’s Club, I have to resist the urge to stuff it into the already-packed shopping cart. But the last time we went to Sam’s, I saw that Wonderful Pistachios (the brand, although they are inherently wonderful) come in a 3-pound bag in the shell or a 1.5-pound bag already de-shelled.

The Question

Which bag has the lowest cost per unit of pistachio seeds? To clarify, a pistachio seed is what is inside the shell. So for our purposes moving forward, you should assume that when I say seed, I mean a shelled pistachio.

The Hypothesis

I hypothesize that the 3-pound bag of unshelled (shells on) pistachios will have a lower cost per unit of seeds than the 1.5-pound bag of shelled (shells pre-removed) pistachios.

  1. Removing shells takes additional processing at the factory.
  2. Additional processing requires additional engineering, factory workers, machinery, and maintenance.
  3. Those extras cost additional money.
  4. Companies pass extra costs on to consumers, especially when it makes the product more convenient for the consumer.
  5. Pistachios that come pre-shelled are more convenient for the consumer.
  6. Therefore, the cost per unit of seeds will be higher for a product that is more convenient for the consumer and takes more resources to produce for the company.

The Experiment

I decided to put an end to my curiosity by removing the shells from an entire 3-pound bag and weighing the resultant seeds. I was extremely careful to fully separate the seed from its shell and “husk” (the paper-like cover that surrounds many of the seeds after you remove the shell). I captured all of the shells and husks in one bin and all of the seeds in another. The entire process took about four solid hours which I completed over the course of two weeks as I watched TV. My thumbs were so sore from the combination of the salt and sharp edges of the shells that I had to take at least a day off between each session. Until…my new favorite tool arrived in the mail, after which I promptly finished off the bag and excitedly began the weighing process.

The Results

Turns out my hypothesis was totally wrong!

  3-pound bag
(in shell)
1.5-pound bag
(seeds only)
Cost $17.98 $14.98
Weight 1361 g 680 g
Weight of Shells 720 g N/A
Weight of Seeds 675 g 689 g*
Percent Yield 50% (of advertised)
48% (of actual)
101%*
Prep. Time 4 hours N/A
Cost Per Gram of Seeds 2.664 ¢/g 2.174 ¢/g
Winner  

*I assumed that the actual yield of seeds in the 1.5-pound bag was 10% (68 g) less than the advertised weight (680 g). This follows my typical experience with weights on consumer packaged goods. When my pistachios run out, I’ll buy the pre-shelled bag and weigh it so I have an actual figure. Update: I bought the pre-shelled pistachios. To my surprise, the weight of the seeds was actually more than they advertised on the bag. It was supposed to be 1.5 pounds (680 g) but turned out to be 689 g! So the cost is now even lower per gram of seeds than I had initially posted.

What should you do?

Buy the pre-shelled pistachios! You will save approximately four hours of your life and also spend less money. Win-win! So unless you are a glutton for punishment, just take the easy way out. For once, it’s the smart thing to do.

Glamor Shots

Shells from the 3-pound bag in a Krispy Kreme box.
Shells from the 3-pound bag in a Krispy Kreme box
All of the seeds from the 3-pound bag
All of the seeds from the 3-pound bag
Share

PHP script to send test emails on a schedule to make sure you are able to receive email

Ever wondered if your email was actually working? I’ve had numerous people over the past few years wonder the same thing. Ideally you would use a consistent, dependable email service provider (think Gmail, Hotmail, Yahoo!, etc.). But sometimes that isn’t an option. If you think you might be receiving email intermittently or not at all, feel free to use this script to send test emails to yourself on a regular basis. If you get the test emails, you know at least someone can get to you.

1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
<?php
 
// Set this to your timezone
date_default_timezone_set('America/New_York');
 
// Start at 8:00 AM (24-hour time)
$startTime = mktime(8, 0, 0);
 
// End at 5:00 PM (24-hour time)
$endTime = mktime(17, 0, 0);
 
 
$currentTime = time();
 
// Do not send the email if it is outside of the allowed hours
if($currentTime < $startTime || $currentTime > $endTime)
{
	print('Not sending an email after hours.');
	die();
}
 
// Get the current day of the week as an index (0=Sunday, 6=Saturday)
$dayOfWeek = date('w');
 
// Do not send the email on weekends
if($dayOfWeek == 0 || $dayOfWeek == 6)
{
	print('Not sending an email on the weekends.');
	die();
}
 
// Info of person to receive the tests
define('TO_EMAIL',		'me@test.com');
define('TO_NAME',		'John Doe');
 
// Info of person sending the tests
define('FROM_EMAIL',	'webmaster@serversendingtests.com');
define('FROM_NAME',	'Email Tester');
 
// Example: 8:00 am on 1 Nov 2010
$subject = 'Test: ' . date('g:i a \o\n j M Y');
 
$message = 'This email was automatically generated. Please send an email to yourusername@youremailprovider.com if you would like to disable these automated tests.';
 
$result = mail(TO_NAME . ' <' . TO_EMAIL . '>', $subject, $message, 'From: ' . FROM_NAME . ' <' . FROM_EMAIL . '>');
var_dump($result);

And finally, setup a cron job that runs on a regular basis (every 15 minutes in this case):

*/15 	* 	* 	* 	* 	wget http://myserver.com/path/to/script.php

I realize that you could do some of the scheduling being done in the PHP script in the cron job definition, but I wanted to have more control over that for future features (think pulling schedules from a database for a multi-user environment).

Share

How to backup your website

Hard Drive in Flames

Everyone knows (or should by now) that cheap web hosts (Bluehost, Dreamhost, MediaTemple, etc.) don’t backup your data for you. So you’d better do it yourself. If you’re on any respectable host, you should have ssh access to the box.

Connect to your box via ssh and run the following commands to create a backup of your site.


cd ~
mkdir Backup
nohup zip -r Backup/YYYY-MM-DD-HHMM.zip www/ > backup_log.txt &

(Replace YYYY with the 4-digit year, MM with the 2-digit month, HH with the 24-hour format of the hour, and MM with the 2-digit minute)

cd ~ navigates to your home folder

mkdir Backup creates the backup directory in which the backups will be stored

nohup is short for no hangup and allows processes started by users at the terminal to continue running even after the user logs out

zip is a program which combines many files into one and compresses them to make the end result even more portable

-r tells zip to burrow into all subdirectories in order to grab all of the files

Backup/YYYY-MM-DD-HHMM.zip is the path to the backup file

www/ is the directory to backup (it may be html, htdocs, httpdocs, etc. on your box)

> backup_log.txt redirects all output from zip to the backup_log.txt file so you can review the file later

& tells linux to run the zip program in the background so that you can logout or perform other tasks without killing the process

Now all you need to do is download that zipped file. Use your favorite SFTP client to login to your box and snag it. I recommend FileZilla Client for all platforms. If you’re looking for an FTP server, FileZilla Server is perfect.

Share

Speed Up That Cheap Website with Cheap Amazon S3


Do you have an economy-grade website host? Me too. BlueHost is great for only $6.95 per month but its response times and transfer rates are terrible. Fear not — Amazon S3 to the rescue. For pennies a day you can supplement your cheap website host using Amazon’s Simple Storage Service (S3).

Amazon S3 is storage for the Internet. It is designed to make web-scale computing easier for developers.

Amazon S3 provides a simple web services interface that can be used to store and retrieve any amount of data, at any time, from anywhere on the web. It gives any developer access to the same highly scalable, reliable, fast, inexpensive data storage infrastructure that Amazon uses to run its own global network of web sites. The service aims to maximize benefits of scale and to pass those benefits on to developers.

It is simple. So simple.

  • Sign up for an account.
  • Download and install the awesome S3 Firefox Organizer (S3Fox) Firefox add-on.
  • Upload the files you want to be served up like hotcakes.
  • Update the links in your HTML files to point to the new location.
    Example: http://s3.amazonaws.com/jeremy/blog/images/large_bandwidth_sucking_header.jpg
    Note that the example is intended to show the format of the URL and does not point to a valid resource.

Too good to be true? Nope. The S3 files are served up lickety split and best of all it takes the load off of your cheap host which allows it to function much more efficiently. So far I have moved my site’s header and the LightBox JS file. Why didn’t I move the other JS files and images? Because Google hosts all of the popular JavaScript libraries for free.

How much does it cost?
Very little, unless your site becomes wildly popular. 1 million requests costs one dollar plus 17 cents per GB transfered. That’s right. 1,000,000 GET requests = $1.00 + $0.17/GB.

Let’s assume the average size of the elements being served from your Amazon S3 bucket is 10KB.
10KB = 0.01MB = 0.00001GB
1,000,000 requests x 0.00001GB = 10GB
10GB x $0.17/GB = $1.70
1,000,000 requests x $0.01/10,000 requests = $1.00
Total Download Cost: $2.70

Your cheap site can now support 1,000,000 requests per month for a whopping $9.65 ($6.95 for BlueHost and $2.70 for Amazon S3). And if your site gets Dugg or on the front page of Reddit, Amazon S3 will scale without sweating a drop.

Share

DD-WRT for the win!

DD-WRT Logo

Today I was faced with a difficult wireless networking scenario: looooong house, many thick walls.

 

The topography is as follows:
Comp A <--- 200 ft., 4 walls ---> Router <--- 150 ft., 3 walls ---> Comp B

The house is older so the walls are very, very solid and RF-absorbing. The old setup involved a Linksys WRT54GX (802.11 b/g) as the router in the middle, a Belkin Wireless-N PCMCIA card on computer A, and a Belkin Wireless-N PCI card on computer B. After many attempts to reposition the wireless adapter’s antennas on computer B with no success I suggested hooking up a WRT54G in client bridging mode (using DD-WRT) to act as the wireless adapter on computer B. Worked like a champ. The signal is now strong and the connection hasn’t dropped one single time.

The kicker is that the WRT54G I used is version 8.2 which has very little RAM and doesn’t support the standard method of upgrading the firmware to DD-WRT.

The Solution

  1. Download TFTP.
  2. Download the VX Work Killer firmware for the WRT54G v8.2
  3. Download the dd-wrt.v24_micro_wrt54gv8.bin firmware for the WRT54G v8.2
  4. Upload the vxworkskillerGv8-v3.bin firmware to the router.
  5. Wait for the router to reboot.
  6. Try to ping the router (192.168.1.1 by default).
  7. When you can ping the router continue to the next step.
  8. Open a command prompt and enter
    tftp -i 192.168.1.1 put dd-wrt.v24_micro_wrt54gv8.bin
  9. If all went well, when the router reboots it will have DD-WRT on it and be accessible via 192.168.1.1.
  10. Username: “root”
  11. Password: “admin”

Source

The connection on computer A is also weak so I’ll be adding a WRT54G to the mix to fix it.

Thank you DD-WRT!

Share