Saturday, April 28, 2012

Three crucial ingredients for any (zombie apocalypse) survival pack

Now that I've started my zombie apocalypse survival pack, I'm adding some of the most important components next: rope, biners and knives. With these three items, the only limit is yourself.

As with most survival packs, Type III paracord is almost certainly the way to go. It has a 550 pound minimum breaking strength and usually seven inner strands. The strength means you can make an emergency harness, hang a hammock or heavy gear, and so on, and the inner core enables you to make a gill net or otherwise use only small amounts of rope as necessary.

I have a 1000' spool of mil-spec 550 cord from Cabelas, though you can save yourself $45 if you go through Amazon. From my spool, I measured out 200' of cord. I considered going with half of that, thinking that the likelihood of an outbreak is low, and I shouldn't commit more rope to a survival pack than necessary. That rope could go to better use for other things around the house. Then I told pragmatic me to shut up. Two hundred feet it is.

Depending on the nature of the rope usage, you may need a biner for securing items, low-friction rope slip (ie, a pulley), or as part of a mechanical purchase. Much of the use I've had is handled quite sufficiently by $0.80 biners from Fleet Farm. Not only are they cheap and small, but they actually have a pretty high weight limit -- often from 400-800 lbs. I'm including a handful of those in my pack, but I will also be picking up two D-shaped non-locking biners from REI. The D-shape helps ensure the load is transferred to its stronger 24kN axis (roughly 5400 lbs of static force).

Also from REI is my Leatherman Surge, rounding out this current set of additions. It's a little heavy and includes things that won't be relevant for zombie survival (eg, screwdriver, bottle opener), and therefore it may be swapped out later for something a little more appropriate. Nonetheless, it's still incredibly useful.

My current collection, now pending the purchase of a few larger carabiners:

Also on the topic of rope and knives, I almost never leave home without my Gerber Shortcut or my homemade paracord bracelet (~9' of 550 cord). Zombies or no, it's just good to be prepared.

Friday, April 27, 2012

Preparing for the zombie apocalypse: The Beginning

Every so often, I find myself evaluating where I'm at in life. What kind of person am I, and what kind of person do I want to be? What are my beliefs? Do my actions and beliefs conflict? How can I remedy any dissonance?

I often find that I am far too pragmatic. Case in point: the Foo Fighters were in town in September of last year. Andrea asked me if I wanted to go. Hell yes I did. But I went into cost-cutting mode and decided that we could do better things with our money than spend it on a concert -- even though I already know that buying experiences is much better than buying stuff. Now and then, I need to tell my pragmatic self to shove it.

I have plenty of projects -- things for Babbage Technologies, for work, things around the house, books to read -- but I don't really have many fun projects. Today, I decided to start a project that will cost me money and provide little practical benefit, but it will be a fun journey.

Wednesday, April 25, 2012

Getting worms and being sick

Last Wednesday, as we were getting ready to head into town, the boys found some worms on our driveway. They had to bring the worms in the car so they could play with them. I did bring some dirt in a tupperware container, though, and they survived at least for a while.

And this last Sunday, Reed got sick. As he couldn't go to daycare on Monday, I ran into work to grab my laptop so I could work from home. Reed really wanted to "draw on [my] skateboard." Yes. My skateboard. Awesome.

Tuesday, April 17, 2012

From the vault: Hammocking at Sportsman's Park

Two weeks ago, the weather was beautiful. Sunny, warm, not much wind -- just about the opposite of what we have right now. After work, the boys and I went to Sportsman's Park in Clearwater. I setup my hammock and lay in there while the boys came and went, alternating between the playset, rocks and the hammock.

Jacob hiding (left) and Reed preparing to hide

That morning (and previous evening) I had made Hungarian coffee cake, and Reed was still eating leftovers from breakfast. There are few things more awesome than hammocking and eating delicious coffee cake.

...Except hammocking, eating coffee cake and relaxing with my boys.

Monday, April 16, 2012

If I never hear your voice again

Frank Warren runs a site called which is an online community for displaying and discussing secrets people send to him. Warren was recently featured on a TED Talk entitled "Half a Million Secrets."

I don't think I'd ever heard of PostSecret before, or if I did, I didn't pay much attention to it. But watching Warren's TED Talk was very powerful. The last post card he shares is one in which someone wrote:
When people I love leave voicemails in my phone I always save them in case they die tomorrow and I have no other way of hearing their voice ever again
This brought back a memory of mine from many years ago. It's faded, but still intact (arguably so).

When we learned to read, my brothers and I often read - among other books - Frog and Toad. I remember how hard it was to struggle through each word when Mom and Dad could do it so fluidly and with such ease. I don't know if that made me want to work harder so I could get there or if it made me more frustrated that I wasn't there already.

Some time not long after Matt died, we (or I?) found a cassette tape on which he had recorded himself reading Frog and Toad aloud. My memory tells me that I sat there listening quietly in awe of this little piece of plastic and roll of magnetic tape. It had only been - what, maybe a few months or a year? - since Matt died, and that I could hear his voice again was indescribable even though it was the awkward word-by-word reading and occasional sounding-out of syllables in a children's book.

Memories are a funny thing. As Radioab discussed, every time you revive a memory, your brain writes it back, quite possibly altering it in some way. Details aren't stored exactly; the major points are saved, and when you retrieve it, your brain re-interprets, re-fabricates the details. This memory of hearing Matt's voice again is almost certainly not very accurate.

Besides, I don't remember his voice. I just remember that I heard it.

Tuesday, April 10, 2012

Some musings on the status quo of programming in the professional world

I occasionally get quite frustrated with the status quo of peoples' programming practices. Lately, the amount of manual work that goes into programming seems a bit excessive. First, some back story.

One of the niches I've carved out for myself at work is in the area of converting statistical models created in SAS. The general process is as follows.
  1. Our statistical modelers create a logistic regression model that predicts something -- credit worthiness, ability/willingness to repay debt, likelihood of fraud, etc. -- and they give us the SAS code for that model.
  2. We convert the model from SAS to ECL.
  3. Once converted, both the engineer (me) and the modeler processes a large number of records in both ECL and SAS, respectively. Results are compared to find errors in the ECL that are fixed.
  4. When the ECL produces the same results as the SAS, the model is fully validated and can be put into a production release.
When I started doing this five years ago, this process was extremely manual. I would go through the SAS code line-by-line, typing up ECL that I created based on my semantic interpretation of the SAS code. Models at the time were usually several hundred lines of code (LOC), maybe a thousand or two on occasion. Since that time, the size of our models has increased dramatically. For one of our flagship products, RiskView, a newer model easily exceeds 10k LOC.

As models started increasing in size, I worked on what became a fairly good 80/20 rule improvement to the process: a sizable portion of the SAS code could be quickly, correctly, and most importantly, automatically, converted to ECL. This allowed me to halve the time it took engineering to convert a model. Since then, we've developed internal tools to improve this process even further.

It struck me as odd that "the way we've always done it" was so manual, tedious, time-consuming. It seemed no one ever thought to automate away this work.

Even with a fairly mature SAS to ECL converter, we still run into incorrect semantic translation (step two above), so we still require work on the validation (step three). I only recently found out that this process -- which is a burden that is put almost squarely on the modeler -- is about as manual as it can be.

This process is archaeology. For a given input record, SAS calculates a score and ECL calculates a score. If these values match up, great! If not, then there's some ECL code that doesn't quite do what it's supposed to (since we've defined SAS as being correct). Maybe it's an expression that lacks proper grouping, thus yielding different orders of operations. Maybe it's a floating point error. Maybe it's something with SAS' missing values, a notion for which ECL has no analog. Whatever the reason, the modeler then traces back what values go into the final score. Let's say there are a half dozen of them. How many of those values match between SAS and ECL? If only one of those values is off, then what values go into that value? They back-track this problem until they find a variable for which SAS and ECL differ but whose input values match.

A contrived example might be something like this SAS code:
if age <= 0 then age_m = 0.000000;
else if age < 18 
then age_m = 1.243567;
else if age < 26 then age_m = 2.345678;
else if age < 42 then age_m = 3.456789;
else if age < 60 then age_m = 4.567890;
else age_m = 5.678900;
And almost semantically equivalent ECL:
age_m := map(
   age < 0  => 0.000000,
   age < 18 => 1.243567,
   age < 26 => 2.345678,
   age < 42 => 3.456789,
   age < 60 => 4.567890,
The very minor difference (highlighted) between these expressions is the less-than-or-equal-to (<=) comparison against zero in SAS but only less-than (<) in ECL. In this example, if both languages report an age of zero, they will each report a different value for age_m -- 0.000000 for SAS and 1.234567 for ECL.

After much digging, the statistical modeler reports to us that the age_m calculation is problematic, and the engineer looks at the SAS and the ECL, finds the problem and fixes it.

On a 10k LOC model with a few hundred intermediate values, there is a lot of archaeology. Yet for my past five years and probably for years before that, this has been the process. Comparing results for model validations has been painstaking.

These inefficiencies bother me. We have so much work to do that wasting time doing tedious digging into 500MB of raw intermediate data should be out of the question. Unfortunately, it's not; it's the status quo, "the way we've always done it." I've therefore spent a good portion of the last two days at work metaprogramming a solution.

Unfortunately, ECL (at this time) has no introspection. Any kind of metaprogramming is done at compile time. That limits me, but not so much that I can't come up with a solution. In this case, I found a way to do a comparison of SAS and ECL intermediate variables, report on the differences and even provide a metadata summary of those differences. With essentially two LOC, I can see where my ECL differs from the SAS and on what proportion of our sample data. I reduce a half day of statistical modeler manual time to twenty seconds of thor time.

Now, coming back to the original point: Why are so many of these inefficiencies in place? I occasionally hear the phrase, "that's how we've always done it" -- never in a derisive tone, just usually as a shrugging off of new ideas as too alien or unproven.

Developing, translating and implementing logistic regression models is one portion of what we do, but I've run into this kind of thing many times over. Some of the most fundamental philosophies of computer science seem to get lost in practice. Code reuse, cost of developer time over processor time, encapsulation, abstraction, just to name a few.

There are reasons why we're always so ridiculously busy, and I think one of those reasons -- and not an unimportant one -- is because we don't sit down and think. We should think about how we're doing things, how they should be done, and whether the effort to change is worth it. More often than not, I think it is worth it.