More than 90% of this website is controlled by one Perl script. That script reads content, surrounds it with style sheets and everything else it needs, and spits it all out for you, the reader.
Obviously that script is pretty darn important, and usually.. usually.. I'm very careful about changes I might make to it. I make a copy, edit the copy, and test the new code on a subset of pages. If I have made some horrible mistake, I fix it or revert back to what I had.
OK, that works, but sometimes a change is just so simple, it hardly seems worthwhile to go to all that trouble. Heck, it's just one or two lines I'm changing, I'll just edit the file directly. No problem, right?
Yeah, right. The other night I had a simple little change like that. I vi'd the script, wrote it back out, and poof.. website down.. or at least the 90% that depends on that script. I had made a stupid typing blunder that caused a syntax error in the code and of course that was the end of that. Easily fixed: back into vi, fix the typo, write it back out, and ayup, everything is fine. No real harm done, maybe a half dozen visitors got a momentary server error - not a great thing to do to people, but no lasting damage.
And then not fifteen seconds later the power went out at my home.
Of course the website was still up. That's in a data center somewhere, well protected with generators and redundant internet connections - that runs for years between reboots. But I couldn't get to it because my power was out and my router wasnt working.
If I had my router on a UPS I actually might be able to: my Verizon FIOS has a battery backup so I assume that if I had power to the router (and my switch, etc.) I could still work. But I don't bother with that: power failures aeren't all that frequent and I need a break now and then anyway. It's not all that important..
Or is it? Suppose that failure had happened just after I had written out the file with the typo? My site would have been down until the power came back on and I could fix it. In this case, that was hours later.. that would have ticked off a lot of people.. me most of all!
OK, I'm convinced: I probably need a UPS. And I probably should always stage my changes through a test copy. But realistically, that's not going to happen.. I know darn well that if it's "just a little change", I'm not going to do that. So I needed something else to help protect me from my own stupidity.
With Perl, that's not very difficult. I wrote a little script called "vpage" that looks like this:
#!/bin/sh file=$1 cp ~/cgi/$file ~/stage/$file vi ~/stage/$file perl -c ~/stage/$file && cp ~/stage/$file ~/cgi/$file
For those "simple" changes, this protects me from typos that break the script dead: if the "perl -c" fails, the file will not be overwritten.
That's no replacement for staging changes. It does acknowledge reality though: I'm not always going to follow protocol even when I know that I should. Now consider this for a moment: I own this website. It's very important to me. I'm not an employee, not hired help: if I can't guarantee that I won't sometimes take shortcuts, how much more likely is it that someone with a lesser interest might do the same? Moral: your systems should anticipate laziness and haste on the part of humans and should try to protect themselves as much as possible.
I'll pick up a UPS soon. I'll also try to more conscientious about testing my changes, but when I'm not, I'll at least use that script rather than editing directly.
Got something to add? Send me email.
More Articles by Anthony Lawrence © 2012-07-12 Anthony Lawrence
The errors which arise from the absence of facts are far more numerous and more durable than those which result from unsound reasoning respecting true data. (Charles Babbage)