Constellation Games by Leonard Richardson and The Monarch of the Glen by Neil Gaiman

| Comments

I might be reading a little too quickly. Just two weeks ago, at the end of my Casino Royale post, I mentioned that I was going to choose my next book from one of the various Humble eBook Bundle or StoryBundle collections that I’ve already purchased. I have already finished two ebooks from StoryBundle since then.

My poor organizational skills

When I purchased my first two StoryBundle.com bundles, I made it a point to tag them accordingly when I imported them into Calibre. Then I got lazy or forgetful, and that didn’t happen anymore. Not having the books tagged made it very difficult to choose my next book, so I went through my old emails and tagged books from a dozen different ebook bundles.

While I was tagging books, I noticed that StoryBundle’s Epic Fantasy Bundle was still on sale, and I hadn’t bought it yet! I even lucked out because there was a Neil Gaiman novella in there, but it felt like cheating!

The Monarch of the Glen by Neil Gaiman

This one was a short and easy-to-read novella. I read the first chapter before going to sleep, and quickly finished off the rest during the next evening. It takes place after the events of American Gods and follows Shadow in his travels to Scotland.

I don’t know how much I can say about something so short. I enjoyed The Monarch of the Glen just as much as I enjoyed American Gods. This time I had a better idea of what to expect going in, though.

Constellation Games by Leonard Richardson

I ended up reading Constellation Games from StoryBundle’s Video Game Bundle. The synopsis sounded interesting, and Cory Doctorow thought it was a brilliant novel. That was enough reason for me to give it a try.

Constellation Games is the story of Ariel Blum, a video game developer living in Austin, TX, and making first contact with a coalition of alien species. It felt like a fresh and novel viewpoint for a first-contact story. Most of the books that I’ve read involving first contact end up being told from the viewpoint of some genius with eight doctorates. Telling the story from the point of view of a game dev blogger was very easy to relate to.

It was also fun reading a book that doesn’t seem to take itself too seriously. There’s a lot of humor in here. All of the comedic science fiction that I’ve read has been British, and almost all of it was written by Douglas Adams. I enjoyed reading a more American take on the genre.

“Ah, and the lovely Jenny,” said Tetsuo, pinching her hand carefully in what I guess was a suave gesture. “I didn’t know you had a private car and driver!”

“That was a taxi,” said Jenny.

“That explains why it was so ugly,” said Tetsuo.

Leonard Richardson Constellation Games

I only read a few pages of Cory Doctorow’s Little Brother. It was overloaded with leet-speak buzzwords. It felt too much like watching Hackers, so I put the book down pretty quickly. Constellation Games suffers from the same problem, but not to the same extent.

I would say that I definitely enjoyed reading Constellation Games, and I look forward to reading anything else Leonard Richardson decides to write. I would also be very interested in playing Caveman Chaos, a fictional game from his book.

zsh-dwim: I Feel Like a Genius, Belatedly

| Comments

I’ve been stuck on my laptop for the last few months, and I’ve been doing my best to tolerate its old, slow, spinning hard drive. I’ve also been tweaking all sorts of settings in an attempt to make things more tolerable. I installed the preload daemon, and that seemed to help things a bit.

That wasn’t enough, though, so I started tweaking various sysctl settings. I’m pretty old school, and I have old habits. I never use the sysctl command. I always use cat to peek into the files in /proc/sys/ and echo to change their values. On one hand, this gives me tab completion of all those file names that I never remember. On the other hand, it takes quite a few keystrokes to turn those cat commands into echo commands.

This is exactly what zsh-dwim is made for. Wouldn’t you think I’d realize this right away? I didn’t. I didn’t think of this until a few days after I was done messing around with sysctl settings!

I’m very excited about this new zsh-dwim transformation. It saves a lot of keystrokes, and I wish I’d thought of it sooner! Unlike using sysctl, this transformation works with variables under both /proc/sys/ and /sys/.

This will be very handy the next time I have to tweak a bunch of kernel settings, and it has given me some ideas for the future of zsh-dwim. I keep thinking of zsh-dwim in terms of actually swapping out parts of the current command. I believe that I should also start thinking in terms of simple cursor placement as well.

You can find zsh-dwim at GitHub.

Ten-Year Anniversary of Using Darcs and Git for My Emacs Config

| Comments

I just happened to take a look at the log of my Emacs configuration Git repository yesterday. I don’t know what made me page through the entire log, but I was very excited to see that the oldest log entry was dated March 13, 2004. That’s ten years to the day!

1
2
3
4
5
commit 0bfa6223629f72065f65fd44c2741d48a9019f07
Author: thehead <thehead@patshead.com>
Date:   Thu May 13 16:31:21 2004 -0400

    Initial Import

That was the day that I migrated my Emacs configuration files from CVS to Darcs. I didn’t import my CVS history. Having history is handy when I mess something up and can’t figure out what I did wrong. Only recent history is useful for that. I didn’t think that ancient history was worth the trouble of importing one commit at a time into Darcs.

I miss Darcs

At the time, Git didn’t exist yet. It was still over a year from its first release. I had already chosen Darcs as my preferred “next generation” version control system. Back in 2004, I still had a separate laptop and desktop. We didn’t have Wi-Fi hotspots in every coffee shop, and we didn’t have convenient things like Seafile or Dropbox to keep our files in sync.

Distributed version control with Darcs was an amazing upgrade, and storing configuration files in Darcs was very convenient. I didn’t have to hope that I remembered to check out my projects before I left home or worry about finding an Internet connection if I forgot.

I was a late adopter of Git. I didn’t migrate my Darcs repositories to Git until March 2011. In my opinion, Darcs has a much more user-friendly command-line interface, and I also preferred the Darcs concept of “every copy is a separate branch.”

If you switch branches often, Git will be faster and more convenient, but it was handy knowing that each copy of each Darcs repository automatically acted like a distinct branch. Combining that feature with Darcs’s excellent merging made it easy to commit small, host-specific changes to local repositories.

Finally giving in to peer pressure

I had to give up on Darcs. Git may be a pain in the neck in comparison, but there’s just too much friction when the rest of the world has decided to use Git. It looks like I converted my Emacs repository three years ago.

I’m surprised so much time has passed already. I think I’m still less comfortable with Git today than I was just a few months into using Darcs. Git is not only less intuitive than Darcs, but I run into merge conflicts much more often than I ever did with Darcs. Git’s merging feels like a hammer compared to the scalpel of Darcs’s “patch theory.”

Do you keep your configuration files in version control? Are you also using Git?

Casino Royale by Ian Flemming

| Comments

I was at a bar with my old friend Tim back in January. It seems that if he’s not busy building awesome guitars, that he just might be reading a book. He told me that he read about half of all the James Bond books last year. He says they’re all rather short, easy to read, and a lot of fun. I’m sure he told me more than that, but those are the highlights that I’m remembering!

I’m a big fan of the James Bond movies—my favorite Bond would most definitely be Roger Moore—so it seemed like a no-brainer to sneak Ian Fleming’s Casino Royale to the top of my list. I thought I would get through it pretty quickly, but it didn’t work out that way.

I’m trying not to write a “book report” here, but I can’t really explain what was slowing me down without giving away a few serious plot points. There will be spoilers ahead!

The first half of the book had very little action. It started out with a chapter full of dossiers of all the various bad guys, and I knew I’d never remember any of that. Then they spent a lot of time talking about cash for gambling, strategies for gambling, and then they finally got down to playing some Baccarat. I’d be surprised if I was bothering to read more than a chapter each day.

Things started to get more exciting around the half way point. The “Bond girl” gets kidnapped, and there’s a car chase. It is a very poorly executed chase on Bond’s part, and he is almost immediately captured. This was one of the many things that wouldn’t be likely to happen to Roger Moore or Sean Connery. They then proceeded to torture Mr. Bond for at least two chapters.

Le Chiffre spoke.

“That is all, Bond. We will now finish with you. You understand? Not kill you, but finish with you. And then we will have in the girl and see if something can be got out of the remains of the two of you.”

He reached towards the table.

“Say good-bye to it, Bond.”

Ian Fleming Casino Royale

This wasn’t just the sort of torture where you simply hit someone with a lead pipe until they talk. This involved attacking some important and very sensitive parts. Parts that we’re told will eventually be removed. When Le Chiffre spoke those words at the end of the chapter, I didn’t know if I wanted to turn that page. If I hadn’t been reading Casino Royale on my tablet, I may have wanted to pull a Joey Tribiani and put the book in the freezer.

I put the book away for a while, and during that time, Season 2 of House of Cards showed up on Netflix. That kept me distracted for a while, and I mostly forgot that I was in the middle of reading a book.

I didn’t pick it back up until last night. Much to my relief, 007 is rescued at the very beginning of the next chapter, and the doctors say he will make a full recovery. I quickly breezed through the rest of the book before I went to sleep. If I’d have known I had nothing to worry about, I probably would have finished the book on the same day that I ended up putting it aside.

What to read next?

I’ll definitely continue the series, but I’m going to choose something different to read next. I’ve picked up two or three StoryBundle.com bundles that I haven’t touched yet. Perhaps I’ll choose a book from one of those bundles at random. Maybe I’ll luck out!

I Migrated To Octopress From Movable Type

| Comments

I started this blog back in 2009, and at the time, I really had no idea what my requirements were in a blogging engine. There was only one thing I knew for certain: I wanted static HTML pages. Static pages are served up fast and are very secure. Other than that, I had no idea what I needed.

I ended up choosing Movable Type. All the blog post pages in Movable Type are static, so it did a pretty good job of meeting my only requirement. It has other features that seemed interesting, too. It has a commenting system, and it will send users an email notification if someone replies to their comments. It has a built-in search system. It will notify various services every time a new post is published. Movable Type also lets you schedule posts to be automatically published in the future.

Four years ago, the fact that Movable Type will announce the existence of my new blog posts to the world sounded like it would be an amazingly useful feature. At this point, I am pretty sure it was completely useless.

What was wrong with Movable Type?

Everything in Movable Type happens in its clunky web interface. The worst part about that is having to compose and edit posts in an HTML textarea. I’ve had something stupid on more than one occasion that made hundreds of words vanish on me because of this. I eventually started using a Chrome extension that would let me edit textareas in Emacs. This helped, but it was still clunky.

Things also seemed to keep getting slower and slower as I wrote more posts. Sometimes I would need to open a half-dozen older posts to check on things and make some small tweaks. This would involve a lot of waiting for posts to open, and waiting for posts to publish.

I was running Movable Type 5, and version 6 was going to be released very soon. I figured that my choices were to upgrade or find a new blogging engine.

What was I looking to improve?

I definitely wanted to get rid of that web-based editor. Most of the new static blog generators store all your posts in text files under Git. This seems brilliant to me. I can’t imagine a faster way to edit blog posts than using Emacs on my local machine. My plan was to be able to have a ‘commit –> push –> publish’ workflow.

I also wanted a more modern theme. I’m not the least bit artistic, and I have absolutely no sense of style. I just wanted something that looked clean, and it definitely had to be a responsive design. Responsive web pages detect what sort of device you are on and adjust the layout of the page to fit.

One small roadblock

All of the available static blog generators are completely static. That means they’re not going to have any sort of a built-in comment system. That meant I was going to have to use some sort of comment service. I decided to give Disqus a try.

I wasn’t convinced that this was a good idea. When I was surfing the web, I used to notice Disqus all the time. It was often very slow to load. Sometimes it seemed to make entire pages load more slowly. This had me worried, but by the time I was looking to switch, things seemed to be working a lot better.

I migrated over to Disqus while I was still using Movable Type. I’ve been using Disqus since June, and I am actually very pleased with the results. I’m getting more comments than I did before making the switch, and I’m much happier letting Disqus handle sending out all the various notification emails.

Enter Octopress

I’ve been actively using Octopress since August of 2013. I’m extremely happy with the results. I have a nice, clean, responsive theme. According to Piwik, I shaved over a tenth of a second off my average page generation times. That alone was worth the effort of migrating!

All my blog posts are now happily sitting in a Git repository on my local machine, and Seafile does a great job of keeping the posts synced up between my laptop and desktop, even when I forget to commit something, and that happens more often than I’d like to admit!

One important feature is still missing

I’ve been limping along for the last six months without one of my favorite Movable Type features. I currently have no way to schedule a blog post to publish in the future.

My blog web server was on a virtual machine running an ancient version of Ubuntu. The OpenVZ host server was still running CentOS 5, and you need the newer OpenVZ kernel that ships with CentOS 6 to run more modern versions of Ubuntu. This meant getting an rbenv and Octopress up and running was pretty much out of the question.

I’ve since upgraded the host server and the web server, but I still haven’t gotten around to setting up an rbenv up on the server for Octopress. This means I still don’t have my commit –> push –> publish workflow up and running yet. I have a little helper script that makes Octopress a bit more comfortable for me, but it still needs a few more features before I can get my automatic publishing back.

The verdict

I’ve been using Octopress for six months so far, and I am extremely happy with it. My site loads faster, looks more modern, and looks so much better on phones and tablets. My friend Brian also migrated to Octopress last year, and I think he’s almost as pleased as I am.

Are you still using Movable Type? Are you using a static blog generator like Octopress? Are you thinking about migrating to a static blog generator? I’d really like to hear what you’re thinking about, or how it is working out for you!

A Couple of zbell.zsh Bug Fixes

| Comments

I’ve been using zbell for over a month now, and I’m really starting to rely on it. Old habits are hard to break. I still find myself peeking at long running processes just to see if they’ve finished, but I’m slowly learning to trust that zbell will let me know when something needs my attention.

A test zbell.zsh notification

Deploying a new blog post using my laptop takes over thirty seconds. I always like to take a look at the live website after it publishes to make sure I didn’t goof anything up. I’ve finally stopped flipping back to the terminal window to check on the progress. Instead, I’m trusting zbell to let me know when it is time to check the blog.

One very, very annoying bug

Some people complained when I chose control-u as the default key for zsh-dwim. Control-u is bound to the function unix-line-discard by default. I’ve never used this function in my entire life. A key sequence that I use for a similar purpose all the time, control-c, has always been my preferred way of canceling a command that I am writing. It has the advantage of leaving the unfinished command on my screen, and that comes in handy if I need to reference my thoughts later on.

This action triggered a very annoying bug in zbell. Hitting control-c would cause zbell to immediately notify me of the previous command. This was very loud and quite annoying. This bug could also be triggered if you hit Enter on a blank line.

Thankfully, this was easy enough to fix.

A much less intrusive bug

The email notification in zbell also had a small bug. I had decided to include the command’s exit status in the body of the email, but I wasn’t squirreling that status away early enough. When you want to capture the exit status of a command, you have to capture it immediately. If you execute any other command before making a copy of the exit status variable, you will end up overwriting it.

This one was easy enough to fix, but it might still be pretty fragile. I added a line to copy the exit status to another variable right at the very beginning of the precmd hook, but any other functions bound to the precmd hook could mess that up.

I don’t have any other hooks, so this fix is good enough for me.

You can find the zbell.zsh Gist at Github.

My Gaming Story: The Start of Three Decades of Gaming

| Comments

I grew up playing videogames. I remember begging my parents incessantly for an Atari when I was six years old. I was too young to know that there were better systems available, and I had no idea that the machine I wanted was actually called an Atari VCS.

I remember one particular Christmas morning, rushing down the stairs as fast as I could to see what Santa Claus had delivered for me. I unwrapped small boxes with game cartridges like Munch-Man, Parsec, and Hunt the Wumpus. These didn’t sound like Atari games to me.

Then I opened a rather large present containing a Texas Instruments TI-99/4a personal computer. It had a metallic case, a keyboard to the left, and a cartridge slot to the right. It didn’t bear much resemblance to an Atari VCS.

My disappointment and confusion vanished pretty quickly once my father got this new piece of equipment hooked up to the TV. By then, I was happily laying down chains in Munch-Man, or shooting down aliens in TI Invaders. Both of these games were superior to the originals from which they were cloned, but I was completely oblivious to that at the time. I was just having fun playing them.

I still enjoy playing some of these games today. I have some of the best TI-99/4a games, like Parsec, Munch-Man, and The Attack running on my arcade cabinet. It doesn’t quite feel the same playing them like this, but it sure does bring back a lot of memories.

I often wonder where I would be today if my father had gotten me that Atari VCS that I actually wanted. Making a living working with computers has been enjoyable and profitable. I doubt I would be where I am today if my father hadn’t chosen to buy that TI-99/4a.

Thank you, dad!

Thief of Time by Terry Pratchett

| Comments

I have been working my way through Pratchett’s Discworld series for quite a few years now. I’ve stuck pretty closely to the Discworld Reading Order diagram. It didn’t take me much more than a year to plow through all the Rincewind and Witches novels. Finishing Thief of Time marks the end of my much slower journey through the Death novels.

Lu-Tze looked impressed, and said so. “I’m impressed,” he said.

Terry Pratchett Thief of Time

I can’t really explain why, because it doesn’t make a lot of sense, but for some reason I kept hearing the voice of Hermes Conrad from Futurama whenever Lu-Tze was speaking. I’m not sure what sort of similarities there should be between a Tibetan monk and a Jamaican Olympic limbo athlete turned accountant, but this is what my brain decided. Who am I to argue with my brain?

I Really Miss My Solid-State Drive

| Comments

Way back in July, I assembled a new computer to replace my old primary workstation, which was my laptop. I decided that the easiest and most economical thing to do was to transplant my laptop’s Crucial M4 solid-state drive into the new machine, and then put one of my spare 7200 RPM hard drives into the laptop.

How bad could it be? I hardly ever have to use the laptop. Who cares if it boots slower and applications open slower?

I’ve now been away from home four long weeks, and I’ve been using my laptop the entire time. Who cares if my laptop is slower? It turns out that I’m the one who cares, and it turns out that I care about it quite a lot!

The downgrade feels more significant than the upgrade

Upgrading to an SSD was very nice. Some programs open instantly instead of taking a few seconds. Some programs open in a few seconds instead of dozens of seconds. Your computer boots up faster. This is all pretty obvious, and it is quite exciting for the first few days or weeks.

Then you forget about it. This is how things should work. Programs should have always started this quickly. Wasn’t it always like this?

It wasn’t. When you go back to a spinning drive, it becomes painfully obvious. It is much worse than you remembered. You don’t get used to how slow it is, either. You get more annoyed as each day passes.

I think about buying an SSD every time I see one go on sale. I’m traveling, though, and I don’t want to deal with the drive swap here, but this line of thinking gets more and more tempting every day.

Mitigating the pain with preload

I installed preload last month. Preload is an “adaptive readahead daemon.” It monitors the files you often access and attempts to keep them in memory. My laptop has plenty of RAM, so I figured this wouldn’t hurt.

It did help with my most obvious problem. I am a terminal addict. I probably open and close hundreds of terminal windows throughout the day. It was sometimes taking over one full second for new terminals to appear on my screen. After installing preload, new terminal windows appear and are ready to use almost instantly.

I haven’t noticed much improvement anywhere else using preload.

Booting takes forever

I shut down and boot up my laptop a couple of times a day lately. I’m amazed at how long it takes to boot. With the solid-state drive, it would be ready to use within about ten seconds of entering my passphrase to unlock the encrypted drive.

I haven’t actually taken a stopwatch to it, but it sure takes a lot longer now. I almost always wander off to do things like empty my pockets and plug in my cell phone. Usually, when I get back, it is still booting up.

Some things just feel absolutely glacial

One of the first and most obvious improvements that I noticed after upgrading to a solid-state drive was how much faster the GIMP launches. The splash screen would pop up, the progress bar would fill up almost instantly, and you’d be editing images before you knew it. Now I can listen to the hard drive churn, watch the disk-activity light, and count up a few hippopotamuses while I wait.

Even with the “quickstarter” running, LibreOffice Writer can sometimes take forever to open. I can count quite a few hippopotamuses while I wait for Writer to open the first time. That’s almost twice as many as when waiting for the GIMP.

Conclusion

If you’re thinking about upgrading to a solid-state drive, you need to stop thinking and start shopping. You won’t be disappointed.

I almost ordered an SSD for my laptop while writing this post. I just don’t want to have to shuffle my data around until I get home, and by the time I get home, I won’t even be using the laptop anymore.

Getting Notified When Long-Running Zsh Processes Complete

| Comments

I’m always on the lookout for neat, new ideas. Back in November, I saw a neat idea over at Reddit: using Zsh hooks to trigger a notification when a command takes a long time to complete. In the comments there, I found a link to Jean-Philippe Ouellet’s excellent little zbell.sh script.

His script is a great fit for anyone who only uses once machine. It wasn’t very useful to me on my Linux machine. A bell going off in the terminal didn’t wake up any of the terminals that I tested. This must be the default behavior on Mac OS X.

At first, I wasn’t sure how prolific these notifications might be. I ended up adding some logging to zbell.sh, and I promptly forgot about it for two months. The contents of the log file looked promising, so I decided to put some more work into this.

A test zbell.zsh notification

The goal

I wanted to be notified at the completion of very long-running processes, even if I am away from my desk. I also wanted to keep the notification process very simple because I wanted it to be easy to install on servers. This was a more difficult proposition than I had expected.

Sending an email seemed like a simple idea. I quickly remembered that most residential ISPs block outgoing SMTP. This slowed me down quite a bit. My next thought was Twitter, but their new authentication process looks a bit too complicated for a simple tool like curl.

That set me looking for a simple tool that could authenticate with an SMTP server over an encrypted connection, and it would be a bonus if the tool was available in the default Centos, Ubuntu, and Debian repositories. That last bit ended up being the difficult part.

I learned something new

It turns out that curl is able to send emails! I can’t even guess how many years I’ve been using curl, and I never noticed this capability before. Not only is it readily available in every Linux distribution’s repositories, but it can also connect to my mail server using SSL/TLS.

Sending an email on port 465 using curl
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
curl --ssl-reqd \
  --url "smtps://mail.patshead.com:465" \
  --mail-from "zbell@patshead.com" \
  --mail-rcpt "zbell@patshead.com" \
  --user 'zbell@patshead.com:password' \
  --insecure --upload-file - &> /dev/null <<EOF &|
From: "ZSH Notification" <zbell@patshead.com>
To: "Pat Regan" <thehead@patshead.com>
Subject: $HOST - $zbell_lastcmd

Completed with exit status $?


Love,

Zbell

EOF

Where are we now?

My changes to zbell.sh are still pretty quick and dirty, and they are very specific to my own machine.

I decided to set up two different timers. My fork of zbell.sh plays a sound and fires off a desktop notification using notify-send if a process takes longer than 60 seconds to complete. If it takes more than three minutes, it sends off an email instead.

I also used my two months of zbell.sh logging to pick out commands to add to the zbell_ignore list. This combination has been working out so far, but I still get notified of a command that I don’t care about every now and then.

The future

This little script is working pretty well so far, but it still needs a lot of cleanup. The notifications and noise-making parts might be pretty specific to Ubuntu. I’d like to make those more generic or detect which relevant utilities exist on the system. The email addresses and login information are hard-coded into the script. I’m going to have to move those out into variables.

Other than that, it is a usable little script already.