Casino Royale by Ian Flemming

| Comments

I was at a bar with my old friend Tim back in January. It seems that if he’s not busy building awesome guitars, that he just might be reading a book. He told me that he read about half of all the James Bond books last year. He says they’re all rather short, easy to read, and a lot of fun. I’m sure he told me more than that, but those are the highlights that I’m remembering!

I’m a big fan of the James Bond movies—my favorite Bond would most definitely be Roger Moore—so it seemed like a no-brainer to sneak Ian Fleming’s Casino Royale to the top of my list. I thought I would get through it pretty quickly, but it didn’t work out that way.

I’m trying not to write a “book report” here, but I can’t really explain what was slowing me down without giving away a few serious plot points. There will be spoilers ahead!

The first half of the book had very little action. It started out with a chapter full of dossiers of all the various bad guys, and I knew I’d never remember any of that. Then they spent a lot of time talking about cash for gambling, strategies for gambling, and then they finally got down to playing some Baccarat. I’d be surprised if I was bothering to read more than a chapter each day.

Things started to get more exciting around the half way point. The “Bond girl” gets kidnapped, and there’s a car chase. It is a very poorly executed chase on Bond’s part, and he is almost immediately captured. This was one of the many things that wouldn’t be likely to happen to Roger Moore or Sean Connery. They then proceeded to torture Mr. Bond for at least two chapters.

Le Chiffre spoke.

“That is all, Bond. We will now finish with you. You understand? Not kill you, but finish with you. And then we will have in the girl and see if something can be got out of the remains of the two of you.”

He reached towards the table.

“Say good-bye to it, Bond.”

Ian Fleming Casino Royale

This wasn’t just the sort of torture where you simply hit someone with a lead pipe until they talk. This involved attacking some important and very sensitive parts. Parts that we’re told will eventually be removed. When Le Chiffre spoke those words at the end of the chapter, I didn’t know if I wanted to turn that page. If I hadn’t been reading Casino Royale on my tablet, I may have wanted to pull a Joey Tribiani and put the book in the freezer.

I put the book away for a while, and during that time, Season 2 of House of Cards showed up on Netflix. That kept me distracted for a while, and I mostly forgot that I was in the middle of reading a book.

I didn’t pick it back up until last night. Much to my relief, 007 is rescued at the very beginning of the next chapter, and the doctors say he will make a full recovery. I quickly breezed through the rest of the book before I went to sleep. If I’d have known I had nothing to worry about, I probably would have finished the book on the same day that I ended up putting it aside.

What to read next?

I’ll definitely continue the series, but I’m going to choose something different to read next. I’ve picked up two or three StoryBundle.com bundles that I haven’t touched yet. Perhaps I’ll choose a book from one of those bundles at random. Maybe I’ll luck out!

I Migrated To Octopress From Movable Type

| Comments

I started this blog back in 2009, and at the time, I really had no idea what my requirements were in a blogging engine. There was only one thing I knew for certain: I wanted static HTML pages. Static pages are served up fast and are very secure. Other than that, I had no idea what I needed.

I ended up choosing Movable Type. All the blog post pages in Movable Type are static, so it did a pretty good job of meeting my only requirement. It has other features that seemed interesting, too. It has a commenting system, and it will send users an email notification if someone replies to their comments. It has a built-in search system. It will notify various services every time a new post is published. Movable Type also lets you schedule posts to be automatically published in the future.

Four years ago, the fact that Movable Type will announce the existence of my new blog posts to the world sounded like it would be an amazingly useful feature. At this point, I am pretty sure it was completely useless.

What was wrong with Movable Type?

Everything in Movable Type happens in its clunky web interface. The worst part about that is having to compose and edit posts in an HTML textarea. I’ve had something stupid on more than one occasion that made hundreds of words vanish on me because of this. I eventually started using a Chrome extension that would let me edit textareas in Emacs. This helped, but it was still clunky.

Things also seemed to keep getting slower and slower as I wrote more posts. Sometimes I would need to open a half-dozen older posts to check on things and make some small tweaks. This would involve a lot of waiting for posts to open, and waiting for posts to publish.

I was running Movable Type 5, and version 6 was going to be released very soon. I figured that my choices were to upgrade or find a new blogging engine.

What was I looking to improve?

I definitely wanted to get rid of that web-based editor. Most of the new static blog generators store all your posts in text files under Git. This seems brilliant to me. I can’t imagine a faster way to edit blog posts than using Emacs on my local machine. My plan was to be able to have a ‘commit –> push –> publish’ workflow.

I also wanted a more modern theme. I’m not the least bit artistic, and I have absolutely no sense of style. I just wanted something that looked clean, and it definitely had to be a responsive design. Responsive web pages detect what sort of device you are on and adjust the layout of the page to fit.

One small roadblock

All of the available static blog generators are completely static. That means they’re not going to have any sort of a built-in comment system. That meant I was going to have to use some sort of comment service. I decided to give Disqus a try.

I wasn’t convinced that this was a good idea. When I was surfing the web, I used to notice Disqus all the time. It was often very slow to load. Sometimes it seemed to make entire pages load more slowly. This had me worried, but by the time I was looking to switch, things seemed to be working a lot better.

I migrated over to Disqus while I was still using Movable Type. I’ve been using Disqus since June, and I am actually very pleased with the results. I’m getting more comments than I did before making the switch, and I’m much happier letting Disqus handle sending out all the various notification emails.

Enter Octopress

I’ve been actively using Octopress since August of 2013. I’m extremely happy with the results. I have a nice, clean, responsive theme. According to Piwik, I shaved over a tenth of a second off my average page generation times. That alone was worth the effort of migrating!

All my blog posts are now happily sitting in a Git repository on my local machine, and Seafile does a great job of keeping the posts synced up between my laptop and desktop, even when I forget to commit something, and that happens more often than I’d like to admit!

One important feature is still missing

I’ve been limping along for the last six months without one of my favorite Movable Type features. I currently have no way to schedule a blog post to publish in the future.

My blog web server was on a virtual machine running an ancient version of Ubuntu. The OpenVZ host server was still running CentOS 5, and you need the newer OpenVZ kernel that ships with CentOS 6 to run more modern versions of Ubuntu. This meant getting an rbenv and Octopress up and running was pretty much out of the question.

I’ve since upgraded the host server and the web server, but I still haven’t gotten around to setting up an rbenv up on the server for Octopress. This means I still don’t have my commit –> push –> publish workflow up and running yet. I have a little helper script that makes Octopress a bit more comfortable for me, but it still needs a few more features before I can get my automatic publishing back.

The verdict

I’ve been using Octopress for six months so far, and I am extremely happy with it. My site loads faster, looks more modern, and looks so much better on phones and tablets. My friend Brian also migrated to Octopress last year, and I think he’s almost as pleased as I am.

Are you still using Movable Type? Are you using a static blog generator like Octopress? Are you thinking about migrating to a static blog generator? I’d really like to hear what you’re thinking about, or how it is working out for you!

A Couple of zbell.zsh Bug Fixes

| Comments

I’ve been using zbell for over a month now, and I’m really starting to rely on it. Old habits are hard to break. I still find myself peeking at long running processes just to see if they’ve finished, but I’m slowly learning to trust that zbell will let me know when something needs my attention.

A test zbell.zsh notification

Deploying a new blog post using my laptop takes over thirty seconds. I always like to take a look at the live website after it publishes to make sure I didn’t goof anything up. I’ve finally stopped flipping back to the terminal window to check on the progress. Instead, I’m trusting zbell to let me know when it is time to check the blog.

One very, very annoying bug

Some people complained when I chose control-u as the default key for zsh-dwim. Control-u is bound to the function unix-line-discard by default. I’ve never used this function in my entire life. A key sequence that I use for a similar purpose all the time, control-c, has always been my preferred way of canceling a command that I am writing. It has the advantage of leaving the unfinished command on my screen, and that comes in handy if I need to reference my thoughts later on.

This action triggered a very annoying bug in zbell. Hitting control-c would cause zbell to immediately notify me of the previous command. This was very loud and quite annoying. This bug could also be triggered if you hit Enter on a blank line.

Thankfully, this was easy enough to fix.

A much less intrusive bug

The email notification in zbell also had a small bug. I had decided to include the command’s exit status in the body of the email, but I wasn’t squirreling that status away early enough. When you want to capture the exit status of a command, you have to capture it immediately. If you execute any other command before making a copy of the exit status variable, you will end up overwriting it.

This one was easy enough to fix, but it might still be pretty fragile. I added a line to copy the exit status to another variable right at the very beginning of the precmd hook, but any other functions bound to the precmd hook could mess that up.

I don’t have any other hooks, so this fix is good enough for me.

You can find the zbell.zsh Gist at Github.

My Gaming Story: The Start of Three Decades of Gaming

| Comments

I grew up playing videogames. I remember begging my parents incessantly for an Atari when I was six years old. I was too young to know that there were better systems available, and I had no idea that the machine I wanted was actually called an Atari VCS.

I remember one particular Christmas morning, rushing down the stairs as fast as I could to see what Santa Claus had delivered for me. I unwrapped small boxes with game cartridges like Munch-Man, Parsec, and Hunt the Wumpus. These didn’t sound like Atari games to me.

Then I opened a rather large present containing a Texas Instruments TI-99/4a personal computer. It had a metallic case, a keyboard to the left, and a cartridge slot to the right. It didn’t bear much resemblance to an Atari VCS.

My disappointment and confusion vanished pretty quickly once my father got this new piece of equipment hooked up to the TV. By then, I was happily laying down chains in Munch-Man, or shooting down aliens in TI Invaders. Both of these games were superior to the originals from which they were cloned, but I was completely oblivious to that at the time. I was just having fun playing them.

I still enjoy playing some of these games today. I have some of the best TI-99/4a games, like Parsec, Munch-Man, and The Attack running on my arcade cabinet. It doesn’t quite feel the same playing them like this, but it sure does bring back a lot of memories.

I often wonder where I would be today if my father had gotten me that Atari VCS that I actually wanted. Making a living working with computers has been enjoyable and profitable. I doubt I would be where I am today if my father hadn’t chosen to buy that TI-99/4a.

Thank you, dad!

Thief of Time by Terry Pratchett

| Comments

I have been working my way through Pratchett’s Discworld series for quite a few years now. I’ve stuck pretty closely to the Discworld Reading Order diagram. It didn’t take me much more than a year to plow through all the Rincewind and Witches novels. Finishing Thief of Time marks the end of my much slower journey through the Death novels.

Lu-Tze looked impressed, and said so. “I’m impressed,” he said.

Terry Pratchett Thief of Time

I can’t really explain why, because it doesn’t make a lot of sense, but for some reason I kept hearing the voice of Hermes Conrad from Futurama whenever Lu-Tze was speaking. I’m not sure what sort of similarities there should be between a Tibetan monk and a Jamaican Olympic limbo athlete turned accountant, but this is what my brain decided. Who am I to argue with my brain?

I Really Miss My Solid-State Drive

| Comments

Way back in July, I assembled a new computer to replace my old primary workstation, which was my laptop. I decided that the easiest and most economical thing to do was to transplant my laptop’s Crucial M4 solid-state drive into the new machine, and then put one of my spare 7200 RPM hard drives into the laptop.

How bad could it be? I hardly ever have to use the laptop. Who cares if it boots slower and applications open slower?

I’ve now been away from home four long weeks, and I’ve been using my laptop the entire time. Who cares if my laptop is slower? It turns out that I’m the one who cares, and it turns out that I care about it quite a lot!

The downgrade feels more significant than the upgrade

Upgrading to an SSD was very nice. Some programs open instantly instead of taking a few seconds. Some programs open in a few seconds instead of dozens of seconds. Your computer boots up faster. This is all pretty obvious, and it is quite exciting for the first few days or weeks.

Then you forget about it. This is how things should work. Programs should have always started this quickly. Wasn’t it always like this?

It wasn’t. When you go back to a spinning drive, it becomes painfully obvious. It is much worse than you remembered. You don’t get used to how slow it is, either. You get more annoyed as each day passes.

I think about buying an SSD every time I see one go on sale. I’m traveling, though, and I don’t want to deal with the drive swap here, but this line of thinking gets more and more tempting every day.

Mitigating the pain with preload

I installed preload last month. Preload is an “adaptive readahead daemon.” It monitors the files you often access and attempts to keep them in memory. My laptop has plenty of RAM, so I figured this wouldn’t hurt.

It did help with my most obvious problem. I am a terminal addict. I probably open and close hundreds of terminal windows throughout the day. It was sometimes taking over one full second for new terminals to appear on my screen. After installing preload, new terminal windows appear and are ready to use almost instantly.

I haven’t noticed much improvement anywhere else using preload.

Booting takes forever

I shut down and boot up my laptop a couple of times a day lately. I’m amazed at how long it takes to boot. With the solid-state drive, it would be ready to use within about ten seconds of entering my passphrase to unlock the encrypted drive.

I haven’t actually taken a stopwatch to it, but it sure takes a lot longer now. I almost always wander off to do things like empty my pockets and plug in my cell phone. Usually, when I get back, it is still booting up.

Some things just feel absolutely glacial

One of the first and most obvious improvements that I noticed after upgrading to a solid-state drive was how much faster the GIMP launches. The splash screen would pop up, the progress bar would fill up almost instantly, and you’d be editing images before you knew it. Now I can listen to the hard drive churn, watch the disk-activity light, and count up a few hippopotamuses while I wait.

Even with the “quickstarter” running, LibreOffice Writer can sometimes take forever to open. I can count quite a few hippopotamuses while I wait for Writer to open the first time. That’s almost twice as many as when waiting for the GIMP.

Conclusion

If you’re thinking about upgrading to a solid-state drive, you need to stop thinking and start shopping. You won’t be disappointed.

I almost ordered an SSD for my laptop while writing this post. I just don’t want to have to shuffle my data around until I get home, and by the time I get home, I won’t even be using the laptop anymore.

Getting Notified When Long-Running Zsh Processes Complete

| Comments

I’m always on the lookout for neat, new ideas. Back in November, I saw a neat idea over at Reddit: using Zsh hooks to trigger a notification when a command takes a long time to complete. In the comments there, I found a link to Jean-Philippe Ouellet’s excellent little zbell.sh script.

His script is a great fit for anyone who only uses once machine. It wasn’t very useful to me on my Linux machine. A bell going off in the terminal didn’t wake up any of the terminals that I tested. This must be the default behavior on Mac OS X.

At first, I wasn’t sure how prolific these notifications might be. I ended up adding some logging to zbell.sh, and I promptly forgot about it for two months. The contents of the log file looked promising, so I decided to put some more work into this.

A test zbell.zsh notification

The goal

I wanted to be notified at the completion of very long-running processes, even if I am away from my desk. I also wanted to keep the notification process very simple because I wanted it to be easy to install on servers. This was a more difficult proposition than I had expected.

Sending an email seemed like a simple idea. I quickly remembered that most residential ISPs block outgoing SMTP. This slowed me down quite a bit. My next thought was Twitter, but their new authentication process looks a bit too complicated for a simple tool like curl.

That set me looking for a simple tool that could authenticate with an SMTP server over an encrypted connection, and it would be a bonus if the tool was available in the default Centos, Ubuntu, and Debian repositories. That last bit ended up being the difficult part.

I learned something new

It turns out that curl is able to send emails! I can’t even guess how many years I’ve been using curl, and I never noticed this capability before. Not only is it readily available in every Linux distribution’s repositories, but it can also connect to my mail server using SSL/TLS.

Sending an email on port 465 using curl
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
curl --ssl-reqd \
  --url "smtps://mail.patshead.com:465" \
  --mail-from "zbell@patshead.com" \
  --mail-rcpt "zbell@patshead.com" \
  --user 'zbell@patshead.com:password' \
  --insecure --upload-file - &> /dev/null <<EOF &|
From: "ZSH Notification" <zbell@patshead.com>
To: "Pat Regan" <thehead@patshead.com>
Subject: $HOST - $zbell_lastcmd

Completed with exit status $?


Love,

Zbell

EOF

Where are we now?

My changes to zbell.sh are still pretty quick and dirty, and they are very specific to my own machine.

I decided to set up two different timers. My fork of zbell.sh plays a sound and fires off a desktop notification using notify-send if a process takes longer than 60 seconds to complete. If it takes more than three minutes, it sends off an email instead.

I also used my two months of zbell.sh logging to pick out commands to add to the zbell_ignore list. This combination has been working out so far, but I still get notified of a command that I don’t care about every now and then.

The future

This little script is working pretty well so far, but it still needs a lot of cleanup. The notifications and noise-making parts might be pretty specific to Ubuntu. I’d like to make those more generic or detect which relevant utilities exist on the system. The email addresses and login information are hard-coded into the script. I’m going to have to move those out into variables.

Other than that, it is a usable little script already.

My Year in Books 2013

| Comments

I was hoping to avoid writing another post like this again this year. While I was writing the last one, I decided that it would be a much better idea to write up a short review each time I finish a book.

That didn’t happen, though. I didn’t read anything at all until March, and I blasted straight through three books in about four weeks’ time. By then it was just too late. I’m going to try to do a better job this year.

I may not have done a good job writing about these books, but I sure did a good job keeping track of them. I meticulously entered all my data into my Goodreads.com account as I was reading.

I’m very pleased with Goodreads. Each month, they do a pretty good job of warning me that new books have been released by authors that I’ve previously read. I might have missed Fate of Worlds and The Long War this year if it weren’t for these monthly alerts.

The list

Lots and lots of John Scalzi

This happens to me all the time. I try my very best not to read books by the same author back to back. Sometimes I just can’t pick out my next book, and I get lazy. It is easy to just pick another book from the same author’s catalog. Old Man’s War happened to be in the first Humble eBook Bundle, and that was a good enough reason for me to read it.

I enjoyed it quite a bit, and the Internet was buzzing about Redshirts, so it seemed like a good choice for my next book. The premise sounded great, and I really liked the first two thirds of the book. In my opinion, this is right around where the book probably should have ended. This is probably the reason I didn’t get back to Scalzi’s books until October.

I’m glad I picked back up where I left off. I had a good time reading the next two books in the Old Man’s War series, and I look forward to reading more this year.

I should also mention that I find John Scalzi’s ridiculously large recliner very intriguing.

The Making of Karateka

Karateka was one of the first games I played on my Apple 2 when I was a kid. I still remember the first time I made it to the end of the game and defeated Akuma. I walked through the door, up to Princess Mariko, and she killed me with one solid kick to the face. I had no idea that I wasn’t supposed to walk up to her in a fighting stance!

I got this book as part of a bundle from StoryBundle.com. I thought it sounded a little interesting, but I didn’t think I would enjoy reading someone’s diary. I was very wrong about that. This was by far my favorite book of 2013.

The book brought back a lot of old memories, and it was very interesting to learn exactly how one of my old favorite games came into existence. It was nice to read about the old days when one person could single handedly create a state-of-the-art video game from scratch.

Larry Niven’s Known Space

It is always so easy to just choose another book in a series that you’re already familiar with, and I’ve read just about every one of Niven’s Known Space and Ringworld books. It was nice to see a couple of new books in this universe show up this year.

Just one Terry Pratchett book

I really enjoy Terry Pratchett’s work, and I always devour his books very, very quickly. That’s why I’m surprised that I only read one of his novels last year. The Long War, the sequel to The Long Earth, was written by Terry Pratchett and Stephen Baxter: two authors whose work I enjoy very much.

I’ve read some complaints about this book. They say that the story doesn’t really go anywhere, and it raises more questions instead of answering questions from the first book. This didn’t bother me at all, because the journey was a lot of fun.

The Long Earth series feels to me like the work of Stephen Baxter with a little bit of Pratchett’s whimsy thrown in, and that’s just fine by me.

My failures of 2013

I hate to give up on a book, but I did so twice this year. Earlier in the year, I started reading Terry Pratchett’s Dodger. I just couldn’t get into it. I put it away after about 60 pages.

I started reading Kim Stanley Robinson’s Blue Mars back in November. I very much enjoyed reading the first two books in the series, but they are quite long and mostly pretty dry. The chapters are long, and you’ll often spend several pages in a row reading descriptions of Martian geography and weather.

This isn’t necessarily a bad thing, but I knew that if I started reading it that I wouldn’t be finished before the end of the year. I knew I was going to be traveling, so I thought it would be better to be reading something lighter. I only made it a dozen pages in before I decided to read Betrayer of Worlds instead.

Plans for 2014

I decided to start off the year by continuing my journey through Terry Pratchett’s Discworld series. We’re barely a week into the new year, and I’m already down the very end of Thief of Time. Susan has already saved the day, and the story is winding down.

I’m going to do my very best to write a little about each book I read this year, and I’m going to try to do it shortly after I finish each one. These posts are more for my benefit than anyone else’s, so I’m going to try not to post them too often!

Looking Back at My 2013 New Years Resolutions

| Comments

Earlier in the year, I posted a short list of resolutions for the year ahead. Most of them were really just tasks or projects that I was hoping to work on this year. The year is now rapidly winding down, so I thought this would be a good time to look back and see if I did a good job this year.

Generate new personal crypto keys – PARTIAL SUCCESS

I did generate new ssh private keys this year. Twice. I actually did it the first time specifically to satisfy this particular new year’s resolution. Later in the year, I built a new computer. That meant I needed an additional ssh private key, and I needed to push that out to all the computers that I ssh into.

Pushing out two new keys is just as easy as pushing out a single key, so I generated even more secure ssh keys for each of my machines. I now feel ever so slightly more secure.

I have not yet created a new GPG key. This is a bit embarrassing for me. By the time I got around to thinking about doing this, Edward Snowden had already started leaking information about cryptographic weaknesses. I really need to research which ciphers are safe in a post-Snowden world, but I haven’t done that yet. I’ll just have to add this to next year’s list!

Rework my persist system – FAIL

I love my little persist system. I use it to manage my configuration files and keep them stored away in a Git repository. It works really well, and it helps me keep all my random config files safely in a Seafile library. Since they’re then sitting in one of my Seafile libraries, they already get quickly and automatically synchronized between my laptop and desktop. I don’t run the Seafile client on any of my servers, but this has me covered in the two places I care about the most.

The problem is that I built my persist system in place. I manually created the Git repository where the config files are stored, and the scripts are all hard-coded to point to that repository. They also assume that repository exists. It doesn’t help that I decided to make my persist repository a submodule of my Prezto repository.

I need to rework it quite a bit. It needs to create that repository for you automatically. The trouble here is that I’m lazy, and it is already working just fine for me. I seem to always have something else that I’d rather work on, and I don’t want to tear out something that works just to replace it with something almost identical.

This is holding up my short series of posts regarding the cleanup of my shell environment. One of the goals of that cleanup was to have a simple way to push my rather large Zsh configuration out to new workstations and servers.

The utility of that feature has diminished quite a bit since I started using Seafile. These files are now automatically copied around from machine to machine for me, and that has reduced my level of motivation quite a bit.

I haven’t really found anything quite like my persist system. It has been quite a while since I shopped around for something like this, though. Maybe I will luck out and find something even better. If I don’t, maybe I’ll be more motivated to make my own setup usable by the general public!

In any case, I would really like to see my shell cleanup series progress to its third part!

Continue working on zsh-dwim – SUCCESS

There’s not much to say here. I made updates to zsh-dwim often enough that I wrote five short posts about it this year.

I’ve slowed down quite a bit on this in the second half of the year, but I’ve already crossed off almost every item on my zsh-dwim to-do list. The ones that I haven’t crossed off are either very strange, or ended up not being very intuitive.

I think I’ve scratched all of my own zsh-dwim itches. I’m most likely to be doing sysadmin-related tasks. I bet someone could think of some transformations that would help streamline a programmer’s workflow. I just don’t have the data to make that happen.

Build a web interface for the arcade cabinet – FAIL

I probably didn’t think about this one a single time after January. I had this idea late last year when my friend Brian ordered a whole mess of NFC tags. It sounded like a pretty neat idea at the time, but I don’t think I’d get much mileage out of it.

I’m going to scratch this one off the list for now.

arcade cabinet

Buy fewer games for my arcade cabinet, spend more time playing them instead – FAIL

I didn’t buy a single game for the arcade cabinet during the first few months of the year. After that, I decided it was alright to buy games for the arcade cabinet if they had native Linux ports.

By summer, I was even buying some Windows-only games again. I buy almost everything that shows up on IndieGameStand that has a Linux port, and I’ll happily pay a dollar for any arcade-style, Windows-only game they put up for sale. If I’m lucky, they’ll run under Wine. If I’m unlucky, I’m just out a dollar.

What’s in store for next year?

I’m not sure, but I still have dozens of hours to think about it! I’d like to do something fun with one of my Arduinos, and I definitely have to build a new stand for my new 27-inch monitors.

I would also like to add another chair to my home office: something comfortable to lounge on when I’m reading, or when I’m watching movies or playing games on my arcade cabinet’s new TV. This is probably the smallest home office I’ve ever had, and it is starting to feel a bit cramped in there. It might be difficult to find something that fits, but I’m going to give it a try.

Emacs: ELPA and use-package

| Comments

I have dozens of files from various corners of the Internet in my ~/.emacs.d/lisp/ directory. I’ve tried to keep the important ones up to date, but I’ve never done a very good job of it. The ones that are old and outdated are the lucky ones. Most of them are so old that I don’t even use them anymore.

Emacs now has a rather nice package manager called ELPA, and I started using it over the summer. Converting my existing configuration files has been a bit painful, and the apparent gains from the effort have been pretty small. I’ve only converted a very small number of my configuration files because of this.

My problem with ELPA

My Emacs configuration is pretty well organized. I’ve been using my-site-start to help manage my configuration since 2009. It is very simple, and it works a lot like the SysV init system. It automatically executes all the files in my ~/.emacs.d/site-start.d directory. I try to keep all the configuration for each individual package in its own file. That makes it easy to quickly remove configurations that I don’t need.

Most of these files are very simple and quite similar. They usually have a require statement at the top followed by one or more setq. Some contain key binding assignments, but I mostly keep those to a single file.

~/.emacs.d/site-start.d/90git-gutter.el
1
2
3
4
(require 'git-gutter)

(global-git-gutter-mode t)
(setq git-gutter:always-show-gutter t)

I assumed that I would be able to reinstall my existing packages using ELPA, and my existing configuration files would just continue to work. The reality wasn’t quite that simple. My understanding might not be entirely accurate here, but it shouldn’t be too far from reality.

The packages installed by ELPA aren’t available until after Emacs is completely finished starting up. This means that all of those require statements become errors. That isn’t a problem on its own, but this also means that any call to functions within these packages also becomes an error. At this point, that call to the global-git-gutter-mode function will fail.

I was able to work around this by setting package-enable-at-startup to nil. This seemed like a bit of a kludge. I’m pretty sure this means that I now have to require every package I install.

John Wiegley’s use-package

I found John Wiegley’s use-package last week, and I am very pleased with it. It manages to solve all my issues with ELPA, and it sure looks like it is going to lead me towards a much cleaner Emacs configuration.

~/.emacs.d/site-start.d/90git-gutter.el with use-package
1
2
3
4
5
6
(use-package git-gutter
  :init
  (progn
    (global-git-gutter-mode t)
    (setq git-gutter:always-show-gutter t)
    ))

I’ve started my journey the lazy way. I replaced all my require calls with use-package calls. Then I just wrapped up my existing code and stuffed it into the use-package’s :init section. This was just enough to eliminate my reliance on setting package-enable-at-startup to nil.

I’ve only just scratched the surface of use-package. It also has options for the configuration of key bindings and for populating your auto-mode-alist.

I have a feeling that I’m going to be slowly rewriting all of my Emacs Lisp files over the next few weeks!