I just happened to take a look at the log of my Emacs configuration Git repository yesterday. I don’t know what made me page through the entire log, but I was very excited to see that the oldest log entry was dated March 13, 2004. That’s ten years to the day!
That was the day that I migrated my Emacs configuration files from CVS to Darcs. I didn’t import my CVS history. Having history is handy when I mess something up and can’t figure out what I did wrong. Only recent history is useful for that. I didn’t think that ancient history was worth the trouble of importing one commit at a time into Darcs.
I miss Darcs
At the time, Git didn’t exist yet. It was still over a year from its first release. I had already chosen Darcs as my preferred “next generation” version control system. Back in 2004, I still had a separate laptop and desktop. We didn’t have Wi-Fi hotspots in every coffee shop, and we didn’t have convenient things like Seafile or Dropbox to keep our files in sync.
Distributed version control with Darcs was an amazing upgrade, and storing configuration files in Darcs was very convenient. I didn’t have to hope that I remembered to check out my projects before I left home or worry about finding an Internet connection if I forgot.
I was a late adopter of Git. I didn’t migrate my Darcs repositories to Git until March 2011. In my opinion, Darcs has a much more user-friendly command-line interface, and I also preferred the Darcs concept of “every copy is a separate branch.”
If you switch branches often, Git will be faster and more convenient, but it was handy knowing that each copy of each Darcs repository automatically acted like a distinct branch. Combining that feature with Darcs’s excellent merging made it easy to commit small, host-specific changes to local repositories.
Finally giving in to peer pressure
I had to give up on Darcs. Git may be a pain in the neck in comparison, but there’s just too much friction when the rest of the world has decided to use Git. It looks like I converted my Emacs repository three years ago.
I’m surprised so much time has passed already. I think I’m still less comfortable with Git today than I was just a few months into using Darcs. Git is not only less intuitive than Darcs, but I run into merge conflicts much more often than I ever did with Darcs. Git’s merging feels like a hammer compared to the scalpel of Darcs’s “patch theory.”
Do you keep your configuration files in version control? Are you also using Git?
I was at a bar with my old friend Tim back in January. It seems that if he’s not busy building awesome guitars, that he just might be reading a book. He told me that he read about half of all the James Bond books last year. He says they’re all rather short, easy to read, and a lot of fun. I’m sure he told me more than that, but those are the highlights that I’m remembering!
I’m a big fan of the James Bond movies—my favorite Bond would most definitely be Roger Moore—so it seemed like a no-brainer to sneak Ian Fleming’sCasino Royale to the top of my list. I thought I would get through it pretty quickly, but it didn’t work out that way.
I’m trying not to write a “book report” here, but I can’t really explain what was slowing me down without giving away a few serious plot points. There will be spoilers ahead!
The first half of the book had very little action. It started out with a chapter full of dossiers of all the various bad guys, and I knew I’d never remember any of that. Then they spent a lot of time talking about cash for gambling, strategies for gambling, and then they finally got down to playing some Baccarat. I’d be surprised if I was bothering to read more than a chapter each day.
Things started to get more exciting around the half way point. The “Bond girl” gets kidnapped, and there’s a car chase. It is a very poorly executed chase on Bond’s part, and he is almost immediately captured. This was one of the many things that wouldn’t be likely to happen to Roger Moore or Sean Connery. They then proceeded to torture Mr. Bond for at least two chapters.
Le Chiffre spoke.
“That is all, Bond. We will now finish with you. You understand? Not kill you, but finish with you. And then we will have in the girl and see if something can be got out of the remains of the two of you.”
He reached towards the table.
“Say good-bye to it, Bond.”
This wasn’t just the sort of torture where you simply hit someone with a lead pipe until they talk. This involved attacking some important and very sensitive parts. Parts that we’re told will eventually be removed. When Le Chiffre spoke those words at the end of the chapter, I didn’t know if I wanted to turn that page. If I hadn’t been reading Casino Royale on my tablet, I may have wanted to pull a Joey Tribiani and put the book in the freezer.
I put the book away for a while, and during that time, Season 2 of House of Cards showed up on Netflix. That kept me distracted for a while, and I mostly forgot that I was in the middle of reading a book.
I didn’t pick it back up until last night. Much to my relief, 007 is rescued at the very beginning of the next chapter, and the doctors say he will make a full recovery. I quickly breezed through the rest of the book before I went to sleep. If I’d have known I had nothing to worry about, I probably would have finished the book on the same day that I ended up putting it aside.
What to read next?
I’ll definitely continue the series, but I’m going to choose something different to read next. I’ve picked up two or three StoryBundle.com bundles that I haven’t touched yet. Perhaps I’ll choose a book from one of those bundles at random. Maybe I’ll luck out!
I started this blog back in 2009, and at the time, I really had no idea what my requirements were in a blogging engine. There was only one thing I knew for certain: I wanted static HTML pages. Static pages are served up fast and are very secure. Other than that, I had no idea what I needed.
I ended up choosing Movable Type. All the blog post pages in Movable Type are static, so it did a pretty good job of meeting my only requirement. It has other features that seemed interesting, too. It has a commenting system, and it will send users an email notification if someone replies to their comments. It has a built-in search system. It will notify various services every time a new post is published. Movable Type also lets you schedule posts to be automatically published in the future.
Four years ago, the fact that Movable Type will announce the existence of my new blog posts to the world sounded like it would be an amazingly useful feature. At this point, I am pretty sure it was completely useless.
What was wrong with Movable Type?
Everything in Movable Type happens in its clunky web interface. The worst part about that is having to compose and edit posts in an HTML textarea. I’ve had something stupid on more than one occasion that made hundreds of words vanish on me because of this. I eventually started using a Chrome extension that would let me edit textareas in Emacs. This helped, but it was still clunky.
Things also seemed to keep getting slower and slower as I wrote more posts. Sometimes I would need to open a half-dozen older posts to check on things and make some small tweaks. This would involve a lot of waiting for posts to open, and waiting for posts to publish.
I was running Movable Type 5, and version 6 was going to be released very soon. I figured that my choices were to upgrade or find a new blogging engine.
What was I looking to improve?
I definitely wanted to get rid of that web-based editor. Most of the new static blog generators store all your posts in text files under Git. This seems brilliant to me. I can’t imagine a faster way to edit blog posts than using Emacs on my local machine. My plan was to be able to have a ‘commit –> push –> publish’ workflow.
I also wanted a more modern theme. I’m not the least bit artistic, and I have absolutely no sense of style. I just wanted something that looked clean, and it definitely had to be a responsive design. Responsive web pages detect what sort of device you are on and adjust the layout of the page to fit.
One small roadblock
All of the available static blog generators are completely static. That means they’re not going to have any sort of a built-in comment system. That meant I was going to have to use some sort of comment service. I decided to give Disqus a try.
I wasn’t convinced that this was a good idea. When I was surfing the web, I used to notice Disqus all the time. It was often very slow to load. Sometimes it seemed to make entire pages load more slowly. This had me worried, but by the time I was looking to switch, things seemed to be working a lot better.
I migrated over to Disqus while I was still using Movable Type. I’ve been using Disqus since June, and I am actually very pleased with the results. I’m getting more comments than I did before making the switch, and I’m much happier letting Disqus handle sending out all the various notification emails.
I’ve been actively using Octopress since August of 2013. I’m extremely happy with the results. I have a nice, clean, responsive theme. According to Piwik, I shaved over a tenth of a second off my average page generation times. That alone was worth the effort of migrating!
I’ve been limping along for the last six months without one of my favorite Movable Type features. I currently have no way to schedule a blog post to publish in the future.
My blog web server was on a virtual machine running an ancient version of Ubuntu. The OpenVZ host server was still running CentOS 5, and you need the newer OpenVZ kernel that ships with CentOS 6 to run more modern versions of Ubuntu. This meant getting an rbenv and Octopress up and running was pretty much out of the question.
I’ve since upgraded the host server and the web server, but I still haven’t gotten around to setting up an rbenv up on the server for Octopress. This means I still don’t have my commit –> push –> publish workflow up and running yet. I have a little helper script that makes Octopress a bit more comfortable for me, but it still needs a few more features before I can get my automatic publishing back.
I’ve been using Octopress for six months so far, and I am extremely happy with it. My site loads faster, looks more modern, and looks so much better on phones and tablets. My friend Brian also migrated to Octopress last year, and I think he’s almost as pleased as I am.
Are you still using Movable Type? Are you using a static blog generator like Octopress? Are you thinking about migrating to a static blog generator? I’d really like to hear what you’re thinking about, or how it is working out for you!
I’ve been using zbell for over a month now, and I’m really starting to rely on it. Old habits are hard to break. I still find myself peeking at long running processes just to see if they’ve finished, but I’m slowly learning to trust that zbell will let me know when something needs my attention.
Deploying a new blog post using my laptop takes over thirty seconds. I always like to take a look at the live website after it publishes to make sure I didn’t goof anything up. I’ve finally stopped flipping back to the terminal window to check on the progress. Instead, I’m trusting zbell to let me know when it is time to check the blog.
One very, very annoying bug
Some people complained when I chose control-u as the default key for zsh-dwim. Control-u is bound to the function unix-line-discard by default. I’ve never used this function in my entire life. A key sequence that I use for a similar purpose all the time, control-c, has always been my preferred way of canceling a command that I am writing. It has the advantage of leaving the unfinished command on my screen, and that comes in handy if I need to reference my thoughts later on.
This action triggered a very annoying bug in zbell. Hitting control-c would cause zbell to immediately notify me of the previous command. This was very loud and quite annoying. This bug could also be triggered if you hit Enter on a blank line.
Thankfully, this was easy enough to fix.
A much less intrusive bug
The email notification in zbell also had a small bug. I had decided to include the command’s exit status in the body of the email, but I wasn’t squirreling that status away early enough. When you want to capture the exit status of a command, you have to capture it immediately. If you execute any other command before making a copy of the exit status variable, you will end up overwriting it.
This one was easy enough to fix, but it might still be pretty fragile. I added a line to copy the exit status to another variable right at the very beginning of the precmd hook, but any other functions bound to the precmd hook could mess that up.
I don’t have any other hooks, so this fix is good enough for me.
I grew up playing videogames. I remember begging my parents incessantly for an Atari when I was six years old. I was too young to know that there were better systems available, and I had no idea that the machine I wanted was actually called an Atari VCS.
I remember one particular Christmas morning, rushing down the stairs as fast as I could to see what Santa Claus had delivered for me. I unwrapped small boxes with game cartridges like Munch-Man, Parsec, and Hunt the Wumpus. These didn’t sound like Atari games to me.
Then I opened a rather large present containing a Texas Instruments TI-99/4a personal computer. It had a metallic case, a keyboard to the left, and a cartridge slot to the right. It didn’t bear much resemblance to an Atari VCS.
My disappointment and confusion vanished pretty quickly once my father got this new piece of equipment hooked up to the TV. By then, I was happily laying down chains in Munch-Man, or shooting down aliens in TI Invaders. Both of these games were superior to the originals from which they were cloned, but I was completely oblivious to that at the time. I was just having fun playing them.
I still enjoy playing some of these games today. I have some of the best TI-99/4a games, like Parsec, Munch-Man, and The Attack running on my arcade cabinet. It doesn’t quite feel the same playing them like this, but it sure does bring back a lot of memories.
I often wonder where I would be today if my father had gotten me that Atari VCS that I actually wanted. Making a living working with computers has been enjoyable and profitable. I doubt I would be where I am today if my father hadn’t chosen to buy that TI-99/4a.
I have been working my way through Pratchett’s Discworld series for quite a few years now. I’ve stuck pretty closely to the Discworld Reading Order diagram. It didn’t take me much more than a year to plow through all the Rincewind and Witches novels. Finishing Thief of Time marks the end of my much slower journey through the Death novels.
Lu-Tze looked impressed, and said so. “I’m impressed,” he said.
I can’t really explain why, because it doesn’t make a lot of sense, but for some reason I kept hearing the voice of Hermes Conrad from Futurama whenever Lu-Tze was speaking. I’m not sure what sort of similarities there should be between a Tibetan monk and a Jamaican Olympic limbo athlete turned accountant, but this is what my brain decided. Who am I to argue with my brain?
How bad could it be? I hardly ever have to use the laptop. Who cares if it boots slower and applications open slower?
I’ve now been away from home four long weeks, and I’ve been using my laptop the entire time. Who cares if my laptop is slower? It turns out that I’m the one who cares, and it turns out that I care about it quite a lot!
The downgrade feels more significant than the upgrade
Upgrading to an SSD was very nice. Some programs open instantly instead of taking a few seconds. Some programs open in a few seconds instead of dozens of seconds. Your computer boots up faster. This is all pretty obvious, and it is quite exciting for the first few days or weeks.
Then you forget about it. This is how things should work. Programs should have always started this quickly. Wasn’t it always like this?
It wasn’t. When you go back to a spinning drive, it becomes painfully obvious. It is much worse than you remembered. You don’t get used to how slow it is, either. You get more annoyed as each day passes.
I think about buying an SSD every time I see one go on sale. I’m traveling, though, and I don’t want to deal with the drive swap here, but this line of thinking gets more and more tempting every day.
Mitigating the pain with preload
I installed preload last month. Preload is an “adaptive readahead daemon.” It monitors the files you often access and attempts to keep them in memory. My laptop has plenty of RAM, so I figured this wouldn’t hurt.
It did help with my most obvious problem. I am a terminal addict. I probably open and close hundreds of terminal windows throughout the day. It was sometimes taking over one full second for new terminals to appear on my screen. After installing preload, new terminal windows appear and are ready to use almost instantly.
I haven’t noticed much improvement anywhere else using preload.
Booting takes forever
I shut down and boot up my laptop a couple of times a day lately. I’m amazed at how long it takes to boot. With the solid-state drive, it would be ready to use within about ten seconds of entering my passphrase to unlock the encrypted drive.
I haven’t actually taken a stopwatch to it, but it sure takes a lot longer now. I almost always wander off to do things like empty my pockets and plug in my cell phone. Usually, when I get back, it is still booting up.
Some things just feel absolutely glacial
One of the first and most obvious improvements that I noticed after upgrading to a solid-state drive was how much faster the GIMP launches. The splash screen would pop up, the progress bar would fill up almost instantly, and you’d be editing images before you knew it. Now I can listen to the hard drive churn, watch the disk-activity light, and count up a few hippopotamuses while I wait.
Even with the “quickstarter” running, LibreOffice Writer can sometimes take forever to open. I can count quite a few hippopotamuses while I wait for Writer to open the first time. That’s almost twice as many as when waiting for the GIMP.
If you’re thinking about upgrading to a solid-state drive, you need to stop thinking and start shopping. You won’t be disappointed.
I almost ordered an SSD for my laptop while writing this post. I just don’t want to have to shuffle my data around until I get home, and by the time I get home, I won’t even be using the laptop anymore.
His script is a great fit for anyone who only uses once machine. It wasn’t very useful to me on my Linux machine. A bell going off in the terminal didn’t wake up any of the terminals that I tested. This must be the default behavior on Mac OS X.
At first, I wasn’t sure how prolific these notifications might be. I ended up adding some logging to zbell.sh, and I promptly forgot about it for two months. The contents of the log file looked promising, so I decided to put some more work into this.
I wanted to be notified at the completion of very long-running processes, even if I am away from my desk. I also wanted to keep the notification process very simple because I wanted it to be easy to install on servers. This was a more difficult proposition than I had expected.
Sending an email seemed like a simple idea. I quickly remembered that most residential ISPs block outgoing SMTP. This slowed me down quite a bit. My next thought was Twitter, but their new authentication process looks a bit too complicated for a simple tool like curl.
That set me looking for a simple tool that could authenticate with an SMTP server over an encrypted connection, and it would be a bonus if the tool was available in the default Centos, Ubuntu, and Debian repositories. That last bit ended up being the difficult part.
I learned something new
It turns out that curl is able to send emails! I can’t even guess how many years I’ve been using curl, and I never noticed this capability before. Not only is it readily available in every Linux distribution’s repositories, but it can also connect to my mail server using SSL/TLS.
My changes to zbell.sh are still pretty quick and dirty, and they are very specific to my own machine.
I decided to set up two different timers. My fork of zbell.sh plays a sound and fires off a desktop notification using notify-send if a process takes longer than 60 seconds to complete. If it takes more than three minutes, it sends off an email instead.
I also used my two months of zbell.sh logging to pick out commands to add to the zbell_ignore list. This combination has been working out so far, but I still get notified of a command that I don’t care about every now and then.
This little script is working pretty well so far, but it still needs a lot of cleanup. The notifications and noise-making parts might be pretty specific to Ubuntu. I’d like to make those more generic or detect which relevant utilities exist on the system. The email addresses and login information are hard-coded into the script. I’m going to have to move those out into variables.
Other than that, it is a usable little script already.
I was hoping to avoid writing another post like this again this year. While I was writing the last one, I decided that it would be a much better idea to write up a short review each time I finish a book.
That didn’t happen, though. I didn’t read anything at all until March, and I blasted straight through three books in about four weeks’ time. By then it was just too late. I’m going to try to do a better job this year.
I may not have done a good job writing about these books, but I sure did a good job keeping track of them. I meticulously entered all my data into my Goodreads.com account as I was reading.
I’m very pleased with Goodreads. Each month, they do a pretty good job of warning me that new books have been released by authors that I’ve previously read. I might have missed Fate of Worlds and The Long War this year if it weren’t for these monthly alerts.
This happens to me all the time. I try my very best not to read books by the same author back to back. Sometimes I just can’t pick out my next book, and I get lazy. It is easy to just pick another book from the same author’s catalog. Old Man’s War happened to be in the first Humble eBook Bundle, and that was a good enough reason for me to read it.
I enjoyed it quite a bit, and the Internet was buzzing about Redshirts, so it seemed like a good choice for my next book. The premise sounded great, and I really liked the first two thirds of the book. In my opinion, this is right around where the book probably should have ended. This is probably the reason I didn’t get back to Scalzi’s books until October.
I’m glad I picked back up where I left off. I had a good time reading the next two books in the Old Man’s War series, and I look forward to reading more this year.
I should also mention that I find John Scalzi’s ridiculously large recliner very intriguing.
The Making of Karateka
Karateka was one of the first games I played on my Apple 2 when I was a kid. I still remember the first time I made it to the end of the game and defeated Akuma. I walked through the door, up to Princess Mariko, and she killed me with one solid kick to the face. I had no idea that I wasn’t supposed to walk up to her in a fighting stance!
I got this book as part of a bundle from StoryBundle.com. I thought it sounded a little interesting, but I didn’t think I would enjoy reading someone’s diary. I was very wrong about that. This was by far my favorite book of 2013.
The book brought back a lot of old memories, and it was very interesting to learn exactly how one of my old favorite games came into existence. It was nice to read about the old days when one person could single handedly create a state-of-the-art video game from scratch.
Larry Niven’s Known Space
It is always so easy to just choose another book in a series that you’re already familiar with, and I’ve read just about every one of Niven’s Known Space and Ringworld books. It was nice to see a couple of new books in this universe show up this year.
Just one Terry Pratchett book
I really enjoy Terry Pratchett’s work, and I always devour his books very, very quickly. That’s why I’m surprised that I only read one of his novels last year. The Long War, the sequel to The Long Earth, was written by Terry Pratchett and Stephen Baxter: two authors whose work I enjoy very much.
I’ve read some complaints about this book. They say that the story doesn’t really go anywhere, and it raises more questions instead of answering questions from the first book. This didn’t bother me at all, because the journey was a lot of fun.
The Long Earth series feels to me like the work of Stephen Baxter with a little bit of Pratchett’s whimsy thrown in, and that’s just fine by me.
My failures of 2013
I hate to give up on a book, but I did so twice this year. Earlier in the year, I started reading Terry Pratchett’s Dodger. I just couldn’t get into it. I put it away after about 60 pages.
I started reading Kim Stanley Robinson’s Blue Mars back in November. I very much enjoyed reading the first two books in the series, but they are quite long and mostly pretty dry. The chapters are long, and you’ll often spend several pages in a row reading descriptions of Martian geography and weather.
This isn’t necessarily a bad thing, but I knew that if I started reading it that I wouldn’t be finished before the end of the year. I knew I was going to be traveling, so I thought it would be better to be reading something lighter. I only made it a dozen pages in before I decided to read Betrayer of Worlds instead.
Plans for 2014
I decided to start off the year by continuing my journey through Terry Pratchett’s Discworld series. We’re barely a week into the new year, and I’m already down the very end of Thief of Time. Susan has already saved the day, and the story is winding down.
I’m going to do my very best to write a little about each book I read this year, and I’m going to try to do it shortly after I finish each one. These posts are more for my benefit than anyone else’s, so I’m going to try not to post them too often!
Earlier in the year, I posted a short list of resolutions for the year ahead. Most of them were really just tasks or projects that I was hoping to work on this year. The year is now rapidly winding down, so I thought this would be a good time to look back and see if I did a good job this year.
Generate new personal crypto keys – PARTIAL SUCCESS
I did generate new ssh private keys this year. Twice. I actually did it the first time specifically to satisfy this particular new year’s resolution. Later in the year, I built a new computer. That meant I needed an additional ssh private key, and I needed to push that out to all the computers that I ssh into.
Pushing out two new keys is just as easy as pushing out a single key, so I generated even more secure ssh keys for each of my machines. I now feel ever so slightly more secure.
I have not yet created a new GPG key. This is a bit embarrassing for me. By the time I got around to thinking about doing this, Edward Snowden had already started leaking information about cryptographic weaknesses. I really need to research which ciphers are safe in a post-Snowden world, but I haven’t done that yet. I’ll just have to add this to next year’s list!
Rework my persist system – FAIL
I love my little persist system. I use it to manage my configuration files and keep them stored away in a Git repository. It works really well, and it helps me keep all my random config files safely in a Seafile library. Since they’re then sitting in one of my Seafile libraries, they already get quickly and automatically synchronized between my laptop and desktop. I don’t run the Seafile client on any of my servers, but this has me covered in the two places I care about the most.
The problem is that I built my persist system in place. I manually created the Git repository where the config files are stored, and the scripts are all hard-coded to point to that repository. They also assume that repository exists. It doesn’t help that I decided to make my persist repository a submodule of my Prezto repository.
I need to rework it quite a bit. It needs to create that repository for you automatically. The trouble here is that I’m lazy, and it is already working just fine for me. I seem to always have something else that I’d rather work on, and I don’t want to tear out something that works just to replace it with something almost identical.
This is holding up my short series of posts regarding the cleanup of my shell environment. One of the goals of that cleanup was to have a simple way to push my rather large Zsh configuration out to new workstations and servers.
The utility of that feature has diminished quite a bit since I started using Seafile. These files are now automatically copied around from machine to machine for me, and that has reduced my level of motivation quite a bit.
I haven’t really found anything quite like my persist system. It has been quite a while since I shopped around for something like this, though. Maybe I will luck out and find something even better. If I don’t, maybe I’ll be more motivated to make my own setup usable by the general public!
There’s not much to say here. I made updates to zsh-dwim often enough that I wrote five short posts about it this year.
You must use an HTML 5-compatible browser to view this video.
I’ve slowed down quite a bit on this in the second half of the year, but I’ve already crossed off almost every item on my zsh-dwim to-do list. The ones that I haven’t crossed off are either very strange, or ended up not being very intuitive.
I think I’ve scratched all of my own zsh-dwim itches. I’m most likely to be doing sysadmin-related tasks. I bet someone could think of some transformations that would help streamline a programmer’s workflow. I just don’t have the data to make that happen.
Build a web interface for the arcade cabinet – FAIL
I probably didn’t think about this one a single time after January. I had this idea late last year when my friend Brian ordered a whole mess of NFC tags. It sounded like a pretty neat idea at the time, but I don’t think I’d get much mileage out of it.
I’m going to scratch this one off the list for now.
Buy fewer games for my arcade cabinet, spend more time playing them instead – FAIL
I didn’t buy a single game for the arcade cabinet during the first few months of the year. After that, I decided it was alright to buy games for the arcade cabinet if they had native Linux ports.
By summer, I was even buying some Windows-only games again. I buy almost everything that shows up on IndieGameStand that has a Linux port, and I’ll happily pay a dollar for any arcade-style, Windows-only game they put up for sale. If I’m lucky, they’ll run under Wine. If I’m unlucky, I’m just out a dollar.
What’s in store for next year?
I’m not sure, but I still have dozens of hours to think about it! I’d like to do something fun with one of my Arduinos, and I definitely have to build a new stand for my new 27-inch monitors.
I would also like to add another chair to my home office: something comfortable to lounge on when I’m reading, or when I’m watching movies or playing games on my arcade cabinet’s new TV. This is probably the smallest home office I’ve ever had, and it is starting to feel a bit cramped in there. It might be difficult to find something that fits, but I’m going to give it a try.