My Year in Books 2013

| Comments

I was hoping to avoid writing another post like this again this year. While I was writing the last one, I decided that it would be a much better idea to write up a short review each time I finish a book.

That didn’t happen, though. I didn’t read anything at all until March, and I blasted straight through three books in about four weeks’ time. By then it was just too late. I’m going to try to do a better job this year.

I may not have done a good job writing about these books, but I sure did a good job keeping track of them. I meticulously entered all my data into my Goodreads.com account as I was reading.

I’m very pleased with Goodreads. Each month, they do a pretty good job of warning me that new books have been released by authors that I’ve previously read. I might have missed Fate of Worlds and The Long War this year if it weren’t for these monthly alerts.

The list

Lots and lots of John Scalzi

This happens to me all the time. I try my very best not to read books by the same author back to back. Sometimes I just can’t pick out my next book, and I get lazy. It is easy to just pick another book from the same author’s catalog. Old Man’s War happened to be in the first Humble eBook Bundle, and that was a good enough reason for me to read it.

I enjoyed it quite a bit, and the Internet was buzzing about Redshirts, so it seemed like a good choice for my next book. The premise sounded great, and I really liked the first two thirds of the book. In my opinion, this is right around where the book probably should have ended. This is probably the reason I didn’t get back to Scalzi’s books until October.

I’m glad I picked back up where I left off. I had a good time reading the next two books in the Old Man’s War series, and I look forward to reading more this year.

I should also mention that I find John Scalzi’s ridiculously large recliner very intriguing.

The Making of Karateka

Karateka was one of the first games I played on my Apple 2 when I was a kid. I still remember the first time I made it to the end of the game and defeated Akuma. I walked through the door, up to Princess Mariko, and she killed me with one solid kick to the face. I had no idea that I wasn’t supposed to walk up to her in a fighting stance!

I got this book as part of a bundle from StoryBundle.com. I thought it sounded a little interesting, but I didn’t think I would enjoy reading someone’s diary. I was very wrong about that. This was by far my favorite book of 2013.

The book brought back a lot of old memories, and it was very interesting to learn exactly how one of my old favorite games came into existence. It was nice to read about the old days when one person could single handedly create a state-of-the-art video game from scratch.

Larry Niven’s Known Space

It is always so easy to just choose another book in a series that you’re already familiar with, and I’ve read just about every one of Niven’s Known Space and Ringworld books. It was nice to see a couple of new books in this universe show up this year.

Just one Terry Pratchett book

I really enjoy Terry Pratchett’s work, and I always devour his books very, very quickly. That’s why I’m surprised that I only read one of his novels last year. The Long War, the sequel to The Long Earth, was written by Terry Pratchett and Stephen Baxter: two authors whose work I enjoy very much.

I’ve read some complaints about this book. They say that the story doesn’t really go anywhere, and it raises more questions instead of answering questions from the first book. This didn’t bother me at all, because the journey was a lot of fun.

The Long Earth series feels to me like the work of Stephen Baxter with a little bit of Pratchett’s whimsy thrown in, and that’s just fine by me.

My failures of 2013

I hate to give up on a book, but I did so twice this year. Earlier in the year, I started reading Terry Pratchett’s Dodger. I just couldn’t get into it. I put it away after about 60 pages.

I started reading Kim Stanley Robinson’s Blue Mars back in November. I very much enjoyed reading the first two books in the series, but they are quite long and mostly pretty dry. The chapters are long, and you’ll often spend several pages in a row reading descriptions of Martian geography and weather.

This isn’t necessarily a bad thing, but I knew that if I started reading it that I wouldn’t be finished before the end of the year. I knew I was going to be traveling, so I thought it would be better to be reading something lighter. I only made it a dozen pages in before I decided to read Betrayer of Worlds instead.

Plans for 2014

I decided to start off the year by continuing my journey through Terry Pratchett’s Discworld series. We’re barely a week into the new year, and I’m already down the very end of Thief of Time. Susan has already saved the day, and the story is winding down.

I’m going to do my very best to write a little about each book I read this year, and I’m going to try to do it shortly after I finish each one. These posts are more for my benefit than anyone else’s, so I’m going to try not to post them too often!

Looking Back at My 2013 New Years Resolutions

| Comments

Earlier in the year, I posted a short list of resolutions for the year ahead. Most of them were really just tasks or projects that I was hoping to work on this year. The year is now rapidly winding down, so I thought this would be a good time to look back and see if I did a good job this year.

Generate new personal crypto keys – PARTIAL SUCCESS

I did generate new ssh private keys this year. Twice. I actually did it the first time specifically to satisfy this particular new year’s resolution. Later in the year, I built a new computer. That meant I needed an additional ssh private key, and I needed to push that out to all the computers that I ssh into.

Pushing out two new keys is just as easy as pushing out a single key, so I generated even more secure ssh keys for each of my machines. I now feel ever so slightly more secure.

I have not yet created a new GPG key. This is a bit embarrassing for me. By the time I got around to thinking about doing this, Edward Snowden had already started leaking information about cryptographic weaknesses. I really need to research which ciphers are safe in a post-Snowden world, but I haven’t done that yet. I’ll just have to add this to next year’s list!

Rework my persist system – FAIL

I love my little persist system. I use it to manage my configuration files and keep them stored away in a Git repository. It works really well, and it helps me keep all my random config files safely in a Seafile library. Since they’re then sitting in one of my Seafile libraries, they already get quickly and automatically synchronized between my laptop and desktop. I don’t run the Seafile client on any of my servers, but this has me covered in the two places I care about the most.

The problem is that I built my persist system in place. I manually created the Git repository where the config files are stored, and the scripts are all hard-coded to point to that repository. They also assume that repository exists. It doesn’t help that I decided to make my persist repository a submodule of my Prezto repository.

I need to rework it quite a bit. It needs to create that repository for you automatically. The trouble here is that I’m lazy, and it is already working just fine for me. I seem to always have something else that I’d rather work on, and I don’t want to tear out something that works just to replace it with something almost identical.

This is holding up my short series of posts regarding the cleanup of my shell environment. One of the goals of that cleanup was to have a simple way to push my rather large Zsh configuration out to new workstations and servers.

The utility of that feature has diminished quite a bit since I started using Seafile. These files are now automatically copied around from machine to machine for me, and that has reduced my level of motivation quite a bit.

I haven’t really found anything quite like my persist system. It has been quite a while since I shopped around for something like this, though. Maybe I will luck out and find something even better. If I don’t, maybe I’ll be more motivated to make my own setup usable by the general public!

In any case, I would really like to see my shell cleanup series progress to its third part!

Continue working on zsh-dwim – SUCCESS

There’s not much to say here. I made updates to zsh-dwim often enough that I wrote five short posts about it this year.

I’ve slowed down quite a bit on this in the second half of the year, but I’ve already crossed off almost every item on my zsh-dwim to-do list. The ones that I haven’t crossed off are either very strange, or ended up not being very intuitive.

I think I’ve scratched all of my own zsh-dwim itches. I’m most likely to be doing sysadmin-related tasks. I bet someone could think of some transformations that would help streamline a programmer’s workflow. I just don’t have the data to make that happen.

Build a web interface for the arcade cabinet – FAIL

I probably didn’t think about this one a single time after January. I had this idea late last year when my friend Brian ordered a whole mess of NFC tags. It sounded like a pretty neat idea at the time, but I don’t think I’d get much mileage out of it.

I’m going to scratch this one off the list for now.

arcade cabinet

Buy fewer games for my arcade cabinet, spend more time playing them instead – FAIL

I didn’t buy a single game for the arcade cabinet during the first few months of the year. After that, I decided it was alright to buy games for the arcade cabinet if they had native Linux ports.

By summer, I was even buying some Windows-only games again. I buy almost everything that shows up on IndieGameStand that has a Linux port, and I’ll happily pay a dollar for any arcade-style, Windows-only game they put up for sale. If I’m lucky, they’ll run under Wine. If I’m unlucky, I’m just out a dollar.

What’s in store for next year?

I’m not sure, but I still have dozens of hours to think about it! I’d like to do something fun with one of my Arduinos, and I definitely have to build a new stand for my new 27-inch monitors.

I would also like to add another chair to my home office: something comfortable to lounge on when I’m reading, or when I’m watching movies or playing games on my arcade cabinet’s new TV. This is probably the smallest home office I’ve ever had, and it is starting to feel a bit cramped in there. It might be difficult to find something that fits, but I’m going to give it a try.

Emacs: ELPA and use-package

| Comments

I have dozens of files from various corners of the Internet in my ~/.emacs.d/lisp/ directory. I’ve tried to keep the important ones up to date, but I’ve never done a very good job of it. The ones that are old and outdated are the lucky ones. Most of them are so old that I don’t even use them anymore.

Emacs now has a rather nice package manager called ELPA, and I started using it over the summer. Converting my existing configuration files has been a bit painful, and the apparent gains from the effort have been pretty small. I’ve only converted a very small number of my configuration files because of this.

My problem with ELPA

My Emacs configuration is pretty well organized. I’ve been using my-site-start to help manage my configuration since 2009. It is very simple, and it works a lot like the SysV init system. It automatically executes all the files in my ~/.emacs.d/site-start.d directory. I try to keep all the configuration for each individual package in its own file. That makes it easy to quickly remove configurations that I don’t need.

Most of these files are very simple and quite similar. They usually have a require statement at the top followed by one or more setq. Some contain key binding assignments, but I mostly keep those to a single file.

~/.emacs.d/site-start.d/90git-gutter.el
1
2
3
4
(require 'git-gutter)

(global-git-gutter-mode t)
(setq git-gutter:always-show-gutter t)

I assumed that I would be able to reinstall my existing packages using ELPA, and my existing configuration files would just continue to work. The reality wasn’t quite that simple. My understanding might not be entirely accurate here, but it shouldn’t be too far from reality.

The packages installed by ELPA aren’t available until after Emacs is completely finished starting up. This means that all of those require statements become errors. That isn’t a problem on its own, but this also means that any call to functions within these packages also becomes an error. At this point, that call to the global-git-gutter-mode function will fail.

I was able to work around this by setting package-enable-at-startup to nil. This seemed like a bit of a kludge. I’m pretty sure this means that I now have to require every package I install.

John Wiegley’s use-package

I found John Wiegley’s use-package last week, and I am very pleased with it. It manages to solve all my issues with ELPA, and it sure looks like it is going to lead me towards a much cleaner Emacs configuration.

~/.emacs.d/site-start.d/90git-gutter.el with use-package
1
2
3
4
5
6
(use-package git-gutter
  :init
  (progn
    (global-git-gutter-mode t)
    (setq git-gutter:always-show-gutter t)
    ))

I’ve started my journey the lazy way. I replaced all my require calls with use-package calls. Then I just wrapped up my existing code and stuffed it into the use-package’s :init section. This was just enough to eliminate my reliance on setting package-enable-at-startup to nil.

I’ve only just scratched the surface of use-package. It also has options for the configuration of key bindings and for populating your auto-mode-alist.

I have a feeling that I’m going to be slowly rewriting all of my Emacs Lisp files over the next few weeks!

Six Months with Seafile

| Comments

Quite a bit has happened since I reported in September. Seafile 2.0 was released, and that release just happened to coincide with a hardware failure. That was actually pretty convenient for me, since I wanted to rebuild my libraries with Seafile 2.0’s improved encryption.

Seafile server bandwidth usage

I was really hoping to have at least a couple months’ worth of good bandwidth reports by now. Unfortunately, I’ve had reason to upload large amounts of data every single month since I started using Seafile. One month my data was uploaded, then Chris’s data was uploaded the next month. I had a hardware failure the month after, and I had to upload my data to the new Seafile 2.0 server. We didn’t get around to uploading Chris’s data again until last month.

My Seafile VPS Statistics for December 2013

We’re not quite two weeks into December yet, but I haven’t had a reason to upload any libraries from scratch yet. Chris and I have only used about 1.5 GB’s worth of bandwidth so far this month. We’ll probably finish the month out with well under 4 GB of bandwidth use. I’ll come back and replace this paragraph in a few weeks when I know the final bandwidth total.

I’m very happy with that. It is barely a blip on the radar. I could almost fit numbers like these into our tiny, inexpensive mobile data plans.

Update: I’m counting down the final hours of 2013 and updating the Seafile server’s bandwidth screenshot image. It looks like we came in at 2.74 GB of bandwidth usage for the month of December. I will be quite pleased if most months end up looking like this!

Seafile server disk usage

I have a total of 11 GB of data in 69,022 files spread across twelve separate libraries. The most files that I have in a single library are 22,347. These libraries are all kept in sync on both my desktop and laptop.

Chris has a single 31 GB library containing 14,596 files, and she has only one computer.

Seafile server disk usage

All the libraries are configured to expire deleted files after 90 days.

Seafile performance

When I first started using Seafile, I was only using a single computer with a fast solid-state drive. I had no performance issues at all. I could boot my computer up, and Seafile would finish checking my local libraries for changes before I could even think to look.

When I bought my new workstation, I had to put a traditional 7200 RPM hard drive back in my laptop. The laptop is now excruciatingly slow for two or three minutes after booting up.

My start-up scripts are pretty aggressive. They simultaneously start up all the long running applications that I use all day long. Things like Emacs, Chrome with a bunch of tabs, Pidgin, Thunderbird, and LibreOffice. These would all be ready to use in just a few seconds with the solid-state drive. They take almost a full minute to get up and running on the old, slow, spinning disk.

Now it takes Seafile almost three minutes to scan my libraries. That is certainly not a ridiculous amount of time to check nearly 70,000 files. It isn’t want I’m used to, though.

The server-side requirements are also extremely light. The virtual server that I’m running Seafile on is configured with only 256 MB of RAM and 96 MB of swap space, and half of that is being used as disk cache. Running in such a small virtual machine hasn’t been a problem at all.

Seafile sync speed

I was pleasantly surprised a few weeks ago. I had the same file open on my laptop and desktop. I made some edits on my desktop and walked away. When I got back, I sat down at my laptop and was surprised to see my changes staring back at me!

By default, Emacs prompts you when it notices a file has changed on disk. I had completely forgotten that I had configured it to automatically reload when a file changes on disk. I was bouncing back and forth between computers as though I were accessing a share on a file server!

Of course, it isn’t quite as fast as a file server. It takes about 20 to 30 seconds for the change to be detected, uploaded to the server, and downloaded to the other computer. That is definitely fast enough for me, and it is very nice to be able to pick up working right where I left off, even if I forget to commit my work in Git before I walk away.

Testing my Seafile backups

I’ve been procrastinating. I back up my Seafile server every day, but I haven’t gotten around to restoring any of those backups to a test server. Since an untested backup may not be a backup at all, I figured it was time to stop procrastinating.

Testing the backups was really simple. Each of my daily backups are stored in btrfs snapshots. All I had to do was chroot into them one at a time, restore the database, and fire up the Seafile service. All I had to do was follow along with the instructions in the Seafile backup and restore documentation.

Daily Virtual Machine Backup Snapshots (Temporarily)
1
2
3
4
5
6
7
8
9
10
wonko@backup1:~$ sudo btrfs subvolume list /mnt/vz-rsync/
ID 439 gen 352 top level 256 path .snapshot/rsync_2013-12-03_23:34:16
ID 442 gen 357 top level 256 path .snapshot/rsync_2013-12-04_14:57:11
ID 448 gen 371 top level 256 path .snapshot/rsync_2013-12-05_16:03:00
ID 480 gen 444 top level 256 path .snapshot/rsync_2013-12-06_15:41:10
ID 509 gen 426 top level 256 path .snapshot/rsync_2013-12-07_07:18:05
ID 515 gen 457 top level 256 path .snapshot/rsync_2013-12-08_01:39:49
ID 520 gen 462 top level 256 path .snapshot/rsync_2013-12-09_01:12:41
ID 526 gen 468 top level 256 path .snapshot/rsync_2013-12-10_01:17:34
wonko@backup1:~$

The first one took the longest to test. Seafile or Nginx didn’t want to start up until I mounted the proc file system in the chroot. It was easy to test a few random days once I got that problem squared away.

A convenient bonus

Cloud storage software like Seafile is a great way to help protect yourself from ransomware like CryptoLocker. If I notice that my library has been corrupted, I can just click a few buttons and restore the entire library to a previous state. I can go back to yesterday, or last week, or as far as I need to go.

I’m using Linux, so I’m not a target of CryptoLocker. Ransomware is some pretty insidious stuff, though, and storing or backing up your data in the cloud is a good way to keep yourself safe.

You don’t have to host your own cloud backups to get a good deal. I’ve heard good things about Backblaze, and I’m looking for an excuse to build one of their awesome, big, red storage servers. They don’t seem to support Linux, though. I’ve also heard good things about Crashplan, and they do support Linux.

Seafile’s own service, Seacloud, has pretty good pricing on their “team” packages. You can sign up for a single account, create libraries for multiple users, and let everyone share the same storage quota. That seems like a pretty good value to me.

There’s a lot of overlap between “cloud storage” and “cloud backup.” “Cloud storage” is generally aimed towards syncing your data between multiple machines, while “cloud backup” is geared more towards moving your data from one computer up to the Internet. If you just want to keep your data safe, then one of the various backup services is probably a better value for you.

The verdict

I had a lot of cloud storage options to choose from six months ago, and I definitely made the right choice. Seafile has just the right combination of encryption, performance, and storage efficiency to meet my needs, and it has made the process of using multiple computers a much more pleasant experience.

There are some interesting features due out in the next couple of months that are listed in the Seafile 2.x Roadmap. I’m looking forward to the improved syncing speed. Improved performance will most likely let me squeeze a little of extra life out of my laptop’s new battery.

Gift Ideas for Geeks

| Comments

Buying gifts is hard work. It is that time of year again, and I’m working hard trying to figure out exactly what to buy for everyone. I thought I might be able to help someone else out if I made a list of some of my favorite toys and gadgets. My friend Brian also thought this was a good idea, and he posted a list of his own this year.

I already own almost everything on this list. That means that you shouldn’t buy me anything on this list for Christmas, because I already have it. It also means that I’ve actually used all this stuff, and I am giving it my official seal of approval.

There are a few things on this list that I don’t own. For instance, I only have one of the listed USB battery packs. I can only put my seal of approval on the smallest of the three, but I was worried that that size might not be ideal for everyone.

USB Battery Pack Chargers ($9 to $50)

External USB battery packs should come in handy for any geek who needs to keep their gadgets powered up while on the go. I ordered my first external battery pack last week, and I decided to get one of the smallest and most inexpensive models I could find.

It is the Swift Gear Mini 2600 mAh battery pack. It is a small, light, four-inch tube that is small enough that I won’t even notice it in my jacket pocket. It is about the size of a small flashlight. In fact, it is a small flashlight, and a surprisingly bright one at that.

Swift Gear 2600mAh battery pack charging my Nexus 7

I was a bit worried that a 2600 mAh battery pack might not have enough capacity, but I’m happy with the choice I made. It is able to bring my Nexus 4 from 14% charge all the way up to 84% in less than two hours. That is enough to buy me a few extra hours of tethering, which will definitely be helpful next time I’m stuck waiting in an airport. It also manages to bring my 2012 Nexus 7 tablet up to 43% from the low battery warning at 14%.

I also included two larger battery packs in my list. I haven’t used these specific models myself, but they were other models I was considering before I decided what size I wanted. Both of the larger battery packs are capable of charging two devices at a time, and their larger capacity would be handy if you were trying to charge a more power-hungry device like a tablet.

Arduino Starter Kits ($55 to $130)

The Arduino is a nifty little hardware prototyping platform, and it is a great way to dip your toe into the world of hardware development. An Arduino board all by itself isn’t very useful. When my first Arduino board arrived, the first thing I did was program it to blink an S.O.S. on its built-in LED.

This isn’t very exciting at all. You need other electronic components if you want to do something interesting. You need parts like LEDs, resistors, buttons, motors, and buzzers. The easiest way to get going is to buy an Arduino starter kit.

I pieced together my own starter kit, but that wouldn’t make a very good gift. The Official Arduino Starter Kit and the Sparkfun Inventor’s Kit are both good choices, and they’re pretty comparable. The official kit seems to come with a larger printed guide, while the kit from Sparkfun comes with a nice storage case.

Of the two, I think Sparkfun’s Inventor’s Kit is a better gift and a better value. Sparkfun’s carrying case is a nice touch, and their holder for the Arduino and breadboard looks pretty convenient.

If you’d like to save money, you can go with a more generic kit. This Arduino Uno Ultimate Starter Kit is about half the price of the other two kits. It may have fewer components than the other two kits, but it definitely provides a better “bang for the buck.”

My Favorite Multitool ($80)

I have had my Swiss Army CyberTool for at least 12 years now, and I would be lost without it. I actually received mine as a Christmas present, and it is the perfect multi-tool for a geek like me that is always building computers and taking apart servers.

The thing that puts the CyberTool 34 above most other multitools is its bit driver. It comes with Torx bits, which are handy if you run into any HP servers. The bit driver itself is also the correct size to fit those brass motherboard standoffs.

A lot of people prefer the Leatherman-style tools, and I own a similar Gerber multitool. These are also very handy tools, and I’ve used mine quite a bit, but they’re all centered around that giant pair of pliers. I just don’t need pliers very often. If your geek is anything like me, he’ll get a lot more mileage out of the Victorinox CyberTool 34.

Laboratory Beaker Mug ($15)

This is probably one of the best birthday presents that I’ve gotten in a long time. First off, it is just really awesome looking. It is pretty much an actual beaker with a handle molded on to the side, and that is just really cool.

A latte in my laboratory beaker mug

I use it every day when I make myself a latte. It is the perfect vessel for that, too. The thin glass doesn’t instantly cool down my drink like standard ceramic mugs do, and being able to see the layers of espresso, milk, and foam through the side is a nice bonus.

Do you have any ideas for fun, geeky gifts? I’d sure like to hear about them!

The Ipega Bluetooth Wireless Controller (PG-9017)

| Comments

I’m going to be away from home for a few months, and I’ve picked up a few items to make myself a little more comfortable while I’m gone. My laptop battery was dead, so I thought it would be a good idea to replace that before I left.

While I was shopping for a new laptop battery, I also noticed that Amazon has quite a few Bluetooth gamepads that clip onto your cell phone. I’m a huge fan of old-school video games, and I’m going to be spending a lot of time in waiting rooms next month, so I figured I would pick one up.

I usually pack a Wii remote with me when I travel. They’re pretty small, I can use them with my phone, tablet, or laptop, and I always have a spare Wii remote lying around. The Ipega gamepad is a pretty big upgrade over the old Wii remote.

Expectations

I decided to go with one of the most inexpensive clip on gamepads; the Ipega PG-9017. They’re selling for right around $20, so my expectations really weren’t all that high. The reviews made it sound like it is cheaply made, and it is incompatible with a lot of Android devices.

My biggest worry was the d-pad. The pictures made it look like it had the terrible, round, Xbox-style d-pad. I figured that even if the d-pad ended up being awful, it would still be a huge upgrade over touch controls.

Bluetooth Pairing Super Mario Bros. 3 on the NES Super Mario World on the SNES

Reality

The Ipega PG-9017 is very light, and a lot of people probably assume that means it is shoddy. I don’t think this is necessarily true. I’ve tried bending the controller. When I put this much force on my Sony Sixaxis controller it starts to creak a bit. The Ipega doesn’t make a sound.

The spring-loaded clip surprised me the most. It seems pretty sturdy, and the spring has no trouble holding my Nexus 4. I did not think it would be a good idea to try to see how much flexing the clip could stand up to. It is just a couple of interlocking pieces of plastic, but it is more than enough to get the job done.

The d-pad is nothing short of excellent. It has a Nintendo-style plus sign shaped d-pad, and it feels great. I’ve been playing a ton of Zanac, and it feels almost just like it did on the NES. The round bits that I perceived to be part of the d-pad in the pictures are molded into the shell of the controller. They’re not part of the d-pad at all.

There is no noticeable lag, either. I would never be able to play Zanac so well if there were, and I would be missing all sorts of jumps in Super Mario Bros!

The analog sticks are indeed analog. I used them to play a bit of Mario 64 and Starfox 64. They’re a little stiff, and they’re not terribly accurate, but they work well enough for the most part. The shoulder triggers are a bit hard to use, because they’re right up under the phone.

There’s one massive flaw

The Bluetooth implementation seems to be broken in a very strange way. I’ve paired it with my laptop, a Nexus 4, a Nexus 7, and a Samsung Galaxy S. It works just fine on all of them. I played a bit of Super Mario Bros. on every device.

Once you turn off the gamepad, it just won’t reconnect. I have to go into the Bluetooth settings on the phone, delete the gamepad, and pair it back up again. This is very annoying.

I don’t know what I changed, but I had it working correctly for a few days. All I had to do was hit the “home” button on the Ipega gamepad, and I could immediately start playing. Then I decided that I should pair it up to the laptop before writing this post. It hasn’t worked correctly again since.

The verdict

I was really hoping that I could recommend this device. The d-pad feels great. The buttons feel good, even if they’re a bit too tightly spaced for my big hands. The clip works great. This should be an awesome gamepad for any retro gaming fan.

Unfortunately, the Bluetooth pairing issue seems like a showstopper. I already have the controller, so I’m going to just deal with it, but I wouldn’t want to inflict this on anyone else.

I’ve since learned that there is a newer revision of this controller, the Ipega PG-9018. I’d really like to know if the newer model corrects this problem.

Do you have a gamepad for your phone? Which one is it, and how is it working for you?

Buying a Third-Party Replacement Battery for My HP DV8T Laptop

| Comments

Update: I’ve had this battery for over 18 months now, and it is still running like a champ.

Update 2: I’ve had this battery for two and a half years. If I’m doing light work, it will still run my laptop for almost four hours. However, if the CPU and GPU are working hard, and I hear the fan working overtime, I’m likely minutes away from randomly losing power. There must be a dying cell in there, because it just can’t supply enough voltage anymore. I’m still pleased with my choice of battery. It got me through a few years, and my laptop is starting to feel pretty ancient now anyway.

The 8-cell battery that came with my HP DV8T laptop has been nearing the end of its life since the beginning of the year. When I was traveling back in March, I was lucky if I could get thirty minutes of use out of it. I can’t get much done in half an hour, but it was more than enough juice to let me move the laptop from one outlet to another.

The battery completely gave out at some point in the last few months. I haven’t been using my laptop all that often since buying my new desktop machine, so I hadn’t given it much thought at all. I’m going to be traveling for a few months, and that seemed like a good reason to pick up a fresh battery.

How do I choose a battery?

There are quite a few choices. The most expensive option is a replacement battery from the laptop manufacturer. This seems like a good idea; the laptop manufacturer is likely to be using high-quality parts in their batteries. The problem is that you don’t know how long those batteries have been sitting on a shelf, and lithium ion batteries deteriorate quite a bit if they’re not cycled regularly.

Then there are really cheap third party replacement batteries. I saw 8-cell batteries for my HP DV8T for as little as $20 shipped. I’ve bought cheap batteries like this before, and the reviews at Amazon seemed to match up pretty closely with my experiences. The ones I bought didn’t have as much capacity as the original battery, and they ended up turning into doorstops within about twelve months.

This worked out just fine for me in the past. Those laptops were nearing the end of their useful life, and I just needed them to limp along until it was time for an upgrade. A battery like that isn’t a good fit for me this time, since I don’t plan on replacing this laptop anytime soon.

There were some more expensive third party batteries at Amazon that come with 18-month warranties. The reviews on these batteries were much better than the $20 batteries, and I figure that if they’re willing to stand behind their product for 18 months, then these batteries should last longer than just a few months.

Stock 8-cell Battery, dated about 1 month after purchase LB1 12-cell battery The new battery adds a bulge to my laptop

My new laptop battery

I decided that I was going to buy an LB1 brand “High Performance Battery”. They seem to make batteries for other brands of laptops, like Dell, Asus, and Toshiba. I found an 8-cell battery for $35 and a 12-cell battery for $45 for my HP laptop, both of which were eligible for Amazon Prime.

My laptop is very power hungry, so I just couldn’t resist the upgrade to a 12-cell battery. I’m not worried about the extra weight, since this is already a gargantuan nine-pound laptop anyway. It also seemed like a good value; it only cost me 28% more to get 50% more battery!

The larger battery adds a small bulge that raises the back of the laptop off the desk a bit. I’m sure some people will be happy that this gives the laptop better airflow. I’m not one of those people, but the extra height surely won’t cause me any problems.

The early results

The colorful brochure that came with the battery recommended that I fully charge and discharge the battery 2 to 6 times. They obviously don’t know me very well. I would have done this anyway just to see how long it would last!

When I first bought this laptop, and its battery was factory fresh and it still had a solid-state drive, the best I could manage was a bit more than 150 minutes of run-time on a charge. I would have been happy to beat that by anything more than an hour. If we can trust the math, I should be able to hit 225 minutes now.

I wasn’t expecting to do that well, though. I’ve since downgraded the SSD with a 7200 RPM spinner, and I used to have all sorts of power saving tweaks set up. This is just a bone stock Xubuntu 13.10 install now.

I was extremely surprised by this new battery. I charged it up and ran it dry for the first time yesterday. The laptop had all my usual applications up and running. The screen brightness was set to about half, and I left it sitting idle almost the entire time. It took 268 minutes to empty the battery. That’s two hours more than I ever got out of the original battery!

The second discharge cycle didn’t go quite as well; it only ran for 238 minutes. This was entirely my own fault. I threw off the results by running some disk-intensive operations while the battery was draining.

The results were even better on the third discharge cycle. It managed a whopping 282 minutes. It was good to see that the first discharge wasn’t a fluke.

30-day update

Things are a bit less scientific this time. I’m traveling, and I don’t have my desktop with me, and it isn’t easy to leave my only available computer alone for four solid hours. Even so, I’m not having any trouble breaking the four-hour mark. I’m going to say that the battery is still working just fine.

I am noticing a small physical problem with this 12-cell battery. The larger battery raises the back of the laptop off the desk, and the battery is off center. I can see the screen shaking around a bit while I type. I’m sure that there’s more to blame than just the battery. I am not terribly gentle with the keyboard, and this old particle-board desk from the eighties isn’t nearly as stable as it once was.

18-month update

The battery is still doing quite well after 18 months. I just ran the laptop from a full charge, and it kept running for 259 minutes. This fits in well with the numbers I was getting when the battery was new. I definitely made the right choice.

Will the battery still be good in 18 months? How about in three years?

It is too soon to tell, but I have a lot of confidence in this battery now. The label on the new battery says that it has 50% more capacity than my original battery, while it is able to power my laptop about 70% longer. I think that’s a very good sign, but I’ll only know for sure with time.

Are you using a third-party replacement battery in your laptop? How is it working out for you?

Lightening My Laptop Bag With DriveDroid

| Comments

I have some rack-mount servers out in the world, and none of them have optical drives. You never know when something is going to go wrong, so I always carry a USB DVD drive and a stack of various discs in my laptop bag. I know that all of these servers should be new enough to boot off of USB flash drives, but I don’t want to run into any weird problems while I’m out in the field. It also doesn’t help that CentOS 5 didn’t seem all that happy about being installed from a USB flash drive.

I don’t do SysAdmin style work very often these days, but you might think otherwise if you looked in my laptop bag. I have all sorts of things in there that I rarely use, like my trusty Swiss Army Cybertool 34, a network cable tester, and an RJ45 crimping tool. I’m trying to cut down on the weight a bit, and a recent hardware failure gave me the perfect opportunity to test out DriveDroid.

What does DriveDroid do?

DriveDroid is an Android app that allows you to use your phone as a virtual CD-ROM drive. DriveDroid can make use of any CD or DVD iso image stored on your phone, and it can even be used to download various Linux and FreeBSD images.

DriveDroid screen shot

My oldest server had no trouble booting from DriveDroid. I didn’t expect to have any problems. I originally configured that server using an external USB DVD drive, and DriveDroid emulates a CD-ROM drive.

My phone isn’t big enough for all these disk images

My Nexus 4 is only the 8 GB model. I’ll never actually fill that under normal circumstances, but it is a little too tight for storing CD and DVD image files. I’m not using my old Samsung Galaxy S anymore, and it has plenty of extra room.

I decided to wipe the old phone clean and install DriveDroid. That way I can keep it in my laptop bag all the time. It is significantly smaller and lighter than the DVD drive and 10-disc case that it replaced.

Old vs. new

I have two small problems with CentOS and DriveDroid. The FAT32 file system on the Galaxy S can’t hold files as large as the CentOS DVD. I copied the CentOS 6 “netinstall” and “minimal” disk images to the phone instead. Either of those will be more than enough to meet my needs.

I also had trouble with the Centos 6.4 and 6.3 “minimal” DVD images; they just don’t want to boot for me. Their MD5 checksums look just fine, and I have no trouble booting the live DVD and “netinstall” images.

Carrying a working “netinstall” will work well enough for my purposes.

DriveDroid isn’t just for servers

I’ve used DriveDroid quite a few times. I used it to reload my laptop after I moved its solid-state drive to my new desktop a few months ago, and I used it shortly after that to set up Chris’s new computer.

Earlier in the year when I was up north visiting my parents, I reloaded their old laptops using DriveDroid. It definitely came in handy that time because one of the DVD drives was acting up.

I don’t expect that I’ll be installing operating systems using optical drives or flash media ever again. DriveDroid is just too convenient.

Upgrading from (X)Ubuntu 13.04 to 13.10

| Comments

I decided to upgrade my laptop and desktop to Ubuntu 13.10, the Saucy Salamander, this weekend. I don’t run Unity, so I expected things to go pretty smoothly. I’m pretty sure that the operating system on my desktop was installed using the “alternative” Ubuntu 12.10 installation media. If I am remembering correctly, that was a requirement at the time if you wanted to use whole-disk encryption.

The operating system on my laptop was installed more recently using the regular Xubuntu 13.10 installation media, and I performed that installation using DriveDroid. Both machines are running XFCE and using Sawfish as the window manager.

Things went pretty smoothly, but there were a few small obstacles.

The xorg-edgers PPA blocks the upgrade

The first time I tried the upgrade, update-manager gave me the error “Could not determine the upgrade,” and it advised me that unofficial software packages might be blocking the upgrade. I took a look at the logs, and I found a lot of problems with video and x.org related packages mentioned in /var/log/dist-upgrade/apt.log.

I’m using x.org and Nvidia driver updates from the xorg-edgers PPA on both of my machines. This certainly counts as unofficial software, and it is most definitely video related. I used ppa-purge to downgrade to the stock versions of these packages.

Removing the xorg-edgers PPA
1
wonko@zaphod:~$ sudo ppa-purge ppa:xorg-edgers/ppa

The upgrade went smoothly once the PPA was disabled and all of its software was removed.

The Nouveau driver hates my monitors

The upgrade completely removed the proprietary Nvidia driver and stuck me with the open-source Nouveau driver. My new QNIX QX2710 monitors are very picky; their EDID information is broken, and they will pretty much only run at a resolution of 2560x1440. I have some specific configuration in my xorg.conf to account for this.

I’m sure some of those configuration options are specific to the Nvidia’s driver. The simplest thing for me to do was just install the proprietary drivers again and reboot.

Installing the (slightly outdated) Nvidia drivers
1
wonko@zaphod:~$ sudo apt-get install nvidia-319

The reboot was required because the Nouveau kernel module was being used to drive the text consoles, so I couldn’t manually remove it from the running kernel.

I’m not sure exactly what I did differently on the laptop, but the proprietary Nvidia driver was already installed on there after the update. It was the old 304.88 version, though.

The best upgrades are the ones you don’t notice

Aside from the small driver issues, which were really my own fault, everything went smoothly. There were no issues with my encrypted root file system, my RAID 1 came up just fine, and no ugly window grips showed up in my terminal windows. In fact, everything looks exactly like it did before the upgrade.

Some Retro Gaming Nostalgia and the Konami Code

| Comments

One of my oldest friends, from all the way back in elementary school, was in town last weekend. My friend Jimmy wasn’t in Dallas very long this time, but he did stop by the house for a few hours. I didn’t remember to drag him into the home office to see the arcade cabinet until he was fixin’ to leave.

I showed him the menus, and I shuffled through the list of games and systems for him. Then he said to me, “Do you have Contra on this thing?” Of course I have Contra on this thing!

Split screen Contra

I’ve only been playing lonely, single-player games on the arcade machine for the last year or two, so I had to set things back up for two-player cocktail-mode split-screen. The player-one joystick was acting up a bit, too, but some aggressive back and forth action on the controller straightened that right out, and before we knew it, we were playing some co-op Contra just like in the old days!

Back in the day

I specifically remember hauling my Nintendo down to Jim’s house one time, and hooking it up to the TV in his bedroom. This was back when Contra was still relatively new. There were probably four of us crowded around that little old TV that night: me, Jim, Marc, and Chad. I’m certain that we made it to the end of the game that night.

I’m not sure if we had to continue or cheat that time, but we most definitely finished the game. We were much better at this type of game back then.

I was also remembering another time when I was playing Double Dragon in the little arcade down at South Side Bowl with my friend Chad. We must have fed a whole roll of quarters into the machine that day, but we made it right to the end of the game.

We had just finished off the last guy, and I still had my baseball bat. We were just wandering around the screen waiting for more bad guys, but they never showed up. I ended up walking off the bottom of the screen into the spikes, and Chad ended up winning. We didn’t know we were supposed to fight at the end!

Back to last weekend

Jim and I ran out of lives very, very quickly. I punched in the Konami Code and got us our 30 lives, and we were off to try again. Things went much better this time.

We were near the end of the sixth level when Jim ran out of guys, and I didn’t have many left either. I didn’t recall what sort of boss was going to be there to meet us at the end of the stage, but I wasn’t expecting to get past him. I have no idea how I managed to survive, but I just barely killed that big, ugly cyclops.

I sacrificed my last few remaining lives so that we could continue and get Jimmy back into the game. There were just two stages left, and we didn’t have too much trouble getting through them. It only took 45 minutes, the Konami Code, and one continue to get us through an entire game of Contra!

This is exactly what I built my arcade cabinet for, and it is probably the most fun I’ve had with it yet! Jim and I probably played Contra on the Nintendo for years back in the eighties. It was awesome being able to do it again over 20 years later. I hope we get a chance to do it again some time!