The Other Eight by Joseph Lallo

| Comments

I’ve been doing a terrible job reading books this year. I read a dozen books last year, but I’ve only managed to complete five so far this year. I was doing alright up through March, but then I didn’t complete another book until June. I’ve done even worse since then. I’ve been doing such a terrible job that I received an embarrassing email from Goodreads reminding me that I’d been reading the same book for four months.

It isn’t the fault of the book. I place the blame on some combination of Team Fortress 2, Game of Thrones, and Borderlands 2. When the choices are reading a few chapters or playing one more round of Team Fortress 2, Team Fortress 2 usually wins. I also managed to accidentally watch every episode of Red Dwarf for the fourth or fifth time.

Why I chose to read The Other Eight

I have a long list of books written by well-known authors that I would like to read, but I try to sneak in books from authors I’ve never heard of as well. I usually pick those books from the bundles.

I have already heard of Joseph Lallo, though. I’ve already read of his science fiction novels, and I enjoyed them both. Those books are very important to me and to this blog, because they are the reason that I follow Joseph Lallo on Twitter.

If I didn’t follow him on Twitter, I would have never found my most excellent editor, Claudette Cruz. She makes sure that I don’t leave out any commas, and fixes all sorts of embarrassing mistakes that I have made. This blog wouldn’t be the same without Claudette’s proofreading!

The Other Eight

The Other Eight by Joseph Lallo

The good news is, superheroes exist. The bad news is, most powers are worthless and most heroes are insane. The Other Eight is the story of the US Army’s latest attempt to find a team of worthwhile heroes and the farcical reality show of a recruitment drive that results from a security leak that makes the secret project public knowledge.

The Other Eight was definitely a fun read, and it is exactly the sort of story you’d expect to find after reading the synopsis. There are plenty of fun characters. My mind was alternating between hearing the voice of Phosphor as either Zeke from the videogame Infamous, or as the engineer from Team Fortress 2. They are similar enough voices, and both felt very appropriate.

Phosphor has a unique super power. He can reach into the bag that he carries and he can pull out a fluorescent bulb, and there will always be another one in there no matter how many he pulls out.

What’s next on the list?

I’ve decided to reread the Red Dwarf novels. I read both while I was in high school, and I was completely unaware that the books were based on a television series. The books incorporate story lines from the television show, while also providing a much deeper backstory.

I haven’t reread any books in years, so this should be interesting!

The Other Eight at Amazon The Other Eight at Smashwords

My Upgrade to Ubuntu 14.10 Utopic Unicorn

| Comments

I don’t tend to get excited when new versions of Linux distributions are released, but I do try to pay some attention to the distributions that I run. This time, though, I managed to completely miss the release of Ubuntu 14.10.

I had some free time over the weekend, so I decided to upgrade my desktop. It was exactly the kind of upgrade I like: the kind of upgrade where nothing went seriously wrong, and I almost can’t tell the difference.

I was lazy this time, and I didn’t bother to remove the xorg-edgers PPA before the upgrade. If you’re using the xorg-edgers PPA, you still might want to roll back those packages before you upgrade from 14.04. I knew what might break, and I know how to manually fix things when they do, so it was worth the roll of the dice to me.

I didn’t notice any changes, but your mileage may vary

I don’t run any of the usual desktop environments. I use the Sawfish window manager along with some components from XFCE. Everything tends to look and function exactly the same between upgrades for me. Just about the worst thing I ever have to deal with is recompiling Sawfish, but that only happens every couple of years.

I’m sure there’s all sorts of changes if you’re running the default Unity desktop.

Waiting for network configuration

This message was the only oddity I’ve experienced since upgrading to Ubuntu 14.10. During boot, a message that reads “Waiting for network configuration…” appears, and the system sits there waiting for quite some time. Then another message appears that reads “Waiting up to 60 more seconds for network configuration…” appears.

I found some solutions on The best answer there is almost certainly to make sure that your /etc/network/interfaces file is configured correctly. Unfortunately for me, mine is set up exactly the way I want it. My computer is set up to request a DHCP address on a bridge device instead of the eth0 device, and this seems to confuse the new boot process in Utopic Unicorn.

I ended up using the more heavy-handed fix. I commented out all the sleep commands in the /etc/init/failsafe.conf file.

# failsafe

description "Failsafe Boot Delay"
author "Clint Byrum <>"

start on filesystem and net-device-up IFACE=lo
stop on static-network-up or starting rc-sysinit

emits failsafe-boot

console output

        # Determine if plymouth is available
        if [ -x /bin/plymouth ] && /bin/plymouth --ping ; then

    # The point here is to wait for 2 minutes before forcibly booting 
    # the system. Anything that is in an "or" condition with 'started 
    # failsafe' in rc-sysinit deserves consideration for mentioning in
    # these messages. currently only static-network-up counts for that.

#       sleep 20

    # Plymouth errors should not stop the script because we *must* reach
    # the end of this script to avoid letting the system spin forever
    # waiting on it to start.
        $PLYMOUTH message --text="Waiting for network configuration..." || :
#       sleep 40

        $PLYMOUTH message --text="Waiting up to 60 more seconds for network configuration..." || :
#       sleep 59
        $PLYMOUTH message --text="Booting system without full network configuration..." || :

    # give user 1 second to see this message since plymouth will go
    # away as soon as failsafe starts.
        sleep 1
    exec initctl emit --no-wait failsafe-boot
end script

post-start exec logger -t 'failsafe' -p daemon.warning "Failsafe of 120 seconds reached."

This is not the optimal solution, but it is much better than waiting an extra two minutes for my computer to reboot.

Debezeling My QNIX QX2710 Monitors

| Comments

I have been wanting to remove the bezels from my QNIX QX2710 monitors since they arrived at my door last year. I became even more interested when my dual-monitor mount arrived, because a piece of the base of the QX2710 is a nearly permanent part of the monitor. To remove that piece, you have to open up the monitor. If I have to take apart the monitor, I definitely don’t want to put it back together.

Before After

There’s one thing that has been keeping this project on the back burner—the downtime. I am using JB Weld epoxy, and it takes 15 to 24 hours to fully cure. I didn’t want to lose the use of my monitors for an entire day. I decided to debezel one monitor at a time. That way I could still use my computer.

Why would you debezel a monitor?

There aren’t many circumstances in which I would go through the trouble of debezeling a single monitor. I did end up removing the bezel from the 24” monitor in my arcade cabinet to get it a few millimeters closer to the glass, but that is a special case. Most monitors are sitting on someone’s desk.

Removing the bezels becomes much more functional when you have two or more monitors on your desk, since you’ll be able to move them closer together. It is much nicer having as small a gap as possible between your monitors. My first pair of LCD monitors were 14” and had HUGE bezels. Had I known how easy it would be, I would have debezeled them as well!

My first LCD monitors

I am hoping to see other improvements specific to these QNIX QX2710 monitors. The VESA mount is not attached directly to the LCD panel. It is attached to the rear plastic cover with four screws. The monitors can be made to wiggle quite a bit on their dual-monitor stand. I’m hoping that attaching the VESA mount directly to the LCD panel will eliminate some of that slop. I’m not terribly hopeful, though, because the mount is made of fairly thin sheet metal.

There is another small deficiency caused by the manner in which they attached the VESA mount. It causes the monitor’s plastic shell to flex and opens a small gap at the bottom between the monitor and the bezel. You can also see this causes some of the outermost pixels to be partially obscured by the bezel. You can’t see this problem unless you look for it, but I know it’s there!

Of course, there’s always aesthetics. I think narrow bezels look better, and that might be enough of an excuse to debezel your monitor.

This is not a how-to

I’m not going to spell out a step-by-step process for you. That’s already been done, and it has been done very well. I referred to some YouTube videos and forum posts while I was working to make sure I wouldn’t do anything stupid. It was a very simple process, and it probably took about an hour to get the first monitor all glued up. The second monitor went quite a bit more quickly. I spent most of the time carefully applying the JB Weld and making sure everything was glued up straight and even.

There is an excellent guide at explaining the process in great detail. I did not follow his instructions, because I felt that using a piece of plywood to increase the surface area for the epoxy was overkill.

I will tell you about the things I didn’t expect, and that I didn’t plan for.

Is JB Weld strong enough?

My father has been building golf clubs for himself since I was a kid. He was always tinkering and doing things like changing the shaft on his driver, replacing grips, or changing the length of a club.

I remember one very important detail from my days of watching him work on golf clubs. The head of a golf club is attached to the shaft with epoxy. He didn’t use fancy, expensive epoxy. He used a simple two-part epoxy that he bought at K-Mart. I’d be surprised if there’s more than two or three square inches of contact surface there, and I’ve never seen the head fly off of any of his clubs.

I was hoping to supply some fancy mathematics here to show how much centripetal force the head of a golf club experiences during a swing. I found a lot of math that explains how much force is applied to the ball on impact, but that isn’t very relevant. I did find a chart that tells me that I swing a five iron at a speed of 90 MPH, and I also learned that the head of a five iron weighs somewhere around 9 ounces.

I decided that 9 ounces traveling around a 4’ arc in excess of 90 MPH generates “a lot” of force. My father can hit that same club a good 20 yards farther than I can, so his club head is traveling quite a bit faster. That same chart says the club head of his driver is moving at over 120 MPH, but I’m pretty sure drivers have a pin that holds the head in place in addition to epoxy.

The component box I had to attach to the back of the LCD panel of the QNIX QX2710 only attaches around the edges. Those edges add up to a surface area close to 30 square inches. I am quite confident the JB Weld will hold.

I used regular JB Weld instead of the quick-set variety. This is only because I had some on hand, and I think these partially used tubes have been in my toolbox for over a decade. I did have some of Harbor Freight’s knock off version of JB Weld’s quick-setting epoxy, but the tubes had no information about curing times.

Painting the bezels

Disassembling the metal frame of the LCD panel for painting looks very simple, but I don’t plan on doing it. One of my monitors has a tiny dust mote stuck between the backlight and the panel. I fear that if I take the actual LCD panels apart that I’ll end up with a cat hair stuck in there.

I think the bare metal looks fine except for one minor flaw. There are lines across the frame on opposite corners of the screen. These are the points where they welded two parts together to create a single frame.

The truth is that I don’t notice the gray metal frame at all. The only time I fixate on it is when I’m talking about it. I’ve been repeatedly glancing at the corners of the monitor while writing the last few paragraphs, but I’ll forget about it soon enough.

How long does it take to debezel a monitor?

The first monitor took more time than the second. I was careful when unsnapping the bezel, and I referenced pictures and videos several times to make sure I wouldn’t accidentally rip any wires out.

I also had some confusion. I wasn’t sure which end of the monitor was up, so I didn’t know which way to orient the controller box. You’ll save yourself some time if you take note of which side is up. The innards don’t seem to be identical in all QNIX monitors, so be sure to pay attention.

The second monitor took no time at all. I had the whole thing disassembled and ready for the epoxy in less than five minutes. It probably took another five minutes to apply the JB Weld. I know it takes hours to harden, but I still worry. I wanted to get it coated and attached reasonably quickly.

The second monitor was problematic

After putting the controller box in place, I noticed a significant problem. Even with weight applied to the top, one side wasn’t making good contact with the back of the LCD panel. It just wasn’t straight enough.

The solution was simple, but my execution was poor. I lifted the controller box, and I applied a thick bead of JB Weld on that side. I did a pretty sloppy job, and ended up smearing JB Weld around the back of the LCD panel. It is a much less professional-looking job than the first monitor, but I’m sure it will function just fine.

In my hurry to get that all squared away, I unknowingly dipped my forearm in my mixture of JB Weld. I didn’t notice it at all until I later rested my arm on my desk. I smeared a giant glob of epoxy on my desk.

I thought this was going to be disastrous. I really like this desk. Lucky for me, JB Weld is easy to wipe off!

I don’t always follow directions

The other folks who documented their debezeling always roughed up the surfaces with sandpaper. I probably would have done the same, but I couldn’t find my sandpaper. I’m not worried about it. I’m confident that 30 square inches of JB Weld will hold.

I don’t just ignore important directions. I’m also impatient. The cure time for JB Weld is 15 to 24 hours. I hung the first monitor back up after twelve hours, and I was even more impatient with the second monitor. I think my patience ran out after about eight hours.

If the JB Weld fails and either of these monitors come crashing down, I will be certain to let everyone know.

The results

I have to say that I am extremely pleased with the results. When I disassembled the first monitor, it didn’t look like there was going to be much improvement. The metal frame on the QNIX QX2710 is quite wide compared to the 24” monitor in my arcade table. I measured the metal frame and the plastic bezel with my caliper, and the difference is less than half an inch. That doesn’t seem very impressive.

The difference is more drastic than I expected

Then I hung the debezeled monitor next to the stock monitor. I couldn’t believe how drastic the difference was. It makes these 27” monitors feel so much less bulky.

To my amazement, I managed to glue those VESA mounts on at exactly the same height. The monitors line up on the monitor stand absolutely perfectly. This was one of my biggest worries.

They still wiggle

I was hoping the monitors would be more rigid on their mounts, but it’s still easy to wiggle them about. They might be a little more rigid, but not by much. I think the real problem is that the VESA mounting points are welded to a thin piece of sheet metal. It just has a lot of flex.

I was thinking about 3D printing a small brace to clip the monitors together. It is probably not necessary, but I’m always looking for an excuse to design something new.

Only look at the front

My desk now looks awesome from the front, but don’t try to peek around the back. There are some wires on the back of the QNIX QX2710 that you just can’t hide, and they are all thin, fragile looking wires. The cables that plug in near the top of the monitor are not a problem, because they only reach an inch or two out of the enclosure.

The wires that run to the bottom corner of the monitor to power the backlight are quite thin. I couldn’t do much with that other than tape them to the monitor to keep them from flopping about.

The pair of debezeled QNIX WX2710 monitors

The bigger problem is the board with the power, volume, and brightness buttons. It needs to be there, because there’s always a chance that I’ll need to adjust the brightness. There just isn’t anywhere to neatly attach it. I taped it to the back of the monitor by its wires for now, but I’d like to find a better solution.

It should be simple enough to 3D print some sort of bracket for it. Maybe I can affix it to the monitor’s VESA mount where it can be hidden out of sight.


I’m very pleased with the results, and I’m glad that I finally did it. I’ve been putting off doing this mostly because of the fear that I wouldn’t glue the VESA mounts on at the same height, and that part actually worked out perfectly. When I rest my yard stick across both monitors and measure the gap with my feeler gauge, the distance is less than 0.004”!

I also think it was well worth the time and effort. If you don’t count all the waiting, I spent more time and A LOT more labor on wire management over the last two days.

Have you debezeled your monitor? Is it a QNIX QX2710 like mine? I’d like to see pictures and hear about your experience!

Craft Coffee - October 2014 - Revel Coffee Roasters

| Comments

Craft Coffee sends me 12 ounces of coffee every month, but that only lasts me about two weeks. That means that I have to find other coffee to finish out each month. The beans I’ve been buying the last few months come from Java Pura in Houston, TX. They’re not quite local, but they’re not too far away.

Java Pura beans are very inexpensive at my local Central Market—just $9.99 for a 12-ounce bag. The bags I’ve bought so far were all roasted within the last month, too! So far I’ve tried their Ethiopian Yirgachiffe and their espresso blend.

I’ve been drinking the espresso blend for the last couple weeks. It tastes pretty good. It is oily, and a little too dark for me, but it is ridiculously easy to pull consistent shots with. The lighter coffees, like their Yirgachiffe, have been much more difficult to tune in.

While I have been enjoying Java Pura’s espresso blend, I’ve been patiently waiting for the next selection from Craft Coffee. The espresso blend tastes good, but it is very boring.

Revel Coffee Roasters, Billings, MT

A lovely nectarine brightness can be detected amongst a body resembling jasmine tea and concord grapes before leveling out with a pleasant nip of grapefruit peel.

The first thing that I noticed about the coffee from Revel Coffee Roasters was the smell. It smelled much more interesting than the espresso blend that I’ve been drinking for the last two weeks. The aroma was more like something you would expect to find in the produce aisle at the grocery store than in a bag of coffee beans.

I don’t know if I’m doing something wrong, but this month I am at a completely loss in attempting to identify any of the flavors mentioned in Craft Coffee’s notes. This is the first time that this has happened in eight months. Half the problem might be that I don’t know what jasmine tea or grapefruit peel taste like.

This is a lighter roast, and I still have difficulty pulling consistent shots on my Rancilio Silvia unless I’m using a dark roast. Craft Coffee allows you to choose whether you prefer light or dark roasts, and even with the difficulty they give me, I still prefer lighter roasts. The flavors are more interesting and easier to pick out.

I’m about half-way through the bag, and I’m starting to pull some pretty consistent and delicious shots. The coffee from Revel Coffee Roasters doesn’t stand out like some of the coffees that Craft Coffee has sent me in the past, like Slate Coffee Roasters Yigracheffe or Oren’s Daily Roast’s Sidamo. It is still delicious, and has been tasting better every day as I tune in the grind and shot time.

I miss the three coffee sampler pack

I wish I could still make good use of the sampler pack. The sampler pack usually had two good coffees along with one outstanding coffee. I now worry that I’m missing out on that one amazing coffee each month.

I am now able to consistently pull delicious shots of espresso with my Rancilio Silvia, but it takes me a while to get there. The first shot of espresso from a new bag of coffee usually goes horribly wrong. The second shot is usually drinkable, but not delicious. I’d be likely to burn through an entire four-ounce pouch of coffee from the sampler before pulling a really good shot.

Use my referral code “pat1245” and you’ll get 15% off

If you use my referral code (pat1245) when checking out at Craft Coffee, you will get a 15% discount. Not only will you save money, but they tell me I’ll get a free month of coffee with every order. That sounds like a good deal for both of us!

3D Printed Keychain Connector for USB Flash Drives

| Comments

My friend Brian has been on a mission to reduce the weight and volume of his keychain.

He had some empty space in his fancy new key machine, so he decided to add a USB flash drive. There were a lot of thin USB 2 flash drives that leave out the metal guard around the connector, but Brian wanted his new drive to be a faster USB 3 model. All the USB 3 drives are a little thicker, and they all have the metal guard. We were wondering if it was now a functional requirement.

Brian found some nice USB 3 flash drives, but none of them had large enough keyring holes to fit his fancy keychain. I had an idea, though. I figured it would be easy to print up a little plug that could fit inside the USB connector. That way he could buy any flash drive he wanted.

Even easy prints require three attempts

This is one of the simplest models I’ve designed using OpenSCAD, and it didn’t take more than 10 minutes to have a model that was ready to print. I hadn’t seen this new keychain yet, but I thought the nuts and bolts looked pretty big, so I took a guess and overestimated on purpose. I figured it would be easier to test if the hole was oversized rather than undersized.

Note the one that snapped on its weak axis

I printed this first model on its side, because it was sized to be precisely the same thickness as a USB connector, and the plug was positioned so that two of the edges would be flush with the USB connector.

This print was a complete disaster. The hole for the keyring was huge, and the walls were all so narrow that the print just didn’t stick together very well.

I printed five parts at once, and each one had a plug 2% smaller than the last. This worked out well, because some of them were sturdy enough to test, and I was definitely on the right track.

The second attempt

The second attempt was a bit more well designed. I rotated the object 90 degrees so that it would print with the connector facing up. This allowed me to make the connector the same width and thickness as a USB connector, so the metal would be flush on all four sides.

This seemed like a great idea, but it was a terrible idea. Not only did I have the same problems printing the small, thin parts as I did the first time, but this part was also extremely weak on the most important axis. It would easily snap off at a layer transition, and you’d end up with a chunk of plastic stuck in your flash drive!

Simple is better

Printing the part lying down was a much better way to go. The plug that inserts into the USB drive is printed along more than one axis, so it is much sturdier. Also, the part printed much better because it was so much wider.

The final design

I was even able to make the part flush with the USB connector on three sides, and three out of four sides is good enough for me.

Lessons learned

I’m still new to 3D printing, and I learn something new with each failure. The time I once again learned that orientation is extremely important. In the past, I have had to adjust the orientation of the part to reduce overhangs. I had that covered this time, but I didn’t think about how poorly such a narrow part might print.

I would likely get better results on these narrow parts if my printer was better calibrated. The first two attempts would have been problematic even if they printed perfectly, because they each had a weak axis that might have snapped off in the USB connector. The third part has three walls surrounding the outside edges and solid layers with a 45-degree offset on the top and bottom. That should make it much sturdier!

Craft Coffee - September 2014

| Comments

This is the second month so far where I’ve opted to receive a single 12-ounce bag of coffee from the fine folks at Craft Coffee. It is definitely a safer choice now that I’m learning to use my Rancilio Silvia espresso machine, and I’ve wasted far less coffee this way, but I really miss tasting three different, delicious coffees every month.

Tasting them is always enjoyable, and it is much easier to write about three different coffees at the same time. I may have to look into increasing the volume of my subscription!

Thirty Thirty, Peoria, IL (Costa Rica)

Tart grapefruit acidity and blueberry jam flavors complement a crisp clean finish in this fruit forward cup.

I really like this coffee from Thirty Thirty. It seems like a pretty well-balanced cup of coffee to me. It is just a little bit tart, but I can easily pick out the blueberry that’s mentioned in Craft Coffee’s notes.

The blueberry flavor reminds me of the Ethiopian Sidamo coffee I received in June from Oren’s Daily Roast. It isn’t quite the same, though, and I think I can understand why Craft Coffee’s description actually says “blueberry jam.”

I’ve actually been pretty successful at making lattes with the Thirty Thirty coffee this month. I only choked [my] Rancilio Silvia]1 on my first pull. The rest were all drinkable. The best double shots of Thirty Thirty that I’ve pulled were all ristrettos near the 1.5 ounce mark.

Use my referral code “pat1245” and you’ll get 15% off

If you use my referral code (pat1245) when checking out at Craft Coffee, you will get a 15% discount. Not only will you save money, but they tell me I’ll get a free month of coffee with every order. That sounds like a good deal for both of us!

The Nvidia GTX 970, Linux, and Overclocking My QNIX QX2710 Monitors

| Comments

I’ve been thinking about buying a new video card for a long time. I probably started seriously considering it the day my 1440p QNIX QX2710 monitors arrived. My old Nvidia GTX 460 card was fast enough to run most games at 2560x1440 with reasonably high settings. It wasn’t fast enough to enable antialiasing in most games, but the pixels are pretty small from where I’m sitting anyway.

There was something I was missing out on. The QNIX monitors can be overclocked. That means you can run it at refresh rates higher than 60 Hz. I’m told that some can be run at 120 Hz. That means that a new image is drawn on the screen 120 times every second—twice as fast as most monitors. My old Nvidia GTX 460 was too old for that, and it could only drive the panels at 60 Hz.

I’ve been keeping my eye on the Nvidia GTX 760 for the past few months. It would be new enough to overclock my QNIX monitors, and it would be fast enough to justify the upgrade. It just wasn’t going to be enough of an upgrade for me to get excited about, especially considering that I had never even seen the difference between 60 Hz and 120 Hz.

The new Nvidia Maxwell GPUs

Late last week, a friend of mine asked me if I’d seen the news about the GTX 970 and GTX 980 video cards. I hadn’t, so I looked them up. The GTX 980 is way beyond my needs, but the GTX 970 looked very interesting. They hit the shelves the next day, and I ordered a GTX 970 from Amazon almost immediately.

The GTX 760 cards that I’ve been interested in sell for around $230. The GTX 970 costs quite a bit more at $330, but I decided that I’d get a lot of value out of that extra $100. The GTX 970 is significantly faster than and has twice as much video RAM as the older GTX 760. In fact, the GTX 970 is nearly comparable to the GTX 780 Ti that was selling for well over $500 last week. The GTX 970 provides excellent bang for the buck with regard to performance.

Performance alone should be enough to justify buying a GTX 970, but even the cheaper GTX 760 is more than fast enough for my needs. The GTX 970 provides more than just a performance upgrade. The GTX 970 and GTX 980 both bring two interesting new features to the table: Multi-Frame Anti-Aliasing (MFAA) and Voxel Global Illumination (VXGI).

Multi-Frame Anti-Aliasing (MFAA)

I’m pretty excited about MFAA. Antialiasing is used to smooth out the jagged edges that appear when one 3D object is rendered on top of each other. There are a number of different methods used to accomplish this. The most popular method is probably multisample antialiasing (MSAA).

Nvidia is claiming that 4xMFAA looks as good as 4xMSAA while only requiring as much extra processing power as 2xMSAA. I don’t know if these claims are true, but I am looking forward to trying it out for myself. Based on what Nvidia has said about MFAA so far, I was expecting to see an option in the nvidia-settings control panel to force the use of MFAA.

I haven’t been able to find such an option, so I will be keeping my eyes open. It sounds like this will show up in a driver update sometime in the near future.

Voxel Global Illumination (VXGI)

I’m much less excited about VXGI. It is a new way to implement dynamic lighting. It calculates things like reflections and shadows in real time. Not only does will it make for better looking visuals, but it sounds like it’s also much easier for artists and programmers to work with.

I’m not very excited about it, because it won’t improve existing games like MFAA will. It will only be available in new games that choose to support it. I still feel that it is a good reason to choose the GTX 970.

You don’t mess with the Zotac

I ended up ordering a Zotac GTX 970. I don’t think I’ve ever owned a product made by Zotac, but this wasn’t a well-planned purchase. I was awake for quite a while that day before I remembered that I wanted to buy a GTX 970. The Zotac video cards tended to be less expensive, and more importantly, there were also three of them left in stock.

In the old days before DVI, HDMI, and DisplayPort, choosing a quality card was very important. We used to rely on the video card’s RAMDAC to generate the analog signal that would drive the VGA port. Low-quality RAMDACs make for fuzzy images and bleeding pixels.

We don’t have to worry about that with modern displays with digital connections, so I’m much less worried about buying a less expensive video card today. I’ll be sure to update this page if I ever experience a problem.

The Zotac GTX 970 came with a pair of adapters for converting 4-pin hard drive power cables into 6-pin GPU cables, and a single DVI to VGA adapter.

60 Hz vs. 96 Hz and beyond

I did two things immediately after installing my new Zotac GTX 970 card. I immediately fired up Team Fortress 2, maxed out all the settings, and ran around a bit to see what kind of frame rates my fancy new graphics card could muster. Then I started monkeying around trying to get my monitors to run at 96 Hz.

It was actually a pretty simple task, but that’s a little out of scope for this particular blog post. I probably had to restart my X server two or three times, and I was up and running at 96 Hz. Once I verified that it was working, I jumped right back into Team Fortress 2.

99.3 Hz refresh rate

It wasn’t very noticeable at first. When you connect to a server, your view is from a stationary camera, and I was watching RED and BLU guys wander around. Then I spawned and started walking around. That’s when you really notice the difference.

I won’t say that the 60% faster refresh rate is the most amazing thing ever, but it is definitely an improvement. The difference is easy to spot as you aim your gun around in the game. The first thing I said was, “This is buttery smooth.” It really is quite nice. It isn’t nearly as drastic as the jump from jittery, 30-frame-per-second console gaming to solid 60-frame-per-second PC gaming, but it is definitely a nice upgrade.

I did some more work, and I pushed my monitors up to about 100 Hz. I was trying for 120 Hz or 110 Hz, but neither would work for me. I was pretty close at 110 Hz, but I got a pretty garbled and jittery picture on the screen. I’m not unhappy, though. I’m quite happy at 100 Hz.

I should note here that it is only possible to overclock the DVI version of the QNIX QX2710. The QNIX QX2710 TRUE10 monitors with HDMI and DisplayPort models use a different LCD panel and will only run at 60 Hz.

UPDATE: I found some tighter timings and converted them to an xorg.conf modeline, and my both my monitors are running happily at 120 Hz. I plan to compile all my data together and write a comprehensive post on the subject.

Steam for Linux game performance

I don’t really have anything in my Steam library that wasn’t playable with my old GTX 460 at 2560x1440. I had to run most of those games with the quality settings turned down a bit, so I spent some time today opening up games and adjusting their video settings as high as they would go.

Team Fortress 2 settings maxed out

Team Fortress 2 used to run at a little over 100 FPS, but it would drop down into the 60 FPS range when a lot was going on. With the new card, I turned everything up to the max and set it to use 8xMSAA. I haven’t seen the GPU utilization go much higher than 65% while playing Team Fortress 2, but I do still drop down into the 60 FPS range. The CPU is the bottleneck here.

Left 4 Dead 2 was pushing my old video card to its limits at 2560x1440, and that was with most of the settings turned down as low as they would go. It wasn’t running much better than 60 or 80 FPS. At the old settings, the new card didn’t drop below 280 FPS. I’m assuming the game is capped at 300 FPS.

After maxing out every setting, Left 4 Dead 2 is running at a solid 150 FPS with the new card. The game looks so much better, and it feels buttery smooth at 100 Hz!

I had similar success with Portal 2. I can’t say much about how it ran on my old video card because I’m pretty sure I played through the game on my laptop. I can say that with the settings maxed out, it doesn’t seem to drop below about 110 FPS. That was with me running around the room where the final “battle” takes place.

I have a quite a few games that don’t report their frames per second—games like Strike Suit Zero, Wasteland 2, and XCOM: Enemy Unknown. I maxed out all their video settings, and as far as I can tell, they’re running quite well.

Shaming two games with bad performance

One of the games that my friends and I have played a lot of is Killing Floor. The Linux port of Killing Floor runs reasonably well for the most part. There are some maps that just run very poorly. One in particular is the Steamland “Objective Mode” map.

Some areas on the map run at less than 30 FPS with my laptop’s Nvidia GT 230M video card. Those same areas run at less than 30 FPS on my desktop with my old GTX 460 card, and those areas of the map run just as poorly with my GTX 970. It is quite a disappointment.

The other game is Serious Sam 3: BFE. I haven’t played this game much because it runs poorly and crashes all the time. It still runs poorly even with my new GTX 970. I was wandering around near the beginning of the game while smashing things with my sledge hammer.

I kept adjusting various video quality settings while I was doing this. The autodetected settings ran at less than 30 FPS. I kept turning things down one notch at a time waiting to see some reasonable performance. The game became very ugly by the time the frame rate got into the 50-to-70-FPS range. That’s when I gave up.


I’m very happy with my purchase of my Zotac GTX 970 card, even if the video card is now the most expensive component in my computer. It is almost an order of magnitude more powerful than the card it replaced, and it even manages to generate less heat. That’s actually a very nice bonus during the hotter days of the year here in my south-facing home office in Texas. This is the warmest room in the house, and long gaming sessions manage to bring the temperature up a couple of degrees past comfortable.

My computer isn’t the only machine in the house that will get an upgrade out of this. I’ll be moving the GTX 460 into my wife’s computer, and her Nvidia GT 640 will be moving into the arcade cabinet. I’m pretty excited about the arcade cabinet upgrade because I will now be able to route the arcade audio through its wall mounted television. Upgrades that trickle down are always the best upgrades!

The GTX 970 is an excellent match for my pair of QHD QNIX QX2710 monitors. Finally being able to take advantage of their overclockability is awesome, and I should have no trouble running new games at 1440p for at least the next couple of years. Now I just have to hope that they eventually release Grand Theft Auto 5 for Linux and SteamOS!

Self-Hosted Cloud-Storage Comparison - 2014 Edition

| Comments

Early last year, I decided that I should replace Dropbox with a cloud-storage solution that I could host on my own servers. I was hoping to find something scalable enough to store ALL the files in my home directory, and not just the tiny percentage of my data I was synchronizing with Dropbox. I didn’t expect to make it all the way to that goal, but I was hoping to come close.

I wrote a few paragraphs about each software package as I was testing them. That old cloud-storage comparison post is almost 18 months old and is starting to feel more than a little outdated. All of the software in that post has improved tremendously, and some nice-looking new projects have popped up during the last year.

All of these various cloud-storage solutions solve a similar problem, but they tend to attack that problem from different directions, and their emphasis is on different parts of the experience. Pydio and ownCloud seem to focus primarily on the web interface, while Syncthing and BitTorrent Sync are built just for synchronizing files and nothing else.

I am going to do things just a little differently this year. I am going to do my best to summarize what I know about each of these projects, and I am going to direct you towards the best blog posts about each of them.

This seemed like it would be a piece of cake, but I was very mistaken. Most of the posts and reviews I find are just rehashes of the official installation documentation. That is not what I am looking for at all. I’m looking for opinions and experiences, both good and bad. If you know of any better blog posts that I can be referencing for any of these software packages, I would love to see them!

Why did these software packages make the list?

I don’t want to trust my data to anything that isn’t Open Source, and I don’t think you should, either. That is my first requirement, but I did bend the rules just a bit. BitTorrent Sync is not Open Source, but it is popular, scalable, and seems to do its job quite well. It probably wouldn’t have made my list if I weren’t using it for a few small tasks.

I also had to know that the software exists. I brought back everything I tried out last year, and I added a few new contenders that were mentioned in the comments on last year’s comparison. If I missed anything interesting, please let me know!


I am more than a little biased towards Seafile. I’ve been using Seafile for more than a year, and it has served me very well in that time. I am currently synchronizing roughly 12-GB made up of 75,000 files using my Seafile server. The 12-GB part isn’t very impressive, but when I started using Seafile, tens of thousands of files was enough to bring most of Seafile’s competition to its knees.

Seafile offers client-side encryption of your data, which I believe is very important. Encryption is very difficult to implement correctly, and there are some worries regarding Seafile’s encryption implementation. I am certainly not an expert, but I feel better about using Seafile’s weaker client-side encryption than I do about storing files with Dropbox.

Features and limitations:

  • Client-side encryption
  • Central server with file-revision history
  • Web interface with file management and link sharing
  • Dropbox-style synchronization (tens of thousands of files)
  • Won’t sync files with colons in their name

For more information:


ownCloud’s focus is on their web interface. As far as I can tell, ownCloud is leaps and bounds ahead of just about everyone else in this regard. They have tons of plugins available, like music players, photo galleries, and video players. If your primary interest is having an excellent web interface, then you should definitely take a look at ownCloud.

ownCloud has a Dropbox-style synchronization client. I haven’t tested it in over a year, but at that time it didn’t perform nearly well enough for my needs. The uploading of my data slowed to a crawl after just a couple thousand files. This is something the ownCloud team have worked on, and things should be working better now. I’ve been told that using MariaDB instead of SQLite will make ownCloud scalable beyond just a few thousand files.

Features and limitations:

  • Central server with file-revision history
  • Web interface with file management, link sharing, and dozens of available plugins
  • Collaborative editing (see the Linux Luddites podcast for opinions)
  • Dropbox-style synchronization, may require MariaDB to perform well

For more information:


SparkleShare was high on my list of potential cloud-storage candidates last year. It uses Git as a storage backend, is supposed to be able to use Git’s merge capabilities to resolve conflicts in text documents, and it can encrypt your data on the client side. This seemed like a winning combination to me.

Things didn’t work out that well for me, though. SparkleShare loses Git’s advanced merging when you turn on client-side encryption, so that advantage was going to be lost on me.

Also, SparkleShare isn’t able to sync directories that contain Git repositories, and this was a real problem for me. I have plenty of clones of various Git repositories sprinkled about in my home directory, so this was a deal breaker for me.

It also looks as though SparkleShare isn’t able to handle large files.

Features and limitations:

  • Central server with file-revision history
  • Web interface with file management and link sharing
  • Dropbox-style synchronization (large files will be problematic)
  • Data stored in Git repo (merging of text files is possible)

For more information:

Pydio (formerly AjaXplorer)

Pydio seems to be in pretty direct competition with ownCloud. Both have advanced web interfaces with a wide range of available plugins, and both have cross-platform Dropbox-style synchronization clients. I have not personally tested Pydio, but they claim their desktop sync client can handle 20k to 30k files.

Pydio’s desktop sync client is still in beta, though. In my opinion, this is one of the most important features that a cloud-storage platform can have, and I’m not sure that I’d want to trust my data to a beta release.

Features and limitations:

  • Central server with file-revision history
  • Web interface with file management and link sharing
  • Dropbox-style synchronization with Pydio Sync (in beta)

For more information:

  • Pydio’s official website

git-annex with git-annex assistant

I first learned about git-annex assistant when it was mentioned in a comment on last year’s cloud-storage comparison post. If I were going to use something other than Seafile for my cloud-storage needs, I would most likely be using git-annex assistant. I have not tested git-annex assistant, though, and I don’t know if it is scalable enough for my needs.

git-annex assistant supports client-side encryption, and the Kickstarter project page says that it will use GNU Privacy Guard to handle the encryption duties. I have a very high level of trust in GNU Privacy Guard, so this would make me feel very safe. It also looks like git-annex assistant stores your remote data in a simple Git repository. That means the only network-accessible service required on the server is your SSH daemon.

Much like SparkleShare, git-annex assistant stores your data in a Git repository, but git-annex assistant is able to efficiently store large files—that’s a big plus.

Dropbox-style synchronization is the primary goal of git-annex assistant. It sounds like it does this job quite well, but it is lacking a web interface. This puts it on the opposite end of the spectrum compared to ownCloud or Pydio.

There is one big downside to git-annex assistant for the average user. git-annex assistant makes use of the git-annex, and git-annex does not have support for Microsoft Windows.

Features and limitations:

  • Central server with file-revision history
  • No web interface on the server
  • Dropbox-style synchronization

For more information:

BitTorrent Sync

I like BitTorrent Sync. It is simple, fast, and scalable. It does not store your data on a separate, centralized server. It simply keeps your data synchronized on two or more devices.

It is very, very easy to share files with anyone else that is using BitTorrent Sync. Every directory that you synchronize with BitTorrent Sync has a unique identifier. If you want to share a directory with any other BitTorrent Sync user, all you have to do is send them the correct identifier. All they have to do is paste it into their BitTorrent Sync client, and they will receive a copy of the directory.

Unfortunately, BitTorrent Sync is the only closed-source application on my list. It is only included because I have been using it to sync photos and backup directories on my Android devices. I’m hoping to replace BitTorrent Sync with Syncthing.

Features and limitations:

  • No central server but can use third-party server to locate peers
  • Very easy to set up
  • Solid, fast synchronization without extra bells and whistles
  • Closed source

For more information:

Pulse (previously called Syncthing)

Syncthing is the Open Source competition for BitTorrent Sync. They both have very similar functionality and goals, but BitTorrent Sync is still a much more mature project. Syncthing has yet to reach its 1.0 release.

I’ve been using Syncthing for a couple of weeks now. I’m using it to keep my blog’s Markdown files synchronized to an Ubuntu environment on my Android tablet. It has been working out just fine so far, and I’d like to use it in place of BitTorrent Sync.

It does require a bit more effort to set directory up to be synchronized than with BitTorrent Sync, but I hope they’ll be able to streamline this in the future. This is definitely a project to keep an eye on.

Syncthing is now in the process of rebranding itself as Pulse.

Features and limitations:

  • Still in beta
  • No central server but can use third-party server to locate peers
  • Easy to set up
  • Synchronization without extra bells and whistles
  • Like BitTorrent Sync, but Open Source

For more information:


There’s no clear leader in the world of self-hosted cloud-storage. If you need to sync several tens of thousands of files like I do, and you want to host that data in a central location, then Seafile is really hard to beat. git-annex assistant might be an even more secure competitor for Seafile—assuming it can scale up to a comparable number of files.

If you don’t want or need that centralized server, BitTorrent Sync works very well, and it should have no trouble scaling up to sync the same volumes of data as Seafile, and I bet it won’t be too long before Syncthing is out of beta and ready to meet similar challenges. Just keep in mind that with a completely peer-to-peer solution, your data won’t be synchronized unless at least two devices are up and running simultaneously.

If you’re more interested in working with your data and collaborating with other people within a web interface, then ownCloud or Pydio might make more sense for you. I haven’t heard many opinions on Pydio, either good or bad, but you might want to hear what the guys over at the Linux Luddites podcast have to say about ownCloud in their 24th episode. The ownCloud discussion starts at about the 96-minute mark.

Are you hosting your own “cloud-storage?” How is it working out for you? What software are you using? Leave a comment and let everyone know what you think!

zsh-dwim: Now With Faster Startup Times, Modular Configuration, and More

| Comments

It has been quite a while since I’ve made any updates to zsh-dwim. A few weeks ago, a feature request from PythonNut showed up in zsh-dwim’s issue tracker on GitHub. He asked me if I wouldn’t mind splitting the implementation and configuration details of zsh-dwim out into separate files.

I thought this was a great idea, and I’ve had similar thoughts in the past. What I was really interested in doing was breaking up the long, monolithic set of transformation definitions into separate files. That way you could easily disable each piece of functionality individually.

I’ve been thinking this would be a good idea for quite a while, but it hasn’t been a priority for me. As far as I knew, I was the only person using zsh-dwim. When the feature request came in, I was happy to see a handful of stars and forks on the zsh-dwim project. That was more than enough of an excuse for me to put in a bit of time on the project.

What is zsh-dwim?

I haven’t written about zsh-dwim in quite a while, so it might be a good idea to explain what it is. The “dwim” in zsh-dwim stands for “Do What I Mean.”“ I borrowed the term from the many Emacs functions with “dwim” in their names. The idea is that when you’re writing a command, or you just finished executing a command, you can just hit one key and zsh-dwim will try to guess what you want to happen.

If you just created a new directory or untarred a file, hitting the zsh-dwim key will attempt to cd into the new directory. Maybe you tried to echo a value into a file in /proc or /sys, and you forgot that only root can do that. Hitting the zsh-dwim key will convert your > into a | sudo tee for you.

If you hit the zsh-dwim key on an empty command line, it will apply its logic to your previous command. If you are already in the middle of editing a command, then zsh-dwim will attempt to apply its logic to it. zsh-dwim will also try to move your cursor to the most logical position.

The commands aren’t executed automatically. You still have to hit the enter key. I’m a big believer in the principle of least surprise. I’d much rather see what is going to be executed before I commit myself to it. I am not a big fan of the sudo !! idiom. That’s why zsh-dwim’s fallback simply adds a sudo to the front of the command. I feel better seeing the actual command that I’m about to run with root privileges.

The more modular design

First of all, as per PythonNut’s request, the configuration has been separated out from the inner working of zsh-dwim. This will make it easier to choose your own zsh-dwim key—my choice of control-u is surprisingly controversial!

I’ve also grouped the related transformations together and put them into separate files. All the transformations are enabled by default, and each file is sourced in the config file. If you don’t like some of my transformations, you can easily comment them out.

zsh-dwim is a bit smarter

I also added some checks to make sure that useless transformations are never used. To accomplish that, zsh-dwim now checks to make sure the appropriate commands are installed before enabling certain transformations. You don’t want to be offered apt-get and dpkg commands on a Mac, and you don’t want to be offered dstat commands when you only have vmstat available.

I put these checks in the config file. I waffled on this decision quite a bit. The config file would be cleaner if I put these checks in the files that define the transformations. That way you’d only have to comment out a single line to manually disable them, but I thought that might get confusing. I didn’t want someone to enable a set of transformations and wonder why they’re not working.

Speeding things up a bit thanks to PythonNut

The new brains in zsh-dwim made zsh startup times slower by almost 0.01 seconds on my computers. That’s almost 15% slower on my desktop and about 5% slower on my laptop. This was caused by calling out to the which command to verify if certain programs were installed.

I wasn’t too happy about this. I open terminal windows dozens, maybe hundreds, of times each day. I like my terminal windows to be ready instantly. When I was using oh-my-zsh, I put in some work to improve its startup time. It was pretty slow back in 2012, and I submitted a patch that shaved 1.2 seconds off of oh-my-zsh’s startup time. By comparison, the extra 0.01 seconds I added to my Prezto startup time was minimum. I still didn’t like it, but it was acceptable for now.

Not long after I applied my changes, PythonNut sent me a pull request on GitHub that replaced all my subshell calls to which with calls to hash. I had no idea about this functionality in zsh, and his patch completely negated any performance penalties I may have caused.

It could probably be argued that this isn’t really a speedup, and that zsh-dwim is just back to approximately its previous speed. I am still calling it a speedup. Functionality is slightly improved, and I was even able to replace some calls to which in my own startup scripts with calls to rehash, too.

3D Printed Speaker and AC Adapter Brackets

| Comments

I wasn’t sure that I was going to write about this particular project. The finished part is serviceable enough, but it is far from perfect, and I was having a lot of trouble getting behind the desk to take a good “action shot.” The desk and monitors are in the corner of the room, so I thought it was going to require some significant gymnastics to get back there.

I bought some longer DVI cables to clean up my cable management, and I was climbing and stretching to attach one of the cables to the monitor stand and feed it down the vertical mount, and it was a surprisingly difficult task. When I started swapping out the second cable, I realized that I have an articulating monitor stand. All I had to do was swing the monitors out of the way, and I could do almost all of the cable routing from the comfort of my chair!

This also let me take a pretty good photo of the new speaker mounts.

The problem

The power adapters for my QNIX QX2710 monitors aren’t a convenient fit for my monitor stand. They are the sort of adapter with a short length of DC power cable attached to a relatively large power brick. That brick has a port that accepts a standard three-prong PC power cable.

I was hoping to hide both power adapters, but the short DC cable just isn’t long enough to follow the entire length of the monitor mount. My interim solution was to tape a power brick to the top of each arm. This works well enough, and they are hiding between the wall and the monitor. Out of sight and out of mind.

I recently acquired a pair of Altec Lansing VS2421 speakers, and I needed to put them somewhere—preferably hidden out of sight. I thought it would be a good idea to kill two birds with one stone, so I decided to make a combination mount for my speakers and power adapters.

The first design

The first design was not only my largest print so far, but it was also a huge failure. A 150-gram failure—that’s almost 10% of an entire spool of filament!

The speakers have a single keyhole-style mounting hole, and I wanted to be able to lay the speakers down on their backs. That single mounting point wasn’t going to be enough to hold them in place, but it was very easy to leave a hole in the bracket to accept an M3 screw. They do also have a small rectangular notch near the bottom, and it was very easy to print a block of plastic that would fit snugly in that space.

The first design

The distance between these two points was going to make for a very tall mounting bracket, taller than anything I’ve ever printed.

I decided that I was going to design a sort of clamp-shaped bracket that would put the power adapter on top of the monitor arm, and the speaker would sit on top of the mount. I thought this would be a good design because the speakers would be closer to the top of the monitors.

The 12-hour print didn’t go so well. I had some layer separation a couple of hours in, but I let it continue to print anyway. I figured it would be alright if the brackets were a little ugly. I just wanted to see if they would work.

The first failed print

They didn’t work. I don’t know how I measured so badly. The printed part came out to precisely the dimensions I had specified in the OpenSCAD source code, but the channels just weren’t big enough to fit either the power adapters or the monitor arm!

I measured both of them again, and my measurements were all way off! I don’t know how I did that. I measured everything at least twice, so I have no good excuse here.

The second attempt

I decided not to print another big, chunky, solid bracket. It took too long, and it was a waste of plastic. This time, I designed a pair of brackets for each speaker: one S-shaped bracket for the M3 screw, and a second bracket for the rectangular stub.

The second design

Instead of placing both the speakers and power adapters above the arms, I ended up hanging the power brick directly behind the arms. I noticed when test-fitting the failed brackets that the speakers could impede the travel of the monitors, and this change of layout allowed me to move the speakers back by almost two inches.

This was a simple design, and it only took me about 15 minutes to go from an empty OpenSCAD source file to a working STL file. That even includes the time it took to look up how to create a function in OpenSCAD. I’m pretty excited. It seems like my skills are improving!

I printed two sets of these brackets, and they are currently doing their job.

The brackets aren’t perfect, but they’re good enough

I’m very happy with how the mounting points on the speakers meet up with the brackets. I’m not all that happy about how well they hold up the power adapters.

The brackets doing their job

The spacing for the speaker mounts has to be very precise, and those two mounting points barely fit on one section of the monitor arm. I was lazy. I knew that if I made the brackets narrow enough that the same set of brackets would work for both sides. I didn’t want to have to mirror the parts.

This wasn’t the best choice. The brackets are a little too narrow and a bit too far apart to comfortably hold the power adapters. The power adapters only have to be nudged a bit before they start to fall through the center.

I’m not terribly worried about this, though. The cable management keeps things from moving around, and all this stuff is behind the monitors where I’ll never see it.

The brackets doing their job