Craft Coffee - September 2014

| Comments

This is the second month so far where I’ve opted to receive a single 12-ounce bag of coffee from the fine folks at Craft Coffee. It is definitely a safer choice now that I’m learning to use my Rancilio Silvia espresso machine, and I’ve wasted far less coffee this way, but I really miss tasting three different, delicious coffees every month.

Tasting them is always enjoyable, and it is much easier to write about three different coffees at the same time. I may have to look into increasing the volume of my subscription!

Thirty Thirty, Peoria, IL (Costa Rica)

Tart grapefruit acidity and blueberry jam flavors complement a crisp clean finish in this fruit forward cup.

I really like this coffee from Thirty Thirty. It seems like a pretty well-balanced cup of coffee to me. It is just a little bit tart, but I can easily pick out the blueberry that’s mentioned in Craft Coffee’s notes.

The blueberry flavor reminds me of the Ethiopian Sidamo coffee I received in June from Oren’s Daily Roast. It isn’t quite the same, though, and I think I can understand why Craft Coffee’s description actually says “blueberry jam.”

I’ve actually been pretty successful at making lattes with the Thirty Thirty coffee this month. I only choked [my] Rancilio Silvia]1 on my first pull. The rest were all drinkable. The best double shots of Thirty Thirty that I’ve pulled were all ristrettos near the 1.5 ounce mark.

Use my referral code “pat1245” and you’ll get 15% off

If you use my referral code (pat1245) when checking out at Craft Coffee, you will get a 15% discount. Not only will you save money, but they tell me I’ll get a free month of coffee with every order. That sounds like a good deal for both of us!

The Nvidia GTX 970, Linux, and Overclocking QNIX QX2710 Monitors

| Comments

I’ve been thinking about buying a new video card for a long time. I probably started seriously considering it the day my 1440p QNIX QX2710 monitors arrived. My old Nvidia GTX 460 card was fast enough to run most games at 2560x1440 with reasonably high settings. It wasn’t fast enough to enable antialiasing in most games, but the pixels are pretty small from where I’m sitting anyway.

There was something I was missing out on. The QNIX monitors can be overclocked. That means you can run it at refresh rates higher than 60 Hz. I’m told that some can be run at 120 Hz. That means that a new image is drawn on the screen 120 times every second—twice as fast as most monitors. My old Nvidia GTX 460 was too old for that, and it could only drive the panels at 60 Hz.

I’ve been keeping my eye on the Nvidia GTX 760 for the past few months. It would be new enough to overclock my QNIX monitors, and it would be fast enough to justify the upgrade. It just wasn’t going to be enough of an upgrade for me to get excited about, especially considering that I had never even seen the difference between 60 Hz and 120 Hz.

The new Nvidia Maxwell GPUs

Late last week, a friend of mine asked me if I’d seen the news about the GTX 970 and GTX 980 video cards. I hadn’t, so I looked them up. The GTX 980 is way beyond my needs, but the GTX 970 looked very interesting. They hit the shelves the next day, and I ordered a GTX 970 from Amazon almost immediately.

The GTX 760 cards that I’ve been interested in sell for around $230. The GTX 970 costs quite a bit more at $330, but I decided that I’d get a lot of value out of that extra $100. The GTX 970 is significantly faster than and has twice as much video RAM as the older GTX 760. In fact, the GTX 970 is nearly comparable to the GTX 780 Ti that was selling for over $500 last week. The GTX 970 provides excellent bang for the buck with regard to performance.

Performance alone should be enough to justify buying a GTX 970, but even the cheaper GTX 760 is more than fast enough for my needs. The GTX 970 provides more than just a performance upgrade. The GTX 970 and GTX 980 both bring two interesting new features to the table: Multi-Frame Anti-Aliasing (MFAA) and Voxel Global Illumination (VXGI).

Multi-Frame Anti-Aliasing (MFAA)

I’m pretty excited about MFAA. Antialiasing is used to smooth out the jagged edges that appear when one 3D object is rendered on top of each other. There are a number of different methods used to accomplish this. The most popular method is probably multisample antialiasing (MSAA).

Nvidia is claiming that 4xMFAA looks as good as 4xMSAA while only requiring as much extra processing power as 2xMSAA. I don’t know if these claims are true, but I am looking forward to trying it out for myself. Based on what Nvidia has said about MFAA so far, I was expecting to see an option in the nvidia-settings control panel to force the use of MFAA.

I haven’t been able to find such an option, so I will be keeping my eyes open. It sounds like this will show up in a driver update sometime in the near future.

Voxel Global Illumination (VXGI)

I’m much less excited about VXGI. It is a new way to implement dynamic lighting. It calculates things like reflections and shadows in real time. Not only does will it make for better looking visuals, but it sounds like it’s also much easier for artists and programmers to work with.

I’m not very excited about it, because it won’t improve existing games like MFAA will. It will only be available in new games that choose to support it. I still feel that it is a good reason to choose the GTX 970.

You don’t mess with the Zotac

I ended up ordering a Zotac GTX 970. I don’t think I’ve ever owned a product made by Zotac, but this wasn’t a well-planned purchase. I was awake for quite a while that day before I remembered that I wanted to buy a GTX 970. The Zotac video cards tended to be less expensive, and more importantly, there were also three of them left in stock.

In the old days before DVI, HDMI, and DisplayPort, choosing a quality card was very important. We used to rely on the video card’s RAMDAC to generate the analog signal that would drive the VGA port. Low-quality RAMDACs make for fuzzy images and bleeding pixels.

We don’t have to worry about that with modern displays with digital connections, so I’m much less worried about buying a less expensive video card today. I’ll be sure to update this page if I ever experience a problem.

The Zotac GTX 970 came with a pair of adapters for converting 4-pin hard drive power cables into 6-pin GPU cables, and a single DVI to VGA adapter.

60 Hz vs. 96 Hz and beyond

I did two things immediately after installing my new Zotac GTX 970 card. I immediately fired up Team Fortress 2, maxed out all the settings, and ran around a bit to see what kind of frame rates my fancy new graphics card could muster. Then I started monkeying around trying to get my monitors to run at 96 Hz.

It was actually a pretty simple task, but that’s a little out of scope for this particular blog post. I probably had to restart my X server two or three times, and I was up and running at 96 Hz. Once I verified that it was working, I jumped right back into Team Fortress 2.

99.3 Hz refresh rate

It wasn’t very noticeable at first. When you connect to a server, your view is from a stationary camera, and I was watching RED and BLU guys wander around. Then I spawned and started walking around. That’s when you really notice the difference.

I won’t say that the 60% faster refresh rate is the most amazing thing ever, but it is definitely an improvement. The difference is easy to spot as you aim your gun around in the game. The first thing I said was, “This is buttery smooth.” It really is quite nice. It isn’t nearly as drastic as the jump from jittery, 30-frame-per-second console gaming to solid 60-frame-per-second PC gaming, but it is definitely a nice upgrade.

I did some more work, and I pushed my monitors up to about 100 Hz. I was trying for 120 Hz or 110 Hz, but neither would work for me. I was pretty close at 110 Hz, but I got a pretty garbled and jittery picture on the screen. I’m not unhappy, though. I’m quite happy at 100 Hz.

Steam for Linux game performance

I don’t really have anything in my Steam library that wasn’t playable with my old GTX 460 at 2560x1440. I had to run most of those games with the quality settings turned down a bit, so I spent some time today opening up games and adjusting their video settings as high as they would go.

Team Fortress 2 settings maxed out

Team Fortress 2 used to run at a little over 100 FPS, but it would drop down into the 50 FPS range when a lot was going on. With the new card, I turned everything up to the max and set it to use 8xMSAA. I have yet to see it dip below 100 FPS, and it is usually pushing 200 to 300 FPS.

Left 4 Dead 2 was pushing my old video card to its limits at 2560x1440, and that was with most of the settings turned down as low as they would go. It wasn’t running much better than 60 or 80 FPS. At the old settings, the new card didn’t drop below 280 FPS. I’m assuming the game is capped at 300 FPS.

After maxing out every setting, Left 4 Dead 2 is running at a solid 150 FPS with the new card. The game looks so much better, and it feels buttery smooth at 100 Hz!

I had similar success with Portal 2. I can’t say much about how it ran on my old video card because I’m pretty sure I played through the game on my laptop. I can say that with the settings maxed out, it doesn’t seem to drop below about 110 FPS. That was with me running around the room where the final “battle” takes place.

I have a quite a few games that don’t report their frames per second—games like Strike Suit Zero, Wasteland 2, and XCOM: Enemy Unknown. I maxed out all their video settings, and as far as I can tell, they’re running quite well.

Shaming two games with bad performance

One of the games that my friends and I have played a lot of is Killing Floor. The Linux port of Killing Floor runs reasonably well for the most part. There are some maps that just run very poorly. One in particular is the Steamland “Objective Mode” map.

Some areas on the map run at less than 30 FPS with my laptop’s Nvidia GT 230M video card. Those same areas run at less than 30 FPS on my desktop with my old GTX 460 card, and those areas of the map run just as poorly with my GTX 970. It is quite a disappointment.

The other game is Serious Sam 3: BFE. I haven’t played this game much because it runs poorly and crashes all the time. It still runs poorly even with my new GTX 970. I was wandering around near the beginning of the game while smashing things with my sledge hammer.

I kept adjusting various video quality settings while I was doing this. The autodetected settings ran at less than 30 FPS. I kept turning things down one notch at a time waiting to see some reasonable performance. The game became very ugly by the time the frame rate got into the 50-to-70-FPS range. That’s when I gave up.

Conclusion

I’m very happy with my purchase of my Zotac GTX 970 card, even if the video card is now the most expensive component in my computer. It is almost an order of magnitude more powerful than the card it replaced, and it even manages to generate less heat. That’s actually a very nice bonus during the hotter days of the year here in my south-facing home office in Texas. This is the warmest room in the house, and long gaming sessions manage to bring the temperature up a couple of degrees past comfortable.

My computer isn’t the only machine in the house that will get an upgrade out of this. I’ll be moving the GTX 460 into my wife’s computer, and her Nvidia GT 640 will be moving into the arcade cabinet. I’m pretty excited about the arcade cabinet upgrade because I will now be able to route the arcade audio through its wall mounted television. Upgrades that trickle down are always the best upgrades!

The GTX 970 is an excellent match for my pair of QHD QNIX QX2710 monitors. Finally being able to take advantage of their overclockability is awesome, and I should have no trouble running new games at 1440p for at least the next couple of years. Now I just have to hope that they eventually release Grand Theft Auto 5 for Linux and SteamOS!

Self-Hosted Cloud-Storage Comparison - 2014 Edition

| Comments

Early last year, I decided that I should replace Dropbox with a cloud-storage solution that I could host on my own servers. I was hoping to find something scalable enough to store ALL the files in my home directory, and not just the tiny percentage of my data I was synchronizing with Dropbox. I didn’t expect to make it all the way to that goal, but I was hoping to come close.

I wrote a few paragraphs about each software package as I was testing them. That old cloud-storage comparison post is almost 18 months old and is starting to feel more than a little outdated. All of the software in that post has improved tremendously, and some nice-looking new projects have popped up during the last year.

All of these various cloud-storage solutions solve a similar problem, but they tend to attack that problem from different directions, and their emphasis is on different parts of the experience. Pydio and ownCloud seem to focus primarily on the web interface, while Syncthing and BitTorrent Sync are built just for synchronizing files and nothing else.

I am going to do things just a little differently this year. I am going to do my best to summarize what I know about each of these projects, and I am going to direct you towards the best blog posts about each of them.

This seemed like it would be a piece of cake, but I was very mistaken. Most of the posts and reviews I find are just rehashes of the official installation documentation. That is not what I am looking for at all. I’m looking for opinions and experiences, both good and bad. If you know of any better blog posts that I can be referencing for any of these software packages, I would love to see them!

Why did these software packages make the list?

I don’t want to trust my data to anything that isn’t Open Source, and I don’t think you should, either. That is my first requirement, but I did bend the rules just a bit. BitTorrent Sync is not Open Source, but it is popular, scalable, and seems to do its job quite well. It probably wouldn’t have made my list if I weren’t using it for a few small tasks.

I also had to know that the software exists. I brought back everything I tried out last year, and I added a few new contenders that were mentioned in the comments on last year’s comparison. If I missed anything interesting, please let me know!

Seafile

I am more than a little biased towards Seafile. I’ve been using Seafile for more than a year, and it has served me very well in that time. I am currently synchronizing roughly 12-GB made up of 75,000 files using my Seafile server. The 12-GB part isn’t very impressive, but when I started using Seafile, tens of thousands of files was enough to bring most of Seafile’s competition to its knees.

Seafile offers client-side encryption of your data, which I believe is very important. Encryption is very difficult to implement correctly, and there are some worries regarding Seafile’s encryption implementation. I am certainly not an expert, but I feel better about using Seafile’s weaker client-side encryption than I do about storing files with Dropbox.

Features and limitations:

  • Client-side encryption
  • Central server with file-revision history
  • Web interface with file management and link sharing
  • Dropbox-style synchronization (tens of thousands of files)
  • Won’t sync files with colons in their name

For more information:

ownCloud

ownCloud’s focus is on their web interface. As far as I can tell, ownCloud is leaps and bounds ahead of just about everyone else in this regard. They have tons of plugins available, like music players, photo galleries, and video players. If your primary interest is having an excellent web interface, then you should definitely take a look at ownCloud.

ownCloud has a Dropbox-style synchronization client. I haven’t tested it in over a year, but at that time it didn’t perform nearly well enough for my needs. The uploading of my data slowed to a crawl after just a couple thousand files. This is something the ownCloud team have worked on, and things should be working better now. I’ve been told that using MariaDB instead of SQLite will make ownCloud scalable beyond just a few thousand files.

Features and limitations:

  • Central server with file-revision history
  • Web interface with file management, link sharing, and dozens of available plugins
  • Collaborative editing (see the Linux Luddites podcast for opinions)
  • Dropbox-style synchronization, may require MariaDB to perform well

For more information:

SparkleShare

SparkleShare was high on my list of potential cloud-storage candidates last year. It uses Git as a storage backend, is supposed to be able to use Git’s merge capabilities to resolve conflicts in text documents, and it can encrypt your data on the client side. This seemed like a winning combination to me.

Things didn’t work out that well for me, though. SparkleShare loses Git’s advanced merging when you turn on client-side encryption, so that advantage was going to be lost on me.

Also, SparkleShare isn’t able to sync directories that contain Git repositories, and this was a real problem for me. I have plenty of clones of various Git repositories sprinkled about in my home directory, so this was a deal breaker for me.

It also looks as though SparkleShare isn’t able to handle large files.

Features and limitations:

  • Central server with file-revision history
  • Web interface with file management and link sharing
  • Dropbox-style synchronization (large files will be problematic)
  • Data stored in Git repo (merging of text files is possible)

For more information:

Pydio (formerly AjaXplorer)

Pydio seems to be in pretty direct competition with ownCloud. Both have advanced web interfaces with a wide range of available plugins, and both have cross-platform Dropbox-style synchronization clients. I have not personally tested Pydio, but they claim their desktop sync client can handle 20k to 30k files.

Pydio’s desktop sync client is still in beta, though. In my opinion, this is one of the most important features that a cloud-storage platform can have, and I’m not sure that I’d want to trust my data to a beta release.

Features and limitations:

  • Central server with file-revision history
  • Web interface with file management and link sharing
  • Dropbox-style synchronization with Pydio Sync (in beta)

For more information:

  • Pydio’s official website

git-annex with git-annex assistant

I first learned about git-annex assistant when it was mentioned in a comment on last year’s cloud-storage comparison post. If I were going to use something other than Seafile for my cloud-storage needs, I would most likely be using git-annex assistant. I have not tested git-annex assistant, though, and I don’t know if it is scalable enough for my needs.

git-annex assistant supports client-side encryption, and the Kickstarter project page says that it will use GNU Privacy Guard to handle the encryption duties. I have a very high level of trust in GNU Privacy Guard, so this would make me feel very safe. It also looks like git-annex assistant stores your remote data in a simple Git repository. That means the only network-accessible service required on the server is your SSH daemon.

Much like SparkleShare, git-annex assistant stores your data in a Git repository, but git-annex assistant is able to efficiently store large files—that’s a big plus.

Dropbox-style synchronization is the primary goal of git-annex assistant. It sounds like it does this job quite well, but it is lacking a web interface. This puts it on the opposite end of the spectrum compared to ownCloud or Pydio.

There is one big downside to git-annex assistant for the average user. git-annex assistant makes use of the git-annex, and git-annex does not have support for Microsoft Windows.

Features and limitations:

  • Central server with file-revision history
  • No web interface on the server
  • Dropbox-style synchronization

For more information:

BitTorrent Sync

I like BitTorrent Sync. It is simple, fast, and scalable. It does not store your data on a separate, centralized server. It simply keeps your data synchronized on two or more devices.

It is very, very easy to share files with anyone else that is using BitTorrent Sync. Every directory that you synchronize with BitTorrent Sync has a unique identifier. If you want to share a directory with any other BitTorrent Sync user, all you have to do is send them the correct identifier. All they have to do is paste it into their BitTorrent Sync client, and they will receive a copy of the directory.

Unfortunately, BitTorrent Sync is the only closed-source application on my list. It is only included because I have been using it to sync photos and backup directories on my Android devices. I’m hoping to replace BitTorrent Sync with Syncthing.

Features and limitations:

  • No central server but can use third-party server to locate peers
  • Very easy to set up
  • Solid, fast synchronization without extra bells and whistles
  • Closed source

For more information:

Syncthing

Syncthing is the Open Source competition for BitTorrent Sync. They both have very similar functionality and goals, but BitTorrent Sync is still a much more mature project. Syncthing has yet to reach its 1.0 release.

I’ve been using Syncthing for a couple of weeks now. I’m using it to keep my blog’s Markdown files synchronized to an Ubuntu environment on my Android tablet. It has been working out just fine so far, and I’d like to use it in place of BitTorrent Sync.

It does require a bit more effort to set directory up to be synchronized than with BitTorrent Sync, but I hope they’ll be able to streamline this in the future. This is definitely a project to keep an eye on.

Features and limitations:

  • Still in beta
  • No central server but can use third-party server to locate peers
  • Easy to set up
  • Synchronization without extra bells and whistles
  • Like BitTorrent Sync, but Open Source

For more information:

Conclusion

There’s no clear leader in the world of self-hosted cloud-storage. If you need to sync several tens of thousands of files like I do, and you want to host that data in a central location, then Seafile is really hard to beat. git-annex assistant might be an even more secure competitor for Seafile—assuming it can scale up to a comparable number of files.

If you don’t want or need that centralized server, BitTorrent Sync works very well, and it should have no trouble scaling up to sync the same volumes of data as Seafile, and I bet it won’t be too long before Syncthing is out of beta and ready to meet similar challenges. Just keep in mind that with a completely peer-to-peer solution, your data won’t be synchronized unless at least two devices are up and running simultaneously.

If you’re more interested in working with your data and collaborating with other people within a web interface, then ownCloud or Pydio might make more sense for you. I haven’t heard many opinions on Pydio, either good or bad, but you might want to hear what the guys over at the Linux Luddites podcast have to say about ownCloud in their 24th episode. The ownCloud discussion starts at about the 96-minute mark.

Are you hosting your own “cloud-storage?” How is it working out for you? What software are you using? Leave a comment and let everyone know what you think!

zsh-dwim: Now With Faster Startup Times, Modular Configuration, and More

| Comments

It has been quite a while since I’ve made any updates to zsh-dwim. A few weeks ago, a feature request from PythonNut showed up in zsh-dwim’s issue tracker on GitHub. He asked me if I wouldn’t mind splitting the implementation and configuration details of zsh-dwim out into separate files.

I thought this was a great idea, and I’ve had similar thoughts in the past. What I was really interested in doing was breaking up the long, monolithic set of transformation definitions into separate files. That way you could easily disable each piece of functionality individually.

I’ve been thinking this would be a good idea for quite a while, but it hasn’t been a priority for me. As far as I knew, I was the only person using zsh-dwim. When the feature request came in, I was happy to see a handful of stars and forks on the zsh-dwim project. That was more than enough of an excuse for me to put in a bit of time on the project.

What is zsh-dwim?

I haven’t written about zsh-dwim in quite a while, so it might be a good idea to explain what it is. The “dwim” in zsh-dwim stands for “Do What I Mean.”“ I borrowed the term from the many Emacs functions with “dwim” in their names. The idea is that when you’re writing a command, or you just finished executing a command, you can just hit one key and zsh-dwim will try to guess what you want to happen.

If you just created a new directory or untarred a file, hitting the zsh-dwim key will attempt to cd into the new directory. Maybe you tried to echo a value into a file in /proc or /sys, and you forgot that only root can do that. Hitting the zsh-dwim key will convert your > into a | sudo tee for you.

If you hit the zsh-dwim key on an empty command line, it will apply its logic to your previous command. If you are already in the middle of editing a command, then zsh-dwim will attempt to apply its logic to it. zsh-dwim will also try to move your cursor to the most logical position.

The commands aren’t executed automatically. You still have to hit the enter key. I’m a big believer in the principle of least surprise. I’d much rather see what is going to be executed before I commit myself to it. I am not a big fan of the sudo !! idiom. That’s why zsh-dwim’s fallback simply adds a sudo to the front of the command. I feel better seeing the actual command that I’m about to run with root privileges.

The more modular design

First of all, as per PythonNut’s request, the configuration has been separated out from the inner working of zsh-dwim. This will make it easier to choose your own zsh-dwim key—my choice of control-u is surprisingly controversial!

I’ve also grouped the related transformations together and put them into separate files. All the transformations are enabled by default, and each file is sourced in the config file. If you don’t like some of my transformations, you can easily comment them out.

zsh-dwim is a bit smarter

I also added some checks to make sure that useless transformations are never used. To accomplish that, zsh-dwim now checks to make sure the appropriate commands are installed before enabling certain transformations. You don’t want to be offered apt-get and dpkg commands on a Mac, and you don’t want to be offered dstat commands when you only have vmstat available.

I put these checks in the config file. I waffled on this decision quite a bit. The config file would be cleaner if I put these checks in the files that define the transformations. That way you’d only have to comment out a single line to manually disable them, but I thought that might get confusing. I didn’t want someone to enable a set of transformations and wonder why they’re not working.

Speeding things up a bit thanks to PythonNut

The new brains in zsh-dwim made zsh startup times slower by almost 0.01 seconds on my computers. That’s almost 15% slower on my desktop and about 5% slower on my laptop. This was caused by calling out to the which command to verify if certain programs were installed.

I wasn’t too happy about this. I open terminal windows dozens, maybe hundreds, of times each day. I like my terminal windows to be ready instantly. When I was using oh-my-zsh, I put in some work to improve its startup time. It was pretty slow back in 2012, and I submitted a patch that shaved 1.2 seconds off of oh-my-zsh’s startup time. By comparison, the extra 0.01 seconds I added to my Prezto startup time was minimum. I still didn’t like it, but it was acceptable for now.

Not long after I applied my changes, PythonNut sent me a pull request on GitHub that replaced all my subshell calls to which with calls to hash. I had no idea about this functionality in zsh, and his patch completely negated any performance penalties I may have caused.

It could probably be argued that this isn’t really a speedup, and that zsh-dwim is just back to approximately its previous speed. I am still calling it a speedup. Functionality is slightly improved, and I was even able to replace some calls to which in my own startup scripts with calls to rehash, too.

3D Printed Speaker and AC Adapter Brackets

| Comments

I wasn’t sure that I was going to write about this particular project. The finished part is serviceable enough, but it is far from perfect, and I was having a lot of trouble getting behind the desk to take a good “action shot.” The desk and monitors are in the corner of the room, so I thought it was going to require some significant gymnastics to get back there.

I bought some longer DVI cables to clean up my cable management, and I was climbing and stretching to attach one of the cables to the monitor stand and feed it down the vertical mount, and it was a surprisingly difficult task. When I started swapping out the second cable, I realized that I have an articulating monitor stand. All I had to do was swing the monitors out of the way, and I could do almost all of the cable routing from the comfort of my chair!

This also let me take a pretty good photo of the new speaker mounts.

The problem

The power adapters for my QNIX QX2710 monitors aren’t a convenient fit for my monitor stand. They are the sort of adapter with a short length of DC power cable attached to a relatively large power brick. That brick has a port that accepts a standard three-prong PC power cable.

I was hoping to hide both power adapters, but the short DC cable just isn’t long enough to follow the entire length of the monitor mount. My interim solution was to tape a power brick to the top of each arm. This works well enough, and they are hiding between the wall and the monitor. Out of sight and out of mind.

I recently acquired a pair of Altec Lansing VS2421 speakers, and I needed to put them somewhere—preferably hidden out of sight. I thought it would be a good idea to kill two birds with one stone, so I decided to make a combination mount for my speakers and power adapters.

The first design

The first design was not only my largest print so far, but it was also a huge failure. A 150-gram failure—that’s almost 10% of an entire spool of filament!

The speakers have a single keyhole-style mounting hole, and I wanted to be able to lay the speakers down on their backs. That single mounting point wasn’t going to be enough to hold them in place, but it was very easy to leave a hole in the bracket to accept an M3 screw. They do also have a small rectangular notch near the bottom, and it was very easy to print a block of plastic that would fit snugly in that space.

The first design

The distance between these two points was going to make for a very tall mounting bracket, taller than anything I’ve ever printed.

I decided that I was going to design a sort of clamp-shaped bracket that would put the power adapter on top of the monitor arm, and the speaker would sit on top of the mount. I thought this would be a good design because the speakers would be closer to the top of the monitors.

The 12-hour print didn’t go so well. I had some layer separation a couple of hours in, but I let it continue to print anyway. I figured it would be alright if the brackets were a little ugly. I just wanted to see if they would work.

The first failed print

They didn’t work. I don’t know how I measured so badly. The printed part came out to precisely the dimensions I had specified in the OpenSCAD source code, but the channels just weren’t big enough to fit either the power adapters or the monitor arm!

I measured both of them again, and my measurements were all way off! I don’t know how I did that. I measured everything at least twice, so I have no good excuse here.

The second attempt

I decided not to print another big, chunky, solid bracket. It took too long, and it was a waste of plastic. This time, I designed a pair of brackets for each speaker: one S-shaped bracket for the M3 screw, and a second bracket for the rectangular stub.

The second design

Instead of placing both the speakers and power adapters above the arms, I ended up hanging the power brick directly behind the arms. I noticed when test-fitting the failed brackets that the speakers could impede the travel of the monitors, and this change of layout allowed me to move the speakers back by almost two inches.

This was a simple design, and it only took me about 15 minutes to go from an empty OpenSCAD source file to a working STL file. That even includes the time it took to look up how to create a function in OpenSCAD. I’m pretty excited. It seems like my skills are improving!

I printed two sets of these brackets, and they are currently doing their job.

The brackets aren’t perfect, but they’re good enough

I’m very happy with how the mounting points on the speakers meet up with the brackets. I’m not all that happy about how well they hold up the power adapters.

The brackets doing their job

The spacing for the speaker mounts has to be very precise, and those two mounting points barely fit on one section of the monitor arm. I was lazy. I knew that if I made the brackets narrow enough that the same set of brackets would work for both sides. I didn’t want to have to mirror the parts.

This wasn’t the best choice. The brackets are a little too narrow and a bit too far apart to comfortably hold the power adapters. The power adapters only have to be nudged a bit before they start to fall through the center.

I’m not terribly worried about this, though. The cable management keeps things from moving around, and all this stuff is behind the monitors where I’ll never see it.

The brackets doing their job

Craft Coffee - August 2014

| Comments

Things are different this month. My Rancilio Silvia is much more difficult to tune in, and I ended up wasting the first ounce or two in each bag of coffee last month. Between that and the random mishaps, I didn’t get to drink very much of the Craft Coffee last month.

I changed my Craft Coffee subscription so that they’d send me a single 12-oz bag instead of a selection of three 4-oz pouches. I think this worked out pretty well, and I’m getting better at using my new Rancilio Silvia. I only choked up the machine on the first shot, and then I had something very drinkable on my second attempt.

I’m still not very consistent, though, but I’m getting lots of practice!

Mountain Air, Ashevilla, NC (Narino, Columbia)

A brazenly bright and juicy lemon-lime acidity, a rich buttery texture, and flavor dynamics ranging from tropical fruit and maple to hidden savory complexities of edamame and plantain harmoniously delight the palate–all while maintaining a juicy sweetness and balance.

The coffee from Mountain Air is delicious. In fact, I made at least two different lattes using these beans that made me say, “this is the best latte I’ve ever made.” That isn’t just because of the beans, though. I’m also getting better at using my Rancilio Silvia. This is definitely some most excellent coffee, but I can’t make a very direct comparison to any of the past offerings.

The notes from Craft Coffee weren’t very helpful for me this month. I have no idea what edamame tastes like, and I have had very little exposure to plaintains. There may be a bit of fruitiness in there, but I’m definitely picking up the maple—it is probably why I keep thinking of breakfast.

I’m drinking my final Mountain Air latte right now. It is not the best latte I’ve made this month, but it is still pretty good.

Use my referral code “pat1245” and you’ll get 15% off

If you use my referral code (pat1245) when checking out at Craft Coffee, you will get a 15% discount. Not only will you save money, but they tell me I’ll get a free month of coffee with every order. That sounds like a good deal for both of us!

One Month With My Rancilio Silvia

| Comments

My next batch of beans from Craft Coffee should be arriving soon, so I thought this would be a good time to post an update on how things are going with my new espresso machine and grinder. Things were off to a pretty bumpy start, but I’ve learned a lot, and I’m making some pretty tasty lattes.

A latte

Last month’s Craft Coffee assortment

I ruined a lot of shots with last month’s Craft Coffee selection. If I managed to pull one good shot with each of the three small pouches of beans, then I’d say I was doing well. I think it is more likely that I got one good latte out of the entire shipment, and that’s a bit disappointing.

Up until last month, every coffee I’ve received from Craft Coffee has been both interesting and delicious. I modified my subscription, and I will be receiving a single 12-oz bag of coffee this month. I’m hoping this will keep me from screwing things up!

I highly recommend Craft Coffee. They’ve sent me a lot of very delicious coffee for a very reasonable price. If you use my referral code, pat1245, you will save 15% on your order, and I will receive a free month of coffee. That seems like a pretty good deal for both of us.

Central Market in-house roasted Mexico Custapec

When the Craft Coffee quickly ran out, I decided it would be best to stop by the Central Market and get a large quantity of one or two coffees of reasonable quality to experiment with. Chris picked out a Central Market branded coffee. The beans were from Mexico, and the sign on the bin claimed it was low acidity. We bought about a half pound of it.

I did a much better job with this coffee. Things didn’t go well at first, but after about five or six double shots things were much improved and I was getting pretty consistent results. They weren’t the best results, but they were definitely drinkable.

Addison Coffee Roasters – Premium Espresso Blend

I definitely preferred lighter coffees with my old espresso machine, but I’ve read that it is easier to tune in a darker roast on a real espresso machine. I’ve used several pounds of Addison Coffee Roasters’ Premium Espresso Blend in the past, so I knew what to expect. It was never my favorite, but I thought it would be a good experiment. We bought about 12 oz.

This was a very oily coffee, and it was a bit harder to dose into the portafilter. The other coffees would level out with a bit of a shake, but this coffee stuck together too much for that. It took three or four shots to get the grind about right, but things were pretty consistent after that.

I tried changing my “temperature surfing” a little to improve the flavor. I went from my usual 40-second wait up to a full minute. This seemed to help a bit, but not enough. I also decided to try “start dumping” the coffee. This made a HUGE difference in the flavor. This is probably cheating, but it definitely works!

I should also note here that no matter what I did, these pucks were always wet. They had a tiny pool of water on top, and they would drop right out of the portafilter. I can’t say that this had any impact on the taste, because the good and bad shots were all equally wet.

I think I prefer a ristretto

The Internet seems to have mixed feelings about what precisely a ristretto espresso actually is. It is definitely a smaller volume of coffee—1 to 1.5 ounces—but how you arrive there seems to be under question. Some say it should still take 25 to 30 seconds to pull a ristretto, while others say you just have to stop the machine sooner. I think I’ve landed somewhere in the middle.

I’ve not weighed anything, but I’ve been happier stopping my machine when my 2-ounce cup is somewhere between ½ and ¾ full. It takes somewhere between 20 and 25 seconds. This seems to be right around where the best lattes have happened. In fact, I think the best ones come in closer to one ounce at 25 seconds.

I thought maybe my coffee-to-milk ratio was just too low at one point, and I pulled out my big 16-oz mug. The extra four ounces of milk mixed with a full-sized, 2-oz doubleshot was not an improvement.

Java Pura – Ethiopian Ygracheffe

We ran out of coffee over the weekend, so we made another trip to the Central Market. I found a shelf full of 12-oz bags of coffee beans from Java Pura for $10 each. I’d never heard of them before, but each bag was stamped with a roasting date, and that seemed promising. I found a bag of Ethiopian Ygracheffe that was roasted on July 17—the same day my Rancilio Silvia arrived.

I knew this coffee would be good as soon as I opened the bag. It smelled delicious, and the bag was full of tiny, light brown Ygracheffe beans.

It took quite a few tries to get the grind right. I had to move up almost three full clicks on the Baratza Preciso before I could get much water to pass through the puck. Once I did, though, it was delicious. This coffee from Java Pura is definitely on par with the brands that Craft Coffee sends me. I most definitely won’t have to “start dump” with this coffee!

I think I made one of the best lattes of my life using this coffee. That was yesterday, and I haven’t been able to quite replicate that success today. I have plenty of beans left in the hopper, and I’m confident that I’ll do it again.

The Baratza Preciso grinder was an excellent choice

I am very pleased with the Baratza Preciso. I may be a beginner, but the microadjustment lever seems like it will be indispensable. The difference between choking the machine and a pretty good ristretto is only two or three clicks of the microadjustment lever. I can’t imagine trying to tune in a shot with only the macro settings.

I know there are other grinders that don’t have fixed settings, and I don’t think I’d like that. I’m making notes for myself, and I know that I was using a setting of 3H for the Premium Espresso Blend. If I ever buy more, I will have a good starting point. I’m using a setting somewhere around 5C with the lighter roast from Java Pura.

3D printing a better tamper

The tamper that comes with the Rancilio Silvia has a convex bottom, and it isn’t big enough to fit the basket. This leads to inconsistent tamping, so I knew I’d want to upgrade the tamper right away. Instead of buying one, I decided to print one, and I’m glad I did.

The first tamper that I printed was just a hair over 58 mm in diameter, which is the size everyone recommends for the Silvia. This seems a little too big for the Silvia’s tapered doubleshot basket. It would get hung up on the sides of the basket unless I dosed pretty heavily—heavily enough that the puck would come into contact with the screen.

This was no good, and I’d have had to order another tamper. Instead, I printed another tamper the same day. This one measures 57.3 mm in diameter, and it has been working out pretty well. This tamper can push the coffee down just below the groove in the filter basket.

I am probably going to print one more that is just under 57 mm, or I might try to add a slight bevel to the edge. I haven’t decided which way to go.

Performance and Efficiency Improvements in Seafile 3.1

| Comments

I have been using Seafile for my synchronization, backup, and storage needs for more than a year. The near real-time backups of my data that Seafile provides me with have saved me a lot of time on several occasions, and having nearly 100% of my data available and up to date on both my laptop and desktop computers has made traveling for long periods of time much more pleasant.

Bandwidth usage is a little high after recreating my libraries

When I adopted Seafile, it was probably the only Open Source Dropbox-like software available that had enough scalability to meet my requirements. The other Open Source cloud storage platforms have made big strides in this area over the past year, but it looks like Seafile has made a significant leap of their own with regards to synchronization speed and scalability.

Upgrading my old Seafile 2.0 libraries

Seafile 3.1 has improved performance in many areas, but you’ll only see those improvements if your files are stored in the new Seafile 3.0 or newer library format. Most of my libraries were created with Seafile 2.0, so I had to do some leg-work if I wanted to see the maximum benefit from my upgrade to Seafile 3.1.

There is no option to convert an old library to the new format. I don’t know if there is an official “best practice” for updating your libraries. This is what I did for each library.

  • Unsync the library on all my computers
  • Rename the library on the server by prepending 2.0 to the name
  • Create a new encrypted library on my desktop computer
  • Add the new library to my laptop

For most of the libraries, I used the “sync with existing folder” option on my laptop. That way it wouldn’t have to download the files all over again. I did choose to download a few of my libraries from scratch to test the performance.

Bandwidth utilization while uploading a new library to Seafile

Seafile had no trouble utilizing my entire 75-megabit FiOS upload limit, and it had no trouble maxing out my laptop’s Wi-Fi connection. I remember doing these same uploads over a year ago. The Seafile client would max out my Internet connection for a little while, then it would idle for almost as long, and it would repeat that several times before the upload would complete. Those idle periods are much smaller with Seafile 3.1. They’re short enough that you’ll miss them on the bandwidth meter if you blink.

The uploading of large directories has also been drastically improved. If you had a 30-GB directory, the older versions of Seafile would copy all of that data to another location on your local machine before uploading it to the server. This was almost problematic for me when I only had a 80-GB solid-state drive. Also, none of your data would be available for syncing to other machines until the entire library was uploaded.

Seafile 3.1 has eliminated this issue. It now uploads your new libraries in smaller 100 MB chunks. Each of those chunks of data is available right away, and it looks like other clients can sync with the library before the upload is complete. This is a huge improvement.

Syncing directories with rapidly changing files

There is a heading in the Seafile 3.1 release notes that reads “Efficient File Change Detection for Large Libraries,” and I’m very excited about this particular improvement. I have several problematic directories that I synchronize to my Seafile server. These directories have been problematic because they have rapidly updating files.

One of them is my Pidgin directory. Pidgin generates log files for every conversation and chat room, and I idle in quite a few IRC channels. I tried to sync this directory with Seafile 2.0, but every time a log file got updated, Seafile would scan every file for changes. This was very inefficient, and it generated a lot of I/O and CPU overhead.

I had some manual workarounds in place to get around this issue. I had things set up so that Seafile would only sync my .purple and .config directories once every few hours. Since the Seafile release notes talked about improvements in this area, I decided to eliminate those workarounds and sync these directories in real time.

This change has been working out well so far. Seafile has been syncing these busy directories for a few days now without being a noticeable drain on resources. I’m one step closer to the dream of being able to sync up my entire home directory!

Other improvements to syncing

My laptop has been a nuisance ever since I downgraded it from a solid-state drive back to its original 7200 RPM disk. I have it aggressively starting quite a few services during its boot process, and it also logs me right in and launches a handful of programs. With the SSD, this didn’t slow the machine down and saved me some time and effort. With the spinning disk, the laptop doesn’t get to a usable state until after X server has been up for almost an entire minute.

One of the issues here was Seafile. When Seafile fires up, it scans through all of your libraries looking for anything that might have changed since it was shut down. I have tens of thousands of files that need to be checked, and that takes some time and generates quite a few I/O operations.

This seems to have been streamlined quite a bit in Seafile 3.1. My laptop is now ready and responsive immediately after my Pidgin and web browser windows pop up. This is another big improvement for me.

Conclusion

Seafile has served me well. It has always performed well enough for my needs, and it hasn’t lost any of my data. Even so, the performance improvements offered by Seafile 3.1 make for an excellent upgrade, and upgrading my deprecated libraries was well worth the small effort.

Are you using Seafile? Are you using another cloud storage or synchronization solution? Leave a comment and tell us how it is working out for you!

FiiO E07K Andes USB DAC/Amplifier

| Comments

I have been complaining about computer audio for quite a while. Routing my headphone and microphone cabling around my desk has been a nuisance, and headphone static has been a constant problem whenever my headset’s microphone is plugged in. On the rare occasions when I want to use my gaming headset for some multiplayer PS3 gaming, I have to fish out the headset’s extension cable and amplifier. This is troublesome because the cable is routed all the way around three sides of my desk.

My friend Brian must have gotten sick of listening to me complain, because he bought me a FiiO E07K for my birthday. I’ve been shopping for some sort of USB sound card for quite a while now, and I’ve been having trouble finding one that fits my needs. I was hoping to not just find a quality DAC for my headphones, but also an ADC to plug my microphone into.

Units with both features are hard to find. I found some really cheap hardware with both audio input and output, usually marketed as USB Skype adapters, but I didn’t think those would be much of an upgrade.

FiiO E07K Andes DAC

There are some fairly high-end DAC units that also feature an ADC, but they didn’t fit my needs even at their much higher prices. They are aimed at a market way above what I’m looking for. They have RCA and ¼ inch line-in and line-out ports, and I’m just looking to plug in a simple 1/8-inch, 4-conductor headset.

I had a lot of trouble deciding to pull the trigger on one of the units in between these two extremes because they don’t quite meet my needs. Brian knew I was looking at the FiiO E07K Andes amplifier, and we can all thank him for pulling the trigger for me!

The hardware

The FiiO E07K is a small aluminum box that’s right around the size of your average smartphone, but just a little shorter and thicker. It has a pair of amplified headphone outputs on one end, and an audio input and mini USB port on the other. It feels like a very solid piece of hardware. It also has a small OLED screen on the front, but that is something I don’t really need.

The E07K is also equipped with a rechargeable battery. You can use the heavy-duty rubber bands to strap the E07K to your phone and connect them together with the short 1/8 inch audio cable, both of which were shipped with the unit. I tried this out with my Nexus 4, and I have to say that there isn’t a significant difference in audio quality when doing this. It would probably allow you to crank the volume up pretty high, but I’m not one to blast music in my ears. This feature is lost on me.

The FiiO E07K is equipped with a 24-bit, 96 kHz USB DAC (Digital Audio Converter). It can also be used at a more standard setting of 16-bit, 48 kHz. I’ve tried both, and even downloaded some very high-quality 96 kHz classical music tracks. They sound fantastic, but my ears can’t tell the difference between 96 and 48 kHz. Your mileage may vary!

The FiiO E07K vs. the on-board audio

On-board PC audio has been more than satisfactory for me. I don’t turn my speakers up very high, and the laptops I’ve been using for most of the last decade have been very well shielded. The new desktop machine I built last year has been a bit less pleasant to my ears, though.

When the CPU and video card are working hard, I can sometimes hear a faint buzz from the speakers. I have a similar problem with the microphone on my Monoprice Gaming Headset. When I turn on the microphone on its little USB audio adapter, it generates static on the speakers.

FiiO E07K in a custom mount

The FiiO E07K generates absolutely no audible static that my ears can detect. I didn’t even realize just how much static was coming out of my headphones before. It was very faint, but I now know that it was there! Unfortunately, I don’t have a convenient way to use the FiiO unit with both my speakers and headphones.

I am absolutely amazed at how much better games sound through my headphones when using the FiiO DAC. The first thing I fired up was Team Fortress 2. I tweaked the volume of the FiiO box to match the level that I normally use when playing. The Reserve Shooter has A LOT more bass now, and I can hear low-volume chatter in the distance (“Stand by little wagon!”) much more clearly than before.

I don’t know if this is due to the higher-quality DAC or if the FiiO’s equalizer settings are doing a better job.

Static on the microphone!

My Monoprice Amplified Gaming Headset uses a 4-conductor 1/8-inch headphone jack to connect its speakers and microphone using a single port. This didn’t seem like it would be a problem. I ordered a cable to split it into separate microphone and headset connectors, and I ordered a long extension cable to plug the microphone directly into the computer.

I even designed and 3D printed a custom mount to attach all of this directly to my desk. All of this worked just as I planned except for one small detail: I’m still hearing static in the headphones when I plug the microphone in!

It is a lot less static than I had before. If I connect everything up before putting the headphones on, I won’t even know that it’s there. It is much quieter than my computer’s fans. The static is very obvious if I plug in the microphone cable while the headphones are already on my head.

This seemed very peculiar. I didn’t understand why the separate microphone and microphone cable should be affecting the headphones. It took me a while, but I finally realized that the headphones and microphone aren’t as separate as I thought. They are sharing a ground wire because of the 4-conductor 1/8-inch connector!

I don’t have a good way to work around this issue. The static is very quiet, and I only have to listen to it if I’m using the microphone. Thankfully, I don’t use the microphone very often.

The verdict

I’m very pleased with the FiiO E07K DAC. It sounds wonderful, makes my games sound a lot better, and it is small enough that I’ll have no trouble taking it with me when I travel. It’d be nice if FiiO would release a DAC anywhere in the $50 to $150 range that also includes an ADC. If the E07K had a microphone input, or better yet, a 4-conductor headset jack, I wouldn’t have anything to complain about.

Craft Coffee - July 2014

| Comments

I’m doing things a little differently this month. I’ve only been using my new Rancilio Silvia for less than a week, and I’m still trying to learn how to use it properly and consistently. With my old Mr. Coffee espresso machine, I would rotate through the beans one latte at a time.

The pressurized portafilter on the Mr. Coffee machine was very forgiving. I’ve heard it said that machines with pressurized portafilters pull a consistently mediocre shot of espresso every time, and I am inclined to believe that. Miss Silvia is much less forgiving, but she’s a much more capable machine in the right hands. Those hands aren’t yet mine, though.

The first shot I pulled with my fresh beans from Craft Coffee was a complete failure. My grind was just too fine—it almost completely blocked the machine. Things went much better after that but not perfectly.

This month, I am going to use up each pouch of coffee before starting the next. I am going to write about (and post!) my thoughts as I finish each bag. I am going to wait until all the coffee is gone before I send this post to my most excellent editor for proofreading, so please excuse any grammatical errors in the mean-time!

Caffe Vita, Seattle, WA (South Kivu, DR Of The Congo)

A robust, vegetal bouquet complements full flavors of caramel and dark chocolate with a slightly smoky finish resembling cavendish tobacco and leather.

I have had a lot of trouble with this coffee, and I don’t know if it is my fault or the beans. There are a lot of words in the description that my taste buds won’t be happy about: “vegetal bouquet,” “dark chocolate,” and “tobacco.”

The coffee from Caffe Vita has a strange aftertaste. Some of the lattes I made seem a bit sour at the end, while others seem bitter. I’m drinking the last one right now, and I can’t decide which it is. Whatever this aftertaste is, it is definitely bringing tobacco to mind.

The third or fourth shot I pulled seemed like it was going technically perfect. I got right around 2 ounces of espresso in just shy of 30 seconds. It was quite bitter and far from perfect. I clicked the grinder up one notch, but that double shot only took about 10 seconds. I made a latte out of that shot anyway, and it tasted surprisingly good. Not very flavorful, but there were no offensive sour or bitter tastes.

I split the difference for this final shot—I dialed the Baratza Preciso back to where it was and moved the micro-adjustment lever to the half-way mark. This is probably the best latte I’ve made with the beans from Caffe Vita, but it still has a strange aftertaste. I’m not ready to blame it on the beans, though!

Metropolis, Chicago, IL (Bruselas, Caturra, Columbia)

Leading off with aromas of nougat, honey, and tobacco, this silky brew offers flavors of roasted almonds and candied yams and finishes with butter notes of caramel and milk chocolate.

The coffee from Metropolis was similar to, but seemed smoother than the coffee from Caffe Vita. The description on the Metropolis pouch also mentions tobacco, but only in relation to the aroma. I don’t know if my nostrils are just tricking my taste buds, but it sure seemed like it had the same sort of tobacco-like finish as the Caffe Vita coffee—it just wasn’t as strong.

It is entirely possible that the aftertaste in both coffees is related to my lack of skill and experience with my new espresso machine. Some of the lattes I made using the Metropolis beans were the best that I’ve managed to make with the Rancilio Silvia.

This will probably be the norm for quite a while. I won’t be surprised if every pouch of coffee tastes better than the last as my barista skills improve.

Forty Weight, Ithaca, NY (Sidama, Ethiopia)

Tangy aromas of raspberry and pineapple lead into flavors of a summer afternoon in the park–peanut butter and jelly sandwiches, strawberry limeade, and chocolate malted milk balls.

I saved this one for last, and I’ve been looking forward to trying it. I’ve really enjoyed all the Ethiopian coffees that I received, and I thought it would be best to get more practice with my new espresso machine before trying the beans from Forty Weight.

I goobered up more shots of this coffee than any of the others. It is a much lighter roast than the other two, and I ended up having to move the setting on the grinder up by more than two full clicks, and I’ve been making micro adjustments the entire time.

I also used my 3D printer to print a proper 58 mm tamper while I was half-way through these beans. It sure looks like I’ll be able to be much more consistent in my tamping now, but this also lead to me wasting two more double shots of the Forty Weight coffee.

I did manage to make one or two pretty good lattes, though, and this is definitely my favorite coffee of the three. It doesn’t have that strange tobacco-like finish, and I can pick out most of the flavors I have come to expect from Ethiopian coffees.

No more sample pouches for me!

I changed my Craft Coffee subscription so that they will send me a single 12-oz bag of coffee each month. I’ve completely wasted at least one quarter of each sample-size pouch just trying to tune in a reasonable shot of espresso, and I haven’t gotten things tasting good until just about the last shot in each pouch.

This should mean that I’ll still have at least a half pound of coffee left after I’ve gotten things tuned in, but it also means that my little coffee review posts will be much shorter. We’ll see how it goes next month!

Use my referral code “pat1245” and you’ll get 15% off

If you use my referral code (pat1245) when checking out at Craft Coffee, you will get a 15% discount. Not only will you save money, but they tell me I’ll get a free month of coffee with every order. That sounds like a good deal for both of us!