My New Monitor: The Gigabyte G34WQC A Ultrawide

| Comments

I have been avoiding a monitor upgrade for as long as I could. I have been running a pair of QNIX QX2710 monitors at my desk for nearly a decade. These have served me well, and were a tremendous value. I have had these IPS panels overclocked to 102 Hz almost the entire time, and I only paid $325 for each monitor in 2013. At the time I bought them, you couldn’t get a name brand 27” 2560x1440 monitor for less than $900.

The QNIX monitors weren’t perfect. Their color reproduction didn’t look all that far off from my new Gigabyte monitor in sRGB mode, but there was more than a bit of backlight bleed around the edges. I knew it was there, but it wasn’t so bad that I couldn’t live with it.

Placeholder Photo

NOTE: This photo needs to be replaced. I am rearranging video lights, camera mounts, computers, and monitors all around my office. Everything is a mess right now. If I knew I was going to use this photo in the blog, I would have turned on more lights and used a better camera!

I am staring at my new monitor while writing this, and I can say for sure that the old monitors looked just fine. Upgrades are always fun, and swapping in this nice new Gigabyte monitor has been an enjoyable experience, but this is kind of a sideways move for me. Going from two 2560x1440 monitors to a single 3440x1440 is both an upgrade and a downgrade.

There is a very specific problem with the QNIX monitors that has been holding me back. They only have a single dual-link DVI port on the back. My Nvidia GTX 970 was probably one of the last few GPUs to sport a pair of DVI ports.

Active DisplayPort to dual-link DVI adapters can be a bit janky. Some are really inexpensive, but a pair of good adapters that might actually work might cost me $50 or $60. That’s almost 20% of the cost of a new monitor.

I am in need of a GPU upgrade, so upgrading my monitor first made a lot of sense.

Are we already too far in for a tl;dr?!

I am quite happy with my purchase. I believe the Gigabyte G34WQC is the best 34” ultrawide monitor you can buy for under $400.

The G34WQC has excellent dynamic range, low latency and a rather fast 144 Hz refresh rate for gaming, FreeSync Premium support, and quite good sRGB color accuracy.

This is definitely the best monitor for me. You can keep reading if you want to find out why!

Why didn’t I buy a 38” 3840x1600 monitor?

I have been drooling over these monitors ever since the first one was announced. These would be a proper upgrade for me!

I should probably say here that my old 27” 2560x1440 monitors, 34” ultrawide 3440x1440 monitors, and 38” ultrawide 3840x1600 monitors all have the same exact pixel density. They all have about 110 pixels per inch. The size of each of their pixels are precisely the same. The bigger monitors just have more of exactly the same thing.

There are only a few models of 38” ultrawide monitors on the market. They usually cost between $1,150 and $1,300, though you can sometimes find one for $999.99.

I would be willing to pay that much for a monitor. That is about what my old 20” Triniton monitor cost me if you adjust for inflation, and that monitor was both used and physically damaged!

All the 38” ultrawide monitors are premium products. You can find 34” ultrawide monitors from brands like Sceptre for under $300.

The premium you have to pay for the extra 4” is high, and I am hopeful that someone like Sceptre will add a 38” ultrawide monitor to their lineup in two or three years.

It seemed like a wise move to save $600 or more today. Do you think I made the right decision? Maybe I will be able to add a 38” monitor to my desk in a few years with that $600 that I saved!

Why the Gigabyte G34WQC A?

I can’t believe how much effort I put into shopping for a monitor. I figured that all of the 1440p ultrawide monitors under $400 would be full of compromises, and I assumed those compromises would be pretty equivalent.

At first I had my sights set on the AOC CU34G2X. It usually sells for $340 on Amazon, but it was priced up at $400 on the day I was shopping. I immediately added it to my wishlist, and I said I would shop around until the price dropped back to $340.

Tom’s Hardware has a great review of this monitor. They tested the latency, and it scored pretty well. They said its HDR support was basically phoned in. Overall, though, I was pleased with the test results at the $400 price point.

Then I noticed the AOC CU34G3S, and it was also priced at $400. It seems to be an update to the CU34G2X. They both have similar quality 34” 3440x1440 VA panels. The cheaper CU34G2X supports up to 144 Hz and has a curve of 1500R, while the newer CU34G3S goes up to 165 Hz and has a curve of 1000R.

This is when I stopped, blinked a few times, and said, “Oh, poop!” How much of a curve do I want? That tight 1000R curve sounded like too much curve!

I would gladly pay $400 for the 165 Hz monitor, especially since it means I could order it immediately and have it on my desk in two days. I was worried more than a little about that more extreme curve.

I clicked on a lot more monitors, but most of them didn’t have reviews that included latency testing like Tom’s Hardware. There was an Acer Nitro for $360 that looked good on paper, but I couldn’t find a single technical review.

Then I stumbled upon the Gigabyte G34WQC for $380. Tom’s Hardware has a really good review, and all the graphs in the review included the AOC monitors that I was already interested in.

The Gigabyte monitor can only reach 144 Hz, but it still manages to match the latency of the 165 Hz AOC monitor. The Gigabyte has higher maximum brightness, and it has really good dynamic range. Not as much dynamic range as an OLED monitor, but 34” 1440p OLED monitors cost four or five times as much.

All of that data was fantastic, but I was most excited that the Gigabyte G34WQC only has a curve of 1500R.

Is 1000R really too much curve?

I have no first-hand experience with a 1000R monitor. I hit up Google, used my protractor, and did some math. I believe I correctly calculated that my two monitors were set at an angle equivalent to around 650R.

Two flat monitors with an angle in between is probably not directly comparable to a continuous curve, but coming up with such an extreme number makes me think that 1000R wouldn’t be as extreme as I feared.

I feel like 1000R would be amazing for first-person shooters. I worried that it would be awkward when I have Emacs in the middle of the screen and some terminal windows off to either side.

I am staring at a 1500R monitor while writing this. It hasn’t even been on my desk for a full 24 hours, and it is already making me think I would have been perfectly happy if I bought a 1000R monitor.

I do feel that you need to have some amount of curve on a monitor this size. My friend Brian Moses has two desks in his office. Both have 34” ultrawide monitors. One has a curve, the other doesn’t. I bet you can guess which one he prefers sitting at.

Why did I settle for a VA monitor?

I was already using IPS monitors, so you might assume that a VA monitor would be a downgrade. My IPS monitors were made with LCD panels rejected by the folks at Dell or Apple. Those LCD panels came off the same assembly line as the very best LCD panels of the time, and they were intended to be used in the most expensive monitors. There was just something they didn’t like about these batches, so they ended up in cheap monitors.

That leads to the other point that this VA monitor has 10 years of technological and manufacturing improvements on my old IPS monitors.

Of course I did check the prices on 34” IPS monitors. There was one oddball down at $450, but I couldn’t find any reviews on that one. The majority of 34” IPS monitors were priced at $750 and above, so they cost twice as much as any of the VA monitors.

If I were going to spend more than $750, I would most definitely have waited for a sale on one of the premium 38” monitors. They are all very nice IPS monitors, and sometimes you can find one on sale for $1,000.

Can you believe I am only using one monitor?

I have had at least two monitors on my desk for a long, long time. I used to have two Riva TNT2 graphics cards installed in my dual-Celeron ABIT BP6 machine connected to a pair of 19” CRT SVGA monitors from Sam’s Club. I believe this would have been right around Y2K. Do you remember CRT monitors and Y2K?!

My old 27” monitors are just about as tall as they need to be. I tried mounting a third monitor above the other two once, and that was way too far to be leaning my neck back. It was uncomfortable even just looking at a small terminal window at the bottom of the screen. I know the 38” ultrawide monitors would be 160 pixels taller, but that’s really only 80 more on top and 80 more on bottom. That would still be reasonable.

The most important thing I learned from using a pair of 27” monitors is that I can really only see about one third of the second monitor without turning my head. I know that I will continue to miss some of that extra screen, but a 34” ultrawide is roughly one third wider than one of my old 27” monitors. That is pretty close to the perfect width.

I was a bit worried that a 38” ultrawide might be too wide. Especially when playing full-screen games. I am much less concerned about this after having the 34” ultrawide on my desk, and I should have figured that out with math. A 38” monitor is only 400 pixels wider than a 34” monitor. That is only 200 more pixels to the right and 200 more pixels to the left!

Don’t let me talk you out of spending three times as much on a 38” ultrawide! I would certainly be excited to have one on my desk.

Let’s get back to the Gigabyte W34WQC A!

I was trying to find a compromise that is good for gaming, good for productivity, and easy on my wallet. I think the Gigabyte was a good choice, and it ticked almost all the right boxes for me.

You already know I was shopping for something reasonably priced. All the monitors I was looking at were $400 or less.

Productivity would steer most people towards something with a much higher DPI. 32” widescreen 3840x2160 monitors are quite common. My wife has a very nice 32” Dell 4K IPS monitor on her desk. It looks great, and it is around 140 DPI.

I could write 2,000 words about why I would prefer to stick to the same DPI. The short answer is that Wayland isn’t ready for me, and X11 doesn’t support fractional scaling. Everything is easier for me if I stay at 110 DPI, and I don’t think there are any high-DPI ultrawide monitors.

The 34” ultrawide is working out well so far. I have my screen divided up into three equal columns. Firefox is on my left with the awesome PPI Calculator open. My Emacs window is in the middle with the font enlarged slightly, giving me a little over 90 columns. To my right are a pair of terminal windows that are about 125 columns wide.

Davinci Resolve Ultrawide

It should definitely be noted that Davinci Resolve is just a little more comfortable with an ultrawide display. You can comfortably fit two widescreen viewers, the inspector tab, and the media pool on the screen at the same time. I used to have to scroll my media pool from side to side to see timecodes and clip lengths. I won’t have to do that anymore!

I have been firing up older first-person shooters that I am confident will keep up with the Gigabyte’s 144 Hz refresh rate. I wandered around for a bit in Borderlands 2, I played through a level of Severed Steel, and I have also been just generally farting around in Just Cause 3.

I ran the UFO ghosting test, and the W34WQC definitely has some ghosting. If I were smart, I would have run the test on my old monitors before putting them in the closet!

I can most definitely tell that the Gigabyte monitor at 144 Hz feels smoother than my old QNIX monitors at 102 Hz. Part of that is certainly due to the extra 42 Hz, but I suspect both monitors have roughly the same number of frames of ghosting. That probably means that the Gigabyte VA panel’s ghost frames fade away more quickly.

I have no science to back that up. This is how I feel playing the same games with each monitor.

I do have some complaints!

Can I start the complaints with a bit of praise? The W34WQC stand is pretty nice. It feels solid, it can tilt, and the height is easily adjustable. I removed the stand as soon as I made sure my long DisplayPort cable could manage 144 Hz at native resolution, because I always use monitor arms. I was excited to see that the Gigabyte stand is attached using the VESA mounting screws. That means I can attach it to any other monitor. I may wind up using it on one of the old QNIX monitors, since I have no idea where the stock legs went to.

Zip Tied Power Cable Cheat

NOTE: Is snipping away ½” of strain relief and zip-tying a 90° bend in the cable cheating? Is it still cheating if it works?

My first complaint is the location of the ports. They all point downwards, and they are all rather close to the bottom. I had to search through my box of power cables to find the one with the smallest end, and I had to get creative with a zip tie to attach the power cable in such a way that it wasn’t hanging below the frame. Who wants to see cables dangling below their monitor?!

I need a long DisplayPort cable to reach my computer, so I am using my own. It has a fairly compact end, and I can still just barely see the cable from where I am sitting. I do have to duck my head down to see, but I shouldn’t be able to see it from my chair at all. The included DisplayPort cable has even longer ends than the one I am using.

The monitor is too vibrant with the default settings

Everything is rather bright, and the reds are crazy vibrant with the monitor set to the standard profile. Reds are bright. Browns look a bit orange. Everything is eye catching, but not in a good way.

I just set it to the sRGB profile, and I think it looks great. I did bump up the brightness a bit to fit better with the lighting in the room. I am assuming Gigabyte’s sRGB profile is calibrated fairly well. I am excited to learn that the color profile I have been using for years on my QNIX monitors wasn’t all that far off!

Conclusion

I believe I made a good decision, but I also don’t feel like there was a wrong choice to be made here. The Sceptre is probably a decent value at $300. Either of the AOC monitors seem fine both on the spec sheet and in the technical reviews on Tom’s Hardware. I don’t expect I would have regretted buying any of them, but I do think the Gigabyte was a better value for me.

I do have some regret that I didn’t splurge on a 38” ultrawide. For productivity work, like writing this blog, the 34” monitor just feels like a bigger monitor. Being 400 pixels wider would almost definitely make the 38” ultrawide feel much like two monitors without a bezel. Then I remember that I can nearly buy an AMD 7900 XT with the money I saved buy staying one size smaller.

What do you think? Did I make the right choice with the Gigabyte W43WQC A? Why does every monitor have a terrible name? Are you already using one of those 38” ultrawide monitors? Do you think I should have spent three times as much for those extra four inches? Let me know in the comments, or stop by the Butter, What?! Discord server to chat with me about it!

Comments