You might consider it a stretch to call my gaming PC a workstation. One lazy way to define a workstation could be enterprise server-grade hardware in an office-friendly case, but I’m willing to be more liberal with my labeling. Workstation is an easy word to use in the title that conveys relevant enough information, so I am sticking with it, because this is the machine I sit at when I want to get work done.
Bazzite is the new and popular gaming Linux distro. It is built on top of Universal Blue, which is built on top of Fedora Silverblue, and these are all immutable distros. I hope I got that correct!
I am excited about the idea of immutable distros. I’ve been running Bazzite’s gaming mode in my living room for a few months, and I am impressed with it. They have desktop spins of the installer, so they have me tempted to give it a try.
I usually shy away from the more niche Linux distros. I don’t want to have to reinstall and start from scratch if someone gets bored and the distro goes away.
I could wait until the end to reveal this, but I am already dipping my toe a little deeper into the Bazzite waters. I just installed the KDE Plasma spin of Bazzite on my Asus 2-in-1 laptop. Things are looking promising so far!
I started out using Slackware in the nineties. I tried SuSE for a while, because their network installer was handy when we had our early cable modems.
I settled on Debian before the end of the decade, and that is all I used until 2006.
That’s when I switched to Ubuntu. The appeal for most Debian users in those days was Ubuntu’s release cycle. We got what amounted to a fresh, reasonably stable, and up-to-date Debian build every six months. That was SO MUCH BETTER than dealing with Debian’s testing repositories breaking your machine twice a year.
I had a continuously updating Ubuntu install on this computer from 2009 until 2022. It was installed on my old laptop, had been dded to new SSD and NVMe drives a few times, and has been paired with one laptop and two different motherboards.
That is when I almost switched back to Debian. Ubuntu has been drifting farther and farther from Debian as the years go by. There are lots of inconsequential things I am grumpy about, but the straw on the camel’s back for me is forcing snaps on us. Ubuntu installs the Firefox snap via apt, and in 2022, the snap would refuse to update itself unless I closed Firefox.
It felt like I traveled backwards in time, and it didn’t help that the Firefox snap took so long to open and refused to auto update unless I remembered to close my browser. Who closes their browser?! This felt like a good time to start thinking about where I might move in the future.
I wound up aborting my Debian install. I’m not going to get all of the details right from memory, but I am sure this will be close enough to accurate. Getting a combination of recent enough Mesa and RADV libraries installed for ray tracing to work well, and getting a build of OBS to work with hardware video encoding, while simultaneously having a working ROCm setup compatible with DaVinci Resolve Studio was going to be a massive pain in the butt.
Ubuntu had two out of the three nailed, and working around the third wasn’t a big deal.
Bazzite to the rescue?!
Bazzite prioritizes gaming. Bazzite is built on top of Fedora Silverblue with nearly bleeding edge AMDGPU drivers and Mesa libraries, so my Radeon GPU will always be working great, and I will be running one of the first distros to ship support for whatever the next generation of Radeon GPUs happens to be. That means I won’t have to wait as long after a new hardware release before upgrading!
This is awesome. Gaming is the most demanding thing I use my computer for, and things always improve when you can use the latest and greatest kernels, drivers, and libraries. Shoehorning this stuff into Ubuntu LTS releases can be a pain, and you’re always lagging behind.
Bazzite ships with their ujust system. It isn’t a package manager. It is more like a consolidated set of scripts and magic to help you get certain things going, much like an officially supported set of Proxmox helper scripts.
On my laptop, I ran ujust enable-tailscale to get my fresh Bazzite install connected to my Tailnet, and I ran ujust install-resolve-studio to install DaVinci Resolve.
It was slightly more complicated than that. I had to download the zip file from Blackmagic’s site myself, but ujust handled the rest for me. It set up a custom distrobox environment with everything Resolve needs to run, and I didn’t even have to click through Resolve’s GUI installation tool. It was just ready to go, and everything seems to work. Though I did have to tweak Resolve’s memory settings to stop it from crashing on my low-end laptop!
I don’t know if it is fair to accuse my laptop of being low end. It was squarely in the mid range when I bought it, but time has gone by, and it is starting to show its age.
The best part is that Resolve is in its own container. It is unlikely that a future update to the Bazzite installation will break things.
It took me a few clicks to install OBS Studio using Bazzite’s new Bazaar frontend for Flatpak. Flatpak correctly installed the required VA-API plugin. I just had to turn on the advanced settings in OBS Studio, and I had my laptop hardware encoding a 1080p screen capture in h.265.
Those were the trio of things that were going an effort to get working on Debian three years ago. They’re all working, and they’re all in better shape than on my current Ubuntu install on my workstation. I think that is an awesome start!
Living with an immutable distro, and embracing Distrobox
I already mentioned that Bazzite uses Distrobox to containerize DaVinci Resolve, but I didn’t explain what Distrobox is. Let’s see if I can do a good enough job in a paragraph.
Distrobox sits on top of either Docker or Podman, and it handles installing, configuring, and running full Linux distros in these containers. They aren’t containerized for security or to provide any significant separation. The opposite is true! All your Distroboxes are plumbed to have access to most of your hardware and to share your home directory.
This means you can set up separate Distroboxes with Arch, Debian, and Ubuntu. You can set up terminal window shortcuts to open shells in these separate boxes. You can create an AI-generated video in your Debian box, then edit that with DaVinci Resolve in the Ubuntu box, and paste that video into Discord using your Arch box. Each Distrobox has access to your Wayland session, so you can run GUI programs on any Distro.
I had Distrobox up and running on my aging Ubuntu install in a few minutes. Not long after, I had an Ubuntu 25.04 box going with Steam installed, and I was playing games that were already downloaded to my Ubuntu host. It bind-mounted all my usual file systems exactly where they needed to be to play my existing Steam games.
My plan is to use Bazzite for the stuff that is a pain to maintain or relies heavily on the host’s hardware. Steam, OBS, and Resolve, and Firefox will live up there on the host. I expect to do nearly everything else inside one or more Distrobox boxes.
It is possible to export a Distrobox image on one machine, then import it on another. My plan is to get myself an environment that I am happy with on my old Ubuntu workstation, and move all my important work into that box. Once I am happy, I will copy that box over to my laptop.
If I do things well, I should almost instantly have my working environment fully operational once I get around to installing Bazzite on my workstation. That is awesome!
The core idea here isn’t new. I used to do something similar with work and personal virtual machines two decades ago, but it wasn’t nearly as easy to work with those separate virtual machines at the same time.
Wiping out my workstation and starting from scratch fills me with dread. I always worry that there will be something that I rely on that is missing, or some weird binary in /usr/local/bin that just doesn’t exist anymore. Maybe I will lose a game’s save files that are stored in a weird location and aren’t being synced by Steam. What if an important program refuses to work correctly, or I can’t figure out how to configure something correctly?
Thing never ACTUALLY go terribly wrong, but I always miss something important, and migrating to an entirely new Linux distro isn’t something I would do on a whim. I am definitely going to kick the tires on my laptop for a few weeks, and put some work into getting a Distrobox environment well configured on my current workstation before I wipe my NVMe.
What do you think? Are you running Bazzite on a productivity machine? Am I silly for thinking this will be a good idea, or am I a genius and optimizing for exactly the right thing? How long do you think it will take me to get a productive Distrobox image set up so I can start my migration? You should join our friendly Discord community to let me know if I am making a mistake, or to chat with me to see how things are working out so far!
I have had a renewed interest in couch gaming on the TV. I set up an Intel N100 mini PC last year as a Steam box, but I didn’t use it very long. Then I picked up a Ryzen 6800H mini PC to use in the living room, and it is running Bazzite. That machine is roughly comparable to a Steam Deck, and it has been fantastic, so I wound up adding a GameSir Cyclone 2 to my collection, and I absolutely love its mechanical buttons and D-pad.
Not all games are fun for me in our living room. Our aging 70” Vizio doesn’t have fantastic latency. You would think 70” makes for a massive television, but the couch is 14’ away. I had trouble reading the descriptions of the upgrades in Spider-Man Remastered, so I wound up playing at my desk instead!
NOTE: I used Flux Kontext to remove some junk under my desk in the photos in this blog!
I have an ancient 40” 1080p TV in my office that was manufactured in 2009. It actually has decent latency—better than our giant Vizio! It is unfortunate that it suffers from the same problem that I have in the living room. When I sit on my office recliner, I am just too far away to game on a 40” screen.
You can find almost every iota of information you could possibly want about this TV from the excellent review at rtings.com. The trouble is that review is in reference to the TCL Q6/Q651G, and that precise model isn’t in stock anymore. I had to buy the Q6/Q651F. What is the difference?
I didn’t find a good source to confirm this before I ordered, but it sure looks like the only significant difference is that the Q6/Q651G runs Google’s Android TV, while my Q6/Q651F runs Amazon’s FireOS. I would have preferred the Google model, since all our other screens have Chromecast with Android TV devices, but I wasn’t going to pay an extra $100 or more just for the different operating system.
The important features that I wanted weere 120 Hz and variable refresh rate (VRR) at 1080p and 1440p. This is available on either model at 55” or above. There are smaller models, but they are always limited to 60 Hz. My model ending in F has both options available in the menus, though I am having trouble with VRR. We’ll talk more about that soon.
The review at rtings.com is chock-full of great facts, but there aren’t many opinions. If you read on, you will see my opinions scattered all over the place. The opinion that I would have most liked to have seen before buying this TV is how good the slightly fake 120-Hz modes look and feel while gaming. To keep with the tl;dr, they feel 100% like playing on a proper 120-Hz monitor, but the loss of vertical resolution makes the picture look muddy if you’re sitting too close to the screen.
What was on my wishlist?
This is easy. I wanted the lowest latency, best response time, and highest refresh rate that I could get for cheap. Cheap is relative. I guess I wanted to optimize for price without giving up on latency.
The important thing here is that I don’t need the dynamic range or crazy-fast response time of an OLED display. I don’t need dimming zones. I just want a basic yet fast and clean picture.
The trouble is that I don’t use the TV in my office all that often, so I didn’t want to spend too much money. I thought that $300 for the 55” TCL Q6/Q651F was quite reasonable.
NOTE: I have been using my new office TV every single day since it arrived. That should be just over two weeks by the time this is published.
What got me excited about this particular TV?
One of the things I noticed playing games on my Bazzite box in the living room is just how much of a bummer gaming at 60 Hz can be. Sure, there are a ton of games where a low refresh rate is fine, and there aren’t all that many games that I can run on my Ryzen 6800H Bazzite box that will push far past 60 frames per second anyway.
I get to plug the TV in my office directly into my gaming PC’s Radeon 6700 XT. It isn’t exactly a bleeding-edge GPU, but it can do better than 60 FPS in all the games I want to play with a controller.
The TCL Q6 can run at 120 Hz at 1080p or 1440p as long as you enabled Dual Line Gate (DLG) in the settings. There are a lot of comments on the Internet talking about how crummy the interpolated 1440p looks on this native 4K display. They’re not entirely wrong, but they’re most definitely a little off.
Don’t sit this close to a 55” TV!
It looks like butt if you’re sitting right in front of the TV. I played a map of Roboquest at 1440p120 with my mouse and keyboard while sitting way too close to this massive TV. It was buttery smooth and felt as responsive as my Gigabyte G34WQC monitor, but from less than two feet away I could see all those chonky, unsmoothed pixels.
That problem goes away when I am sitting 10’ away in my office recliner. I can’t make out those beefy pixels even when wearing the right glasses. It shouldn’t be a problem from your couch.
I started my testing with my Ryzen 6800H Bazzite box. I learned that you have to force the HDMI port to HDMI 2.0 in the TV settings in order for my PC to pick up the 120-Hz modes, but once I did that, they worked perfectly. However, the Steam Deck sidebar interface kept telling me that VRR wasn’t available on my monitor.
I plugged my gaming PC into another HDMI port, and I had the same problem. In both cases I turned VRR on for the correct HDMI port in the TV settings.
My Gigabyte ultrawide is the only port showing a vrr_capable display. The review on rtings.com for the Q6/Q651G says that Freesync is supported. Freesync will work with older HDMI ports, but the HDMI specification didn’t add their own VRR implementation until HDMI 2.1, and my GPU doesn’t support HDMI 2.1 with the drivers available on Linux. Perhaps the Q651G supports Freesync and G-Sync, while the Q651F only supports HDMI VRR?
I am not going to give up on this, but I am not terribly upset. A high refresh rate with low latency was a hard requirement. I was excited about getting a TV with variable refresh rate, but it is significantly less necessary when plugged in to my desktop PC.
How bad is it to not have working VRR?!
I keep squinting my eyes while looking very closely for tearing, but it is hard to notice. I imagine this is half because a 120-Hz display will only have an individual torn frame visible for around 8 milliseconds, and the DLG that allows for 120 Hz is probably smearing that tear across two frames. The higher your refresh rate, the lower the odds are of two tears appearing in close to the same position.
The only time I can see tears is when they line up in precisely the same place. The Marvel animation when starting Spider-Man 2 flashes things fast and runs at exactly 250 frames per second. I can almost always see a tear right in the center of that fast page-flipping animation.
Yes, you can turn on V-sync to eliminate this, but that adds nearly one entire frame of additional latency and makes things even less smooth if your framerate dips. Since I’ve only noticed tearing a handful of times, I am definitely going to opt to have lower latency here!
The TCL Q6 isn’t a bad monitor unless you need to enable DLG for 120-Hz modes!
The last time I rearranged the desks in my office, I decided to put my L-shaped desk along the same wall as the TV. That let me line things up so the TV could be used occasionally as a second monitor. I would never try to do work on the old 40” 1080p display, and I most definitely won’t be doing any proper work with the 55” 80-DPI TCL TV.
I can fit most of what I need on my 34” ultrawide monitor, but sometimes it is handy to have a second display to drop a small Discord window on while you are gaming in full screen, or have some extra screen real estate to monitor camera feeds while recording video while participating in a video podcast interview.
The TCL Q6 is acceptable for these tasks, and I would say that text in my terminal and Emacs windows looks more than clean enough as well. I wouldn’t want to edit blogs sitting two feet from an 80-DPI display, but you could.
Text at 4K60 with DLG disabled (left) and DLG enabled (right), photographed using my Sony ZV-1 on a stationary tripod
The trouble is that small text looks horrible when you enable DLG. You need DLG to game at 120 Hz, and for some reason the TCL Q6 doesn’t disable DLG even when you’re driving it at 2160p60.
The DLG setting is toggled separately for each HDMI port, so if I were running my PC on one port and a PlayStation 5 on another, I would be in good shape. I need to use one port, and I want a clean 2160p60 image while sitting at my desk and a fast 1440p120 when I am gaming from the other side of the room. The only way I can currently do that is slowly navigating through menus on the remote to toggle DLG.
This isn’t as big of a deal as that last paragraph might make it seem. I can just leave DLG disabled most of the time. Games that I would play on the TV where I actually need more than 60 Hz are the exception rather than the rule, though I do also need to enable DLG for latency-sensitive games like Dead Cells, because the rtings tests measured input latency at 10.4 ms at 60 Hz and 6.2 ms at 120 Hz. Dead Cells is the sort of game where that extra quarter of a frame could be the difference between surviving and dying!
Listen. I am fully aware that I bought the cheapest gaming-focused TV I could find. I don’t expect it to be perfect for every situation.
It depends on how you look at it, but from my perspective the answer is “sort of!”
Your 60-Hz panel isn’t magically refreshing twice as fast when Dual Line Gate (DLG) is enabled. Instead, half the lines are physically refreshed on each pass. No individual line is updating more than 60 times per second, but half the lines are being updated every 1/120 of a second.
This has more than a little in common with interlacing on old CRT televisions and monitors. In a way, you are sacrificing half of your vertical resolution for increased update speeds and reduced input lag.
Does that mean you are effectively running at 2560x720? It should be a bit better than that, but as I talked about in the previous section, enabled DLG messes up the vertical resolution even on static images. That makes me think there is something subpar about TCL’s implementation of DLG, but I am not well-versed on what is technically possible to achieve with this.
I think there are two important questions: Does it feel like 120 Hz, and can you ACTUALLY see the reduction in resolution?
I have been using the left corner of my TCL Q6 as extra monitor space when I watch YouTube videos
I sat two feet away from my TV and played Roboquest with my mouse and keyboard. Latency feels comparable to my 144-Hz Gigabyte W34WQC monitor, and flipping back to 60 Hz feels as crummy as I expected. The lower effective vertical resolution is painfully obvious when sitting this close to the TV while playing Roboquest.
What about when I am sitting across the room with a controller in my hand? Maybe I can see the lower effective vertical resolution if I pause the game, squint my eyes, and stare really hard. Otherwise, I don’t think I could tell you if I am playing Spider-Man Remastered at 2160p, full 1440p, or 1440p with DLG enabled from my comfy chair 10’ away.
For what it is worth, I got nearly the same score in Roboquest’s shooting range on my monitor and on the TCL Q6. I had never tried the shooting range before, and I didn’t practice at all. I just made one run on the TV, then another on the monitor. Not a super scientific test, but it was enough to convince me that the TCL Q6 was easily worth $300.
I should note here that DLG looks completely clean at 1080p120, but if you’re far enough away that 1080p is acceptable, then you’re already far enough away that you won’t see the muddiness that DLG causes at 1440p.
I don’t have enough understanding to describe this well. DLG appears to be adjustable separately on each input, but I don’t think it actually is. Sometimes it says DLG is on, but it obviously isn’t.
If I have DLG enabled on HDMI 2 for my PC, and I see that it is actually working, then switch to the Fire TV’s Netflix app, Netflix will usually play back the video with what appears to be a double-thick letterboxing. The aspect ratio seems correct, but the top and bottom of the movie will be cut off.
Popping over to HDMI 2 and disabling DLG fixes this problem.
I don’t know what is really going on here, but I thought I should post a warning. I’m not terribly grumpy about this glitch. I bought a gaming TV, and the TCL Q6 is an excellent gaming TV. I am learning that it does other stuff well, and I am enjoying these other functions. Other use cases are just a bonus for me, so I can’t complain too much if there’s a small settings issue.
Bumping into an FSR upscaling reality
In theory, it is better to run your display at its maximum resolution and let FSR, DLSS, or XeSS upscale your game to match. If your computer can run your game at 1920x1080, it should look better at 3840x2160 than it does at 2560x1440.
My upper limit is 2560x1440, because I want that 120-Hz refresh rate. I was having trouble keeping Marvel’s Spider-Man 2 running at 90 FPS or above. I had to dial down to FSR’s Ultra Performance mode, which means I was rendering the game at 854x480, and that was upscaled to 2560x1440. That is more than a little yucky.
I dropped my TCL Q6 TV’s resolution to 1920x1080, and I could get away with FSR’s Performance mode. That means I was rendering at 960x540, and that was upscaled to 1920x1080. My frames per second wasn’t identical, but it was close enough, and the Performance mode has 30% more pixels to work with before upscaling.
I would be running my display at 2560x1440 if there were a mode in between where I could get a rendering resolution of 960x540, but that just isn’t an option.
I stayed down at 1080p for Spider-Man 2, but I wound up using the FSR 3’s automatic scaling with a target of 105 frames per second. I don’t actually need Ultra Performance upscaling the majority of the time. The frame rate mostly only drops when I am swinging across the city as fast as I can go, and things move fast enough there that it barely matters what the render resolution might be.
You’re probably wondering how the game looks scaled up from 854x480. It looks like absolute butt if you are standing right next to the TV. Slightly blotchy weirdness all over the place, and you can see a fringe of odd smearing around the edges of Spider-Man as he moves around.
There is still some of the smearing around the edges of moving objects even with less aggressive upscaling settings, but you can’t see any of it from ten feet away.
How is the TCL Q6 for watching movies and TV shows?
It is for sure adequate. The built-in speakers do the job, but they aren’t amazing. The video output isn’t anything to write home about, but I can’t really complain too much.
I don’t expect the black levels of an OLED or a display with 1,000 individual dimming zones. The blacks tend to be a little gray. The reviews complain that this TV isn’t terribly bright, but I don’t get any sunshine in my office. Full bright is more than bright enough in here, and an entirely white screen lights up the entire room a surprising amount.
If I read and remember the rtings.com review well enough, the backlight is flicker free down to around 30% brightness. I have so far backed the backlight down to 70%, and I expect I will push things a little dimmer before I am done tuning. The TCL Q6 is a little too bright for my dimly lit office.
This is pretty much the view I have when playing games from my home office’s recliner
I picked the Gaming preset for my PC’s HDMI port because that seems to enable all the low-latency stuff. I picked the Movie preset for both the built-in Fire TV and the HDMI port of my Chromecast device. That Movie preset seems pretty comparable to my gaming monitor. I have my monitor set up with fairly natural, not overly saturated color.
NOTE: The Movie preset enabled a bunch of questionable stuff including HDR Enhancer, Local Contrast Enhancement, but worst of all, it set Motion Processing to low. That last one causes the so-called soap opera effect that most people hate. I didn’t notice that I accidentally enabled it with 60-Hz content, but the first 24-Hz movie I tried to watch looked really broken! I am assuming that I tested Jellyfin’s framerate matching BEFORE switching to Movie mode.
The TCL Q6 does a fantastic job of seamlessly matching the frame rate of content that you are watching. All the movies and shows I have played so far in Netflix and Jellyfin have adjusted the display to 24 Hz, and I have watched YouTube videos play back at 24, 25, 30, and 60 Hz. The review on rtings.com says that the TCL Q6 does a good job of playing back 24 FPS content without judder, and it sure seems as though they are correct.
I think I am going to stick with saying that the TCL Q6 is adequate to a couple of notches above adequate. I most definitely will not be complaining when I get stuck in my office and have to watch a movie.
In fact, if you told me that I had to replace the TV in my living room today, and I wasn’t allowed to upgrade to a $2,500 OLED TV, I would probably just spend $520 on the 75” version of the TCL Q6 and call it a day. It is an upgrade over the 70” Vizio that we’ve had for nearly a decade now, and the TCL Q6 would do an admirable job.
Conclusion
A $300 55” TV like the TCL Q6 is going to involve compromises, and my opinion is that those deficiencies are all in precisely the right places for a gaming-first television.
My instinct here is that I need a few hundred words to summarize what I have written, but that feels like way too much information. The TCL Q6 is inexpensive, is as good or better for gaming than most TVs at double the price, and it is more than adequate for occasional TV- and movie-watching. What more needs to be said?
I’m not sure how long my quasi-review of the TCL Q6 will continue to be relevant. Televisions that can handle native 4K at 120 Hz will be at this price point in a year or two, and fewer of your computers will be limited to HDMI 2.0, so you won’t need to limit your output to 1440p120. For now, though, the TCL Q6 is a fantastic TV for this specific use case.
If this is in the future, and you want to know how my TCL Q6 has been treating me, you can join our Discord community and ask! We are a community of geeky homelab and NAS enthusiasts who also talk about 3D printing, gaming, and home automation. We are a friendly and pretty well-rounded bunch, so you should stop by and say hello!
That title should be longer. I don’t want to exclude 4-bay and 8-bay USB SATA enclosures, but I didn’t want to waste so many extra words in the front of the title!
If you had asked me this question ten or twenty years ago, I would have laughed.
When I consolidated down from a desktop and a small laptop to a single beefy laptop sometime around 2008, I stuffed all my old 40-gigabyte IDE hard drives into individual USB 2.0 enclosures so I could continue to back up my important data to my old RAID 5 array. It did the job, but even with light use, I would get timeouts and weird disk restarts fairly often. I wish I had a photo of this setup.
The Cenmate 6-bay enclosure is about half as wide as the Makerunit 6-bay 3D-printed NAS case. My router-style N100 mini PC Proxmox server is my only mini PC that is WIDER than the Cenmate enclosure!
A lot of time and many USB upgrades, updates, and improvements have happened since those days. I have had a Raspberry Pi with a 14-terabyte USB drive at Brian Moses’s house since 2021, and I have also had a similar 14-terabyte USB hard drive set up as the storage on my NAS virtual machine since January of 2023. Both have been running without a single hiccup for years.
External USB hard drive enclosures are inexpensive, reasonably dense, and don’t look half bad. They also allow for a lot of flexibility, especially if you want to mix and match your homelab’s collection of mini PCs.
A good way to compare storage servers is using price per 3.5” drive bay. I’ve usually said that anything under $200 per bay isn’t bad, and anything down at $100 per bay is quite frugal.
USB SATA enclosures seem to range from $25 to $40 per 3.5” drive bay. The 6-bay model that I ordered cost $182, which works out to $30 per bay. Of course, that isn’t directly comparable to a full NAS box from UGREEN or AOOSTAR on its own.
The Intel N100 mini PC that I plugged my enclosure into cost me $140. Adding that up works out to a delightfully frugal $53 per drive bay!
My most expensive mini PC is the Ryzen 6800H that I currently use for living-room gaming. Let’s assume we are buying a RAM upgrade to push that Acemagician M1 as far as it will go. We’d be up at about $450 with 64 gigabytes of RAM. That would put our 6-bay Ryzen 6800H NAS at $105 per bay. That is still a really good value.
When you buy or build a purpose-built NAS server, you wind up locking yourself in. If you choose a case with 5 drive bays, then you’re probably going to have to swap all your gear into a new case if you decide you need 8 bays.
As long as I have a free USB port, I can plug in another 6-bay or 8-bay enclosure when I run out of storage next year. Many of the options on the market can be daisy chained, so you can plug one enclosure’s USB cable into the previous enclosure. Even if they didn’t, you could always buy a quality USB hub and have the same flexibility.
Daisy-chaining or using a hub will limit your total available bandwidth.
The Cenmate enclosure is here hanging out with a Ryzen 6800H mini PC, an Intel N100 mini PC, and a Seagate 14-terabyte USB hard disk while I was loading it up with disks. It is roughly the same width as the Acemagician M1.
Outgrow your Intel N100 mini PC? Swap in a new mini PC with a Ryzen 5700U or Ryzen 6800H. The mini PC running your NAS virtual machine is acting up? Migrate the VM’s boot disk to another mini PC and move the USB enclosure. You aren’t locked in to any single configuration.
Not only do I have those options, but all my computers run Linux. If something goes completely wrong, I could carry the USB enclosure from my network cupboard to my desk, plug it into my desktop PC, and have immediate, fast, direct access to all my data. If there’s a fire, I can wander out of the house with my laptop and the drive enclosure, and I will have all my extra data with me in a hotel ready to be worked with.
Piling things on top of a drive enclosure in the homelab is pretty reasonable. The enclosure is roughly as wide as all but my largest mini PC, and that mini PC is 80% heat sink. If you go with an 8-bay enclosure, you should be able to fit two stacks of mini PCs on top if you tip the enclosure on its side.
There is a limit to how many mini PCs, 6-bay hard drive enclosures, and small network switches you can stack in your homelab before it gets unwieldy, but unlike a full 19” rack, you can almost always balance just one more mini PC on top if you have to!
The first one I liked was an older Syba model that is available in 4, 6, or 8 bays. It looks older. It doesn’t support daisy-chaining, and it uses the older 5-gigabit USB 3 standard. It has enough unused space on bottom that it should be able to fit another drive in the same space, and that feels like a bummer. Syba has some of the lowest pricing per bay, and they also have an eSATA port on some of their enclosures. I used to use eSATA pretty regularly, but USB 3.0 is faster, and I don’t have any eSATA ports available to plug it into.
Then I was looking at various enclosures from Yottamaster. Their enclosures carried the highest price tags. They look attractive, but Yottamaster doesn’t seem to have one model of enclosure available in different sizes. They all look completely different. Some models have daisy-chaining, some have 5-gigabit USB ports, some have 10-gigabit USB ports. My favorite thing about these was that they were easy to find on Aliexpress.
I decided to purchase an enclosure from Cenmate. Their lineup of enclosures with 2, 3, 4, 6, and 8 bays all look identical aside from height. They support daisy-chaining, and their newest models have 10-gigabit USB 3. You could save a few bucks and go with their older 5-gigabit enclosures, but I figured it would be better to start with a newer model.
I was trying to balance the amount of storage I would waste on parity against wasting SATA bandwidth.
Running a RAID 5 with a 4-disk enclosure would dedicate 25% of my storage to parity, while running a RAID 5 with a 6-disk enclosure would only eat up 16%. It also helps that smaller disks tend to cost less per terabyte than larger disks, though I will be spending a bit more on electricity.
The Cenmate enclosure’s tool-free 3.5” trays are easy to use, and the latch mechanism is quite satisfying to operate!
A 7200-RPM hard disk might top out at 250 megabytes per second on the fast end of the platter and something as low as 120 megabytes per second on the inside tracks. A 10-gigabit USB 3 port can theoretically move nearly 1,000 megabytes per second.
That leaves us at an average speed of around 160 megabytes per second when all six drives have to be choohin’.
What do you think? Is that a reasonable compromise between maximum speed and wasted parity space? I think it is fine. My NAS virtual machine will be bottlenecked by my mini PC’s 2.5-gigabit Ethernet ports anyway.
Nothing even manages to be apples and oranges
There’s a gap here. An N100 or N150 mini PC from Trigkey or Beelink can be found with a pair of 2.5-gigabit Ethernet ports just like a $500 4-bay UGREEN DXP4800, but the most costly 8-bay UGREEN NAS gets you an upgrade to a pair of 10-gigabit ports. Mini PCs with 10-gigabit ports are the worst combination of rare, large, or expensive.
If you have a need for 10-gigabit Ethernet ports, then a 6-bay or 8-bay UGREEN NAS might work out to a better value. My suspicion is that the Venn diagram of people who need 10-gigabit Ethernet and the people who can get by using slow mechanical hard disks would be two circles that are barely even touching.
What are you planning on doing with your big hunk of bulk storage? I watch the occasional movie, but I can easily stream video to every screen in the house with 1-gigabit Ethernet. Sometimes I dump 30 gigabytes of video off my Sony ZV-1, but the microSD card is also way slower than 1-gigabit Ethernet. I run daily backups both locally and remotely, and my remote backups finish in a reasonable amount of time over my 1-gigabit Internet connection, so I won’t notice the difference between that or a 10-gigabit local backup.
At the moment we have no real idea! My Cenmate enclosure only just arrived, but I am working on being as mean to it is I can.
I stuffed it full of spare 4-terabyte SATA disks that I had lying around my office. I plugged the Cenmate enclosure into one of my mini PCs, I set up a RAID 5, and I attached that RAID 5 to a virtual machine. I made sure the virtual machine is light on RAM so not much will be cached.
I fired up tmux, and I have one window continuously looping over a dd job writing sequentially to a big, honkin’ file. I have another window running dd that will be forever reading the RAID 5 block device sequentially. I have a third window running an old-school bonnie++ disk benchmark.
I don’t care how fast any of this goes. The two separate sequential tasks will be fighting the benchmark task for IOPS, so it is all going to run very poorly. What I do care about is whether I can make any disks or the USB SATA chip reset or error out.
I will feel pretty good about it when it survives for a couple of days. I will feel great about it after it has been running for more than a week.
How are things going so far?
The follow-up to this blog post will be a more direct review of the Cenmate unit, but it seems appropriate to include what I learned on the first day with the enclosure!
I have a box with five 4-terabyte SATA drives. These used to live in the homelab server I built for myself in 2015. My plan was to stick those in the enclosure along with an underutilized 12-terabyte to build a 6-drive RAID 5.
One of those 4-terabyte disks is completely dead, and I haven’t extracted the 12-terabyte drive yet. I was impatient, so for today I set up a quick RAID 5 across the first terabyte of the four good drives.
The enclosure is plugged into a 10-gigabit USB port on my Trigkey N100 mini PC, and mdadm during the RAID rebuild said I am hitting 480 megabyte per second reads and 160 megabyte per second writes. That is as fast as these old hard drives can go to build a fresh RAID 5 array. I also verified that smartctl is able to report on every drive bay.
I pulled my spare 12-terabyte drive from my desktop PC and stuck it into a free bay in the Cenmate enclosure, and running dd to read data sequentially from five drives got me up to 950 megabytes per second. I am just going to call that 10 gigabits per second.
The Cenmate enclosure is louder than I hoped yet quieter than I expected. I usually measure the sound of my office with the meter sitting on the desk in front of me, because I care about what I can hear while working. Usually my idle PC’s quiet fans put me at around 36 dB.
The Cenmate is off to my side just barely in arm’s reach, and its fans push that up to 45 dB. I get a reading of 55 dB when I hold the meter up next to the unit. It isn’t ridiculously loud, but I will be happy to move it to my network cupboard at the end of the day!
The conclusion so far
So far, so good! I paid around $30 per bay for a USB SATA enclosure with six 3.5” drive bays, and it is for sure able to move data four times as fast as a 2.5-gigabit Ethernet port. It is cheap, fast, dense, and it even looks nice and clean. We’ll see if it winds up being reliable.
I have already moved the Cenmate enclosure to my network cupboard. Long-term testing is progressing, but it is progressing slowly. I keep finding out that my old hard disks are starting to have bad sectors or other weird problems, so I won’t be able to start properly beating on a full enclosure for a couple of weeks.
I believe that USB hard drive enclosures are a great way to add additional storage to your homelab, especially if you need space for big video files or more room for backups. The enclosures are inexpensive, extremely dense, and it sure looks like they’re going to wind up being reliable as well.
Have you been using a USB enclosure for your homelab’s NAS storage? Or are you a diehard SATA or SAS user? Join our Discord community! We’d love to hear about your successes or failures with USB storage!
It has been almost a year since I wrote about using a $140 Intel N100 mini PC as a game console and Steam-streaming device in the living room. I don’t know what part of this arrangement tickles so many people, but that post has been in my top 10 most read blogs ever since it was published. This is weird to me, because I am not a gaming blog, and my game-related blog posts don’t usually get many views.
Last year, I tested a mini PC for gaming that was actually destined to live in my homelab. This year, I tested a Ryzen 6800H mini PC in my homelab before moving it to its permanent home in the living room.
What’s the tl;dr for this? Bazzite is delightful. It functions very much like SteamOS on the Steam Deck, but it also installs things like Decky and EmuDeck for you. Bazzite installed very easily on my Acemagician M1, and my Ryzen 6800H has enough iGPU horsepower to run Grand Theft Auto 5 Enhanced at nearly 60 frames per second with reasonable settings.
I was looking for the sweet spot where performance and price meet for the best value. I wanted enough power to play a good percentage of my existing Steam library without breaking the bank.
The Intel N100 mini PC I tried out for gaming last year was fun! It can play pretty much any 2D game or top-down shooter I could think of, and it could emulate anything up to around the Nintendo Wii. That little $140 box also did a fantastic job at streaming Steam games from my real gaming PC over the network.
I’m not going to be playing any first-person shooters in the living room. I will continue to a mouse and keyboard for those at my desk. That said, there are a lot of more modern games in my library that would be fun in the living room with a controller. I wanted to be able to run games like Red Dead Redemption 2 and Grand Theft Auto 5.
It also helps that the Ryzen 6800H is in the same league as the Steam Deck. The Steam Deck has faster quad-channel memory giving it a slight boost, but the 6800H pulls ahead due to having 50% more iGPU cores than the Deck. Even better, the Ryzen 6800H mini PC can run at 45 watts, so it can clock a little higher to run games just a little better.
If you see people having success playing a game on the Steam Deck, then the game will run about 20% better on a Ryzen 6800H with dual-channel RAM.
I feel the Ryzen 6800H is the sweet spot between price and performance for a Steam game console
The prices of mini PCs with faster iGPUs go up faster than their performance increases. You can spend an extra $150 on a mini PC with a Ryzen 7740HS or 8740HS to upgrade to the 780M iGPU, but that seems to only be roughly 20% faster.
The most important question to ask yourself is whether the faster mini PC will allow you to play games that you wouldn’t be able to run otherwise. There are a ton of games that wouldn’t run on my $140 Intel N100 mini PC, while the $309 Ryzen 6800H opens up a whole slew of newer games for me to play.
The prices on mini PCs go up rather sharply after the 6800H. Those more expensive mini PCs do come with more than just incremental CPU and GPU upgrades. You get more RAM and more storage.
The trouble is that you will be paying $100, $200, or even $400 more for a 20% or 30% boost in frame rates, but you won’t make use of that extra RAM while gaming.
Things get rather interesting outside the mini PC space once you start pushing past the $500 mark. You could build a mini-ITX gaming PC around a $220 Intel Arc B570 GPU and absolutely blow any mini PC out of the water. It’ll be a little bigger, but it will be upgradeable and oh so much faster!
Who cares about Grand Theft Auto 5! That ran on my PlayStation 3!
This is true. I played through the story, and I played online with my friends on my PlayStation 3. I can tell you that Grand Theft Auto 5 on my mini PC is a very different experience.
The PlayStation 3 could only render the game at 1280x720 and scale that up to 1920x1080 with a basic upscaler, and it couldn’t even maintain 30 frames per second. I am rendering the game at 1920x1080 on my mini PC. I have the settings dialed in to where the frame rate stays at around 60, but I have been in situations where things dip into the low fifties.
There is definitely some room for dialing things up a bit more.
Definitely install Grand Theft Auto 5 Enhanced instead!
I wound up installing the enhanced version of GTA 5 yesterday, and it runs well and looks better than the legacy version. It supports FSR 3 natively, so I don’t have to use Gamescope to upscale using the much worse FSR 1 upscaler.
I keep turning up new knobs that make the game look nicer without noticeably dropping the frame rate. The game stays well above 60 frames per second if I set FSR 3 to performance mode, but that is rending the game at around 1280x720, and it is extremely obvious that the resolution is so low.
This is about as low as the frame rate tends to go in GTA5 Enhanced on my Ryzen 6800H Bazzite mini PC
The game stays mostly in the mid fifties when FSR 3 is set to balanced, and it looks a lot better. I think that is a good tradeoff.
There are no lighting, shadow, or texture quality settings that I can dial down that bring the FPS up over 60, but I was able to push the lighting to high, the shadows to soft, and enable ambient occlusion without losing any performance. There are probably still settings I can push a notch higher without losing performance.
You might be better served by a Steam Deck!
The Steam Deck can run just about any game my Ryzen 6800H mini PC can run, and the cheapest Steam Deck only costs $100 more. What is the trade off there?
The Steam Deck is portable. It has a screen, a battery, and built-in controller, so you can play it on an airplane. You can purchase an inexpensive dock to connect the Deck to both power and your TV at the same time, so you can use the Steam Deck just like I am using my mini PC, but you would retain the option to pick up the Steam Deck and walk away.
My mini PC is faster, came with twice as much storage, and I saved $100. I think I’d be having more fun with a Steam Deck.
NOTE: While I was writing this, refurbished Steam Decks with 256 gigabytes of storage showed up in Valve’s store for $319.
Be careful choosing your mini PC!
My Acemagician M1 came with a single 16-gigabyte SO-DIMM installed. I wound up buying a 32-gigabyte stick of DDR5 for $72 to upgrade one of my Intel N100 mini PCs, and I moved its old 16-gigabyte stick to my Acemagician M1 to upgrade it to dual-channel RAM.
Using both channels doubles the available memory bandwidth. This isn’t a big deal in my homelab, because most processes aren’t held back all that much by a single channel. Gaming with an iGPU requires every ounce of memory bandwidth you can find.
I literally doubled my frame rates in Grand Theft Auto 5 when I installed the second SO-DIMM. You can buy a 16 gigabyte DDR5 SO-DIMM for $40. That would bring my total investment up to $350. The link leads to the same SO-DIMM I am using in my own Acemagician M1.
Other mini PCs on Amazon specifically list that they ship with two 8-GB SO-DIMMs. That is plenty of RAM for low-end gaming, and two SO-DIMMs is what you want.
You don’t have to shop around or think about this at all if you buy a Steam Deck.
I am keeping my eye out for Ryzen 6800H or 6900HX mini PCs that claim to ship with two 8 gigabytes sticks of RAM. Here is what I have so far:
Bazzite is kind of like SteamOS on steriods. Both are immutable Linux distributions—that means you can’t accidentally goober up your base install. Both boot directly to Steam’s big-picture mode. Both run all your games on Linux using Proton under Gamescope. Both have a quick-menu to let you adjust frame-rate limits, FSR upscaling, and put limits on your power usage.
I didn’t play much Mario Galaxy on my Ryzen 6800H yet, but I did verify that it runs at 60 FPS with 3x render resolution for 1080p, and that there is a ton of wiggle room for turning up settings.
Bazzite builds on that. While SteamOS now only supports the Steam Deck, Bazzite has images for AMD, Intel, and Nvidia GPUs. Bazzite also has lets you click a button during setup to install things like Decky to inject frame generation into games that don’t support frame generation, and EmuDeck so you can run NES, PlayStation, Wii, and other old console games.
I have barely scratched the surface with Bazzite. I suspect it deserves its own blog post, but I’d like to dig a little deeper before I attempt to write it!
The Ryzen 6800H has lower Steam streaming latency than my Intel N100
At this point, I think I am locked in to my test game for measuring Steam Link latency. I used Gunfire Reborn at 1080p to test the first Intel N100 gaming mini PC, so that is what I will continue to use. That will help keep things fair.
Device
Wired
WiFi
Ryzen 6800H
6 ms
8 to 11 ms |
Intel N100
8 ms
11 ms |
Steam Link
16 ms
15 ms |
I think it is important to mention that the original Steam Link hardware from 2018 is still fantastic. So many games are just fine with an extra 16 milliseconds of latency, especially if you are using a controller. I played some Red Dead Redemption 2 with 70 milliseconds of latency over T-Mobile’s 5G network. Red Dead is a slow-paced game, so I could only barely tell that there was additional latency. The 16 milliseconds of the Steam Link hardware from seven years ago is imperceptible here.
That said, the 6 or 8 milliseconds of these mini PCs completely goofs up my timing when playing Dead Cells. You’d be likely to have the same problem playing Super Mario Bros. for the NES with similar additional latency.
It sure is hard to read the information on Steam game streaming’s statistics output, isn’t it?!
You shouldn’t entirely trust my latency measurements over WiFi. Is there REALLY something better optimized in the hardware of my Ryzen 6800H mini PC? Does this WiFi chipset just get along better with the access point in my living room? Am I just having a luckier day with interference almost a year after testing the Intel N100? Will the radio situation in your home be equivalent to mine? Probably not.
I am pretty excited to see 8 ms of latency while streaming to the TV in the living room. Every time I fire a game up, it will start at around 11 ms of latency before settling in to 8 ms within the first ten or twenty seconds. It seems to do a good job staying there, too.
As awesome as it is that my Acemagician M1 can run Grand Theft Auto 5 Enhanced, it the sort of game where I wouldn’t notice an additional 6, 8, or even 16 milliseconds of latency. I can run the game maxed out on my gaming PC and stream it to my living room.
Do you want to stream games from your gaming rig or run them in the living room?
I say you should be prepared to do both.
I can’t be using my desktop PC in my office and gaming in the living room simultaneously, but you may have a situation where you need to do work while your kids play games. They can play a lot of games on a beefy enough mini PC without interrupting your work, but they can still stream the fancier games from your gaming PC at other times.
For my purposes, the games that are the most latency sensitive aren’t the ones that require an overpowered gaming rig. I was able to play Dead Cells and every platformer up to New Super Mario Bros. Wii on the $140 Intel N100.
That isn’t to say that I don’t play heavy modern games where latency isn’t important. Those games also happen to be the games where I am going to be using a mouse and keyboard. I won’t be playing those in the living room.
Why put a mini PC in the living room instead of a PlayStation or Xbox?
If you are anything like me, you have a huge backlog of unplayed or underplayed game in your Steam library. I have collected over 2,000 games by purchasing bundles. Usually I play one or two games in a bundle, but there are often games that look like they’d be fun with a controller on the couch.
Having a comfortable way to play couch-friendly games is something I have been missing for quite a number of years, and being able to play my back catalog is going to be awesome.
Maybe you don’t have a Steam library at all, and you don’t know anything about Steam. One of most awesome things about Steam is the sales. You can get deep discounts on older games several times a year, and sometimes those games aren’t even that old. You’ll probably save quite a bit of money buying your games on Steam instead of in Sony’s or Microsoft’s stores.
Don’t I need a controller?!
Yes. I have more than a few DualShock 4 controllers for the PlayStation 4, so I started out using one of those. I have been a fan of first-party PlayStation controllers for a long time. They work great with Linux, and pair right up with Bazzite. Steam understands them. Most importantly for me, though, is that the d-pad on Sony controllers is quite good.
That said, I have been itching to try one of the fancier controllers made by GameSir, and I’ve been having a problem. My DualShock 4 controller kept losing signal and inputs sitting 15’ away from the TV in the living room. This isn’t a new problem. Our Nintendo Switch has trouble, too.
I figured spending $50 on a GameSir Cyclone 2 with its 1,000-Hz USB dongle and fancy microswitch buttons and D-pad would be a good solution to this problem. My connectivity wasn’t perfect, but it was immediately better. Putting the GameSir dongle on a short USB extension and positioning it 6” from the Acemagician M1 solved all my problems.
The GameSir controller is fantastic. I wrote a long-winded blog post about it, but the tl;dr is that it is a nicer controller than anything Sony makes, and it costs $10 to $20 less. If you’re starting from scratch, then I think this is the place to start.
Final Thoughts: The Future of Living Room Gaming is Flexible
So, where does all this leave us? It’s clear the landscape of accessible gaming is shifting. We’re no longer limited to dedicated consoles or expensive gaming rigs to enjoy a great experience in the living room. The Ryzen 6800H mini PC, paired with the magic of Bazzite, offers a compelling blend of power, flexibility, and value. While the Steam Deck remains a fantastic, portable alternative, the mini PC route opens doors for a dedicated, potentially more powerful, and customizable setup.
Ultimately, the “best” solution depends on your needs. Do you prioritize portability? Value a plug-and-play experience? Or crave the freedom to tinker and optimize? There’s a fantastic option out there for everyone.
But don’t just take my word for it! The world of mini PCs, emulation, and Steam streaming is constantly evolving, and it’s much more fun to explore it together. Come join the conversation in our Discord community! Share your own experiences, ask questions, get help with your builds, and discover new gaming possibilities with fellow enthusiasts. We’d love to have you!
Let’s build the perfect living room gaming setup – one mini PC at a time.
I have been keeping my eye on various Ryzen 6000 mini PCs. They aren’t exactly bleeding edge, but they have a rather powerful iGPU, and they have more modern CPU cores than a Ryzen 5700U, like in my laptop, or Ryzen 5800U.
My Ryzen 6800H mini PC while I am installing Proxmox. It is in my home office sitting on top of my off-site Trigkey N100 Proxmox server and its 14-terabyte external USB storage.
I haven been waiting to see a nice specimen drop under $300. Preferably one with 2.5-gigabit Ethernet and a pair of m.2 slots. If it weren’t for all the upcoming tariffs stirring up trouble for us, I was expecting Ryzen 6600U or Ryzen 6800H mini PCs to take the $250 price point away from the Ryzen 5800U mini PCs before summer ends.
I got everything I wanted except the price. I saw the Acemagician M1 on sale for $309, and I just had to snatch one up. I don’t really NEED to expand my homelab, but that is definitely a good enough price for me to be excited enough to do some testing!
Like with most interesting things, the value proposition of a Ryzen 6800H mini PC can be less than simple.
Let’s compare it to my lowest cost Intel N100 Proxmox node. The Ryzen 6800H CPU is nearly four times faster, can fit two or three times as much RAM, has a faster Ethernet port, but the N100 manages to transcode around 50% faster. At today’s prices, the Ryzen 6800H only costs a hair more than twice as much as the Intel N100.
Do you value that extra video-transcoding performance? Maybe you should save some cash and add an Intel N100 mini PC to your homelab, especially when you consider that the Ryzen 6800H burns 50 watts of electricity while transcoding. The Ryzen 6800H is just fast enough to transcode 4K 10-bit tone-mapped video in real time for at least two Jellyfin clients.
Maybe that is just the right amount of video-encoding performance for your needs. If you value that extra CPU chooch, then maybe you should splurge for a Ryzen 6600U or 6800H.
Maybe. There are reports that Acemagician installs spyware on the Windows 11 image that ships on their hardware. I never booted Windows 11 on mine. The very first thing I did was install Proxmox, so this didn’t matter to me at all.
We are still at a point where the handful of Goldilocks mini PCs don’t tend to go on sale at the lower price points. There are a lot of mini PCs with gigabit Ethernet and two m.2 slots, or with 2.5-gigabit Ethernet and one m.2 slot. Sometimes you even get a pair of Ethernet ports, and sometimes BOTH are 2.5-gigabit Ethernet ports. Finding the right combination for a good price can be a challenge!
I could see why you might want to vote against Acemagician with your wallet, but this was the correct porridge for me. It would have been nice if it had a second 2.5-gigabit Ethernet port, but that wasn’t a deal-breaker for me at $309.
Home Assistant says that my power-metering smart outlet reads between 6.1 and 7.5 watts most of the time while my Acemagician M1 is sitting there waiting for a task, but it shoots up to a whopping 50 watts while transcoding video!
There is an oddity, though. Mine shipped with a single 16-gigabyte DDR5 SO-DIMM. I was expecting a pair of 8-gigabyte SO-DIMMS.
On one hand, that means I didn’t acquire a pair of worthless 8-gigabyte DDR5 SO-DIMMs that would be destined for a landfill. On the other hand, I am thinking about using this particular mini PC as a gaming console in the living room, so I could really use that dual-channel RAM for the iGPU. Not only that, but my single-channel RAM might be having an impact on my Jellyfin testing.
OH MY GOODNESS! This isn’t unexpected, but you absolutely need dual-channel memory for good 3D-gaming performance on your Ryzen 6800H. My FPS doubles in most games when I dropped in a second stick of RAM, and some weird regular stuttering in Grand Theft Auto 5* completely went away.
I should also say that taking apart the Acemagician M1 was the opposite of a delight. They hide the screws under the rubber feet, and when you pop the easy side of the lid off you are greeted with the big cooling fan. You have to finesse the entire motherboard out of the shell to reach the memory and NVMe slots underneath.
The rest of this blog will be about the Ryzen 6800H and not specifically my Acemagician Mini PC
The Ryzen 6800H is overkill for my personal homelab, but I wanted to be able to see how my new mini PC might handle some light gaming duties in the living room. Something like the 6-core Ryzen 6600U would have been a better fit for my homelab, and you can find those at better prices, but twice as many GPU cores out of the 8-core Ryzen 6800U or 6800H.
That isn’t a big deal for your homelab. The smaller iGPU probably has exactly as much Jellfin transcoding performance as the heavier iGPU.
I already said some of this in the tl;dr. The Ryzen 6800H is roughly four times faster than an Intel N100 and maybe 25% faster than a Ryzen 5800U.
All mini PCs for your homelab are a good value
This statement is mostly true. You should make sure you’re buying when there is a sale, because there is always at least one mini PC brand running a sale on Amazon on any given day. You may have to wait to get a good deal on exactly the specs you want, but there’s always sure to be a deal coming up. We keep an eye on our favorite mini PC deals in our Discord community.
I have been doing a bad job keeping the pricing in my mini PC pricing spreadsheet up to date. When I last updated it, you could get an Intel N100 mini PC for $134, a Ryzen 5560U for $206, or a Ryzen 6900HX $439. Each of those is roughly twice as fast as the model before it, and each costs around twice as much. The prices and performance don’t QUITE map out that linearly if you plot them on a graph, but none would stray that far from the line.
We haven’t seen deals that good on an Intel N100 or lower-end Ryzen 5000-series mini PC in a while. You’re going to wind up paying $150 or more today, possibly closer to $200. And there aren’t many 6-core Ryzen 5000-series mini PCs around now, so you have to pay a bit more for an 8-core Ryzen 5800U.
What’s exciting today is that the 8-core Ryzen 6000 mini PCs with 12-core RDNA2 iGPUs are four times faster than an Intel N100 or Intel N150 mini PC while only costing a bit more than twice as much.
Lots of small Proxmox nodes, one big one, or something in between?!
Do you want to save money on your electric bill? All these laptop-grade CPUs consume similar amount of power when they aren’t doing any serious work, so it might be better to splurge on one overpowered mini PC that idles at 9 watts, because four Intel N100 boxes will each idle at 7 watts.
NOTE: Don’t just assume that someone else’s idle numbers will exactly match your own. Cramming four times as many virtual machines onto a Ryzen 6800H just because it has four times the CPU and RAM of an Intel N100 also means that you have four times as many opportunities for mostly idle virtual machines to keep the CPU awake. We aren’t always comparing apples with oranges.
That is 9 watts vs. 28 watts. That difference only adds up to around $20 per year where I live, but that might be enough difference in power consumption to pay for a mini PC someone in Europe over the course of 3 or 4 years.
One the other hand, you may have some vital services that need to alongside problematic ones. Maybe your home router runs in a virtual machine, and every once in a while your Jellyfin or Plex containers goofs up your iGPU and requires a reboot of the host. You probably don’t want your home network going down just because you had to fix a problem with Plex.
You have the option of moving Plex and some less vital services over to their own inexpensive Intel N100 mini PC while running the rest of your homelab on a mid-range or high-end mini PC. You have a lot of flexibility in how you split things up.
I have been running Geekbench 5 on all my mini PCs and keeping track of the scores, but why Geekbench 5? I didn’t wind up buying Geekbench 6, because I am unhappy that Geekbench no longer includes an AES test. I have been extremely interesting in improving my potential Tailscale encryption speeds, so this number has been a good indicator of whether or not a particular CPU would be a good upgrade for me.
It also helps that I have all sorts of historical Geekbench 5 scores in my notes. That makes it easier for me to compare older machines to my current hardware.
Mini PC
Single Core
Multi Core
Trigkey Intel N100 DDR4
1,053
2,853 |
Topton Intel N100 DDR5
1,002
2,786 |
Minisforum UM350 Ryzen 3550H
955
3,215 |
Acemagician M1 Ryzen 6800H 1x16GB
1,600
7,729 |
Acemagician M1 Ryzen 6800H 2x16GB
1,646
9,254 |
The multi-core score did improve by about as much as I would have expected, but my multi-core score is lower than many of the mini-PC scores in Geekbench’s database. Other people are near or above 10,000 points.
We should probably also talk about Jellyfin transcoding performance. Unlike the gcn5 iGPU in processors like the Ryzen 3550H or 5800U, the Ryzen 6800H’s RDNA2 iGPU supports hardware tone mapping. This is important today because most content that you download on the high seas will be 10-bit HDR video. If you need to play back on a non-HDR display, then you will want Plex or Jellyfin to map the content down to 8-bit for you. The Intel N100 and Ryzen 6800H can both do that for you.
I played my usual 4K 10-bit test movie, and my Ryzen 6800H was transcoding at between 42 and 56 frames per second. It was also burning 50 watts of electricity as measured at the power outlet while transcoding.
I am not sure why this is the only Jellyfin encoding screenshot I have saved at 51 FPS!
The Intel N100 can manage 75 frames per second while transcoding the exact same movie. I don’t believe I measured power consumption while transcoding on the N100, but both of my N100 mini PCs top out at around 20 watts maximum. The Intel N100 is faster and more efficient at this task than the Ryzen 6800H.
That isn’t the actual performance limit for either machine. When Jellyfin is transcoding two or more videos, the total throughput of all the videos will exceed the single-video maximum.
So, as you can see, diving into the world of mini PCs for your homelab is a fascinating exercise in balancing power, efficiency, and price. The Acemagician M1 with its Ryzen 6800H offers a significant step up in processing power compared to the Intel N100. While it’s not perfect – the fiddly build and single RAM stick were minor inconveniences – the performance gains are undeniable.
Ultimately, the “best” mini PC truly depends on your specific needs and priorities. Do you prioritize power efficiency and low cost? The N100 is a fantastic choice, especially if your mini PC will spend many hours each day transcoding video! Need a bit more punch for demanding services or memory-heavy workloads? A Ryzen 6600U or 6800H might be the sweet spot.
We’ve only scratched the surface here, and the mini PC landscape is constantly evolving. If you’re building your own homelab, debating upgrades, or just enjoy geeking out over hardware, we’d love to have you join our community!
Come hang out with us on the Butter, What?! Discord server! We share deals, troubleshoot issues, discuss projects, and generally discuss all things homelab and DIY NAS. Share your setups, ask questions, and learn from others – we’re a friendly bunch who love to help. We’re always swapping tips and tricks on finding the best hardware, and specifically discussing optimal configurations for different homelab services, so you’ll be among the first to know about the next great mini PC deal!
I have been gaming with one of my Li’l Magnum! fingertip gaming mice for the last three months. The model I am currently using weighs 16.4 grams and is built using the internals from a VXE Mad R mouse.
This is my lightest Li’l Magnum! so far, and the VXE Mad R is a fantastic value. For $43 you get a PAW3395 sensor, a 200-mAh battery, and an 8 KHz receiver. It is that last part that I am particularly excited about, but gaming with your polling rate dialed up to 8,000 Hz drains your mouse’s battery fast.
I will give you the tl;dr right here in the intro. I spent $10 on an inexpensive pair of magnetic USB-C charging adapters. I stuck one of the USB-C ends to the back of my monitor, and I installed one of the 0.8-gram magnetic doodads in my mouse. That brought my mouse up to 17.2 grams, but I can just dock it up on my monitor when I’m not using it, so I never have to think about the battery again.
I cheated. The magnetic doodads arrived, and I immediately ran a cable and stuck the doodad behind my monitor with a big glob of blue tack. That was enough to let me try out my new dock, record a video, and figure out if this was a good idea.
It is a good idea. I like it a lot. Everyone who stops by the house seems to get a kick out of it. So I designed a simple universal L-shaped bracket to hold my magnetic charger up on my monitor.
The bracket is pretty big. I wanted to have a good amount of surface area for the double-stick tape to get a solid grip.
I am going to tell you that I haven’t done proper science. I don’t know how many actual hours of use the mouse can manage on a single charge. I just made sure to play some games every day, kept an eye on the battery level, and I’d call it done when the percentage dropped low enough that I figured it would die on me in the middle of gaming the next day.
I don’t use my fingertip mouse unless I am gaming.
My VXE Mad R could make it four days at 8K, eight days at 4K, and significantly longer at 2K or 1K polling.
I think four days is quite reasonable, but it is a weird schedule to remember when I need to plug my mouse in to top it off.
Do you really need 8K polling?!
Probably not, but I say that every little bit helps. A normal gaming mouse polls for changes 1,000 times each second. That means that when you click the button it may take as long as one millisecond for your game to register that you’ve decided to fire your weapon. That is an imperceptible amount of time.
When you dial things up to a polling rate of 8K, you drop that maximum to 0.125 milliseconds. This is also an imperceptible amount of time.
This might matter when you are playing against other people. You aim at the other person, and they aim at you. You see that you are staring directly at one another. You have identical hardware. You click at the same time, but their mouse is set to 8K and yours is set to 1K. You died almost one full millisecond before the game registered your click.
This is assuming that your mouse hardware is doing a good job, and that it isn’t lying to you.
High polling rates can be problematic
I am running Linux. Your experiences may be different on Windows, but I haven’t encounted a single first-person or third-person shooter that has been grumpy about 8 KHz polling. My frame times are always rock solid.
My troubles have been outside games. Video footage playing on YouTube in Firefox will freeze while the audio continues to play if I jiggle my mouse around. Sometimes the mouse pointer will be jumpy and lag behind when I move it over certain programs.
I didn’t keep track of exactly which things still work well that 2 KHz and 4 KHz, but things are definitely less likely to be problematic down there.
I would be working hard on an automated solution for this if my Li’l Magnum! was my daily driver for productivity tasks. I was already sniffing USB traffic in ATK’s configuration app figuring out which USB commands I might need to send to change my mouse’s polling rate!
Using a fingertip mouse to scroll Reddit and Mastodon isn’t comfortable. I just toss my Li’l Magnum! into its magnetic dock when I am done playing games and move my basic Logitech G305 back into place.
That single millisecond may not matter
There are so many places to shave off latency. The difference between enabling V-Sync, completely uncapped FPS, or capping your FPS just below your VRR maximum could be almost 10 milliseconds. Setting that up correctly could be a free boost for you.
Upgrading to an OLED gaming monitor with might shave 5 to 10 milliseconds of latency off the already nice IPS monitor you are currently using. That might be an expensive upgrade.
The nice thing is that you get to add these little improvements together. Five milliseconds from tuning your settings, plus 5 milliseconds from a monitor upgrade, plus one millisecond from polling your mouse at 8 KHz adds up to an 11 millisecond advantage.
That last millisecond of latency from your mouse might be a free performance upgrade. You may already have the hardware to do it, but you’re just not excited about charging your battery every three days.
What about adding 0.8 grams to save 2 grams?!
I have stopped chasing grams. I have trouble telling the difference between my 16.4-gram and 25.2-gram L’il Magnum! mice while gaming. The difference is obvious when you picked them up, but I quickly forget what I am using after gaming for a few minutes. I suspect this is because either of these mice weigh less than my index finger, so they both feel like almost nothing.
That said, I know for certain that some people want to shave every gram off their mouse that they possibly can. One option is to swap out for 200-mah stock battery for a 25-mAh or 50-mAh battery. The stock battery weighs about 4 grams, and you might be able to shave three of those off my switching to a lighter battery.
The trouble is that the 50-mAh battery won’t last you through an entire evening of gaming. You can probably top off a battery that small if you plug the mouse in when you go on a bathroom break, but plugging and unplugging a USB-C cable is a pain.
Tossing your 17-gram mouse onto a magnetic connector under your monitor requires significantly less effort. Maybe it is worth adding back 0.8 grams after saving 3 grams just for the convenience. You’d still save two grams, but charging would be almost effortless.
Why did I choose this model of magnetic charger by NETDOT?
There are some nice looking magnetic charging adapters that support 240-watt USB-C PD for around $20 each. That seems way too fancy. I don’t expect to ever use a mouse that won’t charge using old-school 5-volt USB power.
Many of the older, cheaper 5-volt magnetic chargers have round ends so they can swivel. That is smart if you are charging a phone, but those bulbous ends sure looked like they’d be close to scraping the mouse pad when using my Li’l Magnum!, so I figured they would be a bad idea.
You can save some cash if you aren’t in a hurry. There are magnetic USB-C chargers on Aliexpress that look similar to the ones I bought, but they are only $2 each.
So why did I choose the NETDOT Gen10 magnetic doodads? They were the right shape, a 2-pack cost $10, and they were one of the options that Amazon could deliver to me the next day.
I am sure others work fine. They may be heavier. They may be lighter. They’re probably all similar enough in weight that it doesn’t really matter.
Every single one of my gaming mice that use USB-C ports for charging are already converted into L’il Magnum! shells. I can’t ACTUALLY test to make sure that there is enough clearance for the NETDOT unit to clear the plastic of a VXE or MCHOSE mouse.
I suspect they’d work just fine. The NETDOT ends are only 0.5 mm larger than the cables that shipped with my VXE and MCHOSE mice.
Conclusion: A Tiny Upgrade for Big Convenience!
Switching to a magnetic charging setup for my Li’l Magnum! fingertip mouse has been a changed the game—literally. No more worrying about battery life at 8K polling, no more fumbling with cables, and just a tiny 0.8-gram trade-off for effortless charging. Whether you’re chasing every millisecond of latency or just love a clever quality-of-life hack, this simple mod is worth trying.
If you’re curious about lightweight mice, high-polling-rate gaming, or 3D printing, come join the conversation in our Discord community! We’d love to hear your thoughts and see your mods and experiments. Drop by, share your setup, and let’s geek out over the little things that make gaming better.
I was a little worried that the 70’ of Cat 5e running through the attic might not manage to connect at 10 gigabit, but it worked just fine. It continued to work fine for a few months. Then things started becoming less reliable.
I hate intermittent problems. I still haven’t correctly identified my problem. In fact, I have been back to having a flawless 10-gigabit Ethernet connection across my house for more than a month. That makes my problem even more difficult to troubleshoot!
I am going to walk through my troubleshooting steps, and tell you what I currently suspect is going on. Maybe you are having similar problems, and maybe something I have done or something that popped into my mind may be of use to you!
I am extremely confident that my problem is related to the weather. It doesn’t get all that cold here in Plano, TX, but my problems started in the cooler months, got worse as the weather got colder, and the problem went away completely once the temperature outside stopped dropping below maybe 50F at night.
I don’t think it is just the cold. Things are more likely to be problematic when it is humid or raining.
Most of the length of the Cat 5e cable running from my office to my network cupboard is above the insulation in the attic.
I initially thought my SFP+ copper modules were overheating
Copper SFP+ modules do tend to get quite warm. They’re not going to burn you, but they’re hot enough that you think they might when you unplug them!
When the problems started occurring, I would move the cable in my office from the 10-gigabit SFP+ module to a 2.5-gigabit Ethernet port on the switch. That would always work fine. I would usually remember to move the cable back a few hours later, and I’d have a flawless 10-gigabit connection again. I assumed the SFP+ modules might be cooling down enough.
Each smokey point on my Smokeping graph represents me swapping ports or modules while doing heavy iperf3 testing
At first I was going almost a week between having to do this. Then every couple of days. Then several times a day.
I decided to take one of the switches apart to see what I might be able to do to keep the SFP+ modules cooler, and I thought I had a pretty good idea. Replacing the dried-out thermal compound in my cheap Intel 10-gigabit Ethernet card was necessary to keep them running at full speed. That was a good enough reason for me to give it a try here!
Attempting to keep the SFP+ modules cool in my MokerLink switches
I moved all my cables to the old gigabit switch, I ordered an assortment of thermal pads from Amazon, and I waited for the pads to be delivered. I figured I would stack several pads up between the PCB and the chassis to help the SFP+ modules to transfer more heat from the module to the shell of the switch.
My shipment got delayed a few days. I cut up some thermal pads when they finally arrived, installed them in the MokerLink switch, and moved all my cables back to the 2.5-gigabit and 10-gigabit ports.
My stacks of thermal pads installed under the SFP+ ports were inspired by the stock thermal pad that MokerLink installs under their CPU!
I couldn’t get a 10-gigabit connection at all. Did I break the switch? Did I mess up the SFP+ ports?
Probably not. I could get them to connect if the other end of the connection was a 2.5-gigabit port.
I wasn’t sure what was going on, but I figured I might as well modify both switches with the thermal pads for good measure.
I didn’t think about my 10-gigabit network much at all for a few weeks. This upgrade was just an inexpensive and fun experiment. My old 1-gigabit Ethernet gear was adequate for my daily needs, and the 2.5-gigabit upgrade was still working just fine. The 10-gigabit links were just a bonus. It didn’t hurt at all having them downgraded.
I don’t know what made me remember to try the 10-gigabit link again, but when I did, it was working perfectly. At least for a while. I’m not sure how long it was stable before I ran into trouble again.
This is when I started noticing the correlation to the weather. The 10-gigabit link was more likely to be problematic at night. It is colder at night. Especially since the sun wasn’t beating down on the attic all day.
Then I started being able to make reliable predictions. I would see rain in the forecast, then I would see network problems.
What do you think is happening in my attic? Are my cables contracting in the cold causing something weird to happen in a bend somewhere? Do I have a tiny tear in a jacket of a cable somewhere that is letting moisture in? Is that causing a short, or is the combination of cold and moisture doing something else?
If moisture is causing a short in a damaged cable, then why does it work perfectly at 2.5-gigabit speeds?
I ordered a 5-gigabit SFP+ copper module
One of the two switches involved in this problematic network link is managed. I can set the port to 1000, 2500, or 10,000 megabit. This does not influence the link speed between the SFP+ modules. The switch always detects a 10-gigabit full-duplex link even when the other end of the connection is a 2.5-gigabit port.
When I first set up the 10-gigabit connection across the house, I set up a long chain of couplers and extra patch cables in order to coax the Xicom SFP+ modules into connecting at 5-gigabit speeds. That worked great when the wiring was good, but I haven’t been able to get them to negotiate down to 2.5-gigabit or 5-gigabit on their own.
Can you see that tiny switch labeled 2 and X on my Lianguo 5-gigabit module? That switches between 2.5-gigabit and 5-gigabit mode!
I wound up ordering a 5-gigabit SFP+ module from Aliexpress. I had two reasons for choosing this particular module. The first is that there really isn’t a big selection of 5-gigabit SFP+ modules. The best part is that this module has a tiny switch that you can flip to set it to 2.5-gigabit or 5-gigabit. I figured that might come in handy!
The 5-gigabit SFP+ module works perfectly. I plugged it in and immediately saw 5-gigabits per second on all my iperf tests.
I want to try the 5-gigabit module while the 10-gigabit link is having problems!
Nothing has gone wrong while I am here to do anything about it. My Smokeping graphs have been very nearly solid green for the last 6 to 8 weeks. We had a cooler night with some rain about three weeks ago, and I had a few cyan blips on the graph. A cyan blip means that one out of twenty pings over a five-minute period didn’t get a response.
I am pretty sure that singular blip happened before the 5-gigabit module arrived at my house.
I joked in Discord that since a potential solution is here in my hands that I won’t see another problem until November. That is feeling less like a joke now.
Conclusion?!
I don’t think we really get to see a conclusion until winter rolls around again. I think the 5-gigabit SFP+ module was a good purchase. Dropping down to a 5-gigabit Ethernet connection to the other side of the house is still a huge upgrade over my old 1-gigabit connection, and that is WAY less work than pulling a new cable.
Some of this is way more obvious with the benefit of hindsight. The temperature or humidity in the attic didn’t occur to me at all early on. When it get cold outside, we turn the heat on. Parts of the house that receive the most cooling in the summer also receive the most heat in the winter. My immediate assumption was that an extra few degrees of heat near my network cupboard was pushing things past the limit. That doesn’t seem to have been the case.
What do you think? Have you ever experienced strange network issues tied to weather or temperature? Could humidity or drops in attic temps be messing with your cabling? Maybe you have a different hypothesis to explain why my 10-gigabit link acts up when it’s cold. Share your thoughts (or war stories!) in the comments below, or jump into our Discord community to geek out over networking mysteries with fellow tech enthusiasts. Let’s solve this together—maybe before winter comes back! ❄️🔌
When I first migrated my virtualized homelab stuff from my old Debian with KVM and virt-manager to a mini PC running Proxmox, I knew I would want to eventually have some sort of cluster manager. It didn’t take long before I had Proxmox running on a second mini PC at home, and this week I am migrating my off-site Raspberry Pi’s Seafile server to another mini PC running Proxmox.
How can you get all that stuff into a single web interface? Proxmox will let you add a bunch of servers to a cluster, but that pushes the high-availability services pretty hard. You need to have a minimum number of machines to maintain a quorum, you need very low latency between your Proxmox hosts, and you need a clustered file system underneath.
Setting up my 1.5-liter N100 off-site Proxmox host with 14 terabytes of storage
I want one of my Proxmox nodes to live in another city. My nodes aren’t a cluster, and they aren’t interchangeable. One of my nodes is plugged into a 3D printer. Two other nodes have external USB hard disks tied to specific virtual machines. I don’t need things migrating around on their own. I just want a unified GUI, and I would like to be able to manually migrate virtual machines and LXC containers around without doing a convoluted backup and restore cycle.
Proxmox’s Datacenter Manager has only had an alpha release so far, and it doesn’t have all that many features yet, but it scratches every single itch that I have.
Installation was a breeze thanks to the Proxmox Helper Script. I had an LXC container up and running in a couple of minutes, and it took less than ten minutes to add all three of my Proxmox nodes via their Tailscale addresses. Using Tailscale means my Datacenter Manager can see all my nodes no matter where they are physically located.
The dashboard shows the CPU utilization of my ten busiest guests, ten busiest nodes, and my ten most memory-hungry nodes. That is a pretty boring view for me, because my homelab isn’t all that complicated. My guests don’t tend to do anything exciting.
The exciting page for me is the remotes tab. It shows a combined list of the task history of all my nodes. This makes it easy to see at a glance if any of my backup tasks have failed.
From there, you can drill down into each remote. That will show a summary page that looks very similar to the summary page on each individual Proxmox server. Even better, though, is that there is a little paper airplane icon next to each guest. This lets you easily migrate containers and virtual machines to a different host. I don’t do this often, but I am excited to have a simple interface to make it happen when I need to balance the load on my servers!
Removing nodes from Proxmox Datacenter Manager is a breeze
Well, it is almost a breeze. You do have to manually grab the TLS key from each new server to paste into your Datacenter Manager interface. This isn’t exactly a friction-free experience, but it also isn’t a herculean effort.
I goofed up the partitioning on my new off-site Proxmox host, and I decided that the cleanest way to fix my mistake was to reinstall the node from scratch. Removing a node from a Proxmox cluster is bit of a pain. My understanding is that if you need to remove a node that no longer exists you might have your work cut out for you.
I don’t think this Allocations section shows up anywhere in the usual Proxmox GUI. It is a handy summary to have!
I thought I might be able to get away with updating my remote node’s certificate fingerprint, but Proxmox Datacenter Manager gave me an error when it tried to reconnect. Even though my old node was gone, I had it deleted and set back up in less than a minute.
Easy-peasy.
Was it easy to get working with Tailscale?
I already have Tailscale running on each of my Proxmox hosts. There is a simple Proxmox helper script that installs Tailscale in your LXC containers for you, so I just used that to add Tailscale to the Datacenter Manager container.
My Proxmox hosts were all grumpy about using Tailscale’s Magic DNS. That isn’t a big deal. My Proxmox hosts only need to be able to talk to each other and to my NAS devices for backup purposes. I wound up configuring all my hosts to use the local DNS server, and I added the five relevant IP addresses to the hosts file on each Proxmox server and the Datacenter Manager container.
I made sure to use the Tailscale hostname when I added remote hosts to the Datacenter Manager GUI. They all seem to be talking happily.
I won’t be able to do proper testing until I send my remote Proxmox server home with Brian next time have pizza here on a Saturday night!
Conclusion
Proxmox Datacenter Manager has been a missing piece in my homelab journey, offering a streamlined way to manage nodes both across the house and across town without the rigid demands of a full-on Proxmox cluster. By bridging standalone Proxmox hosts—whether in my office, in the network cupboard on the other side of my house, or sitting off-site at Brian’s house—it delivers the unified GUI and manual migration capabilities I craved.
Adding Tailscale to the mix erased geographical barriers, while features like cross-host task monitoring and one-click VM/LXC migrations made previously fragmented management into a cohesive experience. For an alpha-stage tool, it is impressive that it is already able to do everything I actually require. Even so, I am looking forward to some of the features on the Proxmox Datacenter Manager roadmap!
If you’re tinkering with Proxmox, juggling nodes in different locations, or just love geeking out over homelab workflows, I’d love to hear about your setup! Join our Discord community to swap tips, troubleshoot quirks, and explore using tools like Tailscale and Proxmox together. Whether you’re a clustering pro or a DIY novice, there’s always room to learn, share, and streamline your lab. Let’s build smarter setups—without the headaches.
I started the journey towards the Li’l Magnum! when I printed a large skeletal shell from MakerWorld for a fake Logitech F304 mouse that you can get on Aliexpress for $8. That build was around 40 grams when I started. I was able to shave it down to 33 grams, but I wanted more. I wanted a smaller, lighter, mouse that felt more solid, and I wanted to use a nicer donor mouse, so I started designing my own.
I built the first Li’l Magnum! around a $45 VXE Dragonfly R1 Pro. The first print was up near 24 grams, while the current iteration is more rigid and clicks nicer while weighing in at only 20.4 grams. It is a fantastic little mouse!
Then I modified the Li’l Magnum! to fit the $19 VXE Dragonfly R1 SE. That build currently comes in at 21.5 grams. Last week, my VXE Mad R made it through customs, and I now have a 16.4-gram Li’l Magnum! that only cost me $43!
These are all fantastic, but you may have to wait several weeksfor you mouse time for your donor mouse to arrive from Aliexpress. To help you sidestep that problem, I ordered a McHose L7 Ultra from Amazon for $66, and it was in my hands and in a newly designed Li’l Magnum! shell the next day.
The L7 Ultra is an awesome mouse with a fantastic sensor, optical switches, and a nice lightweight PCB layout. At 18.26 grams, it doesn’t come in as light as Li’l Magnum! built using a Mad R, but it is close, and you can have one in your hands within 24 hours.
That got me thinking. There must be a decent and extremely inexpensive mouse on Amazon that you could have in your hands in a day or two. It seemed like I should search around and see what I could find.
My feelings here are complicated. When I pushed the grips back about 20 mm to make a longboy Li’l Magnum!, I learned that the UHURU mouse makes a delightful $10 longer Zeromouse-style ultralight mouse.
The longboy isn’t perfect. I threw together the bigger grips quickly and haphazardly, but it does work pretty well. The mouse is only $10. It isn’t easy to tell that the electronics aren’t equivalent to the VXE Mad R or McHose L7 Ultra while you are using it. The long version of the UHURU Li’l Magnum! comes out to 26.3 grams. That is only 10 grams heavier than my lightest fingertip-style Li’l Magnum!.
It is hard to not be a little excited about this. You can build yourself a 26-gram gaming mouse for $10, and it actually feels pretty good.
The UHURU mouse has been available on Amazon for sixteen months, and it has been priced at $9.59 for almost a month so far. It claims to use the same PAW3395 as my VXE R1 Pro and VXE Mad R, and I definitely believe that after taking these mice apart.
I skimmed through some reviews on r/MouseReview. Most of the complaints were related to the shell. The left click on some UHURU mice feels crummy. The plastic is cheap. The RGB LEDs burning out seems to be a common problem. None of this is terribly relevant to building a Li’l Magnum! ultralight mouse, because we will be throwing the heavy shell away.
NOTE: I had zero problems with my UHURU mouse. The clicks felt consistent. Motion seems fine. I don’t have a way to measure latency, but it seems identical to all my other mice. The problems noted in the reviews on Reddit imply that either UHURU’s quality control is poor, or they’ve fixed these deficiencies since those reviews were written. Even if you aren’t turning your UHURU mouse into a Li’l Magnum!, it sure seems like a decent $10 wireless gaming mouse that punches well above its price point.
The biggest reason that I started designing around the VXE R1 is that there is a model available for $19. I don’t want you to have to spend $150 on a Zeromouse Blade only to learn that you hate fingertip mice, or that you hate superlight mice.
I want you to be able to spend $10 or $20 on your ultralight gaming mouse experiment. Maybe you’ll enjoy the experience, and that will encourage you to buy a Zeromouse, build a Li’l Magnum! using a nicer donor mouse, or maybe you’ll try a 27-gram G-Wolves mouse. I am excited about lowering the barrier to entry and democratizing the world of ultralight mice.
What’s a better deal than a $19 mouse with a 19,000-DPI PAW3395SE sensor from Aliexpress? Hopefully a $10 mouse with a 26,000-DPI PAW3395 sensor that can be at your door within 24 hours.
But Pat! Don’t I need a 16-gram Li’l Magnum! build?
I suspect that lighter is better, and that it would be awesome if we could get a mouse down to 5 or 10 grams, but I also don’t think the difference would be all that noticeable.
All my Li’l Magnum! builds are set to 3,200 DPI. My heaviest Li’l Magnum! builds are using my old Corsair Katar Pro wireless mouse and the UHURU mouse. Both weigh just over 26 grams. It doesn’t matter which Li’l Magnum! I am using. After playing for a while, I start to forget which one is in my hand. I am not constantly upset that I am using a heavier model. I completely forget that the heavier mouse isn’t 16 grams.
The Li’l Magnum! built with a VXE Mad R mouse weighing in at just 16.4 grams!
I do wish I could do an actual blind test of each of all these mice, but they are all easy to identify. The VXE Mad R at 16 grams feels like a feather compared to the 21.4-gram R1 SE. My two R1 mice are within a gram of each other, but the R1 SE has heavier and louder switches, so it is obvious which one is which. You can tell the difference between most of these mice just by the feel or sound of the clicks.
If the difference in price between $10, $19, $43, or $66 doesn’t mean much to you, I would build a Li’l Magnum! around the VXE Mad R. You get a nice sensor, optical switches, an 8K receiver, and a 16-gram build. All those things easily add up to $30 in value.
I can’t believe how good the UHURU Li’l Magnum! feels!
The UHURU mouse isn’t perfect. It has a LONG circuit board. It sticks out the back of the fingertip shell by 40 mm more than any other Li’l Magnum! build. It is less than ideal for a fingertip mouse, but it works quite well for a longer grip layout like the Zeromouse. I have uploaded a first attempt at a long Li’l Magnum! shell for the UHURU WM-09.
This is the hardest Li’l Magnum! to assemble. I had to trim the leads for the microswitches off the bottom of the PCB in order to manage to slide the board into the Li’l Magnum! shell without breaking the button flappers. I also had to put a piece of electrical tape over that awful LED that indicates the DPI setting. And if you are building a longboy out of the UHURU mouse, you’re going to have to bend and flex the shell quite a bit to finagle the PCB past the upper support.
Even though the UHURU is my second heaviest Li’l Magnum! at 25.38 grams, and I’ve gotten used to playing Team Fortress 2 with a 16.43-gram mouse, the UHURU still feels extremely light. The odds are high that this mouse has the worst latency of any of my other Li’l Magnum! builds, but I couldn’t tell you that by feel. It is almost impossible to perceive the different between 0.4 milliseconds and 1.5 milliseconds, especially when the entire system has 15 to 25 milliseconds of total latency.
I played a round of Team Fortress 2 with the first functioning prototype printing, and I played exactly as I would with any of my other mice. I expected the long circuit board to bump into my palm when aiming downward, and I am pretty sure I did feel the electrical tape on that LED once or twice, but it isn’t really a problem. The part of the UHURU that sticks out the back is extremely close to the mouse pad.
Should you buy an UHURU mouse for your Li’l Magnum! mod?
I dislike how many compromises I had to make. I didn’t want anyone to have to modify their circuit boards, but I don’t think snipping four leads is much of a modification. I don’t like the long PCB. The rotated buttons mean my post-travel stops don’t work on the UHURU.
That said, I think these compromises are extremely reasonable for a mouse that can arrive at your door tomorrow for $10. As with any other Li’l Magnum!, you’ll need to order some skates as well. I imagine that you could take a pair of scissors to the stock skates, but I have no idea how well that might work!
If you’re on a tight budget, and you can wait two weeks, I think you are much better off ordering a VXE R1 SE for $19. You’ll have a smaller, lighter, cleaner, and probably lower latency Li’l Magnum!, and it doesn’t cost that much more.
If you can’t wait two weeks, and you’re not on a tight budget, then the McHose L7 Ultra is a fantastic mouse for a premium Li’l Magnum! experience.
I know this isn’t a ringing endorsement, but the UHURU WM-09 is just fine if you can’t use Aliexpress. This is especially true if you think you’ll prefer the longboy layout. Maybe you’re on a tight budget, or maybe you’re extremely skeptical about even using an ultralight mouse. Maybe you’re looking for a fun and inexpensive project, and 3D printing a mouse tonight that you can assemble tomorrow seems like a lot of fun.
The UHURU is a good PCB for a longboy Li’l Magnum!
I feel like I should warn you before I get you excited about this. The Li’l Magnum! is extremely customizable, but the farther you get from my own preferences, the less well thought out and supported things get.
The default Li’l Magnum! is a short fingertip mouse that is made to be comfortable in my hand. One of my early experiments was to push the grips back to line up with the Zeromouse Blade. This makes for a much longer mouse, and the way I wind up gripping a mouse like that really isn’t much different than how I hold my Logitech G305. There just isn’t a hump of a mouse to rest my palm on.
The UHURU PCB is long, so I figured it’d be a good mouse to try this experiment with again. I moved the grips back to where I would grab my Logitech mouse, made the grips a but way taller than usual, and made sure the arms were attached to the most appropriate points at the base.
If this is the style of grip layout you want, I will say that it’ll be hard to beat the $10 UHURU WM-09. The PCB still sticks out the back, but not by that much. Having that extra length of PCB means I get to brace the arms better than I could on a longboy Li’l Magnum! using any of the other shorter PCBs.
It isn’t perfect. Those tall grips are a little more squishy. The brace across the top helps, but it is still softer than a fingertip Li’l Magnum!.
This layout isn’t for me. I have grown to like the fingertip grip. My aim was usually great with the longboy, right up until it wasn’t! Sometimes my muscle memory was expecting to be able to continue to move the mouse with my wrist, but my wrist just wouldn’t go any farther. I gamed with regular mice and longboy-style grip for years. I am sure I could get used to it again quickly. I just don’t want to.
I won’t put this Li’l Magnum! shell in my Tindie store
I have two reasons. I am not quite proud enough of this particular Li’l Magnum! build to put it up for sale. The UHURU PCB just isn’t an ideal fit, and I can’t work around its limitations well enough. I also don’t think anyone should spend twice as much on the plastic shell as they do on the electronics that go inside.
I still think it is a great Li’l Magnum! if you can print your own. Turning a $10 mouse purchase into something that is 80% or 90% as good as my most premium Li’l Magnum! is totally awesome!
This is more than a little subjective, but I really do feel that the UHURU Li’l Magnum! really does perform almost 90% as well as my two most premium builds. The trouble is that the part that sticks out the back is so ugly! It doesn’t tend to get in the way when I am gaming, and being able to mount the battery so far back does make the UHURU Li’l Magnum! feel lighter than it actually is, but it makes it looks so much more kludged together.
This makes two shells that I won’t stock in my Tindie store. The other is for the Corsair Katar Pro wireless mouse. That one is a similar weight to the UHURU, but it doesn’t have an elongated PCB. That mouse also feels fantastic, but you can’t buy it anymore, and I hate that I had to solder a USB-C rechargeable AAA battery in to power it up.
Conclusion: Lightweight innovation is within your reach
The journey to crafting the perfect ultralight mouse isn’t about chasing a single magic number—it’s about finding the balance between cost, creativity, and performance that works for you. Whether you’re modding a $10 UHURU WM-09 for next-day tinkering, patiently waiting for a budget-friendly VXE Dragonfly R1, or splurging on a more premium McHose L7 Ultra or VXE Mad R, the Li’l Magnum! project proves that an ultralight gaming mouse doesn’t have to break the bank or test your patience. Each build, from the featherweight 16-gram Mad R to the (almost?) “good enough” UHURU, opens a door to experimentation, proving that even compromises can lead to surprisingly satisfying results.
But this isn’t just about mice—it’s about community. The enjoyment of sharing a mod, troubleshooting a print, or geeking out over sensor specs is what turns solo projects into collective breakthroughs. That’s where you come in.
Whether you’re a seasoned modder or a curious newbie, our growing community is the perfect place to:
Share your Li’l Magnum! builds
Get tips on trimming weights, tweaking shells, or choosing donor mice
Stay updated on new designs and experiments
Connect with fellow enthusiasts who believe that great gaming gear doesn’t have to cost a fortune
Let’s democratize ultralight mice together—one print, one mod, and one Discord message at a time. Join our friendly Discord community and turn your curiosity into creation!
P.S. Even if your first build has a few rough edges (literally), we’ve all been there. Bring your questions, your triumphs, and that $10 mouse you’re secretly proud of—we can’t wait to see what you’ll build next. 🐭✨
I have listed the 3D printed Li’l Magnum! mouse mod kit in my Tindie store. I would prefer that you print your own. I would be excited if you would use this as an opportunity to spend $180 on a Bambu A1 Mini 3D printer so you could print a Li’l Magnum! for yourself and your friends, but I also understand that not everyone can or would even want to own a 3D printer, and I would like to see more people trying out the Li’l Magnum! for themselves.
I would also concede that I would be able to buy more models of donor mice if enough people are buying Li’l Magnum! kits from me. The more unique mice I can buy, the more mice I can support with the Li’l Magnum! shell.
That seems like a win for everybody, and it gets me closer to my goal of democratizing ultralight fingertip mice! It would be nice if you didn’t even have to buy a mouse. If I make the Li’l Magnum! compatible with enough donor mice, then there will be a good chance that you already own the necessary hardware.
Most of my Li’l Magnum! mods use less than three grams of plastic now. They feel quite flimsy on their own. They borrow most of their rigidity from the PCB once it has been installed!
Even after, though, using a mouse like this was a weird experience for me. I kept wanting to squeeze, flex, and press different parts of the first Li’l Magnum! as I was playing. I hardly doing that at all by the second or third day.
If you haven’t used a lightweight mouse before, you will notice that your aim is going to get worse before it gets better. Using a mouse that is 60 to 80 grams lighter is a shocking experience. It takes time to acclimate.
I stopped noticing the 3D-printed shell after a week or two. It is just feels like this is how it is supposed to be now.
You may very well wind up learning that you just hate fingertip mice, hate ultralight mice, or both! I was worried that I wouldn’t like using a 20-gram mouse, so I made sure to design the Li’l Magnum! to work with a $20 donor mouse. That keeps the cost of trying something so radically different as low as possible, and if I hated the ultralight mouse, I could put the original 49-gram mouse back together.
Are you really selling PLA printed on your home FDM 3D printer?!
Yes.
I assumed that I would only be prototyping the Li’l Magnum! using PLA filament. I thought that I would quickly switch to PETG or ABS, and from there I might start sending out of nylon prints from an SLS or MJF machine.
It turned out that the PLA prints work working amazingly well, and my first ABS test mouse was brittle and way too spongy. That was when I realized that the shape and thickness of the button paddles would need to be adjusted for different materials.
That led me two a pair of conclusions. First, I would be spending a lot of money sending out for test prints from an MJF printer just to dial everything in. I also realized that if we’re going to be democratizing custom mice, then we need people to be able to print the Li’l Magnum! at home. Everyone who owns a 3D printer can print PLA.
I could either focus my attention on dialing in the dimensions for a good feel with ABS and nylon, or I could just assume everyone should be printing the Li’l Magnum! in PLA and put all my work into optimizing for that.
Then I strapped one of my prototype shells to my Lumen pick and place machine and set up a script to click the button as fast as it could. I let it run for more than 30 hours straight, and the LumenPNP clicked that button more than 1.2 million times. The Li’l Magnum! still felt brand new, so I feel like PLA is up to the challenge.
What should you expect to see when you receive your Li’l Magnum! shell?
The Li’l Magnum! is balancing in a precarious spot between being just durable enough to not break, stiff enough to feel right in your hand, while also being as light as I can possibly make it. We are also fighting some of the limitations of the FDM 3D-printing process and gravity.
I have been gaming with nothing but one iteration or another of the Li’l Magnum! for the last six weeks. It fits my hand well. It does have some flex when you squeeze the sides hard enough, but I feel that it is more than stiff enough for the job.
The underside of the finger grips will be slightly imperfect because they have to be held up by support material when printed. The same is true of the underside of the buttons.
NOTE: I’ve switched to using zero-clearance PETG supports directly under the button plungers, so they print perfectly now. I am working to verify that the change in filament isn’t contaminating the PLA and making the buttons brittle. I expect we are in good shape, and I believe I will be shipping out perfect plungers from now on!
See that tiny black dot? That is PETG support material accidentally dropped into the PLA, and it ruined the integrity of this print. That button snapped right off while trying to insert the PCB. I believe I have eliminated this problem completely.
There will also be slight imperfections on the bottom of the angled arms that connect the grips to the body. These are printed without support material, but they are at an extreme enough angle that they are just on the edge of being printable without supports. I have printed them both ways, and I think they look cleaner without supports.
Removing these supports leaves a slight discoloration behind. I prefer the ever so slightly uneven surface to the whitening of the plastic.
There will also be a stair-step texture on top of the buttons. This is caused by the 0.16-mm layer height that I have optimized for. I like this slight texture under my fingers, but I could understand why others might not feel the same way. I cut pieces of grip tape to fit my buttons, and I was running the Li’l Magnum! like that for a week. It was nice, but I preferred the bare plastic.
I don’t believe that any of this is bad, but I do think it is important to set expectations. These are not injection-molded mouse shells. These are more like custom FPV drones.
I can pay a 3D-printing service less to print a Li’l Magnum!, why should I buy one from you?!
I would first like to tell you that I didn’t put a ton of thought into the $20 price. It is a round number, it is a little more than one quarter of the price of the original Zeromouse kit, and it is half the price of some of the other 3D-printed ultralight mouse mods that I have seen. If anything, it feels a little on the inexpensive side.
I didn’t check how much a print farm might charge for a Li’l Magnum! print, but I assume they’d be able to send you a 3-gram part at a REALLY low price.
I am not a print farm. I designed the mouse that I am printing. I know what it is supposed to look like. I know how it should feel. I know what can go wrong. I will be sending you something that I am confident will work as intended.
You’re not just paying for a mouse. You’re paying for the 70 prototypes that I printed and tested up to this point, and the prototypes I will continue to print. You’re paying for the hours I spent in OpenSCAD so I could upload an open-source design for the world to use. You’re helping to pay for all my future improvements.
NOTE: I am a little behind on publishing a Git repo of the OpenSCAD source code. There are a lot of plates I am attempting to get spinning: the blog posts, the MakerWorld and Printables pages, the Tindie listing, this post to link from Tindie, YouTube videos, and finally a Git repo. I can only work on getting one plate up into the air at a time, but the GPL v3 repo should be somewhere soon!
I am not Logitech. I won’t sell enough mice that a nickel from every mouse sold will pay an engineer’s salary. I am just a guy with a blog, an underutilized YouTube channel, a 3D printer, and a dream.
I just spent eight hours perfecting the print settings for the button plungers!
The Li’l Magnum! has been in my Tindie store for less than a week, and I was about to put the finishing touches on this blog post to publish it today. Then I had an idea.
I really want to print the plungers on top of zero-clearance PETG supports, but Orca Slicer just doesn’t want to let me do it the way I want.
I need some automatic tree supports sprinkled around the grips and base, but if I have the extremely long purge to keep PETG from mixing with PLA in the print, then I am going to waste tons of filament and time. I am already wasting more than the weight of an entire Li’l Magnum! on a single change to PETG.
I have had half a dozen failed prints trying to bend things to my will. The slicer just won’t let me mix traditional supports topped with solid PETG with PLA tree supports. I wound up designing my own support piece into the model, and I used manually placed modifiers to set up which parts should be PETG, and which bits of PLA needed to be solid.
My first couple of attempts at manually building a PETG support interface failed. The extra narrow extrusion width I use to keep the weight down wasn’t conducive to helping the PETG squish down on the PLA that it already doesn’t want to stick to!
Then I discovered a whole new problem. When printing with regular PLA supports on a PLA print, there is a 0.2-mm gap between the top of the support and the bottom of the plunger. This means that my plungers technically print slightly lower than what I set up in the model.
My prints with the dialed-in PETG supports with zero clearance are coming out with immaculate plungers. There is little to no black PETG left under the PLA plunger, and all my supports almost fall right off.
The trouble is that the plungers were too high! I had to spend another 90 minutes printing and adjusting the model and modifiers two more times to get things feeling right. This means that I will need separate models with different button heights for people who want to use basic PLA supports and people who want zero-clearance PETG supports.
This is another of those things that a print-on-demand service wouldn’t know about.
The PETG supports have increased the print time by 20%, and I am burning through a few pennies in extra material. The time that I have to spend cleaning up the print afterwards has gotten really close to zero now, so I think this is a fantastic trade!
Why can’t you buy a fully assembled Li’l Magnum!?
I would love to be able to order a big box full of the guts for dozens or hundreds of mice from VXE or ATK so that I could sell you a working, tested, functional Li’l Magnum! that you could plug in and start using immediately. I haven’t figured that out yet. I don’t even know if enough of you would be interested in purchasing mice to know if it is worth taking the risk.
I have had great luck with my prints over the last couple of weeks. All the recent STL files print well, and the clicks feel great almost every single time. I expect the clicks will always be perfect now that I am using zero-clearance multimaterial supports for the plungers.
The real trouble is that I only have one of each PCB. Yours might have slightly different tolerances. There is a slim chance that you will need to shim underneath your PCB with a layer of tape to tighten up the clicks.
There is also the chance that the manufacturers will get sneaky. They might slip a slightly different PCB into an existing mouse, and that new PCB just doesn’t quite work with the Li’l Magnum! that I ship you, because I wasn’t aware of the change.
I would love to be able to test the clicks on every Li’l Magnum! before sending them out. It would be nice to know that I paired the right PCB with the right print for perfect clicks every single time. I will be excited if I can get to that point in the future!
What donor mouse do I have to buy to build a Li’l Magnum!?
Let’s start with the table of all the mice that I have been working with:
I have been focusing on the mice from VXE/ATK. They are popular and lightweight. They do well in latency tests, and the configuration GUI works in Chrome on Linux.
I was drawn to the VXE’s Dragonfly R1 series specifically because of that $19 R1 SE. I was absolutely tickled by the idea that you could spend $19 on a mouse, 3D print a 3-gram shell, and you could inexpensively discover whether or not you’re even interested in using an ultralight fingertip mouse.
I was also excited that the lineup didn’t stop there. The R1 Pro has a better sensor, nicer buttons, and its electronics weigh slightly less. It is even better that the R1 Pro doesn’t even cost much more!
My VXE Mad R finally cleared customs and arrived here yesterday, and it is amazing. The Li’l Magnum! needed very little modification to make its PCB fit, and every part of the Mad R is lighter than the R1 Pro. The battery is 1.2 grams lighter, thought it is also has slightly less capacity. The wheel is more than a gram lighter. The PCB is lighter.
My VXE Mad R Li’l Magnum! weighs 16.47 grams. That is 4.2 grams lighter than my VXE R1 Pro build, and I paid roughly $43 each of those mice.
The VXE Mad R also ships with an 8K receiver and upgraded optical switches.
My opinion is that there are only two donor mice to choose between. The VXE Dragonfly R1 SE is a fine mouse for $19, and it comes out to just over 21 grams when installed in a Li’l Magnum!, and 21 grams feels ridiculously light. If that isn’t good enough for you, skip all the other R1 models and go with the VXE Mad R. You get so much for your money there!
I expect that the Mad R Major with its upgraded sensor will fit in the Mad R version of the Li’l Magnum!. I don’t have one here, but I hope some brave soul will test it out for us and report back!
I don’t want to wait for a mouse to ship from China
I picked out a nice mouse from Amazon. It is the McHose L7 Ultra for $66. The specs are very much in line with the VXE Mad R Major. It looks like they both use identical optical switches, the same ultralight wheel, and the low-profile rotary encoder.
The McHose L7 Ultra uses the PAW3950 42K DPI sensor, just like the Mad R Major, so that is an upgrade over my own Li’l Magnum! built around the base model Mad R. Everything in the McHose L7 is lightweight, but the VXE Mad R has an advantage of about one gram due to its slightly smaller battery.
I wouldn’t worry about a single gram.
My copy of the McHose L7 is the Ultra. There is a slightly cheaper L7 Pro that uses the PAW3395 26K DPI sensor just like my Mad R and R1 Pro. There is a good chance that any trim level of the McHose L7 will drop into the Li’l Magnum! shell, but I don’t have the other two on hand to verify that.
Conclusion: Democratizing Ultralight Mice Starts Here
The Li’l Magnum! isn’t just a product—it’s a movement. By bridging 3D printing, open-source 3D models, and gaming hardware, we’re tearing down barriers to ultralight mouse ownership and customization. No more eye-watering price tags. Whether you print your own shell or grab a kit from my Tindie store, you’re part of a community proving that high-performance gear doesn’t have to cost a fortune.
Every Li’l Magnum! kit sold isn’t just a transaction—it’s fuel for the bigger mission. Your support lets me test more donor mice (like the newly acquired Mad R!), refine designs, and expand compatibility across models. PLA prints might not be the ideal material, but they’re affordable, accessible, and battle-tested.
This isn’t about chasing profit margins. It’s about passion, prototypes, and persistence. It’s about empowering you to build a mouse that fits your hand, your playstyle, and your budget. From $19 budget builds to more premium $60 mods, the goal is simple: ultralight mice for everyone, no matter the budget!
Let’s keep the momentum going.
Join our Discord community to share your Li’l Magnum! builds, swap tips, and help shape the future of open-source mouse mods. Whether you’re a tinkerer with your own printer or a gamer curious about fingertip grips, your voice matters here.
Together, we’ll keep pushing boundaries, breaking norms, and democratizing ultralight mice—one print, one mod, one click at a time. Ready to lighten up? Let’s go! 🖱️🚀