Feed aggregator
Path of Exile 2 not using background FPS setting when out of focus
I have this issue with PoE2 where after playing for a while it seems to bloat up and any other hardware accelerated apps like browsers will get sluggish. I'm unsure if this is anything I've done wrong or if it's just a PoE2 issue since performance seems to get laggy after a certain session length. Anyway, tabbing out of the app should use a built in frame limiter in the game set to whatever I want (default 30 fps) but it never limits FPS, like the game is always in focus no matter what. This has my GPU usage at 99-100% when I might want to check on a build or look up some information on loot, etc which at times can completely crash the other apps as they seem to get zero performance.
Here's my setup:
CPU: AMD 5900X
GPU: RTX 3080 (10gb version)
RAM: 32 GB
Software is Arch linux with the zen kernel, KDE Plasma, Proton-CachyOS, the latest nvidia drivers.
Proton prefixes:
PROTON_ENABLE_NVAPI=1 PROTON_ENABLE_NGX_UPDATER=1 PROTON_DLSS_UPGRADE=1 PROTON_DLSS_INDICATOR=1 PROTON_ENABLE_WAYLAND=1 PROTON_ENABLE_HDR=1 ENABLE_HDR_WSI=1 PROTON_USE_NTSYNC=1 PROTON_PRIORITY_HIGH=1 PROTON_VKREFLEX=1 %command%
If anyone's had a similar issue in this game or any other where the out of focus frame limiter in the game doesn't work, and perhaps some insight on how to fix it, I'd appreciate the help!
Thanks for your time!
submitted by /u/Lashmush[link] [comments]
D7VK version 1.7 brings even more retro Direct3D gaming to Linux
Read the full article on GamingOnLinux.
Linux kernel 7.0 is out now
Read the full article on GamingOnLinux.
TIP: Use trainers with protonPreloader
Note: This is for offline games, not online.
• LaLa Trainers Launcher
• CheatDeck
• protonPreloader
• Using Fling trainers with Steam Tinker Launch
[link] [comments]
Gamescope session on Polaris
I have an old computer with an RX 480 hooked up to my TV. I initially tried using Bazzite with it but after its init phase during boot it took well over a minute to load into its gamescope session. Sometime it wouldn't load gamescope at all, it would just stay on a black screen with a cursor at the top left. So I tried using vanilla Arch with the arch-deckify script as a replacement. It seems to load the gamescope session noticeably faster, but will still occasionally fail to load in the same manner as before. Is this a compatibility issue with Polaris cards or something else entirely? I'm getting mixed information when attempting to look this up.
submitted by /u/tumpfy[link] [comments]
Thinking of switching to Linux, but is gaming actually viable for a total beginner?
I’m really tempted to make the jump to Linux, but honestly, it feels a bit impossible from the outside. I’m wondering how hard it actually is to get games like GTA V (FiveM), Red Dead Redemption 2, and fan projects like Pokémon Reloaded running. Is this something a complete newbie can handle, or am I going to be stuck in a terminal for hours just to get a single game to launch?
submitted by /u/Money-Soft6437[link] [comments]
Lf family got some decent games and I don’t use them a lot because I bee into half sword demo only
Building MAAT-RPG: An Ethical AI RPG Without a Game Engine
Kernels for Gaming?
Hi, which do you think is the best or closest to best Kernel for gaming?
I am fine with my Arch linux using Linux-zen, But I wanted to know what do you think about it in terms of task and resources managing.
Thanks in advance.
submitted by /u/Antho_Rufus[link] [comments]
Interesting bug on elden ring.
CachyOS
Nvidia 4070ti
16gb ram
Not sure how well it will record so making a text post instead, but wanted to play some elden ring today and make a new character and the game runs fine, but when I would try to input a name the name tab would flash for a moment and automatically deselect. Happens on both controller and KB&M. If anyone has any idea why feel free to mention. This bug is only happening in big picture mode btw. Desktop mode it's fine but since I sit on a couch with my computer connected to a TV and the digital keyboard doesn't like to work unless I'm in big picture mode. Sorry for the spiel.. just asking for advice!
submitted by /u/That_Cabinet_6370[link] [comments]
Source Games with GDK-Proton
Dear users of Reddit,
Hello. I am very new to Linux Mint Cinnamon and Linux in general, and was having trouble getting Source engine games to run. I just got Minecraft: Bedrock edition running on Linux with the help of GDK-Proton, and thought it might be worth a shot to equip my Source engine games with it and see what happens. Doing so allowed me to play Source engine games on their most recent versions, with normal graphics settings, with little to no issues. I thought this might be an interesting thing to share, as it is my understanding that people have been having trouble playing Source engine games, especially Half-Life: Source, on Linux, and I do not think I have seen anyone else try this before. I will add a link to GDK-Proton below.
From,
Grayson W.
https://github.com/Weather-OS/GDK-Proton
Black Mesa, running on Linux Mint Cinnamon.
Half-Life 2, running on Linux Mint Cinnamon.
Half-Life: Source, running on Linux Mint Cinnamon.
Portal 2, running on Linux Mint Cinnamon.
Portal, running on Linux Mint.
submitted by /u/FlUffYBOI-infinity[link] [comments]
Crimson Desert - CachyOS - RTX 3090 vs Strix Halo (iGPU)
My RTX 3090 Just Got Absolutely Clowned by my Laptop's iGPU — And Linux Is the Ultimate Troll
Picture this: I’m sitting in my office, fire-breathing desktop rig humming like a jet engine. RTX 3090 with 24 GB of VRAM, Ryzen 9 5950X, 128 GB RAM, all cooled like it’s ready for a NASA launch. Across the room on my desk is the ASUS ROG Flow Z13 — basically a chunky Windows tablet that decided to go full gaming laptop. Same 128 GB RAM, but now powered by the AMD Ryzen AI Max+ 395 “Strix Halo” chip with its Radeon 8060S iGPU (40 RDNA 3.5 compute units, roughly RTX 4060–4070 territory on a good day). Both running fresh CachyOS Linux, KDE Plasma 6.6.4 on Wayland, kernel 6.19.12.
Same game: Crimson Desert, Pearl Abyss’s gorgeous open-world beast that dropped in March 2026. Same Proton-CachyOS magic. Same love for high textures and weather effects.
The 3090 is pushing an ASUS ultrawide 3440×1440 monitor — 21% more pixels than the Z13’s crisp 2560×1600 panel. That’s a bigger viewport, more foliage to render, wider weather systems, and extra horizontal chaos in every fight. Logically, the desktop should smoke the handheld.
Instead… the iGPU is the one making me grin like an idiot.
I fired up Crimson Desert on both rigs with MangoHud glued to the screen like a nosy spectator. On the 3090 I was getting solid-but-stuttery frames, even with DLSS 4 Quality and every trick in the book. GPU utilization sometimes lounging around 50–60% like it was on vacation. With DLSS 4.0 on the 3090 im looking at ~30fps on average. FSR 3.1 with Frame Generation actually felt snappier than DLSS 4, ~65fps which is the kind of heresy that makes NVIDIA engineers cry into their tensor cores.
Then I swapped to the little Z13. 2560×1600, FSR 3.1 (menu still only shows 3.1, but Proton’s PROTON_FSR4_RDNA3_UPGRADE=1 flag quietly upgrades it to 4.0/4.1 magic behind the scenes), Frame Generation on, and… butter. Smooth exploration through dense forests, silky combat, no weird frametime spikes. The 8060S was happily chugging along at 70–110+ FPS in cinematic territory while my 3090 felt like it was fighting the Proton translation layer with one hand tied behind its back.
It wasn’t even close in the “fun” department. The portable rig just felt better.
Why Is AMD Dominating NVIDIA on Linux?Let’s be real — Crimson Desert is basically an AMD-sponsored love letter (Blackspace Engine + heavy FSR integration). On Windows it’s fine for everyone. On Linux? The difference is night and day.
AMD + RADV/Mesa + Proton-CachyOS is cooking with gas. Unified memory on the Strix Halo means the iGPU laughs at texture streaming. FSR 4 (even the fallback version) delivers sharper motion and fewer artifacts than DLSS 4 in this title, especially with Frame Generation. My launch options on the Z13 (gamemoderun mangohud PROTON_FSR4_UPGRADE=1 PROTON_FSR4_RDNA3_UPGRADE=1 %command%) just work. No drama.
Meanwhile the 3090 on the same Proton build is dealing with the usual NVIDIA Linux DX12-to-Vulkan tax: lower utilization, ray reconstruction throwing tantrums, and occasional “why is my $1500 GPU only at 48% load?” moments that the community has been complaining about since launch. Community reports on ProtonDB and Reddit are full of 40-series and 3090 owners saying the same thing I’m seeing: “NVIDIA is awful on Linux right now for this game.”
The iGPU isn’t just keeping up — it’s winning the vibe check while my desktop sits there looking expensive and slightly confused.
The Moment I Officially Ditched Microsoft for LinuxWhy did Bill Gates go to Epstein Island?
…Microshaft.
This whole saga is exactly why I finally said goodbye to Windows. I used to be that guy — dual-booting, tweaking, praying to the update gods. Then one day I installed CachyOS, set up Proton, and never looked back.
(Yeah, I said it. The OS that once felt like the only option now feels like the one holding me back from this kind of pure, unfiltered gaming joy.)
Linux isn’t perfect — Wayland hotkey quirks with MangoHud still make me chuckle — but the freedom? The performance tweaks? The fact that a pocket-sized AMD powerhouse can embarrass a full desktop 3090 in a brand-new AAA title? That’s the future I want to play in.
The Moral of the Story (and Why You Should Care)In 2026, Linux gaming isn’t “good enough anymore.” In certain titles — especially ones with strong AMD DNA — it’s straight-up preferable. My ROG Flow Z13 has become my daily driver for Crimson Desert, while the 3090 rig is now the “big-screen cinematic experience” machine that I tweak more aggressively to match the handheld’s smoothness.
Both setups are beasts in their own way. The desktop still wins for sheer immersion on that ultrawide. But the little Strix Halo iGPU proved something profound: raw power isn’t everything when the software stack loves your hardware.
So if you’re on the fence about Linux gaming, do yourself a favor. Grab CachyOS, fire up Proton, slap on some MangoHud, and watch an underdog iGPU humble a flagship GPU.
It’s not just entertaining.
It’s downright hilarious.
And yes — I’m still using both rigs. Because why choose when Linux lets you have the best of both worlds?
Now if you’ll excuse me, I have an open world full of crimson chaos to explore… on the machine that actually wants me to have fun.
submitted by /u/KOTNcrow[link] [comments]
RTX3060 Fedora 43 GPU falls off the bus with no logs. Endless troubleshooting.
**TLDR:**RTX 3060 GA106 causes instant hard system crash (blank screens, forced reboot, zero log entries) in any 3D application on Linux. Crash is 100% reproducible with Unigine Heaven benchmark. I can game for hours at a time and then it will crash non stop after each boot. Issue persists across every available driver (open module, closed proprietary, GSP enabled/disabled), every kernel tested (6.12 LTS through 6.19), both Wayland and X11, and multiple distro configurations on Fedora 43. Desktop use is completely stable. GPU, PSU, RAM and thermals all verified healthy. PCIe link consistently negotiates at 5GT/s (Gen2) despite both GPU and root port targeting 8GT/s (Gen3) on ASUS TUF B450-PLUS GAMING. Extensive troubleshooting documented below. Requesting Nvidia engineer review as this appears to be a driver or firmware level issue specific to GA106 on AMD B450 platform.
System Configuration:
- OS: Fedora 43 KDE Plasma
- Motherboard: ASUS TUF B450-PLUS GAMING, BIOS 4604 (updated from 2008 during troubleshooting)
- CPU: AMD Ryzen 5 3600
- RAM: 32GB DDR4 Corsair 2133MHz (4x8GB)
- GPU: NVIDIA GeForce RTX 3060 GA106 (10DE:2504)
- PSU: Corsair 750W
- Monitors: 3x 1080p (2x DisplayPort, 1x HDMI)
- PCIe Link: Negotiating at 5GT/s (Gen2) despite capable of 8GT/s (Gen3)
Kernels Tested - all crash:
- 6.19.11-200.fc43 (current)
- 6.19.10-200.fc43
- 6.19.9-200.fc43
- 6.17.1-300.fc43
- 6.12.80 LTS
Drivers Tested - all crash:
- RPM Fusion akmod-nvidia 580.126.18 (open module)
- RPM Fusion akmod-nvidia 595.58.03 (open module)
- negativo17 nvidia-driver 595.58.03 (open module)
- Official Nvidia .run installer 580.126.18 (closed proprietary module, GSP disabled)
Parameters Tested:
- NVreg_EnableGpuFirmware=0
- NVreg_PreserveVideoMemoryAllocations=1
- NVreg_UsePageAttributeTable=1
- pcie_aspm=off
- pcie_port_pm=off
- GPU power limited to 150W
BIOS Settings Tested:
- PCIe ARI Support disabled
- PCIe Ten Bit Tag disabled
- AER Cap disabled
- Early Link Speed Gen2
- IOMMU enabled/disabled
Symptoms:
- Hard system crash, instant blank screens, forced reboot
- Zero Xid errors captured in any log
- kdump never fires
- Reproducible with Unigine Heaven benchmark
- Crashes at both high load (165W) AND low load (18W at initialization)
- Desktop completely stable, only 3D applications crash
- Display server: X11 (also tested Wayland, same crashes)
- Crash is instant with no warning, no thermal throttling beforehand
- GPU temperatures normal at crash time (under 70°C)
What was ruled out:
- PSU insufficiency (750W Corsair, tested on different outlets circuits with and without a UPS)
- Thermal issues (temps normal)
- RAM defects (memtester passed)
- Power delivery from wall (tested multiple circuits)
- Steam/Proton (reproduced with native Unigine Heaven outside Steam)
- Wayland vs X11 (both crash)
Unique Finding: PCIe link negotiates at 5GT/s (Gen2) despite both GPU and root port capable of and targeting 8GT/s (Gen3). This may be related to B450 + RTX 3060 compatibility.
Crash is 100% reproducible with Unigine Heaven benchmark at any settings including default low settings.
submitted by /u/Lonecrow66[link] [comments]
Asus G14 2022 RX6800s Hybrid Graphics Compatibility
I own a Asus G14 2022 model laptop. This machine is an all AMD machine. Its dGPU is an RX6800s and it has amd integrated graphics as well. In windows 11 I use a 3rd party program called G Helper to control the laptops power profile. When on battery I force it into ECO mode (igpu only), Silent (No fans), and force 60hz refresh rate. This give me very good battery life and keeps the laptop silent. When I want to game (usually when plugged in and very rarely on battery), I toggle this program to remove these restrictions and of course power usage and noise significantly rises. My question is whether a linux distribution can easily handle my use case? I have tried to research linux compatibility on gaming laptops, but most laptops have a nvidia dgpu combined with either an intel or amd igpu. Can't find many examples of an amd dgpu + amd igpu. I'm generally looking for some direction from the community on the best distribution for my laptop and if I would need any additional tools to manage the dpgu and igpu setup to maximize battery life. Thank you in advance for the help.
submitted by /u/canUrollwithTHIS[link] [comments]
How to enable Anti Lag 2 on Crimson Desert?
Is there a command needed i cant find? Or just not supported? I thought i saw that it had been usable now.
submitted by /u/BuffaloGlum331[link] [comments]
I'm old school
Xid 79: GPU has fallen of the bus crash on Linux (Arch and CachyOS)
I have 5070 ti in a AMD Chipset Mobo. I tried to play both on Arch and CachyOS (they are almost the same distro) and I had mixed results. Games with less requirements run very smoothly, no problems compared with Windows. But stuff like Cyberpunk in very high settings I keep having this Xid 79 " GPU has fallen of the bus" error that shows in journalctl. The crash happens fairly regular, 5-15 minutes in game. It is a GPU crash that results in Blackscreen that require a hard reboot to recover.
On Windows the exact same settings Cyberpunk runs smoothly with no issues in the same desktop.
I tried several troobleshootings, but nothing worked: several kernel options to deal with state change on GPU, driver (and the whole system) rollback, messing with several options on nvida including limiting GPU clock, several nvidia-smi options (mostly dealing with Persistence-M), trying different kernels (like lts and zen), different options to run the games on Steam, messing with Prime and Optimus (from hybrid GPUs guides) several enviromental options and so on. I can list them here. I'm entering the 10th day of continuous troobleshootings.
As a last resort before going back to Windows, I Will try to run the game in OpenSUSE Tumbleweed, to see if it's only the Arch-distros that cant handle to high settings (How does Steam Deck is able to deal with ultra high settings?). Anyway, would love to hear How other Blackwell GPUs owner are managing high settings in modern games.
submitted by /u/Many_Maize_6676[link] [comments]
Skyrim Mod Managers in the Big 2026
Been seeing a lot of posts of MO2 looking native mod managers for Skyrim. Also some more attempts at getting the regular MO2 working better through proton. Curious as to what people are using for modding skyrim these days, especially those with big and complex mod lists. A while back I setup mo2 using a tool called NaK which worked well enough. Still feels kinda jank though.
submitted by /u/rinnekaton[link] [comments]
Not getting past expedition loading screen
Hey yall, I recently bought a new Nvidia RTX 5070 Ti and I've been having issues specifically with Nightreign getting games to load in once you select the expedition. I'm running the game on CachyOS, and I've tried all the things I can think of. Changing proton versions, forcing the bleeding-edge beta build, starting the game from the CLI, trying all the launch options, nothing seems to work. I eventually enabled proton logging and both computers I've tried the GPU on spit out the same error once we're supposed to load into the expedition and the game locks up. It's just this over and over until I kill the process:
351.288:0204:033c:warn:vkd3d-proton:d3d12_device_GetResourceAllocationInfo3: Invalid resource desc. 351.289:0204:0334:warn:vkd3d-proton:d3d12_resource_validate_desc: Invalid alignment 4096 for buffer resource. Must be 0 or D3D12_DEFAULT_RESOURCE_PLACEMENT_ALIGNMENT. 351.289:0204:0334:warn:vkd3d-proton:d3d12_device_GetResourceAllocationInfo3: Invalid resource desc. 351.290:0204:0334:warn:vkd3d-proton:d3d12_resource_validate_texture_alignment: Invalid resource alignment 0x1000 (required 0x10000).I've confirmed both computers are running on the latest Nvidia drivers, and I had briefly tried installing the previous release but had some installation failures and decided I'd keep trying in other places for the moment.
Wondering if anyone else has come across similar issues and have a fix for this? From what I've been able to gather it seems that the GPU is misusing the proton API, but I haven't got a clue why or how you'd go about fixing that. Any help would be super appreciated. Happy to give more details if needed, trying to put most everything here up front.
submitted by /u/EarthGorilla9455[link] [comments]
Animations not working on x86_64 version of Minecraft for Android
Does anyone know how to fix this annoying bug that Mojang said they won't fix? It works on the Windows GDK version, but it is slower and doesn't work with Microsoft login. I am running the Android version with Trinity Launcher (Flatpak version) and my hardware specifications are: CPU: Intel Celeron 6305, GPU: Intel UHD Graphics G4 (Integrated), RAM: 4GB 3200MHz on single channel, my 256GB SSD model is: SAMSUNG MZVLQ256HBJD-00B, my OS: Debian 13 (Trixie), Kernel: Linux 6.19.11-x64v3-xanmod1, and the Minecraft Bugrock version I am using is v26.13 and this also happens in 26.12 and 26.10.
submitted by /u/No_Card_3175[link] [comments]
