Other News about gaming on Linux
Resident Evil Requiem - should Grace's face look like this? Or is there a problem?
1st pic: what it looks like in areas with less lighting (I'd say it okay-ish) 2nd pic: what it looks like in areas with more lighting (definitely looks weird)
Mind you, I did some settings tweaking between 1st and 2nd pic.
But anyways, it still does not look right. Any guesses what might be the problem?
Using nvidia-driver-595 on a 5090.
submitted by /u/kennyminigun[link] [comments]
Peaks of Yore becoming 10fps if playing it for 30 minutes
So i am trying to play Peaks of Yore on CachyOS and it is fine for 30 minutes. Then it starts lagging heavily. if i reopen the game it still lags. If i wait like 10 minutes before opening the game again it is not laggy until i play the game for 30 minutes. I tried using ProtonGE and it still lags.
Output of inxi -xxACGS
System:
Host: gilbert Kernel: 6.19.6-2-cachyos arch: x86_64 bits: 64 compiler: clang
v: 21.1.8
Desktop: KDE Plasma v: 6.6.2 tk: Qt v: N/A wm: kwin_wayland dm: SDDM
Distro: CachyOS base: Arch Linux
CPU:
Info: 6-core model: AMD Ryzen 5 4600H with Radeon Graphics bits: 64
type: MT MCP arch: Zen 2 rev: 1 cache: L1: 384 KiB L2: 3 MiB L3: 8 MiB
Speed (MHz): avg: 1397 min/max: 1400/3000 boost: enabled cores: 1: 1397
2: 1397 3: 1397 4: 1397 5: 1397 6: 1397 7: 1397 8: 1397 9: 1397 10: 1397
11: 1397 12: 1397 bogomips: 71864
Flags-basic: avx avx2 ht lm nx pae sse sse2 sse3 sse4_1 sse4_2 sse4a
ssse3 svm
Graphics:
Device-1: NVIDIA TU117M [GeForce GTX 1650 Mobile / Max-Q] vendor: Lenovo
driver: nvidia v: 590.48.01 arch: Turing pcie: speed: 2.5 GT/s lanes: 8
ports: active: none empty: HDMI-A-1 bus-ID: 01:00.0 chip-ID: 10de:1f99
Device-2: Advanced Micro Devices [AMD/ATI] Renoir [Radeon Vega Series /
Radeon Mobile Series] vendor: Lenovo driver: amdgpu v: kernel arch: GCN-5
pcie: speed: 16 GT/s lanes: 16 ports: active: eDP-1 empty: none
bus-ID: 06:00.0 chip-ID: 1002:1636 temp: 53.0 C
Device-3: IMC Networks Integrated Camera driver: uvcvideo type: USB
rev: 2.0 speed: 480 Mb/s lanes: 1 bus-ID: 1-3:3 chip-ID: 13d3:56ff
Display: wayland server: X.org v: 1.21.1.21 with: Xwayland v: 24.1.9
compositor: kwin_wayland driver: gpu: amdgpu display-ID: 0
Monitor-1: eDP-1 model: LG Display 0x05e5 res: 1920x1080 hz: 60 dpi: 142
diag: 395mm (15.5")
API: EGL v: 1.5 platforms: device: 0 drv: nvidia device: 1 drv: radeonsi
device: 3 drv: swrast gbm: drv: kms_swrast surfaceless: drv: nvidia wayland:
drv: radeonsi x11: drv: radeonsi inactive: device-2
API: OpenGL v: 4.6.0 compat-v: 4.5 vendor: amd mesa v: 26.0.1-arch2.3
glx-v: 1.4 direct-render: yes renderer: AMD Radeon Graphics (radeonsi
renoir ACO DRM 3.64 6.19.6-2-cachyos) device-ID: 1002:1636
display-ID: :0.0
API: Vulkan v: 1.4.341 surfaces: N/A device: 0 type: integrated-gpu
driver: mesa radv device-ID: 1002:1636 device: 1 type: discrete-gpu
driver: nvidia device-ID: 10de:1f99
Info: Tools: api: clinfo, eglinfo, glxinfo, vulkaninfo
de: kscreen-console,kscreen-doctor gpu: nvidia-settings,nvidia-smi
wl: nwg-displays,wayland-info x11: xdpyinfo, xprop, xrandr
Audio:
Device-1: NVIDIA driver: snd_hda_intel v: kernel pcie: speed: 8 GT/s
lanes: 8 bus-ID: 01:00.1 chip-ID: 10de:10fa
Device-2: Advanced Micro Devices [AMD] Audio Coprocessor vendor: Lenovo
driver: N/A pcie: speed: 16 GT/s lanes: 16 bus-ID: 06:00.5
chip-ID: 1022:15e2
Device-3: Advanced Micro Devices [AMD] Ryzen HD Audio vendor: Lenovo
driver: snd_hda_intel v: kernel pcie: speed: 16 GT/s lanes: 16
bus-ID: 06:00.6 chip-ID: 1022:15e3
Device-4: ASUSTek TUF H3 Wireless driver: hid-generic,snd-usb-audio,usbhid
type: USB rev: 1.1 speed: 12 Mb/s lanes: 1 bus-ID: 1-2.3.1:6
chip-ID: 0b05:1963
API: ALSA v: k6.19.6-2-cachyos status: kernel-api
Server-1: sndiod v: N/A status: off
Server-2: JACK v: 0.126.0 status: off
Server-3: PipeWire v: 1.6.0 status: active with: 1: pipewire-pulse
status: active 2: wireplumber status: active 3: pipewire-alsa type: plugin
submitted by /u/himynameisreallyMay[link] [comments]
Gaming performance. split_lock_mitigate.
I don't know if this i common knowledge and I'm the only one who missed it. But this setting made a huge difference in performance for me a least, so I have to share it in case it helps anyone else. If it is already covered in some other post you can just delete this.
Some Windows games (when running on Linux via Proton) trigger a special CPU operation called a “split lock.”
By default, the Linux kernel slows the game down by ~10 ms every time this happens — it’s a security feature to stop potential attacks.
Setting it to 0 turns that artificial slowdown off. On a local computer in your home network you might as well turn it off.
I saw this setting mentioned for running the old "The Division 2" game on linux which I was trying to get to run better.
The difference in framerate was huge. And then I started "Cyberpunk 2077" just to see if it made any difference there. Didn't expect much but OMG. It had defaulted to Raytracing ON and just about everything on High or Ultra which I couldn't do before this.
I have an MSI GeForce RTX 5070 running on Linux Mint by the way.
I tested it by just disabling it like this.
>sudo sysctl -w kernel.split_lock_mitigate=0
This is temporary until you turn it On again or reboot.
So I made it permanent like this.
>echo "kernel.split_lock_mitigate=0" | sudo tee /etc/sysctl.d/50-split-lock.conf
>sudo sysctl -p /etc/sysctl.d/50-split-lock.conf
PS. Tried "Enshrouded" now which also seems to have a very nice performance boost. And I thought this would only effect older games.
NOTE. Just realized that it might depend on what CPU you have. Maybe this is not a problem for all CPU's. I have a "Intel Core i5-13400" and at least for me the difference was crazy.
- Sounds like the Linux gaming-tool you can install called "GameMode" includes this parameter so you wont have to set it manually like I did here. But I wasn't aware of that tool. Will try that out though! Thanks to Chromiell
submitted by /u/bjornxon[link] [comments]
To every dev who dunks on vibe coders: your gatekeeping isn't protecting the craft, it's just protecting your ego
I don't know how to code.
I mean it. I cannot write a for loop from memory. I don't know what a pointer is. I couldn't tell you the difference between compiled and interpreted if my life depended on it. I am, by every traditional metric, a civilian.
Yesterday I spent an afternoon with an AI and built an app that does exactly what I need it to do. Nothing more, nothing less. It works. I tested it. It solved my problem.
And then I made the mistake of being excited about it on the internet.
What followed was a predictable parade of developers, presumably very serious and very credentialed developers, lining up to explain to me why what I did doesn't count. That I don't understand what's under the hood. That I'll be helpless when it breaks. That real programming is a discipline and I'm cheapening it. Someone called it a "slop app." That was a fun one. I wore it like a badge because honestly, my slop app works better for my needs than anything I could have found on the shelf, so thanks for the nickname I guess.
I posted in good faith in a community I thought might appreciate what a total newcomer had pulled off. Instead I got ratio'd into the ground and left feeling like I'd committed some kind of crime. I won't be posting here again. Not because I'm embarrassed, but because life is too short to hang around people who think the correct response to someone's excitement is a lecture. This community can have its purity. I'll find mine elsewhere.
Let me be very clear about something: nobody asked.
Here's what I think is actually happening. For a long time, being able to build software was a superpower. It was a skill that took years, created real gatekeeping by necessity, and granted access to a club that felt earned. That's legitimate. I'm not diminishing the craft.
But AI tools just handed a hammer to everyone who never had one, and some of you are furious about it. Not because we're doing harm. Not because our little scripts are threatening production systems somewhere. But because your special thing got a little less special, and that apparently requires you to go find a new person's excited post and explain to them why they should feel bad.
That's not mentorship. That's not "protecting best practices." That's just jealousy wearing a Stack Overflow avatar.
The Wright Brothers didn't invent flying to gatekeep air travel. You learned to code to solve problems, or you should have. A person who has never touched a terminal just solved their problem. Maybe celebrate that the thing you love is now accessible to more humans?
Or don't. But if your instinct when you see someone genuinely thrilled about making their first working app is to immediately explain why it's not real, that's a you problem, friend. That's a "my identity is too wrapped up in being someone normal people can't compete with" problem.
My app works.
I made it.
AI helped me.
I don't care if that upsets you.
Go touch grass. I'll be over here using my fake app to actually get things done.
To the devs who ARE supportive and welcoming in these spaces, you're the reason people stay. Thank you. This isn't about you.
Oh, and guess what I used to help me write this post - help me organize my thoughts and not just scream into the void? Does that bother you too? Cool. Don't care.
submitted by /u/zer0squaredis[link] [comments]
Snowrunner graphic issues
I've dual-booted Linux and Windows. When I start Snowrunner in Kola at night, I get issues with the northern lights. I have a rtx 3060 and uses the latest nvidia drivers in Linux, 595.
How can this issue be fixed?
I have tried with both Linux mint and bazzite. Same issue.
submitted by /u/areindos[link] [comments]
No working pad controller in Dynasty Warriors Origins of Steam
Hi.
Some info of my PC:
inxi -xxACGS
System:
Host: hav4k-ryzen Kernel: 6.17.0-20-generic arch: x86_64 bits: 64
compiler: gcc v: 13.3.0
Desktop: Cinnamon v: 6.6.7 tk: GTK v: 3.24.41 wm: Muffin dm: LightDM
Distro: Linux Mint 22.3 Zena base: Ubuntu 24.04 noble
CPU:
Info: 8-core model: AMD Ryzen 7 7800X3D bits: 64 type: MT MCP arch: Zen 4
rev: 2 cache: L1: 512 KiB L2: 8 MiB L3: 96 MiB
Speed (MHz): avg: 3206 high: 4347 min/max: 426/5053 boost: enabled cores:
1: 4166 2: 4013 3: 2983 4: 2983 5: 2983 6: 2983 7: 2983 8: 2983 9: 2983
10: 4347 11: 2983 12: 2983 13: 2983 14: 2983 15: 2983 16: 2983
bogomips: 134152
Flags: avx avx2 ht lm nx pae sse sse2 sse3 sse4_1 sse4_2 sse4a ssse3 svm
Graphics:
Device-1: AMD Navi 31 [Radeon RX 7900 XT/7900 XTX/7900M]
vendor: Tul / PowerColor driver: amdgpu v: kernel arch: RDNA-3 pcie:
speed: 16 GT/s lanes: 16 ports: active: HDMI-A-1 empty: DP-1, DP-2, DP-3,
Writeback-1 bus-ID: 03:00.0 chip-ID: 1002:744c
Device-2: AMD Raphael vendor: Gigabyte driver: amdgpu v: kernel
arch: RDNA-2 pcie: speed: 16 GT/s lanes: 16 ports: active: none empty: DP-4,
DP-5, DP-6, HDMI-A-2, Writeback-2 bus-ID: 12:00.0 chip-ID: 1002:164e
temp: 32.0 C
Display: x11 server: X.Org v: 21.1.11 with: Xwayland v: 24.1.6 driver: X:
loaded: amdgpu unloaded: fbdev,modesetting,radeon,vesa dri: radeonsi
gpu: amdgpu display-ID: :0 screens: 1
Screen-1: 0 s-res: 2560x1440 s-dpi: 96
Monitor-1: HDMI-A-1 mapped: HDMI-A-0 model: XG27ACS res: 2560x1440
dpi: 109 diag: 685mm (27")
API: EGL v: 1.5 platforms: device: 0 drv: radeonsi device: 1 drv: radeonsi
device: 2 drv: swrast gbm: drv: kms_swrast surfaceless: drv: radeonsi x11:
drv: radeonsi inactive: wayland
API: OpenGL v: 4.6 compat-v: 4.5 vendor: amd mesa v: PPA glx-v: 1.4
direct-render: yes renderer: AMD Radeon RX 7900 GRE (radeonsi navi31 ACO
DRM 3.64 6.17.0-20-generic) device-ID: 1002:744c
API: Vulkan v: 1.3.275 surfaces: xcb,xlib device: 0 type: discrete-gpu
driver: N/A device-ID: 1002:744c device: 1 type: integrated-gpu driver: N/A
device-ID: 1002:164e device: 2 type: cpu driver: N/A device-ID: 10005:0000
Audio:
Device-1: AMD Navi 31 HDMI/DP Audio driver: snd_hda_intel v: kernel pcie:
speed: 16 GT/s lanes: 16 bus-ID: 03:00.1 chip-ID: 1002:ab30
Device-2: AMD Rembrandt Radeon High Definition Audio driver: snd_hda_intel
v: kernel pcie: speed: 16 GT/s lanes: 16 bus-ID: 12:00.1 chip-ID: 1002:1640
Device-3: AMD Family 17h/19h HD Audio vendor: Gigabyte
driver: snd_hda_intel v: kernel pcie: speed: 16 GT/s lanes: 16
bus-ID: 12:00.6 chip-ID: 1022:15e3
API: ALSA v: k6.17.0-20-generic status: kernel-api
Server-1: PipeWire v: 1.0.5 status: active with: 1: pipewire-pulse
status: active 2: wireplumber status: active 3: pipewire-alsa type: plugin
Already tried with PS4 and 8BitDo Pro 2 Wired, no luck. How can i do? Thanks.
Regards.
submitted by /u/Ok-Ad-6414[link] [comments]
Help with audio track setting for gpu-screen-recorder?
So what I'm trying to do is create 3 separate tracks. The first one should capture every app except discord and my mic so it's pure gameplay sound only, the second one to capture everything including mic, and the last one should only be my mic and discord call. The first and second one works, but I can't isolate discord for the last track as I still could hear audio from firefox being played. How to isolate input and discord call only for the third track? And is the setting for the first and second track already correct?
submitted by /u/TheSullenStallion[link] [comments]
How can I add a friend on EA without having the EA app installed?
I played some EA games years ago and still got my Origin account.
I bought the game It Takes Two on Steam and my operating system is Linux ( PC + Steam Deck )
A friend bought the game on PS5, luckily we can coop together via EA servers.
But I haven't added my friend to the EA friend list yet and the EA app is only available to Windows PCs. I wasn't able to find a "manage friends" feature on the EA browser page.
Ingame I found a "add EA friend" button and sent a friend request to him but he didn't receive the invitation. It seems I have to use the EA app to achieve this.
Do I have to setup a Windows VM to install the EA app to send a friend request ... ? ( lol )
Thanks for help!
submitted by /u/m477h145h3rm53n[link] [comments]
Ubuntu Suspend problem
I have a question, whenever I use both an integrated and a GPU card, I can suspend the laptop and reopen without any problems, but the problem starts when I turn on the Mux switch(using GPU only), it is black whenever I reopen it. So I wanna ask, are there any ways to solve this out
submitted by /u/Jazzlike_Fee_3081[link] [comments]
Title: Good Linux distribution for gaming laptop (NVIDIA + dual boot)?
Hi everyone,
I am planning to install Windows and Linux as a dual boot system on my gaming laptop. Do any of you have suggestions for a decent Linux distribution that would run well with NVIDIA GPUs?
Gaming is my primary application on my laptop, mostly single-player games, so I do need something that performs decently and has easy GPU drivers.
Here are some criteria I have in mind:
- Easy NVIDIA GPU driver installation (either pre-installed or not)
- Good gaming performance (Steam, Proton, etc.)
- Reliability in general
- Maintenance not being an issue
If you are doing the same thing, what are you running and how does your experience look like?
submitted by /u/nothing_505[link] [comments]
i am trying to install this old japanese game called inperishable night on linux using bottles. and i am using a CD rom.
when i use the installer wizard and make it install the game to the bottles files, it shows a window that roughly translates to "cannot find file path." does anyone know how to fix this?
submitted by /u/OzAndApss[link] [comments]
Xbox controller via Bluetooth randomly not working or mis-mapped on Nobara (Fedora-based)
Hi everyone,
I’m having issues with my Xbox controller over Bluetooth on Nobara (Fedora-based), and I can’t figure out if I’m missing a driver, config, or something else.
This happens across multiple games (Genshin Impact, Monster Hunter Wilds, and basically any game that supports controllers).
When I connect the controller via Bluetooth and start a game, one of these usually happens:
- Controller not responding at all
- It shows as connected
- evtest detects all inputs correctly
- But the game doesn’t receive any input
- Incorrect button mapping
- Example: pressing LB triggers Y
- Only A, B, analog sticks, and D-pad seem to work properly
So the issue is not game-specific — it affects everything.
System info:
- OS: Nobara Linux 43 (Fedora-based)
- KDE Plasma: 6.6.2
- KDE Frameworks: 6.24.0
- Qt: 6.10.2
- Kernel: 6.19.10-201.nobara.fc43.x86_64 (also tested 6.19.10-203)
- GPU: NVIDIA GeForce RTX 3050
- CPU: AMD Ryzen 5 5600G
- RAM: 48 GB
- Session: Wayland
Temporary workaround:
The only way I’ve found to fix it is:
- sudo modprobe -r hid_xpadneo
- Turn off the controller
- Wait ~5–10 seconds
- Turn it back on
- Try again
Sometimes I need to repeat this twice, but it usually works afterward.
Question:
Is this a known issue with Bluetooth Xbox controllers on Fedora/Nobara?
Am I missing something like:
- a specific driver (xpad vs xpadneo conflict?)
- udev rules
- Steam Input configuration
- kernel module conflict
Any ideas on how to fix this permanently would be really appreciated 🙏
submitted by /u/DeividAGameX01[link] [comments]
how to move genshit game files to steam ?
i recently installed nobara on my sdd and genshit was already installed on the hdd. and my weeb discordian brother wants to play the game but i couldn't configure steam to access the game's directory on the hdd, and i don't feel like reinstalling 107 worth of gigabytes via steam launcher again, is there a possible way to make the hoyoplay launcher find genshits .exe file
submitted by /u/whatthehwlly[link] [comments]
How is gaming on Linux?
I’m not the biggest fan of windows 11. So I am looking for an alternative. Back in the day my dad who got me into being a computer enthusiast used to partition his hard drive with a dual boot into windows and Ubuntu. I know pretty much everything there is to know about windows but I don’t have any experience with Linux at all. Is it super complicated for a beginner?
submitted by /u/brokeboii94[link] [comments]
Wordle for Linux
I built a Wordle-inspired game for Linux lovers.
Three daily puzzles:
- Guess the Linux command by its attributes.
- Identify the blurred distro logo.
- Name the DE/WM from a screenshot.
It's still a work in-progress and I am very open to suggestions (games to add, improvements i can make, etc.)
Try it out: https://linuxdle.site
submitted by /u/afiddlemain[link] [comments]
Bazzite?
Long story short, I'm rather annoyed and frustrated at Windows being butt. I've been told by a couple of friends and acquaintances that I should give Bazzite a shot and see how it goes. Before I do so, are there any precautions I should take? I'm considering going down the dual boot route. I mainly use my PC for gaming, browsing the internet, and using Microsoft Word for novels.
submitted by /u/im0497[link] [comments]
Fixed Overwatch memory leak + performance issues on Bazzite Linux (NVIDIA laptop)
So I spent a few days trying to get Overwatch running properly on Bazzite and I finally got it. Sharing here because someone else might be having the same issues and can't get it running even with all the help already out there on reddit/forums and AI.
My setup: Acer Nitro 5, i5-12450H, RTX 4050, 16GB RAM, Bazzite KDE
What was happening:
- RAM filling up in like 5 minutes
- System freezing and having to hard reboot
- 40-50fps in the practice range doing literally nothing
- CPU sitting at 90% for no reason
- Game wouldn't even close properly
After a lot of trial and error here's what actually fixed it:
0) Switch to closed NVIDIA drivers. Bazzite has two NVIDIA images, make sure you're on the closed drivers one (bazzite-nvidia), not the open kernel modules.
1) Disable NVIDIA GSP. This was the biggest one by far. GSP is firmware that runs inside the GPU and it has bugs on Linux hybrid laptops. Disabling it dropped my CPU usage from 90% to around 40%.
rpm-ostree kargs --append=nvidia.NVreg_EnableGpuFirmware=0
Reboot, then verify with:
nvidia-smi --query-gpu=gsp.mode.current --format=csv
It should say [N/A].
2) Force the NVIDIA GPU. On hybrid laptops Linux defaults to the integrated GPU. Add this to Overwatch's Steam launch options:
__NV_PRIME_RENDER_OFFLOAD=1 __GLX_VENDOR_LIBRARY_NAME=nvidia %command%
3) Use GE-Proton 8-17. GE-Proton 8-17 is old but it keeps the RAM stable. Install it manually:
cd /tmp wget https://github.com/GloriousEggroll/proton-ge-custom/releases/download/GE-Proton8-17/GE-Proton8-17.tar.gz mkdir -p ~/.steam/root/compatibilitytools.d tar -xf GE-Proton8-17.tar.gz -C ~/.steam/root/compatibilitytools.d/Restart Steam and force it in Overwatch's compatibility settings.
4) DXVK async config. Create ~/.config/dxvk.conf with this inside:
dxvk.numCompilerThreads = 2 dxvk.enableAsync = trueThen update launch options to:
__NV_PRIME_RENDER_OFFLOAD=1 __GLX_VENDOR_LIBRARY_NAME=nvidia DXVK_CONFIG_FILE=/home/YOURUSERNAME/.config/dxvk.conf %command%
What I tried that didn't work / not recommended:
- DXVK_CONFIG="dxvk.trackPipelineLifetime = True;" -> reduces the leak but kills fps, not worth it
- Newer Proton versions (experimental/hotfix) -> terrible fps
Result: ~120fps stable, CPU at 40-57%, RAM holds (high, but still, if anything just restart the game every few matches).
The leak isn't fully gone since it's a Blizzard issue, but it's slow enough that it's not a problem anymore.
Hope this saves someone a few days lol.
(Sources: reddit posts, a lot of Claude chats and a whole lot of practical testing and shaders restarting)
submitted by /u/Gansooh[link] [comments]
World of Warcraft in-game shop purchase modal is all-black
So I'm not really a in-game shop kinda guy, but I made a bet with a buddy of mine, and so I owe him the equivalent of a WoW token in gold. one small problem: when in game, if I go to purchase a token, the checkout window - the one where you'd validate your card etc - is completely black, like whatever it's supposed to show has not loaded.
I am launching WoW through battle.net, and Battle.bet is launched through Faugus Launcher, under GE-Proton10-34. the prefix is on my main OS drive, while the WoW install is on a secondary ext4-formatted drive, which Faugus has mapped to drive letter V.
submitted by /u/boundbylife[link] [comments]
Why pretend on paper that such an old iGPU Broadwell Iris Pro 6200 (Intel Gen8) is capable of doing any Vulkan in hardware at all?
All DX11 games(the ones I have on Steam) crash anyway using dxvk.
You can't even play Skyrim AE Steam because anywhere within 10 minutes to 1 hour the Kernel driver i915 crashes with GPU reset.
-> Games crashed back in 2021 when I first tested this iGPU on Debian stable (Kernel 4.19 something, whatever graphics driver was used for this back then).
-> Games crash now in 2026 (Debian testing, Kernel 6.19.10+deb14, Mesa 26.0.3).
Practically nothing changed in last 5-years.
DX11 to Vulkan translation doesn't fit into this hardware without causing crash.
All drivers are buggy.
(dmesg output taken from over an SSH client after both the driver and display crashed on the host after playing Skyrim AE for like 10 minutes):
i915 0000:00:02.0: [drm] Resetting rcs0 for stopped heartbeat on rcs0
i915 0000:00:02.0: [drm] GPU HANG: ecode 8:0:00000000
You will say, "use the wined3d instead - the DX11 to OpenGL translation".
Yes, if you use wined3d, performance is good enough, the crash doesn't happen with this driver. 30-40 fps at 720p lowest settings Skyrim AE(DX11), but you get texture flickering / missing textures all over the places, the game remains unplayable.
Even the 'DX11 to OpenGL' translation (handled by Mesa's i965 user-space OpenGL driver) doesn't work/is/has been full of bugs/ on Broadwell on Linux.
When we know that the hardware was made before the very first Vulkan 1.0 specification was even finalized, why deceive the users under this false pretense by showing such information everywhere on the Internet making it look like that the hardware is capable to do any real vulkan gaming on Linux?
-> 'vulknainfo --summary' shows Vulkan conformanceVersion 1.3.0 (apiVersion 1.3.335)
-> https://en.wikipedia.org/wiki/Intel_Graphics_Technology#Capabilities_(GPU_hardware)) shows Vulkan 1.3 for Boradwell on Linux
-> https://en.wikipedia.org/wiki/Mesa_(computer_graphics)) shows Vulkan 1.3+ for Gen8+ hardware.
Nobody writes driver for this hardware anymore.
None of the old bugs have ever been fixed.
All the cross API drivers are buggy on this hardware no matter DX -> Vulkan or DX -> OpenGL.
[link] [comments]
