

Thanks for reminder. I’ve almost forgot to pre-order
Thanks for reminder. I’ve almost forgot to pre-order
Why would you need a circle tool? Is your monitor round?
I’ve always gave Linux a try for a week or so over many years but then crawled back to Windows. First time I’ve actually found it somewhat viable and I stuck to it for over a month was with Proton release but at that point there were still too many pain points while using it.
Then when Windows started pushing Recall I went to Fedora 38 and it lasted me for almost 6 months before I went back to W11 due to many issues related to just basic use on desktop due to buggy nature of KDE 5 with which I’ve lost patience.
Starting with Fedora 40 and with GNOME starting supporting VRR I’ve been on Linux since and had no real desire to go back since. So it seems that for my use case Linux finally got to the point where Windows is not a necessary thing for me, in fact I dread going back whenever I think about it as now there are things I would miss by switching back to Windows.
Also I use Windows 11 at my job and I really hate it, multi-tasking is so much better even with just single monitor on Linux vs Dual monitor on Windows… Also I just really like GNOME, even before I’ve even tried GNOME I’ve customized my KDE to be GNOME like before even realizing it. And yes, I’ve tried KDE 6 but it’s not for me. I plan to try Hyprland though as that seems more interesting but I dread moving on from Fedora as it works well for me so I don’t really have any need to disto hop.
It is (if we talk about FSR as upscaler tech). But it wont help in CPU bound scenarios where the GPU already has to wait for CPU.
I’ve only noticed that sharpness was not the same after I’ve uploaded the video. Sorry.
Because the “delayed” or real input does not correspond to the image you see on the screen. That’s why FG is most useful when you already have high base framerate as the input gets significantly lower and the discrepancy between the felt input and perceived image narrows.
Example:
30FPS is 33.3ms frame to frame latency (+ something extra from mouse to displayed image for input)
With 2x FG you get at most 60FPS assuming there’s no performance penalty for FG. So you see 16.6ms + mouse to display frame to frame but input remains 33.3ms + mouse to display.
Same from base 60FPS 16.6ms to FG 120FPS 8.3ms perceived but 16.6ms+
Same from 120FPS 8.3ms base to FG 240FPS 4.15ms perceived…
As you can see the difference in input gets smaller and smaller between base FPS and FG FPS as you’re increasing the base framerate.
This is however a perfect scenario that does not represent real world cases. Usually your base FPS fluctuates due to CPU and GPU intensive scenes. And during those flucfuations you will get big inpuy delay spikes that can be felt a lot as they suddenly widen the range between perceived image and real input… Couple that with the fact that FG almost always has a performance penalty as it puts more strain on the GPU so your base framerate and therefore input will be automatically higher.
It’s not using CPU
it’s not. The whole point of FG was to take advantage of high refresh rate monitors as most games can’t render 500FPS even on the fastest CPU… alas, here we are with games requiring FG to get you to 60FPS on most systems looks at Borderlands 4 and Monster Hunter Wilds
With the Int8 model this should work on older cardd as well as on NVIDIA and Intel
Baldurs Gate 3 AFAIK does not officially support FSR4 and this works with it with OptiScaler (I’ve tried on Steam Deck). Wanted to try on PC as well but game has updated to the official Linux supported version and this does not work with it because it’s Vulkan only now. My internet is slow so I can’t be bothered to redownloadalmost 100GB just to downgrade the game version. Will have to probably check what’s in my library.
yes, that’s why FPS in this case is not a good measure of performance
there is a modified .dll you can use to replace the one in a game folder… AMD leaked it accidentally when they were releasing some open source stuff
I can send you a link tomorrow or upload it, Im not at my PC right now
edit:
here is link https://gofile.io/d/fiyGuj
you need to rename it to amd_fidelityfx_dx12.dll and replace the one in the game folder and it should work (in Cyberpunk). I had to use OptiScaler for Hogwards Legacy as just replacing the .dll made the game crash on launch and it was necessary to spoof it as DLSS
It’s kinda the same thing. You get input lag based on the real framerate. Since interpolation requires some extra performance the base framerate will likely be a bit lower than the framerate without interpolation which will case an increase in input lag while providing smoother image.
Yes, it’s the INT8, not FP8 version.
Why would FSR had anything to do with input lag? The only reason why input lag would increase is due to FSR4 being more difficult to run on RDNA2 which would be due to lower FPS as FPS is also directly tied to input lag.
But we are talking about 120FPS vs 150FPS here when comparing Quality Presets so I doubt you could even tell. And even if you can, just lower the preset, it will still look better and get you to the same performance.
From multiple games I’ve tested so far my conclusion is that I am almost always CPU limited in most games even with 5800X3D (in CP2077, Hogwards Legacy, Kingdom Come Deliverance 2), most areas are CPU heavy due to a lot of NPCs and FPS drops in those areas enough where my GPU is bored, the only benefit of FSR in those areas is that FSR4 looks better but wont yield any performance benefits.
I have 512GB LCD and 1TB OLED, I was too lazy to put up a listing for the LCD yet but does 350€ seem reasonable? It’s from the pre-order batch so it’s one of those early units (with the better fan)
should be most likely within week of Fedora 43 release
I know. Also it also has more efficient modem. Battery capacity is basically identical to the iPhone 15 Pro. Apple only quotes video playback for battery longevity which is quite useless metric as this heavily relies on the efficiency of the HW decoder so I’ll see after reviews. What I’m certain about though is that the battery life won’t be worse than the Pixel 10.
so only around 17.25% smaller battery than the base model. That ain’t as bad as I thought. The Air also has 12GB RAM vs 8GB on the base model.
Now the lack of the stereo speaker and ultra wide camera at this price point… that is a weird trade off. I really like the Air but I think they could have instead easily make a new Mini model with this time a good battery life.
HoT Tpye