Recent testing revealed that Arch Linux, Pop!_OS, and even Nobara Linux, which is maintained by a single developer, all outstripped Windows for the performance crown on Windows-native games. The testing was run at the high-end of quality settings, and Valve's Proton was used to run Windows games on Linux.
I’m not surprised at the confusion, because they’re using it… not wrong, but very confusingly.
Frame time is literally the time to render a frame. So you’d expect that to be a number of miliseconds per frame and so for lower to be better.
But they’re not looking at frametimes, they’re looking at 1% lows and expressing that in fps, not in frametimes. So yeah, confusing.
For the record, the reson why the term is becoming popular is that there are now widespread visualizations that will give you a line of your frametimes in a graph so you can see if the line is flat or spiky. You’ve probably seen it on the Steam Deck or performance analysis videos or whatever. The idea is that all frametimes being consistent is better than high fps but low 1% or 0.1% low. So stable 60fps can look better than spiky 90fps and so on.