It's a common motif in desktop videogaming. The modern desktop computing model is a multitask windowed architecture where the mouse controls the cursor on the screen, a focused window receives events, and the keyboard generates events that may or may not translate to filling text buffers or activating behaviors based upon the subtle state of the concept of "focus."
A videogame generally wants none of that: no windows controlled by the OS, no OS-provided cursor because it won't fit the game's visual theming (or makes no sense in that kind of game at all), other behaviors on the mouse, different keyboard behavior, and (if they could get away with it) no multitasking; you need all that CPU for game stuff. So shifting a desktop PC from "not playing a game" mode to "playing a game" mode is, historically, an extremely modal shift involving kicking most of the OS to the curb, rejecting its reality and replacing it with your own.
Modern OSes have better abstractions for this, and modern computers can actually tolerate running background tasks alongside high-performance games (we've crossed a threshold where most reasonably-optimized games can't find anything to do with all your CPU because the experience is still long-polled by human perception speed). But the fundamental design tension is forever there.
https://www.youtube.com/watch?v=ZSRHeXYDLko&t=2729s