>>105870348Well said. To the point, even if a bit oversimplified.
>not 10 millions lines of code X.org is more like 1.2 millions lines of code, but I understand it's a hyperbole.
>X is a lot of useless lines of code and dead code and useless features no one cared since 1990, all of which are bypassed using modules because since 2007 you really only cares about:>1. Get a GPU buffer>2. Draw on it on time>3. Occasionally handle inputsThat's the crux of the issue. The X protocol was made in a time when you expected the X server to handle fonts, text, filling boxes, and filling boxes with colours. Degraded colors. Shadows. Basically clients sent commands like "display A in that font with a shadow" to the X server, which then did its best. A lot of code was made for the X server to handle the most inane of graphical commands.
This was a 1988 paradigm. It worked. The protocol was also kinda extremely slow (each command are sent *individually* and then *acknowledged* individually, introducing many round trips, which wasn't an issue when you were in a fast LAN in 1988 with extremely weak computers, but is nowadays) and also kinda bad (if there is an error the X server returns *lmao there was an error* to the client, it is to the client to know what it did wrong, OpenGL had the same philosophy).
Except nowadays we expect things to our fonts, like, I dunno, anti-aliasing. And all of this is not handled by the graphical stack, but by whatever library you use to actually display characters, which simply put them into a GPU buffer. This hasn't been handled by X since yes roughly 2008. Well, more like early 2000s.
But the code remains. The protocol remains. The absurdly huge amount of code remains.
Since early 2010s, all any conceivable graphical softwares ask of X.org is: a buffer, a draw() command, and the input on it. Both of which are handled by two extensions to X.org, Xinput and whatever is for gpu access I forgot the name.