

But not the same standard C lib. That’s probably the most important thing outside of kernelspace.


But not the same standard C lib. That’s probably the most important thing outside of kernelspace.


It’s not so much character length from any specific encodings. It’s all the details that go into supporting it. Can’t assume text is read left to right. Can’t assume case insensitivity works the same way as your language. Can’t assume the shape of the glyph won’t be affected by the glyph next to it. Can’t assume the shape of a glyph won’t be affected by a glyph five down.
Pile up millions of these little assumptions you can no longer make in order to support every written language ever. It gets complicated.


Sorta modern.
There’s been two big jumps in fundamental RAM usage during my time using Linux. The first was the move from libc to glibc. That tended to force at least 8MB as I recall. The second was adding Unicode support. That blew things up into the gigabyte hundreds of megabyte range.
Edit: basing a lot of this on memory. Gigabyte range would be difficult for an OG Raspberry Pi, but I think it was closer to 128MB. That seems more reasonable with the difficulty of implementing every written language.
We can’t exactly throw out Unicode support, at least not outside of specific use cases. Hypothetically, you might be able to make architectural changes to Unicode that would take less RAM, but it would likely require renegotiating all the cross-cultural deals that went into Unicode the first time. Nobody wants to go through that again.


It can be routed more efficiently and has generally lower latency. Though how much it matters in practice is debatable, and real world data has fluctuated.
One thing it definitely enables is easier setup of home servers for games without NAT nonsense.


Microsoft even sees it as a big mistake. They’re creating APIs that won’t require anti-cheat to be in the kernel like that. There shouldn’t be any reason it needs to be in the Linux kernel.
That said, “don’t trust the client” is a nice thing to say, but it’s basically impossible to make games work like that. There are certain protocol design considerations that are needed for fps games to work in multiplayer with somewhat laggy connections, and they’re not completely compatible with “don’t trust the client”. If we all had the fiber optic connections and IPv6 that we were promised in the 90s, things would be different. The wack-a-mole game against cheaters is the best that can be done otherwise.
Year of the Linux download!
TIL the 19th century term for undiagnosed ADHD.