

How about a 6.4TB sqlite database?
made you look
How about a 6.4TB sqlite database?
The web moves so fast that we ditched W3C standards for the WHATWG living standard because it took too long to release new features.
That’s because the W3C was focused on XHTML 2 at the time, which nobody outside of the W3C actually wanted. So any proposed amendments to HTML 4 was met with “But we’ll have XHTML 2 soon!”
I’m skeptical of claims from browser makers that the spec process wasn’t moving “fast enough”, since it’s not like they actually implemented it fully anyway.
I take that there isn’t much motivation in moving to 128 because it’s big enough; it’s only 8 cycles (?) to fill a 512 (that can’t be right?).
8 cycles would be an eternity on a modern CPU, they can achieve multiple register sized loads per cycle.
If we do see a CPU with 128 bit addresses anytime soon, it’ll be something like CHERI, where the extra bits are used for flags.
I think CHERI is the only real attempt at a 128 bit system, but it uses the upper 64 bits for metadata, so the address space is still 64 bits.
NTFS was designed back in the mid 90s, when the plan was to have the single NT kernel with different subsystems on top of it, some of those layers (i.e. POSIX) needed case sensitivity while others (Win32 and OS/2) didn’t.
It only looks odd because the sole remaining subsystem in use (Win32) barely makes use of any of the kernel features, like they’re only just now enabling long file paths.
Qt is overkill if all you’re using it for is to create a window you render into, something like SDL would be better.
They were a bit too public with “Dual_EC_DRBG”, to the point where everybody just assumed it had a backdoor and avoided it, the NSA ended up having to pay people to use it.
A place I worked at did it by duplicating and modifying a function, then commenting out the existing one. The dev would leave their name and date each time, because they never deleted the old commented out functions of course, history is important.
They’d also copy the source tree around on burnt CDs, so good luck finding out who had the latest copy at any one point (Hint: It was always the lead dev, because they wouldn’t share their code, so “merging to main” involved giving them a copy of your source tree on a burnt disk)