Oddbean new post about | logout
 nostr:npub1wc2kznjw6kldwzz8eedr9pf208waa8ww3fjsqk0nz4rdv8q663qswt4sk7 nostr:npub17jqvr0kp48sjwctcvhre8lk87yr5qqe726zkunwq2uhtd35wdx4sgkv996 

> Acceptable for what?

For code I write.

> Acceptable for compiling a L'EUnuchs system?

If they've got their code written so that the compiler doesn't botch it, then sure.  I wouldn't recommend it, but I didn't have to write the code and engage in a debate with a compiler.

> So yes, they are C99 compilers.

C99 isn't C, but some other language!

> But they do implement C20,

> When exactly was this before?

Right before the compilers implemented the behavior that optimized away infinite loops, or any of the other retarded shit they crammed in there.  The standard body has gone loopy.

Someone on the mailing list started getting wiggy about "undefined behavior" (which, again, was created to allow compilers to do something sensible but non-portable, rather than to do something punitive) and Charles Forsyth's response was sensible.  "The compiler could do anything!  IT COULD REFORMAT YOUR HARD DRIVE!"  "I would recommend avoiding a compiler that did that."

> I don't think so, because most compilers implemented vastly different language features.

That still happens.

> Remember near/far pointers from Microsoft C?

No; for the brief period where I was writing C but didn't have a Unix system to write it on, I used Borland.  I did get put in charge of porting some code to Windows and can say for certain that their implementation of BSD sockets is terrible and it's no wonder that network code in Windows was shit, but that was WinXP, so weird 16-bit x86 segmentation problems would not have come into play.

> C was never really portable, unless by portable we mean having to manually port and re-test the majority of code. It's all false advertising that people took seriously.

If by "portable" you mean "Nothing ever requires adjustment for the OS or architecture" then almost nothing is "portable".  Plenty of C code I've written runs fine without modifications on ARM or x86, 32 or 64 bits, etc., but there's stuff I've written in extremely portable languages that requires adjustment to run if I shove them into a different *distro* of the same OS with the same CPU:  paths move around, libraries vary.  Even a language that pretends to be completely machine-independent is going to behave differently:  when does a boxed integer get promoted to an unlimited-precision integer, why's a program that used to allocate almost no memory now getting OOM'd when you run it on a 32-bit machine?  Oops, Ubuntu decided pids should be 32 bits, the columns are now misaligned in your output.  ImageMagick gets compiled weird on Arch last I checked, so my code ran fine locally, it ran fine on Debian, it ran fine on OSX(!) but Arch had enabled some weird "optimize this for DSLR and this necessitates dropping some features" and I was relying on those features in something written in Ruby.  Anything you write is going to have to get jiggled a bit before it can escape its initial environment, C's no different.  The idea of standardization is to make poartng an easier process, not to make everyone account for extinct machines when bashing out a one-off.  (I wrote this tiny program to control the backlight on a portable device, compile it, slap setuid on it, I can control the backlight without sudo, binary is 10k:  people write little one-offs all the time.)

C's fine; maybe you don't like it, that's fine.  I like it, and I'd like to use good compilers when I write C.