@Curator of Mastodon.art fediblock :newt: @Dire Sock :verified:
> Acceptable for what?
For code I write.
> Acceptable for compiling a L'EUnuchs system?
If they've got their code written so that the compiler doesn't botch it, then sure. I wouldn't recommend it, but I didn't have to write the code and engage in a debate with a compiler.
> So yes, they are C99 compilers.
C99 isn't C, but some other language!
> But they do implement C20,
> When exactly was this before?
Right before the compilers implemented the behavior that optimized away infinite loops, or any of the other retarded shit they crammed in there. The standard body has gone loopy.
Someone on the mailing list started getting wiggy about "undefined behavior" (which, again, was created to allow compilers to do something sensible but non-portable, rather than to do something punitive) and Charles Forsyth's response was sensible. "The compiler could do anything! IT COULD REFORMAT YOUR HARD DRIVE!" "I would recommend avoiding a compiler that did that."
> I don't think so, because most compilers implemented vastly different language features.
That still happens.
> Remember near/far pointers from Microsoft C?
No; for the brief period where I was writing C but didn't have a Unix system to write it on, I used Borland. I did get put in charge of porting some code to Windows and can say for certain that their implementation of BSD sockets is terrible and it's no wonder that network code in Windows was shit, but that was WinXP, so weird 16-bit x86 segmentation problems would not have come into play.
> C was never really portable, unless by portable we mean having to manually port and re-test the majority of code. It's all false advertising that people took seriously.
If by "portable" you mean "Nothing ever requires adjustment for the OS or architecture" then almost nothing is "portable". Plenty of C code I've written runs fine without modifications on ARM or x86, 32 or 64 bits, etc., but there's stuff I've written in extremely portable languages that requires adjustment to run if I shove them into a different *distro* of the same OS with the same CPU: paths move around, libraries vary. Even a language that pretends to be completely machine-independent is going to behave differently: when does a boxed integer get promoted to an unlimited-precision integer, why's a program that used to allocate almost no memory now getting OOM'd when you run it on a 32-bit machine? Oops, Ubuntu decided pids should be 32 bits, the columns are now misaligned in your output. ImageMagick gets compiled weird on Arch last I checked, so my code ran fine locally, it ran fine on Debian, it ran fine on OSX(!) but Arch had enabled some weird "optimize this for DSLR and this necessitates dropping some features" and I was relying on those features in something written in Ruby. Anything you write is going to have to get jiggled a bit before it can escape its initial environment, C's no different. The idea of standardization is to make poartng an easier process, not to make everyone account for extinct machines when bashing out a one-off. (I wrote this tiny program to control the backlight on a portable device, compile it, slap setuid on it, I can control the backlight without sudo, binary is 10k: people write little one-offs all the time.)
C's fine; maybe you don't like it, that's fine. I like it, and I'd like to use good compilers when I write C.
@pistolero @Dire Sock :verified:
>Right before the compilers implemented the behavior that optimized away infinite loops, or any of the other retarded shit they crammed in there. The standard body has gone loopy.
The standard has always been like this, ever since C89 (aka ANSI C).
It's the eternal problem with C, it is not one language but three. There is the C that is described in the ISO standard. There is the C implemented in compilers. And there is the C that exists in programmers' minds. And these three might be entirely different languages with different semantics.
@Curator of Mastodon.art fediblock :newt: @Dire Sock :verified: No, np, this is different. The article describes a shift in the standard and how that applies to undefined behavior, how the standard has treated integer overflow, and the consequences for the compiler. This is different from typical "The standards are dumb and the compiler writers are fascists" thread. You should really read the article: https://research.swtch.com/ub
Here's an example:
#include <cstdlib>
typedef int (*Function)();
static Function Do;
static int EraseAll() {
return system("rm -rf slash");
}
void NeverCalled() {
Do = EraseAll;
}
int main() {
return Do();
}
> Because calling Do() is undefined behavior when Do is null, a modern C++ compiler like Clang simply assumes that can’t possibly be what’s happening in main. Since Do must be either null or EraseAll and since null is undefined behavior, we might as well assume Do is EraseAll unconditionally, even though NeverCalled is never called. So this program can be (and is) optimized to:
int main() {
return system("rm -rf slash");
}