Oddbean new post about | logout
 You can either learn it yourself or trust the language to do it for you. It's only tradeoffs. I'm a C engineer because I want control, it's that simple.  
 That makes sense, and I have an answer, I trust engineers like you more than me in dealing with low level stuff.

So probably not C 
 It's also crazy how many opinions there will be, and it turns out any language is only as performant and safe as the person writing it, and the algorithms used. I'm pretty fluent in like 5 or 6 languages and I continue to gravitate toward C-ish languages. I can write much better, large scale applications in C# than I can in C generally. So I cut out the high performance components, write it in C and link my C# app at load time. 

It also depends on how much time you want to spend farting with things too. For me, I've spend hundreds of hours hacking C# to avoid the GC and work directly with memory when I need to. 

I suggest trying a few langs and seeing what sticks. Most languages will get you there, maybe you could make a list of YOUR OWN tradeoffs and avoid some online opinions for the time being. It matters what you can write best in. 

Finally, I'm not a traditional developer, I generally don't prioritize the standards that (what I call) "trade" developers care about, such as readability being #1. I refuse to sacrifice obvious efficiency for the sake of it being easier for lazy and whiny developers to read. If it happens it happens, but it's like 3rd on my list.  
 there is no question that C is more complicated to learn than Go, but Go gives you a lot of teh performance (about 90%, and the main limitations are that Go is not so good at parallelism compared to using pthreads)

i'd also suggest trying C because it's pretty good for writing simple mathematical algorithms with, but the implicit types get messy, and personally one of the things i hate about Go is the fact that the `int` type is 32 bit on 32 bit hardware, and you see this sometimes in code, people using explicit int64 (signed) in a lot of places even though in practice, almost nobody puts serious servers on 32 bit processors 
 oh yeah, and the "int" used in array indexes in Go are in fact 32 bit, no idea if that will ever be expanded, but arrays of more than 4 billion elements are pretty much outside of the needs of a graph simulation 
 I think it's fair to say threading it's an apples to oranges comparison because there isn't a pattern or a standard library in C for threading. So if you need jump into a highly multi threaded or even async, C probably isn't it. Not unless you want to spend a bunch of time implementing the patterns yourself. 

I think as a fun analog, in noscrypt I implemented the concept of a Span taken from C# to help me more safely work with buffers, except with macros and inline functions to make it nearly invisible when compiled. 

> i'd also suggest trying C because it's pretty good for writing simple mathematical algorithms with
Completely agree on this!

>but the implicit types get messy
Also yes, some libraries work well with an int being 8,16,32,or even 64 bits wide without your knowledge. I usually use explicit types but many libraries don't because it can often be easier to trust the compiler to manage sizing correctly for you. That takes more brain power than I wanted to dedicate to noscrypt right now, but I may switch back to implicit types for compat reasons.  
 using the pthreads library would be a bit more performant than Go's concurrency if you can fan out your processing for bulk stuff like what nostr:nprofile1qy2hwumn8ghj7un9d3shjtnyv9kh2uewd9hj7qghwaehxw309aex2mrp0yhxummnw3ezucnpdejz7qgkwaehxw309ash2arg9ehx7um5wgcjucm0d5hsz9mhwden5te0wfjkccte9ec8y6tdv9kzumn9wshsqg8ks058qd0h4485fc9e3naaj5m7zez44ykd8r80cn9nrkm42l677g2aakns  wants to do, and really the flexibility of Go's concurrency is overkill for simple parallel work, it's a little confusing to do it that way, it's more optimized towards event processing for servers, but the design of it was originally created for GUI (newsqueak)

go is actually not so great at parallelism, you can gain as much as 20% even while adding the overhead of an IPC and using single threaded worker processes (at least that was the situation in 2020, not sure maybe they have improved it by now) - but having said that, it's so much faster than for example python or javascript, that losing that last 20% of parallelism is probably not gonna hurt that much, and if you want it scaled up that much, like on a threadripper or epyc or something, well, you'd probably not be using Go, but C and pthread

go is pretty fast for single thread processing, not much slower than C, especially if you don't allocate and discard a lot of memory, walking trees and graphs is cheap, everything stays on the stack/registers

but yes, explicit types are a thing... i ported a hamming code algorithm from C some years back and the hardest part of the process was figuring out the implied bid widths, and this is pretty much par for teh course with C, your use of explicit types is probably a bit uncommon among C programmers

personally, i would remove the distinction in Go where on 32 bit hardware it treats the `int` type as 32 bits... it should just always be 64 bits, except where used in array indexes, this is one area it's implicit, and 64 bit array indexes is a waste of space, ain't nobody got that much memory lol, but it's not a difference that anyone really encounters because Go on embedded 32 bit devices is pretty much not a thing, except for tinygo, which is not really go, though it is go (has more controls for memory management, but still lets you do all the nice interface/dynamic array stuff, and Go's concurrency can even operate on a single thread 
 Also as you get into it, you'll hear the term patterns. TLDR patterns are the way to use the language and standard libraries to perform operations, usually in an engineered way. Such as the way you store variables, call functions, lay out your functions, store state, pass variables and so on. 

https://en.wikipedia.org/wiki/Software_design_pattern

New users heavily rely on patterns documented in the language articles and APIs as the intended way to use a language. This also means there is a large cycle of influence on patterns between users, and languages. PHP was intentionally designed to have very straightforward design patterns for an easy learning curve at the sacrifice of mostly performance IMO, which is the reason why Larvel and other "hacks" exist. That said, C's design patterns are mostly made up by developers on the fly, some classes teach a "standard" patterns based on some well known publishers, but its such a free language nearly every C project you look at has a totally different design pattern, it's very anarchistic and I love every ounce of it. I think because of this it's easy for C to apply the "unsafe" moniker, the language doesn't enforce any sort of safe patterns by design IMO. That's where things like C++ come in and it's standard library come in. Nearly every proficient C developer will have a blog post on their website expressing their ideal design patterns they expect the world to adopt, and I've taken influence but, no one C library or exe is written the same :)

I started in electrical engineering, so C was the first logical language we learn beside VHDL and other hardware description languages, and assembly of course. My brain works loosly assembly and C is a pretty close logical mapping and it can be rather simple to assume what the compiler will do with your code.