This has a bunch of "But what could possibly replace C?" type comments and the author (writing in 2016) replies Rust. So that's somebody with their eyes on the ball.
They also point out that C is a bad first language. This shouldn't be relevant to many Computer Science courses, but I have seen too many Electronics students who are taught C first, or indeed as their only language.
I’m actually of the opinion that C should be taught hand in hand or just after someone learns assembly. It’s admittedly a bit gatekeeper-y, but IMHO C should be thought of with same mindset as you would develop an ASM program, but with a bunch of shortcuts and syntactic sugar.
Without understanding what code gets generated, there are so many footguns it can be dangerous at best. Knowing things like calling conventions and how the compiler interacts with memory are really important.
I agree. I learned this way, and, for example, wrapping my head around pointers was much easier after I already understood why LEA existed. Yes, C isn't portable assembler, and no, you can't necessarily predict what ASM will be produced from a particular snippet of code. However, C operates closer to "portable assembler" than any other programming language, and it's a lot harder to go from anything else to C than it is to go from assembly to C
I'm of the opposite opinion. The portable-assembler camp is responsible for a lot of the issues and footguns, because they think they know what code will be generated. But they don't. Compilers can generate any code they want, as long as the program ends up doing the correct io and side effects.
The assembler perspective is important, but not for correctness, rather for optimization. When you optimize it starts mattering how much register pressure and cache pressure etc you have.
I do resonate with your comment a lot, but the portable assembly aspect also drove much of the current C use. As a compromise we may teach two quite different architectures and ask students to write a single code that runs in both. This way they can avoid false beliefs like `sizeof(int) == 4` and hopefully still learn common aspects of most architectures that shaped C's design.
I'm not saying because they need to understand exactly what code gets generated, compiler optimizations can produce interesting results. What I am saying is the idea that an if statement from an assembly is a compare and a jump... knowing what happens under the hood is kind of important for people writing C in 2024.
Like I guess the point I have to make is that if you are writing C in 2024, there is likely a good reason, and if you don't know what's going on in the assembler, I feel like people are playing with fire.
I think that's the misleading perspective the grandparent comment is referring to. An if statement in C isn't necessarily a branch. The generated assembly assembly might have no branches if it's eliminated, it might have multiple branches, and the compiler might even include a function call (e.g. fharden-conditional-branches). You can't know just by looking at the source code alone.
I've found that I'm much less accurate when writing code for non-GCC/Clang compilers because my mental model of what's going to be generated isn't accurate enough without the years of experience I've had looking at the outputs of those specific compiler families.
While true, compared to some of the other languages C code is still relatively close to source, e.g. it will not completely change your data types behind your back etc.
C++ is usually better because you can start with the abstractions (e.g. new, classes, I/O streams, and strings) and then work each one backwards once they’re comfortable at the high level.
(Although C++ wouldn’t be my first choice to begin with for a first language)
Agreed, C++ is a language riddled with accidental complexity, which hinders the learning process. The fact that its convoluted and ever-changing syntax leads to error messages that are indecipherable even for computer science experts alone makes it a poor choice for learners.
Someone who knows nothing would benefit from Pascal a lot even in 2023, even if the practical relevance of the language nowadays is nil (not due to lack of merit, but due to the social dynamics, as a commenter on the OP's original blog righly said) - by the way, R.I.P. Niklaus Wirth. I guess it would be a bit like studying Latin or classical Greek to learn about grammar.
Python is "conceptually worse" than Scheme and Pascal - for learners at an academic level, IMHO - but of course practically more useful/valuable, from an industry point of view.
C gives you too many guns to shoot yourself in the foot. Beginners deserve a language with a helpful compiler, slightly more hand-holding, as well as more structure and convention to encourage a maintainable coding style.
You can read about how experts write their C code (like https://nullprogram.com/blog/2023/10/08/) but you aren't going to appreciate why they decide to do this. Indeed, beginners need to be able to blindly follow rules before they can critique them or invent their own coding styles.
The biggest problem with C is that it lets you do anything to include things you don't want to do and should not do.
"The programmer knows best" means you can easily write to arbitrary memory addresses, smash the call stack, allocate memory and never free it, etc. and as long as the syntax is correct, your program will compile.
Sometimes, in interesting low-level code, embedded code, or similar, this unrestricted behavior is needed: you write seemingly arbitrary values to a memory address because it is memory-mapped hardware and that is where the control register receives instructions. But in most application-level code, it is bad behavior and just causes segment violations.
"The programmer knows best" is the primary cause of security flaws in C (and C++ because of its heavy compatibility with C) since even experienced C programmers make mistakes or find their code being used in unanticipated ways.
As a first language, a lot of students face major stumbling blocks when they deal with pointers and manual memory management. It's hard enough learning how to program for the first time; I remember learning QBASIC as a ten year old from a textbook and online tutorials. Over twenty years later I had experience teaching introductory Python to absolute beginners, and they have to learn how to convert problem statements into code. They have to learn how simple constructs such as loops and functions work. The course would have been much more difficult for my students if I needed to teach them about pointers and manual memory management. I still remember when I first encountered C in high school, and I struggled with segmentation faults due to my inadequate understanding of how pointers worked. It wasn't until my sophomore year of college when I finally understood pointers. It was when I took a computer organization course that used assembly. It was then when I had a better understanding of how memory worked and how C's pointer syntax directly translated to assembly.
I admit that I still have a soft spot for C; once I finally understood pointers I did most of my projects in C; it helped that I had (and still have) a love for systems programming. To this day I can write C in my sleep even though it's been two years since I've last written a significant amount of C. But in grad school I got bit hard by the Lisp and Smalltalk bugs....I went from a big Bell Labs fan to a Xerox PARC fan, and in my professional career I've been largely coding in Python for the past five years since it's now the lingua franca of machine learning.
But I wouldn't recommend C as a first language; I feel it's too much for absolute beginners. I'm torn between Python and Scheme; my feelings right now is that Python is a good introductory language for helping people gain programming experience and allowing students to build interesting things using Python's extensive libraries, while Scheme is an excellent vehicle for teaching how programming languages work at a high level; I have a soft spot for The Structure and Interpretation of Computer Programs (this was the introductory CS textbook at MIT from the 1980s to the late 2000s when MIT switched to Python) and I used it as part of an upper-division course on programming language principles and paradigms at a university where Java is the introductory language.
C is a very pragmatic language for writing software on a PDP-11 fifty years ago. You aren't doing that, so every single place where C made compromises to facilitate that you're paying a price for something you don't need or even want.
For example you want fat pointers, particularly for slice types. On a PDP-11 spending two registers for these types is extravagant, today this seems ridiculous, but C still doesn't provide any fat pointer types. So either you have to roll your own (and with them libraries of code to use them) or put up with whatever was a good idea in the 1970s. The most famous slice type is Rust's &str or C++ std::string_view, a fat pointer for referring to text - strings in other words, but not the mutable, owned, auto-growing strings you might associate with higher level languages, this is just the simple concept of some text. C can't do that, what C gives you is a pointer to a byte, pointer arithmetic and a stern admonition to stop when you reach a byte with a zero value... or you can roll your own slice types.
They also point out that C is a bad first language. This shouldn't be relevant to many Computer Science courses, but I have seen too many Electronics students who are taught C first, or indeed as their only language.