It’s not mentioned in the article, but I also think that learning how to debug and observe the behavior of C programs is useful given the limited ability to catch issues at compile time and high likelihood of code weaknesses and errors in C. Learning not just a traditional debugger like gdb or lldb, but also at least being introduced to tools like perf, strace, and ideally more powerful things like dtrace or bpftrace would be very helpful.
In my own Masters program we had a class on software systems which heavily featured both practical coverage of C and lots about debugging and observability. One of the most aggressively _useful_ classes I’ve taken.
This sounds like a class that was sorely missing in my CS education; looking back at what I was taught versus what skills have been useful in the industry, my university courses always seemed to treat debugging, profiling, etc as irrelevant details. Taking time to improve my understanding of them has paid off quite well in my career, I think.
Years ago, I was a teaching assistant at my alma mater's introductory programming course, which taught C. While debugging was not officially part of the curriculum, we made sure to offer a written, step-by-step tutorial with screenshots for people on how to do debugging using our officially-suggested development environment (which was Borland C by the way). We would explicitly mention the importance of debugging when introducing the first homework assignment, and refer students to the tutorial.
Yes, the poor way this is taught is one of the reasons why so many people are stuck in the stone age of printf debugging.
Already in the mid-1990s it was possible to use C debuggers as poor man's REPL, have tracepoints, scripting debugging sessions, visualize data structures, track down memory leaks and corruption.
One thing I would try to do (in addition to the things mentioned in the article) is to delay the introduction of heap allocation and pointers as much as possible, and start with a purely "value based" code style which passes and returns structs by value, especially use C99 features like compound literals and designated initialization. The result is a much saner and 'friendlier' C.
Also ignore the stdlib as much as possible, especially the string functions.
The C stdlib APIs are essentially ancient leftovers from the K&R and early C89 era and should have been modernized when C99 came around. Especially for newbies, paying too much attention to the stdlib may ingrain bad habits.
The stdlib is mostly fine. The string and character classification functions are outdated because they aren't Unicode aware. There's a more detailed write up here [1].
It's not just the missing UNICODE support (although that's probably out of scope for the stdlib, a couple of UTF-8 aware helper functions would be good enough), but footguns-in-waiting like strcpy(), strncpy(), strcat(), strncat(), ...
All the wide string stuff (wchar.h and wctype.h) as well as locale.h is all pointless today.
Everything related to filesystems and file IO is just the bare minimum that's acceptable for simple UNIX-style command line tools, but not really useful in modern applications (most notably, any support for non-blocking IO and memory-mapped-files are missing).
Arguably, complex.h doesn't even belong in the stdlib (I wonder why such an esoteric feature was even considered).
Better support for custom allocation strategies would be nice (e.g. each function which allocates under the hood should accept an allocator argument instead of being hardwired to malloc).
I'm not even asking for a more feature-rich stdlib, a stdlib2 with the obsolete parts removed and modernized IO and memory management functions would go pretty far. But then of course there's the problem that the C stdlib is also accidentially the de-facto system API on some UNIXes.
There isn't a proper universal solution unfortunately apart from looking for 3rd-party-libs which provide better abstractions over the underlying OS APIs (or writing those yourself).
The thing that really got me into this subject was building my own virtual machine and then a compiler to turn a high level language into machine code.
It sounds complicated but the VM, in Python, is basically:
r = {}
mem = []
boot(mem)
pc = 0
while True:
op, args = decode(mem[pc++])
if op == “+”:
d, a, b = args
r[d] = r[a] + b
elif ...:
...
The “source” which gets “compiled” can also just be Python function calls that emit op codes — no need to get bogged down with lexing and parsing!
The thing you focus on is the semantics of representing a function call (and return) in machine code and how to manage the stack. It made C make a lot more sense e.g. defining locals up front so the compiler knows how much stack space to use. I wish I could remember more — this was all from teaching A-Level Computer Science a few years ago.
(note: this command places the assembly and C code side next to eachother, I think you have to compile with -g debug flag with gcc)
You can work that the assembly implements the semantics of C's virtual machine model and it is this that I find easier to understand than some algebraic systems. I kind of think of struct->ptrarray[indx]->struct.something corresponds to the calculation of ultimately a single memory address even if there is various shift lefts or adds or ors.
Java has its template interpreter which is interesting to read about and there is copy and patch JIT compilers.
And that capability is not remotely unique to C - look at the list of languages Godbolt supports! Not all of them support linking source code to assembly but a large number do: C++, Rust, Zig, Ada, even Dart.
After programming already for many years, the book "C: Interfaces and Implementations" by Hanson still showed me some new tricks for writing libraries
in a more bullet-proof way, and I have mentioned or used it in every C course I have taught ever since.
The book contains some production-quality data type implementations that aspiring C programmers ought to read.
When I studied, C was the first language we learned. I think it was a great way, as we'd learn about memory management, how to do things from "scratch", and since there weren't many batteries included, we kind of just implemented data structures and algorithms as we went on.
It also primed us for more involved classes, like operating systems, compilers, databases, and so on.
But I have to say, it is a language that students really need to fundamentally know if it is going to be used down the road. Sounds obvious, but when I TA'd in more advanced classes (i.e. anything after "introduction to programming"), many of the students that were struggling, were really just struggling with the language - not necessarily the CS concepts. A lot of them were even students that had programming experience prior to enrolling, but had mostly used higher level languages.
When I first learned C I was a teenager coming from x86 assembly, and before that I was only involved with basic scripting. Having the base of how computers worked on the lowest level, as well as familiarizing myself with ROP and how that works. Only then to move onto C, which was a breath of fresh air. I will say that learning Assembly so early on was rough however it really helped with C. I followed the same path that a lot of earlier men and women went through. A lot of individuals back in the day learned a variation of BASIC, then they moved onto Assembly or C (often both) when things were just too slow for their liking.
Going through college I saw exactly what you are saying when it comes to struggling with the language. However most of the individuals I ended up assisting on top of my studying, one-on-one tutoring (if you will); the most glaring thing I noticed was not just a lack of understanding with how the language worked, but more importantly to our discussions, they didn't seem to understand how the fundamentals of computing. How memory is addressed, what exactly are pointers, how de-referencing works in actuality, heap vs. stack, passing to functions to returning values, etc. Once I would walk them through the basics it seemed that they started to understand that while they weren't able to really work effectively within the lower-levels, they would eventually begin to intuit things for themselves much more easily.
I don't believe that all that is required to learn how to write code, however if you know how a machine works, you tend to be able to figure out how to treat it or use it. Like with a manual transmission, if you watch visually how a clutch engages, you'll tend to not burn it as much while letting it do it's job, since you know the fundamentals of how it works. It's no longer the "abstract" or "too complicated to understand". Giving people that knowledge gives them the ability and the confidence to delve deeper and genuinely engage with new ways of doing the same thing they've been doing before.
Many people complain about teaching C leading to bad habits or something of that sort.
C is crucial to understanding OS apis, driver code and low-level algorithms,
its not locked to single platform like assembly and allows composing fairly high-level
abstractions without any hidden overhead. Writing your own libc functions and understanding
how they work is essential to systems programming.
The problem is not teaching C, for better or worse it is going to stay around us for a very long time, thanks to UNIX's victory on server room and embedded systems.
The actual problem is doing a half baked job, teaching it as if PDP-11 were still current, without any consideration for safe programming practices and modern tooling.
These bad habits you mention should be taken care of by general programming concepts classes. If you're using a new language like C for OS development and you are thinking of goto and other frowned on methods then a person really didn't do the programming concepts class very well.
The complaint is probably more about teaching C too early and getting fixated on that. The importance and eventual need for learning C is hardly questioned.
I wrote a lot of criticism of C here. I do want to say the K&R book was a great read when I finally did some hands-on work with C. Credit to Victor Yodaiken for recommending it.
The explanations were well-done. They gradually build up the examples. They actually did warn about common gotchas even then. Their examples mostly worked today with only minor tweaks. I used it with Clang’s and MS Visual Studio’s static analyzers.
If you are a seasoned C programmer, helping GNU Hurd, for instance, with the Rump kernel addon to add NetBSD driver support (please, no blobs) would be a good thing to have.
Yep, C is the Unix and Unix-like ABI, and an odd form of C it's under 9front/plan9's ABI/API too, but this makes the userland much more powerful than the typical permission-isolated Unix system where the user has near no capabilities even for mounting non-external media (by default).
> A stronger tool such as the Clang static analyzer should also be used.
While also teaching students that static code analyzers are very far from infallible and can waste a great deal of developer time with false positives.
It happens about once every month or two that someone posts on the sqlite forum, "I've found this serious bug..." and, after some back and forth, it's revealed that they used a static analyzer or, even worse, just pasted a single internal function from the sqlite core into ChatGPT and asked it to analyze the code for them (completely free of any calling context). Roughly 9 times out of 10, static code analyzers are flat out wrong in their analysis of that particular source tree. It's very rare that a report in sqlite which stems from a static code analyzer is actually correct.
The advice given to such posters is invariably: if you can demonstrate code which tickles the bug your tool is claiming to have found, it will be treated with priority. Until then... not so much.
Coverity used to suggest using strncpy instead of strcpy for a while, which if you blindly follow leads to nasty errors. If someone wonders, snprintf was the correct alternative at the time, I have no idea if the C standard has anything better now.
Wouldn't a better first question be _why_ we're teaching students C?
One answer might be that although students might spend most of their later professional lives in higher-level languages, they should have some experience in a low-level language.
The ideal such language would not have memory management (i.e., not Java / Python), no memory management help from the type system (so no Rust), and no fancy facilities to build abstractions (so no Rust or C++). This language should, ideally, still be a little more pleasant to work with than assembly (so, no assembly). For historical reasons, we would still like unchecked array accesses and null-terminated strings.
Now C is an answer to this question, but it need not be the only answer. We could instead teach students a fictionalized version of real C. So nice 2-s complement behavior on overflows, buffer overflows segfaults and other bad behavior, but mostly of the predictable kind. We could define undefined behaviors as the instructor sees fit.
As much as we can, we can try to make this fictionalized language agree with the semantics of gcc -O0. We can then study gcc -O0 -S itself as an empirical artifact, and use it to understand x86/x86-64/ARM/MIPS assembly.
Finally, while we're at it, we can use the Bourbaki dangerous bend symbol (https://en.wikipedia.org/wiki/Bourbaki_dangerous_bend_symbol) to repeatedly remind students that what they're studying isn't a real language, that real C has sharp edges (including undefined behaviors), and that things like ASAN / Valgrind / ... exist.
I worry that this article is confusing that we'd like students to eventually understand (some understanding of C and an appreciation of the fact that it is an important language which still needs to be handled with care) with the didactic process of reaching that understanding. Wittgenstein's ladder is a thing.
If we went to the trouble of making a safe version of C - and we would have to actually implement it; you can't very well teach students to program without letting them actually compile the code and run it - I'd rather just... use that. Call it C24, get it into GCC and Clang, and standardize on something with less Undefined Behavior and painful sharp edges.
I feel like you're trying to merge computation as a science with an idealized language choice and I don't think it would serve students well.
The "sharp edges" of C in either user mode or kernel mode when running on an x86 or arm platform are true implementation semantics that you must deal with while trying to realize genuine "computation."
Programming languages are just collections of human shorthand for managing these semantics. Which makes me feel that inventing this language would be putting the cart before the horse.
>> adding some high-quality C code, perhaps starting with Redis, Musl, or Xv6.
Agree. Xv6 is the reason why I used C. I learnt more about various OS concepts (process scheduling, file system etc) by hacking Xv6 code. Linux is too complicated for my taste.
As a mobile app dev, I don't use C. Well.... I do. Very ocasionally though. Only for interacting with specific Android hardwares.
I was an experienced asm programmer before a friend loaned me a copy of K+R C. I understood C immediately. I imagine trying to learn C without knowing assembler would be a much tougher road.
Knowing C also helps in understanding a lot of other languages.
The first thing learned with assembler is pointers and addresses. So I don't know how hard it would be to get them coming from other languages.
But I did first learn to program in Basic and Fortran, which gave zero insight whatsoever when subsequently learning assembler. I was pretty baffled by it for a while, until I suddenly "got it".
It helped that on 8 bit home computers BASIC was the OS and systems language, exposing the whole hardware, including having pointer like mechanisms, via PEEK, POKE, DATA, USR, PTR, among others.
Diving into Z80 was a matter of performance, not additional features.
In my opinion the implementation of glibc (there is a bunch of good reasons for the “weirdness” and a whole lot of gcc-isms) and even the linked example from musl (with somewhat confusing #ifdef block) is a thing that should be in some kind of “System Programming” course, not in introductory course to C.
> I’ve always taught C as a side effect of teaching operating systems, embedded systems, or something along those lines....One might argue that we shouldn’t be teaching C any longer, and I would certainly agree that C is probably a poor first or second language.
So this would be an upper-level class: an introduction to C programming for people with Python/Haskell/etc experience and a decent general understanding of computer science.
I love C. It's the language that makes me the most happy. I'm not entirely sure why, I think I like being in control. People always comment about C is unsafe, full of razors and knives flying around.
The thing to keep in mind is that *you* are the one throwing the razors and knives from the moment you go beyond printf("hello, world!\n");
I'd never recommend a company to start building anything in C. In term of the various team sizes and staff churn over the lifetime of a company, it's too risky.
But afaik most companies, particularly the growth hacking ones, are either Go or Kotlin or at that level of abstraction. Rust is slowly eating at C++ and I'm pretty sure no company in their right mind today start a new product in C++. It will slowly become COBOL.
C on the other hand still has new C written every day. I'm a polyglot and I love C, but I never had a job in C. I hope some day that happens. The modern tools aforementioned, e.g. static analysis, clangd, are really good.
The thing I love most about C is that there is a direct relationship between me doing something stupid and feeling pain for it. In python, I can write crap all day without feeling it. Some day I'll feel it, but not today!
I like small binaries and fast code. I like that when a developer tells me I did something wrong, they can prove it, and it's not some stupid high level language "the more you knoooow". I like control over things.
> I'm pretty sure no company in their right mind today start a new product in C++
I say this as someone who loves rust and zig, but these types of statements make me feel like HN is way out of touch with the industry. New products are constantly made in C++. That doesn't make it good (and i don't think companies building in C++ are making the best choices), but to say that no big tech company is writing any new code in C++ is just not correct. I see this happen every day at my job.
I also would definitely say more C++ code is written every day over C, although I'll say, I am not as familiar with the embedded world, but I know one HFT/Gaming/Robotics firm will have millions of new C++ lines every day.
Agreed, I also like writing and especially reading (good) C. The Linux kernel, Wine, and FFmpeg are a joy to read, and much more approachable than one would think.
I get exhausted by people denouncing C in favor of the language du jour, and suggesting it's been obsoleted by newer and especially more abstract languages. A hammer and nails are still useful for specific jobs, even if you have screws.
All tools have their trade offs. Even when you have a battery powered impact driver, sometimes you still need to pick up a basic hammer.
In my opinion, companies should consider C for new projects. Many people know it, the tooling is excellent, there is long-term stability, compile times are fast. With other languages you run into constant problems with tooling or compatibility issues. Also in my experience average and unexperienced programs can be very productive in C. Even if the code is not perfect, it can often be improved incrementally, where in some other languages I sometimes decided to throw it away because there abstractions that do not make sense and everything depends on it. But one absolutely needs to exploit good coding guidelines, modern tools, and CI, and avoid low-level pointer fiddling or string manipulation (which can be abstracted away in C as well as in other languages). Some of the most reliable and powerful software I use daily is written in C.
I'm in the same boat. Took me a while to accept that I am a C programmer at heart, it suits me the best. I feel like I'm in full control and I can move fast with it. Not all razors are on us though, especially if moving around compilers since there are here and there undefined behaviour parts and it's easy to throw knives around.
I also wouldn't recommend starting a new thing with it if the whole team wasn't well-versed with it. On the other side of spectrum, today's world is well-served with python, c++ and rust (that battle is ongoing, I like both), java, heck even JS. There's something special about C though and I've yet to see something replacing it's place. Closest was D1, but that came and went.
> staff churn over the lifetime of a company, it's too risky.
Experience says otherwise. C code bases have serious longevity due to how many people know it and can jump in and contribute. There just aren't that many patterns and abstractions.
You and me both. I'm also a polygot but I originally started on interpreted languages like PHP and Python. I then learned some C, which was quite frustrating before I learned how to reason about memory ownership and hold myself to some idioms. Oh and Valgrind.
I then rewrote a bunch of projects in Rust and while it lead to correct and working software, it didn't spark the joy that C did for me. I don't exactly know why and at times I almost feel ashamed to mention this. I do hope there's a future where there's a version of C with some more substantial changes/improvements though, perhaps taking a lesson or two from Rust or Zig (eg string type w/ length).
Yes, this, most modern languages embrace this idea of Optional. I believe OCaml might have invented it (I love OCaml too!). Odin has it I think, Rust obviously, and Swift heavily as well.
What's interesting, modern C, is promoting this move as well. Don't just return an integer, return a struct result_t with a fail bool or error bool in it, as opposed to some const char* pointing to null. Do this more and more and your C code starts becoming a lot more digestible and modern (although common sense still applies to not go overboard with these constructs in C, but you can set up a nice contract type API design within your code base).
> I'm a polyglot and I love C, but I never had a job in C. I hope some day that happens. The modern tools aforementioned, e.g. static analysis, clangd, are really good.
If you had a job in this you’d know that they are not ;)
I have been programming in C professionally for the past ~4 years on a multi-threaded application in the ads industry and that is central to the business. C has lot of documented "foot-guns" but most of it (if not all of it) can be safeguarded against mis-use using "support structures" around the codebase like having standard code style (enforced via tools as much as possible), static code analysis (via compiler and external code scanning tools), dynamic code analysis (via Asan, Ubsan etc), having small, medium and large tests with right amount of code coverage etc. In addition to that, having a standard set of libraries (c modules for high performance data-structures/algorithms, macro based templates to work with types) and threading model goes a long way in reducing the pain to a very bare minimum. On the plus side, you enjoy a "simpler" language syntax that is easy to learn and with the above “support structures” one can become productive in no time. Plus there are newer books in the market that can teach you C properly, one I have read and recommend is "Effective C by Robert C. Seacord" - the author is one of the C standards committee members so you can't go wrong with the choice.
This has a bunch of "But what could possibly replace C?" type comments and the author (writing in 2016) replies Rust. So that's somebody with their eyes on the ball.
They also point out that C is a bad first language. This shouldn't be relevant to many Computer Science courses, but I have seen too many Electronics students who are taught C first, or indeed as their only language.
I’m actually of the opinion that C should be taught hand in hand or just after someone learns assembly. It’s admittedly a bit gatekeeper-y, but IMHO C should be thought of with same mindset as you would develop an ASM program, but with a bunch of shortcuts and syntactic sugar.
Without understanding what code gets generated, there are so many footguns it can be dangerous at best. Knowing things like calling conventions and how the compiler interacts with memory are really important.
I agree. I learned this way, and, for example, wrapping my head around pointers was much easier after I already understood why LEA existed. Yes, C isn't portable assembler, and no, you can't necessarily predict what ASM will be produced from a particular snippet of code. However, C operates closer to "portable assembler" than any other programming language, and it's a lot harder to go from anything else to C than it is to go from assembly to C
I'm of the opposite opinion. The portable-assembler camp is responsible for a lot of the issues and footguns, because they think they know what code will be generated. But they don't. Compilers can generate any code they want, as long as the program ends up doing the correct io and side effects.
The assembler perspective is important, but not for correctness, rather for optimization. When you optimize it starts mattering how much register pressure and cache pressure etc you have.
I do resonate with your comment a lot, but the portable assembly aspect also drove much of the current C use. As a compromise we may teach two quite different architectures and ask students to write a single code that runs in both. This way they can avoid false beliefs like `sizeof(int) == 4` and hopefully still learn common aspects of most architectures that shaped C's design.
I'm not saying because they need to understand exactly what code gets generated, compiler optimizations can produce interesting results. What I am saying is the idea that an if statement from an assembly is a compare and a jump... knowing what happens under the hood is kind of important for people writing C in 2024.
Like I guess the point I have to make is that if you are writing C in 2024, there is likely a good reason, and if you don't know what's going on in the assembler, I feel like people are playing with fire.
I think that's the misleading perspective the grandparent comment is referring to. An if statement in C isn't necessarily a branch. The generated assembly assembly might have no branches if it's eliminated, it might have multiple branches, and the compiler might even include a function call (e.g. fharden-conditional-branches). You can't know just by looking at the source code alone.
I've found that I'm much less accurate when writing code for non-GCC/Clang compilers because my mental model of what's going to be generated isn't accurate enough without the years of experience I've had looking at the outputs of those specific compiler families.
While true, compared to some of the other languages C code is still relatively close to source, e.g. it will not completely change your data types behind your back etc.
C++ is usually better because you can start with the abstractions (e.g. new, classes, I/O streams, and strings) and then work each one backwards once they’re comfortable at the high level.
(Although C++ wouldn’t be my first choice to begin with for a first language)
Agreed, C++ is a language riddled with accidental complexity, which hinders the learning process. The fact that its convoluted and ever-changing syntax leads to error messages that are indecipherable even for computer science experts alone makes it a poor choice for learners.
Someone who knows nothing would benefit from Pascal a lot even in 2023, even if the practical relevance of the language nowadays is nil (not due to lack of merit, but due to the social dynamics, as a commenter on the OP's original blog righly said) - by the way, R.I.P. Niklaus Wirth. I guess it would be a bit like studying Latin or classical Greek to learn about grammar.
Python is "conceptually worse" than Scheme and Pascal - for learners at an academic level, IMHO - but of course practically more useful/valuable, from an industry point of view.
C gives you too many guns to shoot yourself in the foot. Beginners deserve a language with a helpful compiler, slightly more hand-holding, as well as more structure and convention to encourage a maintainable coding style.
You can read about how experts write their C code (like https://nullprogram.com/blog/2023/10/08/) but you aren't going to appreciate why they decide to do this. Indeed, beginners need to be able to blindly follow rules before they can critique them or invent their own coding styles.
The biggest problem with C is that it lets you do anything to include things you don't want to do and should not do.
"The programmer knows best" means you can easily write to arbitrary memory addresses, smash the call stack, allocate memory and never free it, etc. and as long as the syntax is correct, your program will compile.
Sometimes, in interesting low-level code, embedded code, or similar, this unrestricted behavior is needed: you write seemingly arbitrary values to a memory address because it is memory-mapped hardware and that is where the control register receives instructions. But in most application-level code, it is bad behavior and just causes segment violations.
"The programmer knows best" is the primary cause of security flaws in C (and C++ because of its heavy compatibility with C) since even experienced C programmers make mistakes or find their code being used in unanticipated ways.
As a first language, a lot of students face major stumbling blocks when they deal with pointers and manual memory management. It's hard enough learning how to program for the first time; I remember learning QBASIC as a ten year old from a textbook and online tutorials. Over twenty years later I had experience teaching introductory Python to absolute beginners, and they have to learn how to convert problem statements into code. They have to learn how simple constructs such as loops and functions work. The course would have been much more difficult for my students if I needed to teach them about pointers and manual memory management. I still remember when I first encountered C in high school, and I struggled with segmentation faults due to my inadequate understanding of how pointers worked. It wasn't until my sophomore year of college when I finally understood pointers. It was when I took a computer organization course that used assembly. It was then when I had a better understanding of how memory worked and how C's pointer syntax directly translated to assembly.
I admit that I still have a soft spot for C; once I finally understood pointers I did most of my projects in C; it helped that I had (and still have) a love for systems programming. To this day I can write C in my sleep even though it's been two years since I've last written a significant amount of C. But in grad school I got bit hard by the Lisp and Smalltalk bugs....I went from a big Bell Labs fan to a Xerox PARC fan, and in my professional career I've been largely coding in Python for the past five years since it's now the lingua franca of machine learning.
But I wouldn't recommend C as a first language; I feel it's too much for absolute beginners. I'm torn between Python and Scheme; my feelings right now is that Python is a good introductory language for helping people gain programming experience and allowing students to build interesting things using Python's extensive libraries, while Scheme is an excellent vehicle for teaching how programming languages work at a high level; I have a soft spot for The Structure and Interpretation of Computer Programs (this was the introductory CS textbook at MIT from the 1980s to the late 2000s when MIT switched to Python) and I used it as part of an upper-division course on programming language principles and paradigms at a university where Java is the introductory language.
C is a very pragmatic language for writing software on a PDP-11 fifty years ago. You aren't doing that, so every single place where C made compromises to facilitate that you're paying a price for something you don't need or even want.
For example you want fat pointers, particularly for slice types. On a PDP-11 spending two registers for these types is extravagant, today this seems ridiculous, but C still doesn't provide any fat pointer types. So either you have to roll your own (and with them libraries of code to use them) or put up with whatever was a good idea in the 1970s. The most famous slice type is Rust's &str or C++ std::string_view, a fat pointer for referring to text - strings in other words, but not the mutable, owned, auto-growing strings you might associate with higher level languages, this is just the simple concept of some text. C can't do that, what C gives you is a pointer to a byte, pointer arithmetic and a stern admonition to stop when you reach a byte with a zero value... or you can roll your own slice types.
I don't know. C has worked perfectly fine for me for various tasks and projects over the years. And I'm not alone so there's that, you and your hyperbole are wrong here. Granted I probably wouldn't choose C if safety is one of the biggest issues and it's not a small project, but the blame C gets for just being an honest language which indeed gives you power and the responsibility that comes with it is usually just laughable. C also gives you simplicity which I find delightful. I guess some big projects can indeed profit on complex features and build systems but for smaller stuff it's so easy to just write a simple C program to do it.
You really should be using a tool correctly and for the correct use cases, otherwise the critique is meaningless at best. The fact that the 100% sensible and nonoffensive parent comment is now "flagged" just because folks don't agree with it is just ridiculous.
Their approach is wrong. You don't say at a university level that you're going to teach C in a class. That's not what university level courses are about. If you're teaching programming applications and techniques then use a relevant language which supports those concepts. Learning a language should be a bi-product of the university course, not the fundamental reason for it. You learn C as part of any myriad of courses such as networking, but then it is simply a lecture or two then students need to get up to speed themselves.
https://nostarch.com/Effective_C
https://www.packtpub.com/free-ebook/extreme-c/9781789343625
https://www.oreilly.com/library/view/fluent-c/9781492097273