Hacker News new | past | comments | ask | show | jobs | submit | more nickpsecurity's comments login

Which means policies that reverse that are immensely important. The process of offshore our jobs and much I.P. took decades. Getting them back and rebuilding manufacturing support will take a long time, too.

Just need to make steady progress each year with incentives that encourage large leaps in progress.


lighttpd is awesome for a quick, local server on Ubuntu. One command installs it. You tell the firewall to allow it. Then, just move your files into the directory. Use a CDN, like BunnyCDN, for HTTPS and caching.

It's not only easy: it runs (or ran) huge sites in production.


You don't think American companies raising hundreds of millions to ten billion for training models contributed to their model performance or market positions?

I think a pile of money and talent is largely the cause of where they're at.


I wanted that, too. Then, integrated with something like Synflow:

https://www.synflow.com/


So, many of these universities were taken over in positions of power by people promoting intersectionality which also promotes systematic discrimination (eg DEI) against specific groups. That's a highly-divisive philosophy with no proven benefits that's similar to Marxism which killed 50 million people and wrecked countries. They did this while describing themselves as open-minded institutions commited to everyone's success.

In the degree programs, they forced these beliefs on students in "diversity" classes, rewarded those on their side, and canceled or limited people with differing views. Those who make it through the process are more likely to force it on others in government and business, which they often do. Worse, being federally funded means taxpayers are paying for students' indoctrination in intersectionality and systematically discrimination it claimed to oppose.

Yeah, I want their funding cut entirely since theyre already rich as can be. I also would like to see those running it take it back to what it used to be. That's a Christian school balancing character and intellectual education. Also, one where many views can be represented with no cancel culture. That is worth federal funding.

On top of it, how about these schools with billions in endowments put their money where their mouth is on social issues and start funding high-quality, community colleges and trade schools and Udemy-like programs everywhere? Why do they talk so much and take in so much money but do so little? (Credit to MIT for EdX and Harvard for its open courses.)


> people promoting intersectionality which also promotes systematic discrimination (eg DEI) against specific groups. That's a highly-divisive philosophy with no proven benefits that's similar to Marxism which killed 50 million people and wrecked countries

Just like all people connecting to "Kevin Bacon", and all Wikipedia pages first links connecting to "Philosophy", every idea can be connected to mass murder if you're willing to manufacture enough links.

"Intersectionality" is a descriptive, rather than prescriptive, idea. It promotes nothing.


More like it's two philosophies with similar elements originating from places where both were taught. In both cases, those that believe in them try to force them on everyone in law, policy, etc. They've been doing that, too, so it isn't speculative.

There's also large groups pushing this stuff in businesses, forcing it on all employees, under the banner of ESG. That includes Blackrock and World Economic Forum. There's billions of dollars behind forcing thus stuff on America. Yet, we still see voters rebelling against it, like by electing Trump, because they don't want our country to keep being ruined.


> In both cases, those that believe in them try to force them on everyone in law, policy, etc. They've been doing that, too, so it isn't speculative.

I think it is speculative. I haven't seen this happen beyond a small number of isolated cases, that generally are met poorly within the organization where it happens.

To my observation the association between "believing that intersectionality accurately describes the world today" and "attempting to force others to believe similarly", is about as strong as the association between "frequently voting Republican in the US since 2016", and "attempting to carry out a mass shooting".

Could you describe what you believe "intersectionality" to mean, as a philosophy?


> That's a Christian school

> That is worth federal funding.

... interesting.


You left off...

"Also, one where many views can be represented with no cancel culture."

...before "that is worth federal funding."

Such cherry picking in ways that misrepresent what is said, also common in liberal media, is one reason distrust in liberal politics is at an all-time high. Put the truth of what others said side by side with your own position, like I mentioned intersectionality with my counterpoint. See if your ideas stand up to scrutiny.


It's really not necessary since you had already invoked the notion that separation of Church and State isn't particularly important to your evaluation of what the government should fund. Everything else sort of falls by the wayside.


Which might also allow one to use tools that work on .NET bytecode. They include verification, optimization, debugging, and other transpilers. You might also get a grant or job offer from MS Research. :)


They are amazing machines designed for fault tolerance (99.999% reliability). The Wikipedia article below has design details for how many generations were made. HP bought them.

https://en.m.wikipedia.org/wiki/Tandem_Computers

I think it would be useful in open-source, fault tolerance to copy one of their designs with SiFive's RISC-V cores. They could use a 20 year old approach to dodge patent issues. Despite its age, the design would probably be competitive, maybe better, than FOSS clusters on modern hardware in fault tolerance.

One might also combine the architecture with one of the strong-consistency DR'S, like FoundationDB or CochroachDB, with modifications to take advantage of its custom hardware. At the local site, the result would be easy scaling of a system whose nodes appeared to never fail. The administrator still has to do regular maintenance, though, as the system reports component failures which it works around.


I'd like to write Rust, receive its safety benefits (esp borrow checker), compiler to equivalent C, and then use C's tooling on the result. Why use C's tooling?

In verification, C has piles of static analyzers, dynamic analyzers, test generators (eg KLEE), code generators (eg for parsing), a prover (Frama-C), and a certified compiler. If using a subset of these, C code can be made more secure than Rust code with more effort.

There's also many tools for debugging and maintenance made for C. I can also obfuscate by swapping out processor ISA's because C supports all of them. On the business end, they may be cheaper with lower watts.

I also have more skilled people I can hire or contract to do any of the above. One source estimated 7 million C/C++ developers worldwide. There's also a ton of books, online articles, and example code for anything we need. Rust is very strong in that last area for a new language but C/C++ will maintain an advantage, esp for low-level programming.

These are the reasons I'd use Rust if I wanted C or C++ for deployment. Likewise, I wish there was still a top-notch C++ to C compiler to get the same benefits I described with C's tooling.


Rust is much easier to learn due to C/C++ books all being paid (even cmake wants you to buy their book) whereas Rust documentation is free. I bet more and more people are choosing to learn Rust over C/C++ for this reason, and the number of C/C++ devs will be decreasing.


what a weird take to me... C has DECADES of high quality teaching material in form of books, university courses, plenty of which is freely available with a bit of searching.

And, if we discount the fact that "buying" a book is such a big hurdle, even more high quality academic text books and papers to boost; anything from embedded on the weirdest platforms, basic parsing, writing compilers, language design, high performance computing, teaching of algorithms, data structures, distributed systems, whatever!

edit: I even forgot to name operating system design plus game programming ; and of course accompanying libraries, compilers & build systems to cover all of those areas and use cases! edit2: networking, signal processing, automotive, learning about industry protocols and devices of any sort... if you explore computer science using C as your main language you are in the biggest candy store in the world with regards to what it is you want to learn about or do or implement...


> C has DECADES of high quality teaching material in form of books, university courses, plenty of which is freely available with a bit of searching.

Which means all that high quality teaching material is DECADES old. Rust development is centralised and therefore the docs are always up-to-date, unlike C/C++ which is a big big mess.


Being decades old does not make it out of date. Until a few years ago, the Linux kernel was written using C89. While it has switched to C11, the changes are fairly small such that a book on C89 is still useful. Many projects still write code against older C versions, and the C compiler supports specifying older C versions.

This is very different than Rust where every new version is abandonware after 6 weeks and the compiler does not let you specify that your code is from a specific version.


> This is very different than Rust where every new version is abandonware after 6 weeks and the compiler does not let you specify that your code is from a specific version.

Do you have any specific evidence? Rust ecosystem is known for libraries that sit on crates.io for years with no updates but they are still perfectly usable (backward-compatible) and popular. Projects usually specify their MSRV (minimum supported Rust version) in the README.



Are you asking for LTS releases? https://ferrocene.dev/en/


I was not asking for that. I was answering your question. You asked for evidence of rust releases being abandonware. I gave it to you. Someone else trying to ameliorate Rust releases does not change this reality.


Well “abandonware” is a strange way to call that because nothing is actually abandoned.


Use language features not considered “stable rust” that are later discarded and you will learn it is abandonware very quickly. In any case, you asked for proof and now have it. You should be saying thank you instead of trying to argue.


I mean, thank you, but calling Rust abandonware just because it uses a rolling release model is misleading IMO. Also there's nothing wrong with unstable features being discarded, they're unstable.


The description is accurate compared to what you get from other more established systems languages.


> Being decades old does not make it out of date.

Right, the docs never get out of date if the thing they document never changes. Can you say the same about C++ though? I’ve heard they release new versions every now and then. My robotics teacher didn’t know ‘auto’ is a thing for example.


Both C and C++ release new versions. The compilers continue to support the old versions and people continue using the old versions (less so in the case of C++). Rust’s compiler drops the old version every time it has a new release.

There is no `-std=1.85` in rust 1.86. You do get `-std=c++98` in both g++ and clang++. A book on C or C++ is still useful even decades later since the version of C or C++ described does not become abandonware at some well defined point after release, unlike Rust releases.


I'm confused. Rust uses semantic versioning:

Given a version number MAJOR.MINOR.PATCH, increment the:

    MAJOR version when you make incompatible API changes
    MINOR version when you add functionality in a backward compatible manner
    PATCH version when you make backward compatible bug fixes
Additional labels for pre-release and build metadata are available as extensions to the MAJOR.MINOR.PATCH format.

What kind of versioning scheme does C/C++ use?


C and C++ are two different languages. They are versioned by years. Rust technically does not have versions. The rust tools have versions. Basically all versions of C are binary compatible with each other. I suggest you actually learn and use C rather than asking questions since you are never going to ask the right things to understand how it works without having firsthand experience.


> C and C++ are two different languages. They are versioned by years.

That sounds like Rust editions.


Only superficially. You cannot specify a past version of Rust where features existed that have since been removed by doing that. You also do not have a situation where two different incompatible languages are being accepted by the same compiler and as long as you specify which one is used, the compiler will properly compile code for it. For example, C will not accept headers with C++ exclusive features like namespaces and C++ will not accept headers with C exclusive features like variably modified types.

The only reason you see people grouping the two languages together is due to history. They are both derivatives of an ancient prestandard form of C called K&R C. They both have different standards committees who had different ideas about how to move forward from K&R C. The result is that C compilers were extended to support both, and that extension continues to this day despite the divergence between the two languages. The C standards committee accepted some of the changes the C++ committee made to K&R C, although the C++ standards committee does not reciprocate by accepting changes made by the C standards committee. This is making them increasingly different languages.

Try to spend time learning how other things work instead of posting replies that attempt to reinterpret everything people tell you through a Rust lens whenever someone is kind enough to answer your questions like you are doing here. It is like asking people how Chinese works and then telling them “English does it this way”. The “nothing matters but <insert language here>” mentality that causes that is going to annoy a large number of people from whom you would otherwise be able to learn.


Auto as it is now has been in C++ since C++11, thats more than a decade ago...

If your argument was C then sure thats a C23 feature (well the type inference type of auto ) and is reasonably new.

This is much more a reflection on youe professor than the language. C++11 was a fundamental change to the language, anyone teaching or using C++ in 2025 should have an understanding of how to to program well in a 14 year old version of said language...


> Auto as it is now has been in C++ since C++11, thats more than a decade ago...

> anyone teaching or using C++ in 2025 should have an understanding of how to to program well in a 14 year old version of said language...

If the current year is 2025 then 14 years ago is 2011 which is not that long ago.

> If your argument was C then sure thats a C23 feature (well the type inference type of auto ) and is reasonably new.

Grandparent comment is arguing that Linux was written in C89 until a few days ago so decades old books on C aren't actually outdated.


Decades ols books in C most certainly is even useful in modern C++23 because you need to interact with other libraries written in C89.

When a lot of modern CS concepts wwre first discovered and studied in the 70s, there's no point arguing that old books are useless. Honestly there may be sections of old books that are useless but in the whole they are still useful.


We're talking about learning C/C++ from scratch which makes no sense to do by using a decades old book because it wouldn't teach you any modern features. Also we're not talking about computer science.


You do not need to know about modern features to write code in C. This is part of computer science.


> You do not need to know about modern features to write code in C.

Then what’s the point of adding any new features?


Some people want to use them, they are useful in some contexts and they often already exist in some form elsewhere, but the majority of people often do not need them.

That said, when you learn a foreign language, you do not learn every word in the dictionary and every grammatical structure. The same is true for programming. You just don't need to have a more up to date book than one on C89 to begin learning C.


A long time ago, Victor Yodaiken told me the best way to learn C was the old, K&R book. The explanations, examples, and even pitfalls were all explained well. My code worked without much fuss. That's probably because C hasn't changed much.

I ended up switching to Python for job requirements after getting through half the book. If I re-learn C, I'll go back to it. If it were C++, that would be a totally, different story since they kept piling features on over time.


Oh my... Are you serious? I'm almost triggered by this. A book about algorithms or data structures from 20 years ago has nothing more to teach? 3D game engine design from 20 years ago has nothing more to teach? No point in looking at the source code of Quake, reading k&r, and Knuth's TAOCP Volume 1 was published in 1968 so it's obviously irrelevant garbage!

I could spurr into an essay of what kind of lack of understanding you just portrayed about the world, but I won't.... I won't...


We’re talking about C/C++, not algorithms or data structures.


How you implement algorithms and data structures in C++/rust is semantics at best. The imperative shell of those languages are identical semantically right down to the memory model.


Right, that's why a 20 year old book on algorithms and data structures is not necessarily outdated, but a 20 year old book on C/C++ most certainly is.


My copy of The C++ Programming Language for C++98 is still useful today, as is copy of The C Programming Language for C89. The idea that these books are no longer useful is absurd. Modern compilers still support those versions and the new versions are largely extensions of the old (although C++11 changed some standard library definitions). The only way you could think this is if you have zero knowledge of these languages.


> The only way you could think this is if you have zero knowledge of these languages.

Exactly. For context see my original comment above about C/C++ books being paid.


Are you unable to use search engines:

https://www.learn-c.org/

There are so many free learning resources for these languages that it is ridiculous to say that you need books to learn them. The books are great, but non-essential. If you insist on reading books, there is an ancient invention called a library that you can use for free.


What C standard does that website describe?


At a glance, the code is compatible with all C standards ever published. You are too fixated on learning the latest. The latest is largely just a superset of the earlier standards and the new features are often obscure things you are unlikely to need or want to use.

The main exceptions that you actually would want to use would be the ability to declare variables anywhere in a block and use single line // comments from C++ in C99. Most C programmers do not even know about most of the things added in newer C standards beyond those and perhaps a handful of others.


I did more research and found https://isocpp.org/get-started which appears to be the authority for C++. It states that I will need a textbook for learning C++ and includes a link to Bjarne Stroustrup "Tour of C++" Amazon page (not buying it lol). For C the situation is more complicated because there appears to be multiple organizations making standards for it, and you have to pay "CHF 221.00" to even see the standard. It kind of reminds me of the web where there is also multiple consortiums making standards and browsers hopefully implement them (except the web standards are free). In conclusion I much prefer Rust where you can just read the docs without bullshit.


Almost nobody using C (or C++ for that matter) has read the standard. The standard exists for compiler developers. If you want to read the standard, you can get copies of the drafts for free. It is an open secret that the final draft is basically verbatim with the standard that costs money. However, it is very rare that a C programmer needs to read the standard.

As for C++, there is nothing at that link that says you need a textbook to learn C++ (and the idea that you need one is ludicrous). The textbooks are suggested resources. There are plenty of free resources available online that are just as good for learning C++.

You would be better off learning C before learning C++. C++ is a huge language and its common history with C means that if you do not understand C, you are likely going to be lost with C++. If you insist on learning C++ first, here is the first search result from DuckDuckGo when I search for "learn C++":

https://www.learncpp.com/

You will likely find many more.

For what it is worth, when I was young and wanted to learn C++, I had someone else tell me to learn C first. I had not intended to follow his advice, but I decided to learn C++ by taking a university class on the subject and the CS department had wisely made learning C a prerequisite for learning C++. I later learned that they had been right to do that.

After learning C++, I went through a phase where I thought C++ was the best thing ever (much like how you treat Rust). I have since changed my mind. C is far better than C++ (less is more). I am immensely proud of the C code that I have written during my life while I profoundly regret most of the C++ code that I have written. A particular startup company that I helped get off the ground after college runs their infrastructure on top of a daemon that I wrote in C++. Development of that daemon had been a disaster, with C++ features making it much harder to develop than it actually needed to be. This had been compounded by my "use all of the features" mentality, when in reality, what the software needed was a subset of features and using more language features just because I could was a mistake.

I had only been with the startup for a short time, but rejoined them as a consultant a few years ago. When I did, I saw that some fairly fundamental bugs in how operating system features were used from early development had gone unresolved for years. So much of development had been spent fighting the compiler to use various exotic language features correctly that actual bugs that were not the language's fault had gone unnoticed.

My successor had noticed that there were bugs when things had gone wrong, but put band-aids in place instead of properly fixing the bugs. For example, he used a cron job to restart the daemon at midnight instead of adding a missing `freeaddrinfo()` call and calling `accept()` until EAGAIN is received before blocking in `sigwaitinfo()`. Apparently, ~3000 lines of C++ code, using nearly every feature my younger self had known C++ to have, were too complicated for others to debug.

One of the first things I did when I returned was write a few dozen patches fixing the issues (both real ones and cosmetic ones like compiler warnings). As far as we know, the daemon is now bug free. However, I deeply regret not writing it in C in the first place. Had I written it in C, I would have spent less time fighting with the language and more time identifying mistakes I made in how to do UNIX programming. Others would have been been more likely to understand it in order to do proper fixes for bugs that my younger self had missed too.


> As for C++, there is nothing at that link that says you need a textbook to learn C++.

Sorry, it says that in their FAQ[0]. It also says "Should I learn C before I learn C++?" "Don’t bother." and proceeds to advertise a Stroustrup book[1].

[0]: https://isocpp.org/wiki/faq/how-to-learn-cpp#start-learning

[1]: https://isocpp.org/wiki/faq/how-to-learn-cpp#learning-c-not-...

> If you insist on learning C++ first, here is the first search result from DuckDuckGo when I search for "learn C++":

I don't insist on learning C++ and I even agree with you that C is better. But I have a problem with learning from non-authoritative sources, especially random websites and YouTube tutorials. I like to learn from official documentation. For C there appears to be no official documentation, and my intution tells me that, as nickpsecurity mentioned, the best way is to read the K&R book. But that brings us back to my original point that you have to buy a book.

> was the one true way (like you seem to have been with Rust)

I don't think there exists any one true way. It depends on what you do. For example I like Rust but I never really use it. I pretty much only use TypeScript.

> was the best thing ever (much like how you treat Rust)

I would actually prefer Zig over Rust but the former lacks a mature ecosystem.

> For example, they used a cron job to restart the daemon at midnight instead of adding a missing `freeaddrinfo()` call and calling `accept()` until EAGAIN is received before blocking in `sigwaitinfo()`.

This sounds like a kind of bug that would never happen in Rust because a library would handle that for you. You should be able to just use a networking library in C as well but for some reason C/C++ developers like to go as far as even implementing HTTP themselves.

> After learning C++...

Thanks for sharing your story. It's wholesome and I enjoyed reading.


> Sorry, it says that in their FAQ[0]. It also says "Should I learn C before I learn C++?" "Don’t bother." and proceeds to advertise a Stroustrup book[1].

They also would say "Don't bother" about using any other language. If you listen to them, you would never touch Rust or anything else.

> But I have a problem with learning from non-authoritative sources, especially random websites and YouTube tutorials. I like to learn from official documentation. For C there appears to be no official documentation, and my intution tells me that, as nickpsecurity mentioned, the best way is to read the K&R book. But that brings us back to my original point that you have to buy a book.

The K&R book is a great resource, although I learned C by taking a class where the textbook was "A Book On C". I later read the K&R book, although I found "A Book on C" to be quite good. My vague recollection (without pulling out my copies to review them) is that A Book On C was more instructional while the K&R book was more of a technical reference. If you do a search for "The C programming language", you might find a PDF of it on a famous archival website. Note that the K&R book refers to "The C programming language" by Kerninghan and Ritchie.

Relying on "authoritative" sources by only learning from the language authors is limiting, since they are not going to tell you the problems that the language has that everyone else who has used the language has encountered. It is better to learn programming languages from the community, who will give you a range of opinions and avoid presenting a distorted view of things.

There are different kinds of authoritative sources. The language authors are one, compiler authors are another (although this group does not teach), engineers who actually have used the language to develop production software (such as myself) would be a third and educational institutions would be a fourth. If you are curious about my background, I am ths ryao listed here:

https://github.com/openzfs/zfs/graphs/contributors

You could go to edx.org and audit courses from world renowned institutions for free. I will do you a favor by looking through what they have and making some recommendations. For C, there really is only 1 option on edX, which is from Dartmouth. Dartmouth is a world renowned university, so it should be an excellent teacher as far as learning C is concerned. They appear to have broken a two semester sequence into 7 online courses (lucky you, I only got 1 semester at my university; there was another class on advanced UNIX programming in C, but they did not offer it the entire time I was in college). Here is what you want to take to learn C:

https://www.edx.org/learn/c-programming/dartmouth-college-c-...

https://www.edx.org/learn/c-programming/dartmouth-college-c-...

https://www.edx.org/learn/c-programming/dartmouth-college-c-...

https://www.edx.org/learn/c-programming/dartmouth-college-c-...

https://www.edx.org/learn/c-programming/dartmouth-college-c-...

https://www.edx.org/learn/linux/dartmouth-college-linux-basi...

https://www.edx.org/learn/c-programming/dartmouth-college-c-...

There is technically a certificate you can get for completing all of this if you pay, but if you just want to learn without getting anything to show for it, you can audit the courses for free.

As for C++, there are two main options on edX. One is IBM and the other is Codio. IBM is a well known titan of industry, although I had no idea that they had an education arm. On the other hand, I have never heard of Codio. Here is the IBM sequence (note that the ++ part of C++ is omitted from the URLs):

https://www.edx.org/learn/c-programming/ibm-fundamentals-of-...

https://www.edx.org/learn/object-oriented-programming/ibm-ob...

https://www.edx.org/learn/data-structures/ibm-data-structure...

There actually are two more options on edX for C++, which are courses by Peking University and ProjectUniversity. Peking University is a world class university in China, but they only offer 1 course on edx that is 4 weeks long, so I doubt you would learn very much from it. On the other hand, I have never heard of ProjectUniversity, and their sole course on C++ is only 8 weeks long, which is not very long either. The 3 IBM courses together are 5 months long, which is what you really want.

> I pretty much only use TypeScript.

Learn C, POSIX shell scripting (or bash) and 1 functional programming language (Haskell is a popular choice). You will probably never use the functional programming language, but knowing about functional programming concepts will make you a better programmer.

> This sounds like a kind of bug that would never happen in Rust because a library would handle that for you. You should be able to just use a networking library in C as well but for some reason C/C++ developers like to go as far as even implementing HTTP themselves.

First, I was using a networking library. The C standard library on POSIX platforms is a networking library thanks to its inclusion of the Berkeley sockets API. Second, mistakes are easy to criticize in hindsight with "just use a library", but in reality, even if you use a library, you could still make a mistake, just as I did here. This code also did much more than what my description of the bug suggested. The reason for using asynchronous I/O is to be able to respond to events other than just network I/O, such as SIGUSR1. Had I not been doing that, it would not have had that bug, but it needed to respond to other things than just I/O on a socket.

I described the general idea to Grok and it produced a beautiful implementation of this in Rust using the tokio "crate". The result had the same bug that the C++ code had, because it made the understandable assumption my younger self made that 1 SIGIO = 1 connection, but that is wrong. If two connection attempts are made simultaneously from the perspective of the software, you only get 1 SIGIO. Thus, you need to call accept() repeatedly to drain the backlog before returning to listening for signals.

This logical error is not something even a wrapper library would prevent, although a wrapper library might have prevented the memory leak, but what library would I have used? Any library that wraps this would be a very thin wrapper and the use of an additional dependency that might not be usable on then future systems is a problem in itself. Qt has had two major version changes since I wrote this code. If I had used Qt 4's network library, this could have had to be rewritten twice in order to continue running on future systems. This code has been deployed on multiple systems since 2011 and it has never once needed a rewrite to work on a new system.

Finally, it is far more natural for C developers and C++ developers to use a binary format over network sockets (like I did) than HTTP. Libcurl is available when people need to use HTTP (and a wide variety of other protocols). Interestingly, an early version of my code had used libcurl for sending emails, but it was removed by my successor in favor of telling a PHP script to send the emails over a network socket (using a binary format).


> Thus, you need to call accept() repeatedly to drain the backlog before returning to listening for signals.

It's not just accept. If your socket is non-blocking the same applies to read, write, and everything else. You keep syscalling until it returns EAGAIN.

> I described the general idea to Grok and it produced a beautiful implementation of this in Rust using the tokio "crate". The result had the same bug that the C++ code had, because it made the understandable assumption my younger self made that 1 SIGIO = 1 connection, but that is wrong.

I don't know what your general idea was but tokio uses epoll under the hood (correctly), so what you are describing could only have happened if you specifically instructed grok to use SIGIO.

> Finally, it is far more natural for C developers and C++ developers to use a binary format over network sockets (like I did) than HTTP.

Designing a custom protocol is way more work than just using HTTP. <insert reasons why http + json is so popular (everyone is familiar with it blah blah blah)>.


> It's not just accept. If your socket is non-blocking the same applies to read, write, and everything else. You keep syscalling until it returns EAGAIN.

You do not call read/write on a socket that is listening for connections.

> I don't know what your general idea was but tokio uses epoll under the hood (correctly), so what you are describing could only have happened if you specifically instructed grok to use SIGIO.

That is correct. There is no other way to handle SIGUSR1 in a sane way if you are not using SIGIO. At least, there was no other way until signalfd was invented, but that is not cross platform. epoll isn't either.

> Designing a custom protocol is way more work than just using HTTP. <insert reasons why http + json is so popular (everyone is familiar with it blah blah blah)>.

You are wrong about that. The code is just sending packed structures back and forth. HTTP would overcomplicate this, since you would need to implement code to go from binary to ASCII and ASCII to binary on both ends, while just sending the packed structures avoids that entirely. The only special handling this needs is to have functions that translate the structures from host byte order into network byte order and back, to ensure that endianness is not an issue should there ever be an opposite endian machine at one end of the connection, but those were trivial to write.

Do yourself a favor and stop responding. You have no idea what you are saying and it is immensely evident to anyone who has a clue about software engineering.


> You are wrong about that. The code is just sending packed structures back and forth.

Among other things, this would only work if your client is written in a language that supports C structures.

> Do yourself a favor and stop responding. You have no idea what you are saying and it is immensely evident to anyone who has a clue about software engineering.

Says the one who didn't know how to use non-blocking sockets.

> That is correct. There is no other way to handle SIGUSR1 in a sane way if you are not using SIGIO. At least, there was no other way until signalfd was invented, but that is not cross platform. epoll isn't either.

```

use std::io;

use tokio::{

    net::UnixListener,

    select,

    signal::unix::{SignalKind, signal},
};

#[tokio::main(flavor = "current_thread")]

async fn main() -> io::Result<()> {

    let mut signal = signal(SignalKind::user_defined1())?;

    let listener = UnixListener::bind("./hi")?;

    loop {
        select! {
            _ = signal.recv() => {
                todo!();
            }

            _ = listener.accept() => {
                todo!();
            }
        }
    }
}

```


> Among other things, this would only work if your client is written in a language that supports C structures.

Such languages are used at both ends. Otherwise, this would not have been working in production for ~13 years.

> Says the one who didn't know how to use non-blocking sockets.

Neither did you until you learned it. If you successfully begin a career in software engineering and years later, have the humility to admit the mistakes you made when starting out for the benefit of others, you will deserve to have an incompetent know it all deride you for having been so kind to admit them, just like you are doing to me here.

Anyone with a modicum of programming knowledge can write code snippets free from mistakes immediately after being told about the mistakes that would be made when given a simplified explanation of one thing that is done in production software. The problem with software engineering is that nobody tells you everything you can do wrong before you do it, and you are not writing code snippets, but production software.


> You could go to edx.org and audit courses...

This is great advice, thanks!


> edit2: networking, signal processing, automotive, learning about industry protocols and devices of any sort...

I admit there is many great products that are written in C that aren’t going anywhere any time soon, notably SQLite, but there is no reason to write new software in C or C++.


I wrote a new daemon in C the other day. Plenty of people will continue to write software in C. Some will even write it in C++. There is nothing you can write that will convince the world to stop writing software in these languages. We went through this once with Java. We are going through it with Rust. We will go through it with Zig or whatever it is that people say everyone should adopt next instead of Rust and all else that came before it.


There's tools that can prove the absence of runtime errors in industrially-useful subsets of C. Frama-C, RV-Match, and Meta's Infer come to mind. That code is then usable in about anything because so much of it is written in or can call C. Until Rust has that combo, there's still a reason to use C.

Personally, I'd use both Rust and C with equivalent code. The Rust types and borrow checker give some extra assurance that C might not have. The C tooling gives extra assurance Rust doesn't have. Win, win. If I want, I can also do a certified compile or cross-compile of the Rust-equivalent, C code.


Astree claims to be able to prove the absence of runtime errors in both C and C++, without requiring the use of subsets:

https://www.absint.com/astree/index.htm

By the way, C has a formally verified C compiler:

https://compcert.org/compcert-C.html


Yeah, CompCert will certify a binary for those willing to buy it or use a GPL license. Empirical testing showed it had no bugs in its middle end, unlike other compiler.

On Astree, I couldn't believe it supported all language constructs. I found this on your link:

"and is subject to the same restrictions as Astrée for C.

The high-level abstraction features and template library of C++ facilitate the design of very complex and dynamic software. Extensive use of these features may violate the established principles of safety-critical embedded software development and lead to unsatis­fac­tory analysis times and results. The Astrée manual gives recommendations on the use of C++ features to ensure that the code can be well analyzed. For less constrained (less critical) C++ code, we recommend using the standalone RuleChecker."

So, it still does require a language subset that reduces complexity to benefit from the full analysis. They have greatly expanded what errors they catch since I first read about them. So, thanks for the link.


I researched and discovered kani https://github.com/model-checking/kani, it's pretty cool.

```rust

#[kani::proof]

fn main() {

    let mut array: [i32; 10] = kani::any();

    array.sort_unstable();

    let index: usize = kani::any_where(|i| *i > 0 && *i < array.len());

    assert!(array[index] >= array[index - 1]);
} ```


I do, and will, the industry does and will, for at least a few more decades. And I even enjoy doing so (with C; C++ is more forced upon me, but that'll be the case for some time to come)


That’s what I’m saying. By a few decades you and most of those alleged 7 million C/C++ developers will retire and there won’t be anyone to replace them because everyone will be using Rust or Zig or Go.


Very strong statement, one I don't really believe


That’s what happened to COBOL, right?


In COBOL’s case, nobody really wanted to write software in it in the first place. People used assembly or FORTRAN at the time. MBAs wanted a language corporate secretaries could use, so they made COBOL. It failed at its main purpose, but since IBM pushed it for business processing, people were coerced into writing software with it since if they did not use it, they would not have a job. As the industry developed, they stopped finding new people that they could coerce into using it and the old people started retiring/dying. Unlike COBOL adoption, C and C++ adoption occurred because software engineers wanted to use them due to merit.


> anything from embedded on the weirdest platforms, basic parsing, writing compilers, language design, high performance computing, teaching of algorithms, data structures, distributed systems, whatever

All that is language-agnostic and doesn’t necessarily have anything to do with C.


Yes, but there is material covering all of those aspects with implementations in C plus libraries plus ecosystem support! From teaching material to real world reference implementations to look at and modify and learn from.

And C maps so directly to so many concepts; it's easy to pick up any of those topics with C; and it being so loose makes it even perfect to explore many of those areas to begin with, since very quickly you're not fighting the language to let you do things.


People may learn from material that uses C for illustration purposes, but that won’t prompt them to write their own C. And don’t even mention ecosystem support in C/C++ where developers are notorious for reimplementing everything on their own because there is no package manager.


"rust import csv-parser" "you've installed 89 packages"

why is my executable 20 mb, not as performant as a 50 kb C file & doesn't build a year later if I try to build with 89 new versions of those packages

obligatory xkcd reference: https://imgs.xkcd.com/comics/dependency_2x.png this is what package managers lead to


Conversely, "why does my hand-written CSV parser only support one of the 423 known variants of CSV, and it isn't the one our customer sent us yesterday?"

You kind of have a point behind dependency hell, but the flip side is that one needn't become an expert on universal CSV parsing just to have a prayer of opening them successfully.


You are greatly exaggerating the numbers, and Rust and its ecosystem are known for being stable. Are you saying everyone should write their own csv parser? Also it’s highly likely that an existing CSV library would be optimized, unlike whatever you wrote ad-hoc, so your performance argument doesn’t hold.


I'm saying package managers automate dependency hell and software is and will be less stable & more bloated as a consequence. And one should know how to write a csv parser and yes, even consider writing one if there are obvious custom restraints and it's an important part of one's project.

(and yes, numbers were exaggerated; I picked something trivial like a csv parser pulling in 89 packages for effect; the underlying principle sadly holds true)


Anyone can read a file, split it by newline and split it by comma, but what if values contain newlines or commas? How do you unescape? What about other edge cases? In Rust an existing library would handle all that, and you would also get structure mapping, type coercion, validation etc for free.


The number of dependencies does not indicate how bloated your app is. Package managers actually reduce bloat because your dependencies can have shared dependencies. The reason C programs may seem lightweight is because their number of dependencies is low, but each dependency is actually super fat, and they tend to link dynamically.


In the context of Rust it is not about "bloat" indeed. The compiler includes only used bits and nothing more. However, there are other problems, like the software supply chain security. More dependencies you have, more projects you have to track. Was there a vulnerability that affects me? Is project unmaintained and some ill factor took it over?

In C this was actually a less problem since you had to copy-paste the shared code into your program and at some level you were manually reviewing it all the time.

Also in Rust people tend to write very small libraries and that increases the number of dependencies. However, many still not follow SemVer et. al and packages tend to be unstable too. On top of additional security issues. They maybe be useful for a short time but in many cases you might need to think the lifetime of your application up to 10 years.


> However, there are other problems, like the software supply chain security.

It's not a problem with Rust specifically though. It's not unique to Rust.

> Also in Rust people tend to write very small libraries and that increases the number of dependencies. However, many still not follow SemVer et. al and packages tend to be unstable too.

Don't use random unpopular crates maintained by unknown people without reviewing the code.


That applies to your distro package manager too.


Probably not. Many people prefer C/C++ to Rust, which has its own fair share of problems.


Two people is many people. The general trend I see is that Rust is exploding in adoption.


The last I checked various stats (GitHub Language stats, TIOBE, etc.), Rust wasn't even in the top 10. I'm sure its adoption is increasing. However, other languages like Go seem to be doing much better. Neither will replace C++ or C anytime soon.


C/C++ will be replaced incrementally and it’s already happening. Cloudflare recently replaced nginx with their own alternative written in Rust for example.


That's nice, but a couple of Rust rewrites are not proof of a general trend.

I've been working with C for over 30 years, both professionally and a hobbyist. I have experimented with Rust but not done anything professionally with it. My gut feel is Rust is too syntactically and conceptually complex to be a practical C replacement. C++ is also has language complexity issues, however it can be adopted piecemeal and applied to most existing C code.


> My gut feel is Rust is too syntactically and conceptually complex to be a practical C replacement.

That would depend on what you use C for. But I sure can imagine people complain that Rust gets in the way of their prototyping while their C code is filled with UB and friends.


> That's nice, but a couple of Rust rewrites are not proof of a general trend.

It’s not just a couple. We’ve seen virtually all JS tooling migrate to Rust, and there is many more things but I can’t remember by name.


It is, but it is still tiny compared to C/C++. And many people also do not like it.


There are two categories of people who don’t like Rust:

1. C/C++ developers who are used to C/C++ and don’t want to learn Rust.

2. Go developers who claim Rust is too difficult and unreadable.

Which one is you?


I think the "don't want to learn" is a very poor argument. I learn stuff every day, but I want want to decide myself what I learn to solve my problems, not because Rust folks think everybody has to learn Rust now. I learned a bit of Rust out of curiosity, but not so much I could do a lot with it. I do not want to learn more, because I think the language has severe flaws and and much less suitable than C for my use cases.


I know a guy who used to love Rust and his cannot stand it. The hype wore off in his case.


Does he prefer fighting libtool, make, cmake, configure, automake, autoconf and autoreconf just to add a couple libraries into his project? When I tried to use C, I wrote 10 lines of code and spent hours trying to make all that shit work. It makes me laugh when people say Rust is complicated.


It really is not that complicated. You just use -llibrary when linking and it links.


Oh I never realized I was supposed to install -dev packages, I thought I had to compile myself.


Whether you need -dev packages (for headers) depends on your operating system. I run Gentoo. All headers are always on the system. Other distributions ship the headers separately in -dev packages to save space and you need to install them. You likely can install all -dev packages for everything on your system so you do not need to get the individual ones.


Autocorrect seems to have made a typo worse here. It was supposed to be now, not his.


I think we will get the same safety benefits of Rust in a version of C relatively soon.


Borrow checker is not the only feature that makes Rust great though.


Yes, it also has many aspects I do not like about it. Let's not pretend everybody shares your enthusiasm for it.


What aspects do you not like about Rust?


Too much complexity, long build times, monomorphization, lack of stability / no standard, poor portability, supply chain issues, no dynamic linking, no proper standard, not enough different implementations, etc. It is a nice language though, but I do not prefer it over C.


> long build times, monomorphization

Monomorphization is what causes long build times, but it brings better performance than dynamic dispatch.

> lack of stability

There was another comment which also never elaborated on how Rust is not stable.

> supply chain issues

Not a language issue, you choose your dependencies.

> no proper standard, not enough different implementations

Is that a practical problem?

> no dynamic linking

There is.


If your like Rust, this is fine, but I will stay with C. I find it much better for my purposes.


He said elsewhere that he does not even use Rust. He uses typescript. I am confused why he is bothering to push Rust when he does not even use it.


I would use it if I had the opportunity, that's why.


> > no dynamic linking

> There is.

Eh, I'm a Rust fan, and I hate the dynamic linking situation too.

I genuinely cannot see how Rust would be able to scale to something usable for all system applications the way it is now. Is every graphical application supposed to duplicate and statically link the entire set of GNOME/GTK or KDE/Qt libraries it needs? The system would become ginormous.

The only shared library support we have now is either using the C ABI, which would make for a horrible way to use Rust dependencies, or by pinning an exact version of the Rust compiler, which makes developing for the system almost impossible.

Hopefully we'll get something with #[export] [1] and extern "crabi" [2], but until then Rust won't be able to replace many things C and C++ are used for.

[1] https://github.com/rust-lang/rfcs/pull/3435

[2] https://github.com/rust-lang/rfcs/pull/3470


> I genuinely cannot see how Rust would be able to scale to something usable for all system applications the way it is now. Is every graphical application supposed to duplicate and statically link the entire set of GNOME/GTK or KDE/Qt libraries it needs? The system would become ginormous.

You don't have to statically link C libraries.


I am referring to Rust scaling up to where C is now (i.e. try to remove C from the picture).

As in, if there will ever be a Rust equivalent of KDE/GNOME (e.g. maybe COSMIC), it will have similar libraries. They will have to be in Rust, and will have to be statically linked. And they will eventually grow up to be the size of KDE/GNOME (as opposed to the tiny thing COSMIC is now).


If it is any consolation, early C did not have dynamic linking either. It was added afterward. From what I understand, UNIX systems had statically linked binaries in /bin until ELF was introduced and gradually switched to dynamically linked binaries afterward.


How about some A.I. news that's not neural networks and is also useful to businesses? Or just something different?

Planning software is a classic use of A.I. to solve problems. They come up with improvements to both the tools and benchmarks every year. This article summarizes recent improvements.

Far as useful for business, I'd like to see a survey of those, especially open source. The last one I read about was Optaplanner:

https://www.optaplanner.org/


It was the one many of us thought was most likely to happen. Some aspects of it were in prior, media reports. It connected to people's memories on top of believable speculation. Then, something like that happened in China.

I believe it was popular for those reasons.


Join us for AI Startup School this June 16-17 in San Francisco!

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: