I had to write business logic in C++20 for an embedded Linux box and o-m-g. It sucked. So much. The developer experience was horrendous. All the PRs dragged along because of micro-optimizations every step of the way. "Let me see the generated assembly in Godbolt" why, why?
I know this is probably not all due to the language, but at least one bit of it has to be. It's really cool if you're writing bare-metal/RTOS level embedded stuff and you're worrying about how many assigments can you put into a loop round to optimize the cache lines, but I don't understand why anyone would ever try to talk to the web using C++.
> "Let me see the generated assembly in Godbolt" why, why?
This is a embedded thing, not a C or C++ thing.
The other day someone was saying here on HN that writing Verilog feels like following a process where "you already more or less know which circuit do you want, you are just trying to figure which is the specific Verilog code that will get your synthesizer to generate that circuit".
On embedded platforms (or generally anywhere where you count memory usage in units of KB or less), that's exactly what many people do. They already know more or less the assembly code, they are just looking for the right higher-level program that will translate to that assembly. That's one of the reasons they get angry when the language tries to be too smart.
The reason you just don't code in assembly directly is because it's still a pain and your chances of mistake increase (e.g. doing complex arithmetic expressions).
Does that also mean that embedded programming involves being conservative about compiler updates? Because otherwise those choices might become completely invalid one upgrade later
It's not unreasonable to stick with the same compiler version for the entire life of a product. I was involved with a hardware project where we attempted to upgrade the compiler mid-lifecycle; it caused a subtle malfunction (almost certainly due to a latent bug or undocumented compiler behavior dependency in our code, but we couldn't find it) and we simply decided never to upgrade the compiler for the life of the product. It wasn't considered a big deal to do so.
> All the PRs dragged along because of micro-optimizations every step of the way. "Let me see the generated assembly in Godbolt" why, why?
> I know this is probably not all due to the language, but at least one bit of it has to be
No, there’s a cultural problem. C++ gives you the power and flexibility to really optimize for space or performance but rarely is that worth it early in development. Instead, just start by simply writing the code. Good design will give you affordances for appropriate optimization later.
And if your “embedded” system is so massive it can run Linux then you could end up with better code density by using Python source and including the interpreter!
> All the PRs dragged along because of micro-optimizations every step of the way. "Let me see the generated assembly in Godbolt" why, why?
I think a large part of it is due to the language. C++ code can be simultaneously both very low-level and highly-abstracted and then you will get reviewers complaining about needlesly copying 48 bytes while making a network request...
If might depend if your program does 1 request every now and then, or if your program is doing several thousands network requests per second (then I might complain too).
Of course it depends on what you are building, but my point is that the language gives you access to the low-level facilities which nudge people to worry/think about them even when they are irrelevant and unimportant. Because details like copy elision are usually an obvious point which can be improved upon, and people generally have a need to participate and contribute, small things will be mentioned on review and delay the feature even when it makes no difference. It's not the fault of the language itself but rather the culture around it and the easy and obvious answer to this "just dont use C++ if you dont need it" stops being easy and obvious when you try to actually interop with other languages. </rant>
Idk I once sped a particle renderer a few dozen-fold because people were doing unnecessary copies.. it burnt CPU cycles for nothing for a long time before that, which should really be made illegal
Better that than having to convince someone your code is fine and doesn't need to have variables pulled out of the loop for "optimization". Compilers are good at code motion and many engineers will microptimize things that end up hurting readability for now benefit unless you show them Godbolt.