Hacker News new | past | comments | ask | show | jobs | submit | almostgotcaught's comments login

> Every single package manager couldn’t handle my very basic and very popular dependencies

Well there's your problem - no serious project uses one.

> I’m convinced it’s a bunch of masochists

People use cpp because it's a mature language with mature tooling and an enormous number of mature libraries. Same exact reason anyone uses any language for serious work.


How can you simultaneously call cpp a mature language with mature tooling and acknowledge that there's no working package manager used by any "serious" project?

Package managers per language are a (relatively) new endeavor. The oldest language I can think of that widely adopted it was Perl. Although, perl was quite ahead of it's time in a lot of ways, and php undid some the work of perl and went back to popularizing include type dependencies instead of formal modules with a package manager.

C++ "gets away" with it because of templates. Many (most?) libraries are mostly templates, or at the very least contain templates. So you're forced into include-style dependencies and it's pretty painless. For a good library, it's often downloading a single file and just #include-ing it.

C++ is getting modules now, and maybe that will spur a new interest in package managers. Or maybe not, it might be too late.


It's a relatively new endeavor, but it's also a requirement in 2025 if you want to be portable. The Linux ecosystem was focusing on installing dependencies system-wide for decades (that's how traditional `./configure.sh` expects things to work), and this approach is just inferior in so many ways.

The shenanigans people get into with CMake, Conan, vcpkg, and so on is a patchwork of nightmares and a huge time sink compared to superior solutions that people have gotten used to in other languages, including Rust.


apt install xxxxx-dev

Because cpp is not meant for "rapid prototyping" involving importing half of github with single command. And the reality is that it works.

Works for whom?

C++ build systems are notoriously brittle. When porting a project to a new platform, you're never just porting the code, you are also porting your build system. Every single project is bespoke in some way, sometimes because of taste, but most of the time because of necessity.

It works because people spend a huge amount of time to make it work.


Works for numerous projects which "run the world".

Everyone know the system is brittle, but somehow manage to handle it.


This seems hyperbolic. At work we cross compile the same code for a decent number of different platform - six different OS (Linux Mac windows and some embedded ones) over 20odd cpu architectures.

It’s the same build system for all of them.


Do you people really not realize how completely asinine you sound with these lowbrow comments? I'll give you a hint: did you know that C also has no package manager?

Yeah, and it's also much worse for it. There's a reason everyone in C uses their own linked list implementation and it's not because it's a platonic ideal of perfect software.

The question wasn't whether C/C++ are platonic ideals, the question was whether a language can be mature without a package manager.

If we take “mature” to mean “old” then yes - C and C++ are certainly old. If we take “mature” to mean “good”, then my answer changes.

Agreed. Getting started with a C or C++ project is such a pain in the ass that I won't even bother. Then there is the fact that unless you have special requirements that necessitate C/C++, those languages have nothing going for them.

And removing fluoride from the water

Why does American drinking water need fluoride - for the few seconds people brush their teeth? Other developed nations seem to do get by just fine without it (i.e, most of Europe). Does modern tooth paste not contain the components for proper cleaning? I feel like I'm missing something here, because I don't swallow my tooth paste but I drink my tap water. But if fluoride is fine in water that we drink, why not just add all the other vital chemicals to the tap water that our bodies crave, like soma? Because it really smells like peoples opposition to this is not science-based but emotion-based (i.e., anti-RFK and Trump admin).

"For the few seconds people brush their teeth"? That's not how fluoridated drinking water works. Fluoridated water works all of the time, not just when brushing teeth, and it's not a vital chemical that the body craves.

You are missing something. If you're this confused about a topic, you should at a bare minimum read the Wikipedia page.


I've read it and am not convinced we need to be ingesting fluoride water. Nor is Europe or most other countries and their dental health is fine.

Said another way - brush yo teeth, brush yo GD teeth: https://youtu.be/GlKL_EpnSp8?si=NeKJWKNlcHxtDUYD&t=112


Yeah because in Europe we add fluoride and iodine to table salt, as well as to our toothpaste.

Also, we don't have anywhere close to the sugar consumption the US has, which keeps both our diabetes and dental health issues at rates far below the US.


The questions you posed are not questioning fluoride, they're asking what the basic premise even is. If you don't understand that, you are far from the position needed to be evaluating and analyzing the necessity or benefits of it.

The Wikipedia page you mentioned reading also points out that it's not only a US thing. Or even a water-only thing.


When I see an argument with a phrase like "basic premise" I know I'm reading some word mambo jambo, otherwise the author would just give their summary of that "basic premise" instead of deadlinking it (refer to something without actually referring it).

You don't have an argument yourself, you just wanted to share that you are pro some position.


There are clear factual errors in the underlying assumptions of what was stated about water fluoridation. Those are simply table stakes for having a discussion about anything at all. If one thinks that water fluoridation is useful "just for a few seconds," that it's not done outside of the United States, that it's a replacement for toothpaste, that it's a vital chemical, or that we don't fortify other foods, then they do not know enough about the topic to talk about it, let alone hold the opinion that they know better.

If someone came in with a curious mindset, that'd be one thing. But this is someone walking into a room with an agenda (get rid of fluoride) and a shocking lack of knowledge about that agenda.


>If someone came in with a curious mindset, that'd be one thing. But this is someone walking into a room with an agenda (get rid of fluoride) and a shocking lack of knowledge about that agenda.

But since "Internet People Lie About Fluoride,"[0] why are you surprised? And that's nothing new.

Why? I have no idea. Perhaps cpursley[1] could enlighten us?

[0] https://www.youtube.com/watch?v=GefwcsrChHk

[1] https://news.ycombinator.com/item?id=43894013


Yeah, exactly - she made my point. Buy proper toothpaste with fluoride. Brush after ever meal. I understand the chemistry and am an obsessive brusher. If the Danish don't need it in their DRINKING water, nor do we.

> Buy proper toothpaste with fluoride.

It's funny you should mention that because now the Texas AG is staring to come after fluoridated toothpastes.


You must've watched a different video than I did.

Because you're the "Internet people lie about fluoride" guy to a tee.

Have a good day!


Just brush your teeth after every meal, you will be fine like the Finns. And prob a higher IQ like them, as well (without all the unnecessary floride in the water).

PSA: brushing your teeth directly after eating is actually detrimental, because the acids in food soften the enamel on your teeth. That layer needs to harden first (wait 20-30 minutes), otherwise your toothbrush will strip it away: https://www.cuimc.columbia.edu/news/brushing-immediately-aft...

Now that’s really interesting, thanks.

Where are the studies supporting this claim? A random person on the internet saying "prob higher IQ" is not a convincing argument.

Because just like we have stupid people who don’t vaccinate their children from measles, we have stupid people who don’t make them brush their teeth.

So rather than have them suffer with a lifetime of oral health problems, you can intervene in a transparent and cheap way to prevent these issues altogether.

The introduction of fluoride dramatically improved oral health. NYC has been doing it since the 1960s, so one would think we’d see evidence of the supposed negative effects.


Any actual stats on people not brushing their teeth? It's not 1960 any more... And my entire point was to compare to other nations with similar development levels that don't pump it into their water supply and are doing just fine in terms of oral health.

And by your metric, should we also pump in vitamins and other substances that our bodies crave? Maybe the Fed gov't could just skip that and force drip IV everyone a compliance cocktail after their breakfast of USDA approved and SNAP subsidized Captain Crunch?


> Any actual stats on people not brushing their teeth? It's not 1960 any more... And my entire point was to compare to other nations with similar development levels that don't pump it into their water supply and are doing just fine in terms of oral health.

Something like 30% of people report not brushing their teeth at least once a day. Unclear if that means most of them brush every other day or some even lower frequency, but I’d assume if you report not brushing at least once a day then you likely aren’t brushing consistently every other day or something.

> And by your metric, should we also pump in vitamins and other substances that our bodies crave? Maybe the Fed gov't could just skip that and force drip IV everyone a compliance cocktail after their breakfast of USDA approved and SNAP subsidized Captain Crunch?

We already do this, all the time! Vitamin and mineral fortified foods are everywhere. Iodine is in a lot of salt. It’s a good thing, not something to be mocked. Most vitamins and minerals have minimal cost, no issues with taking “too much” of them, and have significant health benefits if you are deficit on that particular thing.


> Something like 30% of people report not brushing their teeth at least once a day.

Gross, but that's their problem, not mine. There's a multitude of bad health habits, if we were actually serious, there'd be no soda or cereal on the shelves. But big ag and big health activity oppose that because they financially benefit from SNAP. Your fortified foods mention is an example of exactly how insane it all is (we subsidize the corn syrup farmers to produce garbage food and then give poor people money to buy it, instead of you know - real food).


> Gross, but that's their problem, not mine. There's a multitude of bad health habits, if we were actually serious, there'd be no soda or cereal on the shelves.

This absolutist mindset is not helpful for making progress. People want tasty, potentially bad for them foods. You can have bad food that’s made up of “real food” just fine. Fortifying bad foods to make them marginally less bad is a good thing. Don’t let great be the enemy of good. Nobody is looking at a bag of chips and saying “well because it’s got added Vitamin A, it’s good for me now!” Instead, it’s just a silent benefit.


Or just address the actually cause instead of the "problem". Get the shit like food coloring and corn syrup out of our food. Other nations do just fine with their food situation without all these made up excuses and nonsense and like "fortified" food. And stop subsidizing the garbage food and cultivation of it via SNAP.

> Or just address the actually cause instead of the "problem".

Again, you’re asking for behavioral change in humans by fiat. Fortification extends well beyond just adding vitamins to chips or junk food. It’s added in many basic building blocks (milk, and most flours and rice) because it literally is solving nutritional deficiencies caused by poverty. Nobody is somehow making purchasing decisions on junk food based on fortification content.

> Get the shit like food coloring and corn syrup out of our food.

Irrelevant to nutritional content, unless you mean overly sugary foods relating to corn syrup. Which, you can have the exact same health outcomes and hyper palatability by just using regular old sugar.

> Other nations do just fine with their food situation without all this made up nonsense and excuses like "fortified" food.

Other nations fortify their food too, including many “first world” countries. I don’t know why you think this is somehow a uniquely American thing.


> Nobody is somehow making purchasing decisions on junk food based on fortification content

They absolutely are. Food marketing and other tricks work, even on edjamahcated people. Think terms like “150% more antioxidants” and “100% natural fruit gummies!”


We do. Table salt is iodized. We add vitamin A & D to milk and bread.

You’re looking for facts to stuff a straw man. There is clear, obvious correlation between fluoridation and improved oral health. They discovered this decades ago where it was observed that oral health was better in regions where groundwater was used and fluoride occurred naturally.

By my metric, we should take reasonable measures to improve public health. I don’t suppose you’re in favor of making dental care affordable to those who can’t afford it?

If you choose to align yourself with the pseudo intellectual descendants of the John Birch society to protect your “precious bodily fluids”, I’m sorry for you.


> There is clear, obvious correlation between fluoridation and improved oral health.

Not once have I argued against fluoride for oral care. But I don't want it in my DRINKING water.

> We add vitamin A & D to milk and bread.

Right, and our (American) "bread" and milk is not just bad, it's total garbage. I mean really really bad vs civilized countries.

Here's another solution: Just eat real food and brush your teeth with proper toothpaste. I know, shocking!


What a novel suggestion. Sure just do what you’re supposed to do.

What’s your answer when the same idiots campaigning against fluoride decide that toothpaste is a problem? Or that the ADA is a scam and there is no proof that toothbrushing has any effect?

And what’s real food? That is a question that doesn’t have an answer.


> same idiots campaigning against fluoride decide that toothpaste is a problem?

Their teeth will rot out.

> And what’s real food? That is a question that doesn’t have an answer.

It's not fortified Captain Crunch.


How about the fortified rice? The salt? The flour? Bread? Milk? Pasta?

> Any actual stats on people not brushing their teeth?

google is free - it's not anyone's responsibility to educate you and answer your naive questions. and if did google and you're still not convinced, well then i'm glad you're not an elected official wherever it is you live (though if you live in the US i guess you probably voted for the current admin)


And who in the actual flying fuck are you to assume with such confidence who I voted for?

> that would be insane and you'd gas out in seconds.

I mean there are lots of people that dumbbell row 95s or 100s or 105s for 8-10 reps (I used to be one em...). That's not really "seconds" but sure it's not a lot either. But then again no one literally only trains dumbbell rows so it's not at all unbelievable to me that you could do this (train to draw a high weight bow many times without "gassing").


Plus a dumbbell is the same weight the whole time while the bow is only the draw weight at full draw.

On the other hand as someone noted: you don't bring up a dumbbell by a thin string with 3 fingers. I think when trained and without holding like in the movies you could go a bit longer than you'd expect with dumbbels but goddamn your fingers must hurt after.

Yeah but spend your whole life malnourished and march 20 miles THEN do the reps (as peasant archers probably had to do).

This is highly contextual based on time and place. While most people would have had access to fewer calories compared to the modern day, the average person wasn’t starving to death under normal circumstances. We’re talking a population that engaged in regular manual labour, so sufficient nutrition was necessary. I’d also guess the societies starving their populace were unlikely to call them up to war unless they were really desperate.

I think only trained professional archers could use a 170lb pull bow at all.

My understanding is that English longbowmen trained from their teens on the weapon and you can see on the skeletons how it warped their bodies.

Yes but they weren’t starving them for their whole lives.

Agreed. I'm an amateur archer and I asked my archery instructor the highest poundage recurve bow he's ever seen someone fire, and he says that one time someone came to the range with a 100 pound draw bow, but he's only seen that once in 10 years.

Compound bows of course you can go higher because of mechanical advantage, but either way I don't think that people realize how difficult it is to draw a 100 pound bow. Typical professional recurve bow users would rarely want to exceed 50 pounds as I understand it.


There’s also a bit of a different optimization goal in modern archery - the goal is to put as many arrows precisely on a target. More draw weight helps up to a certain extent, but ranges are pre-set and limited and once your draw weight is high enough to comfortably propel the arrow that far, more weight will not improve things. Aiming gets harder at a higher weight. You could shoot a heavier arrow, but the benefits are somewhat limited - it punches a bigger hole which helps a bit, but you’re not trying to kill the target - so the added penetration is not interesting.

In a war setting, higher draw weights increase both distance and penetration, which are desirable.


Why would it ever be impossible/unbelievable? The whole point is it was commonplace for this type of person.

It's just surprising that the number's that large.


These are the kinds of comments that I write when I work really hard (and very long) on a PR and I know no one will really dig into it (kind of like "well at least I committed the findings to posterity").

If you keep at it someone someday will be blown away.

Nah ain't no one got time for that - and I don't blame anyone either (not like I was read other people's comments).

> Halide [1] pioneered (AFAIK) the concept of separating algorithm from implementation at the language level.

you don't need to go all the way to Halide to do what the article is claiming isn't possible - you can do it just by including a "micro-kernel" in your library and have the code branch to that impl (depending on something at runtime) instead of whatever the C code compiled down to. this is done every single day in every single GPU lib (famously cublas ships with hundreds/thousands of these of such ukernels for gemms depending on shapes).


I was going for something different: I don't want to choose a different implementation in runtime, I want the compiler to see through my code and apply constant propagation -- not just for constant inputs, but inputs with known properties, like `n < 1000` or `(n & 7) == 0`. I want it to also learn facts about the output values, e.g. that my `isqrt(n) -> m` function always returns `m` such that `m^2 <= n`. None of this is possible with runtime selection because runtime selection was never the point.

A lot of people have ways of accomplishing this, but my way is using compile-time execution in Zig (I know at least D, C++, and Terra have their own versions of this feature). You can specify a parameter as `comptime` and then do different things based on whatever conditions you want. You can also execute a lot of code at compile-time, including your sqrt check.

E.g. I wrote a `pextComptime` function, which will compile to just a `pext` instruction on machines that have a fast implementation, otherwise it will try to figure out if it can use a few clever tricks to emit just a couple of instructions, but if those aren't applicable it will fallback on a naïve technique.

https://github.com/Validark/Accelerated-Zig-Parser/blob/8782...


I think we're all talking past each other here.

Your suggestions introduce, in effect, a hypothetical `if` statement, only one branch of which is taken. I can change the condition arbitrarily, but ultimately it's still going to be either one or the other.

I want the `if` to take both branches at once. I want the compiler to assume that both branches trigger the exact same side effects and return the same results. I want it to try both approaches and determine the better one depending on the environment, e.g. the number of free registers, (lack of) inlining, facts statically known about the input -- all those things that you can't write a condition for on the source level.

Think about it this way. A standard compiler like LLVM contains passes which rewrite the program in order. If something has been rewritten, it will never be rolled back, except it another pass performs a separate rewrite that explicitly does that. In contrast, e-graphs-based compilers like Cranelift maintain an equivalence graph that represents all possible lowerings, and after the whole graph is built, an algorithm finds a single optimal lowering.

Existing solutions make me choose immediately without knowing all the context. The solution I'd like to see would delay the choice until lowering.


> e-graphs-based compilers like Cranelift maintain an equivalence graph that represents all possible lowerings, and after the whole graph is built, an algorithm finds a single optimal lowering

Do you have a good entrypoint reference for learning about how this works? This (and the associated mention in the article) is the first time I've heard of this approach.


@thrtythreeforty I think this RFC is a good start: https://github.com/bytecodealliance/rfcs/blob/main/accepted/.... Then read through these docs: https://docs.rs/egg/latest/egg/tutorials/. They document the behavior of a particular crate, but they also act as a very accessible high-level overview.

I recently ran down approximately the same rabbit hole when trying to figure out what to do about x86 treating addition and bitwise OR differently. There's https://llvm.org/docs/LangRef.html#id171, but it can't generally be synthesized in Rust. So I went on a short-lived quest:

- https://internals.rust-lang.org/t/expose-llvms-or-disjoint-i...

- https://github.com/rust-lang/libs-team/issues/373

- https://github.com/rust-lang/rust/pull/124601

Which ultimately culminated in an opinion that should sound familiar - https://github.com/rust-lang/rust/pull/124601#issuecomment-2....


Oh, yes, I understand now. I've thought to myself before it would be nice if I could have implementation 1 go into variable x. And implementation 2 go into variable y. Then I do `assert(x == y)` and a compiler like Cranelift should know it only needs to pick one of them.

I'm glad to know that's the design of Cranelift, since that's how I would think it should be done, although I haven't written a middle or backend for a compiler yet.


Another cool thing you could do is fuzz test a version of the code that actually does take both branches (in separate runs with fresh memory etc.) and aborts if they give different results.

> I want the `if` to take both branches at once.

this is called symbolic execution - as i have already told you, many compilers do this in the form sccp and scev. you don't need silly things like egraphs for this.

> If something has been rewritten, it will never be rolled back

this is patently false - not only do passes get rolled back to catch/correct errors but there is a fixed-point iteration system (at least in MLIR) that will apply passes as long as they are "profitable".


I don't see how symbolic execution is relevant to this. Yes, symbolic execution does "check both paths" for some definition of "check", but ultimately I as a programmer still need to write a condition, and that condition is on the source level, so it can't access information on e.g. register pressure, which is what I would like to comptime-branch on.

> not only do passes get rolled back to catch/correct errors but there is a fixed-point iteration system (at least in MLIR) that will apply passes as long as they are "profitable"

Huh? I'm not arguing against fixed-point iteration, that's perfectly fine because it's still a unidirectional process. What do you mean by "passes get rolled back to catch/correct errors" though? Certain rewrites can certainly be not performed in the first place if they pessimize the code, but that's not what I'm talking about.

If there's pass 1 that chooses between rewrite A and B, pass 2 that chooses between rewrite C or D, and pass 3 choosing between E or F, in a typical compiler, this choices would be made one by one mostly greedily. An e-graph style approach allows all of those eight combinations to be tried out without necessarily leading to a combinatorial explosion.


i have no idea what that has to do with what op quoted from your article:

> There is no way to provide both optimized assembly and equivalent C code and let the compiler use the former in the general case and the latter in special cases.

this is manifestly obviously possible (as i've said).

what you're talking about is something completely different goes by many names and uses many techniques (symbex, conex, sccp, scev, blah blah blah). many of these things are implemented in eg LLVM.


Ah ok, I see what you mean (and likely sibling comment too w.r.t. gcc feature). Yes that is a fair point - though still has the substantial downfall of maintaining many different implementation of any given algorithm.

> There are alternative universes where these wouldn't be a problem

Do people that say these things have literally any experience of merit?

> For example, if we didn't settle on executing compiled machine code exactly as-is, and had a instruction-updating pass

You do understand that at the end of the day, hardware is hard (fixed) and software is soft (malleable) right? There will be always be friction at some boundary - it doesn't matter where you hide the rigidity of a literal rock, you eventually reach a point where you cannot reconfigure something that you would like to. And also the parts of that rock that are useful are extremely expensive (so no one is adding instruction-updating pass silicon just because it would be convenient). That's just physics - the rock is very small but fully baked.

> we could have had every instruction SIMDified

Tell me you don't program GPUs without telling me. Not only is SIMT a literal lie today (cf warp level primitives), there is absolutely no reason to SIMDify all instructions (and you better be a wise user of your scalar registers and scalar instructions if you want fast GPU code).

I wish people would just realize there's no grand paradigm shift that's coming that will save them from the difficult work of actually learning how the device works in order to be able to use it efficiently.


The point of updating the instructions isn't to have optimal behavior in all cases, or to reconfigure programs for wildly different hardware, but to be able to easily target contemporary hardware, without having to wait for the oldest hardware to die out first to be able to target a less outdated baseline without conditional dispatch.

Users are much more forgiving about software that runs a bit slower than software that doesn't run at all. ~95% of x86_64 CPUs have AVX2 support, but compiling binaries to unconditionally rely on it makes the remaining users complain. If it was merely slower on potato hardware, it'd be an easier tradeoff to make.

This is the norm on GPUs thanks to shader recompilation (they're far from optimal for all hardware, but at least get to use the instruction set of the HW they're running on, instead of being limited to the lowest common denominator). On CPUs it's happening in limited cases: Zen 3 added AVX-512 by executing two 256-bit operations serially, and plenty of less critical instructions are emulated in microcode, but it's done by the hardware, because our software isn't set up for that.

Compilers already need to make assumptions about pipeline widths and instruction latencies, so the code is tuned for specific CPU vendors/generations anyway, and that doesn't get updated. Less explicitly, optimized code also makes assumptions about cache sizes and compute vs memory trade-offs. Code may need L1 cache of certain size to work best, but it still runs on CPUs with a too-small L1 cache, just slower. Imagine how annoying it would be if your code couldn't take advantage of a larger L1 cache without crashing on older CPUs. That's where CPUs are with SIMD.


i have no idea what you're saying - i'm well aware that compilers do lots of things but this sentence in your original comment

> compiled machine code exactly as-is, and had a instruction-updating pass

implies there should be silicon that implements the instruction-updating - what else would be "executing" compiled machine code other than the machine itself...........


I was talking about a software pass. Currently, the machine code stored in executables (such as ELF or PE) is only slightly patched by the dynamic linker, and then expected to be directly executable by the CPU. The code in the file has to be already compatible with the target CPU, otherwise you hit illegal instructions. This is a simplistic approach, dating back to when running executables was just a matter of loading them into RAM and jumping to their start (old a.out or DOS COM).

What I'm suggesting is adding a translation/fixup step after loading a binary, before the code is executed, to make it more tolerant to hardware changes. It doesn’t have to be full abstract portable bytecode compilation, and not even as involved as PTX to SASS, but more like a peephole optimizer for the same OS on the same general CPU architecture. For example, on a pre-AVX2 x86_64 CPU, the OS could scan for AVX2 instructions and patch them to do equivalent work using SSE or scalar instructions. There are implementation and compatibility issues that make it tricky, but fundamentally it should be possible. Wilder things like x86_64 to aarch64 translation have been done, so let's do it for x86_64-v4 to x86_64-v1 too.


that's certainly more reasonable so i'm sorry for being so flippant. but even this idea i wager the juice is not worth the squeeze outside of stuff like Rosetta as you alluded, where the value was extremely high (retaining x86 customers).

the only people that complain about this don't actually write C++

I write C++, and complain about still not having a portable C++17 implemenation that fully supports parallel STL, not having a full compliant C++20 compiler, embedded compilers kind of still catching up with C++14, and I could rant on.

Are there any embedded compilers left that try to implement their own C++ frontend? To me it looks like everyone gave up on that and uses the clang/gcc/EDG frontends.

Yes, and even those that have compiler forks aren't on vLatest.

Interesting you mention EDG, as it is now famously know as being the root cause why Visual Studio development experience lags behind cl.exe, pointing out errors that compile just fine, especially if using anything related to C++20 modules.

Apparently since the modules support introduced in VS 2019, there have been other priorities on their roadmap.


Yes to all that, but oneapi::dpl has been out for a while.

And how much does that help anyone in systems that can't make use of it?

As much as C helps on TinyOS with a MICaz board I guess.

And what has that to do with ISO C++ compliance for portable code?

It has to do with how ridiculous it is to complain about where the afromentioned very portable library is not supported.

NVIDIA ain't spent much time in the NFL else they would've known "...when you’re bleeding a guy you don’t squeeze him dry right away. Contrarily, you let him do his bidding suavely. So you can bleed him next week and the week after at minimum."


What is the point of this? Fortran is both faster than cpp and easier to write than cpp. It's also by no means a dead or dying or whatever language. Smells like literally "your scientists were so busy they forgot to ask why".


Seems pretty obvious to me, and I’ve written my fair share of both Fortran and C++. I think it is mostly that very few people know Fortran anymore and even fewer people want to maintain it. A vast number of people in 2025 will happily work in C++ and are skilled at it.

Fortran also hasn’t been faster than C++ for a very long time. This was demonstrable even back when I worked in HPC, and Fortran can be quite a bit worse for some useful types of performance engineering. The only reason we continued to use it was that a lot of legacy numerics code was written in it. Almost all new code was written in C++ because it was easier to maintain. I actually worked in Fortran before I worked in HPC, it was already dying in HPC by the time I got there. Nothing has changed in the interim. If anything, C++ is a much stronger language today than it was back then.


Fortran is still quite modern despite its age, and relevant enough that not only has one of the success factors of CUDA, the availability of Fortran on the CUDA SDK, LLVM project also started a Fortran frontend project.

Also to me seems more likely that people that enjoy Fortran in HPC are more likely to change to Chapel than use C++.


> Almost all new code was written in C++ because it was easier to maintain.

What makes you say so? See musicale's comment above. I have a hard time seeing C++ as easier to maintain, if we are just talking about the language. The ecosystem is a different story.


For pure number crunching, Fortran is nicer. Unfortunately, performance for most codes became about optimizing memory bandwidth utilization at some point in the 2000s, and systems languages are much more effective at expressing that in a sane way. It was easier to make C/C++ do numerics code than to make Fortran do systems code. Some popular HPC workloads were also quite light on numerics code generally, being more about parallelization of data processing.

This was before modern C++ existed, so most of the code was “C with classes” style C++. If you can read C then you can read that code. I don’t consider that to be particularly maintainable by modern standards but neither is Fortran.

Modern C++ dialects, on the other hand, are much more maintainable than either. Better results with a fraction of the code. The article doesn’t say but I would expect at least idiomatic C++11, if not later dialects.


Some people at LANL seem to be on a holy crusade to replace Fortran with C++. They occasionally produce stuff like papers saying Fortran is dying and whatever. Perhaps it makes sense for their in-house applications and libraries, but one shouldn't read too much into it outside their own bubble.


I wonder if they feel that the toolchains are just rotting.



Unlike C++, they even got their standard package manager - fpm[1].

[1] https://fpm.fortran-lang.org/


It is as standard as vcpkg and conan.

ISO Fortran does not acknowledge the existences of FPM, just like any programming language ISO standard, the ecosystem is not part of the standard.


This. If someone can't correctly articulate the advantages of Fortran they shouldn't be migrating away from it. This is not to say that migrations should never happen.


Chesterton's Fence


Yes the opposition thinks evil is evil. The opposition also thinks water is wet. Check back here tomorrow for more obvious things rational people think.


The opposition reductively believes this is an existential battle between “good and evil”, they’re the “good”, and that’s a position from which one can justify almost anything to eradicate “evil”.


Well, Trump is the one that almost always frames things in very binary way. If someone contradicts him, it is "fake news". His opposition is typically much less so, and much more rational and thoughtful.

Even many in the opposition agrees with many of his goals (control immigration, protect American industries, shrink the government).


How many Supreme Court rulings does it take for a Trump supporter to admit the Trump administration is unjust? The world may never know.


You can always know, if you want to, by actually engaging in constructive dialog. Which probably isn’t going to happen in this thread because it’s ostensibly about a raspberry pi LiDAR scanner, and thus neither really the time nor place.


The MAGA crowd is not even remotely interested in 'constructive dialog' and is so far down the hole of drinking the kool-aide, constructive dialog with them will likely never be possible.

You cannot have constructive dialog about astronomy with someone who thinks the sky is made of green and purple polkadots because that's what someone told them, and dismiss all evidence to the contrary as a massive conspiracy.

They don't even believe in democracy or constitutional rights - at least, for anyone but them.


I’m interested in constructive dialog, and I believe in democracy and constitutional rights. However, this is a thread about a neat LiDAR scanner.


It's funny - first you call me reductive but now it's all "I'm staying out of this one". Interesting how that goes.


Which many people could have afforded to build a few weeks ago, but now can't.


Consider applying for YC's Summer 2025 batch! Applications are open till May 13

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: