No, the point does not remain that “research” “shows” anything. Seems there is agreement that ‘meaningful experiments have never been done because it is too complex and expensive’ so there is no significant research data.
But the basic argument is that a programming approach — having a compiler and a type system is not a “style” btw — that employs development time tools to reduce the burden of runtime operational tools, afford greater application of a wider set of optimization techniques at runtime, and also add to the information bandwidth of source code via type annotations is reasonably expected to be more rigorous than the other approach.
Fair point. If there was good research showing no difference, it would be equally pointless to argue either side is right, but it wouldn't be mere opinions, like they are today.
As for the move between runtime to development time tools, it still misses the cost of it.
We could move all our code to Haskell and have absolute guarantees for a lot of the common found bugs, but we don't because it's costly. And I don't mean rewrite cost, I mean the cost of its own complexities.
Nobody argues that typed languages aren't more rigorous, but that's not the only variable we care about.
Rigor is not the only variable, agreed. The issue (again) is that the other variables are many and they are non-linear in the main. (For example, the variable of ‘competence of development organization’ is not smoothly distributed. Our field is not a “smooth” ___domain it is ‘chunky’.)
So where does that leave us? Opinions are one option - comparative views to other ‘industrial’ age type of activities may be informative.
I propose to you that “we moderns” live in a typed world. It is not strongly typed but it is typed. One could argue that that is a side-effect of physical world artifacts produced at scale. I would be interested in hearing the argument as to why that near universal phenomena* does not apply to software, in your opinion.
(* Industrial production at scale and emergence of standards)
Maybe my point got lost within the threads but I never said types aren't useful or that we should reject them at all. After all, what you said is true. Whether we want it or not, everything is "typed" in one way or another.
My issue is a practical one. Using limited typed languages like TS has several drawbacks for little benefit. Using a strongly typed language like Haskell would add a ton of greatly needed rigor and correctness, but it's also not without huge drawbacks. Same goes for dynamic languages.
It's not a question of whether or not we should model our software based on our worldly types, it's about how strict we should be about it and the benefits and drawbacks that come within this spectrum. For that reason I argue there's no single general answer to this and claiming there is one is nonsense.
But the basic argument is that a programming approach — having a compiler and a type system is not a “style” btw — that employs development time tools to reduce the burden of runtime operational tools, afford greater application of a wider set of optimization techniques at runtime, and also add to the information bandwidth of source code via type annotations is reasonably expected to be more rigorous than the other approach.