Thinking about the state of the software world in the last several years always makes me think of Vernor Vinge's notion of a "Mature Programming Environment" from _A Deepness in the Sky_ (1999),
"The word for all this is ‘mature programming environment.’ Basically, when hardware performance has been pushed to its final limit, and programmers have had several centuries to code, you reach a point where there is far more significant code than can be rationalized. The best you can do is understand the overall layering, and know how to search for the oddball tool that may come in handy"
(Longer excerpt with an earlier bit about rewriting things always eventually just moving around the set of bugs, inconsistencies, and limitations rather than improving them here: http://akkartik.name/post/deepness )
And and Danny Hillis' idea about the Entanglement ( excerpt from a 2012 interview with SciAm) -
"But what's happened though, and I don't think most people realize this has happened yet, is that our technology has actually now gotten so complicated that we actually no longer do understand it in that same way. So the way that we understand our technology, the most complicated pieces of technology like the Internet for example, are almost like we understand nature; which is that we understand pieces of them, we understand some basic principles according to which they operate, in which they operate. We don't really understand in detail their emerging behaviors. And so it's perfectly capable for the Internet to do something that nobody in the world can figure out why it did it and how it did it; it happens all the time actually. We don't bother to and but many things we might not be able to."
(more: https://www.scientificamerican.com/podcast/episode/the-comin...)
You've written almost exactly what I was going to write, but there's a few things I wanted to add / would have said.
The author seems to be treating software almost as if it were some "thing" outside of human control and development. By this, I mean he waxes philosophical about software development as though it were some divine practice handed down by the angels themselves - not what it actually is, which is a crude implementation and abstraction of the Universe's actual programming language - subatomic particles.
We have the software we have because we as humans are so limited in our thinking and scope, and because every human has slight variations on their idea of the "ideal", whatever that ideal might be.
If that was not the case, how could we have 50+ programming languages, when the very purpose of a programming language is to express your ideas into a somewhat tangible (insofar as one can claim the electronic written word to be tangible) form that can then be communicated to others.
Maybe now, I'm the one waxing philosophical, but my background is that of an evolutionary biologist; I was not formally trained in software or computer engineering, but it seems to be the "point" of every programming language is to express ideas and the point of every piece of software is to create a tool. Humanity and our ancestors have been doing this for millions of years, so why would we be expected to stop now??
> If that was not the case, how could we have 50+ programming languages, when the very purpose of a programming language is to express your ideas into a somewhat tangible (insofar as one can claim the electronic written word to be tangible) form that can then be communicated to others.
I think part of the issue is that we don't all agree on what the point of a computer program is or how best to get to the same results for the points we do agree on.
Similarly the reason we don't understand the Internet is because the Internet is a conceptual handwave used by humans. We use it to communicate which means it's a lot of different things, very messy and a lot of it is organic.
But the universe isn't 'composed' of subatomic particles in the way that a machine is 'composed' of parts. The laws governing subatomic particles, as far as speech is capable of representing them, are probabilistic. And, as we see in practice, we have surrendered to using aspects in the sciences. When useful, we speak of light as a wave, when useful, a particle. Really it is neither or both. The same gap in the work of the software engineer to depict the world in language. Any mode of speech can only depict one aspect of the truth at a time, even if a single language is capable of more than one mode, it can only ever express one aspect.
Your being a Neo-Darwinian and your confusion at the possibility of regression or a halt in progress are the same. Once the mollusc has "conceived" of his shell as an "adaptation" to a change in condition, he has "responded" so harshly to environmental dangers that he has closed them off almost entirely, bringing the process of speciation and adaptation to a near halt. In fact, there are countless examples of "tools" "conceived" by organisms that have been so immaculate that development has grinded to halt. Not all is progress... the world is not a machine...
> Your being a Neo-Darwinian and your confusion at the possibility of regression or a halt in progress are the same. Once the mollusc has "conceived" of his shell as an "adaptation" to a change in condition, he has "responded" so harshly to environmental dangers that he has closed them off almost entirely, bringing the process of speciation and adaptation to a near halt. In fact, there are countless examples of "tools" "conceived" by organisms that have been so immaculate that development has grinded to halt.
To expand on your point a bit, this brings us around to hill-climbing optimization and being trapped in a local maxima.
Biological evolution is marked by episodes of relatively generalist organisms spreading to new niches, speciating, and sometimes re-invading the environment they came from by outcompeting the original specialized denizens (I'm not necessarily just talking about large scale punctuated equilibrium, but smaller scale species ebb and flow).
So too with software: the cycle of specialization, optimization, ossification, and displacement by a generalist competitor from elsewhere happens over and over ("worse is better" is probably the pithiest expression of this, but "premature optimization is the root of all evil" is pretty nice too).
Evolution itself has evolved to increase generativity, in order to not only speed adaptation to change in general, but to unprecedented change (especially when the changes are themselves driven or exemplified by other organisms).
So too with software, where the change that software must adapt to is often driven by other software.
So software keeps getting invented and changed to optimize for and colonize changing environments (social, economic, hardware, network, and software envs), and languages keep getting invented to improve the processes of optimization & adaptation to change, as well as generativity, for both new and old niches. And of course, the boundary between software and programming language is just as fuzzy as similar boundaries in biology, frameworks and DSLs are just two obvious examples that straddle the division.
Not often appreciated is that all of the above applies just as much to the human social/cultural practices of developing software as it does to the tools those practices co-evolve with (eg. writing/editing, testing, change control, distribution, building, ticketing/issues, deployment, etc.). And we can flip our view around and see how parallel mechanisms have always been operating on human culture from even before we were human.
> If that was not the case, how could we have 50+ programming languages, when the very purpose of a programming language is to express your ideas into a somewhat tangible (insofar as one can claim the electronic written word to be tangible) form that can then be communicated to others.
The same reason we have different specialties in the sciences/arts/etc. Even math itself has different languages to express the same ideas. Having different computer languages allows people to express ideas (solve problems) more efficiently given the ___domain of the problem. Very basic example: I wouldn't use zsh to write an xmpp server implementation (but it's possible) and I wouldn't use Java to call a handful of unix commands (also possible).
I agree with this but would take it even further. I think humans are varied enough in how they approach and solve problems that some programming languages just suit some people better, there is an great conference talk (I think about OCaml) from a few years ago that talks about this idea, I'll try and dig it up.
> "you reach a point where there is far more significant code than can be rationalized. The best you can do is understand the overall layering, and know how to search for the oddball tool that may come in handy"
Honestly, as someone trying to get into developing web apps, I believe we are already at this point. Because of writings like Paul Graham's (eg, http://paulgraham.com/icad.html), I figured I'd go with Common Lisp, and as an added bonus there'd be less of a paradox of choice for libraries. Not so. Already I'm looking at a half dozen different approaches for persistent storage. I used to love Perl's concept of TIMTOWTDI, but more and more I find myself drowning in a sea of options that all seem pointless.
"The word for all this is ‘mature programming environment.’ Basically, when hardware performance has been pushed to its final limit, and programmers have had several centuries to code, you reach a point where there is far more significant code than can be rationalized. The best you can do is understand the overall layering, and know how to search for the oddball tool that may come in handy" (Longer excerpt with an earlier bit about rewriting things always eventually just moving around the set of bugs, inconsistencies, and limitations rather than improving them here: http://akkartik.name/post/deepness )
And and Danny Hillis' idea about the Entanglement ( excerpt from a 2012 interview with SciAm) -
"But what's happened though, and I don't think most people realize this has happened yet, is that our technology has actually now gotten so complicated that we actually no longer do understand it in that same way. So the way that we understand our technology, the most complicated pieces of technology like the Internet for example, are almost like we understand nature; which is that we understand pieces of them, we understand some basic principles according to which they operate, in which they operate. We don't really understand in detail their emerging behaviors. And so it's perfectly capable for the Internet to do something that nobody in the world can figure out why it did it and how it did it; it happens all the time actually. We don't bother to and but many things we might not be able to." (more: https://www.scientificamerican.com/podcast/episode/the-comin...)