To respond to the quote about "few software innovations identified in recent times", there's been plenty of innovation in the last decade or so. It might not be quite as fundamental as the innovations of the 20th century - perhaps because all those fundamental things needed to be invented but now have been - but it's there. These are not directly smartphone related, because I agree that most of the innovation of smartphones in particular has been in areas other than computer science, but your posts and the page you linked to are more broadly critical of progress in computer science, and most of them are present in or related to smartphones. Anyway:
SMP - parallelism not among separate, isolated computers whose fastest connection is probably an Ethernet port, but among multiple cores in the same die, accessing the same core RAM. Of course someone had a SMP system decades ago, it's not that complicated an idea, but only recently has it become ubiquitous and critical to taking continued advantage of Moore's Law. Although it's fundamentally a hardware innovation, the ways we write multithreaded programs have evolved to take advantage of it - it's only recently that the sort of software used on millions of systems has had to focus on fine-grained locking and lock-free algorithms, rather than slapping big locks on everything with the only downside being latency. And more unorthodox approaches are gaining traction: CSP, though invented 35 years ago, is being rediscovered with Go, various languages have experimented with software transactional memory (e.g. PyPy, Clojure), and Intel's new processors will finally bring hardware transactional memory mainstream, which might finally realize the dream of fast transactional memory.
GPUs - massive parallelism of relatively small, minimally branching algorithms, again on a single chip found in millions of devices; again, a hardware innovation that requires new ways to write software. Yes, I know how old the transputer is, but now it's mainstream.
Virtual machines - a new consummation of the idea of time sharing, innovative in practice if not theory. It's my personal opinion that they're a massive hack, a poor man's multi-user system that accomplishes no more than a traditional kernel could have, with the right operating system design, with all the kludginess you'd expect from a system based on hacking kernels designed to run directly on the hardware into running side-by-side - but when disk space and RAM became cheap enough that it became obvious that each user of a server should have their own isolated copy of all software, allowing them to install and maintain whatever versions of whatever packages they need, Unix had developed so much around the idea of a central administrator that the new paradigm had to evolve rather than being designed. But who cares? Worse is better, the heritage of Unix and perhaps its conqueror - however it came about, ordinary users of multi-user systems now have more power on average than ever before. Consider the difference between a PHP script hosted on shared hosting and a modern webapp stack. And maybe a new design will come around to replace it all, one of these days.
Closely related, cloud computing - I suppose driven by the decreasing price of hardware. The idea of computing as a commodity is hardly new, but in the last few years it has become a reality: virtual servers can now be spun up and down in minutes, as part of massive server farms provided as a service to a huge number of users, for low cost. This is fundamentally changing the way software is designed: scalability is easier than ever, but it has become more and more useful to write distributed systems that can tolerate internal failure.
HTML5. You can enter a URL and instantly and safely run fast code. Yes, it's just a another VM; yes, Java had this a long time ago. But we avoided some of Java's mistakes and CPUs are faster, so who knows, write-once-run-anywhere might become a reality this time.
Sandboxing. We might still be stuck with C and unsafe code, but sandboxing is starting to make software a lot harder to exploit anyway. Software security in general is receiving a lot of attention these days.
Functional programming and other languages with strong static type systems have had a resurgence lately. Going back a bit farther, you could say the same about dynamic programming languages such as Python, Ruby, and JavaScript. There are so many different languages which have taken different paths that it's hard to identify a single clear direction that programming languages have gone in, but there are some commonalities, and they add up to a significant amount of incremental innovation. There is a lot more to say about this, but I'm getting tired and it would take a lot to do it justice.
Ways to write software: test driven development, agile, etc.
SMP - parallelism not among separate, isolated computers whose fastest connection is probably an Ethernet port, but among multiple cores in the same die, accessing the same core RAM. Of course someone had a SMP system decades ago, it's not that complicated an idea, but only recently has it become ubiquitous and critical to taking continued advantage of Moore's Law. Although it's fundamentally a hardware innovation, the ways we write multithreaded programs have evolved to take advantage of it - it's only recently that the sort of software used on millions of systems has had to focus on fine-grained locking and lock-free algorithms, rather than slapping big locks on everything with the only downside being latency. And more unorthodox approaches are gaining traction: CSP, though invented 35 years ago, is being rediscovered with Go, various languages have experimented with software transactional memory (e.g. PyPy, Clojure), and Intel's new processors will finally bring hardware transactional memory mainstream, which might finally realize the dream of fast transactional memory.
GPUs - massive parallelism of relatively small, minimally branching algorithms, again on a single chip found in millions of devices; again, a hardware innovation that requires new ways to write software. Yes, I know how old the transputer is, but now it's mainstream.
Virtual machines - a new consummation of the idea of time sharing, innovative in practice if not theory. It's my personal opinion that they're a massive hack, a poor man's multi-user system that accomplishes no more than a traditional kernel could have, with the right operating system design, with all the kludginess you'd expect from a system based on hacking kernels designed to run directly on the hardware into running side-by-side - but when disk space and RAM became cheap enough that it became obvious that each user of a server should have their own isolated copy of all software, allowing them to install and maintain whatever versions of whatever packages they need, Unix had developed so much around the idea of a central administrator that the new paradigm had to evolve rather than being designed. But who cares? Worse is better, the heritage of Unix and perhaps its conqueror - however it came about, ordinary users of multi-user systems now have more power on average than ever before. Consider the difference between a PHP script hosted on shared hosting and a modern webapp stack. And maybe a new design will come around to replace it all, one of these days.
Closely related, cloud computing - I suppose driven by the decreasing price of hardware. The idea of computing as a commodity is hardly new, but in the last few years it has become a reality: virtual servers can now be spun up and down in minutes, as part of massive server farms provided as a service to a huge number of users, for low cost. This is fundamentally changing the way software is designed: scalability is easier than ever, but it has become more and more useful to write distributed systems that can tolerate internal failure.
HTML5. You can enter a URL and instantly and safely run fast code. Yes, it's just a another VM; yes, Java had this a long time ago. But we avoided some of Java's mistakes and CPUs are faster, so who knows, write-once-run-anywhere might become a reality this time.
Sandboxing. We might still be stuck with C and unsafe code, but sandboxing is starting to make software a lot harder to exploit anyway. Software security in general is receiving a lot of attention these days.
Functional programming and other languages with strong static type systems have had a resurgence lately. Going back a bit farther, you could say the same about dynamic programming languages such as Python, Ruby, and JavaScript. There are so many different languages which have taken different paths that it's hard to identify a single clear direction that programming languages have gone in, but there are some commonalities, and they add up to a significant amount of incremental innovation. There is a lot more to say about this, but I'm getting tired and it would take a lot to do it justice.
Ways to write software: test driven development, agile, etc.