This person is asking about career prospects 10-15 years out though.
I'm sorry, but the landscape in then might be as alien to someone asking today, as todays would have been to someone asking 15 years ago (2008).
What John said is correct, but personally I think he's underplaying how much people could be affected. Those "product skills" take years of grinding to really sharpen, and in 15 years only a few people might actually be needed to apply them
I doubt AI will replace any job in my lifetime (got 40-50 years left).
Progress will grind to a halt just like self driving cars did because the real world is just too chaotic and 'random' to be captured by a formula/equation/algorithm.
My prediction is: AGI is theoretically possible, but would require impractical amounts of computing power - kinda like how intergalactic travel will never happen.
And regrading comparison with self driving car they are still improving just the bar for them is much higher. If autopilot works 99.9% if time then 1 out of 1000 drivers will die - so technology has to be even better. for LLM is enough if it’s 90% good to be broadly useful.
It’s not about replacing all programmers. If one programmer with AI assistant can do work the same as 2 programmers then one position is redundant.
Even with self driving truck if one truck driver is leading another truck behind controlled by AI, and just for safety you have somewhere C&C center with one person monitoring 4 such ai trucks and in case unexpected event remotely take over control then one truck driver position is redundant.
While I do think there is some threshold where increased productivity makes positions redundant, I don't think 2x would do it in most orgs. My current team easily has enough work for us all to be 2x more productive.
fwiw, self-driving cars did not grind to a halt, development just did not move as quickly as the pundits and self-promotion claimed. I just rode in a fully driverless car on public streets in downtown Austin this week.
Well I'm in web dev (though I was studying CS in 2008) and the 2008 landscape had almost none of the same things. jQuery was not yet a household name, let alone SPAs. Facebook had barely 100 million users. Marc Andreessen yet hadn't written about "software eating the world". Personally I was more optimistic. If anything, the last 15 years have seen the growth of an attitude of tech "entitlement" because hackers got to the a lot of the ideas that now seem obvious in hindsight before a lot of the big corps could.
I'm sure there's still room for innovation, but I think a lot of it going forward will be driven by rapid improvement in AI capabilities.
In 2008, tech wasn't everywhere. iphones were brand new and very few people had them.. There was no "mobile browser" market share (though we did have SMS gateways). 77% of the global population hadn't even been on the internet yet.
AI looks like it's going to be at the forefront of the next big wave of fundamental changes to society, and it's really hard to predict where that will lead us. But I suspect it's going to become apparent that this relatively brief period of tech-elite empowerment was a historical anomaly, because the AI underlings are going to be willing to do a lot more work with none of the coddling, and they're going to improve very quickly.
I totally don’t see that. If you showed me AWS and modern machines and Go and React in 2008 I would certainly see that yeah there was some incremental progress but by no means would my mind be blown. Not much has changed. We still write clients and servers and use http and most of the same languages are still popular but slightly updated. Databases are essentially the same.. how good phones are would probably be the most exciting thing apart from GPT.
Or typescript! I was writing actionscript 3 in 2008 which is essentially the same spec
Huh? Gmail was a spa and that dates from 2004. I don't think a VC writing a blog post says much about how tech had changed. Smartphones existed before the iPhone and Android, they just weren't as popular.
The term "SPA" wasn't in use until at least after 2009 and gmail was probably using some hacky AJAX (XMLHTTPRequest wasn't even standardized until, what, 2006?). Chrome wasn't launched until 2008 so they weren't able to get away with just adding the APIs they needed into the browser. Backbone wasn't even released until 2010 and Angular probably wasn't conceived of internally until late 2008.
Yes, gmail might have had some SPA-like behaviour in 2004-2006 but it was nothing like what we have today. Pretty sure I got access in 2005 because I knew someone who worked at google, and it was mostly doing full refreshes between actions at the time, like pretty much the entire rest of the web
SPA is just an abbreviation of "single page application" and only means a web app that doesn't do full page reloads, it doesn't require the use of any specific framework, so Gmail definitely qualifies and it used it from version one. It wasn't even the first, XMLHttpRequest was created by Microsoft for Outlook Web Access and it shipped in 1999 in IE5. Before Gmail there was also Oddpost, another SPA webmail app, so people started using it to build non-reloading web apps almost immediately. Gmail was the iPhone of web mail: not the first, no real new tech, but just very well done overall and popularized the concept.
You seem to be trying to redefine SPA to mean something else and much vaguer - the use of some specific frameworks, or not being "hacky" whatever that means - but your history here is just off.
Also, jquery was written in 2005 and launched in 2006, and became popular very fast. It was definitely pretty well known by 2008 and of course jquery itself was nothing new, most companies had very similar sets of utility libraries for JS that they used. Jquery just happened to be open source and have a monomaniacal focus on terseness at almost any cost.
Reality is the web has changed relatively little since 2008. It got some more APIs that weren't new, they were used in native desktop/mobile apps for many years beforehand, and that's about it.
Regarding 2008 vs 2023… how to view it probably depends on where you were in your career in 2008. To me 2008 -> 2023 looks like mostly shifting details.
SPAs certainly were a thing back then, it was just called AJAX. (Not to mention the desktop apps that were, architecturally, almost the same thing.) jQuery was a response to the popularity of putting interactivity in the browser, not a precursor.
The questions remain the same, not just from 2008, but going back a long ways… Where does the code live? How is it transformed to execute, where does it execute and how it is moved there? Where does the data live, how is it transformed and moved to where it needs to be displayed, and how are changes moved back? When some of the answers shift, due to changing network capabilities, processing capabilities of nodes, or scaling needs, it’s doesn’t really change the overall picture.
We've had LLMs for about 5 years so far in non-academic research. If we're talking 10 years out that means we're looking at tech that's about 1/3 through its development to date.
Take any mature-ish technology that you use today and compare the version 1/3 through its life to the version you use now. Look at Chrome 20 compare to Chrome 111, or React 14 compared to React 18, or an iPhone 4 compared to an iPhone 14, or a car from 1950 compared to a car today...
The difference is always quite significant. Superficially they're still the same thing, but if you look at the detail everything is just better. AI will be the same.
You can't extrapolate from an arbitrary selection of technologies and assume that LLMs will have the same trajectory. They could be like the iPhone, or they could be like self-driving cars which have been a year away from replacing all drivers for 10 years now.
Self driving cars a few years ago seem particular close in hype level and apprehension to LLM today but progress on those has not matched expectations at all. What if GPT4 is the last major advance in LLMs for a really long time?
Might just be me, but I think the big difference here is the level of adoption. Everybody with an internet connection can use an LLM. It hits closer to home that way, whereas driving is very dangerous and most people haven't used a self driving car before.
Superficially they're similar in the "they both have 4 wheels and an engine" sense, but you could examine literally any part of a car today compared to one from the 1950s and find huge improvements. The efficiency, safety, comfort, tech, manufacturing... Everything is better.
I agree with the safety angle, but besides that driving a car from 2023 is not substantially different from the 1950s (in the sense that it opens up a lot of new possibilities).
When I started reading your comment I thought you were going to argue the opposite. Getting my first iPhone (3G) was a huge change. iPhone 4 to the latest are mostly incremental improvements. Aside from the camera, I could probably live with an iPhone 4 without many issues. Only the software is a lot more bloated now.
We still had a Moto X from 2013 that my wife would power on every now and then to test an app that they were developing (iOS household), and besides the camera it still looks like a perfectly usable modern smartphone. When using it, it doesn't feel like a phone from the prehistory.
The whole mobile economy pretty much started in 2008. First iPhone was released in 2007 but App Store was lunched in 2008. This changed landscape dramatically even if you consider software development. Before 2008 you were fine with writing just windows only desktop app in Delphi - no smartphones, tablets, smartwatches, smart tvs and could leave out supporting macOS or Linux
> as todays would have been to someone asking 15 years ago (2008).
i dont think, if you took someone from 15 yrs ago, and transplanted them here today, that they'd find it all that different technologically. Sure, machines are faster, slightly different, and such, but the fundamentals haven't changed. A software engineer could just as well write an app today as they had 15 yrs ago.
You'd have to go back 30 yrs, for computers (and the landscape of computing) to have been different enough, that you can't transplant a software engineer.
30 years ago (1993): Linux existed, Python existed, web existed (mosaic), DOOM (3D graphics), and even Apple Newton (mobile) existed; and C, shell, windows (GUI), spreadsheet, sql, etc were known long before that.
What exactly revolutionary happened in the last 30 years? javascript? (two weeks project)
amazon, google, facebook, netflix, iphone, instagram, tiktok -- execution is great but seems inevitable that somebody will create it. Ok, for non-IT people iphone was a game changer (the first personal computer that your grandmother can actually use).
The ability of generative AI to produce BS indistinguishable from human BS is very impressive but it remains to be seen whether it is a net positive for an average developer (the time wasted correcting it, waiting for its output can be spent understanding the problem better--the typing the code itself is a small part of a programmer who knows what they are doing).
Was the tech landscape much different 10-15 years ago? This is a genuine question; the iPhone App Store was really the last "big thing" to happen to the industry in my mind, and it came out in 2008.
> I'm sorry, but the landscape in then might be as alien to someone asking today, as todays would have been to someone asking 15 years ago (2008).
Hahahah. Yes. Who could have foreseen the trailblazing advances in the tech industry such as "television, but over the internet", "booking rooms, but via a website" or "posting messages on a forum"
Don't forget the stuff powering it: "RPC, but over HTTP", "scripting languages, but compiled", or "Key-value stores"
2008 was extremely similar to today, although the webdev ecosystem wasn't quite as degenerate. I'd say you'd have to go back to the pre internet era to find a work environment that was fundamentally different.
You have plenty of time and can learn CS and earn a lot of money for years even if at exactly 120 months from now your job is made obsolete. It doesn't take 9 years to learn to code.
The premise of all this seems to be that learning how to program computers is difficult or complex. It is not.
Also, AI will not replace human reasoning in 10-15 years. If it does, it means AGI, and we all have much bigger problems than layoffs.
I'm sorry, but the landscape in then might be as alien to someone asking today, as todays would have been to someone asking 15 years ago (2008).
What John said is correct, but personally I think he's underplaying how much people could be affected. Those "product skills" take years of grinding to really sharpen, and in 15 years only a few people might actually be needed to apply them