> His prediction was based on the trends he was seeing at the time. But it wasn't even AI. Instead he made this prediction because he saw the rise of no-code tools replacing software developers because managers could finally cut out the pesky "translators", i.e. software developers.
It might sound really crazy and stupid today, but when SQL came out, it's advertised as one of "program-generating" languages and was supposed to reduce the need to code.
My career predates relational/SQL databases so I can confirm what you wrote. When Oracle came out the buzz was that a whole bunch of programmers would go away because managers and executives could write queries in a “natural” English-like language.
That never happened. SQL is hard to master, but it’s the easy part of understanding the relational model and any particular schema. Instead Oracle and the other RDBMSs that followed created more jobs for programmers and database analysts and admins.
As someone trying to learn SQL it feels that there are much more steps:
* language syntax
* the relational model (relatively easy? It is just "there"?)
* database schema, where I have 3 000 tables, or 50 0000 tables (that's how ERPs are made)
* actualy knowing how to use the language
* building those queries that join columns from 15 different tables together to deliver the needed data -> it sounds easy, but Im struggling to do it better/faster - I never saw a book or website that focuses on that (multiple joins from multiple tables), when it feels my work is is mostly that
* understanding what is needed
* actual "programming" problems (say: you have 100k items on stock, those are parts of BOMs, make a list of 'where used'... yes I know you can google it and try to copy from stockoverflow)
Seriously, I am trying now how to learn how to setup a simple DB to consolidate data (20 csv files on a shareddrive) and the guides are often like in the 'how to draw an owl' meme...
Controlling/reporting/analyst jobs feel in some way as "programmer" but without tools/training/salary - just figuring stuff on your own. Im doing it, but apart from that SQLzoo website I didnt manage to find any decent guide for the 'usual' problems. Also since those are like 10% of my work I cant focus on those problems properly - and try to learn it after work.
Also SQLzoo is a lot like the "how to draw the owl" meme.. the easy tasks are easy, the hard ones are impossible and there are no medium tasks. There also dont seem to ne any real life examples like those tasks like: 'join 15 tables to make some report'.
Shoot me an email, I’d be more than willing to pair with you. I’ve been a “software engineer” of some flavor for about twenty years now, and about five of those were spent writing SQL almost exclusively.
You can reach me at nominallyanonymous-at-protonmail-dot-com. From there I’ll give you my “durable” contact info - Slack, Discord, SMS, Signal, Telegram… whatever you use regularly, basically.
Tools matter. When I learned SQL long ago it was with FoxPro for DOS, and it was a great tool for doing both SQL and text based UIs (as with curses). Later, I used MS Access 97 and it was an even better tool and sparked a lifelong interest in data modeling. The ui for building up tables (specifying column data types, etc) was really trail-blazing at the time and the interaction remains good today. The built-in ERD charting tool was good, showing your tables in relationship to each other. The visual query builder was...well, I never used it but I suppose it was good? You just had lots of good tools to build tables, flip through their contents, and visualize their relationships.
I don't know of any modern environment that functions like that, on any platform.
I'm posting this to invite others to either a) correct me and tell me I'm wrong that tooling doesn't matter (a legitimate view, but wrong), and/or b) recommend some modern, OSS Access-like tools that might help flatten the learning curve for you. (And if you're more comfortable with a CLI and a REPL already, then you don't even need this hypothetical tool, but I myself am curious about it.)
EDIT: looks like LibreOffice Base is pretty close to MS Access circa 1997.
At some point you have to learn the relational model. And you have to make the mental switch to understand SQL as a declarative language, not an imperative language.
As someone who did a lot of SQL back in the day, and is now doing some again for a startup POC, I'd say you're basically right but there is also:
* Rewriting queries and also parts of the schema when it turns out the things that made sense in the design phase cause massive bottlenecks once you get more data, or just get it faster, than you tested with.
Of course the good news is now you can run the best RDBMS's on your laptop, or on a $5/mo VPS; or have a small managed one starting at <= $15/mo. Plus ChatGPT can help you remember how to do that inside join. ;-)
Building the queries is the easy part. Making the queries run fast with large tables is difficult and there are trainings available, but very focused and a bit expensive compared with the ones on large training websites (ex. LinkedIn Learning). For example Brent Ozar (brentozar.com) has lots of blogs and 2 sets of trainings for MS SQL Server, some for beginners and some for experts, that are extremely useful for people with such needs. Problem is, expert tuning for MS SQL is totally different than expert tuning Oracle that is totally different than Postgres and others.
On a side note, if you have to join 15 tables for a report it is a sign you may go in the wrong direction. In 99% of the cases I never needed more than 5-6 tables if the data is well structured, but that takes years to learn.
I'm not an SQL expert. But data persistency and consistency are generally very hard problems. It's a lot of steps, but I'm not sure if it more steps. More than what? If these data are stored as a big binary blob?
Mine doesn't predate it but it's very confusing for me to read this opinion.
From my point of view, it totally did happen?
Can you imagine how many programmers the company would've needed to get all the data a business analyst casually queries per day?
What you're looking at is the quantity of people actually employed in the industry, not how many SQL made obsolete. The industry just grew so much that it didn't become an issue.
A few things happened. Relational databases enabled more new development, larger databases, interoperability, all of which needed programmers.
With more flexibility in the database companies could collect, store, and use more data. And that data had to get into the DBMS somehow: more code.
Setting up and managing databases required DBAs, a role often filled by programmers in smaller companies, and still filled by programmers today. And only larger companies had business analysts. In smaller companies programmers or maybe technically-proficient managers did that work.
Anyone who had devoted their career solely to building bespoke database systems had to pivot or walk in the late ‘80s, but very few programmers only did that — it was part of the larger application development. If you were good enough to write robust database code pre-Oracle you had plenty of options.
In the ‘80s when RDBMSs started to take over I worked in enterprise logistics. Oracle created jobs at the places I worked, and no programmers had to leave because we got a better tool.
I’ve worked in large and small organizations for 40+ years. I’ve never worked with a manager or analyst who could write SQL more complex than “SELECT * FROM orders WHERE total > 10000.” I’m sure they exist, but not in career-threatening numbers.
In my department I recruited DBAs as recent as last month and this is a permanent position filled by a team of several people that do just DBA work. I saw developers ("programmers") try to do this work in some small companies or in areas where the databases were small (hundreds of MB, a few GB), but I did not see that when the databases exceed 100 GB or when there are dozens of production SQL servers that need to run 24x7. Solutions are implemented based on needs.
The industry grew because the desire for data analysis grew, which is because the technology's ability to meet the desire grew. This can repeat itself a couple more times.
You're considering that every company that needs SQL today would hire enough developers to essentially write most of it from scratch. While some might, most companies that use SQL would not exist, because the cost of developing their product would be prohibitive.
Maybe. Before Oracle (mid-80s) every company did write their own database code.
I think a lot of smaller companies would struggle if that was still a requirement, but if relational/SQL had not come along we’d have something else like it.
That's exactly my point, though. In the mid 80s there were a lot fewer companies producing software. Nowadays we have many more.
Indeed SQL was not the only local maxima we could have gone for, but the point is that having an easy to use database with a powerful query language did not reduce the number of jobs, but instead increased it. Instead of a few companies hiring a lot of developers, we have a lot of companies hiring a few. The latter will usually mean more jobs.
If hardware was cheaper, but writing software required an army of developers to do it, the costs would still be too great. If you read carefully, my point isn't that SQL was the cause of the tech boom, but rather that SQL and other technologies that make developers more productive didn't really take jobs away, because the market for tech would be smaller if you needed too many developers to do anything.
Imagine if every little web startup writing a custom CRUD for their business needs needed to write a new bespoke database. It simply would not be feasible or they'd need some crazy funding.
>Before Oracle (mid-80s) every company did write their own database code.
Not really. There were a ton of ISVs competing with Oracle and pretty much every mainframe and minicomputer maker also had their own database products, many of them non-SQL.
Oracle was the first commercial RDBMS (1979), with an early version of SQL. At that time every mainframe and minicomputer company offered one or more “databases,” often little more than file managers with a library of utilities. ISAM was a popular model but that describes a lot of data management systems, not a product or standard.
All commercial databases pre-Oracle were non-SQL. Watching that history get rediscovered and pushed as an improvement by people mostly too young to know better — so-called NoSQL databases — brings to mind Alan Kay’s quip about the software industry constantly reinventing the flat tire.
The hash-based Pick OS/database came out before Oracle but only ran on a few computers from Microdata and later Pr1me and Honeywell. Pick-based systems remained popular into the early 2000s in some sectors. A friend of mine still works on one, for a state government agency.
You could construct a database management system from the ISAM-based libraries for COBOL or Fortran, but I wouldn’t call those database management systems comparable to Oracle. Mostly they didn’t have a query language per se — you had to write code to get anything in or out. And they ran as part of the application process, not as a separate server dedicated to running the database engine.
I was thinking Db2 might have been a little earlier but you're right. And it's fair that the earlier databases definitely lacked elements of what we'd consider a database management system today even if they handled a lot of the low-level database heavy lifting for companies.
I have seen this multiple times. Got called into a corp. audit once because CFO's spreadsheet didn't match reports from database. His SUM() function was missing some rows, ZIP Codes got converted to scientific notation and he didn't know why, that kind of thing.
Maybe I’m misreading, but “not much harder than” implies SQL and Excel aren’t that hard to learn. “At least as hard as” implies both are similarly difficult.
Funny - also COBOL was intended for the 'business' community to write their own programs.
Even funnier - we've already lived through the great 'software development adoption' by business and hardly noticed, except some of us got work out of it. A lot of small businesses (legal practices and accountancies were particular suspects) grabbed MS Access with both hands and went wild in the 90s/early 2000s. Timesheets and other HR functions were popular applications.
I’m quite sad there isn’t anything like access these days. I feel like I see fairly clever solutions for things being built in excel, but they can’t scale very well to multiple users. For a lot of processes something like Access would be great for bridging the gap.
My oversimplified observation is that, Adobe products at the high-end notwithstanding, MS Office basically crystallized what was a mainstream office productivity suite now that companies weren't buying one-off products. Rather than buying a low-end/midrange desktop publishing program, they made it work with Word. Rather than using a low-end database (of which there were many), the made do with Excel.
I didn't mean SQL wan's a success. I meant, as the commenter above me said, a higher-level tool doesn't necessarily replace the lower-level ones, or reduce the need of them.
Relational databases replaced a whole raft of vendor-supplied databases and custom-built solutions. It was a new tool for software developers, not a replacement.
I don't actually agree. SQL did replace the need for programmers, especially for complex questions. We sabotaged its interfaces to existing GUIs for mundane questions and made CRUD a limitless profession.
The funny/sad part about computer science is that people don't want to understand the costs of customization and sales/management/marketing forever want it as a differentiator.
SQL could have eliminated us from the flow of many niches as easily as the spreadsheet did from business operations. I think why it didn't has more to do with market timing.
SQL was too much ahead of its time, but we are indeed slowly adopting the "avoid operational specifications, use high-level languages, make your language fit the problem instead of the opposite" philosophy of the 4th generation languages.
It was very true, and one of the most success DSLs ever invented. And many people coded SQL who never would have touched other languages with a ten foot pole.
It’s just that, with more powerful tools we can create more advanced and intricate things, so it never ends…
I not exactly welcome but rather anticipate, a near future were instead of reinventing every kind of pattern and library in terms of cloud and infrastructure management, we will see the same meta-management of different AI engines and services…
But it is true, just in a different area than predicted. I hear all the time that scientists and analysts, while being non-programmers, do write various queries without asking a programmer to do it… I am a programmer and not even sure how much time it would take to implement their regular joins with windowing, aggregates etc in a record-based file format. Can I even do that?
SQL was a typical failure in this regard. It was seen as a savior but at the same time was designed for at least math-versed people. The prediction was right, its scope was wrong. Since then we failed many times and as I see it, we tend to diverge from this idea more and more. And the reason is there’s no one controlling our stacks vertically, so it takes more jobs than it could to make an app.
I think it used to be that one needs to write dedicated programs to go through large amount of data --- you need to know a programming language, all its toolchains, and the all the proper algorithms for doing the query __out-of-memory__ --- certainly above your ordinary analysts' paygrade.
It might sound really crazy and stupid today, but when SQL came out, it's advertised as one of "program-generating" languages and was supposed to reduce the need to code.
https://en.wikipedia.org/wiki/Fourth-generation_programming_...
(I mean, in some sense it's true because it's much less code than writing our own database...)