If you enjoyed the Romero article there is an informative & belly-achingly funny podcast of Blindboy interviewing John and his spouse Brenda in front of a live audience in Ireland ->
John is great in that interview, but Brenda almost out shines - she has a really fascinating history and experience in the gaming industry.
If you're not far enough down the rabbit hole after listening to the above Brenda does a solo interview on the similarly awesome "Retro Hour" podcast that is a great listen as well ->
Another thing I found impressive is how fast Carmack grew in term of technical skills.
In 1989 and 1990 he was still programming Apple ][ tile engine games, which are Ultima spinoffs. Not bad but not a new thing either. Starting from 1991 when he joined Softdisk it took him a few months to figure out smooth horizontal scrolling, and in just a bit more than one year, around late 1992 he managed to go from it to a mature ray casting engine, and you can see his evolution from hovertank to wolfenstein. And it took him another year to go from Wolfenstein to shadow caster to Doom. Quake took longer but it was still impressive considering he has to got it before anyone else solved it.
I mean this guy learns really fast. It's not only the rendering engine remind you, he had to do a lot of other parts of the engine too.
Yes, but then again you had to create the whole thing from scratch. Nowadays there are libraries that will do basically anything you can dream of. There's middleware. Whole game engines.
That era? Want to blit on the screen? Do it yourself. Want to load an asset? Not many pixmap formats to choose from. You may have to write the code to load it yourself. Want to play some beeps? Better control the PC speaker frequencies by turning it on and off (without spending too many cycles on it). Sound cards didn't make the problem any simpler.
You can't even code on one window and watch the results on another window. Unless you had loads of cash like Carmack at some point and could buy a NeXT. It's code, run the compiler, run the game. If it crashes it might take the entire machine with it. Reboot and try to figure out what went wrong. You _might_ be able to get a debugger but they were extremely primitive - and no memory protection so the debugger (which would likely modify the code and add side effects) could crash too.
Yeah, simpler games. But glacially slow to code anything. And very little publicly available information.
You try figuring out a highly performant ray casting engine without the internet. Or implement smooth scrolling after entire companies had tried and failed.
> You try figuring out a highly performant ray casting engine without the internet.
Not that it detracts from what he accomplished in any way, Carmack's alternative to the internet seems to have been books and research papers.
From GEBB DOOM:
"John started searching around for 3D research papers. He had several
VHS tapes of math conferences, and compendiums of graphics papers from
conferences because game books were a rare thing back then, and there was
nothing printed that could help us create the engine we were building – he had
to figure out where to get information that was not directly applicable to games
and figure out how to adapt it to his problem.
Bruce Naylor’s May 1993 AT&T Bell Labs paper was titled "Constructing Good
Partitioning Trees" and was published in the proceedings of Graphics Interface
’93. John had this book in his collection. Bruce’s explanation of BSPs was
mostly to cull backfaces from 3D models, but the algorithm seemed like the
right direction, so John adapted it for Wolfenstein 3D.
— John Romero"
This is still the way things work now. A lot of techniques new to games come from earlier academic research there’s even cross-pollination back from industry as well.
DooM uses BSP trees for the levels and Quake uses them as well but in 3D. The main difference was that DooM could make the BSPs on the fly (2D) while Quake needed the BSPs to be compiled ahead of time. The executable had to be run, along with a visualization optimizer and a lighting program, in order to get the compiled map file for Quake.
Smooth scrolling on the PC was a particularly difficult problem because of design decisions made when creating PC expansion card type video systems.
Smooth scrolling, in general, was known on other hardware. That was either having enough CPU throughput and a display resolution combination that would allow the CPU to drive the display fast enough to make it happen, and or with the assistance of video display chips forming video subsystems capable of smooth scrolling by offering the kind of control needed for success, all while also leaving enough of the system resources available for a game to be made featuring the effect.
I’ve been making games since I was a kid in the 80s and whilst what you say is kind of true hardware and interfaces to it were spectacularly more simple as well. There was also more public info than you might suspect as making games was a popular thing to do then as well.
> You can't even code on one window and watch the results on another window. Unless you had loads of cash like Carmack at some point and could buy a NeXT.
Yes, you could. PCs from the beginning could support dual monitor setups with MDA plus any of the graphics cards, and this wasn't an uncommon setup for a long time because MDA had sharper text than any of the IBM color graphics standards until VGA, which had a text mode with slightly better resolution.
> And very little publicly available information.
There was lots of publicly available information. Less of it was free, but it was in books; for the less esoteric stuff they were in decently sized sections of computer books in general bookstores and computer hardware/software stores, magazines you would subscribe to, and for the more esoteric stuff there was stuff that was hard to find outside of (especially university) libraries. It was more work to find some information, but the signal to noise ratio was better.
Back in the 80s on Apple ][ (Both Johns started on Apple ][), whoever wants to develop competent games would switch to assembly ASAP. Whoever could not figure it out would drop out eventually. Eventually, whoever managed to join professional gamedev (be it Origin or Softdisk) HAS to know a lot of low-level stuffs and be fluent to assembly, Pascal and perhaps C. Example: Carmack had to code the rendering engine and network engine for DOOM, probably singlehandedly.
Today's world is a lot easier so more percentage are retained. It's a lot easier to write C#, use Unity and get out a demo in say a few days/weeks. No low level programming skills are needed, at all.
This creates a bias when we compare "classic professional programmers" versus "modern professional programmers". They simply don't need the same set of knowledge.
I find constraints are good. Limitations inspire creativity. It might be easier because the problem is obvious, difficult but obvious. Modern day programming with 1k nodejs module dependencies is a nightmare.
Quake and beyond was when he and id brought in Michael Abrash, and IIRC Abrash helped a lot with the low level graphics of the Quake engine. Not to say Carmack couldn't handle it or it was beyond him but Abrash, a legend in the graphics programming world at the time, definitely helped move technology forward.
From what I remember reading back then, Carmack was already a huge fan of Abrash's tech articles in magazines before Quake, so he had already been learning from Abrash for a long time before they worked together.
Not a lot of people but other people were building similar games at the same time. Carmack gets a lot of recognition because the games were spectacularly successful. For example System Shock was released just a little after Doom and was technically superior. Ultima Underworld was released the year before and had an engine better than Wolf3D. The latter implemented texture mapping first and actually inspired Carmack to do the same.
Curious what makes you say that. While it was full 3D, it ran so much slower than DooM that, to me, "technically superior" is very subjective.
>Ultima Underworld was released the year before and had an engine better than Wolf3D
This one is definitely true but with a caveat - Wolf3D could be played fullscreen while Ultima's first person window was only a small part of the fullscreen window. The rest was an inventory and other things that allowed it to run faster since the actual rendered area was significantly smaller.
Hey, just noticed this reply, HN doesn't notify people so you're lucky I skimmed down the first page of my own comments! :D
I agree technically superior is a subjective notion, particularly as it relates to games and how good a game is. I do think broadly that a full 3D engine is more complex than a 2.5D engine like Doom. There were quite a few new challenges in terms of making Quake for example. System Shock and Doom are also very different games in terms of their complexity. So I'd justify my opinion based on the game and the technology underpinning it being more complex.
For Ultima that becomes more of a production choice. Do they optimize the renderer for fullscreen resolution or build the game. Again a significantly more complex game than Wolf3D so I can understand doing the latter.
None of this takes away from the achievements of the respective teams but we do well to remember what actually happened rather than just lionizing one man.
>we do well to remember what actually happened rather than just lionizing one man
This statement alone was worth the price of admission. As a fan of game engine history (highly recommend Fabian's books: https://fabiensanglard.net/), I have to remember that all of these things were built on the shoulders of giants. It wasn't just one person that contributed, even if it may be only their code that lives on.
Yup I have his books and am old enough to have been around for all these games back in the day although I didn’t start modding games until Duke3D and Quake. I was actually poking around in the source of Doom the other day to see how it handled vector normalization in fixed point (answer convert the vector to direction vector then to angles then back to a length one vector by virtue of the conversion). It’s very fun to go back and pootle through.
A lot of the skills were very similar back then, with almost everything just about how to blit 2D data to the screen in the shortest amount of time. At the time period you're talking about communities were starting to appear, certainly on Usenet and BBSes where coding techniques were discussed and traded. There was a huge amount of competition too between coders back then to get more and more out of the limited hardware we had to work with.
And you had to code everything yourself from scratch. There was no throwing a bunch of vertices and textures at some hardware and getting back a fully rendered scene. It was a fun era. I know I did the same as Carmack and went from writing 2D platformers to writing Phong-shaded, perspective-corrected texture-mapped worlds in under a year.
Everything that Carmack did had been done before, but Carmack made it cool. He had way more vision and imagination than the guys that had done it before.
The smooth scrolling was easy, the EGA had more or less the same hardware capabilities as a C64 in terms of smooth scrolling. It's just that few or no games had tried to use it. For VGA, Mode X was the real game changer because it allowed to use both double buffering and smooth scrolling at the same time. A later 2D platform game that took Keen's smooth scrolling to the next level is Jazz Jackrabbit.
But as CPUs got more powerful, Carmack recognized that you didn't need hardware help anymore and that removed constraints on what games he could program. Keen to Wolf3D to Doom is when id games went from good to wow to insane.
things were simpler, so the scope of what small teams could achieve was greater... but there was also complexity in different ways. (weak memory protection so lots of reboots, buggier tooling, near and far pointers, memory and cpu constraints)
i think maybe the most exciting part of that time was the rate of innovation in hardware. new capabilities were opening up every few years and you didn't need an enormous team or tons of capital to push the envelope.
Synthesis is underrated. Yes, front-to-back BSP tree rendering "had been done before" by like two academics, who didn't use it to make a game, let alone a great game.
To solve a new problem space, Carmack would buy a bunch of college text books, then check into a rando motel for a few weeks, and just dissect all the topics and books, until he'd come up with the design of, for instances a new ray tracing engine.
"In the last six months of 1991, we started and shipped five games,"
What was so different about development back then that they were able to do that? That's unheard of today despite better hardware and a lot more tools. And they didn't have stackoverflow.com!
First all developers are extremely talent and some bagged years of production experience under belt before they formed ID.
Second as other mentioned these are relatively simply games, not the ones like Ultima and Golden box that usually takes years to develop.
Third I think there is a unique culture in teams such as early ID: No bullshit, devs willing to devote 80, 100 hours per week, no management overhead as they KNOW what they are going to do.
> No bullshit, devs willing to devote 80, 100 hours per week, no management overhead as they KNOW what they are going to do.
Yes, I think nowadays there is rightly a reluctance to do this sort of thing without either an ownership stake or very high compensation - neither of which are common in the gaming industry.
Agreed, although I'd say it's a different type of ownership they had, a better type. Not only did they own the company legally, they also got to make the games they dreamed for. Today, one might be able to lure some technical person with stocks and options, but he or she wouldn't be that devote if he or she does not love the products (that much)
Like, Commander Keen and the other of those games are the same amount of effort that might go in to a particularly well staffed month-long game jam today, just in terms of assets and total needed code. CV-11s video on Quake has a section covering that transition from mostly 2D to mostly 3D. It caught a lot folks off guard.
Yes and no.. assets were simpler, games were simpler.
But they had to write a lot their tools to create the assets. They had no middleware, no engine to license. They had to write stuff in assembly. The computers they used were super slow and you could crash them so easily. At some points in iD's early history they were smuggling computers from Softdisk out of their offices and working over the weekend and then taking the computers back Monday morning. The tools were terrible. Documentation was a lot harder to come by. A lot of the people in iD at the beginning were also juggling a day job, they were moonlighting making those earliest games. IIRC Wolfenstein was the first one they worked full time on.
Doom was developed on NeXT workstations, under the NeXTSTEP operating system. The final game engine was programmed in C, and the editing tools were written in Objective-C. The engine was first compiled with Intel's C compiler for DOS, but later Watcom's compiler was used.
Over the entire course of Doom and Quake 1’s development we probably spent $100,000 on NeXT computers, which isn’t much at all in the larger scheme of development. We later spent more than that on Unix SMP server systems (first a quad Alpha, then an eventually 16-way SGI system) to run the time consuming lighting and visibility calculations for the Quake series. I remember one year looking at the Top 500 supercomputer list and thinking that if we had expanded our SGI to 32 processors, we would have just snuck in at the bottom.
Perhaps that is another differentiator contributing to the ID team moving as quickly as they did. SGI NUMA systems were not cheap! How many teams had that kind of best in class hardware in house?
16 MIPS CPU processors housed in metal heat sinks that can look and feel like small bricks! Real steel amd aluminum all over the place!
IRIX delivered NUMA style multiprocessing running on a single OS image they scaled up to 2K CPUs! I believe NASA had a 2K CPU system for quite a while.
Running one of those beasties was an excellent overall computing experience. At that time, perhaps unexpectedly, one of the most impressive features was the documentation. The early version of that system was called, "Online Books" in the software manager, which itself was what we know as a package manager today, and featured a Windows Help type application designed to render the docs and many high quality illustrations in a compact and searchable, selectable form. Selecting text meant being able to work through advanced system examples copy paste into terminal style, among other things.
For people who enjoy command line, in the terminal style coding, the SGI terminal font and white on blue was, and in my view remains, one of the top easy on the eyes, fast and low fatigue text interfaces ever done.
I will often grab one of the modern, similar to the fast bitmap fonts SGI used, and use white on blue today. My first home computer was an 8 bit Atari, also setup to display white on blue, similar to the C64, which can be set that way quick and easy.
Anyhoo, I was using and setting up those systems for a time,from workstation up through 8 CPU Origin and Onyx class NUMA hardware, and absolutely loved it. Exemplary computing experiences.
A comment can be found in the SGI freeware Doom package, "SGI graphics run Doom real sweet." (Or something to that effect.)
Newbies would be surprised to find even their "slow" SGI Indy could run multiple copies of Doom at the same time, rendering into a window with full sound effects and no dropped frames. I would play one while a few others were running in attract, or "demo" mode, all smooth, display frame locked looking crisp and responsive.
To add on to the other comments of games being more simple back then, most of the games released in 1991 by them were sequels and/or shared codebases and tooling with other games.
Three of those releases were Commander Keen games; Shadow Knights and Danger Dave shared engines; as did Catacomb 3D and Hovertank; Rescue Rover I/II were simple puzzle games built on an old demo code base. I'm not trying to take away from their accomplishments, merely pointing out that they didn't start from scratch every time.
I think id was trying to get out of come contractual obligations from Softdisk.
If anything else this makes the achievement that much more impressive. They built tools instead of just hammering out assembly. The five games in six months was just collecting on that previous effort. But knowing what tools to build and how is the real achievement.
It's not just about knowing what tools to build ahead of time. That might well be impossible.
It's about finding the intersection in the design space between "the game I want to make" and "the capabilities of the tools I have" that allow you to adapt your tools to make something 90 % of the way there with 10 % of the effort.
This goes for any fast product development! Lockheeds' Skunk Works were amazing at repurposing tools and parts to invent completely new planes with few components that were actually new.
Right. That's what's impressive. They built the right tools, presumably during previous projects. The tools they built were useful enough to enable future work.
You may not know what tools you will need in the future but when you need a new tool you can build it in a way that it is reusable.
In other words a good tool solves an abstract problem. I don't need an Ikea-Bergmund-chair-leg-attacher. I need a screwdriver.
Once you have a toolbox full of these basic tools you are better prepared to tackle those future projects.
While you are technically correct and I agree completely, I think your emphasis on abstraction might lead to higher (or at least more variable) cycle times rather than lower.
It sounds like you're suggesting that one spends, say, 80 % of one's creativity coming up with general, abstract problems to tool for, and then 20 % of the creativity on finding ways to apply general tools. That absolutely works, I think, if you're fine with high variability in the time you spend between each release.
What I tried to suggest is that instead you spend 40 % of your creativity on finding abstract problems to tool for, and the other 60 % of your creativity in finding ways to apply your distinctly less general tools to new situations, despite their lack of generality.
What concerns me about the high-abstraction route is that people, in my experience, have a tendency to over-abstract. To try to predict every future use case ahead of time, instead of accepting their ignorance and building for ease of change.
That is why I want to emphasise modding non-general tools rather than future-proofing tools at the design stage. Because you can't know at the design stage in which direction future innovation will go.
It should not be understated though how good the toolset was for the time. The original idea for Keen came after they (Carmack and Tom Hall) recreated the first level of Super Mario Bro's 3 in one night. They had to break from Softdisk under Romero's suggestion, because Softdisk would claim proprietorship over the engine. They actually "borrowed" their work computers from Softdisk over a weekend to work on Keen and created the first level in 72 hours. After the success of the first Keen it was a no-brainer to keep making more and milking it for all they could.
Have you ever been bit by the bug of absolutely needing to get something done and finished just so you can keep riding the high of finishing something? You chase that feeling into the next thing, and the next thing, and so on.
They were doing that. Finishing and shipping anything feels really really good. If you're a small team and you feel a deep personal investment in the product then shipping becomes addictive.
Brandon Sanderson and writing is a similar example. Is the quality always there? No. But, the guy is very very good at finishing and clearly rides to wherever his passion takes him. His output is, in comparison to other authors writing in similar genres, incredible.
Also, games were much simpler and players had much simpler expectations.
One thing i've found as an avid gamer is the lack of "humanity" in modern games. Modern high budget games are streamlined, risk averse, and predictable.
If you look at older games you'll often find the creator's / developer's quirks and mannerisms have seeped through showing mild biases, prejudice, stereotypes, etc (humanity). The freedom to express yourself in your work, resulted in people becoming emotionally invested in the product. This investment likely meant they spend many additional hours if not directly working on the product, than thinking about it resulting in a faster shipping date or a higher quality product in the same timespan.
Maybe in aggregate modern gaming exhibits less humanity, but the best games today are on an entirely different level of story telling and emotional investment than 1991.
You could still produce a AAA title with a small team (e.g 10) in the early to mid-90s. By 2000 an average team was about 100 people. There are exceptions to this in both directions but that was the overall trend.
In that period, you moved from mostly 2D to mostly 3D games so more complex code generally, plus the need for artists to do 3D modelling, texturing, rendering, optimisation, etc etc. Sound designers, composers, etc etc. AAA Games just got bigger, plus multiplayer. Hardware got more powerful (sound and graphics cards) so there was a desire/demand to make more use of that hardware. Generally, it all became a lot more "Hollywood".
There are still great games made by small teams of people in a short period of time. But to hit the AAA mark you need to be with a big studio and have the marketing budget behind you. Again, Hollywood.
id Software went independent after working for a company named Softdisk making a monthly floppy disk called "Gamer's Edge":
Softdisk is most famous for being the former workplace of several of the founders of id Software, who worked on a short-lived game subscription product, Gamer's Edge. Gamer's Edge was a monthly[3] PC game disk started in 1990 by John Romero.
Having a monthly deadline making MS-DOS games on floppy is why they were so prolific at that point. None of the Cmdr Keen games are much different from each other, and there were 6 of them.
The original 'trilogy' is a bit more distinct from the rest IMO; Yes the basic platforming concepts are the same, but I'd say Keen 3 and Keen 4 is a pretty substantial jump in style/features like swimming. (Also setting aside the odd duck of Keen Dreams).
Did anyone ever actually successfully buy just one episode? I mean, 10 year old me did from a different publisher (Epic Megagames), but they just sent the whole trilogy anyway.
But overall yes, a lot of those folks got good at churning out games, and part of that was understanding the value of good tooling.
The level editor used for Keen, called TEd (Tile Editor) was actually used for something like 30 different titles, both 2d and 2.5D, prettymuch up to the original Rise of the Triad..
I've listened to a bunch of John Romero's apple time warp podcasts about apple2 development. It was very small companies (sometimes single developers). This was in the 80s... A little bit before.
They're kind of fun when they get into old stories about long gone companies. They don't come out very often however: They're 10% annoying and 90% really fun.
You can do that today, you could ship five Chess or Candy Crush games in six months if you wanted to. It's just a matter of where you set the bar for complexity and effort.
Assets were much simpler. The screen was ~320x240 pixel so your 2D images were a handful of pixels from a couple hundred limited colors. An artist could bang out tons of images and sprites in the amount of time it takes someone to meticulously craft, model, paint, animate, etc. a full 3D monster today. Same thing for sound and music--it was much simpler and more manageable by one or small number of people.
Look at game jams today, like the 48 hour game creation contests. A lot of stuff that comes out of those are little 2D things reminiscent of games of that early 90s era. It's because it's still pretty fast and easy to make assets for that style of game.
On the other hand, 2d animation requires A LOT of these hand-drawn sprites, to the point where I suspect the cost benefits of 2D become questionable. For example, Heroes of Might & Magic 3 moved to their sprites being pre-rendered 3d models (possibly also painted over in every frame), which look worse that the pixel-art sprites in 2D. I suspect they did it because it was just less work to do it in 3D.
I think it's possible to have that same output in mid age. The difference is enthusiasm. It's still novel.
Further one of the games in the Spring Future Games show mentioned they were developed by a small team, using UE5 and AWS Gamelift, in about 9-12 mo.
I think a game in six months is doable with the proper talent. It's not gonna be Elden Ring in scope. But NES Legend of Zelda in 180 days seems possible now.
Tools like Github Copilot & NVidia GauGAN can also assist. But may be a steep research curve before resulting in actual workflow compression ;)
Games had to be smaller projects. The console and home computer hardware at that time put a tight limit on how much code and data you could ship. Filling that up wasn't hard. It was mostly a matter using these resources well.
This changed a few years later when CDs came along. Many game devs originally didn't quite know what to do with so much storage space. Then processors also got much faster still and 3D acceleration became a thing. Game dev project scopes exploded accordingly to give you the game industry we have today.
Technically, I was probably born into a world where they had already revolutionised gaming -- I was a couple years late to watch it unfold personally.
On the other hand, I didn't grow up in an extremely well-off household, and our computers were for a long time pretty much dumpster-dive finds. This meant I was always exposed to hardware a few years old, which in turn meant I got to witness the technical revolution in real time, a few years after the cultural Revolution. (In other words, when Doom had taken over the world, I still couldn't play it. But a few years later, I could, and marvelled at what the computer could do.)
In another twist of fate, one of my parents strongly disliked that type of entertainment, and once they understood what I was doing, there was a strict ban on (most) violent media. (With very unclear boundaries of what constitutes violence.)
So, in effect, I missed the revolution, caught a glimpse of it, and then got cut off from it.
Reading Masters of Doom gave me back a chunk of my childhood that I feel was taken from me, in a weird sort of way.
You needed that DX turbo boost too, huh? You pretty much sun up my experience as well except I had an older brother that gave me his old PC when he went to college.
Quake 1 used to play the music via cdda. I “borrowed” my copy as a teen which did not include that music. My friend at school burned me a cd of Led Zeppelin 1. The summer of me playing quake listening to zeppelin is nostalgia heaven for me.
Agreed. ID was really a dream team at that time (probably before Romero left). Small team, everyone already bagged multi year gamedev exp before (I think Carmack shipped a few games since 1989 and Romero a lot more from mid 80s); Everyone is passionate so no management overhead; Very disciplined;
It makes Carmack out to have some sociopathic characteristics and it makes Romero seem like he became lazy due to being famous (A lesson from which he has learned some hard lessons), but why the intense dislike of American McGee, Tom Hall, Adrian Carmack, Michael Abrash, Sandy Peterson, or any of the others?
I can't wait for this to go up on Youtube. GDC post mortems are amazing, especially the ones on Diablo[0] and Fallout[1]. I've loved Fallout since I was 12 and it really got me into post-apocalypse fiction. I ended up watching and reading all the media that Tim cites in his presentation and discovered some wonderful things like On the Beach (the book by Nevil Shute AND the movie), A Canticle for Leibowitz, etc
Also an insider kind of thing. To get info directly from the .plan file, one had to be on the network and generally running a UNIX, or able to build tools on whatever OS...
Was really fun to get online early on. I had a shell account, useful for the many times I was using a DOS PC
On occassions, I would end up showing people "the Internet" and explaining things like finger, email, USENET were fascinating in what people would ask, or when they wanted to drive. People would often want to print stuff out so they could read it later, like USENET, or talk about it with others. "I was on the Internet with a friend, lookie here..."
I did that too and which I had kept a few pages.
But yeah. Kind of amazing really. We all would just read John's file, in his home directory, no worries. Crazy to think about today.
When I was at Zynga, I helped start the publishing dept. One of our first published games was Ravenwood Faire done by Romero and Hall. My primary memories of that engagement are that Tom Hall was one of the nicest people I ever met in gaming, and Romero had the best conditioned hair of anyone on this planet.
I was in the habit of getting games via shareware, through one of the common catalogs then - yep, an actual paper catalog of shareware games you could order for price of shipping, or close enough. They were cheap and I was entertained by them - Commander Keen, Prince of Persia, Pharoh's Tomb, etc. My friends and I all played some awesome games of the day, but I remember loading up W3D and just being blown away - it really was the first fps I ever played.
Many of the other first-person perspective games I had experience with were the D&D games which were largely still-frame, turn-based games like Eye of the Beholder from SSI[0]. Also played Falcon 3.0 over modem, which was super cool; Gunship 2000 where I first started using a hex editor, diff'ing pre-and post-mission save game files via stare-and-compare to understand how to give myself more medals. But W3D was a whole different level, game-wise.
I called all my friends and told them, basically, "holy sh*t, you're not going to believe how flipping awesome this is! It's like nothing you've ever seen before, come over. It's better than anything you've ever played." To a man (teen boy) they were all dismissive - like, whatever dude. Until they came over to my house and I loaded it up. I think I got it on a bunch of 3.5" floppies, and quickly handed it out, made copies, etc.
I played that game so much I would go to bed, close my eyes, and still be running down the hallways, so burned into my retinas were the graphics. Truly the first game I was completely and unapologetically addicted to.
Then some time later a kid at school who wasn't into computers handed me an unopened game box he had gotten as a present from his folks. He didn't care about it, knew I liked computer games, asked me if I wanted it. I had no idea what was in that box, but said sure, why the heck not? The thing was like a black monolith. It was Ultima VII - The Black Gate. It had a cloth world map, a book to translate runes. I pretty much forgot all about Wolfenstein after that, mostly. A friend and I would play it "together" while talking over the landline for hours (local call).
Doom was cool, so was Descent, played the heck out of them. Was also the time I started taking programming classes in school, including a semester of Pascal. It was always in the back of my mind that Pharoh's Tomb was programmed in Pascal.
Was first for me too. I was amazed and immediately hooked. Managed to cobble together my first 386 machine trading code and CAD / CNC programming work for parts.
Had 5 Mb of RAM. Just enough to run a browser and Winsock. Bo sound card.
I had an amazing experience on that machine playing W3D:
Got whomped on a level, and I forget which one, but the gist of it was having a couple health points, no ammo and ended up in a long, narrow hall, at the dead end.
I turned the other way, knife ready, determined to finish the level and proceeded to knife a nice string of baddies with god line speed and precision! It was three or four and I just had to knife them all, right at the end of the hall, poking each of them often enough to keep their shots at bay!
Truth is my 386 was a bit pokey, and that gave me just the little edge I needed to take them down and score that ammo! From there, and a couple more quick pop out and shoot kills later, med pack! Home free, gonna make it...
This was in the middle of the week, me up super late, tired at work the next day, no regrets.
I love how raw, simple, maybe pure is the right word, the game is. Twitchy as hell. Distilled down to the elements needed for fun.
I attended this talk and afterwards was sitting at a table outside focusing on some work when I looked over and noticed Romero was sitting at the table next to me. He was surrounded by fans listening to some stories and asking him for advice. He seemed like a very chill guy, and I thought it was nice that he took the time to chat with everyone.
Was that music actually in the game playing on a DOS PC, or added separately to the video? If the former, it sure doesn't sound like the usual game music of that time playing on an Adlib or Sound Blaster. Maybe it was using a higher-end sound card of that time, like a Roland MT-32.
When I was a kid, my dad was a member of some club that would send you shareware floppy disks of Mac software every month.
The first few levels of Wolfenstein arrived via that club. 30 years later, when I think of a floppy disk, it's probably the one that had Wolfenstein on it.
romero , steve jobs . This is what i fear to take cofounders, the marketing types steal all the credit and try to downplay the techinal cofounder after being successful.
https://podcasts.apple.com/us/podcast/brenda-and-john-romero...
John is great in that interview, but Brenda almost out shines - she has a really fascinating history and experience in the gaming industry.
If you're not far enough down the rabbit hole after listening to the above Brenda does a solo interview on the similarly awesome "Retro Hour" podcast that is a great listen as well ->
https://audioboom.com/posts/7797932-brenda-romero-wizardry-s...