Hacker News new | past | comments | ask | show | jobs | submit login
Remembering Intel 740 (raspberrypi.com)
69 points by pmarin on Oct 27, 2021 | hide | past | favorite | 40 comments



Let's hop in the time machine and look at John Carmack's .plan

Intel i740 —- Good throughput, good fillrate, good quality, good features. A very competent chip. I wish intel great success with the 740. I think that it firmly establishes the baseline that other companies (especially the ones that didn’t even make this list) will be forced to come up to.

Voodoo rendering quality, better than voodoo1 performance, good 3D on a desktop integration, and all textures come from AGP memory so there is no texture swapping at all.

Lack of 24 bit rendering is the only negative of any kind I can think of.

Their current MCD OpenGL on NT runs quake 2 pretty well. I have seen their ICD driver on ’95 running quake 2, and it seems to be progressing well. The chip has the potential to outperform voodoo 1 across the board, but 3DFX has more highly tuned drivers right now, giving it a performance edge. I expect intel will get the performance up before releasing the ICD.

It is worth mentioning that of all the drivers we have tested, intel’s MCD was the only driver that did absolutely everything flawlessly. I hope that their ICD has a similar level of quality (it’s a MUCH bigger job).

An 8mb i740 will be a very good setup for 3D development work.


All the names and terminology, MMX CPU, 3Dfx Voodoo, VideoLogic PowerVR, AGP, i740, Cyrix 6×86, AMD K6, ALi Aladdin, Super Socket 7, 440BX.... etc. It just put a smile on my face. :) And I have every single one of them somewhere in my house.

But it didn't touch on the real reason why i740 failed. It was Drivers.

Intel fail to invest enough resource to catch up in Driver quality. The constant rendering bug, crash, and performance on non-optimised Games were dismal. ( Games that were not on Review Benchmarks list )

That is why I am sceptical of Intel re-entering the GPU space. This time around they do have their iGPU marketshare as test bed and foundation to built on, but driver quality will continue to dictate Gaming performance. Even AMD barely manage to get ~15% discrete market GPU.

And as a side note. Nvidia market cap at ~$630B is now worth three times more than Intel. I am wondering if Nvidia could be another trillion dollar company.


It is always lack of execution, I was on the Larrabee session at GDCE 2009, and wow it was going to take over the gaming world, history followed another path as we all know.

On a similar note, Solaris has successfully used memory tagging with SPARC ADI, ARM is following along with MTE, meanwhile Intel removed the buggy MPX from their chips.

Then we have the whole StrongArm or Edison boards.

NVidia is very strong in tooling, documentation and being open to polyglot programming on their GPUs, to a level that Intel and AMD keep failing at.


Even far after the 740, with relatively recent iGPUs, Intel's drivers leave something to be desired. I've worked on codebases that had plenty of specific Intel GPU driver versions called out as buggy. This has gotten better over time, but I think you're right to have this concern.


And yet INTC net income is 3x NVDA, 21B vs 7B


I disagree. The 740 didn't have the triangle pipeline speed that competitors had as poly counts went up.

It had great memory and rasterizer bandwidth, it was just starved by transformation. The pipeline was balanced for polycounts that came before it, not what was to come.


Yeah, around that time there were quite a few "hopeful" 3D cards/chips that ultimately didn't quite cut it. I couldn't spare the cash for a 3dfx card at the time, but I wanted a 3D card, so I got an S3 ViRGE-based one. That chip later became infamous as a "3D decelerator" because with hardware 3D rendering most games were slower on a decent PC (albeit a little prettier) than with software rendering. The only game I ever played on it with 3D enabled was the bundled copy of Descent...


In 1996 I had got my first PC, I wanted a 3D card but didn't have the technical knowledge to understand the difference between cards. I got a Matrox Mystique. It ran a couple of games well that it came with. It was completely incapable of playing GLQuake or any other titles I was interested in.

Sad thing is a salesmen had tried to convince me to get a 3dfx Monster3D, but I thought the passthrough dongle thing was trashy and cheap looking. Learned my lesson hard. To this day I research the crap out of anything I buy in the PC space before buying it.


That's the one I got for my first PC as well. I managed to out-do myself my pairing it with a Cyrix 686+. So not only did I have an unsupported gaming card, I had a CPU that didn't like doing floating point and ran quake like powerpoint. ...as you said,you learn from your mistakes.


I emailed Matrox and asked if support for GLQuake was coming and they emailed me back and said "We are currently working on the drivers" Total lie. The Mystique was completely incapable of supporting GLQuake. Live and learn. I had Voodoo 2 SLI like a year and a half later.


To be fair, 20 years later, I'm telling people "I'm aware of your issue and addressing it is on our backlog" [which currently extends beyond the collapse of the stars]


I remember being 11 years old and feverishly researching video cards to play unreal and quake 2 and basically was deciding between a power vr and a voodoo 2. This 3-4 month period, along with a few years later building a pc from parts contributed to my career in software more than anything else I can think of.


I got a version of Tomb Raider to run on the S3 decelerator also, not sure if it was the first, the sequel or both. Eventually a Creative 3D Blaster Banshee saved the day.

Intel's Iris Xe DG1 desktop card was not very performant either. Curious that they decide to build DG2 Arc/Alchemist 6nm Graphics cards at TSMC... I wonder why they chose not to build a CPU instead - an area where they are actually competitive.

Here's some nostalgia - https://ctrl-alt-rees.com/2020-12-06-s3-virge-officially-sup...


I think that Intel is using TSMC because Intel having issues with their smaller process nodes. It probably helps that their lead GPU guy, Raja Koduri[0], worked at ATI (later AMD), so he's familiar with how TSMC does things (because TSMC has manufactured all of ATI/AMD's GPUs).

[0] https://en.wikipedia.org/wiki/Raja_Koduri


And you get a nice bonus that every wafer Intel buys can't be used by AMD/Nvidia.


S3 Virge 3D rendering was faster than software rendering unless you wanted bilinear filtering - then it wasn't :)


Virge fastest modes were around Pentium 100MHz software rendering speed.


I worked on the 740, under Tom Piazza (RIP) from Real3D when Intel acquired them.

Fun project. Uncompetitive before it even shipped.

Got questions? Ask away! It was a very simple pipeline.


Why was the decision made to use system memory as vram instead of shipping more on the card? It seems like a cost cutting measure, but had an outsized effect on performance.


AGP was designed to allow faster access to memory because VRAM was expensive. There was just enough for the framebuffers. Texture size was known to be huge, which would live in RDRAM b/c VRAM would have been crazy expensive (I think the dual 3DFx at the time was like $1k). Also Intel was investing heavily in RDRAM (remember that?). The idea of integrating graphics and southbridge was on the long-term roadmap, which would use UMA (unified memory architecture) to make cheap mobos, which board vendors were begging for. So this also figured into the decision: it wasn't about the graphics crown, it was about selling CPUs. These became the 810 and 815, which the game developers HATED because it held back progress for years: the 810 was the most common graphics chipset by I believe more than one order of magnitude.

So yes, cost cutting due to the desire to dominate the motherboard market, which they did, which led to even more market for their CPUs. Even at the expense of overall graphics software performance for years.


Thanks for the insight. Seems they had a similar line of thinking that Nintendo went down with the N64. They gave SGI a lot of money to develop a world beating GPU at the time and then gave it no memory and expected it to share RDRAM with the CPU. I don't think they even gave it DMA support. And I don't care what Apple or Sony or anyone else says in their keynotes, UMA is always a cost cutting measure.

From the perspective of the late 90's I can see the appeal. Rambus was supposed to be the future. One would think the high latency of the memory would have been more apparent though. And of course RDRAM had its downfall not all that long after. I still have in my collection a Willamette P4 and its RDRAM modules I salvaged from one of those ill fated systems.

I'm not sure how much the Intel IGP held back gaming, best I can remember nobody seriously tried to game on it. Many games outright didn't support the chipset. And those that did didn't work well. For years I gave the advice that gaming meant dedicated video card and the IGP was only good for displaying normal applications. This changed temporarily with the first Iris GPUs in early Core. They were actually competitive with AMD and Nvidia's IGPs at the time. That lasted maybe a year or two.


I remember when AGP came out and it was advertised as making it possible to store textures in RAM, requiring your GPU to only have enough VRAM to act as a cache and frame buffer.

It quickly became apparent that AGP, even as AGP 8x became a thing, wasn't fast enough to do the job, as every GPU still had tons of VRAM.


Quite an odd article for raspberrypi.com

While I am not a reader of the site, why would this be of interest to a Raspberry PI enthousiast?


They apparently sell a Custom PC magazine through the site. Other articles are nominally about the Raspberry Pi but then some are PC focused. It's very confusing.

The article is interesting but the site is confusing.


https://en.wikipedia.org/wiki/Custom_PC_(magazine)

In February 2019 the magazine, along with Digital SLR Photography Magazine, was sold to Raspberry Pi Trading, a subsidiary of the Raspberry Pi Foundation.


This. Custom PC was founded at Dennis Publishing, the UK publishers of PC Pro, Computer Shopper UK, MacUser UK, Computer Buyer & multiple other titles. I wrote for most of them, starting out on the staff of PC Pro and later I wrote the Linux column for Custom PC for its first year. (After that, they cancelled both their columns, sadly.)

It's a good mag with a high level of tech expertise. They generously kept my free contributor's subscription running for well over a decade. :-)

Dennis seems to be outsourcing its magazines now, and CPC got spun off to an independent production company headed by its editor, Ben Hardwidge, who I know personally from our previous career intersections. They've always given a lot of coverage to the RasPi and RasPi-related projects.

Which is probably why the RasPi Foundation bought it. :-)


Thanks for the additional info! I don't check the RPi foundation site with any regularity and when I have it's only been RPi articles.


I assume one of the reason Intel could never develop it's own graphics adapter business is that any proper GPU would initially decrease the load on the CPU, partly reducing the justification for more expensive versions of their main product. Marketing-driven tunnel vision and all that.


Just 1 year later they released the i810 which was Intel first attempt of integrating their own GPU on the motherboard chipset. They didn't get into 'Intel HD graphics', integrated into the CPU for almost another 10 years.


I remember those early days of 3D accelerators...

When the 3dfx Voodoo came out, I really wanted one so I could run all the games written using the Glide API. I asked for one for Christmas, and my dad got me a Rendition 2200. I was actually pretty upset, because so many games were written using Glide, which was a 3dfx-proprietary API. GLQuake existed and used OpenGL, but OpenGL drivers were either non-existent or didn't work, I don't remember.

I ended up hating 3dfx for how they had ruled the 3D accelerator market for a short period of time. I was glad when the Voodoo 5 was a failure and AMD and NVIDIA ended up taking over.


I heard similar story probably a hundred times over the years about wrong initial choices. Whole 8(Amstrad/Spectrum instead of Atari/C64) 16bit (ST instead of Amiga) computers, cpu (386SX, Cyrix, K6, P4 Celeron, P4 generation on intel side in general, AMD Bulldozer) or cpu socket(anything ss7 instead of slot1 Celeron 300A), graphic cards (too many to list), motherboard chipset (via/sis instead of intel), ram (P4 with sdram), Zune :) etc. It always ends in the same way with victim seemingly doubling down and choosing to endure the pain out of spite instead of doing the logical thing and selling off bad components. Very weird way of coping with cognitive dissonance by being stubborn with a mix of post-purchase rationalization/Stockholm syndrome. And no, its not always about money, some of examples I stumbled upon were worse and more expensive (picking ss7 platform in 1998 is one great example).


>> Lockheed Martin created a spin-off dedicated to consumer 3D graphics tech called Real3D, mostly using employees from GE Aerospace.

Imagine aerospace trying to compete with startups like NVIDIA and 3Dfx


Well, they did have decades of experience with real time 3D from flight simulators, so what could some johnny-come-lately's really do, especially when paired with the mightly Intel? (Well, didn't work out...[1])

I suppose if the chip hadn't been a stinker I suspect Real3D would've ended up being spun out of Lockheed Martin somehow as it doesn't seem a great fit for the rest of the business. Presumably the spinoff would have less embarrasing way of being sold to Intel on the cheap and having the tech used for (motherboard) intergrated "Intel Extreme"(ly bad) graphics.

[1] Although ATI wasn't exactly a startup by the 3D era, and 3DFX crashed and burned, so it wasn't quite "startups win".


TBH it was super impressive stuff at the time: https://web.archive.org/web/19970102114017/http://www.real3d...

Solid 60FPS in 1994 on the system's debut (and probably most famous) title: https://www.youtube.com/watch?v=GAcMqFTkAs8


Skunkworks


Brings back fond memories of computing at the time. I remember saving up for a 3dfx card and being amazed at how much of an improvement there was.


Wohaa!! I had one of those. Those were my first few years with computers and I was just learning with a friend of mine how to build and tune them to support better Doom, Quake and all the games we were playing at the time. I think this stuff was the ignition of my profession.

So many good memories. I feel old :-s.


I've got somewhat fond memories of that card. I think it sat between my m3d and my first nvidia card (TNT?) I remember it being incredibly cheap - and with DirectX support my first card out of the years of the dark proprietary wilderness.


This! The i740 was my first "real" 3D graphics card - I remember running the Final Reality benchmark on it and feeling genuinely excited with the purchase.

After that I got the Riva TNT (Diamond Viper V550 iirc) which changed everything from a consumer point of view. Good times.


i740 was outright good (8MB models only) compared to bottom of low end market at the time. Permedia, Sis, Ati Rage Pro Turbo, Cirrus Logic Laguna3D, anything Trident.




Consider applying for YC's Summer 2025 batch! Applications are open till May 13

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: