> The idea of implementing a CPU core inside an FPGA is not new, of course.
Indeed. I took a computer engineering class in undergrad. The capstone project was implementing from scratch a multi-pipeline RISC CPU (including the ALU and a very basic L1 cache) in Verilog that we flashed to FPGAs that we then programmed to play checkers using hand-compiled C. The FPGA was easier to flash and debug than the Spartan-6 mentioned in TFA but was significantly more expensive as well.
It was a brutal class, but it totally demystified computers and the whole industry in a way that made me feel like I really could understand the "whole" stack. Nothing scares me any more, and I no longer fear "magic" in software or hardware.
I had a similar course, using VHDL going on to Spartan-3E kits (which I still have sitting in a box 14 years later). Our professor gave us three C programs -- with the input that would be entered and expected output -- and we had to implement everything to make it work: CPU, VGA output (with "font" definition), keyboard input, and translation of the C programs to our CPU's instruction set.
That was a difficult yet extremely rewarding class. My wife, then girlfriend, still remembers that semester because she barely saw me.
IIRC, bonus points were given to the team with the highest clock speed. I didn't win, but I seem to remember mine being somewhere in the 18MHz range and the winner in the low to mid 20s.
> Nothing scares me any more, and I no longer fear "magic" in software or hardware.
You sound exactly like my professor for computer architecture. “Computers are not magic!” He mentioned this at least once during every lecture and my experience was similar to yours
I'd say the field effects on electrical flow are probably the weirdest part to look at, and they don't reach 'magic' levels.
The atoms below that are relatively straightforward, and them being made up of building blocks is fine I guess, increasingly irrelevant to the issue of a computer.
And going up everything starts to get very non-magical as you turn response curves into binary signals and then string gates together.
Yeah for me it is the parasitic bjt/collector that forms in a fet (c.f. latch up). Also that bjts work in reverse active to some extent, despite the emitter nominally being the injector of the carriers. Weird!
Also I never understood things like slew rates, noise, gain bandwidth. Too high up the stack.
Gain bandwidth is directly related with slew rate. Slew rate is mostly an effect of capacitances (gate capacitance in FETs for example.) Noise is mostly caused by thermal effects (atoms bumping randomly.)
There's also adjacent stacks... like being able to build your own computer out of crabs and rocks if you get stranded on a desert island. So that you can get them to write SOS in the sand and play pong while you wait.
Are there any resources online that would let me do a similarly deep study myself? I'm enjoying the famous "Nand to Tetris" course, but it uses a simplified architecture (no pipelining, no cache, no interrupts) and runs in an emulator instead of an FPGA.
Awesome Base to start with! Lot of things in the next step though: PCIE, JTAG, differential signaling protocols, debuggers/monitors, UEFI firmware, cache coherency, TPM type modules, linkers, loaders, vector instruction sets, sound codecs, DACs, GPUs, multi channel SDRAM, MMUs, display protocols, device trees. Plus latency, throughout, and thermal management in any of the above in a real system.
As someone who started pretty high up the stack(Visual Basic!) and kept peeling layers away out of curiosity I will say working with a CPLD/FPGA was really eye-opening. Latency difference between SRAM/DRAM and the "why" in pipelining made a bunch of disjointed approaches to optimization just drop into place.
I've started with gwbasic, then QBasic and moved up to "Visual Basic for DOS" (ncurses-style UI stuff, similar to Turbo Pascal iirc — I don't think many people even knew it existed since Win 3.11 was already big, but I loathed it and only switched to it because of... Trumpet Winsock for the internet!), but that did not stop me from playing with driving serial (COM) ports or parallel ports (LPT) with printer escape sequences. Not really FPGA level low (not even close), but DOS-based stuff was really easy to start hacking up!
Indeed. I took a computer engineering class in undergrad. The capstone project was implementing from scratch a multi-pipeline RISC CPU (including the ALU and a very basic L1 cache) in Verilog that we flashed to FPGAs that we then programmed to play checkers using hand-compiled C. The FPGA was easier to flash and debug than the Spartan-6 mentioned in TFA but was significantly more expensive as well.
It was a brutal class, but it totally demystified computers and the whole industry in a way that made me feel like I really could understand the "whole" stack. Nothing scares me any more, and I no longer fear "magic" in software or hardware.