Hacker News new | past | comments | ask | show | jobs | submit login

I've never written an LLVM backend, but one difficulty that immediately jumps out at me is the 16-bit architecture.

LLVM specifies types with bitwidths and 32 is most commonly used, meaning a backend would have to either emulate 32 bits, forbid the frontends from generating 32 bits, or silently cast them all down to 16 bits.




32-bit types are only common in LLVM because typically the target architecture is 32/64-bit. LLVM will handle 16-bit (and even 8-bit or 13-bit) targets just fine.

That said, a LLVM backend is overkill for this and I'm sure the simple interpreter Notch has rigged up will be just fine for its purpose.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: