I've come across this in the past when researching modern 68k CPUs.
Pretty cool. It's too bad there's not more energy put into manufacturing legacy hardware with modern processes… something like a Pi Zero sized 68k machine capable of running System 7.5 or Mac OS 8 (which handle monochrome displays well) hooked up to an e-ink display in an ultrabook case would make for a very compelling writing/focus machine… it'd boot nearly instantly from modern storage and would have resource usage in the ballpark of that of a headless Linux install while sporting a full UI.
That is awesome :D I know it should work... it's too easy to make work today, but I can't help imagine being back in the 90s and showing someone classic MacOS running on a tablet. It would blow their mind.
A lot of e-ink devices just run Android so it's relatively easy to run whatever you want on them.
Refresh rate depends on how much of the screen is updating at once. Scrolling a whole page is pretty bad, updating a small region is reasonably quick. Some devices have an 'A2 mode' that makes it somewhat faster by just doing black/white instead of grayscale.
I mean, classic Macs didn't exactly have turbo charged refresh anyways. Yes, things like scrolling chunks of text is probably faster on an actual first-gen Mac than an e-Ink screen, but they weren't really... turbo for graphics rendering.
Drop in replacement for the 68000 that boots in milliseconds (about the same as a real 68000) and leverages the whole rest of the system as is, but with 1000 MIPS of 68K goodness and 512MB of RAM.
> something like a Pi Zero sized 68k machine capable of running ... hooked up to an e-ink display in an ultrabook case would make for a very compelling writing/focus machine
I had a similar idea for a portable writing/focus sub-notebook, except using Linux rather than MacOS. At the time, e-ink displays weren't good enough, but now they are nearly there in terms of refresh rate.
I was thinking of an appliance that would mostly be used for editing text, a A5-format or 10" sub-notebook with a very good keyboard and e-ink screen, and potentially one-week long battery life (due to a low-power CPU and the fact that e-ink doesn't consume electricity unless there is a change).
The iPad isn't a competitor to this in the sense that it has no or no good keyboard and a poor screen for writing/reading/editing text professionally (glossy, no e-paper quality).
Older processors were not necessarily more energy efficient than newer ones. I'm willing to bet you'd get better battery life out of something like a modern ARM Cortex-A than any 68k, FPGA-implemented or otherwise, and have it be significantly more powerful besides. But IANA processor nerd.
> It's too bad there's not more energy put into manufacturing legacy hardware with modern processes…
You can try http://www.mosis.com/ or https://europractice-ic.com/ depending on which side of the Atlantic you are. They are intended to make small batch/prototype fabrication processes available to industry and academia.
OTOH, I am not optimistic the demand for such a CPU would be sufficient for any size of run that could end up in a low-cost part.
Having an order of magnitude (maybe two) fewer lines of code goes a long way to having fewer vulnerabilities. Given the longer and more rigid development cycle and smaller attack surface, it's reasonable to think there are relatively few defects.
This is a proprietary core from a hobbyist/retro mainly Amiga team (nothing to do with Motorola) that has had quite a lot of murkyness associated with it.
There have been accusations of this being simply a Coldfire ripoff based on leaked information [1] but also a lot of people have complained that they delete critical posts from their web forum and discord.
The core is buggy and not really fully compatible with Motorola processors. They've also implemented their own extensions (AMMX). The Amiga scene has had countless peddlers of proprietary hardware, trying to vendor lock-in a substantial userbase. My opinion is that people are better served with Mister FPGA which is completely open and/or other open solutions (The Buffee project, PiStorm).
I've said it before here, but I'll say it again: from the outside looking in this seems to be an ailment affecting the Amiga community more than retro communities. On other retro machines so much is open sourced or rewritten now, but on the Amiga there seems to be some fantasy that people are going to be making money there. Lots of proprietary closed stuff.
I mean, MiST/MiSTer comes out of the Atari ST community, for example. That's what it was originally designed for, an ST core, and all the vhdl is open source.
It's actually not a fantasy. AmigaOS 3.2 was released this year as a commercial product, and Apollo make enough money to employ Gunnar full time and some part time folks too. Giving a niche community exactly the things they want will never be unicorn territory but it can produce liveable income.
There is also the open source stuff: for example Apollo have abandoned CoffinOS due to licensing issues and are now developing ApolloOS, which is a fork of AROS, which is an open source reimplementation of AmigaOS. Free Software ideas aren't as prevalent as I'd like in the Amiga community but they are there (plenty of GNU software was ported to Amiga via ixemul.library so they've always been there).
I mean it's funny, I wasn't familiar with CoffinOS so I googled it, and most of the results I found were links to forum posts of people fighting & bickering about piracy, compensation, and unethical behaviour around compensation and IP.
We're a smaller community, but I don't see this kind of fighting on Atari ST forums. And we have a good quality fully open sourced GPL reimplementation of the operating system, too, with a pretty awesome development team.
It's not exclusive to the Amiga - the same applies to the Acorn / RISC OS community too (to an extent).
There's a font organiser which used to be sold by its author for £10, but was bought up by another company and is now being sold at £45. A personal accounts package that was £30 is now £95. A long outdated word processor, £60. Oh, and the tool to format SD cards - that's £15.
Who really pays that kind of money for a retro 'toy'?
And then I click Order and it takes me to http://www.apollo-computer.com/ , in a new tab. Can you just have stuff on one ___domain (with an SSL certificate) and consolidate information / headers / footers? It's also concerning there's an entire forum (with tens of thousands of messages) with no https...
Then you get an email from a ___domain that has "apollo" in it but is not mentioned or linked to anywhere on the main site. The person sending the email says your order is ready and asks you to send ~500 EUR to some personal bank account in Ireland.
Uhh... no? I've said thanks, but let's use paypal and I'll cover the extra fees. The idea was that if it turns out to be a scam I can get the money back.
In the end everything was fine. Got the V1200, it worked. Also I got some candy, yay! But yeah, the whole process seemed a bit sketchy.
If you count the Coldfire as part of the 68k line then yes, fairly popular microcontroller. Basically a reduced 68K instruction set. https://en.wikipedia.org/wiki/NXP_ColdFire
I think there are places that are invested in them. If you have an in-house software stack (possibly in assembly or with significant parts written in assembly) and years of expertise and familiarity, moving to ARM is maybe not worth the investment.
I don't know what "Legacy" means to NXP, exactly. They're probably not making new, compatible cores. But if you're buying more than a few hundred per year you're likely to be able to do it for a while yet. Microchip, I know, use the phrase "customer-driven obsolescence", which means that as long as people are buying the chips they will keep making them. They still have a few 5V programmable logic devices that are maybe 25 years old but still available in onesies, and some from that series that I've heard they make as a sort of "print on demand" system if you buy the 500 or however many are in one batch. In general obsolescence in embedded hardware is very different to PC software, especially the kind that's normally discussed here.
Definitely true. Also, investment in on-chip peripherals can lock companies into an architecture much longer than on higher-level platforms. We spent a lot of time debugging DMA, SPI, interrupt controllers, etc. on a micro platform and that work is not something we'd want to redo.
Basically, in embedded products you end up touching everything from the app, OS (if there is one), device drivers all the way down to the on-chip peripherals.
I mean honestly I look at the Coldfire MCUs and they're not terrible from a specs sheet POV. I have heard at one point that Coldfire was popular with people building network hardware because they are naturally network(big) endian. I believe Freescale made chips with onboard networking support specifically for that market, too.
I just doubt they can compete against ARM on price at this point. Or I suspect on power consumption.
"Today, the 680x0 is still used by industrial machines, the aircraft industry, cars vendors and fans of retro-computing around the world." it's right there above the fold
A friend and colleague who is a hardcore electrical engineer swears by 68K chips because he almost adores its instruction set. Says coding the assembly on them is superior to any other option for medium to large use cases.
The 68k has an excellent instruction set but imho it suffers from two things: a) It's big-endian. This is a bit of a religious thing, but in the end little-endian won out, and probably for good reasons. It's arguably easier to read a big-endian hexdump but harder for the processor itself to manage and harder for some bit twiddling. b) Separate "data" and "address" register sets. Awkward to program, and makes the instruction set more complicated.
Looking back, the similar but not nearly as successful NS320xx architecture was better. Atari apparently even prototyped it for use in what became the (68k based) Atari ST, but the first chips produced in that series were too buggy.
Both the 68k series and the NS32k series were heavily 'inspired' by the VAX.
One thing that makes the 68k cool is that it's about as "retro" of an architecture as you can get that still supports a modern C/C++ toolchain. GCC still (mostly) supports it.
I worked on the ST and helped evaluate the NS32032/32016. We started counting clock cycles for various operations and got some bad vibes, and then we started finding out about chip bugs (definitely not pretty, and National was coy about them and just said stuff like "we're working on it"), we so decided to use the 68K.
The NS parts were really a due-diligence affair, and we looked at them just to make sure we weren't missing out on something really great (we weren't).
Definitely a prettier architecture than the 68K, with better toolchains and development support at the time. Didn't make up for the bad, though.
Thanks, I always appreciate it when you chime in. Looking back, did Atari Corp actually build any hardware (prototype, wirewrap, etc.) with the 320xx? Or was it just evaluation from the POV of reading datasheets and the like? I've heard conflicting stories around this.
That and the story about Atari talking with Microsoft about an early version of Windows for their hardware.
Whether or not anyone wire-wrapped a board (pretty sure no one did), we didn't write a single line of code for the National parts. There were a number of visits by National sales reps and they dropped a lot of documentation on us, though. We were generally doubtful that the chip would be fast enough, and needed to make a processor decision very quickly; in late July 1984 we were talking about it, and I don't recall anything past August or so. By October the software folks were mostly in Monterey with a bunch of 68K-based systems.
Another factor in the 68K's favor was that it was pretty easy to buy off-the-shelf 68K-based workstations (e.g., Motorola VME-10, the Apple Lisa), which let us do 4-5 months of software development while waiting for the actual ST hardware.
There's some upside to having multiple register sets, though, and many quite successful ISAs have used them to good effect (CDC 6600, Cray-1/-2, &c.). It gives you more registers without having to use more bits for designators.
>> Says coding the assembly on them is superior to any other option for medium to large use cases.
I still have a preliminary instruction set manual for the 68000 printed in 1979. At that time I was learning 8080A machine code. When I saw that book the hardware became something I HAD to get my hands on eventually. I finally got to code one in assembly in college and it was every bit as enjoyable as I had imagined.
BTW my dad wrote a disassembler for the 8080 in basic, and to simplify everything he rewrote all the mnemonics in a far simpler form than the official syntax from Intel. I still have the 16x16 instruction map. Because of that, the 68000s primary advantage was the increased number of registers and their 32bit size along with some better instructions. The pain of proper 8080 syntax was never a thing for me.
> my dad wrote a disassembler for the 8080 in basic, and to simplify everything he rewrote all the mnemonics in a far simpler form than the official syntax from Intel
If you or he still has his notes, would you be able to ZIP them up and upload them to archive.org?
>> If you or he still has his notes, would you be able to ZIP them up and upload them to archive.org?
Not sure why, we were the only 2 people on earth to ever use that. To summarize, he used single letters for verbs and then single letters for register names: a, b,c, d,e, h,l for the 8-bit registers and x,y,z for the 16 bit pairs (b,c) (d,e) (h,l). It turns out you could always figure out some implied stuff. MAY was move a to y, but since A is 8 bits and Y is 16 this meant Y was an address, whereas MAD would be move A to D as an 8 bit register-to-register copy. This lead to all mnemonics being 3 characters or less. There could also be 8 or 16 bit immediate values. The disassembler used a 4 byte decode table - 3 characters and a size (0,1,2) of the immediate data. It was super clean. I'll post is somewhere some day. I still have my Interact computer in its original box waiting to find a nice museum somewhere. And lots of tapes including a bunch of little kid created basic programs. Hard copies of all the "Interaction" newsletters from the local users group too.
This is so true, and also the reason for them being less popular. The 6502 wasn't nice like the 6809. Intel ISA is a mess but was simpler in hardware than M68K.
"IBM considered the 68000 for the IBM PC but chose the Intel 8088 because the 68000 was not ready; Walden C. Rhines wrote that thus "Motorola, with its superior technology, lost the single most important design contest of the last 50 years" -- Wikipedia
Too bad we missed out on M68K IBM PCs, but in the end Macs ended up running x86 (ARM next I suppose).
Fun fact: IBM released a 68k microcomputer aimed at the laboratory market in 1982. It faded away into obscurity, largely because it was more expensive than the PC, and sold into a role which the PC was more than capable of handling.
The corollary is the same thing happening which caused Apple to move to the Intel CPUs, basically, Motorola being unable to squeeze more performance out of it.
On the other hand, with the volume of units that IBM Compatibles had, it may have allowed more resources to push that boundary.
The 68k insn set is a rather traditional CISC architecture, there's not much that makes it "superior" to modern designs. It's comparatively simple because it never really got extended beyond recognition as other architectures have been, but other than that I can't see any reason to recommend it.
Back in the nineties I heard that, although the SNES had more capabilities in the way of color and 3D, "hard core" devs like David Perry could get better raw performance out of the 68000-based Sega Genesis for similar reasons.
Clock for clock a 6502/65816 would outperform a 68000 at raw byte moving and responding to interrupts, but the 68k would smoke it on anything involving heavy integer calculations especially 16-bit or 32-bit integers or anything involving longer words.
So as games got more complex, and clock rates higher, a 68k would definitely outperform.
The 6502 is very cycle efficient and its ridiculously small register makes it fast to respond to interrupts, which I guess in a game machine is a good thing. But machines like the Amiga, Sharp X68000, and lots of arcade games packed in a bunch of custom video hardware too.
The 68000, with its neat ISA, linear address space and large register set, is much easier to program than a 6502/65816. All other things (SDKs, dev hardware) being equal, will result in more, better games hitting the market at a faster cadence.
That statement might have been true when it was first written. It isn't anymore -- nobody is seriously using the 68000 in new hardware designs. Parts availability is limited, and is only going to get more so in the future.
Not really. The vast majority of the 68k series -- and in particular, all of the higher-performance parts -- have been out of production for many years. Some 68000 variants are still available, but are "not recommended for new designs" and are likely to be discontinued once supplies run out.
There are still systems lying around that these can be plugged into, giving the performance improvements with full backwards-compatibility and the original chipsets.
I have one of these. It's fun to play around with. You can download Amiga OS distros (one is called "Coffin") with tons of old apps and games already preinstalled. It takes me back to my youth.
It's not intended to be a replacement for a modern desktop. But it is impressive if you're into the retrocomputing / emulation scene.
They want to call it a successor to the m68040 / m68060, but they're more interested in adding incompatible instructions than they are in finishing emulation.
Not a fan of yet another target for Amiga software - we already have 68000, '020+, PowerPC AmigaOS, PowerPC MorphOS, PowerPC WarpUp, PowerPC PowerUp... And now they want to add "68080"? No, thank you.
I was about to reply to this, thinking what a coincidence, I just bought a copy of Amiga Format magazine this week. Of course, my mind was muddled and I meant Linux Format, but I guess this just means they occupy a similar place in my mind.
Do they have full programming information publicly available for their extensions yet? Something akin to the Motorola manuals but for their extended instructions and addressing modes, including the instruction encodings.
It’s about €500, IIRC. Unless you are too attached to hardware, software emulation is a more cost-effective option.
And it can JIT the 68K code into native instructions as it goes, so it’ll end much faster than anything implementable on a cheap FPGA. A much faster 68K doesn’t make much sense for a gaming machine, but makes a lot of sense emulating a Unix workstation or a 68K Mac.
Pretty cool. It's too bad there's not more energy put into manufacturing legacy hardware with modern processes… something like a Pi Zero sized 68k machine capable of running System 7.5 or Mac OS 8 (which handle monochrome displays well) hooked up to an e-ink display in an ultrabook case would make for a very compelling writing/focus machine… it'd boot nearly instantly from modern storage and would have resource usage in the ballpark of that of a headless Linux install while sporting a full UI.