Hacker News new | past | comments | ask | show | jobs | submit login
Discover the world of microcontrollers through Rust (japaric.github.io)
527 points by ingve on Sept 17, 2017 | hide | past | favorite | 128 comments



This is exactly, exactly what I've been looking for. As a software engineer whose professional career has focused entirely around the web, embedded systems is a strange, enticing landscape with what seems to be an "old guard," that eschews the frenetic pace of web technologies and frameworks, and relies on old, proven, get-shit-done tooling and technology. Which is amazing. But it also means that no one is writing blog posts on medium about this stuff, and beginner tutorials are sparse. For spoiled Elixir devs like me, this is perfect.


Microcontrollers have got a lot more accessible since Arduino. There's lots of beginner material out there, although admittedly not on Medium. (Unsurprisingly, the people involved prefer a more bare metal approach and put it on blogs or forums instead)

The tooling is nearly always C, so it's interesting to see Rust moving into this space. Memory management is not so much of an issue, but multitasking correctness is; perhaps there will be some new micro-RTOS framework with provably correct interrupt handling. (We have sel4, but that's not quite the same thing)

As you get more low level, the tutorials get sparser. You move into FPGAs, where you have a choice of two languages with 1970s design principles, or third-party tools which work despite the manufacturer's total closedness. As you go deeper into actual IC design, nobody will attempt it without supervision from someone experienced to tell you all the little tricks. And then there's analog IC design, which is basically black magic.


> The tooling is nearly always C

With the esp8266 and low end arm controllers, there has been an explosion of languages for embedded applications; micropython, lua, es, basic, even lisp. Particularly for beginners, this is a good thing. Having said that, the dev environments are somewhat lacking at this stage, and I'm wondering if we'll soon be at a point where they'll be enough resources to just put Linux or other OS straight onto the microcontroller.


Yes, but the ecosystem is so heavily geared towards C there that if you want to do anything non-trivial, you'll need to use C anyway.

I wonder how easily one can link C libraries with MicroPython, that'd be the best of both worlds.


Shouldn't be too hard considering that the entire thing is written in C.


you can also use inline assembly on certain platforms. (last i checked anyways)


> "You move into FPGAs, where you have a choice of two languages with 1970s design principles"

I'm assuming you're referring to VHDL and Verilog. There are plenty of other hardware definition languages out there. Chisel, CλaSH, MyHDL, etc... Can see a more complete list here:

https://en.wikipedia.org/wiki/Hardware_description_language


Sure, those languages might exist as a design document somewhere, but you are not going to have a lot of luck finding an FPGA vendor that actually supports them.


Chisel compiles to verilog, so you don't need vendor support.


Similarly, CλaSH and MyHDL also compile to Verilog (or VHDL).


Oh, that's pretty cool. I might look into that then.


Also in this stack, one step below using an Arduino, would be to use existing microcontrollers (or other ICs), but design your own circuit board. In contrast to IC design, this is something you can do on a hobbyist budget. Actually, if you don't etch the board yourself, but have it made and mailed, all you need is soldering equipment to populate the board with the components.


I've got the equipment to do hot air soldering, and I flatter myself that I've gotten pretty good at some fine SMD work, but these days I rarely bother. The premium on a service like macrofab.com usually is less than the money I waste on spare parts I accidentally destroy and postage from many different distributors.


Can not thank you enough for that comment. I was looking for a local shop that could assemble a couple of prototypes for me and the quotes they provided were at least 3-4 times what macrofab shows me.


Any cheap China-based alternatives?


C++, Oberon, Ada, Java, Pascal and Basic are also an option

https://www.mikroe.com/products/#compilers-software

http://www.astrobe.com/default.htm

http://www.microej.com/resources/supported-platforms/

http://www.adacore.com/press/8-bit-avr-microcontroller/

As for source for information, for me it used to be the Elektor magazine, available in a few languages.

Up to a few years, it was still common to occasionally see Pascal listings on it.

But you are right, the embedded culture is mostly C, and using alternatives, even C++, tends to end in culture clash, as Dan Saks referred to it in his CppCon 2016 talk.


What kinds of things would you want proven correct about your code's interrupt handling?


Mostly whether the mechanism for passing data to and from the main program is sound; but also whether things like interrupt priority make sense and whether your re-trigger handling or re-entrancy are going to work properly (an interrupt happens while you're in the handler for the same interrupt). It might be sufficient simply to have a good set of primitives for this, rather than an actual proof.

Basically whether it's going to deadlock or miss interrupts. Deadlock is immediate disaster, but at least the JTAG will help you .. if the device hasn't self-destructed. Missing interrupts is worse because it's extremely hard to debug.

Forcing DMA to behave would also be great, although this isn't strictly a microcontroller issue. I've seen a few war stories where people are trying to debug memory corruption where the program is completely correct - but some other device has simply DMA'd over it. I think this was involved in https://googleprojectzero.blogspot.co.uk/2017/04/over-air-ex... too.


I believe rust's safety guarantees are enough to ensure the soundness of passing data to/from interrupt contexts and interrupt handler re-entrancy. See rtfm[0] for an example and a good explanation. I think DMA handling can be made safe by using the right abstractions, and then you'd just need to trust that any piece of code that accesses the memory uses the abstraction instead of accessing it directly.

Checking interrupt priorities sounds like an interesting problem. What is the state of the art in deadlock prevention? It would also be cool if you could tell the compiler "I need this interrupt handler to return in less than n clock cycles." I wonder if someone could write a rust compiler plugin to check that.

[0]: http://blog.japaric.io/fearless-concurrency/


Top 10 Causes of Nasty Firmware Bugs

https://embeddedgurus.com/blog/2010/12/top-10-causes-of-nast...

it would be great if all those would be easy issues, or solved.


> The tooling is nearly always C, so it's interesting to see Rust moving into this space. Memory management is not so much of an issue, but multitasking correctness is; perhaps there will be some new micro-RTOS framework with provably correct interrupt handling.

Give Céu a look[0]. It was specifically designed for embedded computing, and compiles to C, making it very easy to interface with existing libraries. It requires no manual memory management, and has very nice structured synchronous reactive concurrency primitives.

(EDIT: I realise that multitasking and concurrency aren't quite the same thing; it also has experimental interrupt support though)

Here's a made-up example for Arduino (for which it has support out of the box[1]). It creates two concurrent fading LEDs at different frequencies, and resetting at the push of a button:

    #include "arduino/arduino.ceu"
    
    input  int PIN_02; // button input
    output int PWM_05; // LED outputs, remember that pins 5 and 6
    output int PWM_06; // have a higher PWM frequency on the UNO
    
    // a code block that concurrently fades an LED in and out
    // `pin`   the output pin of the LED
    // `min`   min value of the fade
    // `max`   max value of the fade
    // `delay`   number of milliseconds to wait between increasing/decreasing `analogWrite`
    code/await Fade_forever(var u8 pin, var u8 min, var u8 max, var uint delay) -> void do
      loop do
        var int i;
        loop i in [min->max] do // fade in loop
          if pin == 5 then
            emit PWM_05(i);
          else/if pin == 6 then
            emit PWM_06(i);
          end
          await delay ms;
        end
        loop i in [min<-max] do // fade out loop
          if pin == 5 then
            emit PWM_05(i);
          else/if pin == 6 then
            emit PWM_06(i);
          end
          await delay ms;
        end
      end
    end
    
    loop do //endless loop
      // if *any* of these three code blocks (trails) end, all of the remaining trails in
      // a `par/or` are aborted and code resumes (in this case, the loop restarts).
      // By comparison, a `par/and` construct would require *all* of the trails to
      // terminate before continuing.
      par/or do
        await PIN_02; // .. meaning that if a button is pressed, we reset the loop
      with
        // fade the LED at pin 5 quickly between 64 and 192
        await Fade_forever(5, 64, 192, 5)
      with
        // fade the LED at pin 6 slowly between 0 and 255
        // note that because it fades over almost twice range (255 vs 128)
        // but not exact, it's around eight times slower, not four, and it
        // will slowly go out of sync. 
        // We can push the button to reset however!
        await Fade_forever(5, 0, 255, 20)
    end
[0] http://ceu-lang.org/, http://fsantanna.github.io/ceu/out/manual/v0.20/

[1] https://github.com/fsantanna/ceu-arduino


Can confirm, in my emag applications class I had tools from "black magic something something" company printed on the top.


Small micro controllers don't have the resources you're used to. The smallest thing I ever wrote code for was in the PIC family and had 176 bytes of RAM and like 4 or 8K of flash (EEPROM?). Even on more common parts you may find only a few K of RAM and 10's or 100's of K flash. In that space we don't do any dynamic memory allocation, never mind garbage collection. If you did it's entirely possible you'd run out of heap space and the program would crash. It's a really different world than what web developers and app developers are used to. I highly recommend taking the dive even if it's just for fun and to see a different type of software development.


The recommended development board isn't that bad. 256Kb Flash memory, 48-Kbyte RAM. It's not as tight as a classic Arduino.

This is where Rust's safety helps. Debugging embedded code on small machines is a huge pain. The more problems caught at compile time, the better off you are. A compile time error beats JTAG debugging every time.

The article gets kind of vague once they get beyond LED-blinking and busy-waiting. They implement a brute-force CPU dispatcher and call it "async" programming. They never get to interrupts at all.

Rust on little machines makes sense, but it needs more support underneath to deal with timers, interrupts, and concurrency. There are projects working on this.[1]

[1] https://users.rust-lang.org/t/rust-for-embedded-development-...


> Rust on little machines makes sense, but it needs more support underneath to deal with timers, interrupts, and concurrency. There are projects working on this.[1]

NB. the OP is part of that project, from the same author.


Rust needs a CPU dispatcher underneath, so the thread and lock primitives can work. A minimal single-process multiple-thread OS, something like VxWorks, is all that's needed. This would probably be something you'd link with the program.

It's better to have one good one CPU dispatcher than making users roll their own crappy one for each application.


What do you mean by 'need'?

If you mean every Rust program requires a dispatcher to run, then no, Rust does not need a CPU dispatcher. Thread and locks are not primitives: they're implemented in the standard library (the std part, specifically). The 'thread' safety guarantees don't rely on that functionality, instead the arrows go the other way: the spawning and locking constructs build on the guarantees (driven by the Send and Sync traits, which are purely compile-time constructs) to provide expressive yet safe API. For instance, there's numerous operating systems built in Rust, for instance intermezzOS seems to be pure Rust except for the single file https://github.com/intermezzOS/kernel/blob/master/src/asm/bo... .

If you mean that it would be really nice if there was a general purpose dispatcher library available, then sure, that seems like something that would be great on crates.io.

In any case, I don't see how your comment relates to mine.


Rust with a CPU dispatcher offers the opportunity to get beyond the raw busy-wait and basic interrupt handling of Arduino-type code without going all the way to a full Linux system. That's what I'm getting at here.


The first micro I ever used (as a kid in the late '90s) was a PIC16C62A. 128 bytes of RAM, 3.5kB of EPROM. Not EEPROM, this was the UV-erasable stuff. I didn't have a UV lamp so I could only program during the day when I could take it outside & peel back the black electrical tape from the window to erase it.


And you might want to avoid multiplication operations, because the compiler will likely have to implement those in software, which will be slow, and might make your program too large to fit in the memory… :-)


This is targeted at cortex m4, arm cores with hardware integer and division (and even integer simd).


Got it. I was thinking of the world of microcontrollers in general.


As a rule, I now try to avoid anything that isn't 32 bits with hardware multiplier. Unless you need a controller for under 50 cents I don't think that's a high bar any more. Floating point - much as I'd like to make it a minimum requirement I still enjoy doing some math in fixed point.

Oh for the day when RV32IMAF can be considered low end.


For me, it's the elegance of making do with the least sophisticated components. Using a microcontroller at all can be seen as bringing out the big guns.


The AVRs, at least some of them, have reasonable Flash and RAM and EEPROM (v nice), and a single-instruction multiply. Much nicer than the 8-bit CPUs that I used to program back in days of yore...


>a single-instruction multiply

But not single cycle ;)

This is important!


MUL/MULS/FMULS are all two cycles, to be exact.


OK, thank you for the correction!

Still a lot faster than a bunch of shift+add!


Pff, look at the fancy people over here!


The recommended 32Fs have FPUs so the hit isn't all that bad


I highly recommend nerdkits (http://www.nerdkits.com) for someone in your shoes. You basically build an Arduino and it explains each part along the way. The included PDF is worth the price alone.


All their kits and components are out of stock.


Yeah, front page last-updated 2013, too.


This embedded dude longs to use Erlang on his MCUs. Closest I can get to it is one of the embedded *nixes. Raspberry PIs are great for prototyping but not all that easy a sell on the BOM


There's a new mcu from microchip, that has 32mb dram, for ~$7. Might get you closer to your erlang dream.

It uses new , cheap multi-die assembly techniques, which microchip also used in their ~$1 Bluetooth chip. So I think with time, and with an attractive software ecosystem,we'll see interesting new mcu's and price points.


> There's a new mcu from microchip, that has 32mb dram

...and a long list of errata, some of them quite serious: http://ww1.microchip.com/downloads/en/DeviceDoc/80000736A.pd...

(note that if you look for it on Microchip website, the official link to the errata is also wrong at the moment!)

Some people also reported a few heat issues, since it's got to dissipate heat from the memory and heat from the MCU, which starts being a bit large and a bit fast (for an MCU). Nothing extreme, but it has has to be considered in some setups.

IMO, it's better to wait for other revisions of this chip, other versions in this family, or something else.


This is quite a list. There are so many "Module x is not functional. Workaround: None." its amazing. How did they manage screw up the VBAT pin?


~$1 Bluetooth chip, is that the ATBTLC1000?


https://www.newbiehack.com/MicrocontrollerTutorial.aspx is an ok (not great but not bad either) way to learn about micro controllers. We used this site in a course I took.

YMMV though. People in my class who lacked the experience I had with C and C++ from before had to struggle quite a lot.


One could argue that blog posts wouldn't be needed since the books written on the material are generally accepted as mature coverage of the subject


At the recent RustConf I got to take a course working with TockOS. I highly recommend people check it out: https://www.tockos.org/documentation/getting-started/

It was a lot of fun to work with and looks to have a growing set of common drivers, etc.

The OP guide looks to be lower level and wonderful for learning. Reading through it very quickly, it looks very thorough, well done!


I like TockOS too. As a hardware product manager, I look forward to use it in next project.


This is well done, with lots of explanation around the bits that hang up a lot of people. Also Windows System for Linux (WSL) has become useful enough for embedded system development without running virtual machine. That helps minimize complexity a bit further for Windows users.


Anyone know what would be required to get Rust running on the ESP32 boards? I picked up a couple for $6 each (Wemos LoLin32), and it's an impressive board for the price. Wifi and bluetooth built in.

I assume it's non-trivial or someone would have done it already, but I'm curious if it might happen eventually. Been poking at C which is already a bit out of my wheelhouse, but it'd be a good excuse to check out Rust too if I could.


There's been some experimentation with using https://github.com/thepowersgang/mrustc to target the ESP8266: https://github.com/emosenkis/arduino-esp-rs

The same would work for the ESP32.


I don't know about Rust, but MicroPython runs quite well on the ESP8266/ESP32 boards. Might be worth investigating if you're looking for something a bit more accessible than C.

https://github.com/micropython/micropython-esp32



MicroPython kind of misses the point in this case though. The desire for Rust is for something not only safer and higher level but also fast and low on memory usage. Rust on the esp32 with Rusty bindings to esp-idf would be fantastic.


I'm sure I'd be more productive in MicroPython, but I'm using it more as a fun excuse to pick up a lower level language than anything else.


I agree that the ESP32 would be a really cool way to get some practical experience with Rust.

If you want to get started more easily take a look at Neil Kolban's book on the ESP32.[1] Other environments besides C include C simplified with the Arduino IDE, Python, Lua, and JavaScript.

[1] https://leanpub.com/kolban-ESP32


If you're looking to make a commercial application with the ESP32 or ESP8266, I'd say bit the bullet and learn C. The Espressif SDKs are pretty decent.

I'm making a commercial product with the ESP8266 with the FreeRTOS SDK, and it's been relatively painless so far.


I've recently bit the bullet started porting my CPP Arduino code to C to run on ESP32's and take advantage of some of the more advanced features, watchdogs and freeRTOS tasks.

I would much prefer to write the code in Rust but I'm sure even if it were possible and I got it all working eventually some other chip will be the new hotness and using C ensures I need to do the least amount of work to get it working on that chip


Did you take a look on chibios?


Given that ESP32 is already a bit better than an Amstrad PCW 1512, it would be nice to have other options as well.

MS-DOS had lots of programming language options.


There isn't any LLVM support yet, and it doesn't look like anyone's working on it.


I heard that Expressif was working on an LLVM port, but we'll see...


That has died. Every now and then I'm thinking this might be a fun project, but... well.


You could try and compile with the resurrected LLVM c backend, llvm-cbe, and then compile that C code with existing tools. Good luck debugging that mess if you ever need that, but Rust has compile time safety, right? ;)


I've dabbled with microcontrollers on and off.

I started with the Basic STAMP, played around with things like the OOPic, tried my hand at PIC assembly on the 16F84 and 16F877, enjoyed the AVRs (ATmega328 and ATmega644) for a while, dabbled a bit with the ultra-low-power MSP430 (via the Launchpad), backed the MicroPython project and got a controller out of it, and have some ESP8266 boards that are just waiting to be played with.

All of that was self-taught via online tutorials and forums. While some of this stuff is obscure, there's a ton of information out there for anyone to learn it.

The biggest barrier to entry is money - developing for these platforms requires buying hardware, both the controllers themselves but also the programming tools (e.g., I use an AVRISP mkII to program ATmega chips). Many controller manufacturers make development boards that include a programmer, controller, and various peripherals to play with - this particular post focuses on the STM32FDISCOVERY board which seems to include the programming tool, a button, some LEDs, plus breaks out the controller's pins so that you can hook up whatever you need.


The biggest barrier to entry is not money but time. Every year the boards get cheaper while being more powerful. A esp8266 costs just $5 now and it comes with wifi. A wifi board for older boards that don't come with wifi costs > $5.

The biggest barrier to entry is time because unlike web or other higher level software stack, blogs and github resources are no where near the level you feel like you can figure out how to make something work other than Arduino and RPi platforms. Want to learn how to code Beaglebone's PRU with rproc and not the deprecated uio? Good luck. Want to learn how to configure TMC2130? Good luck. In order to drive the micro-controller space forward, the IC manufacturers really need to produce thorough tutorials that newbies can understand and get up to speed, and keep them up to date. Just take a look at Deep Learning, which became popular no sooner than micro-controller world did. I can easily list 5 top tutorials and MOOCs where a newbie like myself can follow and get some meaningful understanding or output from it. Deep Learning is arguably a much tougher and vast topic as well.


That's because basic deep learning is simpler than the advance microcontroller features you listed.

There are plenty of Arduino courses too.

Also microcontroller manufacturers don't really make their money from hobbyists.


I would place arduino among higher-level programming. Since you are not touching registers and you don't know what is under the hood of Serial.print() (buffers? Circular one that can overflow? blocking? Interrupt driven?) It is more for playing around with prototype.


You can buy an Arduino for $10 and program it with a USB cable and any Windows/Mac/Linux computer.


The esp8266/nodemcu stuff is super cheap to get into. AliExpress has the boards for $2-3 each and I bought a bunch of sensors (temp/humidity, accelerometer, magnetometer) for around the same price. For less than $100 you can have a bag full of this stuff for weekend projects.


I've been really impressed with the esp8266. The only question is what I want to do with it!


Likewise. It has been decades since I last touched a soldering-iron, but this year I've built a few fun & useful projects with ESP8266 boards.

To make them attractive I've also started getting into 3D-printing, paying other people to make cases/boxes/shells for me.


You can get Arduino type boards for a few bucks on Amazon, which can be pretty easily turned into an ICSP programmer for arbitrary other microcontrollers. That helps a lot. Even just using it as an Arduino if you skip the "IDE" is a great way to start.


Yeah, the Arduino really opened up the world of microcontrollers to a lot of people. It's easy to get started and the price is reasonable.

I've used an Arduino as a glorified breakout board thanks to the ICSP header (connected to an AVRISP mkII), and I've also used it with the Arduino IDE.

It's a surprisingly versatile platform.


Is there any significant work to add 8-bit microcontroller support to Rust?


Yeah, there's progress on 8-bit and 16-bit support. We're tracking our work on improving Rust for embedded systems in this roadmap issue [1], with details in this RFC repo [2]. As part of that, we have preliminary support for the 16 bit cpu MSP430 in-tree [3], and a fork for the 8-bit AVR in [4].

[1]: https://github.com/rust-lang/rust-roadmap/issues/15

[2]: https://github.com/rust-embedded/rfcs

[3]: https://github.com/rust-embedded/rfcs/issues/20

[4]: https://github.com/avr-rust/rust


I didn't know LLVM had MSP430 and AVR backends. Neat!


I believe the AVR backend was broken (and out of tree) for a long time, but revived by someone interested in getting Rust onto it.


Whoa sounds awesome. It's nice to see 8 and 16-bit CPUs are not neglected :)



Not that I've seen, but I also think it would not be productive given 8 bit micros are going the way of discrete TTL logic these days. When it is cheaper to buy a 32 bit machine with more flash and ram than an 8 bit machine in a similar package, the software folks always ask the hardware folks for the nicer machine.


Well, sort of. It depends on your BOM cost targets. I remember seeing (a few years ago now) an AVR in a tiny 6 pin surface mount package about 0805-ish size. 8 bit micros are the new 555.

I have a friend that does a lot of consumer electronics development on contract. One of his favorite parts is an 8 bit micro that cost 6 cents in bare die, and that was a few years ago. The part is probably 4 cents now. A 4 dollar ARM looks rediculous next to a 4 cent part in certain application spaces.

But in my world - low volume mechatronics - the ARM with 1mb flash is the no-brainer choice.


Hey, any chance your hardware developer friend could spare a few minutes to talk about his/her work? I'm a freelance hardware+software developer and there's not many of us out there it seems. It would be great to get some feedback on some long-term business goals. You can reach me at [email protected] Thanks!


I'm one too, and I lurk on several subreddits these days.


OT, but are you from Jarnac? A good friend of mine lives in Tilloux


No, my handle comes from a famous duel. A case of judicial combat that undermined the French crown.

http://www.thearma.org/essays/DOTC.htm#.Wb8agK3Mzyg


Any idea where could one get that 4 cent mcu and how to assemble it ?


Chinese supplier. It comes as passivated bare die. You glue it to the board, need a wire bonding machine to attach connectivity, and thn encapsulate it with a blob of epoxy. Serious volume stuff. The dev kit is a few hundred bucks though, so it is reasonable to work with. Reply if you are still interested. But if you aren't already assembling in China, it probably isn't for you.


Those are volume prices.


The 8bit micros are becoming smaller and cheaper though, they're still very useful in certain applications that require very little power (both electrical and processing).


You are correct that there are places for them, But generally not places that need a high level language.

Which is the interesting thing for me. At the current processing nodes (22nm and even 45nm) the cost of the packaging and bin testing dominates. It costs, too a fairly close approximation, exactly the same to put a chip in that is 8 bits as it does to put one in that is 32 bits.

I had a great discussion with a product manager from ST Micro about this at their recent Developers Conference in the Bay Area. They had some collateral on the STM8 series and I asked about it and he shared the above but said that it was either legacy designs, essentially CPLD type designs, or HW engineers without a software person to support them that seemed to still use 8 bit machines. From ST's perspective the cost to produce was the same.

At Digikey the lowest cost 32 bit processor is 58 cents[1], and the lowest cost 8 bit processor is 40 cents[2].

[1] https://www.digikey.com/product-detail/en/stmicroelectronics...

[2] https://www.digikey.com/product-detail/en/stmicroelectronics...


> But generally not places that need a high level language.

Are you referring to Rust here?


Sometimes even C. Those little chips don't have enough RAM or flash to handle the overhead. Often the only choice is assembly.


16 bit seems more common than 32 bit, there's LOTs of low end processors. 32bit processors are expensive ( relatively ), and often don't have anywhere near as good low power operation.

On our product we have up to 4 processors, 1 32bit, 2 16bit, 1 8bit. Smallest only has 20bytes of RAM

Ideally I'd like to program them all with a super typesafe language


Rust is by definition for new projects, so there are simply no more 8-bit microcontrollers around. Those things are moving into museums. And good riddance, M0s are soo much better.


There are plenty. They're ridiculously cheap, so they get used everywhere. Checking DigiKey for the cheapest 8 bit and 32 bit micros: Atmel's ATTiny5-MAHR is $0.196 each at qty 100 (no further price breaks listed, full reel of 3000 is non-stock/call for price). The Cypress CY8C4013SXI-400T is $0.49 each at qty 2500 (full reel). 2.5x the price is a big difference, and matters a lot when you're producing thousands or millions of something. Tons of power supplies use an 8-bit (or even 4-bit) microcontroller instead of or in addition to a dedicated switchmode controller ASIC. They're also ridiculously common as LED dimmer controllers, keypad controllers for microwave ovens, etc. 8-bit micros aren't going anywhere any time soon. Price matters too much.


> so there are simply no more 8-bit microcontrollers around

Atmel still produce a significant number of 8bit AVRs.


Yeah ..

http://www.mouser.at/Semiconductors/Integrated-Circuits-ICs/...

Apropros "M0's being so much better", this is not really an attitude that produces effective results in "the embedded" world.

True fact: in embedded, many times you want the least powerful solution. 8-bit MCU's are freakin' everywhere, and they are not going anywhere.


Modern ARM Cortex M0 MCUs are cheaper than AVRs, use less power, and have much better peripherals. The only reason to use an AVR MCU these days is if you (i) need to run it off a 5V rail or (ii) you prototyped your product with an Arduino.


Cheapest M0 on Digikey: https://www.digikey.com/product-detail/en/cypress-semiconduc...

Cheapest 8-bit AVR on Digikey: https://www.digikey.com/product-detail/en/microchip-technolo...

2.5x the price is not cheaper.


Ok, yes, at the very very low end (that thing has 6 pins!) you might get an AVR for slightly cheaper. My point was that you typically get better price, power usage and performance from a Cortex M0. If you have $1 to spend, for example, there is rarely any advantage to going with AVR.


2.5x doesn't seem like a useful way of looking at it. It's $0.24 vs $0.49. What kind of devices are these things going in? It matters if the rest of the components are $1, not so much of they're $10.


I used to sell parts to Compaq when they made their own server boards. Design manager was trying to iron millibucks out of the BOM cost. That's tenths of pennies. It matters even if you're charging $100. Those pennies add up when you're selling 10-100ku per month on multiple product lines.


For very low quantities, the price difference won't matter much..

If you're selling hundreds of thousands or millions of something, suddenly that is real money that justifies the incremental increase of engineering effort to use a cheaper part.


A lot of the devices they're going in are at or below the $5 retail price. So the COGS has to be under $2 to make a reasonable profit. That doesn't leave much room to waste money on your BOM.


Cortex M has higher I/O latency.


Are you referring to the fact that you can switch a GPIO port with a single instruction on an 8-bit micro?

That's true, but you can usually run the ARMs at higher clock speeds.


Atmel have just been bought by Microchip in an industry-wide reorganization that is sure to swallow the 8 bit micros.

And of course they still make tons of them, it's embedded, you need to supply these chips for another 10 years for old designs.

But you shouldn't use an 8-bit uC in a new design.


Microchip's bread and butter is still cheap 8-bit controllers like the PIC10/12/14/16/18 families. Many devices don't need a 32-bit core, so wasting a few cents on each unit by specifying one drives up cost at high volume.

While the core of a 32-bit controller doesn't take up much die area at smaller process nodes, core size has never really been the limiting factor with MCUs. Look at any MCU die - the flash and RAM both dominate in size over the logic. Going to a 32 bit controller generally means that your instructions are going to be wider, so your program space is going to take up more flash. You will also tend to use more RAM unless you are judicious about word size, and a context switch on an RTOS will cost more memory since the context contains much wider registers.

Another pain point with the 32-bit MCUs is that ARM currently dominates the market, and their licensing fees add a few cents to the cost of each part - which can be a killer at volume. There have been attempts to break ARM's monopoly (most notably Microchip's MIPS-based PIC32 series) but it seems like ARM will continue to dominate this space in the years to come.

As a final note, SiLabs recently introduced an ultra-low power microcontroller line based on the venerable 8-bit 8051 core, so clearly they think that 8-bit MCUs have a role to play in the years to come.


> that is sure to swallow the 8 bit micros.

What makes you say that?

MicroChip released new xMega's in May. [0]

[0] http://www.microchip.com/wwwproducts/en/ATXMEGA64A1U


No: pick the right tool for the job. Wasting money in the MCU is probably wasted 5x to 10x over in the price of the product embedding the MCU. And if that's a consumer product, good luck with the competition.


Having more power full mcu (and with more ram) can reduce time to market. You can use some rtos, heavy libraries, oversized buffers etc.


Intel 8051 are widely deployed in IoT devices.


>"But what's this ttyUSB0 thing? It's a file of course! Everything is a file in *nix."

Isn't this not really true? I remember it was a long point of discussion in an embedded class I took years ago but can't remember exactly what was said.


Many of the APIs for devices are through special ioctls, not through read/write of the file contents. Possibly what was referred to.


I know that was originally the idea, but I doubt very much they've religiously kept up with that abstraction.


It's semi-true. Most things are still basically files, it's just that a lot of things get shoved through ioctls rather than reads, writes, and mmap. You still configure PCIe DMA through a file-based interface, but the ring buffers get mapped directly into your process.

I think technically "everything" at least starts with an access to a file, but maybe not every interaction is done with a file.


Started out with the 8052 in college. Graduated to the PIC. Upgraded to the AVRs. Last seen with Xilinx FPGAs before being kidnapped by fullstack web development.

Man do I miss this stuff. Articles like these make me want to get back into it.


The current gen Xilinx stuff is pretty awesome. Digilent is even making boards aimed at the "maker" community: http://store.digilentinc.com/arty-a7-artix-7-fpga-developmen...


Why did you leave? I'm making the opposite journey right now, minus the job switch.


Not the parent. But I've done a bunch of both hardware and software and frankly I would go for software any day of the week. It's a lot harder to get into a flow with hardware. And it just feels a whole lot less free than software. You can go from pretty vauge idea to a working prototype very rapidly with software, but with hardware that both takes a lot longer and the types of things you can imagine is a lot more constrained.

If you want a taste, go build some stuff in pure x86. You'll pretty quickly get tired of it and will want to build higher order abstractions, and then you're back in software-land.


Interesting, I felt the opposite.

Doing stuff with hardware expands your creative horizon to do things in the real world, freeing yourself from the confines of the virtual realm. 3D print what you need, make it come alive with MCUs.

Also much of modern software development feels like just plugging lego bricks, but in embedded you actually have to know how stuff works, down to the register level.

But then those were purely hobby projects, maybe it feels very different when doing it for a living.


At the time, I was in NYC and I wanted to stay. I couldn't find any hardware related jobs there so I switched over to software.

Now I live in the bay area and am working on a startup. Iteration cycles are much more faster so I can learn quicker and ship quicker. Also, distribution is a lot easier.


I got an STM32 Nucleo board some time ago that is sitting in a drawer. I want to learn Rust, would it apply?




Consider applying for YC's Summer 2025 batch! Applications are open till May 13

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: