It used to be that real programmers programmed in Hex/Assembly, then FORTRAN, and now I'd probably say C.
And now with web-based apps, cloud computing and distributed systems, will the current low-level languages remain relevant to the average software developer?
The lower you go the more you will know. But seriously, there is a lot of upside knowing exactly whats going on in hardware when you are pushing bits around at a high level. A downside is you waste time writing at a level where a compiler/interpreter would generate much better code for you.
on the one hand, software developers need more powerful languages that can do more and abstract more. these improve their output and efficiency. you can build better software faster with higher level languages. they aren't constrained by the computers they're working with, they themselves are the constraints.
on the other, embedded systems developers need lower level languages because they're constrained by the system. you need to cut efficient code if your entire program is to fit onto a microcontroler with 64K memory, 16K ram.
regular "programmers" used to use lower level languages because they were also constrained by the system.