Both RISC and CISC are usually used in the context of describing Turing complete instruction sets. I'm not sure, it's relevant here?
If you want to make a comparison in this flavour: Turing machines are a bit like CPUs in that they can execute arbitrary things in sequence. All the flavours of machine learning are more like GPUs: they do well with oodles of big, parallelisable matrix multiplications interspersed with some simple non-linear transformations.
Well we're talking about a "native" implementation of both for comparison, right? Neural nets as they're being used are just being emulated by our turing-machine-like processors which makes them run like ass in practice. Something like an analog circuit that adds up voltages would be a native NN implementation and would surely vastly outperform any turing machine in wide highly parallel super highly memory driven tasks that are well suited for it, and either emulating the other is slow and bloated.