That was my first instinct to that question as well, but I think he was just asking if ReLU was required by the hardware design - or if it was possible to use other activation functions as well. If the the ReLU was part of the hardware itself somehow, then it wouldn't be possible to use tanh or sigmoid (which may be better in certain situations); so I think he was just asking if ReLU was required, or if there was flexibility allowed in the activation function.