Hacker News new | past | comments | ask | show | jobs | submit login

That was my first instinct to that question as well, but I think he was just asking if ReLU was required by the hardware design - or if it was possible to use other activation functions as well. If the the ReLU was part of the hardware itself somehow, then it wouldn't be possible to use tanh or sigmoid (which may be better in certain situations); so I think he was just asking if ReLU was required, or if there was flexibility allowed in the activation function.



ah, makes sense. guess I was the fool :P




Join us for AI Startup School this June 16-17 in San Francisco!

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: