I think the post represents the typical data scientist's approach to a machine learning problem pretty well. Deep learning is exciting but, for practical problems, most people aren't using it yet. XGBoost is still dominant in Kaggle competitions. Lots of companies with lots of data still use simple single machine models.
So, I think it's a little misleading to say that deep learning eliminates the need for feature engineering. If we're going to provide business value to a company, we're often dealing with somewhat structured data and not "unstructured blobs of pixels or blobs of text". There are several simple linear models with highly engineered features in production where I work that resist attempts to replace them with more clever models and less feature engineering.
Deep learning is awesome and there may be a time when it solves every problem. But let's not oversell it. For now, if someone wants to do machine learning and get paid for it, they're more likely to see their colleagues using the techniques in this article than training deep multi-layer neural nets.
Deep Learning is heavily used in Kaggle competitions when there is image data.
Even in non-image data competitions, a deep neural network will often be one of the models chosen to ensemble. They generally perform slightly worse than XGBoost models, but have the advantage that they often aren't closey correlated, which helps fighting overfitting when tuning ensembling hyperparamters.
For image competitions you are right. Neural networks are often in winning teams ensembles, but they require a lot more work than something like xgboost (gradient-boosted decision trees). For a dataset that isn't image processing or NLP, xgboost is in general much more widely used than neural nets. Neural nets suffer from the amount of computing resources and knowledge needed to apply them, though given infinite knowledge and computing power they are probably on par with or better than xgboost. And if you need to analyze an image they are great.
So, I think it's a little misleading to say that deep learning eliminates the need for feature engineering. If we're going to provide business value to a company, we're often dealing with somewhat structured data and not "unstructured blobs of pixels or blobs of text". There are several simple linear models with highly engineered features in production where I work that resist attempts to replace them with more clever models and less feature engineering.
Deep learning is awesome and there may be a time when it solves every problem. But let's not oversell it. For now, if someone wants to do machine learning and get paid for it, they're more likely to see their colleagues using the techniques in this article than training deep multi-layer neural nets.