Funnily enough, from a mathematical point of view, it's the opposite: integration is a really nice operation that can be applied to basically anything, whereas differentiation is really finicky, sometimes derivatives don't exist and you can't even be too sure when, so you need to be extra careful. This carries over to doing numerical computing: integrating an arbitrary function is easy, for smooth 1d functions it's a solved problem, differentiating even a reasonably smooth function numerically is much harder. If there is noise in your function, it doesn't matter so much for integrating, but can completely break the derivative.
Integration can definitely not be applied to basically anything. See the entire subject of measure theory, the concepts of Riemann, Riemann-stieltjes and Lebesgue integrals.
The class of C(1) functions is quite easy. The class of intergrable functions is much more difficult. All we know is that it is larger. Consider this: to prove a function isn't differentiable, you need only give a single point where the derivative as a limit doesn't converge. To prove a function has no integral, you need to consider all possible partitions of that function's ___domain. (You also need to specify what exact measure is being used, etc).
Perhaps he means from a practical point of view. Signals arising from the types of natural phenomena measured in engineering is always real, contiguous, and limited. An integrating op amp circuit is going to be stable up to the limits of the power supply, but a differentiator is likely unstable and unusable due to noise. Fourier transform and freqiency analysis rely on integration. Feedback loops with delay. Etc.
One way to interpret why noise = bad for differentiation: The Fourier transform of f'(x) is iw F(w). So differentiation is essentially a high-pass filter.
I think you mean analytically in the same sense as the comment I replied to, of having a closed form you can write down by hand, but that's not right. But that's not the really important thing from a mathematical point of view: you really want to know useful properties about what you get out of integration, and integration gives you nice mathematical objects with good properties, very much unlike differentiation. It's wrong to think of getting a closed form as the goal, that's nice, but not as important as the other stuff. The numerical consequences just follow from the mathematics, so they can illustrate the difference.
Still, that's not what you grade you for in school. Differentiation is much easier than integration, because for the former you have, within the scope of what they can throw at you on the test, a well-defined set of rules you pretty much mechanically apply to the formula until you can't simplify the answer anymore. Whereas integration is a constant guess-work and performing algebraic magic tricks to maybe make the formula look like something you can tackle with one of the two or three generic methods you've been taught.
Sure, school != reality, but it's the former we get tortured by...
Finding the closed form of the integral of a closed form, yes then what you say is true (this is different from 'analytic' which in mathematics has a different meaning). Scope of the concept of even baby integration of a function is much much larger, and OP is talking about that.
Note the key word there is a function not a function with a closed form that's a tiny subset.
"Scope of the concept of even baby integration of a function is much much larger, and OP is talking about that."
The OP said the opposite, that differentiation is harder 'more finicky.' I agree that the concept of integration is much richer.
Also, I didn't mean 'closed form solution' when I said 'analytic.' I also didn't mean 'analytic functions.' I meant that the analytic machinery you have to develop in order to have a theory of integration is far richer than for differentiation - i.e, proving the multivariate change of variable theorem.