There was a trend in the late 90s and early 2000s to call everything in computing "modern", and nowadays one way you can identify that a technology is likely outdated is if it calls itself modern.
While much of what's in this textbook is useful content, it's not modern, it's classical. Definitely worth learning and being familiar with, but not really the kinds of techniques that are used in practice or even in research.
I've read it and think it gives great historical perspective to the field. It is important that one understands the successful/unsuccessful methods and subfields developed since Turing's paper. IMO it is hard to appreciate the modern advances without the big picture. Some of the classic symbolic techniques combined with modern ML architectures may come full circle towards solving AGI in the way that NN came back to become the foundation of machine learning.
Modern now would mean something post "attention is all you need" concept. And I think there's plenty of follow-up papers on this concept. Interesting times ahead
While much of what's in this textbook is useful content, it's not modern, it's classical. Definitely worth learning and being familiar with, but not really the kinds of techniques that are used in practice or even in research.