There was a trend in the late 90s and early 2000s to call everything in computing "modern", and nowadays one way you can identify that a technology is likely outdated is if it calls itself modern.
While much of what's in this textbook is useful content, it's not modern, it's classical. Definitely worth learning and being familiar with, but not really the kinds of techniques that are used in practice or even in research.
I've read it and think it gives great historical perspective to the field. It is important that one understands the successful/unsuccessful methods and subfields developed since Turing's paper. IMO it is hard to appreciate the modern advances without the big picture. Some of the classic symbolic techniques combined with modern ML architectures may come full circle towards solving AGI in the way that NN came back to become the foundation of machine learning.
Modern now would mean something post "attention is all you need" concept. And I think there's plenty of follow-up papers on this concept. Interesting times ahead
Reminds me of "modern control systems" courses and books I had in graduate school that contained material developed in the 60's. I think a lot of engineering research developed in the Apollo program era has been called "modern" for some reason.
Reading through the first edition (published 1995) is actually pretty interesting. At this point it's basically a historical document. The section on neural nets contains this gem from John Denker: "Neural networks are the second best way of doing just about anything."
While much of what's in this textbook is useful content, it's not modern, it's classical. Definitely worth learning and being familiar with, but not really the kinds of techniques that are used in practice or even in research.