You would have to block entire wordlists to combat subdomains like that. It would make more sense to whitelist subdomains instead, but it would require much more effort in order to determine what subdomains are required for the website to function. Additionally, if the site in question ever decided to change anything around, someone would have to catch the breaking change and have it corrected on the whitelists for the site to function again.
Machine learning by analyzing what displays on the page by blocking different domains. Bots can be automated to do that continuously and update a decentralized database with such information.