Let's take as an example a website which compares prices in different stores and shows which has the cheapest price. You can do it manually, but using automation it is faster and more convenient. Is doesn't make sense to read websites manually when you can use a script or a language model.
Obviously for consumer it is better to be able to scrape sites. It is only those store owners (greedy capitalists) who do not want consumer to know that their prices are inflated.
Another thing is looking for some information, it is better just to have a language model go around the web and summarize the data for you rather than read someone's site with white letters on black background and weird font.