It really feels like tech companies are taking the approach of "we'll guarantee you anything you want!" As a sales strategy.
The cynical part of me wants to say it shows that they have high confidence they can manipulate the legal system enough to dictate the outcome of any challenges.
Not a lawyer but I imagine the infringement on the input side, during the dataset creation and training would take place wherever the model is being trained? So presumably the us? On the other hand, yeah. Memorizing input data and spitting out protected characters on the output side seems like an issue.
The cynical part of me wants to say it shows that they have high confidence they can manipulate the legal system enough to dictate the outcome of any challenges.