This is what intentionality is about. No intentionality, no truth.
An LLM doesn't deal with propositions, and it is propositions that are the subjects of truth claims. LLMs produce strings of characters that, when interpreted by a human reader, can look like propositions and result in propositions in the human mind, and it is those propositions that have intentionality and truth value. But what the LLMs produce are not the direct expression of propositional content, only the recombination of a large number of expressions of propositional content authored by many human authors.
People are projecting subjective convention onto the objective. The objective truth about LLMs is far poorer in substance than the conventional readings we give to what LLMs generate. There is a good deal of superstition and magical thinking that surrounds LLMs.