ad hoc RPC[1] that involves JSON request/response payloads and is wed to HTTP transport is arguably worse than conforming to the JSON-RPC 2.0 specification[2].
[1] if it’s not REST (even giving a pass on HATEOAS) then it’s probably, eventually, effectively RPC, and it’s still ad hoc even if it’s well documented
The big irony behind HATEOAS is that LLMs are the mythical "evolvable agents" that are necessary to make HATEOAS work in the first place. HATEOAS was essentially built around human level intelligence that can automatically crawl your endpoints and read documentation written in human language and then they scratched their head why it didn't catch on.
Only browser like clients could conform to HATEOAS, because they essentially delegate all the hard parts (dealing with a dynamically changing structureless API) to a human.
With e.g. JSON over HTTP, you can implement an API that satisfies the stateless-ness constraint of REST and so on. Without hypermedia controls it would fit at Level 2 of the RMM, more or less.
In that shape, it would still be a different beast from RPC. And a disciplined team or API overlord could get it into that shape and keep it there, especially if they start out with that intention.
The problem I've seen many times is that a JSON over HTTP API can start out as, or devolve into, a messy mix of client-server interactions and wind up as ad hoc RPC that's difficult to maintain and very brittle.
So, if a team/project isn't committed to REST, and it's foreseeable that the API will end up dominated by RPC/-like interactions, then why not embrace that reality and do RPC properly? Conforming to a specification like JSON-RPC can be helpful in that regard.
[1] if it’s not REST (even giving a pass on HATEOAS) then it’s probably, eventually, effectively RPC, and it’s still ad hoc even if it’s well documented
[2] https://www.jsonrpc.org/specification