Exactly! For a lot of work, I use Claude as my first source, then typically I verify what I got out of it with a search engine. If the search engine also starts to hallucinate (starting to see that on Google if I'm not crazy), I have zero use for it. I want results that match my search query, period.
Their Quick Answer feature does an AI summary of your query results. By default it shows up automatically when it has high confidence. You can disable this in settings or force it to show up by adding "?" to your query.
If you want to jump into an AI chat session, you can add bangs to your queries. "!expert" launches a top of the line research agent and "!code" is good and software development. Both of these use the underlying search engine to get current facts.
Kagi even maintains their own LLM benchmark to monitor how well different models perform. They occasionally swap out default models to keep performance SOTA. You can specify a specific model if you want.