You could use any search engine, and just gather 'decent' results -- scrape the first 3 results and compare using a new instance of a language model to tell you what's inaccurate in the first one based off the fine tuning data.
I'm wondering what happens when most people start using chat bots and stop using search. There'd need to be a big shift from using user's navigation as a signal and for search engine to determine the quality and authority of the content.
How would the next gen web be? Would people even create webpages anymore? How would these be funded if people aren't navigating to these sites and just using chatbots.
Google's already given up on search, google any programming topic and you get stackoverflow-scraped spam everywhere even BEFORE the stackoverflow post it was scraped from. Google's no better than altavista circa 1997 at this point imho. Brave search is better for most things, and if it misses then consult google, etc.
Yeah, if I can't trust the intial answer, I also can't trust any of the refinements. It will be a long way before you can get a similar level of assurance by asking a number of different language models as you can get now by looking at various sources around the web.
what if you google something, then check the links that google brings up that seem most likely to help you, it remembers that (reinforcement learning), 10 other people search almost the same phrase and pick maybe 60% of your links, after 100% and maybe a decent idea of which links are the best, it'll just return the links along w/ a summary of the details from each of them, without overlap, so maybe one page has top 4 x, and the other top 3, and another top5, well it'll give you a venn diagram summary of those 3 pages and maybe of the 12 listings, 3 are common so it only summarizes the 9 unique ones.