llm-answer-engine
llm-answer-engine copied to clipboard
Suggestion: Divide the user question into sub-questions
It is quite similar to how perplexity pro works maybe. So what i was thinking is that we pass the user question via llm model to generate sub questions. We then use these sub-questions to fetch n top results via serper. Then pass all these sub-questions and there respective responses to an llm to generate answer.