Replies: 1 comment
-
@oshoma @20001LastOrder @bharatji30 this too: I'm checking this out and I had written more details than I thought. could you let me know what other details are useful to make this more concrete? |
Beta Was this translation helpful? Give feedback.
0 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
-
Describe the bug
sherpa sometimes fails to answer questions because of context size issues. currently prompts are handled on an adhoc basis and their size can vary without control
To Reproduce
Steps to reproduce the behavior:
Expected behavior
Relying on adhoc prompts will not be robust, so we need to provide more strict instructions for how to handle size issues so that the system can behave robustly regardless of how the prompt is dynamically constructed. Throwing an error is also not a good UX because there are no concrete action items for the user to do when encountered with that since the prompt is mainly written by the system anyway.
Screenshots
n/a
Environment
production
Additional context
my suggestion is to create a function called
prompt_crafter
that is either a wrapper for the LLM calls or used in other context. but this function will be the only way prompts will be created and used.this function will take in strings like "instruction" and "context" and objects like json data as input and puts them all together as the prompt. additionally it will take strategies for size control as flags with default but changeable values. some examples could be:
Beta Was this translation helpful? Give feedback.
All reactions