Fully leverage LLM APIs to allow users to specify different roles in custom prompts #2020
wan-robert
started this conversation in
Ideas
Replies: 0 comments
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Uh oh!
There was an error while loading. Please reload this page.
-
Current Functionality
User writes some custom prompt in a
.mdfile. All texts that the user writes are sent to the LLM with the "user" role.Desired Functionality
LLM APIs allow users speak for the system, the user, and the assistant at the start of the conversation. In some literature, this is known as the "pre-fill". This helps LLMs retrieve the correct knowledge for the conversation and helps set the tone of the LLM's reponse.
Obsidian Copilot chat histories are already stored in a conversation-like format, so I guess a lot of the code to realize this feature already exists.
For example, if the user writes the following text in a custom prompt
When the user calls this prompt, it will be sent to the LLM in a format similiar to this:
Beta Was this translation helpful? Give feedback.
All reactions