Prompts
Seamless Prompt Creation with Intuitive Editor
Last updated
Seamless Prompt Creation with Intuitive Editor
Last updated
The Prst Prompt serves as a crucial component in the prst.ai system, representing the input prompt utilized by AI models. Through configuring prompts, parameters, and receiving feedback, users can construct a flexible prompt management solution with distributed access to each operation.
A prompt is a configuration property that aids in identifying a particular prompt, with the capability to possess multiple prompt versions.
Each prompt version comprises prompt configurations, parameters, and execution parameters such as connector and connector version. These parameters enable users to test prompts via the UI or execute them using the execute API through SDK or directly via API. By defining prompt parameters, a flexible structure capable of handling various scenarios is created.
In addition to parameters, prompt may also be connected to Feedbacks and Executions.
After prompt creation it is possible to configure it with new parameters, and execution information.
Prst Prompt Editor provides many configurations Options of the prompt to simplify Prompt Management. Following the number indicators on the image above here is a list of possibilities:
Edit Prompt Name: by clicking on the input it's possible to update prompt name. The system will update prompt automatically without confirmation.
Mark Parameter: when some text in editor is selected it's possible to convert it to parameter by clicking this button. The text itself will not disappear and it's always possible to undo changes.
Refresh Parameters: clears all parameters in the prompt reverting changes to the original text.
Clear Editor: removes all text presented in the Prompt Body and resets editor.
Play Button: by clicking on this button user will be redirected to the Debug Console, where it would be possible to provide parameters and test prompt.
Prompt Status: changes the status of the prompt. Useful for internal collaboration.
Selected Version: current selected Prompt Version.
Default Connector: connector that will be used to test a prompt.
Default Connector Version: connector version that will be used to test a prompt.
Default Connector Proxy Parameters: connectors allows you to define many proxy parameters that might be useful to test prompts such as temperature, model, role, etc.
Editor: the main input where it's possible to edit prompt and related parameters.
Mark Param: by clicking the Mouse Right Button the Context Menu will appear. In this menu it's possible to define parameter the same way as in #2.
Remove Selected Parameter: from Context Menu it's possible to remove clicked parameter.
Prompt Version Parameters: the table of all prompt parameters
Prompt Version Feedbacks: all collected Prompt Versions Feedbacks
Prompt Debug Console: the debug Console with information about recent executions and rela-time tests.
Prompt Parameter Name: using this input it's possible to change parameter name. This name will be used via API to compile a prompt.
Prompt Parameter Default Value: it's possible to define default value for the parameter. It will be automatically passed in cases when parameter is not presented in the request.
Parameter Type: In some cases it might be important to define parameter type differently from the "string
"
Remove Parameter: another way to remove parameter from the Prompt.
Feedback section displays all collected feedbacks associated with the Prompt Version selected. This view also provide information of the feedback "feeling" based on the combination from the ranking and sentimental analysis result of the user comment.
This information can be very useful for Prompt Engineer and may help to finetune prompt.
Debug Console allows to provide custom parameters and test the prompt with selected connector.
Following the number indicators on the image above here is a list of possibilities:
Parameters Value Input: here it's possible to enter actual value to each parameter.
Execution Logs: the log of actions and operation during the request execution
Feedback Form: allows to rate the result directly during testing
During testing, it's also may be useful to see all original information passed through the system to ensure the system behaves as needed.
To do so you can click on any debug item and see the result in JSON format: