Connectors
Last updated
Last updated
PRST Connector is a HTTP wrapper on top of the request that allows to define request and response parameters, map them to internal entities and proxy custom parameters. It can be used during testing, or directly as wrapper on top of the integration to reduce time on model switching.
💡 Note: PRST Connector offers a unified interface to integrate with various AI provides. using UI configurations it's possible to change ID in request without need to rebuild the whole integration.
Using Connector it's possible to use the same PRST Prompt across multiple AI models and providers.
After Connector Creation it's possible to configure it in accordance with expected behavior. To do so Click on the new Connector and start configurations.
The input parameters represents everything that should be passed to the API request during the execution. PRST Connector Allows to define 2 types of Input Parameters: Credentials and Proxy Parameters.
Credentials is a sensitive information that keeps encrypted in PRST system and uses for the Authentication during the request Execution.
Many models allows to configure many additional parameters to improve generation results. PRST Connector Proxy Parameters allows to define a set of parameters that can be used during the execution to do more sensitive tunning of your generation.
💡 Note: There is no strict limit to the number of parameters. However, it's a good practice to keep only necessary parameters.
Some API may require extra authentication before request. To define a strategy to retrieve a token, or other credentials you can use Auth request.
💡 Note: The request Configurations allows to define custom input and output to simplify integration. Use it to keep only data that you need during request
In case when Auth is not required it's possible to skip it.
With PRST Connector Editor is very simple to configure request and parameters. Simply clicking on the mouse Right Button you can choose the desired parameter and insert it into the request. Then the system will automatically pass it during the request execution.
In Context Menu there are several parameter types: System, Credentials, Proxy and Custom. Each type corresponds to the parameter source and provides easy way to identify them
Depending on the request Method, PRST Connector allows to define different request schemes. For example POST request allows to configure request BODY, while GET does not.
To configure the request body you can easily copy the request from external documentation and then using Context Menu mark parameters in the JSON.
❗️ Note: The type of the parameter depends on how it's presented in the JSON. If your parameter is type of
String
- do not forget to keep double quote like this:"YOUR_PARAM"
To assign the parameter just do the following:
Select the place where parameter should be and click Mouse Right Button
Select the desired parameter and it will be automatically assigned to your request
PRST Connector also allows to customize the result and receive the information in the way you desire. In Addition to the System Parameters it's possible to define a Custom Parameters and custom path to them.
💡 Note: The way to define path works in both sides: construct and destruct. So the following construction is also allowed:
"custom[0].total.tokens"
In example below the parameter prompt_tokens will be passed to the custom path: "custom.total.tokens"
Then in the Preview Section it's possible to check parameters mapping in accordance to the designed structure.
To make sure that the system behaves as needed and request created properly it's possible to test integration directly from the PRST Connector Configurations. To do so just simple click on the test Connection Button and make sure that the request finished with successful result.
❗️ Note: The test request is a real request to AI provider and may be charged in accordance to your AI provider pricing
PRST Connector uses in many places and can be called via API. For example it;s possible to define a default Connector to the prompt to use a Prompt Debug Console or use it for Knowledgebase as your internal assistant.