Setup

Follow these steps to install and configure the Ottic SDK:

1

Install Ottic

Install the Ottic Node.js SDK.

2

Obtain Your Ottic API Key

Visit the Integrations page to copy your Ottic API key.

This key is required to authenticate and use Ottic in your application.

3

Integrate a Published Prompt

Use the snippet below to set up the Ottic SDK and begin working with a published prompt in your application:

Using a published prompt in production

Ottic allows you to generate responses from LLM with your prompt in your application. Below are three use cases demonstrating how to generate responses with published prompts or render prompt text.

1. Generate a response using a prompt

This snippet demonstrates how to use a published prompt with variable placeholders to generate a response from the model:

promptId
string
required

The ID of the published prompt you want to use.

variables
object

Are the variables you want to use in your prompt. Without variables the prompt will be used as is.

messages
array

A list of messages comprising the conversation so far. If messages are not provided, prompt will be used as is.

metadata
object

Metadata is an object that contains additional information about the request.

chainId
string

Chain ID is an identifier for the chain of requests and responses.

tags
array

Tags is an array of strings that contains tags for the request.

metadata, chainId, tags - are optional parameters to your request to monitor your requests and responses.
response will contain the output generated by the LLM based on the configuration of your Ottic prompt.
This snippet demonstrates how to request a response using the selected prompt settings.
You can now update the LLM configuration directly in Ottic and generate responses without modifying your code.

2. Retrieve a rendered prompt with variable replacements

To fetch a prompt with placeholders replaced by specified variable values, use the following code:

promptId
string
required

The ID of the published prompt you want to use.

variables
object

Are the variables you want to use in your prompt. Without variables the prompt will be used as is.

If any variables are missing, they will remain as placeholders in the returned prompt.

3. Retrieve a Prompt with Placeholders Intact

To retrieve a prompt without any variable replacements, use this snippet:

This will return original prompt without modifications.