API Publishing and Embedding

Deploy PromptOwl prompts as REST APIs, generate API keys, embed chatbots via iframe, and integrate AI into your applications.

This guide explains how to publish prompts as APIs, generate API keys, embed chatbots in external applications, and integrate PromptOwlarrow-up-right with your systems.


Table of Contents


Publishing Overview

PromptOwl allows you to expose your prompts as APIs for external applications.

What You Can Do

Feature
Description

API Access

Call prompts via HTTP POST

Embed Chatbot

Add chat widget to any website

Custom Variables

Pass runtime parameters

Conversation History

Maintain context across calls

Model Override

Change LLM settings per request

Integration Options

Method
Best For

REST API

Backend integrations, apps

iFrame Embed

Website chat widgets

JavaScript

Custom web implementations


Making a Prompt Live

Before a prompt can be accessed via API, it must be set to "Live" status.

Publishing a Prompt

  1. Open your prompt

  2. Navigate to Publish tab

  3. Toggle status to Live

  4. Prompt is now accessible via API

Screenshot: Publish Toggle

Live vs Draft Status

Status
API Access
Internal Use

Live

Enabled

Yes

Draft

Blocked

Yes

Checking Publish Status

  • Live prompts show green indicator

  • Draft prompts show gray indicator

  • Status visible on prompt card and publish page

Note: Non-live prompts return error 400 when called via API.


Generating API Keys

API keys authenticate external requests to your prompts.

Creating an API Key

  1. Open your prompt

  2. Go to Publish tab

  3. Find API Key section

  4. Click Generate API Key

  5. Copy the key immediately

Screenshot: API Key Generation

API Key Format

Example:

Important: Save Your Key

Warning
Details

One-time display

Key shown only at generation

Cannot retrieve

No way to view key again

Regenerate if lost

Creates new key, invalidates old

Regenerating Keys

If you lose or need to rotate your key:

  1. Go to Publish tab

  2. Click Regenerate API Key

  3. Old key immediately invalidated

  4. New key generated

  5. Update all integrations

Key Properties

Property
Description

One per prompt

Single active key per prompt

User-bound

Tied to your account

Toggleable

Can enable/disable

Secure

Stored as hash, never plaintext


Using the API

Endpoint

Authentication

Include API key in header:

Basic Request

Request Body

Field
Type
Required
Description

sessionId

string

Yes

User/session identifier

message

string

Yes

User's input message

previousMessages

array

No

Conversation history

variables

object

No

Runtime variable values

llmType

string

No

Override model provider

llmSettings

object

No

Override model parameters

streaming

boolean

No

Enable streaming response

conversationId

string

No

Continue specific conversation

Response Format

Response Fields

Field
Description

id

Conversation identifier

conversationId

Same as id, for reference

messages

Full message history

totalTokenUsed

Token count for billing

citations

Source references (if RAG)


Embedding Chatbots

Embed a chat widget directly in any website.

Getting Embed Code

  1. Open your prompt

  2. Go to Publish tab

  3. Find Chatbot Embed Generator

  4. Copy the embed code

Screenshot: Embed Generator

iFrame Embed

JavaScript Embed

Embed Parameters

Parameter
Description
Example

SESSION_ID

User identifier

user-123

PROMPT_ID

Your prompt ID

abc123def456


Customizing Embedded Chat

Color Customization

Configure chat appearance in the publish interface:

Setting
Description

Header Background

Chat header color

Header Text

Header text color

User Bubble

User message background

User Text

User message text color

Screenshot: Chat Customization

Branding Options

Option
Description

Hide Logo

Remove PromptOwl branding

Custom Colors

Match your brand

Size

Adjust width/height

Size Recommendations

Use Case
Width
Height

Sidebar widget

350px

500px

Full panel

480px

860px

Mobile

100%

100%


Variables and Parameters

Pass dynamic values to your prompts at runtime.

Using Variables

Include variables in your API request:

Variable Syntax

In your prompt, reference variables with:

Overriding LLM Settings

Change model settings per request:

Available LLM Settings

Setting
Description
Range

model

Specific model

Provider-dependent

temperature

Creativity

0-2

max_tokens

Response length

Model-dependent

top_p

Nucleus sampling

0-1


Streaming Responses

Get real-time token-by-token responses.

Enabling Streaming

Handling Streamed Response

The response is sent as Server-Sent Events (SSE):

Streaming Benefits

Benefit
Description

Faster perceived response

See text as it generates

Better UX

Users see progress

Long responses

Handle large outputs


Conversation Management

Maintain context across multiple API calls.

Continuing Conversations

Pass conversationId to continue:

Using Previous Messages

Pass conversation history manually:

Session ID Best Practices

Use Case
Session ID Pattern

Per-user

user-{userId}

Per-session

session-{uuid}

Anonymous

anon-{timestamp}


Security Considerations

API Key Security

Do
Don't

Store keys in environment variables

Hardcode in client-side code

Use server-side proxy

Expose in browser

Rotate keys periodically

Share keys publicly

Use HTTPS only

Send over HTTP

Backend Proxy Pattern

Instead of calling API directly from browser:

CORS

PromptOwl API allows cross-origin requests:

  • All origins permitted (*)

  • POST and OPTIONS methods

  • X-API-Key header allowed

Rate Limiting

Note: Contact your administrator or PromptOwl supportarrow-up-right to configure rate limiting for your API endpoints.


Best Practices

API Integration

Do:

  • Use a backend proxy

  • Store keys securely

  • Handle errors gracefully

  • Implement retry logic

  • Log API calls for debugging

Don't:

  • Expose keys in frontend

  • Ignore error responses

  • Skip validation

  • Overload with requests

Embedded Chat

Do:

  • Match brand colors

  • Test on mobile

  • Consider position carefully

  • Provide clear instructions

Don't:

  • Make chat too small

  • Clash with page colors

  • Block important content

  • Forget mobile users

Performance

Do:

  • Use streaming for long responses

  • Cache when possible

  • Set appropriate token limits

  • Monitor usage

Don't:

  • Request unnecessarily large responses

  • Poll continuously

  • Ignore token costs


Troubleshooting

"PromptOwl API key not found"

Cause: Missing API key header

Solutions:

  1. Check header name: X-API-Key

  2. Verify key is included in request

  3. Check for typos in key

"Invalid API key"

Cause: Key doesn't match or is inactive

Solutions:

  1. Verify key is correct

  2. Check key isn't regenerated

  3. Ensure key is active

  4. Generate new key if needed

"This prompt is not live"

Cause: Prompt is in Draft status

Solutions:

  1. Go to Publish tab

  2. Toggle to Live status

  3. Save changes

  4. Retry API call

Empty or Error Responses

Solutions:

  1. Check request body format

  2. Verify JSON is valid

  3. Ensure required fields present

  4. Check message isn't empty

Embed Not Loading

Solutions:

  1. Verify prompt ID is correct

  2. Check prompt is Live

  3. Ensure HTTPS on your page

  4. Check browser console for errors

Streaming Not Working

Solutions:

  1. Verify streaming: true in request

  2. Check client handles SSE

  3. Ensure connection isn't closed early

  4. Test with non-streaming first


Quick Reference

API Endpoint

Minimum Request

Full Request Options

HTTP Status Codes

Code
Meaning

200

Success

400

Prompt not live or bad request

401

Invalid or missing API key

500

Server error

Embed Template


Last updated