Reviewing and Managing Annotations

Collect and analyze user feedback on AI responses in PromptOwl with annotations, sentiment tracking, and improvement workflows.

This guide explains how to collect user feedback through annotations and use that feedback to improve your AI prompts in PromptOwlarrow-up-right.


Overview

Annotations are user feedback messages attached to AI responses. They help you:

  • Understand how well your prompts are performing

  • Identify areas for improvement

  • Build evaluation datasets for testing

  • Track user satisfaction over time


Understanding Annotations

What Are Annotations?

Annotations combine two types of feedback:

  1. Sentiment - Quick thumbs up/down rating

  2. Detailed Feedback - Written comments explaining the rating

Types of Annotations

Type
Scope
Use Case

Message Annotation

Single AI response

Feedback on a specific answer

Conversation Annotation

Entire conversation

Overall experience feedback


Collecting Annotations (User View)

Adding Feedback to a Response

Users can annotate any AI response:

  1. Hover over an AI response in the chat

  2. Click the Annotation icon (speech bubble)

  3. The annotation modal opens

Screenshot: Annotation Button

The Annotation Modal

The modal shows:

  • Preview of the AI response (first 200 characters)

  • Sentiment buttons (thumbs up/down)

  • Text area for detailed feedback

Screenshot: Annotation Modal

Providing Feedback

  1. Select Sentiment (optional):

    • Click Thumbs Up for positive feedback

    • Click Thumbs Down for negative feedback

    • Click again to deselect

  2. Write Details (required):

    • Describe what was good or bad

    • Suggest improvements

    • Note any inaccuracies

    • Maximum 2,000 characters

  3. Click Submit to save

Annotating Entire Conversations

For overall conversation feedback:

  1. Open the conversation history panel

  2. Click Annotate Conversation in the header

  3. Provide sentiment and feedback

  4. Submit the annotation


Reviewing Annotations (Admin View)

Administrators can review all feedback through the Monitor interface.

Accessing the Monitor

  1. Open a prompt from the Dashboard

  2. Click Monitor in the top navigation

  3. Navigate to the All Annotations tab

Screenshot: Monitor Navigation

The All Annotations View

This view displays all feedback collected for your prompt:

Screenshot: All Annotations Table

Table Columns

Column
Description

User

Who submitted the feedback

Topic

Conversation topic/title

Question

What the user asked

Response

What the AI answered

Annotation

The feedback text (highlighted)

Sentiment

Thumbs up, down, or neutral

Date

When feedback was submitted

Filtering and Sorting

  • Search: Find specific feedback by keyword

  • Sort: Order by most recent first

  • Filter by Sentiment: View only positive or negative feedback


Viewing Annotation Context

To see the full conversation:

  1. Click on any annotation row

  2. The conversation panel opens on the right

  3. View the complete message history

  4. See where the annotation was placed

Screenshot: Annotation Context

Sentiment Analysis

Understanding Sentiment Badges

Badge
Icon
Meaning

Green

Thumbs Up

Positive feedback

Red

Thumbs Down

Negative feedback

Gray

Minus

Neutral/No sentiment

Sentiment Patterns to Watch

  • Consistent negative sentiment → Prompt needs major revision

  • Mixed sentiment on same topics → Inconsistent AI responses

  • Positive sentiment declining → Recent changes may have issues

  • Negative on specific questions → Knowledge gaps to address


Exporting Annotations

Export to CSV

Download all annotations for external analysis:

  1. Go to the All Annotations tab

  2. Click Export CSV

  3. Save the downloaded file

The CSV includes:

  • User information

  • Question and response text

  • Annotation content

  • Sentiment values

  • Timestamps

Screenshot: Export CSV Button

Use Cases for Exports

  • Share with stakeholders

  • Analyze trends in spreadsheets

  • Create reports on AI performance

  • Archive feedback for compliance


Creating Evaluation Sets

Turn high-quality annotations into test cases for prompt evaluation.

What Are Eval Sets?

Eval Sets are collections of question-response pairs with expected outcomes. Use them to:

  • Test prompt changes before publishing

  • Compare different prompt versions

  • Ensure quality doesn't regress

Creating an Eval Set from Annotations

  1. Go to the All Annotations tab

  2. Check the boxes next to relevant annotations

  3. Click Save to Eval Set

  4. Name your eval set

  5. Click Create

Screenshot: Selection Checkboxes

Best Annotations for Eval Sets

Include annotations that:

  • Represent common user questions

  • Show clear expected behavior

  • Cover edge cases

  • Include both positive and negative examples

Running Evaluations

  1. Go to the prompt's Eval tab

  2. Select your eval set

  3. Run the evaluation

  4. Compare results against expected outcomes


Using Annotations to Improve Prompts

Feedback Analysis Workflow

  1. Review Weekly: Check annotations at least weekly

  2. Identify Patterns: Look for recurring issues

  3. Prioritize Fixes: Address frequent negative feedback first

  4. Update Prompts: Make targeted improvements

  5. Test Changes: Use eval sets to verify fixes

  6. Monitor Results: Track if sentiment improves

Common Issues and Solutions

Issue Pattern
Likely Cause
Solution

"Wrong information"

Outdated documents

Update Data Room

"Didn't understand"

Unclear prompt

Improve system context

"Too verbose"

No length guidance

Add response length instructions

"Missed context"

Memory disabled

Enable conversation memory

"Wrong format"

No format instructions

Specify output format in prompt


Best Practices

For Collecting Quality Feedback

  • Enable annotations for all production prompts

  • Train users on how to give useful feedback

  • Make it easy - ensure annotation button is visible

  • Follow up on critical feedback quickly

For Reviewing Feedback

  • Schedule regular reviews (daily or weekly)

  • Look for patterns not just individual complaints

  • Track sentiment trends over time

  • Celebrate positive feedback with your team

For Acting on Feedback

  • Prioritize by impact - fix issues affecting most users

  • Test before publishing - use eval sets

  • Document changes - note why changes were made

  • Close the loop - let users know issues are fixed


Enterprise Settings

Enabling Annotations

Annotations can be enabled/disabled at the enterprise level:

  1. Go to AdminSettings

  2. Find Response Annotation setting

  3. Toggle to enable/disable

Setting
Description

Show Response Feedback

Enable thumbs up/down buttons

Show Response Annotation

Enable detailed annotation modal

Show Response Save

Allow saving responses as artifacts


Troubleshooting

Annotation button not visible

  1. Check enterprise settings have annotations enabled

  2. Verify user is logged in

  3. Ensure user owns the conversation

  4. Check permission settings for the prompt

Cannot submit annotation

  1. Verify feedback text is not empty

  2. Check text is under 2,000 characters

  3. Ensure network connection is stable

  4. Try refreshing the page

Annotations not appearing in Monitor

  1. Verify you have access to the prompt

  2. Check you're viewing the correct prompt

  3. Refresh the All Annotations view

  4. Check date filters aren't excluding recent feedback

Export not working

  1. Check browser allows downloads

  2. Try a different browser

  3. Verify there are annotations to export

  4. Check for popup blockers


Privacy and Data Handling

What's Stored

  • Annotation text

  • Sentiment value

  • Timestamp

  • User who submitted

  • Associated conversation

Data Retention

  • Annotations are retained with conversations

  • Deleting a conversation removes its annotations

  • Export data for archival before deletion

Access Control

  • Only prompt owners/admins can view all annotations

  • Users can only annotate their own conversations

  • Team members see annotations for shared prompts


Last updated