Reviewing and Managing Annotations
Collect and analyze user feedback on AI responses in PromptOwl with annotations, sentiment tracking, and improvement workflows.
This guide explains how to collect user feedback through annotations and use that feedback to improve your AI prompts in PromptOwl.
Annotations are user feedback messages attached to AI responses. They help you:
Understand how well your prompts are performing
Identify areas for improvement
Build evaluation datasets for testing
Track user satisfaction over time
Understanding Annotations
What Are Annotations?
Annotations combine two types of feedback:
Sentiment - Quick thumbs up/down rating
Detailed Feedback - Written comments explaining the rating
Types of Annotations
Feedback on a specific answer
Overall experience feedback
Collecting Annotations (User View)
Adding Feedback to a Response
Users can annotate any AI response:
Hover over an AI response in the chat
Click the Annotation icon (speech bubble)
The annotation modal opens
Screenshot: Annotation Button The Annotation Modal
The modal shows:
Preview of the AI response (first 200 characters)
Sentiment buttons (thumbs up/down)
Text area for detailed feedback
Screenshot: Annotation Modal Providing Feedback
Select Sentiment (optional):
Click Thumbs Up for positive feedback
Click Thumbs Down for negative feedback
Write Details (required):
Describe what was good or bad
Annotating Entire Conversations
For overall conversation feedback:
Open the conversation history panel
Click Annotate Conversation in the header
Provide sentiment and feedback
Reviewing Annotations (Admin View)
Administrators can review all feedback through the Monitor interface.
Accessing the Monitor
Open a prompt from the Dashboard
Click Monitor in the top navigation
Navigate to the All Annotations tab
Screenshot: Monitor Navigation
The All Annotations View
This view displays all feedback collected for your prompt:
Screenshot: All Annotations Table Who submitted the feedback
The feedback text (highlighted)
Thumbs up, down, or neutral
When feedback was submitted
Filtering and Sorting
Search: Find specific feedback by keyword
Sort: Order by most recent first
Filter by Sentiment: View only positive or negative feedback
Viewing Annotation Context
To see the full conversation:
Click on any annotation row
The conversation panel opens on the right
View the complete message history
See where the annotation was placed
Screenshot: Annotation Context
Sentiment Analysis
Understanding Sentiment Badges
Sentiment Patterns to Watch
Consistent negative sentiment → Prompt needs major revision
Mixed sentiment on same topics → Inconsistent AI responses
Positive sentiment declining → Recent changes may have issues
Negative on specific questions → Knowledge gaps to address
Exporting Annotations
Download all annotations for external analysis:
Go to the All Annotations tab
The CSV includes:
Question and response text
Screenshot: Export CSV Button Use Cases for Exports
Analyze trends in spreadsheets
Create reports on AI performance
Archive feedback for compliance
Creating Evaluation Sets
Turn high-quality annotations into test cases for prompt evaluation.
What Are Eval Sets?
Eval Sets are collections of question-response pairs with expected outcomes. Use them to:
Test prompt changes before publishing
Compare different prompt versions
Ensure quality doesn't regress
Creating an Eval Set from Annotations
Go to the All Annotations tab
Check the boxes next to relevant annotations
Screenshot: Selection Checkboxes Best Annotations for Eval Sets
Include annotations that:
Represent common user questions
Show clear expected behavior
Include both positive and negative examples
Running Evaluations
Go to the prompt's Eval tab
Compare results against expected outcomes
Using Annotations to Improve Prompts
Feedback Analysis Workflow
Review Weekly: Check annotations at least weekly
Identify Patterns: Look for recurring issues
Prioritize Fixes: Address frequent negative feedback first
Update Prompts: Make targeted improvements
Test Changes: Use eval sets to verify fixes
Monitor Results: Track if sentiment improves
Common Issues and Solutions
Issue Pattern
Likely Cause
Solution
Add response length instructions
Enable conversation memory
Specify output format in prompt
For Collecting Quality Feedback
Enable annotations for all production prompts
Train users on how to give useful feedback
Make it easy - ensure annotation button is visible
Follow up on critical feedback quickly
For Reviewing Feedback
Schedule regular reviews (daily or weekly)
Look for patterns not just individual complaints
Track sentiment trends over time
Celebrate positive feedback with your team
For Acting on Feedback
Prioritize by impact - fix issues affecting most users
Test before publishing - use eval sets
Document changes - note why changes were made
Close the loop - let users know issues are fixed
Enterprise Settings
Enabling Annotations
Annotations can be enabled/disabled at the enterprise level:
Find Response Annotation setting
Enable thumbs up/down buttons
Enable detailed annotation modal
Allow saving responses as artifacts
Troubleshooting
Check enterprise settings have annotations enabled
Ensure user owns the conversation
Check permission settings for the prompt
Cannot submit annotation
Verify feedback text is not empty
Check text is under 2,000 characters
Ensure network connection is stable
Annotations not appearing in Monitor
Verify you have access to the prompt
Check you're viewing the correct prompt
Refresh the All Annotations view
Check date filters aren't excluding recent feedback
Export not working
Check browser allows downloads
Verify there are annotations to export
Privacy and Data Handling
Annotations are retained with conversations
Deleting a conversation removes its annotations
Export data for archival before deletion
Only prompt owners/admins can view all annotations
Users can only annotate their own conversations
Team members see annotations for shared prompts