Reviewing and Managing Annotations
This guide explains how to collect user feedback through annotations and use that feedback to improve your AI prompts in PromptOwl .
Overview
Annotations are user feedback messages attached to AI responses. They help you:
- Understand how well your prompts are performing
- Identify areas for improvement
- Build evaluation datasets for testing
- Track user satisfaction over time
Understanding Annotations
What Are Annotations?
Annotations combine two types of feedback:
- Sentiment - Quick thumbs up/down rating
- Detailed Feedback - Written comments explaining the rating
Types of Annotations
| Type | Scope | Use Case |
|---|---|---|
| Message Annotation | Single AI response | Feedback on a specific answer |
| Conversation Annotation | Entire conversation | Overall experience feedback |
Collecting Annotations (User View)
Adding Feedback to a Response
Users can annotate any AI response:
- Hover over an AI response in the chat
- Click the Annotation icon (speech bubble)
- The annotation modal opens
The Annotation Modal
The modal shows:
- Preview of the AI response (first 200 characters)
- Sentiment buttons (thumbs up/down)
- Text area for detailed feedback
Providing Feedback
-
Select Sentiment (optional):
- Click Thumbs Up for positive feedback
- Click Thumbs Down for negative feedback
- Click again to deselect
-
Write Details (required):
- Describe what was good or bad
- Suggest improvements
- Note any inaccuracies
- Maximum 2,000 characters
-
Click Submit to save
Annotating Entire Conversations
For overall conversation feedback:
- Open the conversation history panel
- Click Annotate Conversation in the header
- Provide sentiment and feedback
- Submit the annotation
Reviewing Annotations (Admin View)
Administrators can review all feedback through the Monitor interface.
Accessing the Monitor
- Open a prompt from the Dashboard
- Click Monitor in the top navigation
- Navigate to the All Annotations tab
The All Annotations View
This view displays all feedback collected for your prompt:
Table Columns
| Column | Description |
|---|---|
| User | Who submitted the feedback |
| Topic | Conversation topic/title |
| Question | What the user asked |
| Response | What the AI answered |
| Annotation | The feedback text (highlighted) |
| Sentiment | Thumbs up, down, or neutral |
| Date | When feedback was submitted |
Filtering and Sorting
- Search: Find specific feedback by keyword
- Sort: Order by most recent first
- Filter by Sentiment: View only positive or negative feedback
Viewing Annotation Context
To see the full conversation:
- Click on any annotation row
- The conversation panel opens on the right
- View the complete message history
- See where the annotation was placed
Sentiment Analysis
Understanding Sentiment Badges
| Badge | Icon | Meaning |
|---|---|---|
| Green | Thumbs Up | Positive feedback |
| Red | Thumbs Down | Negative feedback |
| Gray | Minus | Neutral/No sentiment |
Sentiment Patterns to Watch
- Consistent negative sentiment → Prompt needs major revision
- Mixed sentiment on same topics → Inconsistent AI responses
- Positive sentiment declining → Recent changes may have issues
- Negative on specific questions → Knowledge gaps to address
Exporting Annotations
Export to CSV
Download all annotations for external analysis:
- Go to the All Annotations tab
- Click Export CSV
- Save the downloaded file
The CSV includes:
- User information
- Question and response text
- Annotation content
- Sentiment values
- Timestamps
Use Cases for Exports
- Share with stakeholders
- Analyze trends in spreadsheets
- Create reports on AI performance
- Archive feedback for compliance
Creating Evaluation Sets
Turn high-quality annotations into test cases for prompt evaluation.
What Are Eval Sets?
Eval Sets are collections of question-response pairs with expected outcomes. Use them to:
- Test prompt changes before publishing
- Compare different prompt versions
- Ensure quality doesn’t regress
Creating an Eval Set from Annotations
- Go to the All Annotations tab
- Check the boxes next to relevant annotations
- Click Save to Eval Set
- Name your eval set
- Click Create
Best Annotations for Eval Sets
Include annotations that:
- Represent common user questions
- Show clear expected behavior
- Cover edge cases
- Include both positive and negative examples
Running Evaluations
- Go to the prompt’s Eval tab
- Select your eval set
- Run the evaluation
- Compare results against expected outcomes
Using Annotations to Improve Prompts
Feedback Analysis Workflow
- Review Weekly: Check annotations at least weekly
- Identify Patterns: Look for recurring issues
- Prioritize Fixes: Address frequent negative feedback first
- Update Prompts: Make targeted improvements
- Test Changes: Use eval sets to verify fixes
- Monitor Results: Track if sentiment improves
Common Issues and Solutions
| Issue Pattern | Likely Cause | Solution |
|---|---|---|
| ”Wrong information” | Outdated documents | Update Data Room |
| ”Didn’t understand” | Unclear prompt | Improve system context |
| ”Too verbose” | No length guidance | Add response length instructions |
| ”Missed context” | Memory disabled | Enable conversation memory |
| ”Wrong format” | No format instructions | Specify output format in prompt |
Best Practices
For Collecting Quality Feedback
- Enable annotations for all production prompts
- Train users on how to give useful feedback
- Make it easy - ensure annotation button is visible
- Follow up on critical feedback quickly
For Reviewing Feedback
- Schedule regular reviews (daily or weekly)
- Look for patterns not just individual complaints
- Track sentiment trends over time
- Celebrate positive feedback with your team
For Acting on Feedback
- Prioritize by impact - fix issues affecting most users
- Test before publishing - use eval sets
- Document changes - note why changes were made
- Close the loop - let users know issues are fixed
Enterprise Settings
Enabling Annotations
Annotations can be enabled/disabled at the enterprise level:
- Go to Admin → Settings
- Find Response Annotation setting
- Toggle to enable/disable
Related Settings
| Setting | Description |
|---|---|
| Show Response Feedback | Enable thumbs up/down buttons |
| Show Response Annotation | Enable detailed annotation modal |
| Show Response Save | Allow saving responses as artifacts |
Troubleshooting
Annotation button not visible
- Check enterprise settings have annotations enabled
- Verify user is logged in
- Ensure user owns the conversation
- Check permission settings for the prompt
Cannot submit annotation
- Verify feedback text is not empty
- Check text is under 2,000 characters
- Ensure network connection is stable
- Try refreshing the page
Annotations not appearing in Monitor
- Verify you have access to the prompt
- Check you’re viewing the correct prompt
- Refresh the All Annotations view
- Check date filters aren’t excluding recent feedback
Export not working
- Check browser allows downloads
- Try a different browser
- Verify there are annotations to export
- Check for popup blockers
Privacy and Data Handling
What’s Stored
- Annotation text
- Sentiment value
- Timestamp
- User who submitted
- Associated conversation
Data Retention
- Annotations are retained with conversations
- Deleting a conversation removes its annotations
- Export data for archival before deletion
Access Control
- Only prompt owners/admins can view all annotations
- Users can only annotate their own conversations
- Team members see annotations for shared prompts