Skip to main content
Collecting feedback on agent outputs helps teams understand where responses fall short—whether that’s in accuracy, tone, or usefulness. It captures real user signals that guide what to improve next. Annotating feedback transforms those signals into structured insights. By labeling and triaging issues, teams can identify patterns, prioritize fixes, and create a clear feedback loop that drives continuous agent improvement. This is typically a part of the quality assurance (QA) process. You can do both seamlessly on Contextual, with built-in tools to collect and annotate feedback in one place.

Feedback Collection

The first phase is to collect feedback from your users.
  1. Users can provide feedback on an agent’s output using a thumbs up/thumbs down rating or by flagging responses that need review. They can do so via the Contextual UI or via API.
On the Contextual UI, you can click on the feedback icons to leave feedback. FeedbackGiving.png We provide an API for users to leave feedback programatically. In the API body, you can flag the query as thumbs_up, thumbs_down, or flagged and include an explanation. Here’s an example:
curl --request POST \
  --url https://api.contextual.ai/v1/agents/{agent_id}/feedback \
  --header 'Authorization: Bearer <token>' \
  --header 'Content-Type: application/json' \
  --data '{
  "message_id": "3c90c3cc-0d44-4b50-8888-8dd25736052a",
  "feedback": "thumbs_down",
  "explanation": "The response did not answer the question",
  "content_id": "3c90c3cc-0d44-4b50-8888-8dd25736052a"
}'
  1. On the Contextual UI, if you leave a thumbs down, you will be prompted to leave freeform feedback and select the reasons for the feedback. 2-FeedbackGiving.gif
  2. Admins are able to configure the feedback workflow that users go through. Navigate to the agent configuration menu and click on User Experience under the Advanced section. Scroll down to Feedback Customization.
Screenshot 2025-11-05 at 7.50.29 PM.png
  1. In the Feedback Customization section, you have three configs:
    1. Enable Mandatory Feedback: Toggle this if you require your users to provide feedback before submitting a new query.
    2. Show Feedback Dialog for Thumbs Up: On the Contextual UI, we will prompt the user to leave details if they submit a thumbs_down. With this option enabled, we will also prompt users if they submit a thumbs_up.
    3. Customized feedback options for thumbs down: When a user gives a thumbs down, they’ll be prompted to provide additional context by selecting one or more reasons for their feedback. This setting allows you to customize the list of reasons users can choose. If this section is blank, we will show users a list of default reasons.
      1. You can click Preview to see how the feedback workflow will look like.
1-FeedbackSetup.gif

Feedback Annotation

After collecting feedback from real users, you can annotate the feedback as part of your quality assurance (QA) workflow.
  1. Before starting your annotation, you will need to configure your annotation labels. Navigate to the agent configuration menu and click on Feedback Annotation under the Advanced section. Screenshot 2025-11-05 at 8.04.39 PM.png
  2. There are four annotation columns you can define labels for. The first column is Query Categories. To add a label, click Add Query Category. You can remove a label by clicking the - icon. Finally, you can turn on Intelligent Categorization which will automatically classify all flagged queries into one of the labels you’ve defined. Screenshot 2025-11-05 at 8.09.45 PM.png
  3. Repeat the same process for the other three columns: Error Categories (annotate any issues identified), Response Quality (annotate the overall quality of the response), Resolution Status (annotate the status of the issue — open/closed/etc).
  4. You can also rename any of the existing columns. If you’d like to introduce a new column type beyond the four provided by default, simply update one of the column names to match the new category you want to track. 6-CustomizeColNames.png
  5. Click Save at the bottom when you are done.
  6. Navigate to the Feedback Annotation module through the agent menu. 3-NavigateModuleNew.gif
  7. To annotate a piece of feedback, simply click on it. The annotation UI will pop up and you can annotate with the custom labels you defined previously. 7-AnnotationWorkflowFast.gif
  8. You can click on citations in the annotation UI to view source chunks. 9-InspectChunk.gif
  9. You can also filter the feedback list to focus on what matters most. Filters can be applied by any column or by the date the feedback was submitted. 8-Filters.gif
  10. There are a few other capabilities in the annotation dashboard:
  • Click Export to CSV to download the annotated data
  • Click the eye icon to select which columns to display
  • You can also choose which column to sort by.
Screenshot 2025-11-05 at 8.27.55 PM.png
  1. Finally, you can click the Dashboard button at the top to view an auto-generated dashboard that visualizes your annotated feedback results.
Navigating to the Dashboard: Screenshot 2025-11-05 at 8.29.51 PM.png Viewing the Dashboard: 10-Dashboard.png