Viewing AI Validation Failures & Provide Feedback

This guide walks users through how to identify and understand image validation failures flagged by AI when a workflow has been run. Failures can be viewed in the project show page and in the Photo Viewer. If this feature is turned off, users will not see any AI Image Analysis data, even if AI Image Analysis has already been run on their projects. Users can also provide feedback on AI Results and View Feedback. 


Step 1: Identify Fields with Failed Validations

Once a project has run an AI Workflow, fields with failed validations will display an error message below the photo thumbnails in the project show page. This error message is also located at the bottom of the photo icons.

FailedValidations_01.png

πŸ“ Example Use Case (with Image Validation)

Imagine you're reviewing a roof inspection project. The field for β€œRoof Condition” is configured with the following requirements:

  • βœ… Image Validation: Roof must be present and detectable in the image 

During review:

You notice two thumbnail images with a red warning icon: 

Clicking one of the images opens it in the viewer, where the right-hand metadata panel displays an error message:

Image Validation Failed The object 'roof' was not found in the image.

This indicates that the AI-driven image validation could not identify a roof in the image, possibly due to poor angle, obstruction, or the photo being unrelated.

This helps you quickly flag or replace invalid content, ensuring your inspection reports maintain accuracy and completeness.


πŸ‘€ Step 2: Reviewing Failed Photos in the Photo Viewer

When a user opens the Photo Viewer in a completed project:

πŸ”Ή Thumbnail Indicators

  • Any photo that fails validation will display a red triangle icon overlay on its thumbnail. 
  • This helps you quickly spot which images need attention without opening each one.


πŸ–Ό Step 3: Viewing Details of Image Analysis Results

Clicking a photo or video thumbnail opens it in full view.

πŸ“‹ Right Sidebar: AI Metadata

  • In the right-hand column, alongside other AI metadata, you’ll see a section labeled:  Screenshot 2025-04-15 at 12.10.19β€―PM.png
  • This section lists:
    • Category
    • Caption
    • Tags
    • Description
    • Validations: Pass (Green Text) & Fail (Red Text) 
    • Extracted Text

Example of a Failed Validation:

βš™οΈ Step 4: Give Feedback on AI Results

  1. Clicking "Give Feedback on AI Results" opens a full-size modal where you can submit your input.

πŸ“ How to Submit Feedback

  1. Select a Category from the dropdown menu:
    • 🏷 Tags
    • πŸ”Ž Extracted Text
    • βœ… Validations
    • πŸ“Œ Other

          

2. Enter Your Comments (required)

  • Provide details about the issue or suggestion.
  • Be as specific as possible to help improve AI performance.

3. Choose one of the following:

  • πŸ’Ύ Save – Submit your feedback.
  • ❌ Cancel – Close the modal without submitting.

πŸ’‘ Why Submit Feedback?

Your feedback helps improve AI accuracy and ensures better results across future projects.


πŸ’‘ Tips

  • Use the validation icons as a triage tool before deeper photo review.
  • You can still manually override or add comments to explain or justify failed photos, if applicable.
  • Make sure your project teams understand what each validation rule is checking.
Have more questions? Submit a request

Comments