Aival allows you to simply and quickly evaluate AI products in terms of performance, fairness, and explainability.
What to Expect
Depending on the type of task of the AI product you are evaluating: classification or segmentation, there are 6-7 main pages to the product. After setting up, you can access these via the sidebar or header navigation.
Summary: Explore an overview of the performance of the AI product, including fairness insights that may indicate the potential for biases.
Performance: Take a closer look at studied groups and how they performed with the AI product, looking at metrics such as specificity, sensitivity, F1 score, and false positive rate.
Fairness: Compare the metrics of studied groups to determine if there are unequal distributions in performance for certain groups. This is important to determine if the AI product is fair.
Image Explorer: Investigate specific images in detail with a DICOM and NIFTI viewer.
Explainability Explorer: If this module is set up, you can investigate further what areas in an image were significant in the AI product's decision.
Configuration: Change aspects of your project such as studied groups, non-indicated groups, and fairness thresholds as well as resetting the project from here.
PDF Report: Export the information of your project in a printable and easy-to-read report.
SegmentationWhen investigating segmentation AI products, Aival has all the same features and pages as when investigating classification products, except the Explainability Explorer.
🏗️ If you are getting set up, start here.
🗺️ If you want a more detailed walkthrough of the features and pages of the product, start here.
💭 If you have any questions or comments, please email us at email@example.com.