Model Performance Statistics

Detailed analytics and performance metrics of the waste classification model

Confusion Matrix

Shows the performance of the classification model across all categories

Confusion Matrix

Normalized Confusion Matrix

Normalized view of classification performance for better comparison

Normalized Confusion Matrix

Metrics Comparison

Comparative view of precision, recall, and F1-score across categories

Metrics Comparison

Precision Score

Measures the accuracy of positive predictions for each category

Precision Score

Recall Score

Measures the ability to find all positive instances

Recall Score

F1 Score

Harmonic mean of precision and recall for balanced performance metric

F1 Score

Specificity Score

Measures the ability to identify negative instances correctly

Specificity Score

About These Metrics

Confusion Matrix

Visualizes the performance of the classification algorithm by comparing predicted vs actual classifications. Darker colors indicate higher values.

Precision & Recall

Precision measures accuracy of positive predictions, while recall measures the ability to find all positive instances. Both are crucial for model evaluation.

F1 Score

The harmonic mean of precision and recall, providing a single metric that balances both concerns. Higher F1 scores indicate better overall performance.

Specificity

Measures how well the model identifies negative instances. Important for avoiding false positives in waste classification.