Confusion Matrix
Shows the performance of the classification model across all categories

Normalized Confusion Matrix
Normalized view of classification performance for better comparison

Metrics Comparison
Comparative view of precision, recall, and F1-score across categories

Precision Score
Measures the accuracy of positive predictions for each category

Recall Score
Measures the ability to find all positive instances

F1 Score
Harmonic mean of precision and recall for balanced performance metric

Specificity Score
Measures the ability to identify negative instances correctly

About These Metrics
Confusion Matrix
Visualizes the performance of the classification algorithm by comparing predicted vs actual classifications. Darker colors indicate higher values.
Precision & Recall
Precision measures accuracy of positive predictions, while recall measures the ability to find all positive instances. Both are crucial for model evaluation.
F1 Score
The harmonic mean of precision and recall, providing a single metric that balances both concerns. Higher F1 scores indicate better overall performance.
Specificity
Measures how well the model identifies negative instances. Important for avoiding false positives in waste classification.