Karl Ho
School of Economic, Political and Policy Sciences
University of Texas at Dallas
Cohen's Kappa adjusts for the possibility of agreement occurring by chance. It ranges from -1 to 1:
Kappa is one of the metrics to evaluate the model's performance. The Kappa value, along with other metrics like Accuracy, Sensitivity, and Specificity, can help you better understand the strengths and weaknesses of your model.
$$ \text{MAE} = \frac{1}{n} \sum_{i=1}^{n} |y_i - \hat{y}_i| $$
Mean Absolute Error is a metric evaluating the performance of regression models. It measures the average of the absolute differences between the predicted values and the actual (ground truth) values. It is a clear, interpretable value representing the average magnitude of the prediction errors, without considering the direction of the error.
where:
Max Kuhn: The caret package
Max Kuhn: The caret package
A and D are correct predictions