F Score
- Measures classification performance by combining precision and recall into a single score.
- Uses a parameter beta to weight the relative importance of precision versus recall (beta = 1 gives equal weight).
- Commonly reported per class and can be aggregated to summarize overall model performance.
Definition
Section titled “Definition”F-score is a metric used to evaluate the performance of a model in classification tasks. It is a weighted average of precision and recall. Precision is the number of true positives divided by the sum of true positives and false positives, and recall is the number of true positives divided by the sum of true positives and false negatives. The F-score is calculated using the following formula:
Precision and recall (as defined in the source) can be written:
Explanation
Section titled “Explanation”- The parameter beta controls the relative weight of recall versus precision. When beta = 1, precision and recall are weighted equally.
- By combining precision and recall, the F-score provides a single-number summary that accounts for both false positives and false negatives.
- Different beta values shift importance toward precision (beta < 1) or recall (beta > 1) according to application requirements.
Examples
Section titled “Examples”Cats and dogs classification
Section titled “Cats and dogs classification”Confusion matrix (as given in the source):
| Actual Cat | Actual Dog | |
|---|---|---|
| Predicted Cat | 40 | 10 |
| Predicted Dog | 20 | 30 |
Based on this confusion matrix, the source calculates precision and recall for each class:
-
Precision for cats: 40 / (40 + 20) = 0.67
-
Recall for cats: 40 / (40 + 10) = 0.80
-
Precision for dogs: 30 / (30 + 20) = 0.60
-
Recall for dogs: 30 / (30 + 10) = 0.75
Using beta = 1 in the F-score formula, the source gives:
-
F-score for cats:
-
F-score for dogs:
The source reports the overall F-score for this model as 0.71.
Use cases
Section titled “Use cases”- Use beta to emphasize precision or recall depending on the specific requirements of an application (the source notes that beta is often set to 1 but may be changed when more weight should be given to precision or recall).
Related terms
Section titled “Related terms”- Precision
- Recall
- Confusion matrix