File size: 1,585 Bytes
e81cf6c
c3c2101
eb36f93
 
 
33037c1
e81cf6c
965b544
e81cf6c
 
 
 
c3c2101
eb36f93
 
c3c2101
 
 
eb36f93
b371989
8946c1d
eb36f93
c3c2101
6e8685d
 
713e221
b371989
c3c2101
 
 
eb36f93
 
c3c2101
 
 
 
 
 
 
 
 
 
eb36f93
 
33037c1
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
---
title: FBeta_Score
tags:
- evaluate
- metric
description: Calculate FBeta_Score
sdk: gradio
sdk_version: 3.50.0
app_file: app.py
pinned: false
---

# Metric Card for FBeta_Score

## Metric Description
*Compute the F-beta score.
The F-beta score is the weighted harmonic mean of precision and recall, reaching its optimal value at 1 and its worst value at 0.
The beta parameter determines the weight of recall in the combined score. beta < 1 lends more weight to precision, while beta > 1 favors recall (beta -> 0 considers only precision, beta -> +inf only recall).*

Note: The default value of Beta is set as 1.5 to calculate the frequently used FBeta 1.5. Please set a different Beta value according to your needs.

## How to Use
``` python
import evaluate

fbeta_score = evaluate.load("leslyarun/fbeta_score")
results = fbeta_score.compute(references=[0, 1], predictions=[0, 1], beta=1.5)
print(results)
{'f_beta_score': 1.0}   
```

## Citation
@article{scikit-learn,
    title={Scikit-learn: Machine Learning in {P}ython},
    author={Pedregosa, F. and Varoquaux, G. and Gramfort, A. and Michel, V.
           and Thirion, B. and Grisel, O. and Blondel, M. and Prettenhofer, P.
           and Weiss, R. and Dubourg, V. and Vanderplas, J. and Passos, A. and
           Cournapeau, D. and Brucher, M. and Perrot, M. and Duchesnay, E.},
    journal={Journal of Machine Learning Research},
    volume={12},
    pages={2825--2830},
    year={2011}

## Further References
https://scikit-learn.org/stable/modules/generated/sklearn.metrics.fbeta_score.html#sklearn.metrics.fbeta_score