Implement precision and recall metrics for classification evaluation (#49671)

This commit is contained in:
Przemysław Witek 2019-12-19 16:07:09 +01:00 committed by GitHub
parent efdba2b347
commit 786ead630a
No known key found for this signature in database
GPG key ID: 4AEE18F83AFDEB23
54 changed files with 2489 additions and 374 deletions

View file

@ -53,7 +53,9 @@ include-tagged::{doc-tests-file}[{api}-evaluation-classification]
<3> Name of the field in the index. Its value denotes the predicted (as per some ML algorithm) class of the example.
<4> The remaining parameters are the metrics to be calculated based on the two fields described above
<5> Accuracy
<6> Multiclass confusion matrix of size 3
<6> Precision
<7> Recall
<8> Multiclass confusion matrix of size 3
===== Regression
@ -104,9 +106,13 @@ include-tagged::{doc-tests-file}[{api}-results-classification]
<1> Fetching accuracy metric by name
<2> Fetching the actual accuracy value
<3> Fetching multiclass confusion matrix metric by name
<4> Fetching the contents of the confusion matrix
<5> Fetching the number of classes that were not included in the matrix
<3> Fetching precision metric by name
<4> Fetching the actual precision value
<5> Fetching recall metric by name
<6> Fetching the actual recall value
<7> Fetching multiclass confusion matrix metric by name
<8> Fetching the contents of the confusion matrix
<9> Fetching the number of classes that were not included in the matrix
===== Regression
@ -118,4 +124,4 @@ include-tagged::{doc-tests-file}[{api}-results-regression]
<1> Fetching mean squared error metric by name
<2> Fetching the actual mean squared error value
<3> Fetching R squared metric by name
<4> Fetching the actual R squared value
<4> Fetching the actual R squared value