Metadata-Version: 2.1
Name: Cls_Evaluation
Version: 0.0.1
Summary: Basic metrics for evaluating classification results
Home-page: UNKNOWN
Author: Swayanshu Shanti Pragnya
Author-email: swayanshu1997@gmail.com
License: MIT
Keywords: summary,F1 score,accuracy,confusion matrix,FDR
Platform: UNKNOWN
Classifier: Development Status :: 5 - Production/Stable
Classifier: Intended Audience :: Education
Classifier: Operating System :: Microsoft :: Windows :: Windows 10
Classifier: License :: OSI Approved :: MIT License
Classifier: Programming Language :: Python :: 3
License-File: LICENSE.txt

A confusion matrix is a summary of classification problem prediction results. 
The number of correct and incorrect predictions is summarized using count values and divided by class.

The function takes two arrays of same length and returns a list of four metrics i.e., TN, FP, FN, and TP
By using confusion matrix we can calculate precision, recall, f1 score, FDR and accuracy.
-Calculate True positive, true negative, false positive and false negative
-Accuracy calculates the number of times classifier predicts correctly.
-F1 score: Harmonic mean of precision and recall.
-Precision: What % of predicted Positive aspects are actually Positive?
-Recall: How many actual Positives are correctly classified?
-False dicovery rate (FDR)

CHANGE LOG

==============

0.0.1 (16/01/2022)
------------------
- First Release

