Skip to content

2. Precision

Trust in YES

Definition

Jab model Positive(➕) kah raha hai to kitna % baar ye sahi sahi ho raha hai
Ya : Model jab jab kah raha hai ki ya chor hai to kitna % baar sahi ho raha hai

To ye find karne ke liye Hame ye jaanana hoga ki model ne apane response me Positive(➕) kab kab kahta hai

Answer: 2 baar

Model Response

Model ka response second lettter se dekha jata hai

How?

  1. TP - ek baar yaha sahi kahta hai aur wo sahi jota hai
  2. FP - aur ek baar yaha sahi kahta hai lekin wo galat hota hai

Ab jitni baar shai hua hai uska % nikalna hai

Model Response

Model ka response sahi hua ki nahi ye first letter se jaana jata hai

TP - itni baar sahi hua FP - itni baar galat hua

\[ Precision = \frac{TP}{TP + FP} \]

TL;DR

Precision high → “Model Negative ko Positive kam bol raha hai”
Also FP should be less in order to make Precision higher

  • FP kam hai matlab model ne Negative ko Positive kam bola
  • FP matlab: Model ne Negative ko Positive kah diya

Note

Negative ko galti se Positive keh dena (FP) jaha jyada critical hoga waha pe Precision ka high hona achha hai
EX: Spam Detection (Negative hai matlab spam nahi hai) To yaha negative ko agar model ne positive kah diya (FP) then it is not good

“Precision tells us how many predicted positives are actually positive, so it focuses on reducing false positives.”

Precision is like: Galat insan ko tijori ka access dene se rokta hai iska matlab ye nahi ko sahi insan ko bhi rok de

Examples

When a model says Positive, someone usually takes action:

  • Mark email as spam
  • Block a transaction
  • Unlock a phone
  • Raise an alert

Precision answers:

“Can I trust the model when it says YES?”