# Probability of Gaussian Naive Bayes

Hi,

How would I go about attaching a probability to the prediction outputted by a Gaussian Naive Bayes model ?
I’m asking because the predict_proba function U can use with sklearn’s Gaussian Naive Bayes classifier only gives me zeros or one, which is not what I want.

Can you explain your confusion regarding your expected outcome ?

That’s cruel man, you know the best finger painting stores are closed during lockdown. So listen here, Delores, the problem I’m facing is this : Say I’ve made a classifier which can distinguish between four classes [green, blue, yellow, red], just to stay in the finger painting theme. How can I know the likelihood off each class given to a certain item/sample ? Using the predict_proba function I get the following outcome [0,0,0,1]. I’m guessing the function just returns a 1 for the class with the highest likelihood, which is not what I’m looking for, I’m looking for a function which returns something like [0.63 , 0.50, 0.20. 0.99]. I hope it’s clear what I’m trying to accomplish.

I was expecting an outcome in the likes off [0.13 , 0.90, 0.50] if you’re working with a multiclass classifier and not [0,1,0]. It seems the function just returns a 1 to the class with the highest probability/likelihood. I’m looking for a calculation/function which returns how certain the classifier is in predicting the different classes.

Log/likelihood values cannot be used alone as an index of fit because they are a function of sample size but can be used to compare the fit of different coefficients. Does this make sense to you?

Not even slightly. Sounds pretty secretive though. Best of luck to you.