Академический Документы
Профессиональный Документы
Культура Документы
NaiveBayesclassifierWikipedia,thefreeencyclopedia
NaiveBayesclassifier
FromWikipedia,thefreeencyclopedia
Inmachinelearning,naiveBayesclassifiersareafamilyofsimpleprobabilisticclassifiersbasedonapplying
Bayes'theoremwithstrong(naive)independenceassumptionsbetweenthefeatures.
NaiveBayeshasbeenstudiedextensivelysincethe1950s.Itwasintroducedunderadifferentnameintothetext
retrievalcommunityintheearly1960s,[1]:488andremainsapopular(baseline)methodfortextcategorization,the
problemofjudgingdocumentsasbelongingtoonecategoryortheother(suchasspamorlegitimate,sportsor
politics,etc.)withwordfrequenciesasthefeatures.Withappropriatepreprocessing,itiscompetitiveinthis
domainwithmoreadvancedmethodsincludingsupportvectormachines.[2]Italsofindsapplicationinautomatic
medicaldiagnosis.[3]
NaiveBayesclassifiersarehighlyscalable,requiringanumberofparameterslinearinthenumberofvariables
(features/predictors)inalearningproblem.Maximumlikelihoodtrainingcanbedonebyevaluatingaclosedform
expression,[1]:718whichtakeslineartime,ratherthanbyexpensiveiterativeapproximationasusedformanyother
typesofclassifiers.
Inthestatisticsandcomputerscienceliterature,NaiveBayesmodelsareknownunderavarietyofnames,
includingsimpleBayesandindependenceBayes.[4]AllthesenamesreferencetheuseofBayes'theoreminthe
classifier'sdecisionrule,butnaiveBayesisnot(necessarily)aBayesianmethod[4]RussellandNorvignotethat"
[naiveBayes]issometimescalledaBayesianclassifier,asomewhatcarelessusagethathaspromptedsome
BayesianstocallittheidiotBayesmodel."[1]:482
Contents
1 Introduction
2 Probabilisticmodel
2.1 Constructingaclassifierfromtheprobabilitymodel
3 Parameterestimationandeventmodels
3.1 GaussiannaiveBayes
3.2 MultinomialnaiveBayes
3.3 BernoullinaiveBayes
3.4 Semisupervisedparameterestimation
4 Discussion
4.1 Relationtologisticregression
5 Examples
5.1 Genderclassification
5.1.1 Training
5.1.2 Testing
5.2 Documentclassification
6 Seealso
7 References
7.1 Furtherreading
8 Externallinks
Introduction
https://en.wikipedia.org/wiki/Naive_Bayes_classifier#Gender_classification
1/11
6/10/2016
NaiveBayesclassifierWikipedia,thefreeencyclopedia
NaiveBayesisasimpletechniqueforconstructingclassifiers:modelsthatassignclasslabelstoprobleminstances,
representedasvectorsoffeaturevalues,wheretheclasslabelsaredrawnfromsomefiniteset.Itisnotasingle
algorithmfortrainingsuchclassifiers,butafamilyofalgorithmsbasedonacommonprinciple:allnaiveBayes
classifiersassumethatthevalueofaparticularfeatureisindependentofthevalueofanyotherfeature,giventhe
classvariable.Forexample,afruitmaybeconsideredtobeanappleifitisred,round,andabout10cmin
diameter.AnaiveBayesclassifierconsiderseachofthesefeaturestocontributeindependentlytotheprobability
thatthisfruitisanapple,regardlessofanypossiblecorrelationsbetweenthecolor,roundnessanddiameter
features.
Forsometypesofprobabilitymodels,naiveBayesclassifierscanbetrainedveryefficientlyinasupervised
learningsetting.Inmanypracticalapplications,parameterestimationfornaiveBayesmodelsusesthemethodof
maximumlikelihoodinotherwords,onecanworkwiththenaiveBayesmodelwithoutacceptingBayesian
probabilityorusinganyBayesianmethods.
Despitetheirnaivedesignandapparentlyoversimplifiedassumptions,naiveBayesclassifiershaveworkedquite
wellinmanycomplexrealworldsituations.In2004,ananalysisoftheBayesianclassificationproblemshowed
thattherearesoundtheoreticalreasonsfortheapparentlyimplausibleefficacyofnaiveBayesclassifiers.[5]Still,a
comprehensivecomparisonwithotherclassificationalgorithmsin2006showedthatBayesclassificationis
outperformedbyotherapproaches,suchasboostedtreesorrandomforests.[6]
AnadvantageofnaiveBayesisthatitonlyrequiresasmallamountoftrainingdatatoestimatetheparameters
necessaryforclassification.
Probabilisticmodel
Abstractly,naiveBayesisaconditionalprobabilitymodel:givenaprobleminstancetobeclassified,represented
byavector
representingsomenfeatures(independentvariables),itassignstothisinstance
probabilities
foreachofKpossibleoutcomesorclasses.[7]
Theproblemwiththeaboveformulationisthatifthenumberoffeaturesnislargeorifafeaturecantakeona
largenumberofvalues,thenbasingsuchamodelonprobabilitytablesisinfeasible.Wethereforereformulatethe
modeltomakeitmoretractable.UsingBayes'theorem,theconditionalprobabilitycanbedecomposedas
InplainEnglish,usingBayesianprobabilityterminology,theaboveequationcanbewrittenas
Inpractice,thereisinterestonlyinthenumeratorofthatfraction,becausethedenominatordoesnotdependon
andthevaluesofthefeatures aregiven,sothatthedenominatoriseffectivelyconstant.Thenumeratoris
equivalenttothejointprobabilitymodel
https://en.wikipedia.org/wiki/Naive_Bayes_classifier#Gender_classification
2/11
6/10/2016
NaiveBayesclassifierWikipedia,thefreeencyclopedia
whichcanberewrittenasfollows,usingthechainruleforrepeatedapplicationsofthedefinitionofconditional
probability:
Nowthe"naive"conditionalindependenceassumptionscomeintoplay:assumethateachfeature is
conditionallyindependentofeveryotherfeature for
,giventhecategory .Thismeansthat
.
Thus,thejointmodelcanbeexpressedas
Thismeansthatundertheaboveindependenceassumptions,theconditionaldistributionovertheclassvariable
is:
wheretheevidence
isascalingfactordependentonlyon
thefeaturevariablesareknown.
,thatis,aconstantifthevaluesof
Constructingaclassifierfromtheprobabilitymodel
Thediscussionsofarhasderivedtheindependentfeaturemodel,thatis,thenaiveBayesprobabilitymodel.The
naiveBayesclassifiercombinesthismodelwithadecisionrule.Onecommonruleistopickthehypothesisthatis
mostprobablethisisknownasthemaximumaposterioriorMAPdecisionrule.Thecorrespondingclassifier,a
Bayesclassifier,isthefunctionthatassignsaclasslabel
forsomekasfollows:
Parameterestimationandeventmodels
Aclass'priormaybecalculatedbyassumingequiprobableclasses(i.e.,priors=1/(numberofclasses)),orby
calculatinganestimatefortheclassprobabilityfromthetrainingset(i.e.,(priorforagivenclass)=(numberof
samplesintheclass)/(totalnumberofsamples)).Toestimatetheparametersforafeature'sdistribution,onemust
assumeadistributionorgeneratenonparametricmodelsforthefeaturesfromthetrainingset.[8]
https://en.wikipedia.org/wiki/Naive_Bayes_classifier#Gender_classification
3/11
6/10/2016
NaiveBayesclassifierWikipedia,thefreeencyclopedia
TheassumptionsondistributionsoffeaturesarecalledtheeventmodeloftheNaiveBayesclassifier.Fordiscrete
featuresliketheonesencounteredindocumentclassification(includespamfiltering),multinomialandBernoulli
distributionsarepopular.Theseassumptionsleadtotwodistinctmodels,whichareoftenconfused.[9][10]
GaussiannaiveBayes
Whendealingwithcontinuousdata,atypicalassumptionisthatthecontinuousvaluesassociatedwitheachclass
aredistributedaccordingtoaGaussiandistribution.Forexample,supposethetrainingdatacontainacontinuous
attribute, .Wefirstsegmentthedatabytheclass,andthencomputethemeanandvarianceof ineachclass.Let
bethemeanofthevaluesin associatedwithclassc,andlet bethevarianceofthevaluesin associated
withclassc.Supposewehavecollectedsomeobservationvalue .Then,theprobabilitydistributionof givena
class ,
,canbecomputedbyplugging intotheequationforaNormaldistributionparameterizedby
and .Thatis,
Anothercommontechniqueforhandlingcontinuousvaluesistousebinningtodiscretizethefeaturevalues,to
obtainanewsetofBernoullidistributedfeaturessomeliteratureinfactsuggeststhatthisisnecessarytoapply
naiveBayes,butitisnot,andthediscretizationmaythrowawaydiscriminativeinformation.[4]
MultinomialnaiveBayes
Withamultinomialeventmodel,samples(featurevectors)representthefrequencieswithwhichcertainevents
havebeengeneratedbyamultinomial
where istheprobabilitythateventioccurs(orKsuch
multinomialsinthemulticlasscase).Afeaturevector
isthenahistogram,with countingthe
numberoftimeseventiwasobservedinaparticularinstance.Thisistheeventmodeltypicallyusedfordocument
classification,witheventsrepresentingtheoccurrenceofawordinasingledocument(seebagofwords
assumption).Thelikelihoodofobservingahistogramxisgivenby
ThemultinomialnaiveBayesclassifierbecomesalinearclassifierwhenexpressedinlogspace:[2]
where
and
Ifagivenclassandfeaturevalueneveroccurtogetherinthetrainingdata,thenthefrequencybasedprobability
estimatewillbezero.Thisisproblematicbecauseitwillwipeoutallinformationintheotherprobabilitieswhen
theyaremultiplied.Therefore,itisoftendesirabletoincorporateasmallsamplecorrection,calledpseudocount,in
https://en.wikipedia.org/wiki/Naive_Bayes_classifier#Gender_classification
4/11
6/10/2016
NaiveBayesclassifierWikipedia,thefreeencyclopedia
allprobabilityestimatessuchthatnoprobabilityiseversettobeexactlyzero.Thiswayofregularizingnaive
BayesiscalledLaplacesmoothingwhenthepseudocountisone,andLidstonesmoothinginthegeneralcase.
Rennieetal.discussproblemswiththemultinomialassumptioninthecontextofdocumentclassificationand
possiblewaystoalleviatethoseproblems,includingtheuseoftfidfweightsinsteadofrawtermfrequenciesand
documentlengthnormalization,toproduceanaiveBayesclassifierthatiscompetitivewithsupportvector
machines.[2]
BernoullinaiveBayes
InthemultivariateBernoullieventmodel,featuresareindependentbooleans(binaryvariables)describinginputs.
Likethemultinomialmodel,thismodelispopularfordocumentclassificationtasks,[9]wherebinaryterm
occurrencefeaturesareusedratherthantermfrequencies.If isabooleanexpressingtheoccurrenceorabsence
ofthei'thtermfromthevocabulary,thenthelikelihoodofadocumentgivenaclass isgivenby[9]
Semisupervisedparameterestimation
GivenawaytotrainanaiveBayesclassifierfromlabeleddata,it'spossibletoconstructasemisupervisedtraining
algorithmthatcanlearnfromacombinationoflabeledandunlabeleddatabyrunningthesupervisedlearning
algorithminaloop:[11]
Givenacollection
BayesclassifieronL.
Untilconvergence,do:
oflabeledsamplesLandunlabeledsamplesU,startbytraininganaive
Predictclassprobabilities
forallexamplesxin .
Retrainthemodelbasedontheprobabilities(notthelabels)predictedinthepreviousstep.
Convergenceisdeterminedbasedonimprovementtothemodellikelihood
parametersofthenaiveBayesmodel.
,where denotesthe
Thistrainingalgorithmisaninstanceofthemoregeneralexpectationmaximizationalgorithm(EM):the
predictionstepinsidetheloopistheEstepofEM,whiletheretrainingofnaiveBayesistheMstep.The
algorithmisformallyjustifiedbytheassumptionthatthedataaregeneratedbyamixturemodel,andthe
componentsofthismixturemodelareexactlytheclassesoftheclassificationproblem.[11]
Discussion
Despitethefactthatthefarreachingindependenceassumptionsareofteninaccurate,thenaiveBayesclassifierhas
severalpropertiesthatmakeitsurprisinglyusefulinpractice.Inparticular,thedecouplingoftheclassconditional
featuredistributionsmeansthateachdistributioncanbeindependentlyestimatedasaonedimensionaldistribution.
https://en.wikipedia.org/wiki/Naive_Bayes_classifier#Gender_classification
5/11
6/10/2016
NaiveBayesclassifierWikipedia,thefreeencyclopedia
Thishelpsalleviateproblemsstemmingfromthecurseofdimensionality,suchastheneedfordatasetsthatscale
exponentiallywiththenumberoffeatures.WhilenaiveBayesoftenfailstoproduceagoodestimateforthecorrect
classprobabilities,[12]thismaynotbearequirementformanyapplications.Forexample,thenaiveBayesclassifier
willmakethecorrectMAPdecisionruleclassificationsolongasthecorrectclassismoreprobablethananyother
class.Thisistrueregardlessofwhethertheprobabilityestimateisslightly,orevengrosslyinaccurate.Inthis
manner,theoverallclassifiercanberobustenoughtoignoreseriousdeficienciesinitsunderlyingnaiveprobability
model.[3]OtherreasonsfortheobservedsuccessofthenaiveBayesclassifierarediscussedintheliteraturecited
below.
Relationtologisticregression
Inthecaseofdiscreteinputs(indicatororfrequencyfeaturesfordiscreteevents),naiveBayesclassifiersforma
generativediscriminativepairwith(multinomial)logisticregressionclassifiers:eachnaiveBayesclassifiercanbe
consideredawayoffittingaprobabilitymodelthatoptimizesthejointlikelihood
,whilelogistic
regressionfitsthesameprobabilitymodeltooptimizetheconditional
.[13]
ThelinkbetweenthetwocanbeseenbyobservingthatthedecisionfunctionfornaiveBayes(inthebinarycase)
canberewrittenas"predictclass iftheoddsof
exceedthoseof
".Expressingthisinlog
spacegives:
Thelefthandsideofthisequationisthelogodds,orlogit,thequantitypredictedbythelinearmodelthatunderlies
logisticregression.SincenaiveBayesisalsoalinearmodelforthetwo"discrete"eventmodels,itcanbe
reparametrisedasalinearfunction
.Obtainingtheprobabilitiesisthenamatterofapplyingthe
logisticfunctionto
,orinthemulticlasscase,thesoftmaxfunction.
Discriminativeclassifiershavelowerasymptoticerrorthangenerativeoneshowever,researchbyNgandJordan
hasshownthatinsomepracticalcasesnaiveBayescanoutperformlogisticregressionbecauseitreachesits
asymptoticerrorfaster.[13]
Examples
Genderclassification
Problem:classifywhetheragivenpersonisamaleorafemalebasedonthemeasuredfeatures.Thefeatures
includeheight,weight,andfootsize.
Training
Exampletrainingsetbelow.
https://en.wikipedia.org/wiki/Naive_Bayes_classifier#Gender_classification
6/11
6/10/2016
NaiveBayesclassifierWikipedia,thefreeencyclopedia
180
12
male
5.92(5'11") 190
11
male
5.58(5'7")
170
12
male
5.92(5'11") 165
10
female 5
100
female 5.5(5'6")
150
female 5.42(5'5")
130
female 5.75(5'9")
150
TheclassifiercreatedfromthetrainingsetusingaGaussiandistributionassumptionwouldbe(givenvariancesare
unbiasedsamplevariances):
mean
(height)
Gender
male
5.855
female 5.4175
variance
(height)
mean
(weight)
variance
(weight)
mean(foot
size)
variance(foot
size)
3.5033e02
176.25
1.2292e+02
11.25
9.1667e01
9.7225e02
132.5
5.5833e+02
7.5
1.6667e+00
Let'ssaywehaveequiprobableclassessoP(male)=P(female)=0.5.Thispriorprobabilitydistributionmightbe
basedonourknowledgeoffrequenciesinthelargerpopulation,oronfrequencyinthetrainingset.
Testing
Belowisasampletobeclassifiedasamaleorfemale.
Gender height(feet) weight(lbs) footsize(inches)
sample 6
130
Wewishtodeterminewhichposteriorisgreater,maleorfemale.Fortheclassificationasmaletheposterioris
givenby
Fortheclassificationasfemaletheposteriorisgivenby
Theevidence(alsotermednormalizingconstant)maybecalculated:
However,giventhesampletheevidenceisaconstantandthusscalesbothposteriorsequally.Itthereforedoesnot
affectclassificationandcanbeignored.Wenowdeterminetheprobabilitydistributionforthesexofthesample.
https://en.wikipedia.org/wiki/Naive_Bayes_classifier#Gender_classification
7/11
6/10/2016
NaiveBayesclassifierWikipedia,thefreeencyclopedia
,
where
and
aretheparametersofnormaldistributionwhichhavebeenpreviously
determinedfromthetrainingset.Notethatavaluegreaterthan1isOKhereitisaprobabilitydensityratherthan
aprobability,becauseheightisacontinuousvariable.
Sinceposteriornumeratorisgreaterinthefemalecase,wepredictthesampleisfemale.
Documentclassification
HereisaworkedexampleofnaiveBayesianclassificationtothedocumentclassificationproblem.Considerthe
problemofclassifyingdocumentsbytheircontent,forexampleintospamandnonspamemails.Imaginethat
documentsaredrawnfromanumberofclassesofdocumentswhichcanbemodelledassetsofwordswherethe
(independent)probabilitythattheithwordofagivendocumentoccursinadocumentfromclassCcanbewritten
as
(Forthistreatment,wesimplifythingsfurtherbyassumingthatwordsarerandomlydistributedinthedocument
thatis,wordsarenotdependentonthelengthofthedocument,positionwithinthedocumentwithrelationtoother
words,orotherdocumentcontext.)
ThentheprobabilitythatagivendocumentDcontainsallofthewords
,givenaclassC,is
Thequestionthatwedesiretoansweris:"whatistheprobabilitythatagivendocumentDbelongstoagivenclass
C?"Inotherwords,whatis
?
Nowbydefinition
and
https://en.wikipedia.org/wiki/Naive_Bayes_classifier#Gender_classification
8/11
6/10/2016
NaiveBayesclassifierWikipedia,thefreeencyclopedia
Bayes'theoremmanipulatestheseintoastatementofprobabilityintermsoflikelihood.
Assumeforthemomentthatthereareonlytwomutuallyexclusiveclasses,SandS(e.g.spamandnotspam),
suchthateveryelement(email)isineitheroneortheother
and
UsingtheBayesianresultabove,wecanwrite:
Dividingonebytheothergives:
Whichcanberefactoredas:
Thus,theprobabilityratiop(S|D)/p(S|D)canbeexpressedintermsofaseriesoflikelihoodratios.Theactual
probabilityp(S|D)canbeeasilycomputedfromlog(p(S|D)/p(S|D))basedontheobservationthatp(S|D)+
p(S|D)=1.
Takingthelogarithmofalltheseratios,wehave:
(Thistechniqueof"loglikelihoodratios"isacommontechniqueinstatistics.Inthecaseoftwomutually
exclusivealternatives(suchasthisexample),theconversionofaloglikelihoodratiotoaprobabilitytakesthe
formofasigmoidcurve:seelogitfordetails.)
https://en.wikipedia.org/wiki/Naive_Bayes_classifier#Gender_classification
9/11
6/10/2016
NaiveBayesclassifierWikipedia,thefreeencyclopedia
Finally,thedocumentcanbeclassifiedasfollows.Itisspamif
(i.e.,
),
otherwiseitisnotspam.
Seealso
AODE
Bayesianspamfiltering
Bayesiannetwork
RandomnaiveBayes
Linearclassifier
Logisticregression
Perceptron
Takethebestheuristic
References
1.Russell,StuartNorvig,Peter(2003)[1995].ArtificialIntelligence:AModernApproach(2nded.).PrenticeHall.
ISBN9780137903955.
2.Rennie,J.Shih,L.Teevan,J.Karger,D.(2003).TacklingthepoorassumptionsofNaiveBayesclassifiers(PDF).
ICML.
3.Rish,Irina(2001).AnempiricalstudyofthenaiveBayesclassifier(PDF).IJCAIWorkshoponEmpiricalMethodsinAI.
4.Hand,D.J.Yu,K.(2001)."Idiot'sBayesnotsostupidafterall?".InternationalStatisticalReview69(3):385399.
doi:10.2307/1403452.ISSN03067734.
5.Zhang,Harry.TheOptimalityofNaiveBayes(PDF).FLAIRS2004conference.
6.Caruana,R.NiculescuMizil,A.(2006).Anempiricalcomparisonofsupervisedlearningalgorithms.Proc.23rd
InternationalConferenceonMachineLearning.CiteSeerX:10.1.1.122.5901.
7.NarasimhaMurty,M.SusheelaDevi,V.(2011).PatternRecognition:AnAlgorithmicApproach.ISBN0857294946.
8.John,GeorgeH.Langley,Pat(1995).EstimatingContinuousDistributionsinBayesianClassifiers.Proc.Eleventh
Conf.onUncertaintyinArtificialIntelligence.MorganKaufmann.pp.338345.
9.McCallum,AndrewNigam,Kamal(1998).AcomparisonofeventmodelsforNaiveBayestextclassification(PDF).
AAAI98workshoponlearningfortextcategorization.
10.Metsis,VangelisAndroutsopoulos,IonPaliouras,Georgios(2006).SpamfilteringwithNaiveBayeswhichNaive
Bayes?.Thirdconferenceonemailandantispam(CEAS).
11.Nigam,KamalMcCallum,AndrewThrun,SebastianMitchell,Tom(2000)."Learningtoclassifytextfromlabeledand
unlabeleddocumentsusingEM"(PDF).MachineLearning.
12.NiculescuMizil,AlexandruCaruana,Rich(2005).Predictinggoodprobabilitieswithsupervisedlearning(PDF).ICML.
doi:10.1145/1102351.1102430.
13.Ng,AndrewY.Jordan,MichaelI.(2002).Ondiscriminativevs.generativeclassifiers:Acomparisonoflogistic
regressionandnaiveBayes.NIPS.
Furtherreading
Domingos,PedroPazzani,Michael(1997)."OntheoptimalityofthesimpleBayesianclassifierunderzero
oneloss".MachineLearning29:103137.
Webb,G.I.Boughton,J.Wang,Z.(2005)."NotSoNaiveBayes:AggregatingOneDependence
Estimators".MachineLearning(Springer)58(1):524.doi:10.1007/s1099400542586.
Mozina,M.Demsar,J.Kattan,M.Zupan,B.(2004).NomogramsforVisualizationofNaiveBayesian
Classifier(PDF).Proc.PKDD2004.pp.337348.
Maron,M.E.(1961)."AutomaticIndexing:AnExperimentalInquiry".JACM8(3):404417.
doi:10.1145/321075.321084.
Minsky,M.(1961).StepstowardArtificialIntelligence.Proc.IRE.pp.830.
https://en.wikipedia.org/wiki/Naive_Bayes_classifier#Gender_classification
10/11
6/10/2016
NaiveBayesclassifierWikipedia,thefreeencyclopedia
Externallinks
BookChapter:NaiveBayestextclassification,IntroductiontoInformationRetrieval(http://nlp.stanford.edu/
IRbook/html/htmledition/naivebayestextclassification1.html)
NaiveBayesforTextClassificationwithUnbalancedClasses(http://www.cs.waikato.ac.nz/~eibe/pubs/Fran
kAndBouckaertPKDD06new.pdf)
BenchmarkresultsofNaiveBayesimplementations(http://tunedit.org/results?d=UCI/&a=bayes)
HierarchicalNaiveBayesClassifiersforuncertaindata(http://www.biomedcentral.com/14712105/7/514)
(anextensionoftheNaiveBayesclassifier).
Software
NaiveBayesclassifiersareavailableinmanygeneralpurposemachinelearningandNLPpackages,
includingApacheMahout,Mallet(http://mallet.cs.umass.edu/),NLTK,Orange,scikitlearnandWeka.
IMSLNumericalLibrariesCollectionsofmathandstatisticalalgorithmsavailableinC/C++,Fortran,Java
andC#/.NET.DataminingroutinesintheIMSLLibrariesincludeaNaiveBayesclassifier.
Winnowcontentrecommendation(http://doc.winnowtag.org/opensource)OpensourceNaiveBayestext
classifierworkswithverysmalltrainingandunbalancedtrainingsets.Highperformance,C,anyUnix.
AninteractiveMicrosoftExcelspreadsheetNaiveBayesimplementation(http://downloads.sourceforge.net/n
aivebayesclass/NaiveBayesDemo.xls?use_mirror=osdn)usingVBA(requiresenabledmacros)with
viewablesourcecode.
jBNCBayesianNetworkClassifierToolbox(http://jbnc.sourceforge.net/)
StatisticalPatternRecognitionToolboxforMatlab(http://cmp.felk.cvut.cz/cmp/software/stprtool/).
ifile(http://people.csail.mit.edu/jrennie/ifile/)thefirstfreelyavailable(Naive)Bayesianmail/spamfilter
NClassifier(http://nclassifier.sourceforge.net/)NClassifierisa.NETlibrarythatsupportstextclassification
andtextsummarization.ItisaportofClassifier4J.
Classifier4J(http://classifier4j.sourceforge.net/)Classifier4JisaJavalibrarydesignedtodotext
classification.ItcomeswithanimplementationofaBayesianclassifier.
Retrievedfrom"https://en.wikipedia.org/w/index.php?title=Naive_Bayes_classifier&oldid=719377593"
Categories: Classificationalgorithms Statisticalclassification
Thispagewaslastmodifiedon9May2016,at09:47.
TextisavailableundertheCreativeCommonsAttributionShareAlikeLicenseadditionaltermsmayapply.
Byusingthissite,youagreetotheTermsofUseandPrivacyPolicy.Wikipediaisaregisteredtrademark
oftheWikimediaFoundation,Inc.,anonprofitorganization.
https://en.wikipedia.org/wiki/Naive_Bayes_classifier#Gender_classification
11/11