So, in this case, the next algorithm we're going to look at

is what's called Naive Bayes and we are going to say,

suppose we want to detect spam emails,

as a learning example.

You can use Naive Bayes to do that.

First, I want to talk about Bayes Theorem.

It's named after Reverend Thomas Bayes,

who's heard of this before?

This was new to me. You guys know?

Did you learn about it in statistics?

It was my problem,

I never took statistics when I was at university for sure.

This was new to me last year when I was learning about this stuff,

so, I'll buzz through it pretty quickly here.

So, in essence, it's an equation that allows new evidence to update a set of beliefs.

So, A and B are events,

this nomenclature means the probability of B is not equal to zero,

probability of A, probability of B.

This is the probability of B given A and this is the probability of A occurring given B.

The equation is the probability of A given B is equal to

the probability of B given A divided by the probability of B times the probability of A.

In this scheme, probability measures the degree of

belief and the theorem links the degree of belief

in a proposition before and after accounting for evidence.

So, for a proposition A and evidence B,

P of A is the initial degree of belief in A called the

prior and the probability of A given B is the degree of belief in A

having accounted for B after some evidences

then discovered or uncovered or become or made aware of.