Friday, April 20, 2018

Is anybody here familiar with Bayes who can tell me if this is the appropriate way to consider multiple pieces of evidence for one event?

Is anybody here familiar with Bayes who can tell me if this is the appropriate way to consider multiple pieces of evidence for one event?

I was wondering whether it would be possible to calculate an event with multiple evidence instances one by one, and using the posterior probability of each calculation as the next calculation's prior probability (i.e. P(A | B) becomes P(A) for P(A | C).

I was able to get similar numbers to the Bayesian network by using a calculator like this, but I'm not sure if it's coincidence.

https://play.google.com/store/apps/details?id=com.thenewboston.katherine

3 comments:

  1. Missed this first time round, but yes: as I mentioned in the other post, suppose you have some hypothesis H and there exist two pieces of relevant evidence E and F.

    Your initial level of belief in H is P(H). When you encounter evidence E, you can update your belief by Bayes' rule:

    P(H|E) = P(E|H) * P(H) / P(E)

    notice that this is just multiplying your old belief level by [P(E|H) / P(E)], which we can call the "effect" of E.
    Since this operation is just a multiplication, we can multiply the effects of the two evidences together to get their combined effect:

    P(H|E&F) = [P(E|H) * P(F|H) / (P(E) * P(F))] * P(H)

    Which incidentally also demonstrates that it doesn't matter which order you consider the evidences in.

    ReplyDelete
  2. Thank you! I'm still reading your other reply, but I want to thank you first for your time and effort!

    ReplyDelete
  3. Sorry, I didn't really answer the question very well... yes, P(H | E&F) is equal to the result of Bayes' rule with evidence E and prior P(H|F); or equivalently with evidence F and prior P(H|E).

    ReplyDelete