Difference between revisions of "Maths"

From Organic Design wiki
(Excellent vid on basic number theory behind crypto)
 
m
 
(One intermediate revision by the same user not shown)
Line 1: Line 1:
 
{{stub}}
 
{{stub}}
 +
 +
== What is Bayesian Reasoning? ==
 +
Bayesian reasoning, rooted in Bayes' theorem, is a method of statistical inference in which Bayes' rule is used to update the probability for a hypothesis as more evidence or information becomes available. It represents a foundational approach in probabilistic reasoning and offers a framework for updating beliefs in light of new data.
 +
 +
Bayes' theorem is mathematically expressed as:
 +
 +
:P(A∣B)=P(B∣A)×P(A)P(B)P(A|B) = \frac{P(B|A) \times P(A)}{P(B)}P(A∣B)=P(B)P(B∣A)×P(A)​
 +
 +
Where:
 +
 +
: P(A∣B)P(A|B)P(A∣B) is the posterior probability, the probability of hypothesis AAA given the new evidence BBB.
 +
 +
: P(B∣A)P(B|A)P(B∣A) is the likelihood, the probability of observing evidence BBB given that hypothesis AAA is true.
 +
 +
: P(A)P(A)P(A) is the prior probability, the original probability of hypothesis AAA before the new evidence is considered.
 +
 +
: P(B)P(B)P(B) is the evidence, the probability of observing evidence BBB.
 +
 +
Here's a conceptual breakdown:
 +
 +
* Prior Belief (P(A)P(A)P(A)): Before seeing any new evidence, you have a certain belief or probability assigned to a hypothesis (often based on earlier observations, background knowledge, or a default assumption).
 +
 +
* Likelihood (P(B∣A)P(B|A)P(B∣A)): Given the hypothesis, this is the probability of observing the new evidence. This often comes from a model or data.
 +
 +
* Evidence (P(B)P(B)P(B)): This is a normalization factor ensuring all probabilities sum to one. It can be thought of as the total probability of the evidence under all possible hypotheses.
 +
 +
* Posterior Belief (P(A∣B)P(A|B)P(A∣B)): After observing the new evidence, this updated probability reflects how likely the hypothesis is. It integrates your prior belief with the new evidence.
 +
 +
In simpler terms, Bayesian reasoning combines our prior beliefs about something with new data to give us an updated, or posterior, belief.
 +
 +
Bayesian reasoning is fundamental in various fields, from machine learning (e.g., Bayesian networks, Naive Bayes classifiers) to medical testing, finance, and scientific research. It offers a structured way to handle uncertainty and integrate diverse sources of information in a coherent probabilistic framework.
 +
 +
 +
 +
== See also ==
 
*[https://www.youtube.com/watch?v=lJ3CD9M3nEQ Excellent vid on basic number theory behind crypto]
 
*[https://www.youtube.com/watch?v=lJ3CD9M3nEQ Excellent vid on basic number theory behind crypto]
 +
*[https://www.youtube.com/watch?v=sj8Sg8qnjOg About Phi being the most irrational number]

Latest revision as of 23:18, 16 August 2023

Cone.png This article or section is a stub. Stubs are articles that have not yet received substantial attention from the authors. They are short or insufficient pieces of information and require additions to further increase the article's usefulness. The project values stubs as useful first steps toward complete articles.


What is Bayesian Reasoning?

Bayesian reasoning, rooted in Bayes' theorem, is a method of statistical inference in which Bayes' rule is used to update the probability for a hypothesis as more evidence or information becomes available. It represents a foundational approach in probabilistic reasoning and offers a framework for updating beliefs in light of new data.

Bayes' theorem is mathematically expressed as:

P(A∣B)=P(B∣A)×P(A)P(B)P(A|B) = \frac{P(B|A) \times P(A)}{P(B)}P(A∣B)=P(B)P(B∣A)×P(A)​

Where:

P(A∣B)P(A|B)P(A∣B) is the posterior probability, the probability of hypothesis AAA given the new evidence BBB.
P(B∣A)P(B|A)P(B∣A) is the likelihood, the probability of observing evidence BBB given that hypothesis AAA is true.
P(A)P(A)P(A) is the prior probability, the original probability of hypothesis AAA before the new evidence is considered.
P(B)P(B)P(B) is the evidence, the probability of observing evidence BBB.

Here's a conceptual breakdown:

  • Prior Belief (P(A)P(A)P(A)): Before seeing any new evidence, you have a certain belief or probability assigned to a hypothesis (often based on earlier observations, background knowledge, or a default assumption).
  • Likelihood (P(B∣A)P(B|A)P(B∣A)): Given the hypothesis, this is the probability of observing the new evidence. This often comes from a model or data.
  • Evidence (P(B)P(B)P(B)): This is a normalization factor ensuring all probabilities sum to one. It can be thought of as the total probability of the evidence under all possible hypotheses.
  • Posterior Belief (P(A∣B)P(A|B)P(A∣B)): After observing the new evidence, this updated probability reflects how likely the hypothesis is. It integrates your prior belief with the new evidence.

In simpler terms, Bayesian reasoning combines our prior beliefs about something with new data to give us an updated, or posterior, belief.

Bayesian reasoning is fundamental in various fields, from machine learning (e.g., Bayesian networks, Naive Bayes classifiers) to medical testing, finance, and scientific research. It offers a structured way to handle uncertainty and integrate diverse sources of information in a coherent probabilistic framework.


See also