In this post, we are going to walk through the Essential Definitions in Probability Theory. Understanding these concepts is critical to comprehend further advanced concepts in probability theory. Here, you will learn:

• The key concepts such as random variables and conditional probabilities.
• You gain knowledge about how these notions are related.

## Random Variables

The random variable is one of the essential definitions in probability theory. In probability theory, the outcomes of a random phenomenon determine the random variable values. In other words, a random variable is a variable that its values are determined with a random event. We should be able to measure a random variable that provides the capability to assign probabilities to its possible values. The domain of a random variable is the sample space. For example, in the case of having a dice, only six possible outcomes are considered, as {1,2,3,4,5,6}.

### Notation

In mathematics, a random variable is usually denoted with upper case roman letters such as , . However, such notation is not consistent, and you should expect to see different notations. However, don’t worry, the utilized notation would be usually reported whenever you read any article.

By using more precise mathematics notation, a random variable is a measurable function defined as , which is from all possible space to some event . Let’s have an illustrative example. Assume we would like to roll a dice, and the measurable event space is all numbers less than 4. Here, we have , the random variable as the outcome of rolling the dice, and we define the event as .

## Conditional probability

One of the essential definitions in probability theory is the conditional probability. This is because a lot of events depends on other precedent events or available partial information. Recognizing and calculating this dependency can lead to a more precise probability estimation. As below figure, how many layers should you wear, depends on the weather!!

### Conditional probability definition

Conditional probability, using simple wording, refers to the likelihood of an event (chain of events) given the fact that another event (chain of events) happened.

Let’s have an example. Assume we toss a dice and then flip a fair coin. What is the probability of getting number one (Event ) in tossing the dice and getting heads (Event ) after flipping the coin? Clearly . So, the probability of getting number one (in tossing the dice) and getting heads (in flipping the coin) is as below: Let’s take a look at a conditional situation. Assume we want to calculate the probability of having heads (in flipping the coin) if we get one (in tossing the dice)? This is a conditional statement. Basically, is conditioned on happening : The mathematical formulation of the conditional probability of two events is as below:

### Conditional probability calculation

If , then the conditional probability of is as below: NOTE: In case (The two and events are independent), we have: ## Bayes’ Rule

In order to explain the Bayes’ rule, let’s start with something simple as a special case. Assume we have two events and . Do you agree with the following statement?  : The intersection of any event with the sample space is that particular event. : The union of any event with its complement event is the whole sample space.

NOTE: and events are mutually exclusive. Think why?

Now, let’s calculate the probability of event :

(1) Above, we used the conditional probability rules to expand the event intersections. The above equation asserts that the probability of event is a weighted average of the conditional probability of given that has occurred and the conditional probability of given that has not happened. This is a very useful formula as a lot of times, directly calculating the probability of an event such as may not be easy or even possible. This rule conditions the likelihood of an event on different events.

### NOTE

The fact is, in Bayes’ rule, the probability of an event is going to be conditioned on different mutually exclusive events ( and ) which together form the whole sample space as .

The above special case can be extended to the below more general rule called the Bayes’ rule.

### Bayes’ rule

Assume we have some mutually exclusive events as , i.e., at least one of the event MUST and will occur. Then, the probability of the event is calculated as below:

(2) ## The concept of independence

We previously discussed conditional probabilities. You learn the concept of probability of an event dependent to another event as . Intuitive, we can infer that if and are independent, then does not depend on , and it simply equals to . We previously investigated the formulation as well. Technically, being independent is mutual, i.e., when , then and vice verse. We can conclude that when and are independent, then . This formulation leads to the following definition:

### Independent Events

Two events and are said to be independent if . Otherwise, they are called dependent.

## Conclusion

In a previous post, you learned about what is probability and some mathematical background. In this post, you acquire knowledge about the fundamentals of probability theory and its key concepts. What you learned so far, aimed to prepare you to utilize probability notions and further strengthen your background for more advanced probabilistic concepts in Machine Learning. Do you have any questions or suggestions? Feel free to comment and share your point of view.