Definition
It is a variable whose value is dependent on the outcome of a random experiment. It is defined as a mapping from the sample space to the real numbers.
- Continuous Random Variable: If a Random Variable can take an Infinite number of values within a range or interval. Then it is is a Continuous random variable.
- Discrete Random Variable: Random Variables whose sample set is a fixed and is not a range.
Note: Random variables are denoted by uppercase letters.
Example
- Player is tolling a die, a win or loss is associated with each outcome
- The game can be judged based on the expected value
Note: Here is the random variable.