Subject: Business Statistics
An uncertain variable whose occurrence depends on chance is known as a random variable. Additionally, it is described as the range of potential values obtained from a random experiment. It can be divided into two categories: discrete random variables and continuous random variables. While continuous variables take values within an interval, discrete variables take values that are finite. The addition or integration of potential values from a random variable constitutes mathematical expectation, commonly referred to as the expected value.
An uncertain variable whose occurrence depends on chance is known as a random variable.
Additionally, it is described as the range of potential values obtained from a random experiment.
There are two categories for random variables:
For Example, One observes how many vehicles arrive at a take away restaurant each week. The random variable of interest is X, where
X=number of vehicles arriving each week
The possible values of X are given by the range space of X, which is denoted by RX,
Here RX={0,1,2,3...}
Let X be a discrete random variable. With each possible outcome xi in RX, a number p(xi)=P(X=xi) gives the probability that random variable equals the values of xi. The numbers p(xi), i=1,2,3.... must satisfy the following two conditions:
The coefficient of pairs (xi,p(xi)),i=1,2,...., is called probability distribution of X, and p(xi) is called the probability mass function (pmf) of X.
Cumulative Distribution Function(cdf):
The cumulative distribution function(cdf) of a random variable is represented by F(x) and defined as:
F(x)=P(X≤x)=\(\sum{p(xi)}\),∀xi≤x
A cdf must be a non-decreasing function, thus if x1≤x2, then F(x1)≤F(x2).
for Example, let's make a random variable X that indicates the number of tails revealed in a trial by tossing a real coin three times. The sample space has the following descriptions: HHH, HHT, HTH, THH, TTH, TTT, TTH, TTH. The following is a representation of the probability mass function:
The cdf for random variable X is represented as follows:
Continuous Random Variables:
If the random variable takes the values in an interval or a collection of intervals, then it is called a continuous random variable. In case of a continuous random variable X, the probability that X lies in the interval [a,b] is given by
P(a≤X≤b)=∫f(x)dx
Where, the function f(x) is called the probability density function (pdf) of random variable X. The pdf satisfies the following three,conditions:
The addition or integration of potential values from a random variable constitutes mathematical expectation, commonly referred to as the expected value. It can alternatively be thought of as the result of the probability of an event happening, P(x), and the value corresponding to the event's actual observed occurrence. Any random variable's expected value is regarded as a beneficial attribute. The expect value, which is typically represented as E(X), is calculated by adding up all the different possible values that the random variable might have. The mathematical formula will provide the mathematical expectation as, E(X)= Σ (x1p1, x2p2, …, xnpn), where x is a random variable with the probability function, f(x), p is the probability of the occurrence, and n is the number of all possible values In the case.Suppose a random variable, X, can take finitely-many possible values, (x1, x2, . . . , xn), each with associated probabilities (p1, p2, . . . , pn). The expectation of X is then defined as:
<x>= x1p1 + x2p2 + . . . + xnpn..
We can see that an expectation is simply the weighted average of all potential outcomes.
Expectation is referred to in the context of discrete random variables as the product of the random variables and their probability mass function. The following is the formula:
$$E(X) = \sum{xP(x)}$$
Similar to discrete random variables, continuous random variables' mathematical expectation is given as the sum of their discrete probability density functions. The following formula is provided.
E(X) =.∫xf(x)dx for the continuous random variable.
Example 3:
Suppose we roll a fair, 6-sided die 100 times, and at the end, we add up all the results of each roll. What would be the likely value of this sum?
Soln
The sample space for a single roll can be described by the following random variable:
Random Variable, X
Outcomes | 1 | 2 | 3 | 4 | 5 | 6 |
Probabilities | $$\frac{1}{6}$$ | $$\frac{1}{6}$$ |
$$\frac{1}{6}$$ |
$$\frac{1}{6}$$ | $$\frac{1}{6}$$ | $$\frac{1}{6}$$ |
Since 100 is much bigger than 6 (the total number of possible outcomes), by the Law of Large Numbers, we should expect 1/6th of the outcomes to be a 1, 1/6th of the outcomes to be a 2, and so on.
Thus, the total sum can be calculated as follows:
sum = 100×\(\frac{1}{6}\)(1)+100×\(\frac{1}{6}\)(2)+100×\(\frac{1}{6}\)(3)+100×\(\frac{1}{6}\)(4)+100×\(\frac{1}{6}\)(5)+100×\(\frac{1}{6}\)(6)
Solving this we get the value = 350. We call the quantity [ \(\frac{1}{6}\)(1)+ \(\frac{1}{6}\)(2)+ \(\frac{1}{6}\)(3) + \(\frac{1}{6}\)(4) +\(\frac{1}{6}\)(5) + \(\frac{1}{6}\)(6)] asthe expectation of the random variable X, and denote it by <X>. It is simply the average number of dots we get on each roll (note, however, that we can never roll exactly 3.5). In general, we use the following definition:
Suppose a random variable, X, can take infnitely-many possible values, (x1,x2,...,xn), each with associated probabilities(p1,p2,...,pn). The expectation of X is then defined as...
<X> = x1p1 + x2p2 + ... + xnpn.
We can conclude that the expectation is the weighted average of the all possible outcomes.
Reference:
© 2021 Saralmind. All Rights Reserved.