Let us say, That there is some Huge random variable, that is very complicated and is not budging to be fit with any simple distribution and so on.. call it , Then, Let us say we want to get the mean , without knowing the distribution , then what we can do, hopefully, is to take samples and hope that .
Markov's inequality.
Let be a random variable that takes non-negative values. Then, can be partitioned as and as where one random variable is when it takes values bigger than or equal to and the other, the remaining part, for some constant .
Using the total expectation theorem, , since expected value of any postiive (read non neagtive) random variable is positive, . Hence $$E[X] \ge \sum {x \ge a}xp(x) \ge a \sum_{x\ge a}p_{X}(x)$$
So the markov inequality states that for any constant in the range of a non negative random variable , $$E[X] \ge a P(X\ge a)$$.
Applying the Markov inequality on (on variance essentially)
(where ). (chebyshev inequality).
Now, these inequalities hold on continuous (non negative) random variables as well, just using integrals instead of sums.
using and , We have for any , $$aP(X \ge a) \le \mu$$