9-probability - Processes

Support/Figures/Pasted image 20250119094642.png
The sample space for bernoulli process is the set of all binary sequences.

For a given number of trials, how many successes do we have? call this random variable Sn (number of successes in n trails)
What is pSn? well for any k=0n,
pSn(k)=nCk pk(1p)nk
E[Sn]=np, var(Sn)=np(1p)
Next, for a given number of successes, how many trials do we have? call is Tn (number of trails for n successes)
What is pTn(k) for any kN, that is, what is the probability that I spend k trials to get n sucesses?
well if k<n, then the probability is zero.
if k=n, then the probability is simply pTn(n)=pn.
what is pTn(n+r)? well, the last n+rth trial has to be a success, else if we have n successes before the last trial, we have used less than n+r trails to get n successes, and if we have less than n1 successes before the last trial, we we never get n successes including the last trail.
Therefore pTn(n+r)=p pSn+r1(n1)
That is, the last trial has to be a success, and independently, given the previous n+r1 trials, we need a total of (n1) successes. Hence, $$p_{T_{n}}(n+r) =(n+r-1) \ C \ (n-1) \ p^n(1-p)^r$$ (pascal PMF)

Now what is E[Tn]? Well E[Tn]=r=0pTn(n+r)(n+r) which seems harder to calculate.
Well the memoryless property (which comes from independence) is an absolute UNIT here.
Let us say your cousin, saw n1 successes so far, where Tn1 models the Random variable, which depicts the numbs of trails needed to see n1 successes. And now you walk into the room. How many more trials do you need to see another success? that is modeled by T1 of course. So E[Tn]=E[Tn1]+E[T1]. Doing this repeatedly,
we have E[Tn]=nE[T1]=np.
Similarly, var(Tn)=nvar(T1)=n(1p)p2

Poisson process

He we talk about continuous time, and use the idea of time homogeneity. In any time interval τ, the probability of k arrivals in that time is modeled by p(k,τ). Equal time intervals have equal probability for any number of arrivals.
and disjoint time intervals and independent.
for any small time interval δ, we have p(k,δ)1λδ if k=0,λδ if k=1, 0 if k>1.
Discretize the interval [0,t] as [0,δ],(δ,2δ],(n1δ,nδ]. whenre nδ=t.
Then each discretized interval acts as a Bernoulli trial.

Hence we ask the question again:
For a length of interval, discretized as t=nδ, Let the random variable At denote the number of arrivals in the interval [0,t]. Then, the probability mass, approximated by discretization looks like a bernoulli process pAt(k)=nCk(λδ)k(1λδ)nk

now, pAt(k)=limnnCk(λtn)k(1λtn)nk.
Without doing any calculus, because we don't want to, we write the result
pAt(k)=(λt)keλtk!. (poission distribution)
E[At]=λt=var(At). this is pretty cool! so the standard deviation of the poission is the square root of the mean.

Now, for a number of arrivals k, let the random variable Tk denote the length of time interval needed to see k arrivals.
using the anologous arugment, if we want to see k the time t, we have to see exactly one arrival in the interval (tδ,t] and exactly have k1 arrivals in the time [0,tδ].
Since the two intervals are disjoint, and since Tk is a continuous random variable
we have fTk(t)δlimδ0pAtδ(k1)λδ Do some rigorous cancellation of the deltas, and some plug and chug

fTk(t)=(λktk1)eλt(k1)!

T1=λeλ.
And due to the fact that Tk is memoryless,
E[Tk]=kE[T1]=kλeλ.

Merging and splitting:

Poisson processes:

Remember that the arrival rate λ is enough to determine a poisson process. Imagine we have two poison signals λ1,λ2. And a machine that records an arrival, if in some small time interval, any one or both of the poison signals record an arrival.
let us use some notation and call Q(λ1) and Q(λ2) for the two Poisson signals, and Q for the merged signal.
Then, for Q to record an arrival in the small interval δ, we have three mutually exclusive cases: only Q(λ1) records an arrival, with probability λ1δ(1λ2δ), only Q(λ2) records an arrival, with probability (1λ1δ)λ2δ, or they both do, with prob λ1λ2δ2.
And so the probability that Q records a signal in the small interval δ is the sum of these three possibilities which is (λ1+λ2)δ. Therefore Q is eqivalent to the poisson process with arrival rate λ1+λ2 !!!!!!!! (we have to ingore higher order terms like δ2).

In the small interval δ, given that we record in arrival in Q, what is the probability that Q(λ1) also recorded a signal in that interval, using conditioning. we get (ignoring δ2 terms) that given Q recorded an arrival, the probability that Q(λ1) recorded that signal is λ1λ1+λ2.
Analogously we can merge to bernoulli processes, B(p1),B(p2) and the probability that in any single trail, the merged B succeeds is p1(1p2)+p2(1p1)+p1p2 hence B is the bernoulli process with each trail succeeded with probability p1+p2p1p2p1+p2.

Splitting poison
Support/Figures/Pasted image 20250119133047.png

Given a poisson signal Q we split it into two signals Q1,Q2, such that (indepedently) in any time interval of length δ, if Q records a signal, with probability p I sent it to Q1 and with probability 1p I send it to Q2.
It is not difficult to see that Q1 is a possion process with arrival rate λp and Q2 is one with arrival rate λ(1p), where λ is the arrival rate of Q itself.

We can analogously split the one Bernoulli process into two this way as well. :)