Tuesday, October 30, 2012

Expectation

E[X] = ΣxP(X = x) = xp(x)dx  [p() is pdf here ]

Clearly E[f(X)] = Σ y P(Y = y ) [y = f(x)] = Σ f(x) P(X = x) = ∫f(x)p(x)dx [By creating mapping from Y to X]

As P(Y = y) = Σ P(X = x )
                [∀ x s.t f(x) = y]

If X,Y mutually exclusive  E[X+Y] = E[X] + E[Y] [from above]

Some candy expectations :-
V[X] (Variance) = E[X^2 - E[X]] = σ(X)^2 [standard deviation]
E[X] = μ(X) [mean]

Moreover
x s.t P(X< x) = 1/2 [median]
x s.t P(X = x) = max P(X = s) [mode]
                           s

E[X^n] =  n'th moment of X.

Moment Generating Function:
M(X,t) = E[e^(xt)] is the moment generating function where ^n(M(x,t))/(t)^n  | t = 0 =  n'th moment

Probability Generating Function :

G(X,z) = E[z^x] and 1/n! (^n(M(x,z))/(∂z)^n) | z = 0 is the P[X = n]

Covariance : Cov(X,Y) = E[(X - E[X])(Y-E[Y])] = E[XY] - E[X]E[Y]

Lemma :
V(X+Y) = V(X) + V(Y) - Cov(X,Y) = E[(X+Y)^2] - E[(X+Y)]^2

Cov(X,Y) = 0 => E[XY] = E[X]E[Y]  <=> X  Y  

Probability and Probability Theory

Probability is the measure of the expectation that a event will occur or a statement is true .
Probability theory is the mathematical branch to analysis the Probability with the help of some basic axioms .

The Probability (P) is defined on the measure space  ( Ω , P) as :-
 Ω :  [Sample Space] is the set of all possible outcome
 E : is a subset of  Ω   called an Event

 P :   2Ω   [0,1] s.t the following three axioms [ Kolmogorov axioms] will be held :-

1. P(E)   0
2. P(Ω) = 1
3. For a countable sequence of disjoint ( mutually exclusive ) events E1 , E2 ...
    P(E1 E2 ..) = P(E1) + P(E2) + ...

It further depicts that :-
1. P(AB) = P(A) + P(B) - P(AB) and further inclusion exclusion principal .
2. P(φ)   =  0

Conditional Probability :
P(A|B) = Probability that event A occur it is known that B occur = P(AB)/P(B) [Defined as that ]

In continuous and general case Ω may be  uncountable and some of e  Ω may be with 0 probability . There we introduce a enumeration of events as instances of random variable [X ∈ ⊆ R] ( Here X = x is an event )and take  2 new term as cdf (cumulative distribution function ) F[x] =  P(X   x) and pdf (probability density function) p(x) =  dF[x]/dx .

Obviously in continuous case P[X = x] = 0 for some or all X and in discrete case P(X = x) is the probability of the event X = x.

XY [X is independent of Y] iff P(XY) = P(X)P(Y)

Sunday, October 7, 2012

Lucas Series

Same like Fibonacci Lucas follow this definition :-

(L(n)) | L(n) = L(n-1) + L(n-2) , L(1) = 1 , L(0) = 2

See this link for further info http://en.wikipedia.org/wiki/Lucas_number

Properties of Lucas Series:
1. L(n) = F(n-1) + F(n+1)
2. L(m+n) = L(m+1)F(n) + L(m)F(n-1)
3. F( 2n ) = L(n)F(n)
4. F(n) = (L(n-1) + L(n+1)) / 5

Fibonacci Number

Let's see some different topic today . Its time to have a fresh look at some of the beautiful series .

Fibonacci Number : http://en.wikipedia.org/wiki/Fibonacci_number Here you can have ample info but I want to give some candy information in a group :-

1. (F(n)) | F(n) = F(n-1)+F(n-2) , F(0) = 0 , F(1) = 1 . Its the definition of this sequence called Fibonacci 
     sequence. 
2. Its an infinite diverging sequence :-
    Lt (n  ∞ , F(n+1)/F(n)) = 1+1/(1+1/( ... )) = y (say) = 1/1+y => y^2+y-1 = 0 => y =  φ >1 thus it is 
    divergent.   

3. gcd(F(n),F(m)) = F(gcd(n,m)) 

4. F(n) = Σ( k = 0 to floor((n-1)/2) , C(n-k-1,k)) the second diagonal sum of Pascal triangle.   

5. (F(n+1) ; F(n)) = (1 1 ; 1 0)^n (1 ; 0)
6. F(m)F(n) + F(m-1)F(n-1) = F(m+n-1) [ From above relation ]

7. G(x) =  generating function = x / (1 - x - x^2) [ From relation 1]

8. F(0) + F(1) + ... + F(n) = F(n+2) - 1

You can go further by seeing this http://en.wikipedia.org/wiki/Pisano_period