LEGACY CONTENT. If you are looking for Voteview.com, PLEASE CLICK HERE

This site is an archived version of Voteview.com archived from University of Georgia on May 23, 2017. This point-in-time capture includes all files publicly linked on Voteview.com at that time. We provide access to this content as a service to ensure that past users of Voteview.com have access to historical files. This content will remain online until at least January 1st, 2018. UCLA provides no warranty or guarantee of access to these files.

45-733 PROBABILITY AND STATISTICS I Notes #5B


February 2000



Poisson Distribution

  1. The Poisson distribution is a discrete distribution that is a good model for discrete events that occur over space, time, or volume. Indeed, many physical processes like the decay of radioactive particles are precisely modeled by the Poisson distribution and we can say that a Poisson process is at work. Examples are legion: the number of calls coming into a switchboard during a specified period of time are distributed as a Poisson random variable; the number of people arriving at a banking machine; the number of cars arriving at a turnpike entrance; typing errors on a page; automobile accidents at an intersection; failures of machines in a factory; etc.
    The Poisson distribution is written as:
    
                             æ [(lx)e-l]/x!
                             ç
                      f(x) = ç            x=0,1,2,3,4,...
                             ç
                             è 0 otherwise
    

    where Lambda, l, is the average number of occurrences of the phenomenon in 1 unit of time, space, or volume.
    This is a legal probability distribution. To see this, note first that:
    f(x) ³ 0 and
    åx=0,+¥ f(x) = åx=0,+¥ (lx e-l)/x! = e-l åx=0,+¥ (lx)/x! = 1
    Because: åx=0,+¥ (lx)/x! = 1 + l + l2/2! + l3/3! + l4/4! + ... = el

  2. The mean of the Poisson distribution:
    E(X) = åx=0,+¥ xf(x) = åx=0,+¥ [x(lx e-l)]/x! = åx=1,+¥ [x(lxe-l)]/x! =
    åx=1,+¥ (lxe-l)/(x-1)! = l[åx=1,+¥ (lx-1 e-l)/(x-1)!]

    Now, let y = x - 1
    E(X) = l [åy=0,+¥ (ly e-l)/y!] = l

  3. The variance of the Poisson distribution is also l.

  4. Problem 3.82 p.117
    Let X = Number of customers per hour. We are given that l = 7.
    1. What is the Probability that no more than 3 customers arrive?
      P(X £ 3) = F(3) = .082
      Using the Table on pp.725-729.
      Note that the correct interpretation of a problem like this is that we are randomly drawing an hour from a large number of hours. This is the probability that we would get 3 or few customers during a randomly drawn hour.

    2. What is the Probability that at least 2 customers arrive?
      P(X ³ 2) = 1 - F(1) = 1 - .007 = .993
    3. What is the Probability that exactly 5 customers arrive?
      P(X = 5) = F(5) - F(4) = .301 - .173 = .128
  5. Traffic Accident Problem
    Let X = Number of traffic accidents during a 48 hour period from midnight Friday to midnight Sunday. Suppose that:
    l = 3.2
    P(4 £ X £ 6) = F(6) - F(3) = .955 - .603 = .352
    Using the Table on pp.725-729.

  6. Problem 3.95 p.118
    Let Y = Number of defects per foot in the rope.
    We are given that l = 2 and that
    Profit = X = 50 - 2Y - Y2.
    E(X) = E(50 - 2Y - Y2) = 50 - 2E(Y) - E(Y2) = 50 - 4 - E(Y2)
    Using the fact that E(Y) = l = 2
    Recall that: VAR(Y) = E(Y2) - [E(Y)]2
    Hence: E(Y2) = VAR(Y) + [E(Y)]2 = l + l2 = 2 + 4 = 6.
    And E(X) = 50 - 4 - 6 = $40.00

  7. Poisson Approximation to the Binomial
    If n is large and p is small, then the Poisson distribution can be used to approximate binomial probabilities by setting:
    l = np
    To see this:
    f(x) = [n choose x]px(1 - p)n-x = {[n(n - 1)(n - 2)(n - 3)...(n - x + 1)]/x!}px(1 - p)n-x
    Now, let l = np so that p = l/n. Hence:
    {[n(n - 1)(n - 2)(n - 3)...(n - x + 1)]/x!} (l/n)x(1 - p)n (1 - p)n-x =
    [lx/x!][n/n][(n-1)/n][(n-2)/n] ...[(n-x+1)/n][1 - (l/n)]n [1 - (l/n)]-x

    Let n ® +¥, p ® 0, where l = np, the limit of the expression above is:
    (lx e-l)/x! because [1 - (l/n)]n goes to e-l and the other terms go to 1.

  8. To see why this is true, consider a chunk of time or space. Suppose we know that a Poisson process is at work with mean, l = 2. We could divide up this chunk into a large number, n, of smaller sub-chunks and treat each of these as a Bernoulli Trial. That is, let Xi = 1 if the phenomenon occurs in the sub-chunk i, and 0 if not. The probability of a success here is simply l/n. Hence:
    E(Z) = E[X1 + X2 + ... + Xn] = E(X1) + E(X2) + ... + E(Xn) =
    l/n + l/n + ... + l/n = l

    The difficulty with this formulation is that it does not guarantee that two (or more) occurrences of the phenomenon could occur within a sub-chunk.
    Hence, we could make the sub-chunks smaller. But no matter how small the sub-chunks the same challenge could be made. Consequently, as we let n go to +¥, p goes to zero, and the Poisson distribution is the infinite sum of Bernoulli Trials defined over zero units of time or space.

  9. Cancer Example
    In a large population the occurrence of a particular form of cancer is 1 in 10,000. Hence, the probability that a randomly drawn person will have the cancer is .0001. We take a sample of 30,000 people, what is the probability that at least 3 have the cancer.
    Let X = Number of people with cancer.
    p = .0001 and n = 30,000 so that l = np = 3
    P(X ³ 3) = 1 - F(2) = 1 - .423 = .577
    Using the Table on pp.725-729.

  10. Problem 3.93 p.117
    From the problem statement we have:
    n = 30 inoculated mice, and p = .2 is the probability that an inoculated mouse will contract the disease. Hence
    l = np = 30*.2 = 6
    Let X = "number of inoculated mice that will contract the disease"
    Therefore, P(X £ 3) = F(3) = .151 Using the Table on p.726