Покупки по категориям

This book provides a self-contained review of all the relevant topics in probability theory. It is suited for undergraduate students in engineering, operations research, statistics, mathematics, actuarial science, business management, computer science, and public policy. It employs a large number of examples to teach the students to use stochastic models of real-life systems to predict their performance and use this analysis to design better systems. The book is devoted to the study of important classes of stochastic processes: All the material is illustrated with many examples and case studies.

The book provides a concise review of probability in the appendix. The book emphasizes numerical answers to the problems. Additional Details Number of Volumes. Reviews From the reviews of the second edition: The author has added a new chapter on Poisson processes and another one on Brownian motion. The discussion is kept on an elementary level and does not require any knowledge from measure theory or advanced calculus.

The next theorem gives a simple method of comput- ing the occupancy times. The occu- pancy times matrix is given by n X M. Fix i and j. Com- pute the occupancy time matrix M. We have, from Theorem 2. Note that this includes the visit at time 0. Consider the three-state weather model described in Example 2. Suppose it is sunny in Heavenly today. Compute the expected num- ber of rainy days in the week starting today. Let Xn be the weather in Heavenly on the nth day.

Then, from Example 2. The required quantity is given by m1;3. The occupancy matrix M. In this section, we study the limiting behavior of Xn as n tends to infinity. We start with the most obvious question: Does the pmf of Xn approach a limit as n tends to infinity? If the limiting distribution exists, is it unique? This question makes sense since it is conceivable that the limit may depend upon the starting state, or the initial distribution of the DTMC.

Finally, the question of practical importance is: If there is a unique limiting distribution, how do we compute it? It so happens that the answers to the first two questions are complex but the answer to the last question is easy. Hence we give that first in the following theorem. Then, assuming that the limiting distribution exists, we see that lim P. We illustrate Theorem 2. Furthermore, all the rows of P n are the same in the limit, implying that the limiting distribution of Xn is the same regardless of the initial distribution.

Thus the pmf of Xn does not approach a limit. It fluctuates between two pmfs that depend on the initial distribution. The DTMC has no limiting distribution. Although there is no limiting distribution for the DTMC of the example above, we can still solve the balance equations and the normalizing equation.

Staff View: Introduction to modeling and analysis of stochastic systems

We call this initial distribution a stationary distribution. Formally, we have the following definition. The questions about the limiting distribution namely existence, uniqueness, and method of computation can be asked about the stationary distribution as well. We have a slightly stronger result for the stationary distribution, as given in the follow- ing theorem. This implies that if P. But, using n D 0 in 2. This yields the following corollary. A limiting distribution, when it exists, is also a stationary distribution.

Introduction to Modeling and Analysis of Stochastic Systems (Springer Texts in Statistics)

Then, from Theorem 2. But these are the same as 2. Hence, from Theorem 2. The next example shows that the limiting or stationary distributions need not be unique. Thus the limiting distribution exists but is not unique. It depends on the initial distribution. It should be clear by now that we need to find a normalized solution i. There is another important interpretation of the normalized solution to the balance equations, as discussed below.

We studied the expected value of this quantity in Section 2. The occupancy of state j is defined as E. The normalization equation 2. Will there always be a solution? Will it be unique? When can this solution be interpreted as a limiting distribution, stationary distribution, or occupancy distribution? Although these questions can be answered strictly in terms of solutions of linear systems of equations, it is more useful to develop the answers in terms of the DTMC framework. That is what we do below. Note that the condition in 2.

It is in general easy to check if the DTMC is irreducible. The usefulness of the concept of irreducibility arises from the following two theorems, whose proofs are beyond the scope of this book. A finite-state irreducible DTMC has a unique stationary distribution; i. A finite-state irreducible DTMC has a unique occupancy distribution and is equal to the stationary distribution. Next we introduce the concept of periodicity. This will help us decide when the limiting distribution exists.

A DTMC with period d can return to its starting state only at times d; 2d; 3d;: It is an interesting fact of irreducible DTMCs that it is sufficient to find the largest d satisfying 2. All other states are guaranteed to produce the same d: This makes it easy to establish the periodicity of an irreducible DTMC. Periodicity is also easy to spot from the transition diagrams. First, define a di- rected cycle in the transition diagram as a directed path from any node to itself.

If all the directed cycles in the transition diagram of the DTMC are multiples of some integer d and this is the largest such integer, then this is the d of the definition above. The usefulness of the concept of irreducibility arises from the following main theorem, whose proof is beyond the scope of this book. A finite-state irreducible aperiodic DTMC has a unique limiting distribution. We shall restrict ourselves to irreducible and aperiodic DTMCs in our study of limiting behavior. We refer the reader to an advanced text for a more complete discussion of these cases.

We end this section with several examples. This DTMC is irreducible and aperiodic. Hence the limiting distribution, the stationary distribution, and the occupancy distribution all exist and are given by the unique solution to 2 3: Note that although we have four equations in three unknowns, one of the balance equations is redundant. This matches our answer in Example 2. This DTMC is irreducible but periodic. Hence there is no limiting distribution.

This matches with the numerical analysis presented in Example 2. Let Xn be the number of packets in the buffer at the beginning of the nth time slot. We want to compute the long-run fraction of the time the buffer is full; i. The occupancy of state 7 is. Hence the buffer is full O Hence the expected number of packets in the buffer in steady state is given by 7 X lim E.

Consider the manufacturing operation of the Gadgets-R-Us company as described in Example 2. Compute the long-run fraction of the time that both the machines are operating. This is an irreducible and aperiodic DTMC. Hence the occupancy distribution exists and is given by the solution to 2 3: In this section, we develop methods of computing such costs. We start with a simple cost model first. Let Xn be the state of a system at time n. Suppose the system incurs a random cost of C. Although we think of c. It may be any other quantity, like reward per visit, loss per visit, profit per visit, etc.

We shall consider two cost-performance measures in the two subsections below. The actual cost incurred at time r is C. Hence the actual total cost up to time n is given by Xn C. Next, let 2 3 c. The next theorem gives a method of computing g. We illustrate the theorem with several examples. Consider the manufacturing model of Example 2.

Assume that both bins are empty at the beginning of a shift. Compute the expected total number of assembled units produced during an 8-hour shift. The transition proba- bility matrix is given by see 2. Thus, if i D 0, both the bins are empty and a unit is assembled in the next hour if both machines produce nondefec- tive components. Hence the expected number of assembled units produced per visit to state 0 is a1 a2 D: A similar analysis for other states yields c.

We want to compute g. Note that the production during the eighth hour is counted as production at time 7. If there were no defectives, the production would be 8 units. Thus the loss due to defective production is. Compute the net revenue the store expects to get over the next 10 weeks, assuming that it begins with five PCs in stock at the beginning of the week.

We are given X0 D 5. If there are i PCs at the beginning of the nth week, the expected storage cost during that week is 50i. Let Dn be the demand during the nth week. Then the expected number of PCs sold during the nth week is E. Hence the expected net revenue is given as c. Computing the expectations above, we get 2 3 Hence we need to compute g. Note that the figure is higher if the initial inventory is 4! This is the result of storage costs. In such cases, it makes more sense to compute the expected long-run cost rate, defined as g. The theorem is intuitive: The DTMC incurs a cost of c.

Hence the expected cost per visit in the long run must be c. We can use Theorem 2.

Suppose the company has 70 employees and this level does not change with time. Compute the long-run weekly payroll expenses for the company. The grade of an employee evolves according to a DTMC with state space f1; 2; 3; 4g and transition probability matrix as given in 2. Since this is an irreducible DTMC, the unique occupancy distribution is obtained using 2. Consider the model of the data switch as de- scribed in Examples 2. Compute the long-run packet-loss rate if the parameters of the problem are as in Example 2. Following the analysis of Example 2. This is too high in practical applications.

This loss can be reduced by either increasing the buffer size or reducing the input packet rate. Note that the expected number of packets lost during the nth slot, as computed in Example 2. This agrees quite well with the long-run loss rate computed in this example. In this section, we study the first-passage times in DTMCs. It is the random number of steps that it takes to actually visit state N. Typically T can take values in f0; 1; 2; 3;: We shall study the expected value of this random variable in detail below. Let mi D E. We condition on X1.

Suppose X0 D i and X1 D j. The following examples illustrate the theorem above. Suppose both machines are up at time 0. Compute the expected time until both machines are down for the first time. We are interested in m2 D E. Solving simultaneously, we get m1 D days; m2 D Thus the expected time until both machines are down is Consider the manpower model of Example 2. Compute the expected amount of time a new recruit spends with the company. Let Xn be the grade of the new recruit at the beginning of the nth week. If the new recruit has left the company by the beginning of the nth week, we set Xn D 0.

Then, using the data in Example 2. Let T be the first-passage time into state 0. We are interested in m1 D E. Solving simultaneously, we get m1 D Thus the new recruit stays with the company for So far we have dealt with a first-passage time into a single state.

Find a copy in the library

What if we are interested in a first-passage time into a set of states? We consider such a case next. Following the same argument as in the proof of Theorem 2. A matrix language package can be used to solve this equation easily. We illustrate this with an example. Consider the model of stock movement as described in Example 2. What is the expected amount of time that Mr. Jones will end up holding the stock? Let Xn be the value of the stock in dollars at the end of the nth day. We are given that X0 D 5. Jones will sell the stock as soon as Xn is 8 or 9 or Thus we are interested in the first-passage time T into the set A D f8; 9; 10g, in particular in m5.

Two gamblers, A and B, bet on successive inde- pendent tosses of a coin that lands heads up with probability p. If the coin turns up heads, gambler A wins a dollar from gambler B, and if the coin turns up tails, gambler B wins a dollar from gambler A. Thus the total number of dollars among the two gamblers stays fixed, say N. The game stops as soon as either gambler is ruined; i. Compute the expected duration of the game, as- suming that the game stops as soon as one of the two gamblers is ruined. Assume the initial fortune of gambler A is i.

Let Xn be the amount of money gambler A has after the nth toss. If Xn D 0, then gambler A is ruined and the game stops. If Xn D N , then gambler B is ruined and the game stops. Otherwise the game continues. Thus we are interested in mi. Passport is a consumer credit card company that has a large number of customers or accounts. These customers charge some of their purchases on their Passport cards. The charges made in one month are due by the end of the next month. If a customer fails to make the minimum payment in a given month, the company flags the account as delinquent.

The company keeps track of the payment history of each customer so that it can identify customers who are likely to default on their obligations and not pay their debt to the company. Passport Credit Card Company 45 Here we describe the simplest method by which passport tracks its accounts. A customer is said to be in state or delinquency stage k if he or she has missed making the minimum payment for the last k consecutive months.

A customer in state k has four possible futures: Currently the company has a simple policy: To make the discussion above more precise, let pk be the probability that a cus- tomer in state k fails to make the minimum payment in the current period and thus moves to state k C 1. Let qk be the probability that a customer in state k declares bankruptcy in the current period and thus moves to state D. Also, let bk be the average outstanding balance of a customer in state k. We will build stochastic models using DTMCs to help the management of Pass- port analyze the performance of this policy in a rational way.

First we assume that the state of an account changes in a Markov fashion. Also, when a customer account is terminated or the customer declares bankruptcy, we shall simply replace that ac- count with an active one, so that the number of accounts does not change. This is the same modeling trick we used in the manpower planning model of Example 2. Let Xn be the state of a particular customer account at time n i.

When the customer goes bankrupt or the account is closed, we start a new account in state 0. We assume that it is a DTMC. Using the data from Table 2. To analyze the perfor- mance of the policy, we need a performance measure. Although one can devise many performance measures, here we concentrate on the expected annual loss due to bankruptcies and account closures.

Let lk be the expected loss from an account in state k in one month. Additionally, in state 6, a customer fails to make the minimum payment with probability p6 , in which case the account is terminated with probability 1 and creates a loss of b6. Using the data in Table 2. Now note that the transition probability matrix P of 2. Passport Credit Card Company 47 dollars per month. Of course, the company must generate far more than this in rev- enue on average from each account to stay in business. If a customer declares bankruptcy, Passport loses the entire outstanding balance as before.

However, if a customer does not declare bankruptcy, the company can decide to terminate the account and turn it over to the WMB company. When an account is turned over to WMB, it collects the outstanding balance on the account from the ac- count holder by barely legal means. Passport management wants to decide if they should hire WMB and, if they do, when they should turn over an account to them. We give the main steps in the performance evaluation of P2.

For comparison, we have also included the annual loss rate of the current policy Pc. It is clear that among the policies above it is best to follow the policy P6 ; that is, turn over the account to WMB as soon as the account holder fails to make six minimum payments in a row. This policy saves Passport We are told that Passport has 14 million accounts, which is much larger than the number above. This will save Passport: At this point, we should see what assumptions we have made that may not be accurate. First of all, what we have presented here is an enormously simplified ver- sion of the actual problem faced by a credit card company.

We have assumed that all accounts are stochastically similar and independent. Both these assumptions are patently untrue. In practice, the company will classify the accounts into different classes so that accounts within a class are similar. The independence assumption might be invalidated if a large fraction of the account holders work in a particu- lar sector of the economy such as real estate , and if that sector suffers, then the bankruptcy rates can be affected by the health of that sector.

The Markov nature of account evolution is another assumption that may or may not hold. This can be vali- dated by further statistical analysis. Another important assumption is that the data in Table 2. One needs to be aware of all such pitfalls before trusting the results of such an exercise. We end this section with the Matlab function that we used to do the computations to produce Table 2. Then, for i0 ; i1 ;: Consider the machine reliability model of Example 2. Now suppose that there are three independent and identically behaving machines in the shop. Let Xn be the number of working machines at the beginning of the nth day.

Using the probabilistic interpretation, show that P n is also a stochastic matrix. Suppose X0 D i with probability 1. Show that Ti is a G. Consider a machine that works as follows. It takes exactly 2 days for the repairs, at the end of which the machine is as good as new. Items arrive at a machine shop in a deterministic fashion at a rate of one per minute. Each item is tested before it is loaded onto the machine. If an item is found defective, it is discarded. Otherwise, it is loaded onto the machine. The machine takes exactly 1 minute to process the item, after which it is ready to process the next one.

Let Xn be 0 if the machine is idle at the beginning of the nth minute and 1 if it is starting the processing of an item. Consider the system of Conceptual Problem 2. Now suppose the machine can process two items simultaneously. However, it takes 2 minutes to complete the processing. There is a bin in front of the machine where there is room to store two nondefective items. As soon as there are two items in the bin, they are loaded onto the machine and the machine starts processing them.

Model this system as a DTMC. However, now suppose that the machine starts working on whatever items are waiting in the bin when it becomes idle. It takes 2 minutes to complete the processing whether the machine is processing one or two items. Processing on a new item cannot start unless the machine is idle.

The weather at a resort city is either sunny or rainy. The weather tomor- row depends on the weather today and yesterday as follows. If it was sunny yesterday and today, it will be sunny tomorrow with probability. If it was rainy yesterday but sunny today, it will be sunny tomorrow with probability. If it was sunny yesterday but rainy today, it will be sunny tomorrow with prob- ability. If it was rainy yesterday and today, it will be sunny tomorrow with probability.

Model this system as a DTMC, making appropriate independence assumptions. Consider the following weather model. The weather normally behaves as in Example 2. However, when the cloudy spell lasts for two or more days, it contin- ues to be cloudy for another day with probability. Develop a four-state DTMC model to describe this behavior. N points, labeled 1; 2;: Let Xn be the position of the particle at time n. Let ui be the probability that the DTMC visits state 1 before it visits state N , starting from state i. At each step, one ball is chosen at random from the N balls.

If it is from urn A, it is moved to urn B, and vice versa. Let Xn be the number of balls in urn A after n steps. Display the transition probability matrix of the DTMC. The following selection procedure is used to select one of two brands, say A and B, of infrared light bulbs. One light bulb from each brand is turned on simultaneously, and the ex- periment ends when one of the two light bulbs fails. Brand A wins a point if the brand A light bulb outlasts brand B, and vice versa.

The probability that the bulbs fail simultaneously is zero. The experiment is repeated with new light bulbs un- til one of the brands accumulates five points more than the other, and that brand is selected as the better brand. Let Xn be the number of points for brand A mi- nus the number of points accumulated by brand B after n experiments. Suppose it incurs a cost of c. Derive the following equations: Another useful cost model is when the system incurs a random cost of C. This model is called the cost per transition model. Define N X c. Also show that g.

Consider the telecommunications model of Example 2. Suppose the buffer is full at the end of the third time slot. Compute the expected number of packets in the buffer at the end of the fifth time slot. Suppose the particle starts on point 1. Compute the probability distribution of its position at time 3.

Consider the stock market model of Example 2. Compute the expected net change in the value of his investment in 5 days. Smith is a coffee addict. He keeps switching between three brands of cof- fee, say A, B, and C, from week to week according to a DTMC with the following transition probability matrix: Suppose the buffer is full at the beginning. Compute the expected number of packets in the buffer at time n for n D 1; 2; 5; and 10, assuming that the buffer size is 10 and that the number of packets arriving during one time slot is a binomial random variable with parameters 5,.

Consider the machine described in Conceptual Problem 2. Suppose the ma- chine is initially up. Compute the probability that the machine is up at times n D 5; 10; 15; and Suppose it has employees at the beginning of week 1, distributed as follows: If employees behave indepen- dently of each other, compute the expected number of employees in each grade at the beginning of week 4. Consider the machine reliability model in Example 2. Sup- pose the machine is up at the beginning of day 0. Compute the probability that the state of the machine at the beginning of the next three days is up, down, down, in that order.

Suppose both machines are up at the beginning of day 0. Compute the probability that the number of working machines at the beginning of the next three days is two, one, and two, in that order. Consider the weather model of Example 2. Compute the probability that once the weather becomes sunny, the sunny spell lasts for at least 3 days. Compute the expected length of a rainy spell in the weather model of Exam- ple 2. Consider the inventory system of Example 2. Compute the probability that the inventory trajectory over the next four Mondays is as follows: Compute the probability that an order is placed at the end of the first week for more PCs.

Suppose both bins are empty at time 0. Compute the probability that both bins stay empty at times n D 1; 2; and then machine 1 is shut down at time n D 4. Compute the occupancy matrix M. Using this, compute the expected number of weeks that Computers- R-Us starts with a full inventory i. Suppose that at time 0 there is one item in bin 1 and bin 2 is empty.

Stanford Libraries

Compute the expected amount of time that machine 1 is turned off during an 8-hour shift. Suppose the buffer is empty at time 0. Compute the expected number of slots that have an empty buffer at the end during the next 50 slots. Classify the irreducible DTMCs with the transition matrices given below as periodic or aperiodic: Do Computational Problem 2. Consider Computational Problem 2. Compute the expected number of em- ployees in each grade in steady state. Consider the weather model of Conceptual Problem 2.

Compute the long- run fraction of days that are rainy. Compute the long- run fraction of days that are sunny. What fraction of the time does the coffee addict of Computational Problem 2. What is the long- run fraction of the time that this machine is up? Compute the expected number of components in bins A and B in steady state. What fraction of the time does the chief financial officer have to interfere in the stock market to control the price of the stock?

Consider the single-machine production system of Conceptual Problem 2. Compute the expected number of items processed by the machine in 10 minutes, assuming that the bin is empty and the machine is idle to begin with. Which one of the two production systems described in Conceptual Problems 2.

Consider the three-machine workshop described in Conceptual Problem 2.

Introduction to modeling and analysis of stochastic systems

What is the net rate of revenue per day in steady state? Can we consider the problem with one machine to obtain the answer for three machines? Compute the long-run ex- pected cost per day of operating this system. Consider the manufacturing system of Example 2. Compute the expected number of assemblies produced per hour in steady state. What will be the increase in the production rate in number of assemblies per hour if we provide bins of capacity 3 to the two machines in Example 2.

Compute the long-run expected number of packets transmitted per unit time by the data switch of Example 2. How is this connected to the packet-loss rate computed in Example 2. Consider the brand-switching model of Computational Problem 2. Smith consumes one pound of coffee per week, what is his long-run expected coffee expense per week? Consider the selection procedure of Conceptual Problem 2. Suppose the mean lifetime of Brand A light bulbs is 1, while that of Brand B light bulbs is 1. Compute the expected number of experiments done before the selection procedure ends. Suppose the buffer is full to begin with.

Compute the expected amount of time counted in number of time slots before the buffer becomes empty. Compute the expected time in hours before one of the two machines is shut down, assuming that both bins are empty at time 0. You may use the Matlab program of Section 2. Suppose Passport has decided not to employ the services of WMB. However, this has generated discussion within the company about whether it should terminate accounts earlier. Which policy should Passport follow? Consider the current policy Pc. One of the managers wants to see if it would help to alert the customers of their impending account termination in a more dire form by a phone call when the customer has missed six minimum payments in a row.

This will cost a dollar per call. The manager estimates that this will decrease the missed payment probability from the current p6 D: Is this policy cost-effective? In this changed environment, should Passport engage the services of WMB? When should it turn over the accounts to WMB? Passport has been approached by another collection agency, which is willing to work with no annual service contract fee. Is this option better than hiring WMB? Now we would like to study a continuous-time stochastic process fX. We shall see in the next chapter that a finite-state CTMC spends an exponentially distributed amount of time in a given state before jumping out of it.

Thus exponential distributions play an important role in CTMCs. Hence we study these topics in this chapter. The pdf given by Equation 3. The cdf given by Equation 3. A random variable with the cdf given in Equation 3. All three are denoted by Exp. We leave it to the reader to verify that 1 E. Suppose a new machine is put into operation at time zero. Its lifetime is known to be an Exp. What are the mean and variance of the lifetime of the machine?

What is the probability that the machine will give trouble-free service continu- ously for 1 day? To use consistent units, we use 24 hours instead of 1 day. We compute the re- quired probability as P. Suppose the machine has not failed by the end of the first day. What is the prob- ability that it will give trouble-free service for the whole of the next day? The required probability is given by P. But this is the same as the earlier answer.

Thus, given that the machine has not failed by day 1, it is as good as new! The property discovered in the example above is called the memoryless property and is one of the most important properties of the exponential random variable. We define it formally below. It has the memoryless property if and only if it is an Exp. Thus X has the memoryless property. Next, suppose X is a continuous random variable with pdf f. Define Z 1 G. Next consider a random variable with parameters k D 1; 2; 3; A random variable with cdf given in Equation 3.

All three are denoted by Erl. We leave it to the reader to verify that k E. We shall redo Example 3. The second probability is lower than the first, indicating that the machine deterio- rates with age. This is what we would expect. The next theorem shows that Erlang random variables appear naturally as the sum of iid exponential random variables. Then Zn is an Erl. Hence the result is true for n D 1: We shall show that it holds for a general n by induction. So suppose the result is true for n D k. We shall show that it holds for n D k C 1.

Hence the result follows by induction. Next we study the minimum of independent exponential random variables. Let Xi be an Exp. Let X D minfX1 ; X2 ;: Then X is the time when the first of these k events occurs. The main result is given in the next theorem. X of Equation 3. Next, let X be as in Equation 3. Thus there is no ambiguity in defining Z. The next theorem gives the joint distribution of Z and X.

Distribution of Z; X. It is counterintuitive that Z and X are independent random variables because it implies that the time until the occurrence of the first of the k events is independent of which of the k events occurs first! In yet other terms, the conditional distribution of X , given that Z D i , is Exp. Boy or a Girl? A maternity ward at a hospital currently has seven pregnant women waiting to give birth. Three of them are expected to give birth to boys, and the remaining four are expected to give birth to girls.


  • La pirámide de Khéops (COLECCIÓN CIAN) (Spanish Edition).
  • Panem et Circensis - Caligulas Beziehungen zu Volk und Militär (German Edition)?
  • Introduction To Modeling and Analysis Of Stochastic Systems by Kulkarni, V G!
  • Dedicant: A Witchs Circle of Fire (Course of Study in the Old Religion);
  • Trabajo asalariado y Capital (Spanish Edition).
  • Del abismo a la luz: La historia de la mamá de Justin Bieber (Spanish Edition).

From prior experience, the hospital staff knows that a mother spends on aver- age 6 hours in the hospital before delivering a boy and 5 hours before delivering a girl. Assume that these times are independent and exponentially distributed. What is the probability that the first baby born is a boy and is born in the next hour? Let Xi be the time to delivery of the i th expectant mother. Assume mothers 1, 2, and 3 are expecting boys and mothers 4, 5, 6, and 7 are expecting girls. Let B be the time when the first boy is born, and G be the time when the first girl is born.

Also, B and G are independent. Thus the probability that the first baby is born within the hour is given by P. In the next section, we study another random variable that is quite useful in build- ing stochastic models. Let X be the number of successes among the n trials. The state space of X is f0; 1; 2;: The random variable X is called a binomial random variable with parameters n and p and is denoted as Bin.

The pmf of X is given by! To avoid such difficulties, it is instructive to look at the limit of the binomial distribution as n! A random variable with pmf given by Equation 3. The pmf given in Equation 3. We see that Equation 3. We leave it to the reader to verify that, if X is a P. Suppose accidents occur one at a time at a dangerous traffic intersection during a hour day. We divide the day into minute-long intervals and assume that there can be only 0 or 1 accidents during each 1-minute interval.

Compute the probability that there are exactly k accidents during one day. Let X be the number of accidents during a hour period. Using the assumptions above, we see that X is a Bin. Hence we approximate X as a P.


  • Tödliches Missgeschick im Frisiersalon? und Mord beim Italiener! (German Edition).
  • Die Äolischen Inseln: Salina (German Edition)!
  • AGAIN 2 BLACK IS BACK?
  • Les 100 mots des services: « Que sais-je ? » n° 3833 (French Edition);

Customers arrive at a service facility one at a time. Suppose that the total number of arrivals during a 1-hour period is a Poisson random variable with parameter 8. Compute the probability that at least three customers arrive during 1 hour. Let X be the number of arrivals during 1 hour. Then we are given that X is a P. What are the mean and variance of the number of arrivals in a 1-hour pe- riod?

Next we study the sums of independent Poisson random variables. Suppose fXi ; i D 1; 2;: Then Zn is a P. We shall show the result for n D 2. The general result then follows by induction. Since adding two Poisson random variables generates another Poisson random variable, it is natural to ask: That is, can we produce two Poisson random variables starting from one Poisson random variable? The answer is affirmative. The reverse process is called thinning and can be visual- ized as follows. An event may or may not be registered.

Let X1 be the number of registered events and X2 be the number of unregistered events. The independence is quite surprising! Using the properties of the exponential and Poisson distributions studied above, we shall build our first continuous-time stochastic process in the next section. The modeling of these systems becomes more tractable if we assume that the successive inter-event times are iid exponential random variables.

For reasons that will become clear shortly, such a stream of events is given a special name: We de- fine it formally below. Let Sn be the occurrence time of the nth event. Thus Tn is the time between the occurrence of the nth and the. Thus an event at 0 is not counted, but an event at t, if any, is counted in N. One can formally define it as N. The stochastic process fN. A typical sample path of a Poisson process is shown in Figure 3. The next the- orem gives the pmf of N. It also justifies the name Poisson process for this stochastic process. For a given t; N. We shall first compute P.

Then the kth event in the Poisson process must have occurred at or before t; i. Then the kth event has occurred at or before t. Hence there must be at least k events up to t; i. Thus the events fN. However, Sk is a sum of k iid Exp. Hence, from Theorem 3. As an immediate consequence of the theorem above, we get E. The next theorem gives an important property of the Poisson process. It has the Markov property at each time t; that is, P.

This can be seen as follows. Then SkC1 is the time of occurrence of the first event after t. Using the strong memoryless property see Conceptual Problem 3. Thus the time until the occurrence of the next event in a Poisson process is always Exp. The result above can be stated in another way: In particular, the increments over non-overlapping intervals are independent. In fact, the argument in the proof above shows that the number of events on any time interval. We say that the increments are stationary. We say that the Poisson process has stationary and independent in- crements.

We shall see this property again when we study Brownian motion in Chapter 7. What are the mean and variance of the number of births in an 8-hour shift? Using days as the time unit, we see that we want to compute the mean and vari- ance of N. Hence we get E. What is the probability that there are no births from noon to 1 p.

Hence the required probability is given by P.