Вы находитесь на странице: 1из 54
Editor:L.Marder | stochastic | processes R.COLEMAN The aim of this series is to provide an in- ‘expensive source of fully solved problems in a wide range of mathematical topics. Initial volumes cater mainly for the needs of first- year and some second-year undergraduates (and other comparable students) in math- ematics, engineering and the physical sciences, but later ones deal also with more advanced material. To allow the optimum amount of space to be devoted to problem solving, explanatory text and theory is generally kept to a minimum, and the scope of each book is carefully limited to permit adequate coverage. The books are devised to be used in con- junction with standard lecture courses in place of, or alongside, conventional texts. ‘They will be especially useful to the student as an aid to solving exercises set in lecture courses. Normally, further problems with an- ‘swers are included as exercises for the reader. Here the author provides a concise intro- duction to stochastic processes through fully worked examples. Whenever possible prob- ; ability arguments are used rather than methods which rely on mathematical manipulations. : The book is suitable for mathematics under- : graduates in their second or third years who : are planning to take courses in stochastic : processes. An elementary background in the calculus and in matrices is necessary, but no knowledge of more advanced mathematics is assumed, consequently the book will be valuable to graduates in technology, engin- eering, business studies or in the social sciences who have the necessary mathematical background and who seek to understand the construction of models of queuing processes, population processes and gambling situations. The book is very self-contained and a brief 4 account of the probability theory that is used is included, so that the reader can successfully use it for self-study. Rodney Coleman graduated in 1964 from Leeds University and then studied at Churchill College and the Statistical Laboratory in the University of Cambridge, where he gained his Ph.D. and a Diploma in Mathematical Statistics. Since 1968, he has been a Lecturer in Math- ematics at Imperial College in the University of London, where he gives courses in Stochastic Processes, Probability Theory, Statistical Theory and Industrial Statistics. £450 | Pr e ‘ vie i Stochastic Processes 2 CALCULUS OF SEVERAL VARIABLES—L. Marder 3 VECTOR ALGEBRA—L. Marder 4 _ 4 ANALYTICAL MECHANICS—D. F. Lawden 7 RODNEY COLEMAN r - < 5 CALCULUS OF ONE VARIABLE—K. Hirst Lecturer in Mathematics Imperial College, University of London VECTOR FIELDS—L, Marder ___ 8 MATRICES AND VECTOR SPACES—F. Brickell 9 CALCULUS OF VARIATIONS—] Craggs 10 LAPLACE TRANSFORMS—J. Williams IL statistics 1A. K. Shahani & P. K. Nandi _ 12 FOURIER SERIES AND BOUNDARY VALUE PROBLEMS—W. E. __ Williams 13, F See Coleman _ 15 FLUID MECHANICs—J. Williams _ 16 GRoups—D. A. R. Wallace re LONDON - GEORGE ALLEN & UNWIN LTD © RUSKIN HOUSE MUSEUM STREET is copyright under the Berne Convention. All ee cer resai abated iy nivel fi 8 ‘of private study, research, criticism or review, as permitted under the Copyright Act 1956, no part of this _ publication may be reproduced, stored in a retrieval system, or transmitted, in any form or by any means, electronic, electrical, chemical, mechanical, optical, photocopying "recording or otherwise, without the prior permission of the ‘copyright owner. Inquiries should be addressed to the publishers © George Allen & Unwin Ltd, 1974 188 0 04 519016 X hardback 004 519017 8 paperback WIGAN LEISURE SERY 1goBOR "Set in 10 on 12 pt ‘Monophoto* Times Mathematics Series 569 Printed in Great Britain by Page Bros (Norwich) Ltd., Norwich 1 WHAT IS A STOCHASTIC PROCESS? 2 RESULTS FROM PROBABILITY THEORY Introduction to probability theory Bivariate distributions ‘Multivariate distributions Probability generating functions Characteristic functions 3 THE RANDOM WALK 3.1 The unrestricted random walk 3.2. Types of stochastic process 3.3 The gambler’s ruin 3.4. Generalisations of the random-walk model 4. MARKOV CHAINS 4.1 Definitions 4.2. Equilibrium distributions 43 Applications 4.4 Classification of the states of a Markov chain 5 THE POISSON PROCESS 6 MARKOV CHAINS WITH CONTINUOUS TIME PARAMETERS 6.1 Thetheory 62. Applications ; 7 NON-MARKOV PROCESSES IN,CONTINUOUS TIME WITH DISCRETE STATE SPACES ~~ 7.1 Renewal theory 7.2 Population processes 7.3 Queuing theory 8 DIFFUSION PROCESSES RECOMMENDATIONS FOR FURTHER READING INDEX wan sey ela RRS Ch eel f re aoe ay y n 6 What is a Stochastic Process? The word stochastic is jargon for random. A stochastic process is a sy which evolves in time while undergoing chance fluctuations. We describe such a system by defining a family of random variables, {: where X, measures, at time t, the aspect of the system which is of int For example, X, might be the number of customers in a queue at As time passes, customers will arrive and leave, and so the value of X, change. At any time t, X, takes one of the values 0, 1, 2, ...; and t can b any value in a subset of (—co, co), the infinite past to the infinite fut If we observe the queue continuously, and customers arrive one at a to be served by a single server, then, when a customer arrives, the valu of X,, the queue size, increases by one, and when a customer departs ; being served, X, decreases by one (Figure 1.1). The values which oy a] 3] i 2| ° 7 Figure |.| The number of customers, X,, in a queue at time * can take are called its states, and changes in the value of X, are called — transitions between its states. If we observe the queue size not continu- ously but at unit intervals, say once every quarter of an hour, then mot Re than one customer can arrive or leave in each time interval. This will to larger fluctuations in the value of X,. These obvious statements repre- sent the basis of the construction of a model of a queue which incorporates _ random intervals between the arrivals of customers and random periods spent at the service point. It is the often complex consequences of idealised models which we shall be studying. Simple as these models are, _ by incorporating a little of the randomness of the real world they b1 us far closer to understanding the real world than could possibly achieved with models which neglect such random behaviour. Stochastic models are applicable to any system involving chance ability as time passes. In geophysics they have been used for the predicti of the size and whereabouts of earthquakes, in geography to study the spread of shoe shops in a growing city, in entomology the way in which aphids congregate on the leaves of plants, in nature conservancy the way in which birds, turtles and eels navigate, and in industry they have been used for the prediction of the durations of strikes. By a stochastic process, we shall mean a family of random variables {X,}, where is a point in a space Tcalled the parameter space, and where, for each te T, X, is a point in a space S called the state space. The family {X,} may be thought of as the path of a particle moving ‘randomly’ in space S, its position at time ¢ being X,. A record of one of these paths is called a realisation of the process. We are interested in relations between the X, for different fixed values of t. We apply the theory of probability to determine these relationships. Other aspects of stochastic processes will also interest us. For example, the time which elapses before a gambler loses all his capital, or the chance that any customer who might enter a shop in a fixed interval (0, t) will be served without having to wait. By thinking in terms of a particle travelling in a space we can often demonstrate the applicability of the same model to widely differing situations. For example, the time to a gambler’s ruin, the time before a reservoir dries up, the time for the server of a queue to ____ become free and the time before an animal population becomes extinct are _ all equivalent to the time before the particle first hits a point 0. At that time the gambler’s resources have been reduced to zero, the water level isdown to zero, the number of waiting customers is zero and the population size is zero. __ Problem1.1 Whatare the state space and parameter space fora stochastic "process which is the score during a football match? Solution. The state space S is the set of possible values the score can take, so S = {(x,y):x,y = 0,1,2,...}. If we measure time in minutes, then the ‘parameter space T is (0,90). The process starts in state (0,0), and transi- tions take place between the states of S whenever a goal is scored. A goal increases x or y by one, so the score (x, y) will then go to (x+1,y).or — y+). oOo _ Problem 1.2 Describe how we might use a stochastic process to study the resources of an insurance company. Solution. We let the resources at time t be a random variable X,. Then X, willincrease at a randomly fluctuating but fairly steady rate as premiums come in, but is subject to sudden falls as claims are met. Oo Problem 1.3 What are the state space and parameter space for a stochastic process which is the depth of the sea in position x at time 1? Solution. The depth of the sea is measured from the top of a wave down to the seabed. As the waves move about, the depth at any fixed point x will vary with time, regardless of any larger-scale influences such as tides. We can measure the depth at any time t, and in any position x, so the parameter space Tis the set of all t = (z,x) for which —co < t < 00 and x € &, where & is the set of map references for the entire sea. Here t is not just time, but a combination of time and space coordinates. The state space S is the set of all values which the depth can possibly be, so S = [0,co), where the depth is 0 when the seabed is exposed, and we do not limit the height the waves can reach, although a wave of infinite height — will not occur without a miracle, a Problem 1.4 An epidemic process. A fatal disease is brought into a closed community by a single infected individual. Describe how the spread of the disease may be studied as a stochastic process. Solution. We suppose that, for a period, infected persons show no symptoms and are not infectious. They then become carriers, and are infectious, but still show no symptoms. Carriers, after a period, exhibit symptoms of the disease and are isolated. These people are cured and be~ come immune, or else they die. Let M be the class of immune members of the community; S be the class of susceptibles, that is, people at risk; N be the noninfectious incubators of the disease; C, the carriers; I, the carriers who have been isolated; and D, the dead. Initially all members of the community are in M or S, except the single infected person in N or C. Transitions between classes take place only according to the arrows in Figure 1.2. The random variables of interest are the numbers in each of the classes. at each time t. The progress of the disease will depend on the degree of c - L Figure 1.2 : . i immunity, the amount of contact between carriers and susceptibles, the rate at which carriers are detected, and the chance of a cure being effected. Epidemiologists use the theory of stochastic processes to seek ways of _ influencing the rates of transition between the classes. Qo Problem 1.5 A library book loan process. A reader visits a library _ regularly at the same time each week. There, if he has finished the book he is currently borrowing, he exchanges it; otherwise he has its loan _ tenewed. Consider the stochastic process {Z,:n = ...,—1,0,1,2,...}, __ where Z, is the number of renewals of the book currently being borrowed __ as the reader leaves the library in week n, where the weeks are measured from an arbitrary time point. ___ Ifa book has just been exchanged, then Z, = 0. What is the state space or this process, and what are the possible transitions of state? Solution, State space S$ is {0,1,2,...}, since Z, can take any value - 0,1,2,...; though it would have to be a massive tome for very large values. FZ, = kk = 0, 1,2,...), then in week n+ 1 if the book being read is com- ee teeta ane book isborrowed,soZ,,, = 0. Otherwise, the loan is renewed for another week, so Z,,, = k+1. Oo ?roblem 1.6 A dam storage process. Consider a dam which can hold at st w units of water. Suppose that during day n, J, units of water flow ito the dam, any overflow being lost. Provided the dam is not dry, one nit of water is released at the end of each day. Suppose that {y,} is a uence of nonnegative integers, and that w is a positive integer. What is State space for {Z,:n = 0, 1,2,...}, where Z, is the content of the dam the release (if any) of the unit of water on day n? Show how Z,.,, ends on Z, and y,, ,. Solution. If the dam is full on day n, then after the unit release, Z,, will fake value w—1. If we then have a dry day (ie. y,,, = 0),Z,,, = w—2. another dry day (y,,, = 0), Z,,, = w—3; and so on, until after the (w—l)st dry day, Z,,,,, = 0. Clearly these will be the only possible lues that Z, can take. The state space is therefore {0,1,2,...,w—1}. _IfZ, = i(@ = 0,1,..., w—1) and y,,, = k(k = 0, 1, 2, ...) then fy fo (i+k = Oorl) ‘ Zyay =jitk—-1 (i+k = 2,3,....~—1) , w—1 (i+k = wew+l,...) a _ Problem 1.7 A factory has two machines, but on any given day not _ more than one is in use. This machine has a constant probability p of _ breaking down and, if it does, the breakdown occurs at the end of the ee E ey ny es rm oer day's work. A single repairman is employed. It takes him two days to repair a machine, and he works on only one machine at a time. Construct a stochastic process which will describe the working of this factory. Solution. We must select a suitable random variable to observe. Clearly its value need be recorded only at the end of each day, since all transiti occur just before then. The parameter space T will therefore be the set of working days during which this system is in use. Let us call the first day, the second, 2; and so on; then T= {1,2,3,...}. A suitable stochastic process is {X,:n T}, where we record at the end of day n the value of X,,, the number of days that would be needed to get both machines back in working order. If both machines are in working order, then X, is 0. If one machine is in working order and the other h already had one day’s repair carried out on it, then X,, is 1. fone machine is in working order, and the other has just broken down, then X, is 2 If one machine has just broken down, and the other has had one day's repair carried out on it, then X, is 3. These are the only possible cases, so the state space is S = {0,1,2,3}. a ne | EXERCISES Bis 1. List the set of possible transitions between the states of the factory — described in Problem 1.7. é 2. Choose random variables suitable for studying the behaviour traffic at the junction between a main road and a side road. 3. Construct a process for studying the counting of the votes in an elec- tion fight between just two candidates. it 4, Describe the process {X,}, where X, is the number of teeth an indi- vidual has at time ¢ if he was born at time t = 0. Clearly X, = 0. — E _ This chapter contains results from elementary probability theory. Those to whom this is all familiar should nevertheless read it to acquaint them- with the notation, some of which has been introduced specifically ‘to facilitate the study of stochastic processes, and so differs from that generally used. _ 2.1 Introduction to probability theory An experiment is any situation in which there is a set of possible outcomes. For example, a competition _ or game, a horse race, a ballot, a law suit are all experiments since their esults are uncertain. A random variable (which we abbreviate to rv) is a number associated with the outcome of an experiment. For example, ‘the experiment might be for a dentist to observe the number of natural _ teeth a patient has, Then the rv, X, will take value 28 if the patient has 2B teeth. An rv is called discrete if it can take only a finite or countably infinite number of distinct values. For example, the number of teeth nust be one of the numbers 0 to 32. __ Since the outcomes of an experiment are uncertain, associated with each _ outcome there will be a probability. The probability that rv X takes value _x will be the probability that the outcome associated with x occurs, We write this: pr(X = x) Ifa discrete rv, X, can take values x,, x, ... _ ina set S, then the sequence of numbers p,,, p,,,.... Where f P, = pr(X = x) (xeS) is called the probability distribution of the rv X, and P,>O(xeS), YY p, =p, +, +... = 1 tes For any subset S* of S pr(Xe S*) = 2 Pe (2.1) _ For example, in throwing a die, if X is the value it shows, then $ = {1,2,3, 4,5,6}. If the die is fair then p, = 2 (x € S). To find the probability that an even number shows, we define S* = {2, 4,6}, then __ pr(an even number shows) = pr(X € S*) = pr(X = 2 or 4 or 6) = P2+PatPo If pr(X = x) = 0 for every real x, then rv X is called continuous, For example, X might be a height or a weight. Its value will have an infinite ee a oe decimal expansion, although in practice we must round off our measure- The distribution function (abbreviation: df) of an rv X is the function F(x) = pr(X < x) F is a nondecreasing function of x, and F(-0)=0, -F() =1 If there is a nonnegative function f(x) such that we can write Fe) =|" _ foray then is the probability density function (abbreviation: density) of X, and X is continuous. For example, if rv X has an exponential distribution with — parameter 2, then 0 (x < 0) sult (x <0) BOS &>g FA (eos («> 0) We write: X is 6(2). ie We can interpret the density as a probability by writing flx)dx = pr{X € (x,x-+dx)} IEX is an ry, then so is y(X) for any real-valued function W. IfX is discrete, then so also is y(X), and we define the expectation (or expected value) of W(X) by Ey(X) = x Wx)pr(X = x) Similarly, if X is continuous, the expectation of y(X) is By) = [* verse) ax The expectation of X, EX, is called the mean of the distribution of X, and is generally denoted by j.. By taking (x) = ax +b, where a and b are con~ stants and expanding the right sides of equations 2.3 and 2.4, we find that E(aX +b) = aEX +b (2.5) ‘The variance of the distribution of X, VX, is defined as E(X — 1), where = EX, and is generally denoted by o®, We can write VX in the con- venient form VX = EX?-(Ex) Bs olation: From equdtions 22 and of EX = is xhe“** dx = 1/d EX? = \ x*he-* dx = 2/)? f Problem 2.2 Find EX and VX ifry Xhas the Bernoulli distribution with ‘parameter 0. Solution. We write: X is (0). The parameter is called the probability of Bciccess, since the Bernoulli distribution is often used as a model for experi- a ‘ments which have only two outcomes: success and failure. If the outcome is success we set X = 1, if failure then X = 0. Other applications are 5 when the outcomes are on and off, yes and no, or heads and tails, The _ Probability distribution of X is therefore cae: p(X =1)=6, pr(X =0)= 1-0 From equations 2.3 and 2.6 EX = Ixp(X = 1)+0xpr(X = 0) = p(X =1)=0 (27) 4 EX? = 1?xpr(X = 1)+0?x pr(X = 0) = 6 _ Therefore VX = EX?—(EX) = 6-6? = 1-6) o blem 2.3 Construct a model for the experiment in which a fair coin is tossed n times. _ Solution. Consider the experiment, or trial, in which a coin is tossed just ‘once. Then the large experiment is a sequence of n of these trials, The out- come of a trial can be either ‘a head shows’ or ‘a tail shows’. We shall label _ these H and Trespectively, as a convenient abbreviation. Then, since the iy is fair, pr(H) = pr(T) = 4. Let us define an ry X which takes value 1 if and value 0 if 7. Then X is AU). The trial is called a Bernoulli tial ___ Nowsuppose that we repeat this trial n times under identical conditions; _ that is, we carry out a sequence of n independent Bernoulli trials. (Inde- pendence will be defined formally later.) We then define rvs Fs 1 if Hon kth trial ss { tT on kth triad = 162--->m) "where each X, is independently (3). The sequence of rvs X, Xn... ee ; i Me descihen the Sate ctaiag expert ig Tee ae ltbeea iy tho mabe of bas tet lo erro ed a experiment, we define rv ‘a Z, =X ,+..4X, which, as the sequence of trials proceeds, increases by one every time the coin comes down head u ost. We note that as n increases, sequence {Z,} is a stochastic pr a very important process, being the basis for random walk mi 2.2 Bivariate distributions (X, Y) of rvs is the function Fy y F(x, y) = Fy yy) It is the probability that the value of rv Midoes n6t Gliecd sand the value of rv ¥ does not exceed y. The subscripts will be used when we wish to ; avoid ambiguity; for example, in Section 2.1 for the rv ~~ have % used fy, Fy, Uy, 0%, Ey, Vx. For (X,Y), F(x,00) = p(X 0) i The value y is given and so is a known constant. The expectation of X given Y = yis ; ExirayX = [| Saray) dx (2.14) _ IfX and Y are independent then, from equation 2.9, Fixx Y) = Sx) Fy) for every (x, y) (2.15) so, from equation 2.13, Sivas) = £,0) (2.16) Now EME = J" |" voeaoovddedy (2.17) a z WC, y)p(X =x, Y = y) We define the covariance of rvs X and Y to be — cov(X, ¥) = Ex yf(X — Ex X)(¥ — Exy ¥)} Ati? = Ex (XY)-(E,X)(E,¥) (2.18) Problem 2.4 Bivariate rv (X, ¥) takes values (—2, 4), (—1, 1), (1, 1), (2,4), each with probability }. (i) Find the marginal distributions of X and ¥, and calculate coX, Y). (i) Show that X and Y are not independent. Soltition. (i) The marginal distributions of X and Y are: pax = -2) = p(X = -2 Y=4 =} p(X = —1) = pr(X = 1) = p(X = 2)= 3 similarly; a pa = 1) = pel = —1, Y= tok = 1, Y= Da dtb=} pr(Y = 4) =} ; —s X= H(-2)4(- ie. EY -4044=4 and E(XY) = }{(-2« 4)+(-1 x 1)+(1 x 1I)4+(2x4)} = 0. Therefore cov(X, ¥) = E(X¥)—(EX)(EY) = Gi) 0 = pr(X = -2, Y = 1) # p(X = —2)pr'¥ = 1) =}xd= Therefore, X and Y are not independent. We note that Y = X?. If X and Y are both continuous rvs, from equations 2.17 and 2.13, EON) = [I ves Mqresf 0) dedy oa NR T. VO Mar as)4x} fy) dy = Ey{Exyar WX, Yh Similarly, if one or other or both of X and ¥ is discrete, li ExyWX ¥) = EyExyayW(%¥) 9 This will be called the decomposition rule and is the most important singl ‘ device we shall use in our treatment of stochastic processes. For example, if {Z,,} is the path of a particle, then to determine Ey(Z,) we can condition on the position Z, (k < n) at an earlier time, ie. EWZ,) = Ex2,V(Z,)= ExEzinenV(Z,) 220 A special case of equation 2.19 is when depends only on X.Then Ex V(X) = ExEyy=x W(X) = Ex WX) since, given X = X, (X) is a constant, so Eyjxex W(X) = YOO This result was used in equation 2.18. ? Problem 2.5 If the distribution of X| ¥ = y is (0())), show that the marginal distribution of X is @(E, 4(¥)). Solution. From equation 2.7, ExyayX = pr(X = 1|¥ = y) = 0) Reareesore ; = pr(Y, = yy Dor = yal ¥, = y, PHYS = 1X = Yo = Ye) p(X = 1) = Ey X = EyEyyayX = Ey AY) oh = FAN 9 7 gO) ) ey X can take only 2 values, 0 and 1; therefore From this we can generalise the decomposition rule to pr(X = 0) = 1—pr(X = 1) = 1-E, &Y) Ey, YM s+ ¥) = Ey, Evaysers Evsiniens, sory X is @(E, OY). o eee myablew 2.6 Show that The result of Problem 2.7 generalises. Ver ¥) = EyVyirer VY) + VyExrarWX¥) (2.22) Problem 2.8 Show that if X,,..., X, are independently distributed Solution, We shall abbreviate ¥(X, Y) to W. By equation 2.6, rvs, then ‘ Vy ¥ = Ex? (Ex WP Beal (Xo Va (XQ) = (Bx, Hy (Xy )} +: (Ex, V(X} ; = {Ey Exy-r¥? —E(Exyar¥} = AAge Ay SAY +{Ey(Eqyay¥)? — (Ey Eyyar¥)?} Solution. We use the method of induction. By Problem 2.7 = EyVy-r¥ + Vy Exiyar By= Ady where we used equation 2.19 and introduced two middle terms which Suppose cancel. oO Bpg = Apo Problem 2.7 Show that if X and ¥ are independent rvs, then for any Thetninee X71, Bi, Bre maependeat, j real-valued functions $ and y Fret atten AM 0 Xp MO) = Eyres Aha ee 4 Ex PAW} = {Ey OX} {Ey WV)} Xq-1) i Solution, By equations 2.14 and 2.16 Re ae oa aed =, AX) = Ey (X) Ein 11%ne Xa VAX 1) 2 W(X Bitte ae = Ee VRE, axe aKa Xp) = ApByog 5 = A,(A, A.A, Egy GCOW)} = Ey Exy=r($COWD)} = By WOE gr 620 n(As Aa---4,-1) F by equation 2.5, since, given Y = Y, y(Y) is a constant. Therefore The result is thus proved. or Ex yf XW Y)} = Ey WAY) {Ey 6X} = {Ey 6X0} {Ey W()} The converse is not necessarily true. In Problem 2.4 we had rvs X and Y since, with respect to the marginal distribution of Y, Ey #(X) is a constant. which were not independent, but for which Bi o E, (XY) = (E, X)(E, ¥) In a stochastic process {X,,} the states of the system, X,, X,,... will — 2.3 Multivariate distributions Rvs X,,...., X, are mutually independent in general not be independent, and we shall deal with consequences of this. if, for every subset, X,,,..., X,,, of two or more, In particular we shall study the special case in which X, depends on X,_,, but not on X,_,,X,_35++++ pr(X,, 2Y)=1 Note that in (iv), PeX > 2¥)m 0 ff Saxe Nandy 5. If rv X has distribution Pa = fie = 3, pr(X = 1) =}, what is | the distribution of rv Z = 1/X? What is the joint distribution of X and Show that cov(X,Z) = —3. 6. Prove formula 2.19 when X and Y are both continuous, and when X__ is discrete and Y is continuous. 7. Ifrv X has pgf G(s), verify that VX = G"(1)+G'(1)—{G(1)}?. 8. Ifry X is A(2), show that EX = VX = 1, 9. Ifrv X is Bin(n, 0), show that EX = nO, VX = nO(1—6). he rv Y which SBE of uae b Bigvod tuctese: 8) op ol bueMexclding ther aist nicceeattiay & metric distribution with parameter 0. We write Y is 9(8), and prY = y)= (1-970 (y= 0,1,2,...) that Y has pgf G(s) = 6/{1—(1—6)s} and that EY = (1—0)/6, VY = (1-0/0, 1, Therv W which is the number of failures in a sequence of independent - Bernoulli trials (probability of success 6) prior to the nth success has a _ negative binomial distribution with parameters n and 0. We write W is ee piW = w) ("e-"ra-or 6 =0,1,2,..) uppose that Y,, Y,,..., Y, are independent (4) rvs. By writing ee W=Y,+Y,+...+¥, "deduce that W has pgf G(s) = [6/{1—(1 —6)s}]*, and that bi EW =n(1-0)/0, VW = n(1—0/0? . If rv X is (2), and if the conditional distribution of Y given that x is Bin(x, 0), show that the marginal distribution of Y is A(A0). se Problem 2.14.) Show that V(aX +bY) = a?VX +b?VY + 2abcov(X, Y). If rv X is &(A), show that its cf (6) is 2/(A—i0). An rv Y having car arte” W290 f <9 where T(a) = f° ute i has the gamma (z;) distribution. Show that _ Y has cf {2/(A—i6)}*. An 6() rv therefore has a gamma (I; 2) distribution. Show that EY = a/, VY = a/?. 16. IfX,,...,X, are independent rvs, where X, is gamma(o,;2), show Fx) = r that Z, = X,+...4+X, is gamma ($434 et The Random Walk 3.1 The unrestricted random walk Problem 3.1 Consider the stochastic process which is the path of particle which moves along an axis with steps of one unit at time intervals also of one unit. Suppose that the probability is p of any step being to the right, and is q = 1 —p of being to the left. Suppose also that e: step is taken independently of every other step. Then this process is ca the unrestricted random walk. If the particle is in position 0 at time determine the probability that it will be in position k after n steps. Solution. Let {Z, } be the stochastic process, where Z, is the position the particle at time n, that is, after n steps from its starting point, 0. stochastic process has a discrete time parameter space {0,1,2,...} and discrete state space {—00,..., —1,0,1,...,00}. Now each step X is an independent rv having distribution oa p(X =1)=p, p(X = -l)=q Initially Z, = 0. After n steps Z, = X,+X,+...+X, where each X, is independently distributed as X. We can write equati 3.1L as Z, = Z,-1tX, where X, is independent of Z,_,. We wish to determine the value of Poy = priZ, = k|Z, = 0) Let rv ie) sia a1 PP as Y= : xs pC 1am) That is, let rv Y, = }(X,+1); then each Y, is an independent Bernoulli trial with probability of success p. Then rv R, = ¥,+...+¥, = 4Z, +n a is Bin(n, p). Therefore, by equation 2.26 pe = priZ, = k|Zy = 0) = pr{R, = HZ,+n) = Hk-+n)} [ura)ne G(k+n) eS = {0,1,2,....n}) make 0 (otherwise). Since ER, = np, VR, = "pq, = - E(QR,—n) = 2ER,—n = es (ince pq =v = V(2R,—n) = 4vR, = 4npq O (34) ‘We can call upon some quite deep theorems of probability theory to _ obtain the behaviour of Z, when n is large. For example, by the strong law of large numbers, with Deobabty one 42, ~1Ez, =p-q a n+x0 - That is, for large n, the particle will, if p > q, almost certainly drift in a positive direction along the axis of motion, the mean step length being Be Also, by the central limit theorem, e W,= FAD ~ an NOD rvasn +0 So, from tables of the N(0, 1) distribution, for large n, pr(—1:96 < W, < 1.96) » 095 ie. pr{n(p—a)—1:96,/(4npq) < Z, < nlp—a)+1-96/(4npq)} = 095. _ Problem 3.2 Find approximate 95% bounds for Z9 999 if p = 06. Solution. q = 0-4 ‘Therefore, EZ = 10000(0-6—0-4) = VVZ = V(4x 10000 x 0-6 x 0-4) = /9600 = 40/6 = 98 fore, 196 /VZ ~ 196 x98 ~ 192 Therefore pr(2000-192 < Z,oo99 < 2000 +192) x 095 PI(1808 < Z, 999 < 2192) = 095 A superficial acquaintance with the theory of statistics is sufficient to tell _us that with n = 10000 the approximation will be very good indeed. _ Problem 3.3 Find approximate 95% bounds for Zo 999 if p = 0-5. Solution. EZ = 0../VZ = (4x 10000 x 05 x 0-5) = 100. Therefore, pr(—200 < Zro000 < 200) = 095 o _ Now ,/VZ is maximum when p = 4. Therefore after 10000 steps, which would allow the particle to be up 6 10000 steps on either side of its starting point, it is, with 95% certainty, effectively restricted to within +200 steps. é | Problem 3.4 Evaluate the generating function Pols) = Da for k = 0 and k = 1, where pW is defined in equation 3.3. Solution. These are not pefs, since pf) (n = 0,1, 2,...) isnot a probability distribution, it being possible for the particle to be in position k for many _ different values of n on the same random walk. For example, p'®) = 1, es (;)o0 = 2pq, 80 Poo > 14+2pq >1. = (2m redid= % (f "pare = E (myer : 0,24 m=3n=0 = (1—4pqs?)"* = ufs), say (3.5) (We might recognise this as the Legendre polynomial generating function.) ‘ Now (es _ Qn+1)! _ An+1)-1 2x) n )” (n+ilnt and im LOL -+-[EG He Shds pees : = ith equation 3.5) t mas. Wy Comparnon with et = i{1 — vl—4w)} Therefore Poil6)= x pert tgantt = EM reat aa n=0 wn 5 Choon 2 LC) a0 i n=0 = 2ps(1—4pqs*) ‘a (i= ~4pas!)) = sifu) 1} (by equations 3.5 and 3.7) Oo We shall in future try to find solutions which are based on probability — . arguments rather than analytical techniques. d In Section 4.4 we shall meet a formula by which, from Po9(s) and Pol) 4 we can calculate P,,(s) for any integer k. 3.2 Types of stochastic process A process {Z,:t¢ T S (— 0, 20)} such that for any {1,,t,,-..5f,}€ 7, Where t, < t, <... 0). ie A Markov process is a stochastic process {Z, :t € T S (— 00, 00)} for _ which, given the value of Z, the distribution of Z, (s > #) inno way depends _ onaknowledge of Z, (w < 1). The future behaviour, when the present state ‘the process is known, i is unchanged by additional knowledge about its ist behaviour. Thus if ae pty yen Se Sty < tee Fe en the joint distributions of Za ts2y |Z the same. This is referred to as the Markov property. Markov processes are by far the most important class of stochastic _ Processes and their theory is highly developed. Every stochastic process e ith Esdependent increments is a Markov process. Problem 3.5 Show that the unrestricted random walk is a stochastic with independent increments (and so is also Markov), and is onary under translations by integer intervals. "Solution. If {X,} is a sequence of independent rvs, then {Z.,}, where = ES X,, has independent increments since if ny Zee f0F all me (...,=1,0,1,2,...). 33 The panes rin ‘ Problem 3.6 The random walk with two absorbing barriers. Two faale 4 versaries, A and B, have resources £a and £b respectively. They play agame j in which each play results in A winning £1 from B with probability p, or Bwinning £1 from A with probability g = 1—p. Each play is independent _ of every other play. Find the probability that A will eventually have to aa withdraw on losing his entire £a. Solution. Define rv y= 1. if A is eventually ruined ~ (0 otherwise Then rv Y has a Bernoulli distribution with, as its probability of the probability that A is eventually ruined. We consider the random-walk _ process {Z, :n = 0,1,2,...}, where Z,+a are the resources (in £) of A _ after the nth play. Then Z, = 0 and the state space et S = {-a,—a4,...,—1,0,1,2,...,6—1,5} % The game will cease when A has won B’s £b, or when B has won 4’s £a, i.e. when Z, first reaches —a or b; thereafter Z,,,,Z,,>,--- Will all take the same value as Z, . These states —a and b are called absorbing states. RvY =1ifZ, reaches —a before it reaches b, and Y = O otherwise. We seek pr(¥ = '1|Z, = 0). q Let us write Gs) = Eyjgy018" = 1-0,+.0,8 = 1-1-9), (ie S) the pgf for the Bernoulli ry Y given that the walk starts in state i, where _ 0, = pXY = 1|Z, =i. a We find a set of recurrence relations for the G,, which we solve for 0). Clearly 0_,=1, 0, = 0. We need some preliminary results. Clearly prY = 1|Z, =i) = 0, irrespective of n. Also, since the plays are independent, the random walk — has the Markov property, so Eyjzy-nzoei = Eyjzyat G(s) (ike S) Then, by the decomposition rule, + G8) = Eyzgei8” = Eyzizoni” = E, = Exstze=i G2,(9) Therefore 1-6, = G0) = deat fu Y zi\20=1E yz, =2,,20=15 Ez,129=1 92,(0). = Ez,\2g=41 —92,) = 1—Ez,12,=19, 9, = Ex,izo=1 92, = 6,-;pr(Z, = i-1|Z, = )+0,,, pr(Z, = i+1|Z, =) = 90,_,+P8,,, (i= —a4l,...,b-1). (3.8) _ This is a sequence of a+b—2 second-order difference equations in _ @+b—2unknowns. These equations can generally be written down directly “with just a brief explanation. The use of generating functions introduces an extra flexibility in allowing us to abstract from our results any of those aspects of the process which interest us. In this problem there is a one-one correspondence between G, and @,, so this is unnecessary. The decomposi- tion rule applied in this way is sometimes referred to as a decomposition based on the first step. We now solve the equations for 0,. We write (p+9)0, = 99,_,+P9,.; P(O:+1 —9,) = 4,-9.4) (= —a4l,...,b- _ Write 1, = 6, -0,_,,and 4 = q/p. Then I, =Al_, = Aal,_,) =... = aT, - and similarly, by writing a = 0, —0, = G=1, iep=q=) qa =a #l) _ Problem 3.7 Whaat is the probability that A is eventually triumphant? Solution. Let rv yee 1 if A is eventually triumphant 0 otherwise and let _¢, = pr(¥* = 1|Z, = i) = pr{A is eventually triumphant|A starts with £(a+i)} = pr{Bis eventually ruined| B starts with £(b—i)} Then we would obtain the same equations as in Problem 3.6, but with a and b interchanged, p and q interchanged, and i changed to —i. Thus, from equation 3.9, a — G = a/p #1) Problem 3.8 Show that it is certain that the game ea ends. that A is either eventually ruined or tlumphant = pr(¥ = 1|Z, = 0)+pr(¥* = 1|Z, = 0) = O,+¢) =1 (all p,q > 0 for which p+q = 1) 2a Thus with probability 1, the game will eventually end. That is, for the random walk with two absorbing barriers, absorption is certain with probability 1. oY Problem 3.9 Find the distribution of the duration, T, of the game. Solution. Let F,(s) = Exyzy=i8" = ye fps where S$ = pr(T = n|Z, = i) = pr{game ends on nth play| A starts with £(a+i} Given Z, = j, then by the Markov property and time independence T=1+T where T’ has pgf F (s) ie. Betty, ia = Brass’? = SFO) Then, by a decomposition based on the first step, FA) = Ex.zy2-15" = Exyizo=iErizy=2120-1" = SEzizyaiFz,(9) = s(9F,_,)+PF,.,(9} (= —a+l,...,6-1) We have the boundary conditions that F_{8) = Epjzy= 287 = ES = and F,(s) = 1 similarly. We seek F,(s). The general solution is F{s) = A(s) {a(s)}' + Bis) {B(s)}* — rel and f(s) are the roots Re Stas quate v = s(q-+pv’) (3.10) and A(s) and B(s) are determined from the boundary conditions. Then F_s) = 1 = Aa~*4+BBp-* Fs) Aa? + BB pp x xP pe Bit ae gre _ Then F,(s) = A+B. o When s = 1, a(1) =1 and f(1) = 2 = q/p; and for 2 4 1 we find F (1) = 1. When 2 = 1 we can set B = a(1+<) in the solution and let £0. Since i :, haye solution A= Popa Fn= sp = e PAT > k|Zy= 0) =F f9+0 as k +00 _ With probability 1, therefore, the game eventually ends. This is another solution to Problem 3.8. Problem 3.10 The random walk with a single absorbing barrier. What is __ the probability of 4’s eventual ruin if he is playing against a casino which has unlimited capital? Solution. If we let the casino be adversary B, and have unlimited re- sources, then the probability of A’s eventual ruin, given that he begins with £4, is tim a messy a> a Panto Be pear eae) er Sa i im p= a (= a/p < 1) rt 1 @<4q) . i eos probability (q/p)* that gambler A is eventually ruined, but there is ability 1—(q/p? that the game never ends. We say that the rando: y walk has positive drift if p >q. On the other hand, if p < q (that is, there is zero or negative drift) gambler A will almost certainly be ruined, ie. with probability 1 he will reach the barrier at — a. oO Thus if p < q, as it always will be in any casino, the gambler knows that _ eventually he will be ruined. However, what will be of interest to him the distribution of the time before he loses all his money. We investigate this time to absorption in the next problem. 7 Problem 3.11 The time to absorption. Find the probability distribution of the rv To, the number of plays made before a gambler who se £a is ruined, given that p < q. Solution. For convenience we now take our single absorbing barrier to be state 0 (not state —a as before), and consider the stochastic process i {Z, :n = 0,1,2,...}, where Z, is the capital of the gambler after the mg play. Then Z, = a. Now Fol) = Exjzq=aS Z where F,(s) is the solution of Problem 3.9. nee p <4, then, from equation 3.10, 2(3)6(3) = a/p > 1. Therefore |6(s)| > 1, 0 {A(s)}* is unbounded as b + co. Therefore, since F,(s) is bounded as b — co, B(s) = 0. Then Est = lim Fy) Foe(s) = lim F_{s) = 1 = AG)(a(9)* so F,g(9) = lim Fa(s) = As) = {a(9}* = {iae—soer This also holds when p = q = 4. oa Alternatively we note that for £a to be reduced to £0 for the first time, it must be reduced successively for the first time to £(a—1), to £a—2), and so on o£ and finally to £0. During each ofthese unit reductions the gambler’s capital can rise, but eventually, since p < gq, it will fall. Let rv T,, (j < i) be the number of plays for the gambler’s capital to be reduced a for the first time from fi to £j. Then Tyo = Ta,o-1+ Ty 14-2 -+-+Tio aay But the time taken to lose a pound does not depend on which pound itis, so by this time independence each 7, is distributed as Tq, and these T,,_ are mutually independent by the Markov property. Thus : 27 5 f , © Fes) = Fol)" ¢ need therefore determine only F , (s). We find an alternative derivation f this in Problem 4.18. Problem 3.12 The random walk with a single reflecting barrier. Consider _arandom walk such that the particle, when it reaches state 0, moves on the t step to state 1 with probability 1, and resumes its random walk o 2 4 6 6 0 @ 4 6 18 20 2 26 9 Figure 3.1 The random walk with a single reflecting barrier If this random walk process {Z,:n = 0,1,2,...} starts in state i (iS = {0,1,2,...}), determine a set of difference equations for the pgf ITs) for the position of the particle at time n. ‘Solution. We consider a decomposition based on the last step. T1"*%3) = E, suv = B ne Wome an\to=i Ezy hzy=Zmto~iS””'} Lwith probability g (Z, # 0) 1 (Z, = 0) "$0, by the Markov property, Ege sltn=Znto=i9"* = Bz, shzqaz45 x . Bess = (ps+qs-")s* (Z, #0) “Ys = (pst qs) +qs—s") (Z, = 0) [ei with probability p (Z, # 0) Zaus - As—s~")pr(Z, = 0|Z. = )+Epp,-(P5-+ 9s" *)s% = q(s—s” *)11%0) + (ps + gs” *10%s) (3.12) o It will therefore spend periods away from the reflecting barrier broken t by single steps onto it. By the Markov property, after first stepping onto the barrier, the process is in a statistical equilibrium, independent of the initial starting state, and the periods off the barrier are independent and identically distributed random variables. Find the equilibrium distribu-_ tion of Z, as n — oo, and show that this solution exists only if p is strictly less than g. a f Solution, Let I1("s) + II(s) = Y° n,s! as n + co. Then, letting m —> co in equation 3.12, Sao zi II(s) = (s—s~ *)q (0) + (ps +.qs~ ")H(s) ie. _ (s-s qn, _ (145) ols WO osrq) . Tes” where v = p/q Now I(1) = 1. Therefore, 1 = 2n,/(1—v), so 11(0) = m = (1-v) = 1=1/(2q). We note that if v = p/q = 1 then x, = 0; and so II(s) = 0, which is not a pgf. Similarly if v > 1, ie. p > g, then x, < 0, so m, is not even a probability. If v < 1 then ” a d-v+s) (1=y)s ie a 1—vs = Mo Iy{s)+(1—n gH y(5) where [1,(s) = 1 is the pgf of an rv U which is 0 with probability 1, and Tly(s) = (1 — v)s/(1 — vs) = s ITy(s) is the pgf of an rv W = V +1, where V- has a geometric distribution with parameter 1—v = 2—q"'. The ry U_ corresponds to the time on the barrier and W to the time off it. a The form of I(s) is that of a mixture of the pgfs of the two rvs U and W. — A mixture is the result ofa two-stage experiment. First we decide according — to the result of a Bernoulli trial with probability of success 7, whether to observe rv U or rv W. Ifa success results then we observe U, otherwise W. This model has the same probability behaviour as the equilibrium — distribution of the random walk. We shall be meeting a theory for equilibrium distributions later. Problem 3.14 Consider a random walk on (0,1,2,...,a), where at each 4. State 0 is absorbing, and state a is impenetrable with | pr(Z,,, = |Z, = 4) = 4, pr(Z,,, = a—1|Z, = a) =. (That is, if the particle is in an impenetrable state, then at the next step with some fixed probability @ it remains in that state for a further time unit, otherwise with — #1 -v)+4(1+). ity 1 — 9 it retreats a step and resumes its random-walk behaviour.) Show that if the particle starts in state i(0) then absorption is certain, and determine the expected time to absorption. Solution. We consider a decomposition based on the first step. Let F,(s) be the pgf for the first passage time T, from state k to state 0. Then, for aise) -<.,4—1, Fs) a = Es!*7-tpr(Z, = k-1|Zy = &) +Es'*T'pr(Z, = k+1|Zy = k) = SF,_ 1) +35F, 16) (3.13) ‘The boundary conditions are F,(s) = Es = Es? = 1 (3.14) and F,{s) = Es'*™*-*pr(Z, = a—1|Z, = a) +Es!*™pr(Z, = a|Z, = a) = 4sF,_,(s)+4sF, (s) (3.15) We consider the case s = 1, then solving equation 3.13 successively starting with the last equation (3.15) we find F,(1) = 1 (k = 0,1,2,...,a), so absorption must take place with some finite n (with probability 1). (See Problem 3.9.) Let M, = ET, = £{F,0) js=a then M, < © and, from equations 3.13, 3.14 and 3.15, satisfies M, = 14+3M,_,4+4M,4, (k= 1,2...,a-1) M,=0 (3.16) M, = 1+3M,_,+3M, We note that these are solved by M, = 0, M, = «0 (k = 1,2,...,a), but since absorption occurs with probability 1 within a finite time, we can tule out this solution. Let J, = M, — M,-, (k = 1, 2,..., a), then equations eit? 212.01, 1, I, = Aat1—-k) (k =1,2,...,a) Therefore M,= M.-M, = (M,—M,_,)+(M,_,—M,_2)+--.+(M,—Mg) =D =2 5, @+1-W = ia+1-9, a Problem 3.15 A particle executes a random walk on the vertices of a cube. The steps are independent, and the particle always steps with equal probability to one of its three neighbouring vertices. If A and H are opposite vertices, find the pgf for the first passage time from A to H, i.e. the time at which a particle which sets out from A first reaches H. If A and H are absorbing states, what is the probability of absorption at H given that the walk starts at a vertex which is neither A nor H? Solution. Label the vertices of the cube as in Figure 3.2. a) pees Db) (2) Figure 3.2 Define states for the system by the minimum number of steps from the vertices to A. Then if the particle is at A, it is in state 0; if at B, C or Ditis in state 1; if at E, F or G it is in state 2; and if at H it is in state 3. Let T, be the first passage time from state i to state 3 (i = 0, 1,2, 3), then T, = 0. If F(s) = Es", we seek F,(s). We consider decompositions based on the first step X. F,(s) = Est? = Es!*"! = sF,(s) since from state 0 the first step must be to state 1. F,(s) = EyEpjyay 57 = 4Es!*7042Bs!*+ = 19F,(s)+25F (5) since, with probability 3, the first step is to state 0 or, with probability 3, to state 2. Similarly F,(s) = 3sF, (s)+4sF; (s) = 3sF, (s)+4s since F,(s) = Es® = 1. We can solve these to obtain F,(3) = 2839-78?) (80 ET, = 10) Now suppose that states 0 and 3 are absorbing Let @,(i = 0, 1, 2, 3) be the probability of absorption in state 3 if the particle starts in state i. We seek the values of @, and @,. Clearly 0, = 0, 6, = 1. By decompositions based on the first step, starting in state 1 and state 2 respectively, we obtain 31 0, = 405+30,, 0, = 30, +405 %=-% 5 3.4 Generalisations of the random-walk model Instead of just a single step at each time instant we can allow larger jumps. Problem 3.16 Suppose that in a random walk the steps are independently distributed as a discrete rv X, where p(X=K)=p, (k= .++, —1,0, 1, 2, ...); and that there are absorbing barriers at integers —a and b (a, b > 0), the process being absorbed as soon as the walk reaches ‘or passes over a barrier. Suppose that the process starts in state i. Find difference equations for @,, the probability that absorption eventually _ occurs at —a. Find difference equations for F((s), the pgf for the time to absorption of the process. Solution. We argue as in Problem 3.6. Define Bernoulli rv { if absorption eventually occurs at —a Y= 5 0 otherwise ‘and consider the random walk process {Z, :n = 0,1,2,...}, where Z, is the position of the particle after n steps. We seek difference equations for 6, = pr(¥ = 1|Z, = i). We use a decomposition based on the first step: pry = 1|Z, =i) = 2, Pe, = 1125 = DeAY = 125 = 1,2, =) ie. o 6= D p-.0, Gel = {-at1,-a+2,...,b-1}) ie by the Markov property. Clearly 0, =--=0_,,=9,=1 6=4,,= Therefore <8 tea @ 0, = 1x ee Bat Heh Pj 9; + Ox BP = 0 i i 9+ 3 P)-18, (ed ae Pp, = pr(X < —a—i) As in Problem 39, if rv T is the time to absorption, then if Z, = i (ie and Z, = jwecan write T = 1+7", where, if I, T’ has pet F,(s), but ifj¢J then T’ = 0; ic. 32 SF, Enasaj29-15" = EpizgnjS 7 = {: i nO Therefore F(3) = EzyzgaiE- = s|Zo=t TZ =Z4,Zo=t =a baa =f Emit, miki Fn} ten =. s{or+ Zk} (ien) where gf = p(X < -a-i or X >b-i) Problem 3.17 Consider the random walk in which the steps are in- dependently distributed like a continuous rv X, where X has densi pix) (— co b-2) o EXERCISES 1. A particle at each unit time point makes a step to the right with probability p, a step to the left with probability g, or no step at all with probability r = 1—p—q. Find the pef, G,(s), for the position, Z, , of the particle after time lly it is in position 0. Find the mean and variance of this distribution. Show that G5.) =F Gio" = s/{—pste+sl tr) 1g} 2. Consider a particle which takes an unrestricted random walk as in Problem 3.1. Suppose that the walk has positive drift (A = q/p < 1) and that the particle starts in state 0. Use Problem 3.10 to find the distribution of rv W, where — W is the leftmost point the particle reaches. 3. Consider the special case of Problem 3.11 in which a =1 and p=q=}. Expand F,,(s) in powers of s to obtain a formula for Py = Pr(T,o = k). Evaluate p, for k = 1, 2,..., 5. Describe a coin-tossing experiment by which this process can be realised, and carry out 10 realisations. 33 pter 4 Markov Chains 4.1 Definitions A Markov chain (abbreviation: MC) is a Ma process {Z, : ne T} with a discrete time parameter space T, and a finite or countably infinite state space S. We take, without losing generality, both Sand T to be subsets of integers. The time-independent transition prob- _ d to the next question only after a unit of further reading. The third swer carries no marks, and the student selecting it must undertake two of further reading before proceeding to the next question. If the udent makes a random and independent choice of answer to each abilities of a MC are tion, what is the pef for the total number of units of further reading ? = prZy.,=i|Z_ =i (bie) must do before he selects an answer which carries two marks? (independent of m). For example, for the unrestricted random walk, oe Solve Problem 3.14 for the general impenetrable barrier having P (G =i+1) oP ={1-p G=i-1) _ parameter 6. y 0 (otherwise) Solve Problem 3.14 for the case in which p = able barrier also has parameter 0. By convention we write p,, for p{}). The one-step transition matrix (or j transition matrix) is the matrix P=(p,) GieS) Now p,, > 0, Eu = 1; the first because the p, are probabilities, 0, q = @ and the second because, given Z, = i, Z, must be in some state j of S. For th same reasons the elements of the n-step transition matrix P= 6) Ges) Bp > 0, x yt We call matrices with these properties—that is, square with nonneg elements and unit row sums—stochastic matrices. Problem 4.1 What is the transition matrix of the MC {Z, } where each Z,, is independently distributed as rv Z, which has distributic piZ=K=p, (k=0,1,2,..)2 BI Solution. Py satisfy pr(Z, = j|Z, = i) = pr(Z, = j) (by independence) =p, 1 Therefore nee ae ‘(see equation 32), and the steps X, are independently buted asrv X,which has distribution pr(X = k) = p,(k = 0,1, 2 if X has distribution pr(X = k) = q, (k =..., 1,0, 1,2,...)? = pr(X, = j—i) (since Z, and X, are independent) Po Pa 0 BE OBST 8 HR 0 9 Poe's Es $ u 4.3 A rat is put in the maze illustrated in Figure 4.1. At each instant it changes room, choosing its exit at random. What is the jon matrix of the MC {Z,}, where Z, is the room the rat is occupy- uring (n, n+1)? Figure 4.1 Problem 4.4 Show that the factory described in Problem 1.7 can “ analysed as an MC. Solution. Exercise 1 of Chapter 1 asked for a list of the possible tions between the states of the factory. We list the transitions here, their corresponding probabilities. They were found by considering event at the factory which caused its state to change. Let g = 1—p. Transition from state i: 0 0 1 1 2 tostate j: 0 2 0 2 with probability py: 9 Pq P 3 otherwise A iy ie, transition matrix 1 q 1 0 0 Reippieed 0 2 q KH ovunNnyun Soss oo eovoow ~ 0 Problem 4.5 Consider the library book loan process described Problem 1.5. Suppose that the times in weeks it takes the reader to. library books are independent rvs having distribution function F(t). {F(/) :j = 0,1,2,...} is a sequence with F(j) < F(j+1) show that th process is an MC, and find its transition matrix. Solution, Let the time to read a book be an rv T. A book exchange and only if the reader finished it during the previous week. This has t Markoy property, and so the process isan MC. The only possible transitio are i to 0 or to i+] (i= 0, 1, 2, ...), so py (Gi #0, i+), Pio +Piis1 = 1. Now pr(Z, = i) = p(T > f) = 1—F(i), and Pissr = PUZ, 4, = 1+1Z, = i) = p(T > i+1|T > 9) p(T >i+1,T >i) _ p(T >i+l)_ 1-Fi+l) 5 t=), ts ee (i = 0,1,2, , er ee eT " e F(0) = 0; and Piis1 <1 since {F({)} is a monotone increasing ence. Therefore . Fi+1)—Fi) L Mo = Pita = TFo _ These probabilities are the elements of the transition matrix. Qo blem 4.6 Consider a time-dependent MC {X,:n = 0,1,2,...} ed on the state space {0,1}. If pr(X, : a = 1X, = 1) = 6,(m = 1,2,...,showthat {Z, : Z, X,, is S also a time-dependent MC, aa determine the marginal distribution ‘of Z,. Solution. Zim Zs (KX) ZimexS = 0, then Z, = 0 (so state 0 is an absorbing barrier of the time- - ee random ‘walk {Z,} on {0,1}). If Z,_, = 1, then X, = 1 = 0, 1,...,n—1), and by the Markov property of {X, } pz, = 1Z,_, = 1) = pnX, = 1/X,_, =) = 9, pr(Z, = 0/Z,_, = 1) = herefore, given Z,_,, we need no further information in predicting Z,,80 {Z,} is an MC. -Z,, is a Bernoulli rv (Exercise 3 of Chapter 2). Also _ fo Z.,=0) Primera ite 8, eel ity of success, pe 1) Pe E,,2, an Ez, Balan 1=2Z, Z, = E,,_,0,Z,-1 = 0,pr(Z,_, = 1) = 0,(0,_, pr\Z,_. = 0} 0, 1-0, Pr(Zy = 1) = 6,...0, PXg = 1) a Consider an MC {Z,,:n = 0, 1, 2, ...} having time-independent transi- n matrix P. If the distribution of the initial state is pr(Z, = i) = p, (i = 0, 1, 2,...), then by applying the Markov property to equation 2.23 e obtain the joint distribution of as Bin see PZ = fgg Zy = iy ores Zy= i) = PigPioiyPirins**Pinsin (41) Let us denote pr(-|Z, = i) by rit then pr{Z, = k) = p,, and by he Markov property and time independence prZ,=J1Z, =") = ptZ,=/1Z,- Wry After its first step the particle must be in one of the states of S. Therefore, by equation 2.12, f f= puiZ, = N= Ye pr Z, = j|Z, = Npr(Z, = = F PaPay In matrix notation the right side is the (i,j) element of PP = P?, where P is the one-step transition matrix. The two-step transition matrix fp) is thus P?. Similarly (9) = P®, and so on. In general pint = YP pfmpty Kes This is called the Chapman-Kolmogorov equation. In matrix notation pete bors We can define P® ina natural way to beI = 6y) the identity matrix, where is the Kronecker delta (= 1ifi = j, = Oifi # j). Oy Problem4.7 Consider an MC {X, S = {0,1,...,s} and transition matrix P = (p,,). Suppose that the MC has been thinned according to the matrix P = (y,,). That is, given X, =i and X,,, = j, we delete X,,, from the realisation with probability and retain it with probability 1—y,,. This thinning has the Markov property. Show that the thinned process {Z,: n = 0,1,2,...}, where Z, = X, is an MG, and find its transition matrix P* = (p). Solution. Each Z, of the thinned process is an X, of the MC. Therefore PZ, 41 = JZ =i Zp = ty > A= ih) f = priX,.,, =J)Xo=io X= --> X= 4) forsome 7 SEQUENCE M57135---s Mya i,) by the Markov property of {X, } ) i, = Pr, = prZ,41 = so {Z,} isan MC. We consider a decomposition based on the first step, whether or not it is deleted. Let IIx, Iz 1 ifthe first step is deleted is Y= y te 0 otherwise Then Baeoet = Bry rzuaacit™ Exstto=t Ex =Xi.20~1Ezsix = xnY=¥izont We pick out the coefficient of s/, ie. (Z, = j|Z =i Pee ee, =i|X,=%, Y=¥ pr(X, = klZ, == Py Eyyjenzo-1¥ = PUY =1/X,=k, Z=D=% (43) by the Markov property, if X, = k, then it is as though the thinned starts again from k. Therefore (Z, =j|X, =k Y=0, Z, = i) = 4,, (the Kronecker delta) (Z,=j|X,=k, Y=1, Z, =i) = pr(Z, =j|Z, =) = pty (Z, =j|X, =k Y=y, Z) =i) =(1-y,+yPh (Y= 0,1) onl V06z,,+ ¥P%,)} = Eg yzonilll-71x,)5x,)+ 7x, PRs} (by 4:3) = Y Pull—79)5)+%aPh} = Pyl—-ry+D Pa Ya Phy is is © = Py 4+ Y duPhy where Q = (a,) = (Py7y) P* = P-Q+QP* d-Qp* = P-Q Pt = (1-Q)""P-Q) o : Equilibrium distributions We seek the limiting form of P" as n + 00, limit exists. ym 4.8 Find II = lim P* for a time-independent MC having state = {0,1}. ‘Solution, A realisation of this process will be a sequence of 0s and Is. transition matrix P. must be of the form p=('5? re @ F(s) as n — co; then oF = (1— —enee 1F = ary Therefore 4 F(s) = {c(1+s)}*, where c is a constant. F(l) = El =1 = (2c =e 3 FQ) = ¥ pest = Lats 2 ¥ ("se en eene oa 2 F bo \k eb CG) oat ie, the equilibrium distribution of {X,} is Bin(2a,}). i eo ‘ ey ee ae te : ee te If we set s = 1 in equation id put M, = F'(1) we obtai a = a. However, if we differentiate equation 4.9 we obtain @F,, (8) = (1—s*)F(3) + (@—2)s F's) +0F,(s) Now, if we set s = 1 and « = 2a, we obtain 2aM,., = 2a—2)M,+2a if M,.,-4 = (1-Z)o,-a Therefore 1 ly M,-a= (Dom: —a)= (2) (M,_.—4) = (1-2) (My — ie. M, = a+(1 -i) a a Problem 4.12 A simple queue model. A single server at time it 1,2, ... Serves a customer, if any are waiting. Suppose that in time int (n,n+1) X, customers arrive, where {X,,} is a sequence of indeper tvs, each distributed as rv X, where pr(X = k) = p, (k = 0,1, 2,. rv Z, is the number of customers present (including the one being se1 at time instant n, show that {Z, : 1, 2,...} isan MC. Find its tran tion matrix and equilibrium distribution. p Solution. We prove that {Z,} is an MC by showing that Z,,,, dep only on Z, and X, and on no earlier information. The number of custo present at time n+ 1 is the number present at time n, less the one served | time n, plus the number of new arrivals in (n, n+ 1), i.e. ae fe-1+%, (Z, = 1,2..) pa (Z, = 9) = Z,-4,+X, me Ta Therefore Po; = PUZ,41 = j|Z, = 0) = p(X, =j) =p, G= 01,2. and for i # 0 Py = PrZ,,, = J|Z, = ) = pr(Z,-1+X, =Jj|Z, =) i Por Gm tL hitt.) riot) 4 OG wi Ee via inet Pr Ps Pas, Pr Po Po Pi We solve = WP form’ = (zy x, x3...) and find = 1 He MePt YA Prrj K=O.) (4.11) hese may be solved successively to find , in terms of x, Then E x, = 1 vill give the value of m5, Alternatively, let us define pgfs Bp i 19) = ¥ ms Pt) = ¥ sts n P(s) = Es* is known. We call EX = p the traffic intensity (= rati expected number of arrivals to expected number of departures). We ply equation 4.11 by s* and sum for k = 0 to co: Por EE tts oS < Pf s)+- ¥ xs! g changing the order of PCS) ae . resrraseat eae summation) = my PES)+H{M9)—9} 1) my = =A a = {(1—s)P(s)} xp lim U=9PO) _ 5. tim“ (rHopital's rule) ae, Rayer led aE {P(s)—s} im LSPS) — Pls) = P@-1 (=p) P(s) % The condition for stability (ie. an equilibrium solution) is p <1, be- cause ifp = 1, then, = 0 so MI(s) = 0, and ifp > 1,7) < 0, andisnota probability. If p > 1 the queue size will tend to grow without limit. Problem 4.13 Suppose that in the simple queue model there is wail room for only m customers including the one being served. Customers arrive to find the waiting room full leave without being served and do — not return. What is the transition matrix? Show how to find the equilibrium — distribution. Solution. Z,,, = min(Z,—4,+X,,m) a (Z, =1,2....m; X, = 0,1,...,m+1-Z,) II(s) = ee (Z,=0; X, =0,1,....m—1) be m (otherwise) Therefore Po; = PtZy41 = i|Z, = 0 ; {rts G =01,...5m—1) =} PAX, > M) = Ppt Pmerte-+= Py» say (i = m) t 0 (otherwise) and, for i = 1,2,....m, 4 Py = PrZ,., = JZ, = 9 a PX, =J+1-) =Pyiy, G = i-Li...5m—1) nf 3 ei G=m) 0 (otherwise) Oxad 2 Sata O [Po Py Pa + Pmoa Vo [Po Py Pa ++ maa Pee2 [0 my Prov Paca m OP 0" Or Ps, In solving n*” = n*P*, the first m equations are the same as in (4.11 Therefore x* = cx, (j = 0,1,...,m—T), where ¢ is a constant, This is because each n* (j = 0, 1,..., m—1) is the same function of n$ as x, is of T- The last equation is a= = mp, me Cre =P, 2+ at Peas jtmBP, Te Ph, = Pmt Pm +1 +++ This reduces, where »,, is known by Problem 12, to Poftm = (1—P,)nn = {roPat aed = Cm say maa TF = Poe Ym +c i enlipaed ‘roblem 4.14 The discrete branching process. Each individual in a popu- 7 gives rise to X new individuals for the next generation, where rv X _ has pgf G(s). Individuals reproduce independently of one another and die. This occurs at the same instant for each individual after a fixed constant lifetime. If the process starts with a single individual, show that (5), the pgf for Z,, the population size in the nth generation, satisfies -Tecurrence relations T+1(9) = H,{G9)}, H+ (8) = G{,(9)} (4.12) where clearly I7,(s) = s and IT,(s) = G(s). Solution. We consider decompositions based on the last and the first ations respectively. We can write Zyey = Xyt..+Xy, (4.13) nce each of the Z, individuals in the nth generation has his own family. family sizes {X,:i = 1,2,...,Z,} are independently distributed, each with pgf G(s). Then, by Problem 2.14, Z, , , has pgf I1,{G(s)}. Alternatively we can consider the populations which grow in the re- gn generations from the Z, = X born in the first generation. Then Zui = Z+., +ZM (4.14) p “where the Z\? are independently each distributed like Z,. Then, again by jblem 2.14, Z,, , has pef G{IT,(s)}. o ~ Making parents die when they give birth is no restriction since we could have a parent survive as one of its offspring if G(0) = pr(X = 0) = 0. ‘This is because individuals behave independently and alike, so parents and offspring will be indistinguishable. On the other hand, if we have _G(0) > 0, the number of children is not restricted to one or more, so it will be possible for the population to die out. We then have the problem of - determining the chance of extinction. t Problem 4.15 Dees Me rz, Ae We Vz, Boise size in the nth generation. Solution. Z, = 1 so M, = 1 and V, = 0. From equation 4.13, Ep estes Zar = EO to Xy) = KEX = ke where « = EX =. G'(1) is the mean family size. Therefore Myes = E2541 Br Eznestenntn Zar = Bz,(Z,%) = This is a recurrence relation, and obviously M, = oM,_, = o(aM,_,)=...=@"M, =o" Also, since the X, are independent, Vz, ssizneZons = VX, +... X,) = EVE = kp where B = VX = G"(1)+a—a?. Therefore, from equation 2.22, = Vy 41 = B2V 2, sstznetnZatit Ven Bz 5 sla Zalutd = E,,(Z,B)+Vz(Z,0) = BM,+02V, = pal +0?V, This is a recurrence relation. If 2 = 1, then we easily see that V, = nf. If « # 1, we can solve successively with n = 0, 1, 2, .... Alternatively, let us use equation 4.14 By gums Zyer BZ, © x, = sot = xVZ, = xv, Yars layesixex Zn Therefore Very Vlics = Ea Neugi ox 2itat VaEacyinex2iva ; = E(XV)+V(Xo") = VEX +a"VX = a*"B+aV, 4.16) This is another recurrence relation for V,. If« # 1, we solve by subtracting : (4.15) from (4.16) to obtain a(1—a)¥, = Ba'(1—a") ie. Bi @=1) Problem 4.16 Show that the population process {Z, : 5 is a Markov chain, and find its transition matrix. If Z, = 1, what is the chance that the population will eventually become extinct? What if the process starts with i individuals? Solution. Since Z, nt = oe 2, = 1,2.) 0 Z, 9%) and the X, are independent rvs, each having known pgf G(s), we see that the. process is an MC. Let py = PrZ,.. = JZ, =) Po; = 5g, (the Kronecker delta), ie. state 0 is absorbing 1y = PAX = j) = pj, = But what is p,, (i = 2, 3, 4, = i, from (4.17) and Problem pp 212, FO) = 3 Put! = East 80 p, is the coefficient of s! in {G(s)}. IfZ, = 1, the probability that the population is extinct on or before the nth generation is p", = pr(Z, = 0|Z, = 1) = q, say. Clearly q, increases as time passes, that is as n increases. We can assume that 0 < py = Py = q; < 1, since if p, = 0 no extinction is possible. The sequence {q, :n = _ 1,2,...} is therefore bounded above (by 1) and increasing, and so must have a limit g. But by (4.12) 9, = 11,0) = G{I,_ ,(0)} = G(a,_,) (4.18) Therefore, as n - 00, limit q satisfies q = Gq) (4.19) ey be another solution and, if there is, it is the one we want. To prove _ this, let q* be any root. Then since G(q) is a power series in q with positive coefficients (they are probabilities) it increases in < q < 1 asq increases, so by (4.18) 7 = G0) < Gq") = q* (4.20) E % = Glq,) < Gig") = q* (since, by (4.20) q, < q*, so G(q,) < G(q*)). By induction, q, < q*. Therefore, as n + co q = lim 4g, < a* and so the chance of extinction, q, is the smallest positive root of q = G(q). Ifthe process started with i individuals, the probability of extinction is ‘the chance that each of the i lines independently dies out. It is therefore q', where q is the smallest positive root of q = G(q). a Computationally we have described a simple iterative procedure for ving equation 4.19, namely Gu+1 = GG,» 91 = Po The condition separating the cases g = 1 and g < 1 is given by obsery- "ing the slope at s = 1 of the function G(s) (see Figure 4.2). i he Figure 4.2 The slope at s = 1 is G'(1) = EX = a, the expected family size; and so =1 ; <1 q " 1 8ecording to whether # {s i 4.4 Classification of the states of a Markov chain State j is periodic with period d if pm >0 (m=1,2..), po =0 (n# md) ie. the particle can return to its initial sas j only at times d, 2d, The period d is the greatest common divisor of all n for which p\ > 0. If d = 1, then state j is called aperiodic. Each state of the random walk on the integers (Problem 3.1) has period 2 since the parti can only return to it in an even number of steps. Let ry T), ie the time at which the particle returns to state j for the time, aiete, 1 if the particle stays in j for a time unit. The state j is See Rtmeeye Teal meaty Sere Let rv T,, be t fe time for the particle to go from state i to state for the ‘ first time. Then T,, is called the first passage time from i to j. (See Problem is) 3.15) Let T,, have paf F fs). If P,(s) = x pis", then Pi), = Fy(9Py(9) where 6,, is the Kronecker delta. This is the renewal equation. Problem 4.17 Derive the renewal equation. Solution. Let ry Mf) be the time at which the particle starting in state i is in state jfor the kth time, Then M{?) = T,, and Mi) =T, +7 Pe TD. TG) (k= 2,3,..) where each ae by ai eae property, is independently distributed, — b the time independence ofthe MC, each Ty? is distributed as T if MP has pef G,a(s), fey Fs) {F {9}! = FifS)G, 4-108) (k= 1,2,3,..). pe Gace» or M=n or ...) = map = n)+pr(M{) = n)+.... PMultiply by *, and sum over n = 1,2... PS) = 5 DMS" = Gij(s) + Gils) + Gials) +... (4.22) . = Fis) (1 + Gijy(s) + G(s) + ---) = Fy (1 +{P, (Py }] ‘by setting i = jin (422). Then (4.21) follows by noting that pl? = 6,. 0 When i = j and s = 1 we have P, (1) = {1-F,()} then, since pr(T,, < 00) = F,,(I), the condition F,,(1) = 1 is equivalent to P,,(1) = 0, and F,,(1) < 1 to P,,(1) < 00. ‘Tide tate j tevecterrant ox transient according to whether P.,(1) = co or P,y(I) < 20. -roblem 4.18 Determine F,,(s) and P,,(s) for the unrestricted random walk of Problem 3.1 for every integer i and j. oF -_ Solution. The solution of Problem 3.4 is that 5 Pools) = us), Pox (s) = {u(s)—1}/2gs (4.23) here 1s) = (1—4pqs*)~#. By symmetry we can interchange p and q to Po (s) = {u(s)—1}/2ps state i, the particle cannot be allowed to pass through state 0 it is reaching 0 for the first time. State 0 can therefore be treated as an absorbing barrier. Then (for i > 0) F,,(s) is the pgf for the time to a gambler’s ruin if he starts with £i. In Problem 3.11 we saw that eS Fio(9) = {Fio(}' @ > 0). By the stationarity of the random walk (Problem 3.5) Pig sj) = Pig S. Fis j(3) = Fin) os equation 4.21, Fo9(s) = 1—{Poo(s)}"* = 1—{u(s)}-* Pig(8) = Fio(S)Poo (8) = {Fy (S)}‘Pools) (i = 1,2...) ae TThetelees od Tora t = Fo(8) = Pyols) {Pools} * = [1—{us)}“V/2ps “The alternative solution of Problem 3111 is completod by equation aaa Together the displayed equations from (4.23) to (4.24) enable us to deter- mine F,,(s) and P,,(s) for every i > j. Similarly F_,o(8) = {F_10(9}' where, by symmetry, F_, (8) is F,o(s) with p and q interchanged, ic. [1 = {us)}~*]/2¢s. Then P_,o(8) = F_1,0(8)Poo (8) = {F_-1,0(9}'Pools) (i = 1,2...) and so we obtain F,,(s) and P,,(s) for every integer i and j. Problem 4.19 Determine ET, and VT. for the unrestricted rand walk. Solution. From F , .(s) by equations 2.24 we find that if p < q ET) =(9-p)*, VT yo = 4pa/a—p)* Therefore, from equation 3.11 = GET, = G—p)*, VT) = iVT 9 = 4paiq—py If p > q, F,(1) = ©, Fi(1) = & and so the moments ET, and VT, do not exist. State i is mull-recurrent if iis recurrent and ET,, = 00; state iis peste recurrent if iis recurrent and ET,, < 0. State i leads to state j (we write: i -+ j) if for some integer k > 0, pi) > 0. The particle then has positive probability of reaching state j from state i. States i and j communicate (we write: i j and j > i. Now is — an equivalence relation, so the states may be partitioned into equivalen classes, called the irreducible classes of the Markov chain. A chain of just one class is called irreducible. Clearly, whilst it is possible fo a particle to leave an irreducible class to visit another, it can never return, otherwise the two irreducible classes would communicate and be a single — class. The random walk with 2 absorbing barriers has 3 classes. Both periodicity and recurrence/transience are class properties. For an MC with only finitely many states, at least one state must be recurrent, — and every recurrent state must be positive recurrent. The reader should be able to argue why this is so. ; Kolmogorov's theorem is that if an MC is irreducible and aperiodic, then — there is am, such that a pin, as n+o c ero “ e {n,} is the equilibrium distribution, which by this theorem exists, hich we have seen satisfies equation 4.4: w=xP ‘an MC has an equilibrium distribution it is called ergodic. It then forms a si irreducible class, and every state is positive recurrent and aperiodic. _ Problem 4.20 Show that the unrestricted random walk with p = q = } is null-recurrent. Py(s) = Poo(s) = (1—-s?)-* P, (1) = © "so the random walk is recurrent. F, (8) = 1—{P,(s)}-* = 1-1-3? ET, = F,(1) = lim s(1-s)-# = 0 sat refore the random walk is null-recurrent. o for a disposable or edible commodity, such as razor blades or s of wine. At each time instant an item may be consumed. When stocks run out, j items (j = 0, 1, 2, ...) are purchased with probability r,. Consider an irreducible MC with transition probabilities Poy =, G = 0,1,2,..) ; ei =P @ =1,2...) cs Puy-1 =F =1-p (i= 1,2...) Find the pf F,, (s) for the first passage time T,. from state i to state 0. S that the MC is positive or null-recurrent according to whether the of the distribution {r,} is finite or infinite. Solution. In state i (>0) the next step is to i—1 with probability q or no _ change with probability p. Now r To = Tyger +Ti-1y-at +++ Tho _ where each component is independently distributed like T, 4. We consider ‘a decomposition based on the first step, X. Given X = —1, T,) = 1 and Ber ee = s; and given X = 0, T,, = 1+T7, where Ti, is distributed like T,,, so Eryyxa0 57" = Bnjgs)° = SF io(8) | Therefore Fyo(s) = Est! = Ey Ey. x57 = PAX = —Is+pr(X = OsFy9(8) = 95+PSF 19(8) Therefore F ,9(s) = 9s/(1—ps) Therefore Fyo(s) = {Fy9(s)}' = {as(1—ps)}! The chain is given to be irreducible, (The sequence {r, } must therefore have an infinite number of nonzero elements.) To study recurrence we need therefore investigate only one state. It is convenient to examine state 0. — GiventhatX = 0,thenT,, = 1;andgiventhatX = j,thenT,, = 1+Ty- Therefore Ertan? = Erg! = s{Fio(3)} Therefore o i % Fool) = ExExygenx5™® = roses © 7,{Fio()) = 5 (Ss) 1 ma jo 1—ps, = sR{qs/(1—ps)}, where R(s) = ¥ r,s! fH Therefore Foo(l) = R(l) = 1 since {r,} is a distribution State 0 is therefore recurrent, and so the MC is recurrent. Further, Foo(1) = Me (1), where R’(1) is the mean of the distribution {r,}- Foo(1) is infinite or finite according to whether R’(I) is infinite or _ finite. This is the condition for state 0 (and so also the MC) to be null or positive recurrent. oO Problem4.22 Each morning, records of the previous day's business arrive at the accounts office to be entered in the ledgers. The office on any day is capable of writing up k days’ records with probability p, (k = 0, 1, 2,...)._ Show that, if the office staff work diligently, Z,, the number of days’ records at the start of day n waiting to be written up, forms a Markov _ chain, and find its transition matrix. f Show that the trial solution x, = (1—A)4/~" j= 1, 2, ...) satisfies the equations for the equilibrium distribution. What in this equilibrium situation is the probability that the previous day's accounts will be dealt. with immediately on arrival at the office? etn Xb uni: art od al 9 iting on day n plus that for day n, less those dealt with on day n, i. Zep, = Z,+1-X, & this depends only on Z, and an rv X, which depends only on Z,, $0 - the process is an MC. Now X, is the number capable of being dealt with if at least that number is waiting, otherwise X, is the number waiting. ‘T erefore ‘ pax, =/1Z,= 9-43 n= Psy U=9 0 G =0,1,. a) (otherwise) By =pdZ,,, =/1Z, = = prZ,+1-X, =j1Z, = 9 = Pax, =i+1-j|Z, =) = (otherwise) the transition matrix is : Pa Udy Bi iden Aireda=s 12/2 pp 0; 040 P=2/P, Py Pp 9 0 P. Mea 7 Se eitrinm cautions, x = xP, are % Py +m, Pa +m, Pt: M1 = MPot hs PitMeaPat ? (= 1,2...) Using the trial solution: n, = (1—2)4/~', we find that 2 must satisfy a = a p> Be ere G(s) is a known pgf. This equation emerged in Problem 4.16 about e discrete branching process, We saw there that if G'(1) < 1 then the equation has just one root, 4 = 1. Then x, = 0 and does not form a - distribution. For equilibrium, therefore, we must have G'(1) > 1, and take to be the smaller root, 45, of = G). A= Gi) 4 ate They sic eles A Uy wale, i ledgers first thing on day n is prZ,_, = 1, X, #0) +2, (1—py) = (1—Ag)(1—py) as nm +20 Problem 4.23 Consider the library loan process of Problems 1.5 and of If F(t) has a finite mean, show that the MC is irreducible. aperiodic and Positive recurrent. Find its equilibrium distribution. Solution. The MC is irreducible since, for each (i,j), k = j+1 is suc that p > 0. The possible path from i to j in j+1 steps is i to 0 on the fi eae on the second, then I to 2, and so on, ending on th CR ee Pi” > Pig Por Pia--Pj1,j > 9 Aperiodicity and recurrence are class properties so we consider only the one state, 0, of the irreducible chain. Since p{!) = F(1) > 0, the MC is _ aperiodic. State 0 is recurrent if pr([7+1] < co) = 1, where [T+] the number of weeks the book is on loan, ie. the smallest integer no smaller than T, and is the first return time to state 0. Now pr([T +1] < «) > pr(T+1 < 0) = F(a) = is Now since E[T +1] < E(T+1) = ET +1, and ET < co, we have that E[T +1] < ©, so the MC is positive recurrent. An irreducible, aperiodic, positive recurrent MC is ergodic and has a equilibrium distribution, r’, given by x’ = WP, a FQ)-FQ)_ FG)—FQ) T= SAU eryy “omy Ries = aa (=0,1,2,...) ms 1-F() 1—F(j+1) 1-F(j+1) = "1 —FG-Df | TFG) § ~ 1 =FG=1) = % {1—FUi+ D} Therefore 1, =m {IF} G= 01,2...) Since = 2 =% Y {1-F}, oO =o Leer) (1- FW} "= . . rer Hi oe > can easily show that > {1—F(®)} = E[T +1]. lem 4.24 Achilles.and the tortoise. Zeno (495-435BC) reported on a between Achilles and a tortoise. Suppose that the tortoise is given -w time units start. Suppose also that during each time unit it runs a distance ‘of a yard or else it stays where it is according to an independent Bernoulli trial with probability of moving 6 (0 < @ < 1), and that Achilles runs at a steady rate of one yard each time unit. During the initial w time units the tortoise will run X,, yards to position _Z, = X,. While Achilles is reaching Z, , the tortoise will run on a further X,, yards to position Z, = X, +X,. In general while Achilles is running ‘from Z,_, to Z,, the tortoise is running a further X,, , yards to Z,,, = , in = 1, 2,...} is a Markov chain, and that EX, = w0"; find EZ, and the mean of Z = lim Z,, the position of the tortoise "when Achilles catches it up. ao Determine the distribution of Z directly. Solution. X,., , depends only on X, (it is as though the race starts again __ with the tortoise having X,, time units start), so {X,} isan MC. Clearly 0 is an absorbing state, when Achilles catches up with the tortoise. Now X, is Bin(w, 6) and X,,,|X, = k is Bin(k, 6). Therefore Myer = EX ys = Ex ExysstayeeXnes = Ex,(%q 6) = OM, = 01M, = &-1(w6) = wo" (70 as no). «BZ, = BX, +..4X,)= DMA Py ket Directly: the tortoise runs a distance Z before Achilles catches up with it, _ where Z is the number of successes before the wth failure in a sequence __ of independent Bernoulli trials with probability of success 0; therefore Z __ is negative binomial, NB(w, 1 — 6). o EXERCISES “1. Nuclear chain reactions. Consider a population of neutrons which ~ are being bombarded by other particles. The neutrons are inactive unless 58 TAS ws directly hit by a particle. By fission the particle is then broken into fixed number, m, of the neutrons, and energy is released. Under a stream of bombarding particles each neutron independently in each short unit time interval has a small probability @ of receiving a direct hit. If the number of neutrons initially is mi,, show that the number present at the nth unit time point is a Markov chain. What is its transition matrix? 2. By defining the sequence {X, : j = 0,1,2,...} of Bernoulli rvs, wi X,=1 if Z, =i, and X, = 0 otherwise, use the result of Problem 4. to establish equation 4.1. 3. The simple queue model. Show directly from equation 4.10 that TI, (5) = Es satisfies recurrence relation $I, ., 6) = {(6—DpxZ, = 0)+11,(8)}P43) where P(s) is the pgf for X. If 11,3) + 11) ~ 359 anises show directly from EZ,,, = EZ,—E4,+EX, that n, = 1—p, where p = EX. By letting n co in equation 4.25 find I1(s). 4. A single-server queue with batch arrivals. Let us call the nth customer served C,. Let B, be the number of customers in the first batch to arrive _ after the departure of C,, and A, be the total number of customers who arrive while C, is being served. If Z, is the size of the queue at the departure of C,, show that 2 Sa An 2m lo) tt 1B.-14+4,,, (Z, =0) Suppose that when Z, = 0, B, is independent of A, ,, the B, have pgf B(s) and the A, have pgf A(s). Use the method of Exercise 3 to show that T1,(9), the pgf for Z,,, satisfies $1,.1(8) = [{B)—1}p(Z, = 0) +11, (9)] A) Deduce the form of the limiting pgf. 5. Consider the 2-state Markov chain {X, :n = 0,1,2,...} of Problem 4.9. Show that the process {Y,: = 0, 1, 2,...}, where if Xan» Xone) = 1) if(XaqsXaq41) = (1,0) if (Xan Xone1) = (0,0) if (Xan, Xone) = (11) ta he=0,'1, 2; <2} BREE Te ts eres ot icd Graco, ‘2 and 3 from the sequence {Y, }. Show that {Z,} is a 2-state MC h transition matrix poo = Pi; = Por = Pio = 1—A, where G-2-A)*. ‘A branching process. Suppose that every individual at each time 0,1,2,... becomes a family, where the family sizes are independently distributed with pgf G(s). During each unit time interval (n,n+1) immi- ints arrive in numbers which are independently distributed with pgf Show that J, (s), the pgf for the population size just after the nth eration, satisfies TI. (8) = H{G()}11, {G(s)} A branching process. For the discrete branching process of Problem let rv Y, = Z,+...+Z, be the total number of descendants in the st k generations. By a decomposition based on Z, , the number in the ration, show that H, (s), the pgf for Y, satisfies recurrence relation A, (s) = G{H,_,(9)} luce a formula for EY, in terms of EY, _, and the mean family size EZ, , d use it to find EY,. Find the classes formed by the states of the Markov chains with the ‘ing transition matrices. Describe the classes as transient, positive- nt or null-recurrent. Poo 1 P= Pan = Pur 2 = 23% Py =0 (otherwise). Seam oO ONMNE aha O OREEE _ Arandom walk with a single impenetrable barrier. Show that the Markov with states {0,1,2,...} and having transition probabilities Poo = 40> Por = Po Puna =% Paar = Bh = 1,23...) Py = 0 (otherwise) only ifthe series Er, eee where = (Po Ps +++ Pa M41 42-+-Ie+1) 10. The dam storage model. Consider the dam storage model of Pro! 1.6. Suppose that the daily inputs {Y,} are independent, identic: distributed rvs with pgf G(s) =) g, s*. Show that {Z, }, where Z, is the io 3 content after the unit release on day n, is a Markov chain, and find transition matrix. i Tf (m,2,,-:+s%»—,) is the equilibrium distribution, prove that % = m/mq (k = 0, 1, 2, ..., w—2) does not depend on w, and that _ 4, ‘ ©, "5 = GO)(1—s{G(s)—s}. Ifthe inputs have a geometric distribu- bution, what is this equilibrium distribution? ’ 11. Achilles and the tortoise. For Problem 4.24, show that VX, = W010), — cov(X,,X,,,,) = wor*™(1— 6"). . now consider stochastic processes in which changes of state occur at dom time points. First we define the Poisson process {No, : t € [0, 20)}, which gives the times of these jumps. Let rv N,, be the number of point events which occur in time interval (t,c]. If the stochastic process — {No, :t € [0, 0)} is ‘time independent, i. for each k, pr(N,,,, =) depends only ont (5.1) ii) has independent increments, and is - (iii) orderly, ic. pr(N,,,, > 2) = o(t) ast +0 “where, if A(t) = o(t) as t + 0, then A(z)/t > O.as t+ 0 n the process is a Poisson process. priN,gea = 0) = 1-Ad-+0(4) pr(N,eea = 1) = 24 +0(4) PHN +4 2 2) = 4) (ii) N,,..4'8 independent of No, for all t and A. Solution. By (ii) {N g, } has independent increments. Since the probabilities in (i do not depend on 1, {No,} is time independent. We are given that (No, } is orderly; so {No,} is a Poisson process. Oo roblem 5.2 Prove that, for fixed t, No, is P(At). Solution, We shall abbreviate No, to N, here and in the following prob- Let IT, (s) = Es", then T1,. (8) = Est 4 = Eset evan “By (5.2), N, and N,,,—N, are independent and, by (5.1), N,44—N, is stributed as N,—N, = N, (since Ng = 0). Therefore a 1045, pri, = we} ; = I1,(s){(1—24)s° +24s'+0(4)} bycondition (i) of Problem 5.1 Therefore TM) a fd) Sea = Ml -9),6)+ HO 10-99 TI, (s) =e #09 the pgf for a P(At) rv, where we have used I,(s) = Es’ = Es = 1. O Problem 5.3. The unrestricted random walk in continuous time. Suppose — that a particle starts at the origin and makes a sequence of independet steps: +1 with probability p, —1 with probability q = 1—p at instants T,, T,,..., which occur as a Poisson process having rate p meter / (see Figure 5.1). Figure 5.1 What is the distribution of Z,, the position of the particle at time te [0, co)? Solution. Essentially we have the bivariate process {Z,,N, : t € [0,00)}. The kth step is a rv ye 1 with probability p **)-1. with probability q which has pgf ps+qs~*. Then Z=X,t..4Xy, Z,=0 (iN, =0) 80, by Problem 2.14, G,(s) = Ez, s* = Ey,(ps+qs')"* = exp{—2t(1—ps—qs~")} Problem 5.4 Determine EZ,, VZ,. Solution. By equations 3.4, Exine=eZ: = KP—9) — Vaxine=nZ, = 4kDq Therefore, by equation 2.20, Ez, Z, = Ey,Ezinan, 2: = (P—QEn,N, = (P—Qat; and by equation 2.22 = 4pq Ey, N,+ V{(P—QN,} = 4pqit+(p—g)PAt = At " Alternatively, we could have applied equation 2.24 to G,(s). Problem 5.5 Find the distribution of rv T, the time to the first event of __ a Poisson process of rate 2. Solution. pr(T > 1) = pr (the first event occurs later than time t) : = Pr(No, = 0) = e* Pherefore, pr(T < 1) = 1—e~*, which, by equation 2.2, is the distribution ~ function of an (A) rv. o Problem 5.6 Find the distribution of rv T*, the time between events of process of rate 2. Solution. pr(T* > t) = Tim PrN. acrare =0|Nra =D = lim pr(N ys 424441 = 9) 4-0 since N,....4 and N,..4..+4+1 are independent) = pr(No, = 0) wy time independence), and so by Problem 5.5, T* is distributed as T nd is &(2). o Problem 5.7 First passage times for the unrestricted random walk. se p < q. Find the distribution of rv T,,, the first passage time from state k to state 0 (see Figure 5.2). Es 4 ° im 7 ; Figure 5.2 __ Solution. As in the discrete time case of Problem 3.11, a first passage _ from state k to state 0 implies successive first passages from k to k—1, from k—1 to k—2,..., from 1 to 0, i. Tho = Tha-1 +Tr-14-2 +-+-+Ti0 each T, ,_, is independently distributed as T,,. The characteristic function ¢4o (0) for Ty is therefore {¢,,(0)}*. We condition on the first el Which Seat ats rato teas 7) okies by Pot oe ease or a rs ee cals {rate =T with probability q OT Plas with probability p depending on whether the first jump, X, , was —1 or’ +1. Therefore $10(0) = Ep, 0° = Ex Erixy-x, 00? = qEe"+ pEeltt*T20) = qW0)+ py(O)Ee*™” = WO) {a+Pb29} = WO) [4+ P{>1()}7] where /(8) = 4/(A—i6) is, by Exercise 14 of Chapter 2, the cf for T. We — solve this quadratic equation in 1, keeping 0 constant; and since 10 (0) = 1 (for p 0, Pe Pift) = 1, ie. P(t) is a stochastic matrix. a (ii) A continuity condition. As t — 0, p;,{t) + 1; then by (i) p,{t) > 0 as 1 +0 for i # j; therefore p,{t) + 5, 4 ie. P(t) > Last +0. (iii) The Chapman-Kolmogorov equation holds, ie. P(u+v) = Pw)P(r) (u,v € [0, <0) If S is finite, then the unique solution of equation 6.1 is Pu) = e'@ = 3 war where Q is a constant matrix. Given Q we can find P(t), and given P(t) we can find Q. Thus Q can be used to ‘represent’ the process. é Ifa remainder term A(t) = O(t) as t + 0, then |A(0/t| < c, a positi constant, as t + 0. Then from equation 6.2, as 1 0 P(t) = 14+ 1Q+0(") Pift) = 5 +tq,,+O(0?) fg lt ee 4y = fig HEM =F oy0}| = HO) say since 5,, = p,{0) (by (i). If A(®) ~ ct as t+ 0, then A(t + ¢ as t> 0. Therefore, for i # j, p,jt) ~ qyt as t+ 0. Thus q,,t is the probability of a jump from i to j in short time interval t; q,, is termed the transition rate 6.3 we can show that if i # j then q,, > 0 and that q,, < 0. We th set q, = —qy 2 0. Problem 6.1 Let rv T, be the time spent in state i before a jump from it. Prove that 7, is &(q,). Solution. In Problem 5.6 we proved the special case for the Poisson process. For fixed , define ar , tr 4 es er ae oe le _ PO) = pr{Z, =i at times t = t/n,2t/n,...,(n—I)t/n,t|Z, = {p,(t/n)}"_ by the Markov property and time independence = {1—g,t/n+O(P?/n)}" + ee asn + 0 asn + co, Pi) > pr{Z,=i (<< 2)} = pr; > 2). Therefore é pr(T; < t) = 1—e~*, the df of an 8(q,) rv. o _ If0 < q, < ~, then state i is called stable; if g, = 00, then i is called instantaneous; and if g, = 0, then i is absorbing. By Problem 6.1 a particle instantaneously jumps from an instantaneous state, and never leaves an The Kolmogorov differential equations relate P(t) to Q. The backward equations are 70 _ ape PA) = & Gu Prj(t) are generally valid. The forward equations are ah) _ simian vif0) = 5 Paley these do not always hold. 2 Applications _ Problem 6.2 The general time-independent birth and death process with mmigration. Consider the Markov chain {Z, :t € [0, o0)}, where Z, is the ion size at a time r. If Z, = i(i = 0, 1, 2,...), then in a short time nterval (t,t+A), Z, increases by one (by a birth or the arrival of an immigrant) with probability 4,4+0(4), decreases by one (by a death _ or the departure of an emigrant) with probability .,4 +.0(4), or does not change with probability 1—(2,+y,)4+0(4). Any other changes must efore have probability o(4). Clearly 4) = 0. What is the matrix Q? the Kolmogorov forward differential equations, and determine their Solution. The state space is {0,1,2,...}- Since the transition matrix P(d)is 2,4+0(4) (= i+) 1-G,+u)4+0(4) G =i 1A +0(4) G=i-1 0(4) (otherwise) (64) a 0 —W-A, A, ” 9 a a eal Let us abbreviate p,{t) to p,, and use pj, to denote (d/dt)p,(t). Then, i Z, = i, the forward equations are Pio = —AgPio + Hs Pix Pry = Aya Puy- 1-H Ay tHe rPajer UF = 12,.-) For the equilibrium solution, let p,{t) > 7, as t + co, then 0 = ~Agm y+ Hm, 0 = Ay 1-H) —Aj— Hyer) G=12.-) Write Ty Amy MyerF%er UV = 0,12.) Then =-l 0=1,,-1, G=1,2..) 80 1,=0 G=0,1,2,....) Therefore Rye /My = Altyes Therefore My = (y/my_ 1) (Hj /j-2)-+-( [ro)o = Vil where y= SoBe zeae Yom 1 Hgftj— yoo By Then, since Ex, = 1, l=1 pe so m= H/d% = 01,2...) This will be a probability distribution if and only if Ev, converges. Problem 6.3 The linear birth and death process with immigration. Con- sider a population process {Z,:t€[0,0o)}, where Z, is the size of the population at time t. People in the population behave independently ibility -o(: eens Will give? Terie to “andthe” wasrnter: Pith bability 44 +-0(4) the member will die or emigrate, and with probability —G+n4+o(4) nothing will happen to the member. In 4 also, an m nt will join the population with probability «A +0(4). If initially - the population contains i members, what is the pgf, 17s), for Z,? Solution. We consider a decomposition based on the last 4. Then Zysa = Z,AX +. 4X, 4W we 1 ifthere is an immigrant in (t, t+ 4), i.e. with probability «4 +0(4) x '0 otherwise, ie. with probability 1—a4+0(4) 1 ifthe Ith member gives birth in (t,t + A), ie. with probability 4A+0(4) —1 if the Ith member dies or emigrates in (t,t+ A), i.e. with probability pA-+0(A) 0 otherwise, i.e. with probability 1—(2+u)4+0(4). Es” = ads'+(1—ad)s°+0(A) = 1—(a—as)4 +0(4) Es*| = 2As'+{1—(A+p)4}s°+pAs~!+0(A) = 1+ {4s—(At+p)+us™'}4+0(4) “eX (Es) (Es%). ..(Es*) (Es) = (Es™) Es”) = Been —(iA+ipta)A +0(A)} + {ind +0(A)}s~! +0(4) the expansion and collection of terms which are O(1) and O(4). lore Zi+4=5|Z, = )= p(X, +...+X,+W = j-i (iA+a)4+0(4) G =i+1) 1-G+int+a)s+o(4) ipA+o(4) G=i-1) (4) (otherwise) It is clear that we could have written this, or equivalently the Q-matrix, it down. It is of the form (6.4), with 2, = id+a, y, = in, and so quations 6.5 are (0 = — Pio + Pi Py = (i -A+a}p, 1. —UA+u) + a}p+G+ Dap. je. G= een Sh a = Oforj=. a hold for j = —1, 0, 1, 2,.... We 68 by s! Hs! = As? j— Isp, j_, +ass~'p, , ‘We then sum over j = Ts) = EzjzgniS* = >, p,(t)s! ai aes gpa an Gs Font +p Then ot és ie. oT asa eS aT ot és We shalll solve this partial differential equation for IT. We shall not dis here why the method works. We solve the auxiliary equations dt ds aq 1 @s—p0—9§ ~ —a(l-s We shall suppose 2 # 41 (Exercise 5 is the case 4 = 12), then a—yar = 24544 as. al ay ie a eee les As—u I-s (GES) - —p)"IT = constant Since the partial differential equation is of the first order, it can have o one arbitrary constant, so the two constants must be functions of another, ic, ——e7 G-#! = constant Therefore (s— wr = Afteoa We determine / from the initial condition: Z, .€. I 9(s) = (s—yys! = (22) By setting r = (4s—p)/(1—s), so s = (u+r)(Z+0), we determine th function f(r). This, when substituted into (6.10) gives I = I7,(s). o siren Z, = 4 Z; hs the ma dnibtion a 29-7 «-+¥?, where Z* is the number of immigrants who arrived in 0 d are still present at time t plus their descendants who are still present _ at time ¢, and 1 if the jth of the i original members is still present at time t iia 0 otherwise (i= 12...) Then Z} and each of the Y¥are mutually independent, and Z** is distri- like Z, given Z, = 0. Also, each Yis a Bernoulli rv with probability ~ of success, pr(T > t) = e™ (by Problem 5.5) Therefore Ts) = EAH Belew te! fe need therefore have considered only the case i = 0. special cases have been given names: A = yt = 0: the Poisson process _ 2 = 0: the immigration-emigration process a = = 0: the linear growth process or the Yule process a = 0: the linear birth and death process roblem 6.4 Find EZ, for the population process of Problem 6.3. Solution. We can find EZ, and VZ, from the pef IT,(s). However, it is ot necessary to solve equation 6.9 to determine them. If we differentiate equation 69 with repost to s, we obtain oF is— yo gett {As—w(-1)+A01-)— as—1} 2 oll (6.11) 1) =1, en tae - ari 5 1 in equation 6.11 we thus obtain a first-order ordinary dif- _ ferential equation in M,: Meu aM, =a (6.12) st mmmeaie Since en 2 (em d Hemel... 7 (3) | ae EZZ AD} = {vam M) =% z, See ae where V, = VZ,, we can differentiate a tidied-up equation 6.11 with respect. to s, and let s > 1, to obtain a first-order ordinary differential equatio for V,, into which we can substitute for M, from equation 6.13. Note the KW = VZ, = 0. Problem 6.5 The single-server queue with Poisson arrivals and exponen tial service times, Customers arrive in a Poisson process of rate 4, and join the queue. The service times for customers are independent &(ui) Determinea partial differential equation for I7,(3), the pef for Z,, the num! of customers in the queue (including the one being served, if any) at time Find the equilibrium distribution of the queue size. Solution. In (t,t +), pr(a new customer arrives) = 14 +0(4) pr(a customer finishes being served and he leaves) = A +0(4) pr(no customer arrives and none completes being served) =1- (i+masold) pr(anything else) = 0(4) Let p,(t) = pr(Z, = k), and abbreviate it to p,. The forward equations are Po = —APot HP Pe = AP, 1-H ADT Heer (k= 1,2...) Multiply the kth equation by s* and sum over k = 0,1,2,.. I1,{s) to IT; then isn yut—p.)—anr+" ap.) = (F *) és o99) If p,(t) > m, as t + 0, then TT + Xo oH = 0, Therefore ot +3 abbr ns, and {n,} is given by setting »E(i)? ™ = (1—p)p* (k = 0,1,2,...) a1 —; = E(service time)/E(arrival interval) _ is the traffic intensity. Then x, is a probability only if p < 1. For p <1 the sequence {r,} is the geometric distribution %(p). a - Alternatively, equation 6.7 gives this result with 2, = 2, 4, = 1. This queue is in fact a random walk in continuous time with steps _ +1, -1, having probabilities 4/(A+y), /(A+1) respectively, and steps __ taking place as a Poisson process of rate 1+, There is an impenetrable barrier at the origin at which the particle stays for a time which is an _ 6(2) rv and then jumps to +1. BP EXERCISES 1 Accident proneness. Suppose that if no previous accidents have occurred If < o, then the model represents the case in which the lesson has been learnt, but if 6 > a then we have the case in which the shock leads to a _ state in which more blunders are likely. Find the Kolmogorov forward differential equations for p,(t) = pr(N, = k), where N, is the number of accidents in (0, t), and show that I7,(s), the pgf for N,, satisfies pus = (B—a)(1—s)e™™ "Find EN, from this equation (by the method of Problem 6.4). Note that this model reduces to the Poisson process when « = f. _ 2. Suppose that the probability of an accident in (t,t+4) given that /k accidents have already occurred is (a+kd)A+o(4) (k = 0,1,2,...) Find the Kolmogorov forward differential equations for p(t) in this case. Verify that they are the same as for a linear birth process with immigra- _ tion, the birth rate per individual being and the immigration rate being o. _ Solve the equations to show that N, has a negative binomial distribution. 3 Race relations. For the linear birth process with immigration of _ Exercise 2, show that if there are n individuals present at time zero, then _ the population size at time t has pgf sel!— se — 1) 09 If the individuals present at time zero and their descendants are called — natives, and those who enter after time zero and their descendants are called immigrants, prove that if ni > a the expected number of natives always exceeds the expected number of immigrants, 4. The linear birth and death process. If there is no immigration and if the birth rate per individual is 4 and the death rate perindividual isu(a # p), show that the population size at time 1, Z,, has pef (a—by)/(a—cy)), if initially Zy=1, where a=As—p, b=(s—ly, c =(s-l)4, p= exp{(2—y)t}. Determine EZ,, VZ, and the probability of extinction. 5. Do Exercise 4 and Problem 6.3 for the cases in which 2 = y. Check that the results could have been derived by setting » = A+e in the solu- — tions for A # y, and letting ¢ > 0. 4 6. Find VZ, for the population process of Problem 6.3, using the method outlined following the solution to Problem 6.4. community contains i (i = 0, 1, 2, ...,) members at time ¢, then, during a short interval (t, t+ 4),a new member joins with probability(n—i)Ad+0(4); each member independently leaves with probability 44+0(4), and there is no change in membership with probability 1—ni.A+o(4). Set out the Kolmogorov forward differential equations for the distribution of the size, Z,, of the community at time t. If initially the community had no members, show that Z, is Bin © (n,3(1-e" 4), 8. The incoming traffic to a car park with n spaces is a Poisson process of rate A, but only while spaces remain unfilled. In any short time interval of length 4 each vehicle already in the car park independently leaves with probability v4+0(4). Write down the Kolmogorov forward _ differential equations for the distribution of the number, Z,, of spaces filled at time t. ¥ Find the equilibrium distribution of Z,. If the car park is large enough that there is always room for incoming traffic, and if initially k spaces are filled, find EZ,. 9. A single-server queue with constant service times. Customers arrive in a Poisson process of rate 4, and the service time is unity. Let rv Z, be the number of waiting customers, including the one being served, at time 1. Show that if ¥, customers arrive in time interval (t,¢+1) then eee eae . "7 fete Gabe.) tet = Vy (Z,=0) recurrence relations for p,(t) = pr(Z, = k), and show that if G,(s) $6, 19) = {(8—I)polt) + G,(s)} exp{a(s— 1} ee Problem 4.12 and Exercises 3 and 4 of Chapter 4.) sae at Non-Markov Processes in Continuous Time Discrete State Spaces We now consider population processes for which the lifetimes of. members — are not necessarily exponential rvs. The remaining lifetime of an individual _ will depend on how old he is now, and so the future size of the population — will depend not only on present size, but also on the present ages of its members. The process is still Markovian if we include this information, but we would not wish to keep records on every individual throughou their lives. There is no change i in the population size except when an event occurs: when a child is born, or someone dies, or an immigrar arrives or an emigrant departs. We consider, therefore, decompositi based on the times of the first or last of these events. 7.1 Renewal theory We consider first the special case of a renewa process. The function of a single component—for example, a continuously burning light bulb—is observed and it is replaced by another immediately it fails. The components will have lifetimes which are independent {X,}, each distributed like a positive rv X. Then the rv Z,, the time to the 1 failure of the nth component if the first is fitted at time 0, is given by th random walk Z,=X,+...4X,, Zy =0 See Figure 7.1. We take a fixed time r and are interested in the following — tvs: i the number of renewals in (0, a. N, = max{n:Z, < t} the forward recurrence time, 7” = Zy,,,—t and the backward recurrence time, T, =t-Zy, ————SS Zu + 7 Figure 7.1 Now, for a fixed t, Ty Ape By Eg Ng is not distributed as X, since in any realisation of the process the fixed 7 ‘ ie ¢ likely to be in a long interval non-Markov since T;* depends on T;. For example, if a light ee Sree Se ee ree “than a week or a good one which lasts about a year—then, if a bulb has survived 6 months, it is of proven quality with about another 6 months’ lifetime. "There is an important and fundamental relationship between the counting process {N,:t€(0,0o)} and the random walk process {Z,: = 0,1,2,...}. The events ‘the number of renewals in time t is at least n’ and ‘the time to the nth renewal does not exceed ¢’ are the same, and so have the same probability. Therefore pr(N, > n) = pr(Z, < 1) (7.1) “Suppose that EX =u, VX = 0%. We can approximate the distribution of __N, for large t by working directly with Z,,; that is, 4 pr(N, t, then N, = 0; if U =u <1, then N, = 1+N;_,, where aN. the number of further renewals in the remaining t—u up to ¢, and so ‘is distributed like N,_,. We are using the fact that the Markov property holds at renewal points, since then it is as though the process were starting Es" = ByEgyyaus = [2° fl) du Eyyyaus™ = [f epdu Este ++ | fudu Bs? II{s) = sf lM, (9) du | flu) du Problem 7.2 Determine integral equations for the renewal function H, = EN,, and the renewal density h, = (d/dt) H,. Solution. We differentiate equation 7.2 with respect to s, and let s > 1. as) _ an, 35 ly SMM dues | fa) Se Therefore H,= {flu dut J, S00H,. du We differentiate this with respect to t: h, = fO+SOH.+ f° flwh,_, du =S0+ [ Oh, du since Hy = EN, = E0 = 0. Since N, ~ t/u ast > 20,H, ~ t/wandh, ~ 1/p. Problem7.3 Show that h, is the instantaneous transition rate defined by pr{a renewal occurs in (t,t+4)|a renewal occurred at 0} = h,A+o(A) Solution. If we write equation 7.3 in the form h,A+0(4) = {f04-+0(4)} + f (Fw) du} {h,,+0(4)} then the right side is pr{the first renewal is in (t, t+ A) or it is in (u,u-+du) for some ue (0,1) and there is a renewal in an interval of length A aftera further t—u} ie. it is the probability that there is a renewal in (t,t + 4) given thet was a renewal at 0. ; Problem 7.4 When the lifetime rv is 6(2), that is, ex; ial wit i ponential with p mee 2, show that rv N, is A(22), that is, Poisson with mean At, and 1 is 2. Solution. We use the definition of h, given by Problem 7.3. If h, then by Problem 5.2 N, is Azz), and em i eee , is Az), and by Problem 56 the intervals The other way round: we put Sf) =te-*™ O 0), or the last renewal before t is in (v,v+do)(0 < v < 1) for some v €(0,t), and the next renewal is in (t+x,t+x-+ A) ie. in (t+x—v, t+x—v+4) later} G9. (¥)4 +04) = {f(t+x)4+0(4)} + fhe do {f(t+x—0)4 +0(4)}. fe set u = t—v in the integral, divide by A, and let A + 0. Then 3 a (8) = flt+x)+ J" hy, flux) du (74) _ We must suppose that f(x) + 0 as x + oo. Equation 7.4 is true for fixed t. t + 00, then h,_, + u~1, so (3) > as) = [° flux) du ts A, = &, since Ay = Iq(s) = Es®° = 1 =a =s) 5 JF saw = j{I-Fe) oo _ Problem 7.6 Find the distribution of the backward recurrence time T,, and its limit as t + co. - pr{T, €(%,x+4)} ‘there is no renewal in (0,1), in which case T> = t, or there renewal in (t—x,t—x +4) (0 < x < t)and there areno more renewals in (t—x, t)} Nowpr{norenewalin(u,v)|arenewalatu} = pr(X > v—u) = 1—F(o—w). Therefore p(T, = t) = pr(X >t) = 1-F() 9, &) =h_,{1-F(X)} O 00,F(t) > landh,_, > p-}, therefore the distribution of T, tends to a density g(x) = {1—F(x)}/u (0 < x < 00), the same lim istribution as that of T,". od 7.2 Population processes If instead of replacing a failed component by — another we replace it with X components, where X is an rv having pgf G(8), then the situation is that of the family tree described in Problem 4.14, except that the lifetimes of individuals are independent rvs each having density f(x) (0 < x < 0). Problem 7.7 The continuous-time branching process. Suppose that at _ time 0 the process starts with the birth of a single individual, the founder. _ Find an integral equation for the pf, IZ(s), of the size, Z,, at time f, of the population. a Solution. We consider a decomposition based on the time, U, of the founder's death. Let N be the number of offspring left by the founder. Then Ey s¥ = G(s). If U = u, then, for t u z, Be ZO +... Z0, (N = 1,2,...) a ei (N =0) where each ZY, is independently distributed like Z,_,. Therefore, for t > u, by Problem 2.14, Ezju-veiS” = G{M,_(9)} Therefore Ts) = Ez,s* = EyEziyaus* = [° flu) du Ezyyays™ = J) £00) du G1, _(0)} du-+[* fu) du Bs! = s(1-FO}+ ff FONGET, 9} au O05) Problem 7,8 Determine ITs) in the special case in which the number of offspring is always 2 and the lifetime rv is &(4). Solutio ME i ah eae tl a al es feGh waa Je (0 < u < 00). This is the linear th process, a special case of Problem 6.3. Equation 7.5 becomes wath Ts) = se-**+ |" de“ ™{T1,_(9)}? du Multiply by e*, set v = t—u in the integral, differentiate with respect to t, divide out e*, and abbreviate JT,(s) to 17. Then ; al re Gy 7 M-1), a iol |a7-1| Adt= aay = (ea-z)a = dlog—- es |m-1| _ 4 log’, — = 2t-+constant = sew a low TI,(s) = s, so I,(s) = isa-e ie. Z,—1 is Y(e~*), a geometrically distributed rv. o een the arrivals of customers are independently and identically _ distributed continuous rvs, that the service times of customers are inde- ‘tly and identically distributed rvs, and that there is just a single . Label the kth customer C,. Denote by ¥, the time between the s of C, and C, , ,; by X, the service time of C,; and by W, the waiting (which excludes the time in which heis being served) of C,. Ifthe density f the rv V, = X,—Y, is g(v) (—co < v < oo), and the distribution func- _ tion of W, is F,(w), show that v : Faw) = |" For—v)o(o) do Solution. Assume that C, arrives at t = 0 and finds no one ahead of ‘him; therefore W, = 0. Customer C, is in the queue for a time W+X,. “If %> +X, then W,,, = 0; that is, if Cys; arrives late enough he has no waiting. If ¥, < W,+X,,thenW,, = W+X,-Y = With. is, W., = max(W,+ V,,0). Now Fi(w) = 0(w <0, for all k), There. , for w > 0, fi rarer Pasa is Fest) = pre Sw) = = prmax+ ¥,0)< w) = hth ~ jh pr{h, €(v,0+do)}pr(W, < w—v| V, = 0) = J" a) dv F(w—v) since pr(W, < w—v|K, = v) = 0(v > w) and using the independence of _ W, and Vj, ae ‘As k — 00, Fi(w) + a limit F(w) which satisfies Fw) = ie F(w—v)g(v) dv. It can be shown that F(w) = po S ¥, < w for all n) (w > 0); and if EX, > O then F(w) = 0, i.e. the queue grows without limit; but if BY; < a then F(w) + 1 as w — oo so the limit F(w) is a df. Problem 7.10 The busy period of a single-server queue with Poi arrivals. Suppose that customers join the queue at times which form a_ Poisson process of rate a, and that the service times are independent rvs _ having characteristic function (8). Find equations for the pf G(s) re the number of customers served during a busy period, and for the of (0) for the busy period, B. ; Solution. The starting times of the busy periods—that is, the time points when a customer arrives and can be served without waiting—are ren points. The ends of the busy periods—that is, the time points when server becomes idle—are also. The busy periods are therefore indepen- dently and identically distributed rvs. The number, A, of customers who arrive during the service time, Bi of the first customer, C,, is P(aX), so os — E,jx=x5* = exp{—ax(1—s)} y These customers are C>, C3, .... C,,,. Without losing generality y imposea last-come first-: served queue discipline. ‘Then C,,,,, isserved second. — While C,, , is being served, other customers arrive and will all be served before C,. The number served starting with C,,, before C, is reached is a rv N,, distributed exactly like N, the number served in a busy peri Similarly, the number, N,_,, served, starting with C,, before C,_, reached is also distributed as N, and is independent of N,. Therefore, — given A, N =14N,+N,+...4N, where the 1 is for C,’s service, so Enanas” = s{G(s)}* 7 Se? 4 f : orbs population EE af tino'e ¥ inltally there steno Bemhenon G(s) = Eys" = Bx, ay8" = SExEgr=x{G(9)}* POLS an ’ = sEyexp[ —aX{1—G(s)}] = s¥[ia{1—G(s)}] Gs) = eet ee TT,_,{3)®,_,(s)ae~™ du If N =n, then B = X,+...+X,, 80, by equation 2.28, where the immigration rate is «, and I7(s) is the pgf for the process with o : no immigration from a single member given in Exercise 4 of Chapter 6. : H(0) = G{V(O)} = WON/Lia{ 1 —6(0)}] Qo Solve this equation for (s) and verify that the solution is that derived in This problem can be formulated as a random walk and, as such, a special Problem 6.3. case was solved in Problems 5.7 and 5.8, Show that as t + 00, if yu. Derive, by a decomposition based on the admission of the _ events of a Poisson process having parameter 2, show by a decomposition first new member, the pgf for the membership size at time ¢, conditional - based on the final A, that p,, the probability that the counter is locked at ‘on the founder still being alive then. Hence find the pgf for the member-_ time ¢ satisfies ship size just after the founder’s death. Fi+(+Ap, = 2 7. For the model of accident proneness described in Exercise 1 of : 4 Chapter 6, by means of a decomposition based on the time of the first agen ne potution, accident, show that the pgf for the number of accidents in (0, t) is "3. Find the solution, 17(s), of equation 7.5 in the special case in which ee t the number of offspring of an individual is equally likely to be 0 or 2, fase“#-*—(B—a)(1 sje} /{Bs—(B—a)}. __ and the lifetime distribution is &(2x). What is the chance that the population Note that this reduces to the pgf for a Poisson rv when « = f. ~ ig extinct by time ¢, if it starts at time 0 with the birth ofa single individual? 4, The linear birth and death process with immigration. Show by a decom- _ position based on the arrival of the first immigrant that the pgf, #(s), 84 t/At einen steps. Tae from ee 3. a 2, q0-aax = wa e are Markov processes {Z;:t ¢ T} having continuous time para- e meter spaces and continuous state spaces, and for which a small change in Z (ay? " t results in only a small change in Z,. A realisation can be thought of as VZ, = yr fPatx? = = Apa the path ofa particle moving very erratically in a continuous medium, v depending only on its current position. If we let Ax and At —0 in such a way that (Ax)/At > « and ae important class of diffusion processes are the Gaussian processes. fig Veg If {Z,:t€T} is a process whose state space i the real line, R, and if its p= 3+ 34 9=5-7,4* meter space, T, is any subset of R, then it is a Gaussian process if, for a finite n, 2, am Z,,) has a multivariate normal distribution. A where a and f are constants, then EZ, + ft and VZ, ~ at. By the c wussian process is stationary if its covariance function g(s,t) = g(t—s). limit theorem (Z, — Bt)/,/(at) > an N(Q, 1) rv. Wiener (or Brownian motion) process with parameter 4, {Z,:t € R}, is the Gaussian process with independent increments having EZ, = 0 and Problem 8.2 Show that u, ,, where - g(s.t) = Amin(s,, (2 > 0). | fusion Processes u,, dx = pr{Z, €(x,x+dx)} em 8.1 If {Z,} is Wiener with parameter «, show that {Y,} = ‘Z.2a:} is a stationary Gaussian process, and determine its covariance ion, g(s, 1). lution. If Z, isa normally distributed ry, then so is a(t)Z,(9, Where a and Barscal functions of f, since, for fixed 1, a(t) and b(t) are real constants. Solution, From equation 2.12, by a decomposition based on the last t e process {Y;} is therefore Gaussian, since Z, is normal. Now EY, = e “EZ. =0 since EZ, = 0 for all r. Therefore a(t,t+t) = covlY, ¥,.) = EY, Ka.) we Efe" *Z 6 PZ Sasa} =e 9cov{ ZZ ecient ' = e424 min {624 @2ME+9} eA ae gr coe not depend on t. oO mi Merton = ut (At 5 + O(a? ‘ rene procesecs can arise as the limit of a random walk. Let us consider eu, in the unrestricted random walk, but suppose that the particle takes : Uy any = M+ (— Ax) aa y—axp 4 o(axy ependent short steps of length 4x, to the right with probability p, to the left with probability q = 1—p, after short time intervals of ou gth At, It is a delicate matter how we let both 4x and dt tend to zero. Hany = HC) Se Ho) 5+ OC satisfies the forward partial differential equation Ou a eu ax a ~ Pex pr(Z,,.4, = x) = pr(Z, = x—Ax)pr(Z,, 4, = x|Z, = x— Ax) +pr(Z, = x+Ax)pr(Z,.4, = x|Z, = x+Ax) | Therefore Maar ar = Menace Pt Met aeed We expand each term in a Taylor series up to O(Az)* or O(Ax)?, abbreviate u,,, to u. wr (A) + O18)? = Mp +a)+(—p+ qa) ++ gids Fe ota) P+q=1,-pt+q= oe therefore ou — _B f(x?) du | 1 f(Ax)?) du (Ax)? = Be eee +0(ax@2") Ax and At +0 in such a way that (Ax)?/At > a, we obtain tion 8.1. d 7m 8.3 Show that the Brownian motion process satisfies equation -_ Solution. Here EZ, = 0, so f = 0; and VZ, = amin(t,t) = at. There- fore, since the displacement Z, is N(0,a1), from equation 2.29 we have P(x, y) dy = pr{Z,€(y,y+dy)|Z, = x} 1 1 - Jean"? {0-27} dy (-~ oin (Axj/Ar—+y, ade +B! : Solution. Let X¥ = X,—a and k = i-a If X¥ = Xty = kAx = say, then i (k+ Ax = x+Ax with probability ? 2 aka = 1 —x/ad (k-1)Ax = x—Ax with probability a 3+tk/a = {1 +x/aAx) ea Xa = Xie nae = ut (ant a O(At? x 1 u 1 Ltt yuct tac 1 i.e Ou A +5(4bs {us canst +(x) 53 + Ol a (ear (2)2) scar outa Therefore ae a oan) = (a lz So (F3)+0 (ax *) os? Therefore ou é @u a Ba (uta 5 a a This limiting equation is is the diffusion equation for the Ornstein-Uhi Process, which is defined to be the Gaussian process with EZ, = 0 covariance function g(s,t) = ae~**-"! (¢ >0, B > 0). Problem a showed a method of constructing it from a Wiener process. EXERCISES 1. If {X, : te [0, 00)} is the Wiener process with parameter A, for e: {Z,:te[0,0)} find EZ, covZ,,Z)) (< n Bate tietier tion process is stationary in the wide sense, and whether is Gauseian. =at+X, (@>0, (ii) Z,= X,,,-X, (@ >0), =(1-9Xy4-9 O)), Z, = Y,,, —Y,, where Y, is Gaussian with EY,=a+ft, cowY,,¥,,,.)=e7" (y > 0) . Consider the diffusion limit of an unrestricted random walk on the axis. If at time t, X, = x, then at time t+At (n—1)Ax = x-Ax with probability p, atta = nx = x with probability r, (n+1)Ax = x+Ax with probability p, and r, are continuous functions of x, and 2p, +r, = | for every =X, Let u,,dx = pr{X, ¢(x,x+dx)}. If Ax and Ar both tend to zero in a way that (Ax)?/Ar — , a constant, show that in the limit u = ou 2 Gn Pe.) Consider a particle which starts at the origin and carries out a random according to Problem 8.2. If @,(6) is the characteristic function for and if y(0) is the cf for a N(f, a) rv, show by a decomposition of 4, 4 (0) d on the last At that @,(6) = ev _In a model for gas diffusion through a porous membrane there are 2 A and B, cach containing 2c molecules. These molecules are of 2 there are 2c black ones and 2c white ones. After each unit interval are interchanged. The rv X, is the number of black molecules in A time n. Consider the Markov chain {Z,:n = 0,1,2,...}, where = X,—c. Find difference equations in k and n for pf = pr(Z, = k) 1 a decomposition based on the last step. _ Consider the diffusion process arising as follows. If at time instant t, z, then at ¢+At, Z,,4, = z—Az, z or z+Az, where the process has such a way that (Az)?/At > y and (c At)" — f, show that in the limit satisfies the Ornstein-Uhlenbeck or differential equation ou # penny where u,,dz = pr{Z, €(z,2+dz)} 5. Suppose that {Z, :€[0,0)} is the Omnstein-Uhlenbeck process. If Z, given that Zo = z is N{ze~™, (1 —e~?#)/(2)}, show that u,.dx = pr{Z,€(x,x-+dx)|Zo = z} satisfies equation 8.2. Note that the equilibrium distribution of Z, is given by letting t > co, and so is N(0,7/28), ba does not depend on z. Recommendations for Further Reading A fairly elementary introduction to probability theory is oy Wil is of General treatments of stochastic processes at about the level of Problem Solver are given in The Theory of Stochastic Processes David R. Cox and Hilton D. Miller, published by Methuen in 1965, and i The Elements of Stochastic Processes with Applications to the Natu Sciences by Norman T. J. Bailey, published by John Wiley in 1964. Some specific applications are made in Queues by David R. Cox and Walter L. Smith, published by Methuen in 1961, The Theory of Storage by P. A. P. Moran, published by Methuen in 1959, and Stochastic Models for Social Processes by David J. Bartholomew, the second edition of which was published by John Wiley in 1973. bing barrier 23 orbing state 23, 68 Proneness 74, 85 state 51 ‘@, Bernoulli distributed 8 jackward equations 68 ulli distribution, trial 8 Bin, binomially distributed 14 Branching process 48, 60, 81, 84 n motion process 86 ur-parking process 75 Central limit theorem 20 _ cf, characteristic function 16 Chapman-Kolmogorov equation 39, 67 Coin-tossing process 8 unicating states 53 tion for food 75 nal distribution 9 mal probability 9, 12 ition based on the first step 24 m rule 11, 13 asity, probability density function 7 df, distribution function 7 - Diffusion approximation of a random walk First passage time 51 Fokker-Planck equation 88 Football match 2 Forward equations 68 4, geometrically distributed 18 Gambler's ruin 23, Gamma distribution 18 Gas diffusion processes 43, 88, 90 Gaussian process 86 Immigration-emigration process 72 Impenetrable barrier 29, 60 Independence 9, 12 Independent increments 22 Instantaneous state 68 Instantaneous transition rate 79 Insurance models 2, 6 Irreducible MC, class 53 Joint distribution function 9 Kolmogorov differential equations 68 ‘Kolmogorov’s theorem 53 Kronecker delta 39 Law of large numbers 20 Leads to, > 53 Library book loans 4, 37, 57 4, mean of a distribution 7 Machine breakdown 4, 37 Marginal distribution 9 MC, Markov chain 35 Markov process, property 22 Matrix of transition rates 67 ‘Mean 7 Miracle 3 Moments 16 'N, normally distributed 16 NB, negative binomially distributed 18 ‘Newspaper sales 66 Nuclear chain reaction 58 ‘Null-recurrence 53 0 0,~ 67 Oceanography 3 One-step transition matrix 35 Orderly process 62 /Ornstein-Uhlenbeck process 89, 91 ®, Poisson distributed 14 Period, periodicity, periodic state 51 psf, probability generating function 13 Poisson counting process, Poisson process a Population processes 48, 60, 68, 75, 81 Positive recurrent 53 r, probability function 6 Probability density function 7 Probability distribution 6 Probability of success 8 Programmed learning 34 Qmatrix of transition rates 67 ‘Queuing processes 1, 45, 59, 73, 75, 82 Race relations 74 ry, random variable 6 Random walk 19 Realisation 2 Recurrence time 77 Reourrent state 51, 52 Reflecting barrier 28 Renewal density, function 78 Strong law of large numbers 20 Teeth 5 Thinned process 39 Time independence 25, 27, 35 Time to absorption 27 Traffic 5 Traffic intensity 46, 74 Transient state 51, 52 Transition matrix, probability 35 Transition rate 67 Trial 8 Unrestricted random walk 19, 52, 54, 63, 64 4 V, variance operator 7 Variance 7 Wiener process 86 Yule process 72 10. 11. ‘zs 13. 14, 15. 16. Calculus of Variations J. W. Craggs Laplace Transforms J. Williams Statistics 1 . A. K. Shahani & P. K. Nandi Fourier Series and Boun- dary Value Problems W. E. Williams Electromagnetism D. F. Lawden Stochastic Processes R. Coleman Fluid Mechanics J. Williams Groups D. A. R. Wallace ISBN 0 04 519016 X Printed in Great Britain Ordinary Differential Equations J. Heading Calculus of Several Variables L. Marder Vector Algebra L. Marder Analytical Mechanics D. F. Lawden Calculus of One Variable K. E. Hirst Complex Numbers J. Williams Vector Fields L. Marder Matrices and Vector Spaces F. Brickell S$3SS3900ud DILLSVHOOLS

Вам также может понравиться