Stochastic OR

5.1 The \(M(n)/M(n)/1\) Queue

In this section, we specify appropriate functions for \(\lambda(n)\) and \(\mu(n)\) in Eq. (4.6.2), or \(\lambda^{+}(n)\) in Eq. (4.6.10) in case of blocking, for several of the queueing models discussed in Section 3.2. We emphasize that understanding how to specify the level-crossing equations is more important than deriving closed-form expressions for the performance measures.1 The code of Section 5.2 deals with the numerical aspects. Further, it is assumed that a system without blocking is stable.2 A system that blocks demand is stable anyway.

In the \(M(n)/M(n)/1\) queue the inter-arrival and service times are exponentially distributed conditional on \(L=n\), that is,

\begin{align*} X|L=n & \sim \Exp{\lambda(n)}, & S|L=n & \sim \Exp{\mu(n)}. \end{align*}

Somewhat more specifically, similar to Eq. (1.2.6), we require that3 In the general case, arrivals need not enter. We therefore consider \(\lambda^{+}\) rather than \(\lambda\).

\begin{align*} \P{\L(t+\Delta t) = n+1 \given \L(t) = n } & = \lambda^{+}(n) \Delta t + o(\Delta t), \quad n\geq 0, \\ \P{\L(t+\Delta t) = n-1 \given \L(t) = n } & = \mu(n) \Delta t + o(\Delta t), \quad n \geq 1. \end{align*}

Note that, when all arrivals are accepted4 In which case \(\lambda^{+}(n) = \lambda(n)\). , we write \(\lambda(n)\), but otherwise, we use \(\lambda^{+}(n)\). The next models are all examples of the \(M(n)/M(n)/1\) queue with different choices for \(\lambda(n)\) (or \(\lambda^{+}(n)\)) and \(\mu(n)\).

The \(M/M/1\) queue has a constant arrival and service rate, so

\begin{align*} \lambda(n) & = \lambda^{+}(n) = \lambda, & \mu(n) & =\mu. \end{align*}

The level-crossing equations Eq. (4.6.5) result in

\begin{equation*} p(n+1) = \frac{\lambda(n)}{\mu(n+1)} p(n) = \frac{\lambda}{\mu} p(n) = \rho p(n) = \rho^{n+1} p(0),\quad \rho = \frac{\lambda}{\mu}, \end{equation*}

and from Eq. (4.6.6),

\begin{equation*} p(0)^{-1} = \sum_{n=0}^{\infty} \rho^{n} = \frac{1}{1-\rho} \quad \implies\quad p(n) = (1-\rho)\rho^{n}, \quad n \geq 0. \tag{5.1.1} \end{equation*}

With this expression for \(p(n)\), it is easy to compute the key performance measures. The load equals the utilization since all jobs are accepted and served. With a bit of algebra,5 Ex 5.1.8.

\begin{align*} \E \L & = \frac \rho{1-\rho}, & \E{\Ls} & = \rho, \\ \E \QQ & = \E \L - \E{\Ls} = \frac{\rho^2}{1-\rho}, & \P{\L> n} & = \rho^{n+1}. \tag{5.1.2} \end{align*}

The \(M/M/c\) queue has \(c\) identical servers to process jobs, thus

\begin{align*} \lambda(n) = \lambda^{+}(n) & = \lambda, & \mu(n) & = \mu \min\{n, c\}. \end{align*}

Closed-form expressions for \(\E \QQ\) and \(p(n)\) exist6 Opt 5.1.9. , but are not very informative. However, two measures are particularly simple. The load is \(\rho = \lambda/c\mu\), and

\begin{align*} \E{\Ls} & \stackrel1= \sum_{n=0}^{\infty}\min\{n,c\} p(n) \stackrel2= \frac{\lambda}\mu, \end{align*}

where 1 shows how to compute \(\E{\Ls}\) as an expectation, and 2 follows from Little's law because jobs spend an average time \(\E S=1/\mu\) in service, and jobs arrive at a rate \(\lambda\) at the servers.7 Note that \(\E{\Ls} \neq \rho\).

The \(M/M/1/K\) queue blocks jobs when \(L\geq K\), hence8 Thus, \(\lambda(n) \neq \lambda^{+}(n)\).

\begin{align*} \lambda(n) &= \lambda, & \lambda^{+}(n) & = \lambda \1{n < K}, & \mu(n) & = \mu. \end{align*}

This gives \(\lambda p(n) = \mu p(n+1)\) for \(n<K\), and \(p(n) = 0\) for \(n>K\), because \(\lambda^{+}(n) = 0\) for \(n\geq K\). Thus, in general, \(\lambda^{+}(n) p(n) = \mu(n+1)p(n+1)\), from which follows that

\begin{align*} \rho & = \frac{\lambda}{\mu}, & p(n) & = \frac{1-\rho}{1-\rho^{K+1}} \rho^n. \end{align*}

There is a subtle point to make here. When the system contains \(K\) jobs, the rate at which jobs arrive at the system is \(\lambda (K) = \lambda\) because jobs still arrive as a Poisson process. However, the rate at which jobs enter is \(\lambda^{+}(K)= 0\), consequently,

\begin{equation*} \lambda \pi(K) = \lambda(K) p(K) \neq \lambda^{+}(K) p(K) = 0. \end{equation*}

Since \(\lambda(K) = \lambda\), we still have9 PASTA. \(\pi(K) = p(K)\), and therefore the fraction of jobs rejected upon arrival is equal to \(p(K)\). Note that because jobs are blocked, the utilization is no longer equal to the load \(\rho=\lambda/\mu\). Specifically, if we define the mean acceptance rate as

\begin{equation*} \lambda^{+} = \sum_{n=0}^{\infty} \lambda^{+}(n) p(n), \tag{5.1.3} \end{equation*}

we see for the \(M/M/1/K\) queue that \(\lambda^{+} = \lambda(1-p(K))\). With this, the utilization becomes

\begin{align*} \tilde \rho & = \lambda^{+}/\mu = \lambda (1-p(K))/\mu = \E\Ls, \tag{5.1.4} \end{align*}

where the last equation follows from Little's law.

The \(M/M/c/K\) queue is a multi-server queue with blocking10 Hence, it combines the \(M/M/c\) and the \(M/M/1/K\) queue. ,

\begin{align*} \lambda(n) &= \lambda, & \lambda^{+}(n) & = \lambda\1{n<K}, & \mu(n) & = \mu \min\{n, c\}. \end{align*}

The \(M/M/c/c\) queue11 Also known as the Erlang-\(B\) model. blocks jobs when all of its \(c\) servers are occupied,12 Why is it not necessary to write \(\mu(n) = \mu \min\{c, n\}\)?

\begin{align*} \lambda(n) &= \lambda, & \lambda^{+}(n) & = \lambda\1{n<c}, & \mu(n) & = n \mu. \end{align*}

When \(n<c\), \(\lambda p(n) = (n+1)\mu p(n+1)\), hence

\begin{align*} p(n+1) = \frac{\lambda}{(n+1)\mu }p(n) = \cdots = \frac{\lambda^{n+1}}{(n+1)!\mu^{n+1}} p(0). \end{align*}

The normalizing constant \(p(0)\) follows from the constraint \(1 = \sum_{n=0}^{c}p(n)\). As there are as many servers as jobs, \(\E \QQ = 0\). Moreover, jobs get accepted at a rate \(1-p(c)\), so that by PASTA,

\begin{align*} \E \L = \E \Ls = \frac{\lambda}{\mu} \left(1- p(c)\right). \end{align*}

The \(M/M/\infty\) queue assigns a free server to each arrival,13 This system is said to have ample servers.

\begin{align*} \lambda(n) &= \lambda^{+}(n) =\lambda, & \mu(n) & = n \mu. \end{align*}

By taking the limit \(c\to \infty\) in the expressions of the \(M/M/c/c\) queue14 Or the \(M/M/c\) queue. we obtain

\begin{align*} p(n) & = e^{-\lambda/\mu} \frac{(\lambda/\mu)^n}{n!}, & \E \QQ & =0, & \E \L = \E \Ls = \frac \lambda \mu. \end{align*}

We see that the number of busy servers in the \(M/M/\infty\) queue is Poisson distributed with parameter \(\lambda \E S\).

A finite calling population has a finite number \(N\) of jobs. This model is useful for analyzing repair systems and inventory systems of spare parts. To see this, consider a factory with \(N\) machines and \(c\) mechanics.15 When \(c=N\), we obtain the Ehrenfest model of diffusion, which is used to provide some insight into the second law of thermodynamics. A machine can be in one of two states: working or failed. When a machine breaks down, it waits at the repair department until repaired. When \(n\) machines are in repair, there are still \(N-n\) machines running. Thus, if \(\lambda\) is the rate at which an individual machine fails,

\begin{align*} \lambda(n) & = \lambda (N-n), & \mu(n) = \mu \min\{n, c\}, \end{align*}

A customer balks when finding the queue too long and deciding not to join. Balking differs from blocking. In the latter case, \(\lambda^{+}(n)= \lambda \1{n < K}\); in the former case, a fraction of the customers may already choose not to join the system at a lower level than \(K\). A simple model for balking is

\begin{align*} \lambda^{+}(n) & = \lambda [K-n]^{+}, & \mu(n) & =\mu. \end{align*}
TF 5.1.1

Consider the level-crossing analysis of the \(M(n)/M(n)/1\) queue. Claim: it is necessary that the inter-arrival times of jobs are iid.

Solution
Did you actually try? Maybe see the 'hints' above!:
Solution, for real

False. In this queueing model, the arrival rate \(\lambda(n)\) can depend on the number of jobs in the queue. If this is the case, the inter-arrival times can still be independent but not identically distributed.

TF 5.1.2

For the \(M/M/1\) queue, the following reasoning leads to the expected number of jobs in the system.

\begin{equation*} \begin{split} M_L(s) & = \E{e^{s L}} = \sum_{n=0}^\infty e^{s n}p(n) = (1-\rho) \sum_n e^{s n} \rho^n \\ & =\frac{1-\rho}{1-e^{s}\rho}, \end{split} \end{equation*}

where we assume that \(s\) is such that \(e^s \rho < 1\). Then,

\begin{equation*} M_L'(s) = (1-\rho) \frac{1}{(1-e^s\rho)^2} e^s \rho. \end{equation*}

Claim: This implies \(\E \L = M_L'(0) = \rho/(1-\rho)\).

Solution
Did you actually try? Maybe see the 'hints' above!:
Solution, for real

True. See Ex 5.1.8.

TF 5.1.3

Claim: To model the \(M/M/c/c\) queue as an \(M(n)/M(n)/1\) queue, we must set \(\lambda(n) = \lambda\) for all \(n\).

Solution
Did you actually try? Maybe see the 'hints' above!:
Solution, for real

False. For the \(M/M/c/c\) queue, \(\lambda^{+}(n) = \lambda\1{n<c}\), so \(\lambda^{+}(n) = 0\) when \(n \geq c\).

TF 5.1.4

Claim: For the \(M/M/c\) queue, \(\E{\QQ} = \sum_{n=0}^\infty \max\{n-c, 0\} p(n)\), wherein \(p(n) = \P{L=n}\).

Solution
Did you actually try? Maybe see the 'hints' above!:
Solution, for real

True.

TF 5.1.5

Claim: For the \(M/M/c\) queue, \(p(n) = p(0) \frac{(\lambda/\mu)^n}{n!}\) for all \(n \geq 0\).

Solution
Did you actually try? Maybe see the 'hints' above!:
Solution, for real

False. This formula holds only for \(n \leq c\). For \(n > c\), all \(c\) servers are busy and the formula changes to \(p(n) = p(0) \frac{\lambda^n}{c! \, c^{n-c} \mu^n}\).

Ex 5.1.6

Show that for the \(M/M/1\) queue, \(\E{\QQ} = \sum_{n=1}^\infty (n-1)\pi(n)\) and use this to find \(\E{\QQ}=\rho^2/(1-\rho)\).

Solution
Did you actually try? Maybe see the 'hints' above!:
Solution, for real

The fraction of time the system contains \(n\) jobs is \(\pi(n)\), by PASTA. When the system contains \(n>0\) jobs, the number in the queue is one less, namely, \(n-1\).

\begin{align*} \E{\QQ} & = \sum_{n=1}^\infty (n-1)\pi(n) =\sum_{n=1}^\infty n\pi(n) -\sum_{n=1}^\infty \pi(n) = \E\L - (1-\pi(0)) = \E\L - \rho. \end{align*}
Ex 5.1.7

Derive the steady-state probabilities \(p(n)\) for a queue with a finite calling population with \(N\) jobs and \(N\) servers. What happens if \(N\to\infty\)?

Solution
Did you actually try? Maybe see the 'hints' above!:
Solution, for real

Take \(\lambda(n) = (N-n)\lambda\) and \(\mu(n) = n \mu\). Then

\begin{align*} p(n+1) & = \frac{\lambda(n)}{\mu(n+1)} p(n) = \frac{(N-n)\lambda}{(n+1)\mu} p(n) = \frac{(N-n)(N-(n-1))}{(n+1)n}\frac{\lambda^2}{\mu^2} p(n-1) \\ & = \frac{N!}{(N-(n+1))!}\frac1{(n+1)!}\rho^{n+1} p(0) = {N \choose n+1}\rho^{n+1} p(0). \end{align*}

Therefore, \(G=\sum_{k=0}^N \rho^k { N \choose k}\). Now, using that \((a+b)^{m} = \sum_{k=0}^{m} {m \choose k} a^{k} b^{m-k}\), we see that \((1+\rho)^{N} = \sum_{k=0}^N { N \choose k} \rho^k 1^{N-k} = G\). So,

\begin{align*} p(n+1) & = {N \choose n+1} \frac{\rho^{n+1}}{(1+\rho)^{N}}. \end{align*}

When \(\rho = 0\), \(p(n)=0\) for all \(n>1\). When \(\rho>1\), then \((1+\rho)^{-N} \to 0\) as \(N\to \infty\), but \({N \choose n+1} \to 1/(n+1)!\). Hence, \(p(n)\to 0\) if \(N\to \infty\) for all \(n\).

Ex 5.1.8

Use moment-generating functions to derive \(\E\L\) and \(\V\L\) for the \(M/M/1\) queue, and show that \(\P{L>n} = \rho^{n+1}\).

Hint

Use Eq. (5.1.1), show that \(M_L(s) = (1-\rho) \sum_n e^{s n} \rho^n\), then use the familiar expression of the sum over a geometric series. Similarly, \(\P{\L\geq n} = \sum_{k\geq n} p(k)\).

Solution
Did you actually try? Maybe see the 'hints' above!:
Solution, for real
\begin{equation*} M_L(s) = \E{e^{s L}} = \sum_{n=0}^\infty e^{s n}p(n) = (1-\rho) \sum_n e^{s n} \rho^n=\frac{1-\rho}{1-e^{s}\rho}, \end{equation*}

where we assume that \(s\) is such that \(e^s \rho < 1\). Then,

\begin{align*} M_L'(s) & = (1-\rho) \frac{1}{(1-e^s\rho)^2} e^s \rho \implies \E\L = M_L'(0)= \frac{\rho}{1-\rho}, \\ \E{\L^2} & = M''(0)= \frac{2\rho^2}{(1-\rho)^2} + \frac{\rho}{1-\rho} = \rho \frac{1+\rho}{(1-\rho)^{2}}, \\ \V\L & = \E{\L^2} - (\E\L)^2 = \frac{\rho(1+\rho)}{(1-\rho)^2}-\frac{\rho^2}{(1-\rho)^2} = \frac{\rho}{(1-\rho)^2}, \\ \P{\L\geq n} & = \sum_{k=n}^\infty p(k) = \sum_{k=n}^\infty p(0)\rho^k = (1-\rho)\sum_{k=n}^\infty \rho^k = (1-\rho)\rho^n \sum_{k=0}^\infty\rho^k = (1-\rho) \rho^n \frac1{1-\rho} = \rho^n. \end{align*}

In the remainder, we derive the KPIs of the above models.16 These exercises are optional, i.e, you may skip them if you don't like algebra.

Opt 5.1.9

Derive the following expressions for the \(M/M/c\) queue:

\begin{subequations} \begin{align*} \rho & = \frac{\lambda}{ c\mu}, \\ p(n) & = \frac 1 {G} \frac{1}{\prod_{k=1}^{n}\min\{c, k\}}(c\rho)^{n} , \quad n\geq 1, \\ G & = \frac 1{p(0)} = \sum_{n=0}^{c-1} \frac{(c\rho)^n}{n!} + \frac{(c\rho)^c}{(1-\rho)c!}, \label{eq:501} \\ \E{\QQ} & = \sum_{n=c}^\infty (n-c) p(n) = \frac{(c\rho)^c}{c! G}\frac{\rho}{(1-\rho)^2} \\ \tag{5.1.5} \end{align*} \end{subequations}
Solution
Did you actually try? Maybe see the 'hints' above!:
Solution, for real

For \(n\geq 1\),

\begin{align*} p(n) & = \frac{\lambda(n-1)}{\mu(n)}p(n-1) = \frac{\lambda}{\min\{c, n\} \mu }p(n-1) = \frac{1}{\min\{c, n\}}(c\rho) p(n-1) \\ & = \frac{1}{\min\{c, n\}\min\{c, n-1\}}(c\rho)^2 p(n-2) \\ & = \frac{1}{\prod_{k=1}^{n}\min\{c, k\}}(c\rho)^{n} p(0). \end{align*}

To obtain the normalization constant \(G = 1/p(0)\),

\begin{align*} 1 & = \sum_{n=0}^\infty p(n) = p(0) + \sum_{n=1}^{c-1} p(n) + \sum_{n=c}^\infty p(n) \\ & =p(0) + p(0) \sum_{n=1}^{c-1}\frac{(c\rho)^n}{n!} + p(0)\sum_{n=c}^{\infty} \frac{c^c}{c!} \rho^{n} \\ & =p(0)\sum_{n=0}^{c-1}\frac{(c\rho)^n}{n!} + p(0) \sum_{n=c}^{\infty} \frac{(c\rho)^c}{c!} \rho^{n-c} \\ & = p(0)\sum_{n=0}^{c-1}\frac{(c\rho)^n}{n!} + p(0)\frac{(c\rho)^c}{c!} \sum_{n=0}^{\infty} \rho^n \\ & = p(0) \sum_{n=0}^{c-1}\frac{(c\rho)^n}{n!} + p(0)\frac{(c\rho)^c}{c!(1-\rho)}. \end{align*}

For \(n\geq c\), observe that \(p(n) = (c \rho)^{n}/(c^{n-c}c!) = c^{c} \rho^{n}/c!\), so that

\begin{align*} \E{\QQ} & =\sum_{n=c}^\infty (n-c) p(n) =\sum_{n=c}^\infty (n-c) \frac{c^c}{c!}\rho^{n} p(0) \\ & =\frac{c^c\rho^c}{G c!} \sum_{n=c}^\infty (n-c) \rho^{n-c} =\frac{c^c\rho^c}{G c!} \sum_{n=0}^\infty n \rho^n \\ & =\frac{c^c\rho^c}{G c!} \frac{\rho}{(1-\rho)^2}. \end{align*}

The derivation of the expected number of jobs in service becomes easier if we pre-multiply the normalization constant \(G\):

\begin{align*} G \E{\Ls} & = G \left( \sum_{n=0}^{c} n p(n) + \sum_{n=c+1}^{\infty} c p(n) \right) \\ & = \sum_{n=1}^{c} n \frac{(c\rho)^n}{n!} + \sum_{n=c+1}^{\infty} c \frac{c^c\rho^n}{c!} = \sum_{n=1}^{c} \frac{(c\rho)^n}{(n-1)!} + \frac{c^{c+1}}{c!}\sum_{n=c+1}^{\infty} \rho^n \\ & = \sum_{n=0}^{c-1} \frac{(c\rho)^{n+1}}{n!} + \frac{(c\rho)^{c+1}}{c!}\sum_{n=0}^{\infty} \rho^n = c\rho \left(\sum_{n=0}^{c-1} \frac{(c\rho)^n}{n!} + \frac{(c\rho)^{c}}{c!(1-\rho)}\right). \end{align*}

Observe that the RHS is precisely equal to \(\rho c G\), so that \(\E\Ls = c \rho\).

Opt 5.1.10

Check that the performance measures of the \(M/M/c\) queue reduce to those of the \(M/M/1\) queue if \(c=1\).

Hint

Fill in \(c=1\). Realize that this is a check on the formulas.

Solution
Did you actually try? Maybe see the 'hints' above!:
Solution, for real

Take \(c=1\)

\begin{align*} p(0) & = \frac{1}G \frac{(c\rho)^0}{0!}=\frac1 G, \\ p(n) & = \frac{1}G \frac{c^c\rho^n}{c!} = \frac{1}G \frac{1^1\rho^n}{1!} =\frac{\rho^n}G , \quad n=1,2, \ldots \\ G & =\sum_{n=0}^{c-1} \frac{(c\rho)^n}{n!} + \frac{(c\rho)^c}{(1-\rho)c!} =\sum_{n=0}^{0} \frac{\rho^0}{0!} + \frac{\rho}{(1-\rho)} = 1 + \frac{\rho}{1-\rho} = \frac1{1-\rho}, \\ \E{\L} & = \frac{(c\rho)^c}{c! G}\frac{\rho}{(1-\rho)^2} = \frac{\rho}{1/(1-\rho)}\frac{\rho}{(1-\rho)^2} = \frac{\rho^2}{1-\rho}, \\ \E{\L} & = \sum_{n=0}^{c}n p(n) + \sum_{n=c+1}^\infty c p(n) = p(1) + 1 \sum_{n=2}^\infty p(n) = 1- p(0) = \rho. \end{align*}

Everything agrees with the formulas we derived earlier for the \(M/M/1\) queue.

Opt 5.1.11

Show that Eq. (5.1.4) holds.

Solution
Did you actually try? Maybe see the 'hints' above!:
Solution, for real

With the other results for the \(M/M/1/K\) queue,

\begin{align*} 1-p(0) & = 1-\frac{1-\rho}{1-\rho^{K+1}} = \frac{1-\rho^{K+1} - 1 + \rho}{1-\rho^{K+1}} = \rho \frac{1-\rho^{K}}{1-\rho^{K+1}}, \\ \rho (1-p(K)) & = \rho \frac{1-\rho^{K+1} - \rho^K + \rho^{K+1}}{1-\rho^{K+1}} = \rho \frac{1-\rho^{K}}{1-\rho^{K+1}}. \end{align*}
Opt 5.1.12

Show that as \(K\to\infty\), the performance measures of the \(M/M/1/K\) converge to those of the \(M/M/1\) queue.

Hint

Use that \(\sum_{i=0}^n x^i = (1-x^{n+1})/(1-x)\).

Is it necessary for this expression to be true that \(|x|<1\)? What should you require for \(|x|\) when you want to take the limit \(n\to\infty\)?

Solution
Did you actually try? Maybe see the 'hints' above!:
Solution, for real

To take the limit \(K\to\infty\)—not the limit \(n\to\infty\)—, write

\begin{equation*} G= \frac{1-\rho^{K+1}}{1-\rho} = \frac{1}{1-\rho} -\frac{\rho^{K+1}}{1-\rho}. \end{equation*}

Since \(\rho^{K+1}\to 0\) as \(K\to \infty\) (recall, \(\rho<1\)), we get

\begin{equation*} G \to \frac{1}{1-\rho}, \end{equation*}

as \(K\to\infty\). Hence, \(p(n)=\rho^n/G \to \rho^n(1-\rho)\), and the latter are the steady-state probabilities of the \(M/M/1\) queue. Finally, if the steady-state probabilities coincide, the performance measures derived from \(p(n)\) coincide as well.