Вы находитесь на странице: 1из 17

Chapter 9

One- and Two-Sample Estimation


Problems
9.1 From Example 9.1 on page 271, we know that E(S 2 ) = σ 2 . Therefore,
 
′2 n−1 2 n−1 n−1 2
E(S ) = E S = E(S 2 ) = σ .
n n n

9.2 (a) E(X) = np; E(P̂ ) = E(X/n) = E(X)/n = np/n = p.


√ √
E(X)+ n/2 np+ n/2
(b) E(P ′ ) = √
n+ n
= √
n+ n
6= p.
√ √
np+ n/2 p+1/2 n
9.3 lim √ = lim √ = p.
n→∞ n+ n n→∞ 1+1/ n

9.4 n = 30, x̄ = 780, and σ = 40. Also, z0.02 = 2.054. So, a 96% confidence interval for the
population mean can be calculated as
√ √
780 − (2.054)(40/ 30) < µ < 780 + (2.054)(40/ 30),

or 765 < µ < 795.

9.5 n = 75, x̄ = 0.310, σ = 0.0015, and z0.025 = 1.96. A 95% confidence interval for the
population mean is
√ √
0.310 − (1.96)(0.0015/ 75) < µ < 0.310 + (1.96)(0.0015/ 75),

or 0.3097 < µ < 0.3103.

9.6 n = 50, x̄ = 174.5, σ = 6.9, and z0.01 = 2.33.

(a) A 98% confidence interval


√ for the population mean√is
174.5 − (2.33)(6.9/ 50) < µ < 174.5 + (2.33)(6.9/ 50), or 172.23 < µ < 176.77.

(b) e < (2.33)(6.9)/ 50 = 2.27.

103
104 Chapter 9 One- and Two-Sample Estimation Problems

9.7 n = 100, x̄ = 23, 500, σ = 3900, and z0.005 = 2.575.

(a) A 99% confidence interval for the population mean is


23, 500 − (2.575)(3900/10) < µ < 23, 500 + (2.575)(3900/10), or
22, 496 < µ < 24, 504.
(b) e < (2.575)(3900/10) = 1004.

9.8 n = [(2.05)(40)/10]2 = 68 when rounded up.

9.9 n = [(1.96)(0.0015)/0.0005]2 = 35 when rounded up.

9.10 n = [(1.96)(40)/15]2 = 28 when rounded up.

9.11 n = [(2.575)(5.8)/2]2 = 56 when rounded up.

9.12 n = 20, x̄ = 11.3, s = 2.45, and t0.025 = 2.093 with 19 degrees of freedom. A 95%
confidence interval for the population mean is
√ √
11.3 − (2.093)(2.45/ 20) < µ < 11.3 + (2.093)(2.45/ 20),

or 10.15 < µ < 12.45.

9.13 n = 9, x̄ = 1.0056, s = 0.0245, and t0.005 = 3.355 with 8 degrees of freedom. A 99%
confidence interval for the population mean is

1.0056 − (3.355)(0.0245/3) < µ < 1.0056 + (3.355)(0.0245/3),

or 0.978 < µ < 1.033.

9.14 n = 10, x̄ = 230, s = 15, and t0.005 = 3.25 with 9 degrees of freedom. A 99% confidence
interval for the population mean is
√ √
230 − (3.25)(15/ 10) < µ < 230 + (3.25)(15/ 10),

or 214.58 < µ < 245.42.

9.15 n = 12, x̄ = 48.50, s = 1.5, and t0.05 = 1.796 with 11 degrees of freedom. A 90%
confidence interval for the population mean is
√ √
48.50 − (1.796)(1.5/ 12) < µ < 48.50 + (1.796)(1.5/ 12),

or 47.722 < µ < 49.278.

9.16 n = 12, x̄ = 79.3, s = 7.8, and t0.025 = 2.201 with 11 degrees of freedom. A 95%
confidence interval for the population mean is
√ √
79.3 − (2.201)(7.8/ 12) < µ < 79.3 + (2.201)(7.8/ 12),

or 74.34 < µ < 84.26.


Solutions for Exercises in Chapter 9 105

9.17 n = 25, x̄ = 325.05, s = 0.5, γ = 5%, and 1 − α = 90%, with k = 2.208. So, 325.05 ±
(2.208)(0.5) yields (323.946, 326.151). Thus, we are 95% confident that this tolerance
interval will contain 90% of the aspirin contents for this brand of buffered aspirin.

9.18 n = 15, x̄ = 3.7867, s = 0.9709, γ = 1%, and 1 − α = 95%, with k = 3.507. So, by
calculating 3.7867 ± (3.507)(0.9709) we obtain (0.382, 7.192) which is a 99% tolerance
interval that will contain 95% of the drying times.
9.19 n = 100, x̄ = 23,500, s = 3, 900, 1 − α = 0.99, and γ = 0.01, with k = 3.096. The
tolerance interval is 23,500 ± (3.096)(3,900) which yields 11,425 < µ < 35,574.
9.20 n = 12, x̄ = 48.50, s = 1.5, 1 − α = 0.90, and γ = 0.05, with k = 2.655. The tolerance
interval is 48.50 ± (2.655)(1.5) which yields (44.52, 52.48).
9.21 By definition, MSE = E(Θ̂ − θ)2 which can be expressed as

MSE = E[Θ̂ − E(Θ̂) + E(Θ̂) − θ]2


= E[Θ̂ − E(Θ̂)]2 + E[E(Θ̂) − θ]2 + 2E[Θ̂ − E(Θ̂)]E[E(Θ̂) − θ].

The third term on the right hand side is zero since E[Θ̂ − E(Θ̂)] = E[Θ̂] − E(Θ̂) = 0.
Hence the claim is valid.
n−1 2 σ2
9.22 (a) The bias is E(S ′2 ) − σ 2 = n
σ − σ2 = n
.
σ2
(b) lim Bias = lim = 0.
n→∞ n→∞ n

(n−1)S 2
9.23 Using Theorem 8.4, we know that X 2 = σ2
follows a chi-squared distribution
 2 with
σ
n − 1 degrees of freedom, whose variance is 2(n − 1). So, V ar(S 2 ) = V ar n−1 X2 =
 2 4
2
n−1
σ 4 , and V ar(S ′2 ) = V ar n−1
n
S 2 = n−1
n
V ar(S 2 ) = 2(n−1)σ
n2
. Therefore, the
′2
variance of S is smaller.

9.24 Using Exercises 9.21 and 9.23,


MSE(S 2 ) V ar(S 2 ) + [Bias(S 2 )]2 2σ 4 /(n − 1)
= =
MSE(S ′2 ) V ar(S ′2 ) + [Bias(S ′2 )]2 2(n − 1)σ 4 /n2 + σ 4 /n2
3n − 1
=1+ 2 ,
2n − 3n + 1
which is always larger than 1 when n is larger than 1. Hence the MSE of S ′2 is usually
smaller.

9.25 n = 20, x̄ = 11.3, s = 2.45, and t0.025 = 2.093 with 19 degrees of freedom. A 95%
prediction interval for a future observation is
p
11.3 ± (2.093)(2.45) 1 + 1/20 = 11.3 ± 5.25,

which yields (6.05, 16.55).


106 Chapter 9 One- and Two-Sample Estimation Problems

9.26 n = 12, x̄ = 79.3, s = 7.8, and t0.025 = 2.201 with 11 degrees of freedom. A 95%
prediction interval for a future observation is
p
79.3 ± (2.201)(7.8) 1 + 1/12 = 79.3 ± 17.87,

which yields (61.43, 97.17).

9.27 n = 15, x̄ = 3.7867, s = 0.9709, and t0.025 = 2.145 with 14 degrees of freedom. A 95%
prediction interval for a new observation is
p
3.7867 ± (2.145)(0.9709) 1 + 1/15 = 3.7867 ± 2.1509,

which yields (1.6358, 5.9376).

9.28 n = 9, x̄ = 1.0056, s = 0.0245, 1 − α = 0.95, and γ = 0.05, with k = 3.532. The


tolerance interval is 1.0056 ± (3.532)(0.0245) which yields (0.919, 1.092).

9.29 n = 15, x̄ = 3.84, and s = 3.07. To calculate an upper 95% prediction limit, we
obtain t0.05 p
= 1.761 with 14 degrees of freedom. So, the upper limit is 3.84 +
(1.761)(3.07) 1 + 1/15 = 3.84 + 5.58 = 9.42. This means that a new observation
will have a chance of 95% to fall into the interval (−∞, 9.42). To obtain an up-
per 95% tolerance limit, using 1 − α = 0.95 and γ = 0.05, with k = 2.566, we get
3.84 + (2.566)(3.07) = 11.72. Hence, we are 95% confident that (−∞, 11.72) will
contain 95% of the orthophosphorous measurements in the river.

9.30 n = 50, x̄ = 78.3, and s = 5.6. Since t0.05 = 1.677 with 49 degrees of freedom,
the bound ofp a lower 95% prediction interval for a single new observation is 78.3 −
(1.677)(5.6) 1 + 1/50 = 68.91. So, the interval is (68.91, ∞). On the other hand,
with 1 − α = 95% and γ = 0.01, the k value for a one-sided tolerance limit is 2.269
and the bound is 78.3 − (2.269)(5.6) = 65.59. So, the tolerance interval is (65.59, ∞).

9.31 Since the manufacturer would be more interested in the mean tensile strength for future
products, it is conceivable that prediction interval and tolerance interval may be more
interesting than just a confidence interval.

9.32 This time 1 − α = 0.99 and γ = 0.05 with k = 3.126. So, the tolerance limit is
78.3 − (3.126)(5.6) = 60.79. Since 62 exceeds the lower bound of the interval, yes, this
is a cause of concern.

9.33 In Exercise 9.27, a 95% prediction interval for a new observation is calculated as
(1.6358, 5.9377). Since 6.9 is in the outside range of the prediction interval, this new
observation is likely to be an outlier.

9.34 n = 12, x̄ = 48.50, s = 1.5, 1 − α = 0.95, and γ = 0.05, with k = 2.815. The lower
bound of the one-sided tolerance interval is 48.50 − (2.815)(1.5) = 44.275. Their claim
is not necessarily correct.
Solutions for Exercises in Chapter 9 107

9.35 n1 = 25, n2 = 36, x̄1 = 80, x̄2 = 75, σ1 = 5, σ2 = 3, and z0.03 = 1.88. So, a 94%
confidence interval for µ1 − µ2 is
p p
(80 − 75) − (1.88) 25/25 + 9/36 < µ1 − µ2 < (80 − 75) + (1.88) 25/25 + 9/36,

which yields 2.9 < µ1 − µ2 < 7.1.

9.36 nA = 50, nB = 50, x̄A = 78.3, x̄B = 87.2, σA = 5.6, and σB = 6.3. It is known that
z0.025 = 1.96. So, a 95% confidence interval for the difference of the population means
is p
(87.2 − 78.3) ± 1.96 5.62 /50 + 6.32 /50 = 8.9 ± 2.34,
or 6.56 < µA − µB < 11.24.

9.37 n1 = 100, n2 = 200, x̄1 = 12.2, x̄2 = 9.1, s1 = 1.1, and s2 = 0.9. It is known that
z0.01 = 2.327. So
p
(12.2 − 9.1) ± 2.327 1.12 /100 + 0.92 /200 = 3.1 ± 0.30,

or 2.80 < µ1 − µ2 < 3.40. The treatment appears to reduce the mean amount of metal
removed.

9.38 n1 = 12, n2 = 10, x̄1 = 85, x̄2 = 81, s1 = 4, s2 = 5, and sp = 4.478 with t0.05 = 1.725
with 20 degrees of freedom. So
p
(85 − 81) ± (1.725)(4.478) 1/12 + 1/10 = 4 ± 3.31,

which yields 0.69 < µ1 − µ2 < 7.31.

9.39 n1 = 12, n2 = 18, x̄1 = 84, x̄2 = 77, s1 = 4, s2 = 6, and sp = 5.305 with t0.005 = 2.763
with 28 degrees of freedom. So,
p
(84 − 77) ± (2.763)(5.305) 1/12 + 1/18 = 7 ± 5.46,

which yields 1.54 < µ1 − µ2 < 12.46.

9.40 n1 = 10, n2 = 10, x̄1 = 0.399, x̄2 = 0.565, s1 = 0.07279, s2 = 0.18674, and sp = 0.14172
with t0.025 = 2.101 with 18 degrees of freedom. So,
p
(0.565 − 0.399) ± (2.101)(0.14172) 1/10 + 1/10 = 0.166 ± 0.133,

which yields 0.033 < µ1 − µ2 < 0.299.

9.41 n1 = 14, n2 = 16, x̄1 = 17, x̄2 = 19, s21 = 1.5, s22 = 1.8, and sp = 1.289 with t0.005 =
2.763 with 28 degrees of freedom. So,
p
(19 − 17) ± (2.763)(1.289) 1/16 + 1/14 = 2 ± 1.30,

which yields 0.70 < µ1 − µ2 < 3.30.


108 Chapter 9 One- and Two-Sample Estimation Problems

9.42 n1 = 12, n2 = 10, x̄1 = 16, x̄2 = 11, s1 = 1.0, s2 = 0.8, and sp = 0.915 with t0.05 = 1.725
with 20 degrees of freedom. So,
p
(16 − 11) ± (1.725)(0.915) 1/12 + 1/10 = 5 ± 0.68,

which yields 4.3 < µ1 − µ2 < 5.7.

9.43 nA = nB = 12, x̄A = 36, 300, x̄B = 38, 100, sA = 5, 000, sB = 6, 100, and
50002 /12 + 61002 /12
v= (50002 /12)2 (61002 /12)2
= 21,
11
+ 11

with t0.025 = 2.080 with 21 degrees of freedom. So,


r
50002 61002
(36, 300 − 38, 100) ± (2.080) + = −1, 800 ± 4, 736,
12 12
which yields −6, 536 < µA − µB < 2, 936.

9.44 n = 8, d¯ = −1112.5, sd = 1454, with t0.005 = 3.499 with 7 degrees of freedom. So,
1454
−1112.5 ± (3.499) √ = −1112.5 ± 1798.7,
8
which yields −2911.2 < µD < 686.2.

9.45 n = 9, d¯ = 2.778, sd = 4.5765, with t0.025 = 2.306 with 8 degrees of freedom. So,
4.5765
2.778 ± (2.306) √ = 2.778 ± 3.518,
9
which yields −0.74 < µD < 6.30.

9.46 nI = 5, nII = 7, x̄I = 98.4, x̄II = 110.7, sI = 8.375, and sII = 32.185, with
(8.7352 /5 + 32.1852 /7)2
v= (8.7352 /5)2 (32.1852 /7)2
=7
4
+ 6

So, t0.05 = 1.895 with 7 degrees of freedom.


p
(110.7 − 98.4) ± 1.895 8.7352 /5 + 32.1852/7 = 12.3 ± 24.2,

which yields −11.9 < µII − µI < 36.5.

9.47 n = 10, d̄ = 14.89%, and sd = 30.4868, with t0.025 = 2.262 with 9 degrees of freedom.
So,
30.4868
14.89 ± (2.262) √ = 14.89 ± 21.81,
10
which yields −6.92 < µD < 36.70.
Solutions for Exercises in Chapter 9 109

9.48 nA = nB = 20, x̄A = 32.91, x̄B = 30.47, sA = 1.57, sB = 1.74, and Sp = 1.657.
(a) t0.025 ≈ 2.042 with 38 degrees of freedom. So,
p
(32.91 − 30.47) ± (2.042)(1.657) 1/20 + 1/20 = 2.44 ± 1.07,
which yields 1.37 < µA − µB < 3.51.
(b) Since it is apparent that type A battery has longer life, it should be adopted.
9.49 nA = nB = 15, x̄A = 3.82, x̄B = 4.94, sA = 0.7794, sB = 0.7538, and sp = 0.7667 with
t0.025 = 2.048 with 28 degrees of freedom. So,
p
(4.94 − 3.82) ± (2.048)(0.7667) 1/15 + 1/15 = 1.12 ± 0.57,
which yields 0.55 < µB − µA < 1.69.
9.50 n1 = 8, n2 = 13, x̄1 = 1.98, x̄2 = 1.30, s1 = 0.51, s2 = 0.35, and sp = 0.416. t0.025 =
2.093 with 19 degrees of freedom. So,
p
(1.98 − 1.30) ± (2.093)(0.416) 1/8 + 1/13 = 0.68 ± 0.39,
which yields 0.29 < µ1 − µ2 < 1.07.
9.51 (a) n = 200, p̂ = 0.57, q̂ = 0.43, and z0.02 = 2.05. So,
r
(0.57)(0.43)
0.57 ± (2.05) = 0.57 ± 0.072,
200
which yields 0.498 < p < 0.642.
q
(b) Error ≤ (2.05) (0.57)(0.43)
200
= 0.072.
485
9.52 n = 500.p̂ = 500
= 0.97, q̂ = 0.03, and z0.05 = 1.645. So,
r
(0.97)(0.03)
0.97 ± (1.645) = 0.97 ± 0.013,
500
which yields 0.957 < p < 0.983.
228
9.53 n = 1000, p̂ = 1000
= 0.228, q̂ = 0.772, and z0.005 = 2.575. So,
r
(0.228)(0.772)
0.228 ± (2.575) = 0.228 ± 0.034,
1000
which yields 0.194 < p < 0.262.
8
9.54 n = 100, p̂ = 100
= 0.08, q̂ = 0.92, and z0.01 = 2.33. So,
r
(0.08)(0.92)
0.08 ± (2.33) = 0.08 ± 0.063,
100
which yields 0.017 < p < 0.143.
110 Chapter 9 One- and Two-Sample Estimation Problems

34
9.55 (a) n = 40, p̂ = 40
= 0.85, q̂ = 0.15, and z0.025 = 1.96. So,
r
(0.85)(0.15)
0.85 ± (1.96) = 0.85 ± 0.111,
40
which yields 0.739 < p < 0.961.
(b) Since p = 0.8 falls in the confidence interval, we can not conclude that the new
system is better.
24
9.56 n = 100, p̂ = 100
= 0.24, q̂ = 0.76, and z0.005 = 2.575.
q
(a) 0.24 ± (2.575) (0.24)(0.76)
100
= 0.24 ± 0.110, which yields 0.130 < p < 0.350.
q
(b) Error ≤ (2.575) (0.24)(0.76)
100
= 0.110.

9.57 n = 1600, p̂ = 23 , q̂ = 31 , and z0.025 = 1.96.


q
(a) ± (1.96) (2/3)(1/3)
2
3 1600
= 23 ± 0.023, which yields 0.644 < p < 0.690.
q
(b) Error ≤ (1.96) (2/3)(1/3)
1600
= 0.023.

(1.96)2 (0.32)(0.68)
9.58 n = (0.02)2
= 2090 when round up.

(2.05)2 (0.57)(0.43)
9.59 n = (0.02)2
= 2576 when round up.

(2.575)2 (0.228)(0.772)
9.60 n = (0.05)2
= 467 when round up.

(2.33)2 (0.08)(0.92)
9.61 n = (0.05)2
= 160 when round up.

(1.96)2
9.62 n = (4)(0.01)2
= 9604 when round up.

(2.575)2
9.63 n = (4)(0.01)2
= 16577 when round up.

(1.96)2
9.64 n = (4)(0.04)2
= 601 when round up.

9.65 nM = nF = 1000, p̂M = 0.250, q̂M = 0.750, p̂F = 0.275, q̂F = 0.725, and z0.025 = 1.96.
So
r
(0.250)(0.750) (0.275)(0.725)
(0.275 − 0.250) ± (1.96) + = 0.025 ± 0.039,
1000 1000
which yields −0.0136 < pF − pM < 0.0636.
Solutions for Exercises in Chapter 9 111

80 40
9.66 n1 = 250, n2 = 175, p̂1 = 250
= 0.32, p̂2 = 175
= 0.2286, and z0.05 = 1.645. So,
r
(0.32)(0.68) (0.2286)(0.7714)
(0.32 − 0.2286) ± (1.645) + = 0.0914 ± 0.0713,
250 175
which yields 0.0201 < p1 − p2 < 0.1627. From this study we conclude that there is
a significantly higher proportion of women in electrical engineering than there is in
chemical engineering.
120 98
9.67 n1 = n2 = 500, p̂1 = 500
= 0.24, p̂2 = 500
= 0.196, and z0.05 = 1.645. So,
r
(0.24)(0.76) (0.196)(0.804)
(0.24 − 0.196) ± (1.645) + = 0.044 ± 0.0429,
500 500
which yields 0.0011 < p1 − p2 < 0.0869. Since 0 is not in this confidence interval, we
conclude, at the level of 90% confidence, that inoculation has an effect on the incidence
of the disease.

9.68 n5◦ C = n15◦ C = 20, p̂5◦ C = 0.50, p̂15◦ C = 0.75, and z0.025 = 1.96. So,
r
(0.50)(0.50) (0.75)(0.25)
(0.5 − 0.75) ± (1.96) + = −0.25 ± 0.2899,
20 20
which yields −0.5399 < p5◦ C − p15◦ C < 0.0399. Since this interval includes 0, the
significance of the difference cannot be shown at the confidence level of 95%.

9.69 nnow = 1000, p̂now = 0.2740, n91 = 760, p̂91 = 0.3158, and z0.025 = 1.96. So,
r
(0.2740)(0.7260) (0.3158)(0.6842)
(0.2740 − 0.3158) ± (1.96) + = −0.0418 ± 0.0431,
1000 760
which yields −0.0849 < pnow − p91 < 0.0013. Hence, at the confidence level of 95%,
the significance cannot be shown.

9.70 n90 = n94 = 20, p̂90 = 0.337, and 0̂94 = 0.362

(a) n90 p̂90 = (20)(0.337) ≈ 7 and n94 p̂94 = (20)(0.362) ≈ 7.


q
(b) Since z0.025 = 1.96, (0.337 − 0.362) ± (1.96) (0.337)(0.663)
20
+ (0.362)(0.638)
20
= −0.025 ±
0.295, which yields −0.320 < p90 − p94 < 0.270. Hence there is no evidence, at
the confidence level of 95%, that there is a change in the proportions.

9.71 s2 = 0.815 with v = 4 degrees of freedom. Also, χ20.025 = 11.143 and χ20.975 = 0.484.
So,
(4)(0.815) (4)(0.815)
< σ2 < , which yields 0.293 < σ 2 < 6.736.
11.143 0.484
Since this interval contains 1, the claim that σ 2 seems valid.
112 Chapter 9 One- and Two-Sample Estimation Problems

9.72 s2 = 16 with v = 19 degrees of freedom. It is known χ20.01 = 36.191 and χ20.99 = 7.633.
Hence
(19)(16) (19)(16)
< σ2 < , or 8.400 < σ 2 < 39.827.
36.191 7.633
9.73 s2 = 6.0025 with v = 19 degrees of freedom. Also, χ20.025 = 32.852 and χ20.975 = 8.907.
Hence,
(19)(6.0025) (19)(6.0025)
< σ2 < , or 3.472 < σ 2 < 12.804,
32.852 8.907
9.74 s2 = 0.0006 with v = 8 degrees of freedom. Also, χ20.005 = 21.955 and χ20.995 = 1.344.
Hence,
(8)(0.0006) (8)(0.0006)
< σ2 < , or 0.00022 < σ 2 < 0.00357.
21.955 1.344
9.75 s2 = 225 with v = 9 degrees of freedom. Also, χ20.005 = 23.589 and χ20.995 = 1.735.
Hence,
(9)(225) (9)(225)
< σ2 < , or 85.845 < σ 2 < 1167.147,
23.589 1.735
which yields 9.27 < σ < 34.16.

9.76 s2 = 2.25 with v = 11 degrees of freedom. Also, χ20.05 = 19.675 and χ20.95 = 4.575.
Hence,
(11)(2.25) (11)(2.25)
< σ2 < , or 1.258 < σ 2 < 5.410.
19.675 4.575
9.77 s21 = 1.00, s22 = 0.64, f0.01 (11, 9) = 5.19, and f0.01 (9, 11) = 4.63. So,

1.00/0.64 σ2 σ2
< 12 < (1.00/0.64)(4.63), or 0.301 < 12 < 7.234,
5.19 σ2 σ2
σ1
which yields 0.549 < σ2
< 2.690.

9.78 s21 = 50002 , s22 = 61002 , and f0.05 (11, 11) = 2.82. (Note: this value can be found by
using “=finv(0.05,11,11)” in Microsoft Excel.) So,
 2  2
5000 1 σ12 5000 σ12
< 2 < (2.82), or 0.238 < 2 < 1.895.
6100 2.82 σ2 6100 σ2

Since the interval contains 1, it is reasonable to assume that σ12 = σ22 .

9.79 s2I = 76.3, s2II = 1035.905, f0.05(4, 6) = 4.53, and f0.05 (6, 4) = 6.16. So,
    
76.3 1 σI2 76.3 σ2
< 2 < (6.16), or 0.016 < 2I < 0.454.
1035.905 4.53 σII 1035.905 σII

Hence, we may assume that σI2 6= σII


2
.
Solutions for Exercises in Chapter 9 113

9.80 sA = 0.7794, sB = 0.7538, and f0.025 (14, 14) = 2.98 (Note: this value can be found by
using “=finv(0.025,14,14)” in Microsoft Excel.) So,
 2    2
0.7794 1 σ2 0.7794 σA2
< A < (2.98), or 0.623 < < 3.186.
0.7538 2.98 σB2 0.7538 σB2

Hence, it is reasonable to assume the equality of the variances.

9.81 The likelihood function is


n
Y n
Y
L(x1 , . . . , xn ) = f (xi ; p) = pxi (1 − p)1−xi = pnx̄ (1 − p)n(1−x̄) .
i=1 i=1

Hence, ln L = n[x̄ ln(p) + (1 − x̄) ln(1 − p)]. Taking 


derivative with respect to p
∂ ln(L)
and setting the derivative to zero, we obtain ∂p = n x̄p − 1−x̄
1−p
= 0, which yields
x̄ 1−x̄
p
− 1−p
= 0. Therefore, p̂ = x̄.

9.82 (a) The likelihood function is


n
Y n
Y β
L(x1 , . . . , xn ) = f (xi ; α, β) = (αβ)n xβ−1
i e−αxi
i=1 i=1
n n
!β−1
−α
P

i
Y
= (αβ)n e i=1 xi .
i=1

(b) So, the log-likelihood can be expressed as


n
X n
X
ln L = n[ln(α) + ln(β)] − α xβi + (β − 1) ln(xi ).
i=1 i=1

To solve for the maximum likelihood estimate, we need to solve the following two
equations
∂ ln L ∂ ln L
= 0, and = 0.
∂α ∂β

9.83 (a) The likelihood function is


n
Y n 
Y 
1 [ln(xi )−µ]2
L(x1 , . . . , xn ) = f (xi ; µ, σ) = √ e− 2σ2
i=1 i=1
2πσxi
(n
)
1 1 X
= n
Q exp − 2 [ln(xi ) − µ]2 .
2σ i=1
(2π)n/2 σ n xi
i=1
114 Chapter 9 One- and Two-Sample Estimation Problems

(b) It is easy to obtain

Xn n
n n 2 1 X
ln L = − ln(2π) − ln(σ ) − ln(xi ) − 2 [ln(xi ) − µ]2 .
2 2 i=1
2σ i=1

n
P n
P
∂ ln L 1 1
So, setting 0 = ∂µ
= σ2
[ln(xi ) − µ], we obtain µ̂ = n
ln(xi ), and setting
i=1 i=1
n
P n
P
∂ ln L
0= ∂σ2
= − 2σn2 + 1
2σ4
[ln(xi ) − µ]2 , we get σ̂ 2 = 1
n
[ln(xi ) − µ̂]2 .
i=1 i=1

9.84 (a) The likelihood function is

n n
!α−1 n
1 Y  1 Y −
P
(xi /β)
α−1 −xi /β
L(x1 , . . . , xn ) = nα n
xi e = nα xi e i=1 .
β Γ(α) i=1 β Γ(α)n i=1

(b) Hence
n
X n
1X
ln L = −nα ln(β) − n ln(Γ(α)) + (α − 1) ln(xi ) − xi .
i=1
β i=1

Taking derivatives of ln L with respect to α and β, respectively and setting both


as zeros. Then solve them to obtain the maximum likelihood estimates.

9.85 L(x) = px (1 − p)1−x , and ln L = x ln(p) + (1 − x) ln(1 − p), with ∂ ∂p


ln L
= x
p
− 1−x
1−p
= 0,
we obtain p̂ = x = 1.
 k
9.86 From the density function b∗ (x; p) = x−1
k−1
p (1 − p)x−k , we obtain


x−1
ln L = ln + k ln p + (n − k) ln(1 − p).
k−1
∂ ln L k n−k
Setting ∂p
= p
− 1−p
= 0, we obtain p̂ = nk .

9.87 For the estimator S 2 ,


" n #
1 X 1
V ar(S 2 ) = 2
V ar (xi − x̄)2 = 2
V ar(σ 2 χ2n−1 )
(n − 1) i=1
(n − 1)
1 4 2σ 4
= σ [2(n − 1)] = .
(n − 1)2 n−1

For the estimator σb2 , we have

2σ 4 (n − 1)
V ar(σb2 ) = .
n2
Solutions for Exercises in Chapter 9 115

9.88 n = 7, d¯ = 3.557, sd = 2.776, and t0.025 = 2.447 with 6 degrees of freedom. So,
2.776
3.557 ± (2.447) √ = 3.557 ± 2.567,
7
which yields 0.99 < µD < 6.12. Since 0 is not in the interval, the claim appears valid.
28
9.89 n = 75, x = 28, hence p̂ = 75
= 0.3733. Since z0.025 = 1.96, a 95% confidence interval
for p can be calculate as
r
(0.3733)(0.6267)
0.3733 ± (1.96) = 0.3733 ± 0.1095,
75
which yields 0.2638 < p < 0.4828. Since the interval contains 0.421, the claim made
by the Roanoke Times seems reasonable.

9.90 n = 12, d¯ = 40.58, sd = 15.791, and t0.025 = 2.201 with 11 degrees of freedom. So,
15.791
40.58 ± (2.201) √ = 40.58 ± 10.03,
12
which yields 30.55 < µD < 50.61.

9.91 n = 6, d¯ = 1.5, sd = 1.543, and t0.025 = 2.571 with 5 degrees of freedom. So,
1.543
1.5 ± (2.571) √ = 1.5 ± 1.62,
6
which yields −0.12 < µD < 3.12.

9.92 n = 12, d¯ = 417.5, sd = 1186.643, and t0.05 = 1.796 with 11 degrees of freedom. So,
1186.643
417.5 ± (1.796) √ = 417.5 ± 615.23,
12
which yields −197.73 < µD < 1032.73.

9.93 np = nu = 8, x̄p = 86, 250.000, x̄u = 79, 837.500, σp = σu = 4, 000, and z0.025 = 1.96.
So, p
(86250 − 79837.5) ± (1.96)(4000) 1/8 + 1/8 = 6412.5 ± 3920,
which yields 2, 492.5 < µp − µu < 10, 332.5. Hence, polishing does increase the average
endurance limit.
24 36
9.94 nA = 100, nB = 120, p̂A = 100
= 0.24, p̂B = 120 = 0.30, and z0.025 = 1.96. So,
r
(0.24)(0.76) (0.30)(0.70)
(0.30 − 0.24) ± (1.96) + = 0.06 ± 0.117,
100 120
which yields −0.057 < pB − pA < 0.177.
116 Chapter 9 One- and Two-Sample Estimation Problems

9.95 nN = nO = 23, s2N = 105.9271, s2O = 77.4138, and f0.025 (22, 22) = 2.358. So,
105.9271 1 σ2 105.9271 σ2
< N2 < (2.358), or 0.58 < N2 < 3.23.
77.4138 2.358 σO 77.4138 σO
For the ratio of the standard deviations, the 95% confidence interval is approximately
σN
0.76 < < 1.80.
σO
Since the intervals contain 1 we will assume that the variability did not change with
the local supplier.
9.96 nA = nB = 6, x̄A = 0.1407, x̄B = 0.1385, sA = 0.002805, sB = 0.002665, and sp =
002736. Using a 90% confidence interval for the difference in the population means,
t0.05 = 1.812 with 10 degrees of freedom, we obtain
p
(0.1407 − 0.1385) ± (1.812)(0.002736) 1/6 + 1/6 = 0.0022 ± 0.0029,
which yields −0.0007 < µA −µB < 0.0051. Since the 90% confidence interval contains 0,
we conclude that wire A was not shown to be better than wire B, with 90% confidence.
9.97 To calculate the maximum likelihood estimator, we need to use
 
n
P
i x n n
 e−nµ µi=1  X Y

ln L = ln  Q 
n  = −nµ + ln(µ) xi − ln( xi !).
xi ! i=1 i=1
i=1
n
P
1
Taking derivative with respect to µ and setting it to zero, we obtain µ̂ = n
xi = x̄.
i=1
On the other hand, using the method of moments, we also get µ̂ = x̄.
n
P
1
9.98 µ̂ = x̄ and σ̂ 2 = n−1
(xi − x̄)2 .
i=1
2 2 2 2 2
9.99 Equating x̄ = eµ+σ /2 and s2 = (e2µ+σ )(eσ −1), we get ln(x̄) = µ+ σ2 , or µ̂ = ln(x̄)− σ̂2 .
2
On the other hand, ln s2 = 2µ + σ 2 + ln(eσ − 1). Plug in the form of µ̂, we obtain
2
σ̂ 2 = ln 1 + x̄s 2 .

x̄2 s2
9.100 Setting x̄ = αβ and s2 = αβ 2, we get α̂ = s2
, and β̂ = x̄
.
9.101 n1 = n2 = 300, x̄1 = 102300, x̄2 = 98500, s1 = 5700, and s2 = 3800.
(a) z0.005 = 2.575. Hence,
r
57002 38002
(102300 − 98500) ± (2.575) + = 3800 ± 1018.46,
300 300
which yields 2781.54 < µ1 − µ2 < 4818.46. There is a significant difference in
salaries between the two regions.
Solutions for Exercises in Chapter 9 117

(b) Since the sample sizes are large enough, it is not necessary to assume the normality
due to the Central Limit Theorem.
(c) We assumed that the two variances are not equal. Here we are going to obtain
a 95% confidence interval for the ratio of the two variances. It is known that
f0.025 (299, 299) = 1.255. So,
 2  2
5700 1 σ2 5700 σ12
< 12 < (1.255), or 1.793 < < 2.824.
3800 1.255 σ2 3800 σ22

Since the confidence interval does not contain 1, the difference between the vari-
ances is significant.
q
9.102 The error in estimation, with 95% confidence, is (1.96)(4000) n2 . Equating this quan-
tity to 1000, we obtain
 2
(1.96)(4000)
n=2 = 123,
1000
when round up. Hence, the sample sizes in Review Exercise 9.101 is sufficient to
produce a 95% confidence interval on µ1 − µ2 having a width of $1,000.

9.103 n = 300, x̄ = 6.5 and s = 2.5. Also, 1 − α = 0.99 and 1 − γ = 0.95. Using Table A.7,
k = 2.522. So, the limit of the one-sided tolerance interval is 6.5+(2.522)(2.5) = 12.805.
Since this interval contains 10, the claim by the union leaders appears valid.

9.104 n = 30, x = 8, and z0.025 = 1.96. So,


r
4 (4/15)(11/15) 4
± (1.96) = ± 0.158,
15 30 15

which yields 0.108 < p < 0.425.


(1.96)2 (4/15)(11/15)
9.105 n = 0.052
= 301, when round up.

9.106 n1 = n2 = 100, p̂1 = 0.1, and p̂2 = 0.06.

(a) z0.025 = 1.96. So,


r
(0.1)(0.9) (0.06)(0.94)
(0.1 − 0.06) ± (1.96) + = 0.04 ± 0.075,
100 100

which yields −0.035 < p1 − p2 < 0.115.


(b) Since the confidence interval contains 0, it does not show sufficient evidence that
p 1 > p2 .
118 Chapter 9 One- and Two-Sample Estimation Problems

9.107 n = 20 and s2 = 0.045. It is known that χ20.025 = 32.825 and χ20.975 = 8.907 with 19
degrees of freedom. Hence the 95% confidence interval for σ 2 can be expressed as
(19)(0.045) (19)(0.045)
< σ2 < , or 0.012 < σ 2 < 0.045.
32.825 8.907
Therefore, the 95% confidence interval for σ can be approximated as

0.110 < σ < 0.212.

Since 0.3 falls outside of the confidence interval, there is strong evidence that the
process has been improved in variability.
9.108 nA = nB = 15, ȳA = 87, sA = 5.99, ȳB = 75, sB = 4.85, sp = 5.450, and t0.025 = 2.048
with 28 degrees of freedom. So,
r
1 1
(87 − 75) ± (2.048)(5.450) + = 12 ± 4.076,
15 15
which yields 7.924 < µA − µB < 16.076. Apparently, the mean operating costs of type
A engines are higher than those of type B engines.
9.109 Since the unbiased estimators of σ12 and σ22 are S12 and S22 , respectively,
1 (n1 − 1)σ12 + (n2 − 1)σ22
E(S 2 ) = [(n1 − 1)E(S12 ) + (n2 − 1)E(S22 )] = .
n1 + n2 − 2 n1 + n2 − 2
If we assume σ12 = σ22 = σ 2 , the right hand side of the above is σ 2 , which means that
S 2 is unbiased for σ 2 .
9.110 n = 15, x̄ = 3.2, and s = 0.6.
(a) t0.01 = 2.624 with 14 degrees of freedom. So, a 99% left-sided confidence interval
has an upper bound of 3.2 + (2.624) √0.6
15
= 3.607 seconds. We assumed normality
in the calculation.
q
1
(b) 3.2 + (2.624)(0.6) 1 + 15 = 4.826. Still, we need to assume normality in the
distribution.
(c) 1 − α = 0.99 and 1 − γ = 0.95. So, k = 3.520 with n = 15. So, the upper bound
is 3.2 + (3.520)(0.6) = 5.312. Hence, we are 99% confident to claim that 95% of
the pilot will have reaction time less than 5.312 seconds.
17
9.111 n = 400, x = 17, so p̂ = 400
= 0.0425.
(a) z0.025 = 1.96. So,
r
(0.0425)(0.9575)
0.0425 ± (1.96) = 0.0425 ± 0.0198,
400
which yields 0.0227 < p < 0.0623.
Solutions for Exercises in Chapter 9 119

(b) z0.05 = 1.645. So,


q the upper bound of a left-sided 95% confidence interval is
0.0425 + (1.645) (0.0425)(0.9575)
400
= 0.0591.
(c) Using both intervals, we do not have evidence to dispute suppliers’ claim.

Вам также может понравиться