Please help!
Type your G # here:G12345678
Answer in the highlighted boxes.
Submit to Blackboard by the deadline.
Problem
1
2
3
4
5
6
7
8
9
10
Marks
2.5
2.5
2.5
2.5
2.5
2.5
2.5
2.5
2.5
2.5
Topic
Bayesian
Poisson
Distributions
Test for μ; σ known
Test for μ; σ unknown
Nonparametric
Regression
Joint Density
Source of data
Probability
25 total marks
Chapter
5
4
4
5
7
14
11
3
10
2
Our prior distribution, for the probability of tails with a coin, is:
We toss the coin twice, and we get tails both times.
Under the resulting posterior distribution, complete the table:
p
0.28
0.53
0.78
f(p)
f(p) = 1; ie uniform on (0,1)
Hurricane occurrences, per year, are Poisson-distributed with λ = 1.9
Complete the table:
Y
H
1
1
1
10
100
1
2
3
19
163
P(at least H hurricanes in Y years)
(you are allowed 1% tolerance)
(you are allowed 1% tolerance)
A random variable, X, is known to have a mean of 6
Complete the table:
If X follows
Uniform(0,n)
Poisson
Geometric
Exponential
Chi-square
Binomial(p = .1)
then Variance(X) is
H0:
population mean =
The underlying population is normal with variance =
Test H0 using a 5% chance of Type 1 error (two tails)
732
312
sample mean
5% critical values:
low
high
decision
(“reject” or “cannot reject”)
Sample (n = 40); Problems 4-5-6 all use this same sample
702
734
735
722
732
732
733
740
715
699
726
710
725
730
735
721
738
733
720
728
731
735
724
724
734
736
732
709
734
732
714
695
720
735
741
736
777
660
712
709
H0:
population mean =
732
The underlying population is normal with unknown variance
Test H0 using a 5% chance of Type 1 error (two tails)
sample mean
5% critical values:
low
high
decision
(“reject” or “cannot reject”)
Sample (n = 40); Problems 4-5-6 all use this same sample
702
734
735
722
732
732
733
740
715
699
726
710
725
730
735
721
738
733
720
728
731
735
724
724
734
736
732
709
734
732
714
695
720
735
741
736
777
660
712
709
H0: population median =
732
We do not know anything about the underlying population
Test H0 using a 5% chance of Type 1 error (two tails)
test statistic
5% critical values:
low
high
decision
(“reject” or “cannot reject”)
Sample (n = 40); Problems 4-5-6 all use this same sample
702
734
735
722
732
732
733
740
715
699
726
710
725
730
735
721
738
733
720
728
731
735
724
724
734
736
732
709
734
732
714
695
720
735
741
736
777
660
712
709
Using the x and y data below, run a linear regression to find
intercept
slope
s2
std dev of slope estimator
p-value for testing H0: slope = 0 using 2 tails
x
y
4234
3811
3430
3087
2778
2500
2250
2025
1823
1641
1477
1329
1196
1076
968
871
784
706
635
572
515
256.9
211.6
238.1
211.8
194.1
124.5
187.3
110.5
233.1
150.3
124.7
41.2
182.1
118.1
31.9
114.3
144.9
59.7
126.9
43.9
136.3
A speedboat has two engines, A and B.
The joint density function of the lifetimes of the two engines, measured in hours, is:
f(A, B) = .125 (A + B)
defined on the square: 0 < x < 2 and 0 < y < 2.
Calculate the probability that the speedboat fails during its first 78 minutes of operation, according to:
The boat fails upon either engine failing:
The boat fails only when both engines have failed:
H0 : the source distribution is Beta with parameters r = s = 3; in other words f(x) = 30 * x 2 * (1-x)2 and F(x) = 30 * (x3/3 - x4/2 +
Using the strata, and data, shown below, test H0 using a 5% chance of committing a Type I error
test statistic
critical value
decision
("reject" or "cannot reject")
Strata
.0 to .2
.2 to .325
.325 to .425
.425 to .5
.5 to .575
.575 to .675
.675 to .8
.8 to 1
Data (n=100)
i
Yi
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
0.0770
0.1135
0.1364
0.1542
0.1692
0.1824
0.1943
0.2052
0.2153
0.2248
0.2338
0.2423
0.2505
0.2584
0.2659
0.2733
0.2803
0.2872
0.2940
0.3005
0.3069
0.3132
0.3194
0.3254
0.3313
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
0.3372
0.3430
0.3486
0.3542
0.3598
0.3653
0.3707
0.3761
0.3814
0.3866
0.3919
0.3971
0.4022
0.4074
0.4125
0.4176
0.4226
0.4276
0.4327
0.4377
0.4426
0.4476
0.4526
0.4576
0.4625
0.4675
0.4724
0.4774
0.4824
0.4874
0.4923
0.4973
0.5024
0.5074
0.5124
0.5175
0.5226
0.5278
0.5329
0.5381
0.5434
0.5486
0.5539
0.5593
0.5647
0.5702
0.5758
73
74
75
76
77
78
79
80
81
82
83
84
85
86
87
88
89
90
91
92
93
94
95
96
97
98
99
100
0.5814
0.5870
0.5928
0.5987
0.6046
0.6106
0.6168
0.6231
0.6295
0.6360
0.6428
0.6497
0.6567
0.6641
0.6716
0.6795
0.6877
0.6962
0.7052
0.7147
0.7248
0.7357
0.7476
0.7608
0.7758
0.7936
0.8165
0.8530
d F(x) = 30 * (x3/3 - x4/2 + x5/5)
The dealer has a deck of cards that DOES contain a joker.
The joker has no suit, ie it is not a spade, not a heart, not a diamond, and not a club
Thus there are 53 cards in the deck.
You are dealt 5 cards, completely randomly.
Complete the table:
Hand
4-of-a-kind
straight
flush
joker hand
P(Hand)
further description of the hand
4 aces and any other card (the other could be
A-2-3-4-5 or 2-3-4-5-6 or … or 9-10-J-Q-K or 1
5 cards all of the same suit; for example 5 spa
the 1 joker and any other 4 cards
me suit; for example 5 spades; the joker cannot be 1 of the 5 cards
We have a coin.
Our prior distribution, for the probability of tails with this coin, is:
We toss the coin, and we get tails.
Under the resulting posterior distribution, complete the table:
p
0.25
0.5
0.75
f(p)
1.125
1.5
1.125
You must integrate p(2-2p) dp from 0 to 1, obtaining 1/3
1/3 is the denominator in the posterior distribution
f(p) = 2 - 2p
We have the density function:
fT(t) = 9.4e-9.4t + .79 * (3e-3t - 9.4e-9.4t)
Find the following:
E(T)
Var(T)
over the range 0 < t < ∞
0.28234
0.068099
The easy way (and that's what you see above) is to notice that we have a weighted average of two exponential distribution
parameter λ
9.4
3
weight
21%
79%
The exponential distribution has expectation 1/λ and variance 1/λ2
The hard way is to use calculus.
ge of two exponential distributions:
Your parameter is P(Tails) with a $20 gold piece that you found in an old trunk.
Your prior is U(0,1) which of course gives E(P) = .5
You create your sample by tossing the coin 8 times.
After 1 toss you determine your first posterior distribution. Then you toss again, allowing you to further refine your idea of th
And so on, so that after each toss you have a new edition of the posterior distribution.
Define Ei(P) to be E(P) using the posterior distribution based on the first i tosses
Complete the table:
i
0
1
2
3
4
5
6
7
8
tossi
Ei(P)
T
H
T
H
T
T
T
T
0.5000
0.6667
0.5000
0.6000
0.5000
0.5714
0.6250
0.6667
0.7000
T
0
1
1
2
2
3
4
5
6
cumulative counts
H
0
0
1
1
2
2
2
2
2
f(x)
1
x
x(1-x)
x^2(1-x)
x^2(1-x)^2
x^3(1-x)^2
x^4(1-x)^2
x^5(1-x)^2
x^6(1-x)^2
her refine your idea of the posterior distribution.
multiplier
1
2
6
12
30
60
105
168
252
E(x)
0.5
0.666666667
0.5
0.6
0.5
0.571428571
0.625
0.666666667
0.7
Draw 12 times from U(-6, 6); refer to the values drawn as Y1, Y2, …, Y12
Consider the following estimators for the mean of the density (which is in fact zero):
Ῡ
Y1
(Y1+Y12) / 2
Complete the table:
Ῡ
Y1
(Y1+Y12) / 2
expectation
0
0
variance
1
12
0
6
the variance of a draw from U(0,1) is 1/1
variance of a draw from U(0,1) is 1/12
We wish to approximate the distribution of Ῡ from problem 4.
We do this with a normal distribution. Using the method of moments, what parameter values do we assign to the normal dist
μ
σ
0
1
es do we assign to the normal distribution?
How large a sample must be taken from the triangular density
in order for the sample mean to have a 90% probability
of being in the range [.9, 1.1]
The triangular density is:
y=x
y=2-x
for x from 0 to 1
for x from 1 to 2
You can assume that the CLT applies.
Required sample size is n =
50
With calculus, you should find:
E(Y)
Var(Y)
Std dev(Y)
So we will use N(1, .70711/√n) to approxima
The z-score we need is
That z, times .70711/√n, must equal .1
Solving, n =
That's an exact integral answer!
No need to round up
us, you should find:
1
0.5
0.707107
use N(1, .70711/√n) to approximate the density of Ῡ
e we need is
1.644854
es .70711/√n, must equal .1
50
xact integral answer!
With coin tossing, our prior estimate for the distribution of P is U(0,1).
Our experimental result, from 4 tosses, is HTTH
Determine the posterior distribution, and use it to calculate:
Probability that .4 < P < .6
0.3651
Variance of P
0.0500
Repeat, but this time using 20 heads and 20 tails.
Probability that .4 < P < .6
Variance of P
0.8291
0.0061
For the 4-toss case:
The prior is Beta(1, 1)
The posterior density is Beta(3, 3) =
That density is:
The variance, from Wikepedia:
Integrate the density to find:
p
F(p)
0.4 0.31744
0.6 0.68256
5! / 2! / 2! * p2 * (1-p)2
0.05
For the 40-toss case:
The integration of p20 * (1-p)20 must be done by number crunching
The variance, from Wikepedia:
0.006098
r
s
21 (20 failures)
21 (20 successes)
Probability that .4 < p < .6 =
This is the density of p conditioned on havng seen r-1 failures and s-1
successes out of r+s-2 Bernoulli trials (p is probability of success)
6
Non-integral r and s can be handled by the gamma function.
Below uses factorals; only valid for integral r and s.
5
4
p
f(p)
3
2
1
0.36
0.32
0.28
0.24
0.20
0.16
0.12
0.08
0
0.04
5.6516E-108
4.62257E-28
3.95641E-22
1.07162E-18
2.74666E-16
1.9322E-14
5.99462E-13
1.05633E-11
1.22952E-10
1.04198E-09
6.87116E-09
3.69686E-08
1.68058E-07
6.62875E-07
2.31584E-06
7.28423E-06
2.09009E-05
5.52993E-05
0.000136115
0.000314016
0.00068325
0.001409637
0.002770291
0.00520656
0.009390233
0.016300658
0.027307444
0.044249888
0.069500871
0.106000185
0.157240923
0.227193228
0.320152953
0.440508753
0.592429591
0.00
0.00
0.01
0.02
0.03
0.04
0.05
0.06
0.07
0.08
0.09
0.10
0.11
0.12
0.13
0.14
0.15
0.16
0.17
0.18
0.19
0.20
0.21
0.22
0.23
0.24
0.25
0.26
0.27
0.28
0.29
0.30
0.31
0.32
0.33
0.34
0.35
0.36
0.37
0.38
0.39
0.40
0.41
0.42
0.43
0.44
0.45
0.46
0.47
0.48
0.49
0.50
0.51
0.52
0.53
0.54
0.55
0.56
0.57
0.58
0.59
0.60
0.61
0.62
0.63
0.64
0.65
0.66
0.67
0.68
0.69
0.70
0.71
0.72
0.73
0.74
0.75
0.76
0.77
0.78
0.79
0.80
0.81
0.779484908
1.004220739
1.267725311
1.569225705
1.905761424
2.271980112
2.660094458
3.060027788
3.459759545
3.845862571
4.204203762
4.520760724
4.782491886
4.978188203
5.099232494
5.140198192
5.099232494
4.978188203
4.782491886
4.520760724
4.204203762
3.845862571
3.459759545
3.060027788
2.660094458
2.271980112
1.905761424
1.569225705
1.267725311
1.004220739
0.779484908
0.592429591
0.440508753
0.320152953
0.227193228
0.157240923
0.106000185
0.069500871
0.044249888
0.027307444
0.016300658
0.009390233
0.00520656
0.002770291
0.001409637
0.00068325
0.000314016
0.82
0.83
0.84
0.85
0.86
0.87
0.88
0.89
0.90
0.91
0.92
0.93
0.94
0.95
0.96
0.97
0.98
0.99
1.00
0.000136115
5.52993E-05
2.09009E-05
7.28423E-06
2.31584E-06
6.62875E-07
1.68058E-07
3.69686E-08
6.87116E-09
1.04198E-09
1.22952E-10
1.05633E-11
5.99462E-13
1.9322E-14
2.74666E-16
1.07162E-18
3.95641E-22
4.62257E-28
5.6516E-108
1.00
0.96
0.92
0.88
0.84
0.80
0.76
0.72
0.68
0.64
0.60
0.56
0.52
0.48
0.44
0.40
0.36
0.32
0.829054013
Beta
A Bayesian PROBABILITY problem, as opposed to the more difficult Bayesian ESTIMATION problems
Josh takes a twenty-question multiple-choice exam where each question has five possible answers. Some
of the answers he knows and get right, while others he gets right just by making lucky guesses. Suppose that the conditional p
of his knowing the answer to a randomly selected question given that he got it right is 0.92. How many of the
twenty questions was he prepared for?
We can organize things as follows, where n is the value we seek
not
prepared prepared
right
n (20-n).2
wrong
0 (20-n).8
We know that .92 = n / (n + (20-n).2)
From there we can solve for n, obtaining
n should be integral, so we round up
13.93939
14
es. Suppose that the conditional probability
We've got everything to become your favourite writing service
Money back guarantee
Your money is safe. Even if we fail to satisfy your expectations, you can always request a refund and get your money back.
Confidentiality
We don’t share your private information with anyone. What happens on our website stays on our website.
Our service is legit
We provide you with a sample paper on the topic you need, and this kind of academic assistance is perfectly legitimate.
Get a plagiarism-free paper
We check every paper with our plagiarism-detection software, so you get a unique paper written for your particular purposes.
We can help with urgent tasks
Need a paper tomorrow? We can write it even while you’re sleeping. Place an order now and get your paper in 8 hours.
Pay a fair price
Our prices depend on urgency. If you want a cheap essay, place your order in advance. Our prices start from $11 per page.