If $X_isim U(0,theta)$, then there exists UMP test for $H_0:theta=theta_0$ vs $H_1:theta>theta_0$
Let $X_{1},dots,X_{n} sim Unif(0,theta)$ independently. Show there exists a UMP size $alpha$ test for testing $H_{0}:theta=theta_{0}$ vs $H_{1}:theta>theta_{0}$.
My attempt:
I attempted to show that the distribution is of increasing monotone likelihood ratio by computing $$frac{f(X_{1},dots, X_{n}midtheta_{1})}{f(X_{1}dots X_{n}midtheta_{2})}$$
This computation gave me: $$frac{f(X_{1},dots, X_{n}midtheta_{1})}{f(X_{1},dots ,X_{n}midtheta_{2})}=frac{theta_{2}^{n}I(X_{(n)}<theta_{1})}{theta_{1}^{n}I(X_{(n)}<theta_{2})}$$ This already confuses me because I'm not familiar with what happens when dividing indicator functions. Moreover, this does not seem to be a non-decreasing function of our test-statistic $t(x)=X_{(n)}$, which, according to my notes, means we cannot apply the Karlin-Rubin theorem for finding a UMP test. Since I get stuck here I cannot prove that there exists a UMP size $alpha$ test and I can also not find its form.
Question: What is going wrong in my approach above and how can I solve this question?
Thanks!
statistics probability-distributions statistical-inference uniform-distribution hypothesis-testing
add a comment |
Let $X_{1},dots,X_{n} sim Unif(0,theta)$ independently. Show there exists a UMP size $alpha$ test for testing $H_{0}:theta=theta_{0}$ vs $H_{1}:theta>theta_{0}$.
My attempt:
I attempted to show that the distribution is of increasing monotone likelihood ratio by computing $$frac{f(X_{1},dots, X_{n}midtheta_{1})}{f(X_{1}dots X_{n}midtheta_{2})}$$
This computation gave me: $$frac{f(X_{1},dots, X_{n}midtheta_{1})}{f(X_{1},dots ,X_{n}midtheta_{2})}=frac{theta_{2}^{n}I(X_{(n)}<theta_{1})}{theta_{1}^{n}I(X_{(n)}<theta_{2})}$$ This already confuses me because I'm not familiar with what happens when dividing indicator functions. Moreover, this does not seem to be a non-decreasing function of our test-statistic $t(x)=X_{(n)}$, which, according to my notes, means we cannot apply the Karlin-Rubin theorem for finding a UMP test. Since I get stuck here I cannot prove that there exists a UMP size $alpha$ test and I can also not find its form.
Question: What is going wrong in my approach above and how can I solve this question?
Thanks!
statistics probability-distributions statistical-inference uniform-distribution hypothesis-testing
See math.stackexchange.com/questions/1736322/….
– StubbornAtom
Dec 3 '18 at 18:41
add a comment |
Let $X_{1},dots,X_{n} sim Unif(0,theta)$ independently. Show there exists a UMP size $alpha$ test for testing $H_{0}:theta=theta_{0}$ vs $H_{1}:theta>theta_{0}$.
My attempt:
I attempted to show that the distribution is of increasing monotone likelihood ratio by computing $$frac{f(X_{1},dots, X_{n}midtheta_{1})}{f(X_{1}dots X_{n}midtheta_{2})}$$
This computation gave me: $$frac{f(X_{1},dots, X_{n}midtheta_{1})}{f(X_{1},dots ,X_{n}midtheta_{2})}=frac{theta_{2}^{n}I(X_{(n)}<theta_{1})}{theta_{1}^{n}I(X_{(n)}<theta_{2})}$$ This already confuses me because I'm not familiar with what happens when dividing indicator functions. Moreover, this does not seem to be a non-decreasing function of our test-statistic $t(x)=X_{(n)}$, which, according to my notes, means we cannot apply the Karlin-Rubin theorem for finding a UMP test. Since I get stuck here I cannot prove that there exists a UMP size $alpha$ test and I can also not find its form.
Question: What is going wrong in my approach above and how can I solve this question?
Thanks!
statistics probability-distributions statistical-inference uniform-distribution hypothesis-testing
Let $X_{1},dots,X_{n} sim Unif(0,theta)$ independently. Show there exists a UMP size $alpha$ test for testing $H_{0}:theta=theta_{0}$ vs $H_{1}:theta>theta_{0}$.
My attempt:
I attempted to show that the distribution is of increasing monotone likelihood ratio by computing $$frac{f(X_{1},dots, X_{n}midtheta_{1})}{f(X_{1}dots X_{n}midtheta_{2})}$$
This computation gave me: $$frac{f(X_{1},dots, X_{n}midtheta_{1})}{f(X_{1},dots ,X_{n}midtheta_{2})}=frac{theta_{2}^{n}I(X_{(n)}<theta_{1})}{theta_{1}^{n}I(X_{(n)}<theta_{2})}$$ This already confuses me because I'm not familiar with what happens when dividing indicator functions. Moreover, this does not seem to be a non-decreasing function of our test-statistic $t(x)=X_{(n)}$, which, according to my notes, means we cannot apply the Karlin-Rubin theorem for finding a UMP test. Since I get stuck here I cannot prove that there exists a UMP size $alpha$ test and I can also not find its form.
Question: What is going wrong in my approach above and how can I solve this question?
Thanks!
statistics probability-distributions statistical-inference uniform-distribution hypothesis-testing
statistics probability-distributions statistical-inference uniform-distribution hypothesis-testing
edited Dec 4 '18 at 15:09
StubbornAtom
5,36411138
5,36411138
asked Dec 3 '18 at 15:47
S. Crim
14212
14212
See math.stackexchange.com/questions/1736322/….
– StubbornAtom
Dec 3 '18 at 18:41
add a comment |
See math.stackexchange.com/questions/1736322/….
– StubbornAtom
Dec 3 '18 at 18:41
See math.stackexchange.com/questions/1736322/….
– StubbornAtom
Dec 3 '18 at 18:41
See math.stackexchange.com/questions/1736322/….
– StubbornAtom
Dec 3 '18 at 18:41
add a comment |
1 Answer
1
active
oldest
votes
Just to elaborate on the division of indicator functions:
To test $H_0:theta=theta_0$ against $H_1:theta=theta_1(>theta_0)$ according to Neyman-Pearson lemma, note that
the likelihood ratio $lambda$ is of the form
begin{align}
lambda(x_1ldots,x_n)&=frac{f_{H_1}(x_1,ldots,x_n)}{f_{H_0}(x_1,ldots,x_n)}
\\&=left(frac{theta_0}{theta_1}right)^nfrac{mathbf1_{{x_{(n)}<theta_1}}}{mathbf1_{{x_{(n)}<theta_0}}}
\\&=begin{cases}left(frac{theta_0}{theta_1}right)^n&,text{ if }0<x_{(n)}<theta_0\,,, infty&,text{ if }theta_0< x_{(n)}<theta_1end{cases}
end{align}
So I think it follows from here that $lambda$ is a monotone non-decreasing function of $x_{(n)}$.
And by N-P lemma we know that an MP test of size $alpha$ is given by $$varphi(x_1,ldots,x_n)=mathbf1_{lambda(x_1,ldots,x_n)>k}$$, where $k$ is so chosen that $$E_{H_0}varphi(X_1,ldots,X_n)=alpha$$
Now you can proceed with your proof.
Could you go into more detail on why we can conside the function $lambda (x_{1}, dots, x_{n})$ a monotone non-decreasing function of $x_{n}$? I know the definition of such a function but am kind of lost as to how it follows directly from this representation of $lambda$. Thanks!
– S. Crim
Dec 5 '18 at 7:12
1
@S.Crim Roughly, plotting $lambda$ against $x_{(n)}$, we see that it takes a constant value when $0<x_{(n)}<theta_0$ and increases without bound when $theta_0<x_{(n)}<theta_1$.
– StubbornAtom
Dec 5 '18 at 12:56
add a comment |
Your Answer
StackExchange.ifUsing("editor", function () {
return StackExchange.using("mathjaxEditing", function () {
StackExchange.MarkdownEditor.creationCallbacks.add(function (editor, postfix) {
StackExchange.mathjaxEditing.prepareWmdForMathJax(editor, postfix, [["$", "$"], ["\\(","\\)"]]);
});
});
}, "mathjax-editing");
StackExchange.ready(function() {
var channelOptions = {
tags: "".split(" "),
id: "69"
};
initTagRenderer("".split(" "), "".split(" "), channelOptions);
StackExchange.using("externalEditor", function() {
// Have to fire editor after snippets, if snippets enabled
if (StackExchange.settings.snippets.snippetsEnabled) {
StackExchange.using("snippets", function() {
createEditor();
});
}
else {
createEditor();
}
});
function createEditor() {
StackExchange.prepareEditor({
heartbeatType: 'answer',
autoActivateHeartbeat: false,
convertImagesToLinks: true,
noModals: true,
showLowRepImageUploadWarning: true,
reputationToPostImages: 10,
bindNavPrevention: true,
postfix: "",
imageUploader: {
brandingHtml: "Powered by u003ca class="icon-imgur-white" href="https://imgur.com/"u003eu003c/au003e",
contentPolicyHtml: "User contributions licensed under u003ca href="https://creativecommons.org/licenses/by-sa/3.0/"u003ecc by-sa 3.0 with attribution requiredu003c/au003e u003ca href="https://stackoverflow.com/legal/content-policy"u003e(content policy)u003c/au003e",
allowUrls: true
},
noCode: true, onDemand: true,
discardSelector: ".discard-answer"
,immediatelyShowMarkdownHelp:true
});
}
});
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
StackExchange.ready(
function () {
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fmath.stackexchange.com%2fquestions%2f3024228%2fif-x-i-sim-u0-theta-then-there-exists-ump-test-for-h-0-theta-theta-0-v%23new-answer', 'question_page');
}
);
Post as a guest
Required, but never shown
1 Answer
1
active
oldest
votes
1 Answer
1
active
oldest
votes
active
oldest
votes
active
oldest
votes
Just to elaborate on the division of indicator functions:
To test $H_0:theta=theta_0$ against $H_1:theta=theta_1(>theta_0)$ according to Neyman-Pearson lemma, note that
the likelihood ratio $lambda$ is of the form
begin{align}
lambda(x_1ldots,x_n)&=frac{f_{H_1}(x_1,ldots,x_n)}{f_{H_0}(x_1,ldots,x_n)}
\\&=left(frac{theta_0}{theta_1}right)^nfrac{mathbf1_{{x_{(n)}<theta_1}}}{mathbf1_{{x_{(n)}<theta_0}}}
\\&=begin{cases}left(frac{theta_0}{theta_1}right)^n&,text{ if }0<x_{(n)}<theta_0\,,, infty&,text{ if }theta_0< x_{(n)}<theta_1end{cases}
end{align}
So I think it follows from here that $lambda$ is a monotone non-decreasing function of $x_{(n)}$.
And by N-P lemma we know that an MP test of size $alpha$ is given by $$varphi(x_1,ldots,x_n)=mathbf1_{lambda(x_1,ldots,x_n)>k}$$, where $k$ is so chosen that $$E_{H_0}varphi(X_1,ldots,X_n)=alpha$$
Now you can proceed with your proof.
Could you go into more detail on why we can conside the function $lambda (x_{1}, dots, x_{n})$ a monotone non-decreasing function of $x_{n}$? I know the definition of such a function but am kind of lost as to how it follows directly from this representation of $lambda$. Thanks!
– S. Crim
Dec 5 '18 at 7:12
1
@S.Crim Roughly, plotting $lambda$ against $x_{(n)}$, we see that it takes a constant value when $0<x_{(n)}<theta_0$ and increases without bound when $theta_0<x_{(n)}<theta_1$.
– StubbornAtom
Dec 5 '18 at 12:56
add a comment |
Just to elaborate on the division of indicator functions:
To test $H_0:theta=theta_0$ against $H_1:theta=theta_1(>theta_0)$ according to Neyman-Pearson lemma, note that
the likelihood ratio $lambda$ is of the form
begin{align}
lambda(x_1ldots,x_n)&=frac{f_{H_1}(x_1,ldots,x_n)}{f_{H_0}(x_1,ldots,x_n)}
\\&=left(frac{theta_0}{theta_1}right)^nfrac{mathbf1_{{x_{(n)}<theta_1}}}{mathbf1_{{x_{(n)}<theta_0}}}
\\&=begin{cases}left(frac{theta_0}{theta_1}right)^n&,text{ if }0<x_{(n)}<theta_0\,,, infty&,text{ if }theta_0< x_{(n)}<theta_1end{cases}
end{align}
So I think it follows from here that $lambda$ is a monotone non-decreasing function of $x_{(n)}$.
And by N-P lemma we know that an MP test of size $alpha$ is given by $$varphi(x_1,ldots,x_n)=mathbf1_{lambda(x_1,ldots,x_n)>k}$$, where $k$ is so chosen that $$E_{H_0}varphi(X_1,ldots,X_n)=alpha$$
Now you can proceed with your proof.
Could you go into more detail on why we can conside the function $lambda (x_{1}, dots, x_{n})$ a monotone non-decreasing function of $x_{n}$? I know the definition of such a function but am kind of lost as to how it follows directly from this representation of $lambda$. Thanks!
– S. Crim
Dec 5 '18 at 7:12
1
@S.Crim Roughly, plotting $lambda$ against $x_{(n)}$, we see that it takes a constant value when $0<x_{(n)}<theta_0$ and increases without bound when $theta_0<x_{(n)}<theta_1$.
– StubbornAtom
Dec 5 '18 at 12:56
add a comment |
Just to elaborate on the division of indicator functions:
To test $H_0:theta=theta_0$ against $H_1:theta=theta_1(>theta_0)$ according to Neyman-Pearson lemma, note that
the likelihood ratio $lambda$ is of the form
begin{align}
lambda(x_1ldots,x_n)&=frac{f_{H_1}(x_1,ldots,x_n)}{f_{H_0}(x_1,ldots,x_n)}
\\&=left(frac{theta_0}{theta_1}right)^nfrac{mathbf1_{{x_{(n)}<theta_1}}}{mathbf1_{{x_{(n)}<theta_0}}}
\\&=begin{cases}left(frac{theta_0}{theta_1}right)^n&,text{ if }0<x_{(n)}<theta_0\,,, infty&,text{ if }theta_0< x_{(n)}<theta_1end{cases}
end{align}
So I think it follows from here that $lambda$ is a monotone non-decreasing function of $x_{(n)}$.
And by N-P lemma we know that an MP test of size $alpha$ is given by $$varphi(x_1,ldots,x_n)=mathbf1_{lambda(x_1,ldots,x_n)>k}$$, where $k$ is so chosen that $$E_{H_0}varphi(X_1,ldots,X_n)=alpha$$
Now you can proceed with your proof.
Just to elaborate on the division of indicator functions:
To test $H_0:theta=theta_0$ against $H_1:theta=theta_1(>theta_0)$ according to Neyman-Pearson lemma, note that
the likelihood ratio $lambda$ is of the form
begin{align}
lambda(x_1ldots,x_n)&=frac{f_{H_1}(x_1,ldots,x_n)}{f_{H_0}(x_1,ldots,x_n)}
\\&=left(frac{theta_0}{theta_1}right)^nfrac{mathbf1_{{x_{(n)}<theta_1}}}{mathbf1_{{x_{(n)}<theta_0}}}
\\&=begin{cases}left(frac{theta_0}{theta_1}right)^n&,text{ if }0<x_{(n)}<theta_0\,,, infty&,text{ if }theta_0< x_{(n)}<theta_1end{cases}
end{align}
So I think it follows from here that $lambda$ is a monotone non-decreasing function of $x_{(n)}$.
And by N-P lemma we know that an MP test of size $alpha$ is given by $$varphi(x_1,ldots,x_n)=mathbf1_{lambda(x_1,ldots,x_n)>k}$$, where $k$ is so chosen that $$E_{H_0}varphi(X_1,ldots,X_n)=alpha$$
Now you can proceed with your proof.
answered Dec 3 '18 at 19:39
StubbornAtom
5,36411138
5,36411138
Could you go into more detail on why we can conside the function $lambda (x_{1}, dots, x_{n})$ a monotone non-decreasing function of $x_{n}$? I know the definition of such a function but am kind of lost as to how it follows directly from this representation of $lambda$. Thanks!
– S. Crim
Dec 5 '18 at 7:12
1
@S.Crim Roughly, plotting $lambda$ against $x_{(n)}$, we see that it takes a constant value when $0<x_{(n)}<theta_0$ and increases without bound when $theta_0<x_{(n)}<theta_1$.
– StubbornAtom
Dec 5 '18 at 12:56
add a comment |
Could you go into more detail on why we can conside the function $lambda (x_{1}, dots, x_{n})$ a monotone non-decreasing function of $x_{n}$? I know the definition of such a function but am kind of lost as to how it follows directly from this representation of $lambda$. Thanks!
– S. Crim
Dec 5 '18 at 7:12
1
@S.Crim Roughly, plotting $lambda$ against $x_{(n)}$, we see that it takes a constant value when $0<x_{(n)}<theta_0$ and increases without bound when $theta_0<x_{(n)}<theta_1$.
– StubbornAtom
Dec 5 '18 at 12:56
Could you go into more detail on why we can conside the function $lambda (x_{1}, dots, x_{n})$ a monotone non-decreasing function of $x_{n}$? I know the definition of such a function but am kind of lost as to how it follows directly from this representation of $lambda$. Thanks!
– S. Crim
Dec 5 '18 at 7:12
Could you go into more detail on why we can conside the function $lambda (x_{1}, dots, x_{n})$ a monotone non-decreasing function of $x_{n}$? I know the definition of such a function but am kind of lost as to how it follows directly from this representation of $lambda$. Thanks!
– S. Crim
Dec 5 '18 at 7:12
1
1
@S.Crim Roughly, plotting $lambda$ against $x_{(n)}$, we see that it takes a constant value when $0<x_{(n)}<theta_0$ and increases without bound when $theta_0<x_{(n)}<theta_1$.
– StubbornAtom
Dec 5 '18 at 12:56
@S.Crim Roughly, plotting $lambda$ against $x_{(n)}$, we see that it takes a constant value when $0<x_{(n)}<theta_0$ and increases without bound when $theta_0<x_{(n)}<theta_1$.
– StubbornAtom
Dec 5 '18 at 12:56
add a comment |
Thanks for contributing an answer to Mathematics Stack Exchange!
- Please be sure to answer the question. Provide details and share your research!
But avoid …
- Asking for help, clarification, or responding to other answers.
- Making statements based on opinion; back them up with references or personal experience.
Use MathJax to format equations. MathJax reference.
To learn more, see our tips on writing great answers.
Some of your past answers have not been well-received, and you're in danger of being blocked from answering.
Please pay close attention to the following guidance:
- Please be sure to answer the question. Provide details and share your research!
But avoid …
- Asking for help, clarification, or responding to other answers.
- Making statements based on opinion; back them up with references or personal experience.
To learn more, see our tips on writing great answers.
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
StackExchange.ready(
function () {
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fmath.stackexchange.com%2fquestions%2f3024228%2fif-x-i-sim-u0-theta-then-there-exists-ump-test-for-h-0-theta-theta-0-v%23new-answer', 'question_page');
}
);
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
See math.stackexchange.com/questions/1736322/….
– StubbornAtom
Dec 3 '18 at 18:41