Relation between independence and correlation of uniform random variablesCorrelations with a linear combination means correlation with individual variables?Geometric mean of uniform variablesHow to Test Independence of Poisson Variables?If $X$ and $Y$ are normally distributed random variables, what kind of distribution their sum follows?Distribution of X-U(0,1) conditioned on sigma algebra of Y/X, where is Y is U(0,1)?Is there a parametric joint distribution such that $X$ and $Y$ are both uniform and $mathbbE[Y ;|; X]$ is linear?Are two Random Variables Independent if their support has a dependency?Correlation of the sigmoid function of normal random varaiblesIntuitive reason why jointly normal and uncorrelated imply independenceConditional maximum likelihood of AR(1) UNIFORM PROCESS
What is the plural TO / OF something
Deletion of copy-ctor & copy-assignment - public, private or protected?
How to get the n-th line after a grepped one?
How could an airship be repaired midflight?
Help rendering a complicated sum/product formula
Do I need to consider instance restrictions when showing a language is in P?
Light propagating through a sound wave
My friend is being a hypocrite
Violin - Can double stops be played when the strings are not next to each other?
Can you move over difficult terrain with only 5 feet of movement?
Optimising a list searching algorithm
What is the term when voters “dishonestly” choose something that they do not want to choose?
HP P840 HDD RAID 5 many strange drive failures
What exactly term 'companion plants' means?
The average age of first marriage in Russia
Suggestions on how to spend Shaabath (constructively) alone
What does Jesus mean regarding "Raca," and "you fool?" - is he contrasting them?
What does Deadpool mean by "left the house in that shirt"?
Four married couples attend a party. Each person shakes hands with every other person, except their own spouse, exactly once. How many handshakes?
What (if any) is the reason to buy in small local stores?
Print last inputted byte
Fewest number of steps to reach 200 using special calculator
Variable completely messes up echoed string
Are dual Irish/British citizens bound by the 90/180 day rule when travelling in the EU after Brexit?
Relation between independence and correlation of uniform random variables
Correlations with a linear combination means correlation with individual variables?Geometric mean of uniform variablesHow to Test Independence of Poisson Variables?If $X$ and $Y$ are normally distributed random variables, what kind of distribution their sum follows?Distribution of X-U(0,1) conditioned on sigma algebra of Y/X, where is Y is U(0,1)?Is there a parametric joint distribution such that $X$ and $Y$ are both uniform and $mathbbE[Y ;|; X]$ is linear?Are two Random Variables Independent if their support has a dependency?Correlation of the sigmoid function of normal random varaiblesIntuitive reason why jointly normal and uncorrelated imply independenceConditional maximum likelihood of AR(1) UNIFORM PROCESS
$begingroup$
My question is fairly simple: let $X$ and $Y$ be two uncorrelated uniform random variables on $[-1,1]$. Are they independent?
I was under the impression that two random, uncorrelated variables are only necessarily independent if their joint distribution is normal, however I can't come up with a counterexample to disprove the claim I ask about. Either a counterexample or a proof would be greatly appreciated.
correlation independence uniform
$endgroup$
add a comment |
$begingroup$
My question is fairly simple: let $X$ and $Y$ be two uncorrelated uniform random variables on $[-1,1]$. Are they independent?
I was under the impression that two random, uncorrelated variables are only necessarily independent if their joint distribution is normal, however I can't come up with a counterexample to disprove the claim I ask about. Either a counterexample or a proof would be greatly appreciated.
correlation independence uniform
$endgroup$
add a comment |
$begingroup$
My question is fairly simple: let $X$ and $Y$ be two uncorrelated uniform random variables on $[-1,1]$. Are they independent?
I was under the impression that two random, uncorrelated variables are only necessarily independent if their joint distribution is normal, however I can't come up with a counterexample to disprove the claim I ask about. Either a counterexample or a proof would be greatly appreciated.
correlation independence uniform
$endgroup$
My question is fairly simple: let $X$ and $Y$ be two uncorrelated uniform random variables on $[-1,1]$. Are they independent?
I was under the impression that two random, uncorrelated variables are only necessarily independent if their joint distribution is normal, however I can't come up with a counterexample to disprove the claim I ask about. Either a counterexample or a proof would be greatly appreciated.
correlation independence uniform
correlation independence uniform
asked 2 hours ago
PeiffapPeiffap
153
153
add a comment |
add a comment |
1 Answer
1
active
oldest
votes
$begingroup$
Independent implies uncorrelated but the implication doesn't go the other way.
Uncorrelated implies independence only under certain conditions. e.g. if you have a bivariate normal, it is the case that uncorrelated implies independent (as you said).
It is easy to construct bivariate distributions with uniform margins where the variables are uncorrelated but are not independent. Here are a few examples:
consider an additional random variable $B$ which takes the values $pm 1$ each with probability $frac12$, independent of $X$. Then let $Y=BX$.
take the bivariate distribution of two independent uniforms and slice it in 4 equal-size sections on each margin (yielding $4times 4=16$ pieces, each of size $frac12timesfrac12$). Now take all the probability from the 4 corner pieces and the 4 center pieces and put it evenly into the other 8 pieces.
Let $Y = 2|X|-1$.
In each case, the variables are uncorrelated but not independent (e.g. if $X=1$, what is $P(-0.1<Y<0.1$?)
If you specify some particular family of bivariate distributions with uniform margins it might be possible that under that formulation the only uncorrelated one is independent. Then under that condition, being uncorrelated would imply independence -- but you haven't said anything about the bivariate distribution, only about the marginal distributions.
For example, if you restrict your attention to say the Gaussian copula, then I think the only uncorrelated one has independent margins; you can readily rescale that so that each margin is on (-1,1).
Some R code for sampling from and plotting these bivariates (not necessarily efficiently):
n <- 100000
x <- runif(n,-1,1)
b <- rbinom(n,1,.5)*2-1
y1 <-b*x
y2 <-ifelse(0.5<abs(x)&abs(x)<1,
runif(n,-.5,.5),
runif(n,0.5,1)*b
)
y3 <- 2*abs(x)-1
par(mfrow=c(1,3))
plot(x,y1,pch=16,cex=.3,col=rgb(.5,.5,.5,.5))
plot(x,y2,pch=16,cex=.5,col=rgb(.5,.5,.5,.5))
abline(h=c(-1,-.5,0,.5,1),col=4,lty=3)
abline(v=c(-1,-.5,0,.5,1),col=4,lty=3)
plot(x,y3,pch=16,cex=.3,col=rgb(.5,.5,.5,.5))
(In this formulation, $(Y_2, Y_3)$ gives a fourth example)
$endgroup$
$begingroup$
Thank you. I'm struggling to see why the examples you provided still guarantee that $Y$ is uniformly distributed on $[-1, 1]$, though.
$endgroup$
– Peiffap
1 hour ago
$begingroup$
Do the plots of the bivariate densities help? In each case the shaded parts are all of constant density
$endgroup$
– Glen_b♦
1 hour ago
$begingroup$
They make it visually clearer, yes. Thank you, again.
$endgroup$
– Peiffap
1 hour ago
add a comment |
Your Answer
StackExchange.ifUsing("editor", function ()
return StackExchange.using("mathjaxEditing", function ()
StackExchange.MarkdownEditor.creationCallbacks.add(function (editor, postfix)
StackExchange.mathjaxEditing.prepareWmdForMathJax(editor, postfix, [["$", "$"], ["\\(","\\)"]]);
);
);
, "mathjax-editing");
StackExchange.ready(function()
var channelOptions =
tags: "".split(" "),
id: "65"
;
initTagRenderer("".split(" "), "".split(" "), channelOptions);
StackExchange.using("externalEditor", function()
// Have to fire editor after snippets, if snippets enabled
if (StackExchange.settings.snippets.snippetsEnabled)
StackExchange.using("snippets", function()
createEditor();
);
else
createEditor();
);
function createEditor()
StackExchange.prepareEditor(
heartbeatType: 'answer',
autoActivateHeartbeat: false,
convertImagesToLinks: false,
noModals: true,
showLowRepImageUploadWarning: true,
reputationToPostImages: null,
bindNavPrevention: true,
postfix: "",
imageUploader:
brandingHtml: "Powered by u003ca class="icon-imgur-white" href="https://imgur.com/"u003eu003c/au003e",
contentPolicyHtml: "User contributions licensed under u003ca href="https://creativecommons.org/licenses/by-sa/3.0/"u003ecc by-sa 3.0 with attribution requiredu003c/au003e u003ca href="https://stackoverflow.com/legal/content-policy"u003e(content policy)u003c/au003e",
allowUrls: true
,
onDemand: true,
discardSelector: ".discard-answer"
,immediatelyShowMarkdownHelp:true
);
);
Sign up or log in
StackExchange.ready(function ()
StackExchange.helpers.onClickDraftSave('#login-link');
);
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
StackExchange.ready(
function ()
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fstats.stackexchange.com%2fquestions%2f398050%2frelation-between-independence-and-correlation-of-uniform-random-variables%23new-answer', 'question_page');
);
Post as a guest
Required, but never shown
1 Answer
1
active
oldest
votes
1 Answer
1
active
oldest
votes
active
oldest
votes
active
oldest
votes
$begingroup$
Independent implies uncorrelated but the implication doesn't go the other way.
Uncorrelated implies independence only under certain conditions. e.g. if you have a bivariate normal, it is the case that uncorrelated implies independent (as you said).
It is easy to construct bivariate distributions with uniform margins where the variables are uncorrelated but are not independent. Here are a few examples:
consider an additional random variable $B$ which takes the values $pm 1$ each with probability $frac12$, independent of $X$. Then let $Y=BX$.
take the bivariate distribution of two independent uniforms and slice it in 4 equal-size sections on each margin (yielding $4times 4=16$ pieces, each of size $frac12timesfrac12$). Now take all the probability from the 4 corner pieces and the 4 center pieces and put it evenly into the other 8 pieces.
Let $Y = 2|X|-1$.
In each case, the variables are uncorrelated but not independent (e.g. if $X=1$, what is $P(-0.1<Y<0.1$?)
If you specify some particular family of bivariate distributions with uniform margins it might be possible that under that formulation the only uncorrelated one is independent. Then under that condition, being uncorrelated would imply independence -- but you haven't said anything about the bivariate distribution, only about the marginal distributions.
For example, if you restrict your attention to say the Gaussian copula, then I think the only uncorrelated one has independent margins; you can readily rescale that so that each margin is on (-1,1).
Some R code for sampling from and plotting these bivariates (not necessarily efficiently):
n <- 100000
x <- runif(n,-1,1)
b <- rbinom(n,1,.5)*2-1
y1 <-b*x
y2 <-ifelse(0.5<abs(x)&abs(x)<1,
runif(n,-.5,.5),
runif(n,0.5,1)*b
)
y3 <- 2*abs(x)-1
par(mfrow=c(1,3))
plot(x,y1,pch=16,cex=.3,col=rgb(.5,.5,.5,.5))
plot(x,y2,pch=16,cex=.5,col=rgb(.5,.5,.5,.5))
abline(h=c(-1,-.5,0,.5,1),col=4,lty=3)
abline(v=c(-1,-.5,0,.5,1),col=4,lty=3)
plot(x,y3,pch=16,cex=.3,col=rgb(.5,.5,.5,.5))
(In this formulation, $(Y_2, Y_3)$ gives a fourth example)
$endgroup$
$begingroup$
Thank you. I'm struggling to see why the examples you provided still guarantee that $Y$ is uniformly distributed on $[-1, 1]$, though.
$endgroup$
– Peiffap
1 hour ago
$begingroup$
Do the plots of the bivariate densities help? In each case the shaded parts are all of constant density
$endgroup$
– Glen_b♦
1 hour ago
$begingroup$
They make it visually clearer, yes. Thank you, again.
$endgroup$
– Peiffap
1 hour ago
add a comment |
$begingroup$
Independent implies uncorrelated but the implication doesn't go the other way.
Uncorrelated implies independence only under certain conditions. e.g. if you have a bivariate normal, it is the case that uncorrelated implies independent (as you said).
It is easy to construct bivariate distributions with uniform margins where the variables are uncorrelated but are not independent. Here are a few examples:
consider an additional random variable $B$ which takes the values $pm 1$ each with probability $frac12$, independent of $X$. Then let $Y=BX$.
take the bivariate distribution of two independent uniforms and slice it in 4 equal-size sections on each margin (yielding $4times 4=16$ pieces, each of size $frac12timesfrac12$). Now take all the probability from the 4 corner pieces and the 4 center pieces and put it evenly into the other 8 pieces.
Let $Y = 2|X|-1$.
In each case, the variables are uncorrelated but not independent (e.g. if $X=1$, what is $P(-0.1<Y<0.1$?)
If you specify some particular family of bivariate distributions with uniform margins it might be possible that under that formulation the only uncorrelated one is independent. Then under that condition, being uncorrelated would imply independence -- but you haven't said anything about the bivariate distribution, only about the marginal distributions.
For example, if you restrict your attention to say the Gaussian copula, then I think the only uncorrelated one has independent margins; you can readily rescale that so that each margin is on (-1,1).
Some R code for sampling from and plotting these bivariates (not necessarily efficiently):
n <- 100000
x <- runif(n,-1,1)
b <- rbinom(n,1,.5)*2-1
y1 <-b*x
y2 <-ifelse(0.5<abs(x)&abs(x)<1,
runif(n,-.5,.5),
runif(n,0.5,1)*b
)
y3 <- 2*abs(x)-1
par(mfrow=c(1,3))
plot(x,y1,pch=16,cex=.3,col=rgb(.5,.5,.5,.5))
plot(x,y2,pch=16,cex=.5,col=rgb(.5,.5,.5,.5))
abline(h=c(-1,-.5,0,.5,1),col=4,lty=3)
abline(v=c(-1,-.5,0,.5,1),col=4,lty=3)
plot(x,y3,pch=16,cex=.3,col=rgb(.5,.5,.5,.5))
(In this formulation, $(Y_2, Y_3)$ gives a fourth example)
$endgroup$
$begingroup$
Thank you. I'm struggling to see why the examples you provided still guarantee that $Y$ is uniformly distributed on $[-1, 1]$, though.
$endgroup$
– Peiffap
1 hour ago
$begingroup$
Do the plots of the bivariate densities help? In each case the shaded parts are all of constant density
$endgroup$
– Glen_b♦
1 hour ago
$begingroup$
They make it visually clearer, yes. Thank you, again.
$endgroup$
– Peiffap
1 hour ago
add a comment |
$begingroup$
Independent implies uncorrelated but the implication doesn't go the other way.
Uncorrelated implies independence only under certain conditions. e.g. if you have a bivariate normal, it is the case that uncorrelated implies independent (as you said).
It is easy to construct bivariate distributions with uniform margins where the variables are uncorrelated but are not independent. Here are a few examples:
consider an additional random variable $B$ which takes the values $pm 1$ each with probability $frac12$, independent of $X$. Then let $Y=BX$.
take the bivariate distribution of two independent uniforms and slice it in 4 equal-size sections on each margin (yielding $4times 4=16$ pieces, each of size $frac12timesfrac12$). Now take all the probability from the 4 corner pieces and the 4 center pieces and put it evenly into the other 8 pieces.
Let $Y = 2|X|-1$.
In each case, the variables are uncorrelated but not independent (e.g. if $X=1$, what is $P(-0.1<Y<0.1$?)
If you specify some particular family of bivariate distributions with uniform margins it might be possible that under that formulation the only uncorrelated one is independent. Then under that condition, being uncorrelated would imply independence -- but you haven't said anything about the bivariate distribution, only about the marginal distributions.
For example, if you restrict your attention to say the Gaussian copula, then I think the only uncorrelated one has independent margins; you can readily rescale that so that each margin is on (-1,1).
Some R code for sampling from and plotting these bivariates (not necessarily efficiently):
n <- 100000
x <- runif(n,-1,1)
b <- rbinom(n,1,.5)*2-1
y1 <-b*x
y2 <-ifelse(0.5<abs(x)&abs(x)<1,
runif(n,-.5,.5),
runif(n,0.5,1)*b
)
y3 <- 2*abs(x)-1
par(mfrow=c(1,3))
plot(x,y1,pch=16,cex=.3,col=rgb(.5,.5,.5,.5))
plot(x,y2,pch=16,cex=.5,col=rgb(.5,.5,.5,.5))
abline(h=c(-1,-.5,0,.5,1),col=4,lty=3)
abline(v=c(-1,-.5,0,.5,1),col=4,lty=3)
plot(x,y3,pch=16,cex=.3,col=rgb(.5,.5,.5,.5))
(In this formulation, $(Y_2, Y_3)$ gives a fourth example)
$endgroup$
Independent implies uncorrelated but the implication doesn't go the other way.
Uncorrelated implies independence only under certain conditions. e.g. if you have a bivariate normal, it is the case that uncorrelated implies independent (as you said).
It is easy to construct bivariate distributions with uniform margins where the variables are uncorrelated but are not independent. Here are a few examples:
consider an additional random variable $B$ which takes the values $pm 1$ each with probability $frac12$, independent of $X$. Then let $Y=BX$.
take the bivariate distribution of two independent uniforms and slice it in 4 equal-size sections on each margin (yielding $4times 4=16$ pieces, each of size $frac12timesfrac12$). Now take all the probability from the 4 corner pieces and the 4 center pieces and put it evenly into the other 8 pieces.
Let $Y = 2|X|-1$.
In each case, the variables are uncorrelated but not independent (e.g. if $X=1$, what is $P(-0.1<Y<0.1$?)
If you specify some particular family of bivariate distributions with uniform margins it might be possible that under that formulation the only uncorrelated one is independent. Then under that condition, being uncorrelated would imply independence -- but you haven't said anything about the bivariate distribution, only about the marginal distributions.
For example, if you restrict your attention to say the Gaussian copula, then I think the only uncorrelated one has independent margins; you can readily rescale that so that each margin is on (-1,1).
Some R code for sampling from and plotting these bivariates (not necessarily efficiently):
n <- 100000
x <- runif(n,-1,1)
b <- rbinom(n,1,.5)*2-1
y1 <-b*x
y2 <-ifelse(0.5<abs(x)&abs(x)<1,
runif(n,-.5,.5),
runif(n,0.5,1)*b
)
y3 <- 2*abs(x)-1
par(mfrow=c(1,3))
plot(x,y1,pch=16,cex=.3,col=rgb(.5,.5,.5,.5))
plot(x,y2,pch=16,cex=.5,col=rgb(.5,.5,.5,.5))
abline(h=c(-1,-.5,0,.5,1),col=4,lty=3)
abline(v=c(-1,-.5,0,.5,1),col=4,lty=3)
plot(x,y3,pch=16,cex=.3,col=rgb(.5,.5,.5,.5))
(In this formulation, $(Y_2, Y_3)$ gives a fourth example)
edited 40 mins ago
answered 1 hour ago
Glen_b♦Glen_b
213k22413763
213k22413763
$begingroup$
Thank you. I'm struggling to see why the examples you provided still guarantee that $Y$ is uniformly distributed on $[-1, 1]$, though.
$endgroup$
– Peiffap
1 hour ago
$begingroup$
Do the plots of the bivariate densities help? In each case the shaded parts are all of constant density
$endgroup$
– Glen_b♦
1 hour ago
$begingroup$
They make it visually clearer, yes. Thank you, again.
$endgroup$
– Peiffap
1 hour ago
add a comment |
$begingroup$
Thank you. I'm struggling to see why the examples you provided still guarantee that $Y$ is uniformly distributed on $[-1, 1]$, though.
$endgroup$
– Peiffap
1 hour ago
$begingroup$
Do the plots of the bivariate densities help? In each case the shaded parts are all of constant density
$endgroup$
– Glen_b♦
1 hour ago
$begingroup$
They make it visually clearer, yes. Thank you, again.
$endgroup$
– Peiffap
1 hour ago
$begingroup$
Thank you. I'm struggling to see why the examples you provided still guarantee that $Y$ is uniformly distributed on $[-1, 1]$, though.
$endgroup$
– Peiffap
1 hour ago
$begingroup$
Thank you. I'm struggling to see why the examples you provided still guarantee that $Y$ is uniformly distributed on $[-1, 1]$, though.
$endgroup$
– Peiffap
1 hour ago
$begingroup$
Do the plots of the bivariate densities help? In each case the shaded parts are all of constant density
$endgroup$
– Glen_b♦
1 hour ago
$begingroup$
Do the plots of the bivariate densities help? In each case the shaded parts are all of constant density
$endgroup$
– Glen_b♦
1 hour ago
$begingroup$
They make it visually clearer, yes. Thank you, again.
$endgroup$
– Peiffap
1 hour ago
$begingroup$
They make it visually clearer, yes. Thank you, again.
$endgroup$
– Peiffap
1 hour ago
add a comment |
Thanks for contributing an answer to Cross Validated!
- Please be sure to answer the question. Provide details and share your research!
But avoid …
- Asking for help, clarification, or responding to other answers.
- Making statements based on opinion; back them up with references or personal experience.
Use MathJax to format equations. MathJax reference.
To learn more, see our tips on writing great answers.
Sign up or log in
StackExchange.ready(function ()
StackExchange.helpers.onClickDraftSave('#login-link');
);
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
StackExchange.ready(
function ()
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fstats.stackexchange.com%2fquestions%2f398050%2frelation-between-independence-and-correlation-of-uniform-random-variables%23new-answer', 'question_page');
);
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function ()
StackExchange.helpers.onClickDraftSave('#login-link');
);
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function ()
StackExchange.helpers.onClickDraftSave('#login-link');
);
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function ()
StackExchange.helpers.onClickDraftSave('#login-link');
);
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown