Relation between independence and correlation of uniform random variablesCorrelations with a linear combination means correlation with individual variables?Geometric mean of uniform variablesHow to Test Independence of Poisson Variables?If $X$ and $Y$ are normally distributed random variables, what kind of distribution their sum follows?Distribution of X-U(0,1) conditioned on sigma algebra of Y/X, where is Y is U(0,1)?Is there a parametric joint distribution such that $X$ and $Y$ are both uniform and $mathbbE[Y ;|; X]$ is linear?Are two Random Variables Independent if their support has a dependency?Correlation of the sigmoid function of normal random varaiblesIntuitive reason why jointly normal and uncorrelated imply independenceConditional maximum likelihood of AR(1) UNIFORM PROCESS

In Aliens, how many people were on LV-426 before the Marines arrived​?

Brake pads destroying wheels

Do I need to consider instance restrictions when showing a language is in P?

Matrix using tikz package

Practical application of matrices and determinants

How could an airship be repaired midflight?

What are idioms that are antonymous to "don't skimp on"?

Could Sinn Fein swing any Brexit vote in Parliament?

Deletion of copy-ctor & copy-assignment - public, private or protected?

I seem to dance, I am not a dancer. Who am I?

Variable completely messes up echoed string

Print a physical multiplication table

How to generate binary array whose elements with values 1 are randomly drawn

What does Deadpool mean by "left the house in that shirt"?

What is the significance behind "40 days" that often appears in the Bible?

Print last inputted byte

Using Past-Perfect interchangeably with the Past Continuous

Describing a chess game in a novel

Can other pieces capture a threatening piece and prevent a checkmate?

Hausdorff dimension of the boundary of fibres of Lipschitz maps

Is it true that good novels will automatically sell themselves on Amazon (and so on) and there is no need for one to waste time promoting?

Asserting that Atheism and Theism are both faith based positions

Writing in a Christian voice

Do native speakers use "ultima" and "proxima" frequently in spoken English?



Relation between independence and correlation of uniform random variables


Correlations with a linear combination means correlation with individual variables?Geometric mean of uniform variablesHow to Test Independence of Poisson Variables?If $X$ and $Y$ are normally distributed random variables, what kind of distribution their sum follows?Distribution of X-U(0,1) conditioned on sigma algebra of Y/X, where is Y is U(0,1)?Is there a parametric joint distribution such that $X$ and $Y$ are both uniform and $mathbbE[Y ;|; X]$ is linear?Are two Random Variables Independent if their support has a dependency?Correlation of the sigmoid function of normal random varaiblesIntuitive reason why jointly normal and uncorrelated imply independenceConditional maximum likelihood of AR(1) UNIFORM PROCESS













1












$begingroup$


My question is fairly simple: let $X$ and $Y$ be two uncorrelated uniform random variables on $[-1,1]$. Are they independent?



I was under the impression that two random, uncorrelated variables are only necessarily independent if their joint distribution is normal, however I can't come up with a counterexample to disprove the claim I ask about. Either a counterexample or a proof would be greatly appreciated.










share|cite|improve this question









$endgroup$
















    1












    $begingroup$


    My question is fairly simple: let $X$ and $Y$ be two uncorrelated uniform random variables on $[-1,1]$. Are they independent?



    I was under the impression that two random, uncorrelated variables are only necessarily independent if their joint distribution is normal, however I can't come up with a counterexample to disprove the claim I ask about. Either a counterexample or a proof would be greatly appreciated.










    share|cite|improve this question









    $endgroup$














      1












      1








      1





      $begingroup$


      My question is fairly simple: let $X$ and $Y$ be two uncorrelated uniform random variables on $[-1,1]$. Are they independent?



      I was under the impression that two random, uncorrelated variables are only necessarily independent if their joint distribution is normal, however I can't come up with a counterexample to disprove the claim I ask about. Either a counterexample or a proof would be greatly appreciated.










      share|cite|improve this question









      $endgroup$




      My question is fairly simple: let $X$ and $Y$ be two uncorrelated uniform random variables on $[-1,1]$. Are they independent?



      I was under the impression that two random, uncorrelated variables are only necessarily independent if their joint distribution is normal, however I can't come up with a counterexample to disprove the claim I ask about. Either a counterexample or a proof would be greatly appreciated.







      correlation independence uniform






      share|cite|improve this question













      share|cite|improve this question











      share|cite|improve this question




      share|cite|improve this question










      asked 2 hours ago









      PeiffapPeiffap

      153




      153




















          1 Answer
          1






          active

          oldest

          votes


















          5












          $begingroup$

          Independent implies uncorrelated but the implication doesn't go the other way.



          Uncorrelated implies independence only under certain conditions. e.g. if you have a bivariate normal, it is the case that uncorrelated implies independent (as you said).



          It is easy to construct bivariate distributions with uniform margins where the variables are uncorrelated but are not independent. Here are a few examples:



          1. consider an additional random variable $B$ which takes the values $pm 1$ each with probability $frac12$, independent of $X$. Then let $Y=BX$.


          2. take the bivariate distribution of two independent uniforms and slice it in 4 equal-size sections on each margin (yielding $4times 4=16$ pieces, each of size $frac12timesfrac12$). Now take all the probability from the 4 corner pieces and the 4 center pieces and put it evenly into the other 8 pieces.


          3. Let $Y = 2|X|-1$.


          In each case, the variables are uncorrelated but not independent (e.g. if $X=1$, what is $P(-0.1<Y<0.1$?)



          Plot of bivariate distribution for each case



          If you specify some particular family of bivariate distributions with uniform margins it might be possible that under that formulation the only uncorrelated one is independent. Then under that condition, being uncorrelated would imply independence -- but you haven't said anything about the bivariate distribution, only about the marginal distributions, so the broader answer applies to the question as asked.



          For example, if you restrict your attention to say the Gaussian copula, then I think the only uncorrelated one has independent margins; you can readily rescale that so that each margin is on (-1,1).




          Some R code for sampling from and plotting these bivariates (not necessarily efficiently):



          n <- 100000
          x <- runif(n,-1,1)
          b <- rbinom(n,1,.5)*2-1
          y1 <-b*x
          y2 <-ifelse(0.5<abs(x)&abs(x)<1,
          runif(n,-.5,.5),
          runif(n,0.5,1)*b
          )
          y3 <- 2*abs(x)-1

          par(mfrow=c(1,3))
          plot(x,y1,pch=16,cex=.3,col=rgb(.5,.5,.5,.5))
          plot(x,y2,pch=16,cex=.5,col=rgb(.5,.5,.5,.5))
          abline(h=c(-1,-.5,0,.5,1),col=4,lty=3)
          abline(v=c(-1,-.5,0,.5,1),col=4,lty=3)
          plot(x,y3,pch=16,cex=.3,col=rgb(.5,.5,.5,.5))


          (In this formulation, $(Y_2, Y_3)$ gives a fourth example)






          share|cite|improve this answer











          $endgroup$












          • $begingroup$
            Thank you. I'm struggling to see why the examples you provided still guarantee that $Y$ is uniformly distributed on $[-1, 1]$, though.
            $endgroup$
            – Peiffap
            1 hour ago











          • $begingroup$
            Do the plots of the bivariate densities help? In each case the shaded parts are all of constant density
            $endgroup$
            – Glen_b
            1 hour ago











          • $begingroup$
            They make it visually clearer, yes. Thank you, again.
            $endgroup$
            – Peiffap
            1 hour ago










          Your Answer





          StackExchange.ifUsing("editor", function ()
          return StackExchange.using("mathjaxEditing", function ()
          StackExchange.MarkdownEditor.creationCallbacks.add(function (editor, postfix)
          StackExchange.mathjaxEditing.prepareWmdForMathJax(editor, postfix, [["$", "$"], ["\\(","\\)"]]);
          );
          );
          , "mathjax-editing");

          StackExchange.ready(function()
          var channelOptions =
          tags: "".split(" "),
          id: "65"
          ;
          initTagRenderer("".split(" "), "".split(" "), channelOptions);

          StackExchange.using("externalEditor", function()
          // Have to fire editor after snippets, if snippets enabled
          if (StackExchange.settings.snippets.snippetsEnabled)
          StackExchange.using("snippets", function()
          createEditor();
          );

          else
          createEditor();

          );

          function createEditor()
          StackExchange.prepareEditor(
          heartbeatType: 'answer',
          autoActivateHeartbeat: false,
          convertImagesToLinks: false,
          noModals: true,
          showLowRepImageUploadWarning: true,
          reputationToPostImages: null,
          bindNavPrevention: true,
          postfix: "",
          imageUploader:
          brandingHtml: "Powered by u003ca class="icon-imgur-white" href="https://imgur.com/"u003eu003c/au003e",
          contentPolicyHtml: "User contributions licensed under u003ca href="https://creativecommons.org/licenses/by-sa/3.0/"u003ecc by-sa 3.0 with attribution requiredu003c/au003e u003ca href="https://stackoverflow.com/legal/content-policy"u003e(content policy)u003c/au003e",
          allowUrls: true
          ,
          onDemand: true,
          discardSelector: ".discard-answer"
          ,immediatelyShowMarkdownHelp:true
          );



          );













          draft saved

          draft discarded


















          StackExchange.ready(
          function ()
          StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fstats.stackexchange.com%2fquestions%2f398050%2frelation-between-independence-and-correlation-of-uniform-random-variables%23new-answer', 'question_page');

          );

          Post as a guest















          Required, but never shown

























          1 Answer
          1






          active

          oldest

          votes








          1 Answer
          1






          active

          oldest

          votes









          active

          oldest

          votes






          active

          oldest

          votes









          5












          $begingroup$

          Independent implies uncorrelated but the implication doesn't go the other way.



          Uncorrelated implies independence only under certain conditions. e.g. if you have a bivariate normal, it is the case that uncorrelated implies independent (as you said).



          It is easy to construct bivariate distributions with uniform margins where the variables are uncorrelated but are not independent. Here are a few examples:



          1. consider an additional random variable $B$ which takes the values $pm 1$ each with probability $frac12$, independent of $X$. Then let $Y=BX$.


          2. take the bivariate distribution of two independent uniforms and slice it in 4 equal-size sections on each margin (yielding $4times 4=16$ pieces, each of size $frac12timesfrac12$). Now take all the probability from the 4 corner pieces and the 4 center pieces and put it evenly into the other 8 pieces.


          3. Let $Y = 2|X|-1$.


          In each case, the variables are uncorrelated but not independent (e.g. if $X=1$, what is $P(-0.1<Y<0.1$?)



          Plot of bivariate distribution for each case



          If you specify some particular family of bivariate distributions with uniform margins it might be possible that under that formulation the only uncorrelated one is independent. Then under that condition, being uncorrelated would imply independence -- but you haven't said anything about the bivariate distribution, only about the marginal distributions, so the broader answer applies to the question as asked.



          For example, if you restrict your attention to say the Gaussian copula, then I think the only uncorrelated one has independent margins; you can readily rescale that so that each margin is on (-1,1).




          Some R code for sampling from and plotting these bivariates (not necessarily efficiently):



          n <- 100000
          x <- runif(n,-1,1)
          b <- rbinom(n,1,.5)*2-1
          y1 <-b*x
          y2 <-ifelse(0.5<abs(x)&abs(x)<1,
          runif(n,-.5,.5),
          runif(n,0.5,1)*b
          )
          y3 <- 2*abs(x)-1

          par(mfrow=c(1,3))
          plot(x,y1,pch=16,cex=.3,col=rgb(.5,.5,.5,.5))
          plot(x,y2,pch=16,cex=.5,col=rgb(.5,.5,.5,.5))
          abline(h=c(-1,-.5,0,.5,1),col=4,lty=3)
          abline(v=c(-1,-.5,0,.5,1),col=4,lty=3)
          plot(x,y3,pch=16,cex=.3,col=rgb(.5,.5,.5,.5))


          (In this formulation, $(Y_2, Y_3)$ gives a fourth example)






          share|cite|improve this answer











          $endgroup$












          • $begingroup$
            Thank you. I'm struggling to see why the examples you provided still guarantee that $Y$ is uniformly distributed on $[-1, 1]$, though.
            $endgroup$
            – Peiffap
            1 hour ago











          • $begingroup$
            Do the plots of the bivariate densities help? In each case the shaded parts are all of constant density
            $endgroup$
            – Glen_b
            1 hour ago











          • $begingroup$
            They make it visually clearer, yes. Thank you, again.
            $endgroup$
            – Peiffap
            1 hour ago















          5












          $begingroup$

          Independent implies uncorrelated but the implication doesn't go the other way.



          Uncorrelated implies independence only under certain conditions. e.g. if you have a bivariate normal, it is the case that uncorrelated implies independent (as you said).



          It is easy to construct bivariate distributions with uniform margins where the variables are uncorrelated but are not independent. Here are a few examples:



          1. consider an additional random variable $B$ which takes the values $pm 1$ each with probability $frac12$, independent of $X$. Then let $Y=BX$.


          2. take the bivariate distribution of two independent uniforms and slice it in 4 equal-size sections on each margin (yielding $4times 4=16$ pieces, each of size $frac12timesfrac12$). Now take all the probability from the 4 corner pieces and the 4 center pieces and put it evenly into the other 8 pieces.


          3. Let $Y = 2|X|-1$.


          In each case, the variables are uncorrelated but not independent (e.g. if $X=1$, what is $P(-0.1<Y<0.1$?)



          Plot of bivariate distribution for each case



          If you specify some particular family of bivariate distributions with uniform margins it might be possible that under that formulation the only uncorrelated one is independent. Then under that condition, being uncorrelated would imply independence -- but you haven't said anything about the bivariate distribution, only about the marginal distributions, so the broader answer applies to the question as asked.



          For example, if you restrict your attention to say the Gaussian copula, then I think the only uncorrelated one has independent margins; you can readily rescale that so that each margin is on (-1,1).




          Some R code for sampling from and plotting these bivariates (not necessarily efficiently):



          n <- 100000
          x <- runif(n,-1,1)
          b <- rbinom(n,1,.5)*2-1
          y1 <-b*x
          y2 <-ifelse(0.5<abs(x)&abs(x)<1,
          runif(n,-.5,.5),
          runif(n,0.5,1)*b
          )
          y3 <- 2*abs(x)-1

          par(mfrow=c(1,3))
          plot(x,y1,pch=16,cex=.3,col=rgb(.5,.5,.5,.5))
          plot(x,y2,pch=16,cex=.5,col=rgb(.5,.5,.5,.5))
          abline(h=c(-1,-.5,0,.5,1),col=4,lty=3)
          abline(v=c(-1,-.5,0,.5,1),col=4,lty=3)
          plot(x,y3,pch=16,cex=.3,col=rgb(.5,.5,.5,.5))


          (In this formulation, $(Y_2, Y_3)$ gives a fourth example)






          share|cite|improve this answer











          $endgroup$












          • $begingroup$
            Thank you. I'm struggling to see why the examples you provided still guarantee that $Y$ is uniformly distributed on $[-1, 1]$, though.
            $endgroup$
            – Peiffap
            1 hour ago











          • $begingroup$
            Do the plots of the bivariate densities help? In each case the shaded parts are all of constant density
            $endgroup$
            – Glen_b
            1 hour ago











          • $begingroup$
            They make it visually clearer, yes. Thank you, again.
            $endgroup$
            – Peiffap
            1 hour ago













          5












          5








          5





          $begingroup$

          Independent implies uncorrelated but the implication doesn't go the other way.



          Uncorrelated implies independence only under certain conditions. e.g. if you have a bivariate normal, it is the case that uncorrelated implies independent (as you said).



          It is easy to construct bivariate distributions with uniform margins where the variables are uncorrelated but are not independent. Here are a few examples:



          1. consider an additional random variable $B$ which takes the values $pm 1$ each with probability $frac12$, independent of $X$. Then let $Y=BX$.


          2. take the bivariate distribution of two independent uniforms and slice it in 4 equal-size sections on each margin (yielding $4times 4=16$ pieces, each of size $frac12timesfrac12$). Now take all the probability from the 4 corner pieces and the 4 center pieces and put it evenly into the other 8 pieces.


          3. Let $Y = 2|X|-1$.


          In each case, the variables are uncorrelated but not independent (e.g. if $X=1$, what is $P(-0.1<Y<0.1$?)



          Plot of bivariate distribution for each case



          If you specify some particular family of bivariate distributions with uniform margins it might be possible that under that formulation the only uncorrelated one is independent. Then under that condition, being uncorrelated would imply independence -- but you haven't said anything about the bivariate distribution, only about the marginal distributions, so the broader answer applies to the question as asked.



          For example, if you restrict your attention to say the Gaussian copula, then I think the only uncorrelated one has independent margins; you can readily rescale that so that each margin is on (-1,1).




          Some R code for sampling from and plotting these bivariates (not necessarily efficiently):



          n <- 100000
          x <- runif(n,-1,1)
          b <- rbinom(n,1,.5)*2-1
          y1 <-b*x
          y2 <-ifelse(0.5<abs(x)&abs(x)<1,
          runif(n,-.5,.5),
          runif(n,0.5,1)*b
          )
          y3 <- 2*abs(x)-1

          par(mfrow=c(1,3))
          plot(x,y1,pch=16,cex=.3,col=rgb(.5,.5,.5,.5))
          plot(x,y2,pch=16,cex=.5,col=rgb(.5,.5,.5,.5))
          abline(h=c(-1,-.5,0,.5,1),col=4,lty=3)
          abline(v=c(-1,-.5,0,.5,1),col=4,lty=3)
          plot(x,y3,pch=16,cex=.3,col=rgb(.5,.5,.5,.5))


          (In this formulation, $(Y_2, Y_3)$ gives a fourth example)






          share|cite|improve this answer











          $endgroup$



          Independent implies uncorrelated but the implication doesn't go the other way.



          Uncorrelated implies independence only under certain conditions. e.g. if you have a bivariate normal, it is the case that uncorrelated implies independent (as you said).



          It is easy to construct bivariate distributions with uniform margins where the variables are uncorrelated but are not independent. Here are a few examples:



          1. consider an additional random variable $B$ which takes the values $pm 1$ each with probability $frac12$, independent of $X$. Then let $Y=BX$.


          2. take the bivariate distribution of two independent uniforms and slice it in 4 equal-size sections on each margin (yielding $4times 4=16$ pieces, each of size $frac12timesfrac12$). Now take all the probability from the 4 corner pieces and the 4 center pieces and put it evenly into the other 8 pieces.


          3. Let $Y = 2|X|-1$.


          In each case, the variables are uncorrelated but not independent (e.g. if $X=1$, what is $P(-0.1<Y<0.1$?)



          Plot of bivariate distribution for each case



          If you specify some particular family of bivariate distributions with uniform margins it might be possible that under that formulation the only uncorrelated one is independent. Then under that condition, being uncorrelated would imply independence -- but you haven't said anything about the bivariate distribution, only about the marginal distributions, so the broader answer applies to the question as asked.



          For example, if you restrict your attention to say the Gaussian copula, then I think the only uncorrelated one has independent margins; you can readily rescale that so that each margin is on (-1,1).




          Some R code for sampling from and plotting these bivariates (not necessarily efficiently):



          n <- 100000
          x <- runif(n,-1,1)
          b <- rbinom(n,1,.5)*2-1
          y1 <-b*x
          y2 <-ifelse(0.5<abs(x)&abs(x)<1,
          runif(n,-.5,.5),
          runif(n,0.5,1)*b
          )
          y3 <- 2*abs(x)-1

          par(mfrow=c(1,3))
          plot(x,y1,pch=16,cex=.3,col=rgb(.5,.5,.5,.5))
          plot(x,y2,pch=16,cex=.5,col=rgb(.5,.5,.5,.5))
          abline(h=c(-1,-.5,0,.5,1),col=4,lty=3)
          abline(v=c(-1,-.5,0,.5,1),col=4,lty=3)
          plot(x,y3,pch=16,cex=.3,col=rgb(.5,.5,.5,.5))


          (In this formulation, $(Y_2, Y_3)$ gives a fourth example)







          share|cite|improve this answer














          share|cite|improve this answer



          share|cite|improve this answer








          edited 2 mins ago

























          answered 2 hours ago









          Glen_bGlen_b

          213k22413763




          213k22413763











          • $begingroup$
            Thank you. I'm struggling to see why the examples you provided still guarantee that $Y$ is uniformly distributed on $[-1, 1]$, though.
            $endgroup$
            – Peiffap
            1 hour ago











          • $begingroup$
            Do the plots of the bivariate densities help? In each case the shaded parts are all of constant density
            $endgroup$
            – Glen_b
            1 hour ago











          • $begingroup$
            They make it visually clearer, yes. Thank you, again.
            $endgroup$
            – Peiffap
            1 hour ago
















          • $begingroup$
            Thank you. I'm struggling to see why the examples you provided still guarantee that $Y$ is uniformly distributed on $[-1, 1]$, though.
            $endgroup$
            – Peiffap
            1 hour ago











          • $begingroup$
            Do the plots of the bivariate densities help? In each case the shaded parts are all of constant density
            $endgroup$
            – Glen_b
            1 hour ago











          • $begingroup$
            They make it visually clearer, yes. Thank you, again.
            $endgroup$
            – Peiffap
            1 hour ago















          $begingroup$
          Thank you. I'm struggling to see why the examples you provided still guarantee that $Y$ is uniformly distributed on $[-1, 1]$, though.
          $endgroup$
          – Peiffap
          1 hour ago





          $begingroup$
          Thank you. I'm struggling to see why the examples you provided still guarantee that $Y$ is uniformly distributed on $[-1, 1]$, though.
          $endgroup$
          – Peiffap
          1 hour ago













          $begingroup$
          Do the plots of the bivariate densities help? In each case the shaded parts are all of constant density
          $endgroup$
          – Glen_b
          1 hour ago





          $begingroup$
          Do the plots of the bivariate densities help? In each case the shaded parts are all of constant density
          $endgroup$
          – Glen_b
          1 hour ago













          $begingroup$
          They make it visually clearer, yes. Thank you, again.
          $endgroup$
          – Peiffap
          1 hour ago




          $begingroup$
          They make it visually clearer, yes. Thank you, again.
          $endgroup$
          – Peiffap
          1 hour ago

















          draft saved

          draft discarded
















































          Thanks for contributing an answer to Cross Validated!


          • Please be sure to answer the question. Provide details and share your research!

          But avoid


          • Asking for help, clarification, or responding to other answers.

          • Making statements based on opinion; back them up with references or personal experience.

          Use MathJax to format equations. MathJax reference.


          To learn more, see our tips on writing great answers.




          draft saved


          draft discarded














          StackExchange.ready(
          function ()
          StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fstats.stackexchange.com%2fquestions%2f398050%2frelation-between-independence-and-correlation-of-uniform-random-variables%23new-answer', 'question_page');

          );

          Post as a guest















          Required, but never shown





















































          Required, but never shown














          Required, but never shown












          Required, but never shown







          Required, but never shown

































          Required, but never shown














          Required, but never shown












          Required, but never shown







          Required, but never shown







          Popular posts from this blog

          How should I use the fbox command correctly to avoid producing a Bad Box message?How to put a long piece of text in a box?How to specify height and width of fboxIs there an arrayrulecolor-like command to change the rule color of fbox?What is the command to highlight bad boxes in pdf?Why does fbox sometimes place the box *over* the graphic image?how to put the text in the boxHow to create command for a box where text inside the box can automatically adjust?how can I make an fbox like command with certain color, shape and width of border?how to use fbox in align modeFbox increase the spacing between the box and it content (inner margin)how to change the box height of an equationWhat is the use of the hbox in a newcommand command?

          152 Atala Notae | Nexus externi | Tabula navigationis"Discovery Circumstances: Numbered Minor Planets"2000152Small-Body Database

          Doxepinum Nexus interni Notae | Tabula navigationis3158DB01142WHOa682390"Structural Analysis of the Histamine H1 Receptor""Transdermal and Topical Drug Administration in the Treatment of Pain""Antidepressants as antipruritic agents: A review"