independent, identically distributed (IID) random variables












13














I am having trouble understanding IID random variables. I've tried reading http://scipp.ucsc.edu/~haber/ph116C/iid.pdf, http://www.math.ntu.edu.tw/~hchen/teaching/StatInference/notes/lecture32.pdf, and http://www-inst.eecs.berkeley.edu/%7Ecs70/sp13/notes/n17.sp13.pdf but I don't get it.



Would someone explain in simple terms what IID random variables are and give me an example?










share|cite|improve this question
























  • say you are sampling from a known distribution with replacement. Every time you, say, draw a sample, this is a random variable. Drawn samples are independent of each other, and the distribution never changes. Thus, IID random variables
    – Eleven-Eleven
    Aug 13 '13 at 21:41
















13














I am having trouble understanding IID random variables. I've tried reading http://scipp.ucsc.edu/~haber/ph116C/iid.pdf, http://www.math.ntu.edu.tw/~hchen/teaching/StatInference/notes/lecture32.pdf, and http://www-inst.eecs.berkeley.edu/%7Ecs70/sp13/notes/n17.sp13.pdf but I don't get it.



Would someone explain in simple terms what IID random variables are and give me an example?










share|cite|improve this question
























  • say you are sampling from a known distribution with replacement. Every time you, say, draw a sample, this is a random variable. Drawn samples are independent of each other, and the distribution never changes. Thus, IID random variables
    – Eleven-Eleven
    Aug 13 '13 at 21:41














13












13








13


15





I am having trouble understanding IID random variables. I've tried reading http://scipp.ucsc.edu/~haber/ph116C/iid.pdf, http://www.math.ntu.edu.tw/~hchen/teaching/StatInference/notes/lecture32.pdf, and http://www-inst.eecs.berkeley.edu/%7Ecs70/sp13/notes/n17.sp13.pdf but I don't get it.



Would someone explain in simple terms what IID random variables are and give me an example?










share|cite|improve this question















I am having trouble understanding IID random variables. I've tried reading http://scipp.ucsc.edu/~haber/ph116C/iid.pdf, http://www.math.ntu.edu.tw/~hchen/teaching/StatInference/notes/lecture32.pdf, and http://www-inst.eecs.berkeley.edu/%7Ecs70/sp13/notes/n17.sp13.pdf but I don't get it.



Would someone explain in simple terms what IID random variables are and give me an example?







probability probability-theory random-variables independence






share|cite|improve this question















share|cite|improve this question













share|cite|improve this question




share|cite|improve this question








edited Nov 12 '15 at 9:25









BCLC

1




1










asked Aug 13 '13 at 21:35









Frank EppsFrank Epps

3321323




3321323












  • say you are sampling from a known distribution with replacement. Every time you, say, draw a sample, this is a random variable. Drawn samples are independent of each other, and the distribution never changes. Thus, IID random variables
    – Eleven-Eleven
    Aug 13 '13 at 21:41


















  • say you are sampling from a known distribution with replacement. Every time you, say, draw a sample, this is a random variable. Drawn samples are independent of each other, and the distribution never changes. Thus, IID random variables
    – Eleven-Eleven
    Aug 13 '13 at 21:41
















say you are sampling from a known distribution with replacement. Every time you, say, draw a sample, this is a random variable. Drawn samples are independent of each other, and the distribution never changes. Thus, IID random variables
– Eleven-Eleven
Aug 13 '13 at 21:41




say you are sampling from a known distribution with replacement. Every time you, say, draw a sample, this is a random variable. Drawn samples are independent of each other, and the distribution never changes. Thus, IID random variables
– Eleven-Eleven
Aug 13 '13 at 21:41










3 Answers
3






active

oldest

votes


















25














Im sure you know that iid means independent, identically distributed. I think the most prominent example is a coin toss repeated several times.



If $X_1, X_2, dots$ designate the result of the 1st, 2nd, and so on toss (where $X_i = 1$ means that in the i-th toss you have got head and $X_i = 0$ tail), you have that $X_1,X_2,dots $ are iid.



They are independent since every time you flip a coin, the previous result doesn't influence your current result. Edit: there is a mathematical definition of independence, but I don't think it is necessary at the moment.



They are identically distributed, since every time you flip a coin, the chances of getting head (or tail) are identical, no matter if its the 1st or the 100th toss (probability distribution is identical over time). If the coin is "fair" the chances are 0,5 for each event (getting head or tail).



Does that help?






share|cite|improve this answer























  • An example of an identical dependent situation would help me :).
    – jiggunjer
    Sep 5 '16 at 5:46






  • 1




    @jiggunjer How about this: Let $X_1$ be the result of flipping a coin, then let $X_{i+1}=X_i$ for all $i$. The $X_i$ are identically distributed - each variable has .5 chance of being 0 and .5 chance of being 1 - but they're as correlated as can possibly be.
    – Steven Stadnicki
    Nov 6 '16 at 2:41





















4














"Independent" means for any $x_i in X$, $P(x_0, x_1,..., x_i) = prod_0^i P(x_i)$



For example, toss 2 dice. Let $X_1$ be the indicator RV of the first being {1, 2}, and let $X_2$ be the indicator RV of the second being {6}. It's intuitive to conclude that $P(X_1=1, X_2 = 1) = P(X_1=1) P(X_2=1) = frac{1}{3} cdot frac{1}{6}$, so are other combinations.



However, synonyms of "identical" include "alike" and "equal". That's, the probability of every variable should be equal, or identical. In the above example $P(X_1) neq P(X_2)$. To make $X_1$ and $X_2$ be indentical variables, we can let $X_2$ be the indicator RV of the second die being {1, 6}. Then we get $P(X_1=1) = P(X_2=1)=frac{1}{3}$.



"Independent but not identical" as shown in the above two instances, the first is this case, since $P(X_1=1, X_2 = 1) = P(X_1=1) P(X_2=1) = frac{1}{3} cdot frac{1}{6}$ but $P(X_1) neq P(X_2)$.



"Identical and independent", the second example is for this case, since $P(X_1=1, X_2 = 1) = P(X_1=1) P(X_2=1)$ and $P(X_1) = P(X_2)$.



"Identical but not independent", let $X_2$ be the indicator RV of the first die being {5, 6}, we can get: $P(X_1=1, X_2 = 1) = 0 neq P(X_1=1) P(X_2=1)$ but $P(X_1) = P(X_2)$.



Refrence: https://math.stackexchange.com/a/994136/351322






share|cite|improve this answer































    0














    .So basically you will consider events where the outcome in one case will not depend on the outcome of the other cases .It is called identical because in every case u consider the possible outcomes will be same as the previous event .Some one has suggested yes tossing of coin is a good example .I will try to be a statistician here .You should go through few statistical distributions like normal ,gamma etc and then see additive property .There while proving the additive property u will consider independent events initially and then prove that the addition of all the independent variates also follows the respective distribution using the MGF of that particular distribution and then you will extend your property to see what if the variates are made similar .Hope you got your answer






    share|cite|improve this answer





















    • Welcome to Math.SE. This question has a well-accepted answer. You have provided nothing new. And what you have provided does not exactly answer the question. Please refrain from opening up old questions which have good answers unless you have something significant to contribute. THere are many new questions begging for answers.
      – Shailesh
      Mar 11 '16 at 14:46













    Your Answer





    StackExchange.ifUsing("editor", function () {
    return StackExchange.using("mathjaxEditing", function () {
    StackExchange.MarkdownEditor.creationCallbacks.add(function (editor, postfix) {
    StackExchange.mathjaxEditing.prepareWmdForMathJax(editor, postfix, [["$", "$"], ["\\(","\\)"]]);
    });
    });
    }, "mathjax-editing");

    StackExchange.ready(function() {
    var channelOptions = {
    tags: "".split(" "),
    id: "69"
    };
    initTagRenderer("".split(" "), "".split(" "), channelOptions);

    StackExchange.using("externalEditor", function() {
    // Have to fire editor after snippets, if snippets enabled
    if (StackExchange.settings.snippets.snippetsEnabled) {
    StackExchange.using("snippets", function() {
    createEditor();
    });
    }
    else {
    createEditor();
    }
    });

    function createEditor() {
    StackExchange.prepareEditor({
    heartbeatType: 'answer',
    autoActivateHeartbeat: false,
    convertImagesToLinks: true,
    noModals: true,
    showLowRepImageUploadWarning: true,
    reputationToPostImages: 10,
    bindNavPrevention: true,
    postfix: "",
    imageUploader: {
    brandingHtml: "Powered by u003ca class="icon-imgur-white" href="https://imgur.com/"u003eu003c/au003e",
    contentPolicyHtml: "User contributions licensed under u003ca href="https://creativecommons.org/licenses/by-sa/3.0/"u003ecc by-sa 3.0 with attribution requiredu003c/au003e u003ca href="https://stackoverflow.com/legal/content-policy"u003e(content policy)u003c/au003e",
    allowUrls: true
    },
    noCode: true, onDemand: true,
    discardSelector: ".discard-answer"
    ,immediatelyShowMarkdownHelp:true
    });


    }
    });














    draft saved

    draft discarded


















    StackExchange.ready(
    function () {
    StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fmath.stackexchange.com%2fquestions%2f466927%2findependent-identically-distributed-iid-random-variables%23new-answer', 'question_page');
    }
    );

    Post as a guest















    Required, but never shown

























    3 Answers
    3






    active

    oldest

    votes








    3 Answers
    3






    active

    oldest

    votes









    active

    oldest

    votes






    active

    oldest

    votes









    25














    Im sure you know that iid means independent, identically distributed. I think the most prominent example is a coin toss repeated several times.



    If $X_1, X_2, dots$ designate the result of the 1st, 2nd, and so on toss (where $X_i = 1$ means that in the i-th toss you have got head and $X_i = 0$ tail), you have that $X_1,X_2,dots $ are iid.



    They are independent since every time you flip a coin, the previous result doesn't influence your current result. Edit: there is a mathematical definition of independence, but I don't think it is necessary at the moment.



    They are identically distributed, since every time you flip a coin, the chances of getting head (or tail) are identical, no matter if its the 1st or the 100th toss (probability distribution is identical over time). If the coin is "fair" the chances are 0,5 for each event (getting head or tail).



    Does that help?






    share|cite|improve this answer























    • An example of an identical dependent situation would help me :).
      – jiggunjer
      Sep 5 '16 at 5:46






    • 1




      @jiggunjer How about this: Let $X_1$ be the result of flipping a coin, then let $X_{i+1}=X_i$ for all $i$. The $X_i$ are identically distributed - each variable has .5 chance of being 0 and .5 chance of being 1 - but they're as correlated as can possibly be.
      – Steven Stadnicki
      Nov 6 '16 at 2:41


















    25














    Im sure you know that iid means independent, identically distributed. I think the most prominent example is a coin toss repeated several times.



    If $X_1, X_2, dots$ designate the result of the 1st, 2nd, and so on toss (where $X_i = 1$ means that in the i-th toss you have got head and $X_i = 0$ tail), you have that $X_1,X_2,dots $ are iid.



    They are independent since every time you flip a coin, the previous result doesn't influence your current result. Edit: there is a mathematical definition of independence, but I don't think it is necessary at the moment.



    They are identically distributed, since every time you flip a coin, the chances of getting head (or tail) are identical, no matter if its the 1st or the 100th toss (probability distribution is identical over time). If the coin is "fair" the chances are 0,5 for each event (getting head or tail).



    Does that help?






    share|cite|improve this answer























    • An example of an identical dependent situation would help me :).
      – jiggunjer
      Sep 5 '16 at 5:46






    • 1




      @jiggunjer How about this: Let $X_1$ be the result of flipping a coin, then let $X_{i+1}=X_i$ for all $i$. The $X_i$ are identically distributed - each variable has .5 chance of being 0 and .5 chance of being 1 - but they're as correlated as can possibly be.
      – Steven Stadnicki
      Nov 6 '16 at 2:41
















    25












    25








    25






    Im sure you know that iid means independent, identically distributed. I think the most prominent example is a coin toss repeated several times.



    If $X_1, X_2, dots$ designate the result of the 1st, 2nd, and so on toss (where $X_i = 1$ means that in the i-th toss you have got head and $X_i = 0$ tail), you have that $X_1,X_2,dots $ are iid.



    They are independent since every time you flip a coin, the previous result doesn't influence your current result. Edit: there is a mathematical definition of independence, but I don't think it is necessary at the moment.



    They are identically distributed, since every time you flip a coin, the chances of getting head (or tail) are identical, no matter if its the 1st or the 100th toss (probability distribution is identical over time). If the coin is "fair" the chances are 0,5 for each event (getting head or tail).



    Does that help?






    share|cite|improve this answer














    Im sure you know that iid means independent, identically distributed. I think the most prominent example is a coin toss repeated several times.



    If $X_1, X_2, dots$ designate the result of the 1st, 2nd, and so on toss (where $X_i = 1$ means that in the i-th toss you have got head and $X_i = 0$ tail), you have that $X_1,X_2,dots $ are iid.



    They are independent since every time you flip a coin, the previous result doesn't influence your current result. Edit: there is a mathematical definition of independence, but I don't think it is necessary at the moment.



    They are identically distributed, since every time you flip a coin, the chances of getting head (or tail) are identical, no matter if its the 1st or the 100th toss (probability distribution is identical over time). If the coin is "fair" the chances are 0,5 for each event (getting head or tail).



    Does that help?







    share|cite|improve this answer














    share|cite|improve this answer



    share|cite|improve this answer








    edited Aug 13 '13 at 23:41

























    answered Aug 13 '13 at 21:41









    Dima McGreenDima McGreen

    550412




    550412












    • An example of an identical dependent situation would help me :).
      – jiggunjer
      Sep 5 '16 at 5:46






    • 1




      @jiggunjer How about this: Let $X_1$ be the result of flipping a coin, then let $X_{i+1}=X_i$ for all $i$. The $X_i$ are identically distributed - each variable has .5 chance of being 0 and .5 chance of being 1 - but they're as correlated as can possibly be.
      – Steven Stadnicki
      Nov 6 '16 at 2:41




















    • An example of an identical dependent situation would help me :).
      – jiggunjer
      Sep 5 '16 at 5:46






    • 1




      @jiggunjer How about this: Let $X_1$ be the result of flipping a coin, then let $X_{i+1}=X_i$ for all $i$. The $X_i$ are identically distributed - each variable has .5 chance of being 0 and .5 chance of being 1 - but they're as correlated as can possibly be.
      – Steven Stadnicki
      Nov 6 '16 at 2:41


















    An example of an identical dependent situation would help me :).
    – jiggunjer
    Sep 5 '16 at 5:46




    An example of an identical dependent situation would help me :).
    – jiggunjer
    Sep 5 '16 at 5:46




    1




    1




    @jiggunjer How about this: Let $X_1$ be the result of flipping a coin, then let $X_{i+1}=X_i$ for all $i$. The $X_i$ are identically distributed - each variable has .5 chance of being 0 and .5 chance of being 1 - but they're as correlated as can possibly be.
    – Steven Stadnicki
    Nov 6 '16 at 2:41






    @jiggunjer How about this: Let $X_1$ be the result of flipping a coin, then let $X_{i+1}=X_i$ for all $i$. The $X_i$ are identically distributed - each variable has .5 chance of being 0 and .5 chance of being 1 - but they're as correlated as can possibly be.
    – Steven Stadnicki
    Nov 6 '16 at 2:41













    4














    "Independent" means for any $x_i in X$, $P(x_0, x_1,..., x_i) = prod_0^i P(x_i)$



    For example, toss 2 dice. Let $X_1$ be the indicator RV of the first being {1, 2}, and let $X_2$ be the indicator RV of the second being {6}. It's intuitive to conclude that $P(X_1=1, X_2 = 1) = P(X_1=1) P(X_2=1) = frac{1}{3} cdot frac{1}{6}$, so are other combinations.



    However, synonyms of "identical" include "alike" and "equal". That's, the probability of every variable should be equal, or identical. In the above example $P(X_1) neq P(X_2)$. To make $X_1$ and $X_2$ be indentical variables, we can let $X_2$ be the indicator RV of the second die being {1, 6}. Then we get $P(X_1=1) = P(X_2=1)=frac{1}{3}$.



    "Independent but not identical" as shown in the above two instances, the first is this case, since $P(X_1=1, X_2 = 1) = P(X_1=1) P(X_2=1) = frac{1}{3} cdot frac{1}{6}$ but $P(X_1) neq P(X_2)$.



    "Identical and independent", the second example is for this case, since $P(X_1=1, X_2 = 1) = P(X_1=1) P(X_2=1)$ and $P(X_1) = P(X_2)$.



    "Identical but not independent", let $X_2$ be the indicator RV of the first die being {5, 6}, we can get: $P(X_1=1, X_2 = 1) = 0 neq P(X_1=1) P(X_2=1)$ but $P(X_1) = P(X_2)$.



    Refrence: https://math.stackexchange.com/a/994136/351322






    share|cite|improve this answer




























      4














      "Independent" means for any $x_i in X$, $P(x_0, x_1,..., x_i) = prod_0^i P(x_i)$



      For example, toss 2 dice. Let $X_1$ be the indicator RV of the first being {1, 2}, and let $X_2$ be the indicator RV of the second being {6}. It's intuitive to conclude that $P(X_1=1, X_2 = 1) = P(X_1=1) P(X_2=1) = frac{1}{3} cdot frac{1}{6}$, so are other combinations.



      However, synonyms of "identical" include "alike" and "equal". That's, the probability of every variable should be equal, or identical. In the above example $P(X_1) neq P(X_2)$. To make $X_1$ and $X_2$ be indentical variables, we can let $X_2$ be the indicator RV of the second die being {1, 6}. Then we get $P(X_1=1) = P(X_2=1)=frac{1}{3}$.



      "Independent but not identical" as shown in the above two instances, the first is this case, since $P(X_1=1, X_2 = 1) = P(X_1=1) P(X_2=1) = frac{1}{3} cdot frac{1}{6}$ but $P(X_1) neq P(X_2)$.



      "Identical and independent", the second example is for this case, since $P(X_1=1, X_2 = 1) = P(X_1=1) P(X_2=1)$ and $P(X_1) = P(X_2)$.



      "Identical but not independent", let $X_2$ be the indicator RV of the first die being {5, 6}, we can get: $P(X_1=1, X_2 = 1) = 0 neq P(X_1=1) P(X_2=1)$ but $P(X_1) = P(X_2)$.



      Refrence: https://math.stackexchange.com/a/994136/351322






      share|cite|improve this answer


























        4












        4








        4






        "Independent" means for any $x_i in X$, $P(x_0, x_1,..., x_i) = prod_0^i P(x_i)$



        For example, toss 2 dice. Let $X_1$ be the indicator RV of the first being {1, 2}, and let $X_2$ be the indicator RV of the second being {6}. It's intuitive to conclude that $P(X_1=1, X_2 = 1) = P(X_1=1) P(X_2=1) = frac{1}{3} cdot frac{1}{6}$, so are other combinations.



        However, synonyms of "identical" include "alike" and "equal". That's, the probability of every variable should be equal, or identical. In the above example $P(X_1) neq P(X_2)$. To make $X_1$ and $X_2$ be indentical variables, we can let $X_2$ be the indicator RV of the second die being {1, 6}. Then we get $P(X_1=1) = P(X_2=1)=frac{1}{3}$.



        "Independent but not identical" as shown in the above two instances, the first is this case, since $P(X_1=1, X_2 = 1) = P(X_1=1) P(X_2=1) = frac{1}{3} cdot frac{1}{6}$ but $P(X_1) neq P(X_2)$.



        "Identical and independent", the second example is for this case, since $P(X_1=1, X_2 = 1) = P(X_1=1) P(X_2=1)$ and $P(X_1) = P(X_2)$.



        "Identical but not independent", let $X_2$ be the indicator RV of the first die being {5, 6}, we can get: $P(X_1=1, X_2 = 1) = 0 neq P(X_1=1) P(X_2=1)$ but $P(X_1) = P(X_2)$.



        Refrence: https://math.stackexchange.com/a/994136/351322






        share|cite|improve this answer














        "Independent" means for any $x_i in X$, $P(x_0, x_1,..., x_i) = prod_0^i P(x_i)$



        For example, toss 2 dice. Let $X_1$ be the indicator RV of the first being {1, 2}, and let $X_2$ be the indicator RV of the second being {6}. It's intuitive to conclude that $P(X_1=1, X_2 = 1) = P(X_1=1) P(X_2=1) = frac{1}{3} cdot frac{1}{6}$, so are other combinations.



        However, synonyms of "identical" include "alike" and "equal". That's, the probability of every variable should be equal, or identical. In the above example $P(X_1) neq P(X_2)$. To make $X_1$ and $X_2$ be indentical variables, we can let $X_2$ be the indicator RV of the second die being {1, 6}. Then we get $P(X_1=1) = P(X_2=1)=frac{1}{3}$.



        "Independent but not identical" as shown in the above two instances, the first is this case, since $P(X_1=1, X_2 = 1) = P(X_1=1) P(X_2=1) = frac{1}{3} cdot frac{1}{6}$ but $P(X_1) neq P(X_2)$.



        "Identical and independent", the second example is for this case, since $P(X_1=1, X_2 = 1) = P(X_1=1) P(X_2=1)$ and $P(X_1) = P(X_2)$.



        "Identical but not independent", let $X_2$ be the indicator RV of the first die being {5, 6}, we can get: $P(X_1=1, X_2 = 1) = 0 neq P(X_1=1) P(X_2=1)$ but $P(X_1) = P(X_2)$.



        Refrence: https://math.stackexchange.com/a/994136/351322







        share|cite|improve this answer














        share|cite|improve this answer



        share|cite|improve this answer








        edited Dec 5 '18 at 12:13









        Community

        1




        1










        answered Nov 6 '16 at 2:24









        lernerlerner

        289115




        289115























            0














            .So basically you will consider events where the outcome in one case will not depend on the outcome of the other cases .It is called identical because in every case u consider the possible outcomes will be same as the previous event .Some one has suggested yes tossing of coin is a good example .I will try to be a statistician here .You should go through few statistical distributions like normal ,gamma etc and then see additive property .There while proving the additive property u will consider independent events initially and then prove that the addition of all the independent variates also follows the respective distribution using the MGF of that particular distribution and then you will extend your property to see what if the variates are made similar .Hope you got your answer






            share|cite|improve this answer





















            • Welcome to Math.SE. This question has a well-accepted answer. You have provided nothing new. And what you have provided does not exactly answer the question. Please refrain from opening up old questions which have good answers unless you have something significant to contribute. THere are many new questions begging for answers.
              – Shailesh
              Mar 11 '16 at 14:46


















            0














            .So basically you will consider events where the outcome in one case will not depend on the outcome of the other cases .It is called identical because in every case u consider the possible outcomes will be same as the previous event .Some one has suggested yes tossing of coin is a good example .I will try to be a statistician here .You should go through few statistical distributions like normal ,gamma etc and then see additive property .There while proving the additive property u will consider independent events initially and then prove that the addition of all the independent variates also follows the respective distribution using the MGF of that particular distribution and then you will extend your property to see what if the variates are made similar .Hope you got your answer






            share|cite|improve this answer





















            • Welcome to Math.SE. This question has a well-accepted answer. You have provided nothing new. And what you have provided does not exactly answer the question. Please refrain from opening up old questions which have good answers unless you have something significant to contribute. THere are many new questions begging for answers.
              – Shailesh
              Mar 11 '16 at 14:46
















            0












            0








            0






            .So basically you will consider events where the outcome in one case will not depend on the outcome of the other cases .It is called identical because in every case u consider the possible outcomes will be same as the previous event .Some one has suggested yes tossing of coin is a good example .I will try to be a statistician here .You should go through few statistical distributions like normal ,gamma etc and then see additive property .There while proving the additive property u will consider independent events initially and then prove that the addition of all the independent variates also follows the respective distribution using the MGF of that particular distribution and then you will extend your property to see what if the variates are made similar .Hope you got your answer






            share|cite|improve this answer












            .So basically you will consider events where the outcome in one case will not depend on the outcome of the other cases .It is called identical because in every case u consider the possible outcomes will be same as the previous event .Some one has suggested yes tossing of coin is a good example .I will try to be a statistician here .You should go through few statistical distributions like normal ,gamma etc and then see additive property .There while proving the additive property u will consider independent events initially and then prove that the addition of all the independent variates also follows the respective distribution using the MGF of that particular distribution and then you will extend your property to see what if the variates are made similar .Hope you got your answer







            share|cite|improve this answer












            share|cite|improve this answer



            share|cite|improve this answer










            answered Mar 11 '16 at 14:26









            Tejas SureshTejas Suresh

            111




            111












            • Welcome to Math.SE. This question has a well-accepted answer. You have provided nothing new. And what you have provided does not exactly answer the question. Please refrain from opening up old questions which have good answers unless you have something significant to contribute. THere are many new questions begging for answers.
              – Shailesh
              Mar 11 '16 at 14:46




















            • Welcome to Math.SE. This question has a well-accepted answer. You have provided nothing new. And what you have provided does not exactly answer the question. Please refrain from opening up old questions which have good answers unless you have something significant to contribute. THere are many new questions begging for answers.
              – Shailesh
              Mar 11 '16 at 14:46


















            Welcome to Math.SE. This question has a well-accepted answer. You have provided nothing new. And what you have provided does not exactly answer the question. Please refrain from opening up old questions which have good answers unless you have something significant to contribute. THere are many new questions begging for answers.
            – Shailesh
            Mar 11 '16 at 14:46






            Welcome to Math.SE. This question has a well-accepted answer. You have provided nothing new. And what you have provided does not exactly answer the question. Please refrain from opening up old questions which have good answers unless you have something significant to contribute. THere are many new questions begging for answers.
            – Shailesh
            Mar 11 '16 at 14:46




















            draft saved

            draft discarded




















































            Thanks for contributing an answer to Mathematics Stack Exchange!


            • Please be sure to answer the question. Provide details and share your research!

            But avoid



            • Asking for help, clarification, or responding to other answers.

            • Making statements based on opinion; back them up with references or personal experience.


            Use MathJax to format equations. MathJax reference.


            To learn more, see our tips on writing great answers.





            Some of your past answers have not been well-received, and you're in danger of being blocked from answering.


            Please pay close attention to the following guidance:


            • Please be sure to answer the question. Provide details and share your research!

            But avoid



            • Asking for help, clarification, or responding to other answers.

            • Making statements based on opinion; back them up with references or personal experience.


            To learn more, see our tips on writing great answers.




            draft saved


            draft discarded














            StackExchange.ready(
            function () {
            StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fmath.stackexchange.com%2fquestions%2f466927%2findependent-identically-distributed-iid-random-variables%23new-answer', 'question_page');
            }
            );

            Post as a guest















            Required, but never shown





















































            Required, but never shown














            Required, but never shown












            Required, but never shown







            Required, but never shown

































            Required, but never shown














            Required, but never shown












            Required, but never shown







            Required, but never shown







            Popular posts from this blog

            Berounka

            Sphinx de Gizeh

            Different font size/position of beamer's navigation symbols template's content depending on regular/plain...