Proof for a matrix that the following is valid…











up vote
2
down vote

favorite
3












The task is the following,



Let $M$ be a matrix. Show that the minimal number of rows and columns that together contain all $1$’s of $M$ equals the size of the largest set of $1$’s in $M$, such that no two are in the same row or column.



How can I proof this?










share|cite|improve this question


























    up vote
    2
    down vote

    favorite
    3












    The task is the following,



    Let $M$ be a matrix. Show that the minimal number of rows and columns that together contain all $1$’s of $M$ equals the size of the largest set of $1$’s in $M$, such that no two are in the same row or column.



    How can I proof this?










    share|cite|improve this question
























      up vote
      2
      down vote

      favorite
      3









      up vote
      2
      down vote

      favorite
      3






      3





      The task is the following,



      Let $M$ be a matrix. Show that the minimal number of rows and columns that together contain all $1$’s of $M$ equals the size of the largest set of $1$’s in $M$, such that no two are in the same row or column.



      How can I proof this?










      share|cite|improve this question













      The task is the following,



      Let $M$ be a matrix. Show that the minimal number of rows and columns that together contain all $1$’s of $M$ equals the size of the largest set of $1$’s in $M$, such that no two are in the same row or column.



      How can I proof this?







      matrices






      share|cite|improve this question













      share|cite|improve this question











      share|cite|improve this question




      share|cite|improve this question










      asked 2 days ago









      plsneedhelp

      434




      434






















          2 Answers
          2






          active

          oldest

          votes

















          up vote
          0
          down vote



          accepted










          Let $$R: ={text{set of rows and columns that together contain all 1’s of M}}$$Let $$S: = {text{largest set of 1’s in M, such that no two are in the same row or column}}$$We need at least a row or column to cover each of the $1$'s in $S$ (as each of them is in a different row and column), so $$|R| geq |S|$$



          Claim: There is a combination of rows and columns created by assigning either a row or a column to each $1$ in $S$ (for eg. first $1$ in $S$ is contained in a row of $R$; second in a column of $R$; etc.) that contains each of the $1$'s in $M$



          Suppose there doesn't exist any, then there is some $1$ left out in the matrix from all the row and column indices of any combination of rows and columns in $R$. This $1$ is not in $S$ since we covered all of them by assigning either a row or column containing it. This is a contradiction to the size of $S$ since we chose it to be the largest set of $1$'s with different row and column index than all others.






          share|cite|improve this answer























          • The proof is not quite correct I think. This issue is that the mere fact that the additional 1 (let's call it $x$) is not covered by a set of rows and columns that covers $S$ does not imply that $x$ can be added to $S$, i.e. that it is not in the same row as some other $1$ (let's call it $s$) that is already in $S$. After all: suppose that $R$ is so big that both the row AND the column of $s$ are in $R$, then we could have just chosen to use the column, rather than row, and thereby leave $x$ uncovered, in spite of it being in the same row as $s$.
            – Vincent
            2 days ago










          • @Vincent but I said there does not exist any combination of rows and column (also, I built $R$ such that it will only contain either a row containing $s$ or a column containing $s$), thus both row and column containing $s$ can't exist in $R$ by definition of $S$ and construction of $R$
            – ab123
            2 days ago












          • The sentence between brackets is not true as written, we know very little about how you built $R$, for all we know you might take $R$ to be all rows and all columns. The thing that contains either a row or a column containing $s$ doesn't have a name yet, but let's call it $T$. There are many possible choices for $T$. Some of them will miss some 1's in $M$ and your claim is that there is one that doesn't. So the situation you want to rule out is: every choice of a subset $T$ of $R$ that still cover $S$ misses some element $x$, possibly dependent on the choice of $T$. (ctd below)
            – Vincent
            2 days ago












          • (continuation) what you do rule out is this: there is a 'universal' element $x$ that is missed by all possible choices of $T$. While it is true that no such $x$ exists, that is weaker than the statement that there is a $T$ that misses nothing.
            – Vincent
            2 days ago












          • How can $|R|$ be bigger than $|S|$?
            – plsneedhelp
            2 days ago


















          up vote
          0
          down vote













          I edited my answer to make it more informative



          Here is a somewhat unsatisfactory proof as it reduces this theorem to another one about bipartite graphs without telling you anything about why that theorem is true. (But I'll provide a link where you can look it up.)



          Make a bipartite graph: one set of vertices are the rows, one set of vertices are the columns and we draw an edge from vertex 'row i' to vertex 'column j' if and only there is a 1 in entry (i, j) of $M$.



          Now 1's correspond to edges in the graph, and a collection of $1$'s with no common rows or columns correspond to a set of edges with no common vertices. Such a collection of edges is called a matching.



          A set of rows and columns that together 'see' all 1's corresponds to a collection of vertices that together see all edges. Such a set of vertices is called a vertex cover.



          In every graph we have (for obvious reasons) that the size of the biggest matching is less than or equal to the size of the smallest vertex cover. However, in bipartite graphs (of which the graph we constructed from our matrix $M$ is an example) we have the highly non-obvious fact that these sizes are actually equal. (This is called Konig's matching theorem and the proof can be found on page 41 in these lecture notes by Lex Schrijver: https://homepages.cwi.nl/~lex/files/dict.pdf.



          The equality given by Konigs theorem for the very particular graph coming from our matrix $M$ is exactly the equality you were asking about.






          share|cite|improve this answer























            Your Answer





            StackExchange.ifUsing("editor", function () {
            return StackExchange.using("mathjaxEditing", function () {
            StackExchange.MarkdownEditor.creationCallbacks.add(function (editor, postfix) {
            StackExchange.mathjaxEditing.prepareWmdForMathJax(editor, postfix, [["$", "$"], ["\\(","\\)"]]);
            });
            });
            }, "mathjax-editing");

            StackExchange.ready(function() {
            var channelOptions = {
            tags: "".split(" "),
            id: "69"
            };
            initTagRenderer("".split(" "), "".split(" "), channelOptions);

            StackExchange.using("externalEditor", function() {
            // Have to fire editor after snippets, if snippets enabled
            if (StackExchange.settings.snippets.snippetsEnabled) {
            StackExchange.using("snippets", function() {
            createEditor();
            });
            }
            else {
            createEditor();
            }
            });

            function createEditor() {
            StackExchange.prepareEditor({
            heartbeatType: 'answer',
            convertImagesToLinks: true,
            noModals: true,
            showLowRepImageUploadWarning: true,
            reputationToPostImages: 10,
            bindNavPrevention: true,
            postfix: "",
            imageUploader: {
            brandingHtml: "Powered by u003ca class="icon-imgur-white" href="https://imgur.com/"u003eu003c/au003e",
            contentPolicyHtml: "User contributions licensed under u003ca href="https://creativecommons.org/licenses/by-sa/3.0/"u003ecc by-sa 3.0 with attribution requiredu003c/au003e u003ca href="https://stackoverflow.com/legal/content-policy"u003e(content policy)u003c/au003e",
            allowUrls: true
            },
            noCode: true, onDemand: true,
            discardSelector: ".discard-answer"
            ,immediatelyShowMarkdownHelp:true
            });


            }
            });














             

            draft saved


            draft discarded


















            StackExchange.ready(
            function () {
            StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fmath.stackexchange.com%2fquestions%2f3007772%2fproof-for-a-matrix-that-the-following-is-valid%23new-answer', 'question_page');
            }
            );

            Post as a guest















            Required, but never shown

























            2 Answers
            2






            active

            oldest

            votes








            2 Answers
            2






            active

            oldest

            votes









            active

            oldest

            votes






            active

            oldest

            votes








            up vote
            0
            down vote



            accepted










            Let $$R: ={text{set of rows and columns that together contain all 1’s of M}}$$Let $$S: = {text{largest set of 1’s in M, such that no two are in the same row or column}}$$We need at least a row or column to cover each of the $1$'s in $S$ (as each of them is in a different row and column), so $$|R| geq |S|$$



            Claim: There is a combination of rows and columns created by assigning either a row or a column to each $1$ in $S$ (for eg. first $1$ in $S$ is contained in a row of $R$; second in a column of $R$; etc.) that contains each of the $1$'s in $M$



            Suppose there doesn't exist any, then there is some $1$ left out in the matrix from all the row and column indices of any combination of rows and columns in $R$. This $1$ is not in $S$ since we covered all of them by assigning either a row or column containing it. This is a contradiction to the size of $S$ since we chose it to be the largest set of $1$'s with different row and column index than all others.






            share|cite|improve this answer























            • The proof is not quite correct I think. This issue is that the mere fact that the additional 1 (let's call it $x$) is not covered by a set of rows and columns that covers $S$ does not imply that $x$ can be added to $S$, i.e. that it is not in the same row as some other $1$ (let's call it $s$) that is already in $S$. After all: suppose that $R$ is so big that both the row AND the column of $s$ are in $R$, then we could have just chosen to use the column, rather than row, and thereby leave $x$ uncovered, in spite of it being in the same row as $s$.
              – Vincent
              2 days ago










            • @Vincent but I said there does not exist any combination of rows and column (also, I built $R$ such that it will only contain either a row containing $s$ or a column containing $s$), thus both row and column containing $s$ can't exist in $R$ by definition of $S$ and construction of $R$
              – ab123
              2 days ago












            • The sentence between brackets is not true as written, we know very little about how you built $R$, for all we know you might take $R$ to be all rows and all columns. The thing that contains either a row or a column containing $s$ doesn't have a name yet, but let's call it $T$. There are many possible choices for $T$. Some of them will miss some 1's in $M$ and your claim is that there is one that doesn't. So the situation you want to rule out is: every choice of a subset $T$ of $R$ that still cover $S$ misses some element $x$, possibly dependent on the choice of $T$. (ctd below)
              – Vincent
              2 days ago












            • (continuation) what you do rule out is this: there is a 'universal' element $x$ that is missed by all possible choices of $T$. While it is true that no such $x$ exists, that is weaker than the statement that there is a $T$ that misses nothing.
              – Vincent
              2 days ago












            • How can $|R|$ be bigger than $|S|$?
              – plsneedhelp
              2 days ago















            up vote
            0
            down vote



            accepted










            Let $$R: ={text{set of rows and columns that together contain all 1’s of M}}$$Let $$S: = {text{largest set of 1’s in M, such that no two are in the same row or column}}$$We need at least a row or column to cover each of the $1$'s in $S$ (as each of them is in a different row and column), so $$|R| geq |S|$$



            Claim: There is a combination of rows and columns created by assigning either a row or a column to each $1$ in $S$ (for eg. first $1$ in $S$ is contained in a row of $R$; second in a column of $R$; etc.) that contains each of the $1$'s in $M$



            Suppose there doesn't exist any, then there is some $1$ left out in the matrix from all the row and column indices of any combination of rows and columns in $R$. This $1$ is not in $S$ since we covered all of them by assigning either a row or column containing it. This is a contradiction to the size of $S$ since we chose it to be the largest set of $1$'s with different row and column index than all others.






            share|cite|improve this answer























            • The proof is not quite correct I think. This issue is that the mere fact that the additional 1 (let's call it $x$) is not covered by a set of rows and columns that covers $S$ does not imply that $x$ can be added to $S$, i.e. that it is not in the same row as some other $1$ (let's call it $s$) that is already in $S$. After all: suppose that $R$ is so big that both the row AND the column of $s$ are in $R$, then we could have just chosen to use the column, rather than row, and thereby leave $x$ uncovered, in spite of it being in the same row as $s$.
              – Vincent
              2 days ago










            • @Vincent but I said there does not exist any combination of rows and column (also, I built $R$ such that it will only contain either a row containing $s$ or a column containing $s$), thus both row and column containing $s$ can't exist in $R$ by definition of $S$ and construction of $R$
              – ab123
              2 days ago












            • The sentence between brackets is not true as written, we know very little about how you built $R$, for all we know you might take $R$ to be all rows and all columns. The thing that contains either a row or a column containing $s$ doesn't have a name yet, but let's call it $T$. There are many possible choices for $T$. Some of them will miss some 1's in $M$ and your claim is that there is one that doesn't. So the situation you want to rule out is: every choice of a subset $T$ of $R$ that still cover $S$ misses some element $x$, possibly dependent on the choice of $T$. (ctd below)
              – Vincent
              2 days ago












            • (continuation) what you do rule out is this: there is a 'universal' element $x$ that is missed by all possible choices of $T$. While it is true that no such $x$ exists, that is weaker than the statement that there is a $T$ that misses nothing.
              – Vincent
              2 days ago












            • How can $|R|$ be bigger than $|S|$?
              – plsneedhelp
              2 days ago













            up vote
            0
            down vote



            accepted







            up vote
            0
            down vote



            accepted






            Let $$R: ={text{set of rows and columns that together contain all 1’s of M}}$$Let $$S: = {text{largest set of 1’s in M, such that no two are in the same row or column}}$$We need at least a row or column to cover each of the $1$'s in $S$ (as each of them is in a different row and column), so $$|R| geq |S|$$



            Claim: There is a combination of rows and columns created by assigning either a row or a column to each $1$ in $S$ (for eg. first $1$ in $S$ is contained in a row of $R$; second in a column of $R$; etc.) that contains each of the $1$'s in $M$



            Suppose there doesn't exist any, then there is some $1$ left out in the matrix from all the row and column indices of any combination of rows and columns in $R$. This $1$ is not in $S$ since we covered all of them by assigning either a row or column containing it. This is a contradiction to the size of $S$ since we chose it to be the largest set of $1$'s with different row and column index than all others.






            share|cite|improve this answer














            Let $$R: ={text{set of rows and columns that together contain all 1’s of M}}$$Let $$S: = {text{largest set of 1’s in M, such that no two are in the same row or column}}$$We need at least a row or column to cover each of the $1$'s in $S$ (as each of them is in a different row and column), so $$|R| geq |S|$$



            Claim: There is a combination of rows and columns created by assigning either a row or a column to each $1$ in $S$ (for eg. first $1$ in $S$ is contained in a row of $R$; second in a column of $R$; etc.) that contains each of the $1$'s in $M$



            Suppose there doesn't exist any, then there is some $1$ left out in the matrix from all the row and column indices of any combination of rows and columns in $R$. This $1$ is not in $S$ since we covered all of them by assigning either a row or column containing it. This is a contradiction to the size of $S$ since we chose it to be the largest set of $1$'s with different row and column index than all others.







            share|cite|improve this answer














            share|cite|improve this answer



            share|cite|improve this answer








            edited 2 days ago

























            answered 2 days ago









            ab123

            1,639421




            1,639421












            • The proof is not quite correct I think. This issue is that the mere fact that the additional 1 (let's call it $x$) is not covered by a set of rows and columns that covers $S$ does not imply that $x$ can be added to $S$, i.e. that it is not in the same row as some other $1$ (let's call it $s$) that is already in $S$. After all: suppose that $R$ is so big that both the row AND the column of $s$ are in $R$, then we could have just chosen to use the column, rather than row, and thereby leave $x$ uncovered, in spite of it being in the same row as $s$.
              – Vincent
              2 days ago










            • @Vincent but I said there does not exist any combination of rows and column (also, I built $R$ such that it will only contain either a row containing $s$ or a column containing $s$), thus both row and column containing $s$ can't exist in $R$ by definition of $S$ and construction of $R$
              – ab123
              2 days ago












            • The sentence between brackets is not true as written, we know very little about how you built $R$, for all we know you might take $R$ to be all rows and all columns. The thing that contains either a row or a column containing $s$ doesn't have a name yet, but let's call it $T$. There are many possible choices for $T$. Some of them will miss some 1's in $M$ and your claim is that there is one that doesn't. So the situation you want to rule out is: every choice of a subset $T$ of $R$ that still cover $S$ misses some element $x$, possibly dependent on the choice of $T$. (ctd below)
              – Vincent
              2 days ago












            • (continuation) what you do rule out is this: there is a 'universal' element $x$ that is missed by all possible choices of $T$. While it is true that no such $x$ exists, that is weaker than the statement that there is a $T$ that misses nothing.
              – Vincent
              2 days ago












            • How can $|R|$ be bigger than $|S|$?
              – plsneedhelp
              2 days ago


















            • The proof is not quite correct I think. This issue is that the mere fact that the additional 1 (let's call it $x$) is not covered by a set of rows and columns that covers $S$ does not imply that $x$ can be added to $S$, i.e. that it is not in the same row as some other $1$ (let's call it $s$) that is already in $S$. After all: suppose that $R$ is so big that both the row AND the column of $s$ are in $R$, then we could have just chosen to use the column, rather than row, and thereby leave $x$ uncovered, in spite of it being in the same row as $s$.
              – Vincent
              2 days ago










            • @Vincent but I said there does not exist any combination of rows and column (also, I built $R$ such that it will only contain either a row containing $s$ or a column containing $s$), thus both row and column containing $s$ can't exist in $R$ by definition of $S$ and construction of $R$
              – ab123
              2 days ago












            • The sentence between brackets is not true as written, we know very little about how you built $R$, for all we know you might take $R$ to be all rows and all columns. The thing that contains either a row or a column containing $s$ doesn't have a name yet, but let's call it $T$. There are many possible choices for $T$. Some of them will miss some 1's in $M$ and your claim is that there is one that doesn't. So the situation you want to rule out is: every choice of a subset $T$ of $R$ that still cover $S$ misses some element $x$, possibly dependent on the choice of $T$. (ctd below)
              – Vincent
              2 days ago












            • (continuation) what you do rule out is this: there is a 'universal' element $x$ that is missed by all possible choices of $T$. While it is true that no such $x$ exists, that is weaker than the statement that there is a $T$ that misses nothing.
              – Vincent
              2 days ago












            • How can $|R|$ be bigger than $|S|$?
              – plsneedhelp
              2 days ago
















            The proof is not quite correct I think. This issue is that the mere fact that the additional 1 (let's call it $x$) is not covered by a set of rows and columns that covers $S$ does not imply that $x$ can be added to $S$, i.e. that it is not in the same row as some other $1$ (let's call it $s$) that is already in $S$. After all: suppose that $R$ is so big that both the row AND the column of $s$ are in $R$, then we could have just chosen to use the column, rather than row, and thereby leave $x$ uncovered, in spite of it being in the same row as $s$.
            – Vincent
            2 days ago




            The proof is not quite correct I think. This issue is that the mere fact that the additional 1 (let's call it $x$) is not covered by a set of rows and columns that covers $S$ does not imply that $x$ can be added to $S$, i.e. that it is not in the same row as some other $1$ (let's call it $s$) that is already in $S$. After all: suppose that $R$ is so big that both the row AND the column of $s$ are in $R$, then we could have just chosen to use the column, rather than row, and thereby leave $x$ uncovered, in spite of it being in the same row as $s$.
            – Vincent
            2 days ago












            @Vincent but I said there does not exist any combination of rows and column (also, I built $R$ such that it will only contain either a row containing $s$ or a column containing $s$), thus both row and column containing $s$ can't exist in $R$ by definition of $S$ and construction of $R$
            – ab123
            2 days ago






            @Vincent but I said there does not exist any combination of rows and column (also, I built $R$ such that it will only contain either a row containing $s$ or a column containing $s$), thus both row and column containing $s$ can't exist in $R$ by definition of $S$ and construction of $R$
            – ab123
            2 days ago














            The sentence between brackets is not true as written, we know very little about how you built $R$, for all we know you might take $R$ to be all rows and all columns. The thing that contains either a row or a column containing $s$ doesn't have a name yet, but let's call it $T$. There are many possible choices for $T$. Some of them will miss some 1's in $M$ and your claim is that there is one that doesn't. So the situation you want to rule out is: every choice of a subset $T$ of $R$ that still cover $S$ misses some element $x$, possibly dependent on the choice of $T$. (ctd below)
            – Vincent
            2 days ago






            The sentence between brackets is not true as written, we know very little about how you built $R$, for all we know you might take $R$ to be all rows and all columns. The thing that contains either a row or a column containing $s$ doesn't have a name yet, but let's call it $T$. There are many possible choices for $T$. Some of them will miss some 1's in $M$ and your claim is that there is one that doesn't. So the situation you want to rule out is: every choice of a subset $T$ of $R$ that still cover $S$ misses some element $x$, possibly dependent on the choice of $T$. (ctd below)
            – Vincent
            2 days ago














            (continuation) what you do rule out is this: there is a 'universal' element $x$ that is missed by all possible choices of $T$. While it is true that no such $x$ exists, that is weaker than the statement that there is a $T$ that misses nothing.
            – Vincent
            2 days ago






            (continuation) what you do rule out is this: there is a 'universal' element $x$ that is missed by all possible choices of $T$. While it is true that no such $x$ exists, that is weaker than the statement that there is a $T$ that misses nothing.
            – Vincent
            2 days ago














            How can $|R|$ be bigger than $|S|$?
            – plsneedhelp
            2 days ago




            How can $|R|$ be bigger than $|S|$?
            – plsneedhelp
            2 days ago










            up vote
            0
            down vote













            I edited my answer to make it more informative



            Here is a somewhat unsatisfactory proof as it reduces this theorem to another one about bipartite graphs without telling you anything about why that theorem is true. (But I'll provide a link where you can look it up.)



            Make a bipartite graph: one set of vertices are the rows, one set of vertices are the columns and we draw an edge from vertex 'row i' to vertex 'column j' if and only there is a 1 in entry (i, j) of $M$.



            Now 1's correspond to edges in the graph, and a collection of $1$'s with no common rows or columns correspond to a set of edges with no common vertices. Such a collection of edges is called a matching.



            A set of rows and columns that together 'see' all 1's corresponds to a collection of vertices that together see all edges. Such a set of vertices is called a vertex cover.



            In every graph we have (for obvious reasons) that the size of the biggest matching is less than or equal to the size of the smallest vertex cover. However, in bipartite graphs (of which the graph we constructed from our matrix $M$ is an example) we have the highly non-obvious fact that these sizes are actually equal. (This is called Konig's matching theorem and the proof can be found on page 41 in these lecture notes by Lex Schrijver: https://homepages.cwi.nl/~lex/files/dict.pdf.



            The equality given by Konigs theorem for the very particular graph coming from our matrix $M$ is exactly the equality you were asking about.






            share|cite|improve this answer



























              up vote
              0
              down vote













              I edited my answer to make it more informative



              Here is a somewhat unsatisfactory proof as it reduces this theorem to another one about bipartite graphs without telling you anything about why that theorem is true. (But I'll provide a link where you can look it up.)



              Make a bipartite graph: one set of vertices are the rows, one set of vertices are the columns and we draw an edge from vertex 'row i' to vertex 'column j' if and only there is a 1 in entry (i, j) of $M$.



              Now 1's correspond to edges in the graph, and a collection of $1$'s with no common rows or columns correspond to a set of edges with no common vertices. Such a collection of edges is called a matching.



              A set of rows and columns that together 'see' all 1's corresponds to a collection of vertices that together see all edges. Such a set of vertices is called a vertex cover.



              In every graph we have (for obvious reasons) that the size of the biggest matching is less than or equal to the size of the smallest vertex cover. However, in bipartite graphs (of which the graph we constructed from our matrix $M$ is an example) we have the highly non-obvious fact that these sizes are actually equal. (This is called Konig's matching theorem and the proof can be found on page 41 in these lecture notes by Lex Schrijver: https://homepages.cwi.nl/~lex/files/dict.pdf.



              The equality given by Konigs theorem for the very particular graph coming from our matrix $M$ is exactly the equality you were asking about.






              share|cite|improve this answer

























                up vote
                0
                down vote










                up vote
                0
                down vote









                I edited my answer to make it more informative



                Here is a somewhat unsatisfactory proof as it reduces this theorem to another one about bipartite graphs without telling you anything about why that theorem is true. (But I'll provide a link where you can look it up.)



                Make a bipartite graph: one set of vertices are the rows, one set of vertices are the columns and we draw an edge from vertex 'row i' to vertex 'column j' if and only there is a 1 in entry (i, j) of $M$.



                Now 1's correspond to edges in the graph, and a collection of $1$'s with no common rows or columns correspond to a set of edges with no common vertices. Such a collection of edges is called a matching.



                A set of rows and columns that together 'see' all 1's corresponds to a collection of vertices that together see all edges. Such a set of vertices is called a vertex cover.



                In every graph we have (for obvious reasons) that the size of the biggest matching is less than or equal to the size of the smallest vertex cover. However, in bipartite graphs (of which the graph we constructed from our matrix $M$ is an example) we have the highly non-obvious fact that these sizes are actually equal. (This is called Konig's matching theorem and the proof can be found on page 41 in these lecture notes by Lex Schrijver: https://homepages.cwi.nl/~lex/files/dict.pdf.



                The equality given by Konigs theorem for the very particular graph coming from our matrix $M$ is exactly the equality you were asking about.






                share|cite|improve this answer














                I edited my answer to make it more informative



                Here is a somewhat unsatisfactory proof as it reduces this theorem to another one about bipartite graphs without telling you anything about why that theorem is true. (But I'll provide a link where you can look it up.)



                Make a bipartite graph: one set of vertices are the rows, one set of vertices are the columns and we draw an edge from vertex 'row i' to vertex 'column j' if and only there is a 1 in entry (i, j) of $M$.



                Now 1's correspond to edges in the graph, and a collection of $1$'s with no common rows or columns correspond to a set of edges with no common vertices. Such a collection of edges is called a matching.



                A set of rows and columns that together 'see' all 1's corresponds to a collection of vertices that together see all edges. Such a set of vertices is called a vertex cover.



                In every graph we have (for obvious reasons) that the size of the biggest matching is less than or equal to the size of the smallest vertex cover. However, in bipartite graphs (of which the graph we constructed from our matrix $M$ is an example) we have the highly non-obvious fact that these sizes are actually equal. (This is called Konig's matching theorem and the proof can be found on page 41 in these lecture notes by Lex Schrijver: https://homepages.cwi.nl/~lex/files/dict.pdf.



                The equality given by Konigs theorem for the very particular graph coming from our matrix $M$ is exactly the equality you were asking about.







                share|cite|improve this answer














                share|cite|improve this answer



                share|cite|improve this answer








                edited 2 days ago

























                answered 2 days ago









                Vincent

                2,98611228




                2,98611228






























                     

                    draft saved


                    draft discarded



















































                     


                    draft saved


                    draft discarded














                    StackExchange.ready(
                    function () {
                    StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fmath.stackexchange.com%2fquestions%2f3007772%2fproof-for-a-matrix-that-the-following-is-valid%23new-answer', 'question_page');
                    }
                    );

                    Post as a guest















                    Required, but never shown





















































                    Required, but never shown














                    Required, but never shown












                    Required, but never shown







                    Required, but never shown

































                    Required, but never shown














                    Required, but never shown












                    Required, but never shown







                    Required, but never shown







                    Popular posts from this blog

                    Berounka

                    Sphinx de Gizeh

                    Different font size/position of beamer's navigation symbols template's content depending on regular/plain...