Proof for a matrix that the following is valid…
up vote
2
down vote
favorite
The task is the following,
Let $M$ be a matrix. Show that the minimal number of rows and columns that together contain all $1$’s of $M$ equals the size of the largest set of $1$’s in $M$, such that no two are in the same row or column.
How can I proof this?
matrices
add a comment |
up vote
2
down vote
favorite
The task is the following,
Let $M$ be a matrix. Show that the minimal number of rows and columns that together contain all $1$’s of $M$ equals the size of the largest set of $1$’s in $M$, such that no two are in the same row or column.
How can I proof this?
matrices
add a comment |
up vote
2
down vote
favorite
up vote
2
down vote
favorite
The task is the following,
Let $M$ be a matrix. Show that the minimal number of rows and columns that together contain all $1$’s of $M$ equals the size of the largest set of $1$’s in $M$, such that no two are in the same row or column.
How can I proof this?
matrices
The task is the following,
Let $M$ be a matrix. Show that the minimal number of rows and columns that together contain all $1$’s of $M$ equals the size of the largest set of $1$’s in $M$, such that no two are in the same row or column.
How can I proof this?
matrices
matrices
asked 2 days ago
plsneedhelp
434
434
add a comment |
add a comment |
2 Answers
2
active
oldest
votes
up vote
0
down vote
accepted
Let $$R: ={text{set of rows and columns that together contain all 1’s of M}}$$Let $$S: = {text{largest set of 1’s in M, such that no two are in the same row or column}}$$We need at least a row or column to cover each of the $1$'s in $S$ (as each of them is in a different row and column), so $$|R| geq |S|$$
Claim: There is a combination of rows and columns created by assigning either a row or a column to each $1$ in $S$ (for eg. first $1$ in $S$ is contained in a row of $R$; second in a column of $R$; etc.) that contains each of the $1$'s in $M$
Suppose there doesn't exist any, then there is some $1$ left out in the matrix from all the row and column indices of any combination of rows and columns in $R$. This $1$ is not in $S$ since we covered all of them by assigning either a row or column containing it. This is a contradiction to the size of $S$ since we chose it to be the largest set of $1$'s with different row and column index than all others.
The proof is not quite correct I think. This issue is that the mere fact that the additional 1 (let's call it $x$) is not covered by a set of rows and columns that covers $S$ does not imply that $x$ can be added to $S$, i.e. that it is not in the same row as some other $1$ (let's call it $s$) that is already in $S$. After all: suppose that $R$ is so big that both the row AND the column of $s$ are in $R$, then we could have just chosen to use the column, rather than row, and thereby leave $x$ uncovered, in spite of it being in the same row as $s$.
– Vincent
2 days ago
@Vincent but I said there does not exist any combination of rows and column (also, I built $R$ such that it will only contain either a row containing $s$ or a column containing $s$), thus both row and column containing $s$ can't exist in $R$ by definition of $S$ and construction of $R$
– ab123
2 days ago
The sentence between brackets is not true as written, we know very little about how you built $R$, for all we know you might take $R$ to be all rows and all columns. The thing that contains either a row or a column containing $s$ doesn't have a name yet, but let's call it $T$. There are many possible choices for $T$. Some of them will miss some 1's in $M$ and your claim is that there is one that doesn't. So the situation you want to rule out is: every choice of a subset $T$ of $R$ that still cover $S$ misses some element $x$, possibly dependent on the choice of $T$. (ctd below)
– Vincent
2 days ago
(continuation) what you do rule out is this: there is a 'universal' element $x$ that is missed by all possible choices of $T$. While it is true that no such $x$ exists, that is weaker than the statement that there is a $T$ that misses nothing.
– Vincent
2 days ago
How can $|R|$ be bigger than $|S|$?
– plsneedhelp
2 days ago
|
show 1 more comment
up vote
0
down vote
I edited my answer to make it more informative
Here is a somewhat unsatisfactory proof as it reduces this theorem to another one about bipartite graphs without telling you anything about why that theorem is true. (But I'll provide a link where you can look it up.)
Make a bipartite graph: one set of vertices are the rows, one set of vertices are the columns and we draw an edge from vertex 'row i' to vertex 'column j' if and only there is a 1 in entry (i, j) of $M$.
Now 1's correspond to edges in the graph, and a collection of $1$'s with no common rows or columns correspond to a set of edges with no common vertices. Such a collection of edges is called a matching.
A set of rows and columns that together 'see' all 1's corresponds to a collection of vertices that together see all edges. Such a set of vertices is called a vertex cover.
In every graph we have (for obvious reasons) that the size of the biggest matching is less than or equal to the size of the smallest vertex cover. However, in bipartite graphs (of which the graph we constructed from our matrix $M$ is an example) we have the highly non-obvious fact that these sizes are actually equal. (This is called Konig's matching theorem and the proof can be found on page 41 in these lecture notes by Lex Schrijver: https://homepages.cwi.nl/~lex/files/dict.pdf.
The equality given by Konigs theorem for the very particular graph coming from our matrix $M$ is exactly the equality you were asking about.
add a comment |
2 Answers
2
active
oldest
votes
2 Answers
2
active
oldest
votes
active
oldest
votes
active
oldest
votes
up vote
0
down vote
accepted
Let $$R: ={text{set of rows and columns that together contain all 1’s of M}}$$Let $$S: = {text{largest set of 1’s in M, such that no two are in the same row or column}}$$We need at least a row or column to cover each of the $1$'s in $S$ (as each of them is in a different row and column), so $$|R| geq |S|$$
Claim: There is a combination of rows and columns created by assigning either a row or a column to each $1$ in $S$ (for eg. first $1$ in $S$ is contained in a row of $R$; second in a column of $R$; etc.) that contains each of the $1$'s in $M$
Suppose there doesn't exist any, then there is some $1$ left out in the matrix from all the row and column indices of any combination of rows and columns in $R$. This $1$ is not in $S$ since we covered all of them by assigning either a row or column containing it. This is a contradiction to the size of $S$ since we chose it to be the largest set of $1$'s with different row and column index than all others.
The proof is not quite correct I think. This issue is that the mere fact that the additional 1 (let's call it $x$) is not covered by a set of rows and columns that covers $S$ does not imply that $x$ can be added to $S$, i.e. that it is not in the same row as some other $1$ (let's call it $s$) that is already in $S$. After all: suppose that $R$ is so big that both the row AND the column of $s$ are in $R$, then we could have just chosen to use the column, rather than row, and thereby leave $x$ uncovered, in spite of it being in the same row as $s$.
– Vincent
2 days ago
@Vincent but I said there does not exist any combination of rows and column (also, I built $R$ such that it will only contain either a row containing $s$ or a column containing $s$), thus both row and column containing $s$ can't exist in $R$ by definition of $S$ and construction of $R$
– ab123
2 days ago
The sentence between brackets is not true as written, we know very little about how you built $R$, for all we know you might take $R$ to be all rows and all columns. The thing that contains either a row or a column containing $s$ doesn't have a name yet, but let's call it $T$. There are many possible choices for $T$. Some of them will miss some 1's in $M$ and your claim is that there is one that doesn't. So the situation you want to rule out is: every choice of a subset $T$ of $R$ that still cover $S$ misses some element $x$, possibly dependent on the choice of $T$. (ctd below)
– Vincent
2 days ago
(continuation) what you do rule out is this: there is a 'universal' element $x$ that is missed by all possible choices of $T$. While it is true that no such $x$ exists, that is weaker than the statement that there is a $T$ that misses nothing.
– Vincent
2 days ago
How can $|R|$ be bigger than $|S|$?
– plsneedhelp
2 days ago
|
show 1 more comment
up vote
0
down vote
accepted
Let $$R: ={text{set of rows and columns that together contain all 1’s of M}}$$Let $$S: = {text{largest set of 1’s in M, such that no two are in the same row or column}}$$We need at least a row or column to cover each of the $1$'s in $S$ (as each of them is in a different row and column), so $$|R| geq |S|$$
Claim: There is a combination of rows and columns created by assigning either a row or a column to each $1$ in $S$ (for eg. first $1$ in $S$ is contained in a row of $R$; second in a column of $R$; etc.) that contains each of the $1$'s in $M$
Suppose there doesn't exist any, then there is some $1$ left out in the matrix from all the row and column indices of any combination of rows and columns in $R$. This $1$ is not in $S$ since we covered all of them by assigning either a row or column containing it. This is a contradiction to the size of $S$ since we chose it to be the largest set of $1$'s with different row and column index than all others.
The proof is not quite correct I think. This issue is that the mere fact that the additional 1 (let's call it $x$) is not covered by a set of rows and columns that covers $S$ does not imply that $x$ can be added to $S$, i.e. that it is not in the same row as some other $1$ (let's call it $s$) that is already in $S$. After all: suppose that $R$ is so big that both the row AND the column of $s$ are in $R$, then we could have just chosen to use the column, rather than row, and thereby leave $x$ uncovered, in spite of it being in the same row as $s$.
– Vincent
2 days ago
@Vincent but I said there does not exist any combination of rows and column (also, I built $R$ such that it will only contain either a row containing $s$ or a column containing $s$), thus both row and column containing $s$ can't exist in $R$ by definition of $S$ and construction of $R$
– ab123
2 days ago
The sentence between brackets is not true as written, we know very little about how you built $R$, for all we know you might take $R$ to be all rows and all columns. The thing that contains either a row or a column containing $s$ doesn't have a name yet, but let's call it $T$. There are many possible choices for $T$. Some of them will miss some 1's in $M$ and your claim is that there is one that doesn't. So the situation you want to rule out is: every choice of a subset $T$ of $R$ that still cover $S$ misses some element $x$, possibly dependent on the choice of $T$. (ctd below)
– Vincent
2 days ago
(continuation) what you do rule out is this: there is a 'universal' element $x$ that is missed by all possible choices of $T$. While it is true that no such $x$ exists, that is weaker than the statement that there is a $T$ that misses nothing.
– Vincent
2 days ago
How can $|R|$ be bigger than $|S|$?
– plsneedhelp
2 days ago
|
show 1 more comment
up vote
0
down vote
accepted
up vote
0
down vote
accepted
Let $$R: ={text{set of rows and columns that together contain all 1’s of M}}$$Let $$S: = {text{largest set of 1’s in M, such that no two are in the same row or column}}$$We need at least a row or column to cover each of the $1$'s in $S$ (as each of them is in a different row and column), so $$|R| geq |S|$$
Claim: There is a combination of rows and columns created by assigning either a row or a column to each $1$ in $S$ (for eg. first $1$ in $S$ is contained in a row of $R$; second in a column of $R$; etc.) that contains each of the $1$'s in $M$
Suppose there doesn't exist any, then there is some $1$ left out in the matrix from all the row and column indices of any combination of rows and columns in $R$. This $1$ is not in $S$ since we covered all of them by assigning either a row or column containing it. This is a contradiction to the size of $S$ since we chose it to be the largest set of $1$'s with different row and column index than all others.
Let $$R: ={text{set of rows and columns that together contain all 1’s of M}}$$Let $$S: = {text{largest set of 1’s in M, such that no two are in the same row or column}}$$We need at least a row or column to cover each of the $1$'s in $S$ (as each of them is in a different row and column), so $$|R| geq |S|$$
Claim: There is a combination of rows and columns created by assigning either a row or a column to each $1$ in $S$ (for eg. first $1$ in $S$ is contained in a row of $R$; second in a column of $R$; etc.) that contains each of the $1$'s in $M$
Suppose there doesn't exist any, then there is some $1$ left out in the matrix from all the row and column indices of any combination of rows and columns in $R$. This $1$ is not in $S$ since we covered all of them by assigning either a row or column containing it. This is a contradiction to the size of $S$ since we chose it to be the largest set of $1$'s with different row and column index than all others.
edited 2 days ago
answered 2 days ago
ab123
1,639421
1,639421
The proof is not quite correct I think. This issue is that the mere fact that the additional 1 (let's call it $x$) is not covered by a set of rows and columns that covers $S$ does not imply that $x$ can be added to $S$, i.e. that it is not in the same row as some other $1$ (let's call it $s$) that is already in $S$. After all: suppose that $R$ is so big that both the row AND the column of $s$ are in $R$, then we could have just chosen to use the column, rather than row, and thereby leave $x$ uncovered, in spite of it being in the same row as $s$.
– Vincent
2 days ago
@Vincent but I said there does not exist any combination of rows and column (also, I built $R$ such that it will only contain either a row containing $s$ or a column containing $s$), thus both row and column containing $s$ can't exist in $R$ by definition of $S$ and construction of $R$
– ab123
2 days ago
The sentence between brackets is not true as written, we know very little about how you built $R$, for all we know you might take $R$ to be all rows and all columns. The thing that contains either a row or a column containing $s$ doesn't have a name yet, but let's call it $T$. There are many possible choices for $T$. Some of them will miss some 1's in $M$ and your claim is that there is one that doesn't. So the situation you want to rule out is: every choice of a subset $T$ of $R$ that still cover $S$ misses some element $x$, possibly dependent on the choice of $T$. (ctd below)
– Vincent
2 days ago
(continuation) what you do rule out is this: there is a 'universal' element $x$ that is missed by all possible choices of $T$. While it is true that no such $x$ exists, that is weaker than the statement that there is a $T$ that misses nothing.
– Vincent
2 days ago
How can $|R|$ be bigger than $|S|$?
– plsneedhelp
2 days ago
|
show 1 more comment
The proof is not quite correct I think. This issue is that the mere fact that the additional 1 (let's call it $x$) is not covered by a set of rows and columns that covers $S$ does not imply that $x$ can be added to $S$, i.e. that it is not in the same row as some other $1$ (let's call it $s$) that is already in $S$. After all: suppose that $R$ is so big that both the row AND the column of $s$ are in $R$, then we could have just chosen to use the column, rather than row, and thereby leave $x$ uncovered, in spite of it being in the same row as $s$.
– Vincent
2 days ago
@Vincent but I said there does not exist any combination of rows and column (also, I built $R$ such that it will only contain either a row containing $s$ or a column containing $s$), thus both row and column containing $s$ can't exist in $R$ by definition of $S$ and construction of $R$
– ab123
2 days ago
The sentence between brackets is not true as written, we know very little about how you built $R$, for all we know you might take $R$ to be all rows and all columns. The thing that contains either a row or a column containing $s$ doesn't have a name yet, but let's call it $T$. There are many possible choices for $T$. Some of them will miss some 1's in $M$ and your claim is that there is one that doesn't. So the situation you want to rule out is: every choice of a subset $T$ of $R$ that still cover $S$ misses some element $x$, possibly dependent on the choice of $T$. (ctd below)
– Vincent
2 days ago
(continuation) what you do rule out is this: there is a 'universal' element $x$ that is missed by all possible choices of $T$. While it is true that no such $x$ exists, that is weaker than the statement that there is a $T$ that misses nothing.
– Vincent
2 days ago
How can $|R|$ be bigger than $|S|$?
– plsneedhelp
2 days ago
The proof is not quite correct I think. This issue is that the mere fact that the additional 1 (let's call it $x$) is not covered by a set of rows and columns that covers $S$ does not imply that $x$ can be added to $S$, i.e. that it is not in the same row as some other $1$ (let's call it $s$) that is already in $S$. After all: suppose that $R$ is so big that both the row AND the column of $s$ are in $R$, then we could have just chosen to use the column, rather than row, and thereby leave $x$ uncovered, in spite of it being in the same row as $s$.
– Vincent
2 days ago
The proof is not quite correct I think. This issue is that the mere fact that the additional 1 (let's call it $x$) is not covered by a set of rows and columns that covers $S$ does not imply that $x$ can be added to $S$, i.e. that it is not in the same row as some other $1$ (let's call it $s$) that is already in $S$. After all: suppose that $R$ is so big that both the row AND the column of $s$ are in $R$, then we could have just chosen to use the column, rather than row, and thereby leave $x$ uncovered, in spite of it being in the same row as $s$.
– Vincent
2 days ago
@Vincent but I said there does not exist any combination of rows and column (also, I built $R$ such that it will only contain either a row containing $s$ or a column containing $s$), thus both row and column containing $s$ can't exist in $R$ by definition of $S$ and construction of $R$
– ab123
2 days ago
@Vincent but I said there does not exist any combination of rows and column (also, I built $R$ such that it will only contain either a row containing $s$ or a column containing $s$), thus both row and column containing $s$ can't exist in $R$ by definition of $S$ and construction of $R$
– ab123
2 days ago
The sentence between brackets is not true as written, we know very little about how you built $R$, for all we know you might take $R$ to be all rows and all columns. The thing that contains either a row or a column containing $s$ doesn't have a name yet, but let's call it $T$. There are many possible choices for $T$. Some of them will miss some 1's in $M$ and your claim is that there is one that doesn't. So the situation you want to rule out is: every choice of a subset $T$ of $R$ that still cover $S$ misses some element $x$, possibly dependent on the choice of $T$. (ctd below)
– Vincent
2 days ago
The sentence between brackets is not true as written, we know very little about how you built $R$, for all we know you might take $R$ to be all rows and all columns. The thing that contains either a row or a column containing $s$ doesn't have a name yet, but let's call it $T$. There are many possible choices for $T$. Some of them will miss some 1's in $M$ and your claim is that there is one that doesn't. So the situation you want to rule out is: every choice of a subset $T$ of $R$ that still cover $S$ misses some element $x$, possibly dependent on the choice of $T$. (ctd below)
– Vincent
2 days ago
(continuation) what you do rule out is this: there is a 'universal' element $x$ that is missed by all possible choices of $T$. While it is true that no such $x$ exists, that is weaker than the statement that there is a $T$ that misses nothing.
– Vincent
2 days ago
(continuation) what you do rule out is this: there is a 'universal' element $x$ that is missed by all possible choices of $T$. While it is true that no such $x$ exists, that is weaker than the statement that there is a $T$ that misses nothing.
– Vincent
2 days ago
How can $|R|$ be bigger than $|S|$?
– plsneedhelp
2 days ago
How can $|R|$ be bigger than $|S|$?
– plsneedhelp
2 days ago
|
show 1 more comment
up vote
0
down vote
I edited my answer to make it more informative
Here is a somewhat unsatisfactory proof as it reduces this theorem to another one about bipartite graphs without telling you anything about why that theorem is true. (But I'll provide a link where you can look it up.)
Make a bipartite graph: one set of vertices are the rows, one set of vertices are the columns and we draw an edge from vertex 'row i' to vertex 'column j' if and only there is a 1 in entry (i, j) of $M$.
Now 1's correspond to edges in the graph, and a collection of $1$'s with no common rows or columns correspond to a set of edges with no common vertices. Such a collection of edges is called a matching.
A set of rows and columns that together 'see' all 1's corresponds to a collection of vertices that together see all edges. Such a set of vertices is called a vertex cover.
In every graph we have (for obvious reasons) that the size of the biggest matching is less than or equal to the size of the smallest vertex cover. However, in bipartite graphs (of which the graph we constructed from our matrix $M$ is an example) we have the highly non-obvious fact that these sizes are actually equal. (This is called Konig's matching theorem and the proof can be found on page 41 in these lecture notes by Lex Schrijver: https://homepages.cwi.nl/~lex/files/dict.pdf.
The equality given by Konigs theorem for the very particular graph coming from our matrix $M$ is exactly the equality you were asking about.
add a comment |
up vote
0
down vote
I edited my answer to make it more informative
Here is a somewhat unsatisfactory proof as it reduces this theorem to another one about bipartite graphs without telling you anything about why that theorem is true. (But I'll provide a link where you can look it up.)
Make a bipartite graph: one set of vertices are the rows, one set of vertices are the columns and we draw an edge from vertex 'row i' to vertex 'column j' if and only there is a 1 in entry (i, j) of $M$.
Now 1's correspond to edges in the graph, and a collection of $1$'s with no common rows or columns correspond to a set of edges with no common vertices. Such a collection of edges is called a matching.
A set of rows and columns that together 'see' all 1's corresponds to a collection of vertices that together see all edges. Such a set of vertices is called a vertex cover.
In every graph we have (for obvious reasons) that the size of the biggest matching is less than or equal to the size of the smallest vertex cover. However, in bipartite graphs (of which the graph we constructed from our matrix $M$ is an example) we have the highly non-obvious fact that these sizes are actually equal. (This is called Konig's matching theorem and the proof can be found on page 41 in these lecture notes by Lex Schrijver: https://homepages.cwi.nl/~lex/files/dict.pdf.
The equality given by Konigs theorem for the very particular graph coming from our matrix $M$ is exactly the equality you were asking about.
add a comment |
up vote
0
down vote
up vote
0
down vote
I edited my answer to make it more informative
Here is a somewhat unsatisfactory proof as it reduces this theorem to another one about bipartite graphs without telling you anything about why that theorem is true. (But I'll provide a link where you can look it up.)
Make a bipartite graph: one set of vertices are the rows, one set of vertices are the columns and we draw an edge from vertex 'row i' to vertex 'column j' if and only there is a 1 in entry (i, j) of $M$.
Now 1's correspond to edges in the graph, and a collection of $1$'s with no common rows or columns correspond to a set of edges with no common vertices. Such a collection of edges is called a matching.
A set of rows and columns that together 'see' all 1's corresponds to a collection of vertices that together see all edges. Such a set of vertices is called a vertex cover.
In every graph we have (for obvious reasons) that the size of the biggest matching is less than or equal to the size of the smallest vertex cover. However, in bipartite graphs (of which the graph we constructed from our matrix $M$ is an example) we have the highly non-obvious fact that these sizes are actually equal. (This is called Konig's matching theorem and the proof can be found on page 41 in these lecture notes by Lex Schrijver: https://homepages.cwi.nl/~lex/files/dict.pdf.
The equality given by Konigs theorem for the very particular graph coming from our matrix $M$ is exactly the equality you were asking about.
I edited my answer to make it more informative
Here is a somewhat unsatisfactory proof as it reduces this theorem to another one about bipartite graphs without telling you anything about why that theorem is true. (But I'll provide a link where you can look it up.)
Make a bipartite graph: one set of vertices are the rows, one set of vertices are the columns and we draw an edge from vertex 'row i' to vertex 'column j' if and only there is a 1 in entry (i, j) of $M$.
Now 1's correspond to edges in the graph, and a collection of $1$'s with no common rows or columns correspond to a set of edges with no common vertices. Such a collection of edges is called a matching.
A set of rows and columns that together 'see' all 1's corresponds to a collection of vertices that together see all edges. Such a set of vertices is called a vertex cover.
In every graph we have (for obvious reasons) that the size of the biggest matching is less than or equal to the size of the smallest vertex cover. However, in bipartite graphs (of which the graph we constructed from our matrix $M$ is an example) we have the highly non-obvious fact that these sizes are actually equal. (This is called Konig's matching theorem and the proof can be found on page 41 in these lecture notes by Lex Schrijver: https://homepages.cwi.nl/~lex/files/dict.pdf.
The equality given by Konigs theorem for the very particular graph coming from our matrix $M$ is exactly the equality you were asking about.
edited 2 days ago
answered 2 days ago
Vincent
2,98611228
2,98611228
add a comment |
add a comment |
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
StackExchange.ready(
function () {
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fmath.stackexchange.com%2fquestions%2f3007772%2fproof-for-a-matrix-that-the-following-is-valid%23new-answer', 'question_page');
}
);
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown