Characterization of zero-dimensional frames via lattices of ideals











up vote
0
down vote

favorite












My question concerns the left-to-right implication of the following:



Theorem A frame $L$ is compact and zero-dimensional iff it is isomorphic to the lattice of ideals $mathcal{I}(B)$ of some Boolean algebra $B$.



A frame $L$ is a complete and bounded lattice which satisfies: $xwedge bigvee_{iin I}a_i=bigvee_{iin I} xwedge s_i$ (it is a complete Heyting algebra). $ain L$ is complemented iff there is a unique $a^ast$ such that $awedge a^ast=0$ and $avee a^ast=1$. We take $Z(L)$ to be the set of all complemented elements of $L$. $Z(L)$ is a Boolean algebra, and is usually called the center of $L$.



$L$ is zero-dimensional iff for every $ain L$ there exists $Ssubseteq Z(L)$ such that $a=bigvee S$. Finally $L$ is compact iff it's compact as a lattice.



So, in a proof of the theorem we take the lattice of ideals of the center of $L$ and define $hcolon Lrightarrow mathcal{I}(Z(L))$ as $h(a)=(a]cap Z(L)$, where $(a]={xin Lmid xleqslant a}$. Of course, $h(a)$ must be an ideal, but I fail to see how to prove that it is a downward set (the other two properties of ideals are rather obvious: $0in h(a)$ and $h(a)$ must closed for finite joins since $Z(L)$ is a sublattice of $L$). But taking $xin h(a)$ and $yleqslant x$, I cannot see how to prove that $yin Z(L)$.



Could you please help me with this?










share|cite|improve this question


























    up vote
    0
    down vote

    favorite












    My question concerns the left-to-right implication of the following:



    Theorem A frame $L$ is compact and zero-dimensional iff it is isomorphic to the lattice of ideals $mathcal{I}(B)$ of some Boolean algebra $B$.



    A frame $L$ is a complete and bounded lattice which satisfies: $xwedge bigvee_{iin I}a_i=bigvee_{iin I} xwedge s_i$ (it is a complete Heyting algebra). $ain L$ is complemented iff there is a unique $a^ast$ such that $awedge a^ast=0$ and $avee a^ast=1$. We take $Z(L)$ to be the set of all complemented elements of $L$. $Z(L)$ is a Boolean algebra, and is usually called the center of $L$.



    $L$ is zero-dimensional iff for every $ain L$ there exists $Ssubseteq Z(L)$ such that $a=bigvee S$. Finally $L$ is compact iff it's compact as a lattice.



    So, in a proof of the theorem we take the lattice of ideals of the center of $L$ and define $hcolon Lrightarrow mathcal{I}(Z(L))$ as $h(a)=(a]cap Z(L)$, where $(a]={xin Lmid xleqslant a}$. Of course, $h(a)$ must be an ideal, but I fail to see how to prove that it is a downward set (the other two properties of ideals are rather obvious: $0in h(a)$ and $h(a)$ must closed for finite joins since $Z(L)$ is a sublattice of $L$). But taking $xin h(a)$ and $yleqslant x$, I cannot see how to prove that $yin Z(L)$.



    Could you please help me with this?










    share|cite|improve this question
























      up vote
      0
      down vote

      favorite









      up vote
      0
      down vote

      favorite











      My question concerns the left-to-right implication of the following:



      Theorem A frame $L$ is compact and zero-dimensional iff it is isomorphic to the lattice of ideals $mathcal{I}(B)$ of some Boolean algebra $B$.



      A frame $L$ is a complete and bounded lattice which satisfies: $xwedge bigvee_{iin I}a_i=bigvee_{iin I} xwedge s_i$ (it is a complete Heyting algebra). $ain L$ is complemented iff there is a unique $a^ast$ such that $awedge a^ast=0$ and $avee a^ast=1$. We take $Z(L)$ to be the set of all complemented elements of $L$. $Z(L)$ is a Boolean algebra, and is usually called the center of $L$.



      $L$ is zero-dimensional iff for every $ain L$ there exists $Ssubseteq Z(L)$ such that $a=bigvee S$. Finally $L$ is compact iff it's compact as a lattice.



      So, in a proof of the theorem we take the lattice of ideals of the center of $L$ and define $hcolon Lrightarrow mathcal{I}(Z(L))$ as $h(a)=(a]cap Z(L)$, where $(a]={xin Lmid xleqslant a}$. Of course, $h(a)$ must be an ideal, but I fail to see how to prove that it is a downward set (the other two properties of ideals are rather obvious: $0in h(a)$ and $h(a)$ must closed for finite joins since $Z(L)$ is a sublattice of $L$). But taking $xin h(a)$ and $yleqslant x$, I cannot see how to prove that $yin Z(L)$.



      Could you please help me with this?










      share|cite|improve this question













      My question concerns the left-to-right implication of the following:



      Theorem A frame $L$ is compact and zero-dimensional iff it is isomorphic to the lattice of ideals $mathcal{I}(B)$ of some Boolean algebra $B$.



      A frame $L$ is a complete and bounded lattice which satisfies: $xwedge bigvee_{iin I}a_i=bigvee_{iin I} xwedge s_i$ (it is a complete Heyting algebra). $ain L$ is complemented iff there is a unique $a^ast$ such that $awedge a^ast=0$ and $avee a^ast=1$. We take $Z(L)$ to be the set of all complemented elements of $L$. $Z(L)$ is a Boolean algebra, and is usually called the center of $L$.



      $L$ is zero-dimensional iff for every $ain L$ there exists $Ssubseteq Z(L)$ such that $a=bigvee S$. Finally $L$ is compact iff it's compact as a lattice.



      So, in a proof of the theorem we take the lattice of ideals of the center of $L$ and define $hcolon Lrightarrow mathcal{I}(Z(L))$ as $h(a)=(a]cap Z(L)$, where $(a]={xin Lmid xleqslant a}$. Of course, $h(a)$ must be an ideal, but I fail to see how to prove that it is a downward set (the other two properties of ideals are rather obvious: $0in h(a)$ and $h(a)$ must closed for finite joins since $Z(L)$ is a sublattice of $L$). But taking $xin h(a)$ and $yleqslant x$, I cannot see how to prove that $yin Z(L)$.



      Could you please help me with this?







      ideals boolean-algebra lattice-orders heyting-algebra






      share|cite|improve this question













      share|cite|improve this question











      share|cite|improve this question




      share|cite|improve this question










      asked Nov 28 at 10:31









      Mad Hatter

      481213




      481213






















          1 Answer
          1






          active

          oldest

          votes

















          up vote
          1
          down vote



          accepted










          You want to prove that $h(a)$ is an ideal of $Z(L)$. Thus, it has to be downward closed in $Z(L)$, not in $L$.



          Showing that $h(a)$ is downward closed in $Z(L)$ is very easy because $h(a)$ is the intersection of a downward closed subset of $L$ with $Z(L)$.






          share|cite|improve this answer





















          • Blind me ;). Thanks!
            – Mad Hatter
            2 days ago











          Your Answer





          StackExchange.ifUsing("editor", function () {
          return StackExchange.using("mathjaxEditing", function () {
          StackExchange.MarkdownEditor.creationCallbacks.add(function (editor, postfix) {
          StackExchange.mathjaxEditing.prepareWmdForMathJax(editor, postfix, [["$", "$"], ["\\(","\\)"]]);
          });
          });
          }, "mathjax-editing");

          StackExchange.ready(function() {
          var channelOptions = {
          tags: "".split(" "),
          id: "69"
          };
          initTagRenderer("".split(" "), "".split(" "), channelOptions);

          StackExchange.using("externalEditor", function() {
          // Have to fire editor after snippets, if snippets enabled
          if (StackExchange.settings.snippets.snippetsEnabled) {
          StackExchange.using("snippets", function() {
          createEditor();
          });
          }
          else {
          createEditor();
          }
          });

          function createEditor() {
          StackExchange.prepareEditor({
          heartbeatType: 'answer',
          convertImagesToLinks: true,
          noModals: true,
          showLowRepImageUploadWarning: true,
          reputationToPostImages: 10,
          bindNavPrevention: true,
          postfix: "",
          imageUploader: {
          brandingHtml: "Powered by u003ca class="icon-imgur-white" href="https://imgur.com/"u003eu003c/au003e",
          contentPolicyHtml: "User contributions licensed under u003ca href="https://creativecommons.org/licenses/by-sa/3.0/"u003ecc by-sa 3.0 with attribution requiredu003c/au003e u003ca href="https://stackoverflow.com/legal/content-policy"u003e(content policy)u003c/au003e",
          allowUrls: true
          },
          noCode: true, onDemand: true,
          discardSelector: ".discard-answer"
          ,immediatelyShowMarkdownHelp:true
          });


          }
          });














          draft saved

          draft discarded


















          StackExchange.ready(
          function () {
          StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fmath.stackexchange.com%2fquestions%2f3016998%2fcharacterization-of-zero-dimensional-frames-via-lattices-of-ideals%23new-answer', 'question_page');
          }
          );

          Post as a guest















          Required, but never shown

























          1 Answer
          1






          active

          oldest

          votes








          1 Answer
          1






          active

          oldest

          votes









          active

          oldest

          votes






          active

          oldest

          votes








          up vote
          1
          down vote



          accepted










          You want to prove that $h(a)$ is an ideal of $Z(L)$. Thus, it has to be downward closed in $Z(L)$, not in $L$.



          Showing that $h(a)$ is downward closed in $Z(L)$ is very easy because $h(a)$ is the intersection of a downward closed subset of $L$ with $Z(L)$.






          share|cite|improve this answer





















          • Blind me ;). Thanks!
            – Mad Hatter
            2 days ago















          up vote
          1
          down vote



          accepted










          You want to prove that $h(a)$ is an ideal of $Z(L)$. Thus, it has to be downward closed in $Z(L)$, not in $L$.



          Showing that $h(a)$ is downward closed in $Z(L)$ is very easy because $h(a)$ is the intersection of a downward closed subset of $L$ with $Z(L)$.






          share|cite|improve this answer





















          • Blind me ;). Thanks!
            – Mad Hatter
            2 days ago













          up vote
          1
          down vote



          accepted







          up vote
          1
          down vote



          accepted






          You want to prove that $h(a)$ is an ideal of $Z(L)$. Thus, it has to be downward closed in $Z(L)$, not in $L$.



          Showing that $h(a)$ is downward closed in $Z(L)$ is very easy because $h(a)$ is the intersection of a downward closed subset of $L$ with $Z(L)$.






          share|cite|improve this answer












          You want to prove that $h(a)$ is an ideal of $Z(L)$. Thus, it has to be downward closed in $Z(L)$, not in $L$.



          Showing that $h(a)$ is downward closed in $Z(L)$ is very easy because $h(a)$ is the intersection of a downward closed subset of $L$ with $Z(L)$.







          share|cite|improve this answer












          share|cite|improve this answer



          share|cite|improve this answer










          answered Dec 12 at 22:34









          Luca Carai

          1948




          1948












          • Blind me ;). Thanks!
            – Mad Hatter
            2 days ago


















          • Blind me ;). Thanks!
            – Mad Hatter
            2 days ago
















          Blind me ;). Thanks!
          – Mad Hatter
          2 days ago




          Blind me ;). Thanks!
          – Mad Hatter
          2 days ago


















          draft saved

          draft discarded




















































          Thanks for contributing an answer to Mathematics Stack Exchange!


          • Please be sure to answer the question. Provide details and share your research!

          But avoid



          • Asking for help, clarification, or responding to other answers.

          • Making statements based on opinion; back them up with references or personal experience.


          Use MathJax to format equations. MathJax reference.


          To learn more, see our tips on writing great answers.





          Some of your past answers have not been well-received, and you're in danger of being blocked from answering.


          Please pay close attention to the following guidance:


          • Please be sure to answer the question. Provide details and share your research!

          But avoid



          • Asking for help, clarification, or responding to other answers.

          • Making statements based on opinion; back them up with references or personal experience.


          To learn more, see our tips on writing great answers.




          draft saved


          draft discarded














          StackExchange.ready(
          function () {
          StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fmath.stackexchange.com%2fquestions%2f3016998%2fcharacterization-of-zero-dimensional-frames-via-lattices-of-ideals%23new-answer', 'question_page');
          }
          );

          Post as a guest















          Required, but never shown





















































          Required, but never shown














          Required, but never shown












          Required, but never shown







          Required, but never shown

































          Required, but never shown














          Required, but never shown












          Required, but never shown







          Required, but never shown







          Popular posts from this blog

          Basket-ball féminin

          Different font size/position of beamer's navigation symbols template's content depending on regular/plain...

          I want to find a topological embedding $f : X rightarrow Y$ and $g: Y rightarrow X$, yet $X$ is not...