Ranking Big O Functions By Complexity











up vote
0
down vote

favorite












I am trying to rank these functions — 2n, n100, (n + 1)2, n·lg(n), 100n, n!, lg(n), and n99 + n98 — so that each function is the big-O of the next function, but I do not know a method of determining if one function is the big-O of another. I'd really appreciate if someone could explain how I would go about doing this.










share|improve this question




















  • 1




    Surely you have learnt a definition of what O is? If so please specify what the difficulty in applying it is. Also question might be more suitable for math.stackexchange.com
    – user10605163
    Nov 20 at 6:53

















up vote
0
down vote

favorite












I am trying to rank these functions — 2n, n100, (n + 1)2, n·lg(n), 100n, n!, lg(n), and n99 + n98 — so that each function is the big-O of the next function, but I do not know a method of determining if one function is the big-O of another. I'd really appreciate if someone could explain how I would go about doing this.










share|improve this question




















  • 1




    Surely you have learnt a definition of what O is? If so please specify what the difficulty in applying it is. Also question might be more suitable for math.stackexchange.com
    – user10605163
    Nov 20 at 6:53















up vote
0
down vote

favorite









up vote
0
down vote

favorite











I am trying to rank these functions — 2n, n100, (n + 1)2, n·lg(n), 100n, n!, lg(n), and n99 + n98 — so that each function is the big-O of the next function, but I do not know a method of determining if one function is the big-O of another. I'd really appreciate if someone could explain how I would go about doing this.










share|improve this question















I am trying to rank these functions — 2n, n100, (n + 1)2, n·lg(n), 100n, n!, lg(n), and n99 + n98 — so that each function is the big-O of the next function, but I do not know a method of determining if one function is the big-O of another. I'd really appreciate if someone could explain how I would go about doing this.







time-complexity big-o complexity-theory






share|improve this question















share|improve this question













share|improve this question




share|improve this question








edited Nov 21 at 19:42









hidefromkgb

2,613730




2,613730










asked Nov 20 at 4:54









HoneyBunchers

133




133








  • 1




    Surely you have learnt a definition of what O is? If so please specify what the difficulty in applying it is. Also question might be more suitable for math.stackexchange.com
    – user10605163
    Nov 20 at 6:53
















  • 1




    Surely you have learnt a definition of what O is? If so please specify what the difficulty in applying it is. Also question might be more suitable for math.stackexchange.com
    – user10605163
    Nov 20 at 6:53










1




1




Surely you have learnt a definition of what O is? If so please specify what the difficulty in applying it is. Also question might be more suitable for math.stackexchange.com
– user10605163
Nov 20 at 6:53






Surely you have learnt a definition of what O is? If so please specify what the difficulty in applying it is. Also question might be more suitable for math.stackexchange.com
– user10605163
Nov 20 at 6:53














2 Answers
2






active

oldest

votes

















up vote
0
down vote



accepted










Assuming you have some programming background. Say you have below code:



void SomeMethod(int x)
{
for(int i = 0; i< x; i++)
{
// Do Some Work
}
}


Notice that the loop runs for x iterations. Generalizing, we say that you will get the solution after N iterations (where N will be the value of x ex: number of items in array/input etc).
so This type of implementation/algorithm is said to have Time Complexity of Order of N written as O(n)



Similarly, a Nested For (2 Loops) is O(n-squared) => O(n^2)



If you have Binary decisions made and you reduce possibilities into halves and pick only one half for solution. Then complexity is O(log n)



Found this link to be interesting.



For: Himanshu



While the Link explains how log(base2)N complexity comes into picture very well, Lets me put the same in my words.



Suppose you have a Pre-Sorted List like:



1,2,3,4,5,6,7,8,9,10



Now, you have been asked to Find whether 10 exists in the list. The first solution that comes to mind is Loop through the list and Find it. Which means O(n). Can it be made better?



Approach 1:



As we know that List of already sorted in ascending order So:




  • Break list at center (say at 5).

  • Compare the value of Center (5) with the Search Value (10).

  • If Center Value == Search Value => Item Found

  • If Center < Search Value => Do above steps for Right Half of the List

  • If Center > Search Value => Do above steps for Left Half of the List


For this simple example we will find 10 after doing 3 or 4 breaks (at: 5 then 8 then 9) (depending on how you implement)



That means For N = 10 Items - Search time was 3 (or 4). Putting some mathematics over here;



2^3 + 2 = 10 for simplicity sake lets say



2^3 = 10 (nearly equals --- this is just to do simple Logarithms base 2)



This can be re-written as:



Log-Base-2 10 = 3 (again nearly)



enter image description here



We know 10 was number of items & 3 was the number of breaks/lookup we had to do to find item. It Becomes



log N = K



That is the Complexity of the alogorithm above. O(log N)






share|improve this answer























  • I don't quite see how this is helpful for solving the problem in OP. They are supposed to order big-O classes by inclusion. Algorithms are not relevant at all.
    – user10605163
    Nov 20 at 6:58












  • Exactly this is just nn would you please explain via log(n) example as well because nn is the simplest of all
    – Himanshu Ahuja
    Nov 21 at 19:49


















up vote
0
down vote













Generally when a loop is nested we multiply the values as O(outerloop max value * innerloop max value) n so on. egfor (i to n){ for(j to k){}} here meaning if youll say for i=1 j=1 to k i.e. 1 * k next i=2,j=1 to k so i.e. the O(max(i)*max(j)) implies O(n*k).. Further, if you want to find order you need to recall basic operations with logarithmic usage like O(n+n(addition)) <O(n*n(multiplication)) for log it minimizes the value in it saying O(log n) <O(n) <O(n+n(addition)) <O(n*n(multiplication)) and so on. By this way you can acheive with other functions as well.




Approach should be better first generalised the equation for calculating time complexity. liken! =n*(n-1)*(n-2)*..n-(n-1)so somewhere O(nk) would be generalised formated worst case complexity like this way you can compare if k=2 then O(nk) =O(n*n)







share|improve this answer























    Your Answer






    StackExchange.ifUsing("editor", function () {
    StackExchange.using("externalEditor", function () {
    StackExchange.using("snippets", function () {
    StackExchange.snippets.init();
    });
    });
    }, "code-snippets");

    StackExchange.ready(function() {
    var channelOptions = {
    tags: "".split(" "),
    id: "1"
    };
    initTagRenderer("".split(" "), "".split(" "), channelOptions);

    StackExchange.using("externalEditor", function() {
    // Have to fire editor after snippets, if snippets enabled
    if (StackExchange.settings.snippets.snippetsEnabled) {
    StackExchange.using("snippets", function() {
    createEditor();
    });
    }
    else {
    createEditor();
    }
    });

    function createEditor() {
    StackExchange.prepareEditor({
    heartbeatType: 'answer',
    convertImagesToLinks: true,
    noModals: true,
    showLowRepImageUploadWarning: true,
    reputationToPostImages: 10,
    bindNavPrevention: true,
    postfix: "",
    imageUploader: {
    brandingHtml: "Powered by u003ca class="icon-imgur-white" href="https://imgur.com/"u003eu003c/au003e",
    contentPolicyHtml: "User contributions licensed under u003ca href="https://creativecommons.org/licenses/by-sa/3.0/"u003ecc by-sa 3.0 with attribution requiredu003c/au003e u003ca href="https://stackoverflow.com/legal/content-policy"u003e(content policy)u003c/au003e",
    allowUrls: true
    },
    onDemand: true,
    discardSelector: ".discard-answer"
    ,immediatelyShowMarkdownHelp:true
    });


    }
    });














    draft saved

    draft discarded


















    StackExchange.ready(
    function () {
    StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fstackoverflow.com%2fquestions%2f53386454%2franking-big-o-functions-by-complexity%23new-answer', 'question_page');
    }
    );

    Post as a guest















    Required, but never shown

























    2 Answers
    2






    active

    oldest

    votes








    2 Answers
    2






    active

    oldest

    votes









    active

    oldest

    votes






    active

    oldest

    votes








    up vote
    0
    down vote



    accepted










    Assuming you have some programming background. Say you have below code:



    void SomeMethod(int x)
    {
    for(int i = 0; i< x; i++)
    {
    // Do Some Work
    }
    }


    Notice that the loop runs for x iterations. Generalizing, we say that you will get the solution after N iterations (where N will be the value of x ex: number of items in array/input etc).
    so This type of implementation/algorithm is said to have Time Complexity of Order of N written as O(n)



    Similarly, a Nested For (2 Loops) is O(n-squared) => O(n^2)



    If you have Binary decisions made and you reduce possibilities into halves and pick only one half for solution. Then complexity is O(log n)



    Found this link to be interesting.



    For: Himanshu



    While the Link explains how log(base2)N complexity comes into picture very well, Lets me put the same in my words.



    Suppose you have a Pre-Sorted List like:



    1,2,3,4,5,6,7,8,9,10



    Now, you have been asked to Find whether 10 exists in the list. The first solution that comes to mind is Loop through the list and Find it. Which means O(n). Can it be made better?



    Approach 1:



    As we know that List of already sorted in ascending order So:




    • Break list at center (say at 5).

    • Compare the value of Center (5) with the Search Value (10).

    • If Center Value == Search Value => Item Found

    • If Center < Search Value => Do above steps for Right Half of the List

    • If Center > Search Value => Do above steps for Left Half of the List


    For this simple example we will find 10 after doing 3 or 4 breaks (at: 5 then 8 then 9) (depending on how you implement)



    That means For N = 10 Items - Search time was 3 (or 4). Putting some mathematics over here;



    2^3 + 2 = 10 for simplicity sake lets say



    2^3 = 10 (nearly equals --- this is just to do simple Logarithms base 2)



    This can be re-written as:



    Log-Base-2 10 = 3 (again nearly)



    enter image description here



    We know 10 was number of items & 3 was the number of breaks/lookup we had to do to find item. It Becomes



    log N = K



    That is the Complexity of the alogorithm above. O(log N)






    share|improve this answer























    • I don't quite see how this is helpful for solving the problem in OP. They are supposed to order big-O classes by inclusion. Algorithms are not relevant at all.
      – user10605163
      Nov 20 at 6:58












    • Exactly this is just nn would you please explain via log(n) example as well because nn is the simplest of all
      – Himanshu Ahuja
      Nov 21 at 19:49















    up vote
    0
    down vote



    accepted










    Assuming you have some programming background. Say you have below code:



    void SomeMethod(int x)
    {
    for(int i = 0; i< x; i++)
    {
    // Do Some Work
    }
    }


    Notice that the loop runs for x iterations. Generalizing, we say that you will get the solution after N iterations (where N will be the value of x ex: number of items in array/input etc).
    so This type of implementation/algorithm is said to have Time Complexity of Order of N written as O(n)



    Similarly, a Nested For (2 Loops) is O(n-squared) => O(n^2)



    If you have Binary decisions made and you reduce possibilities into halves and pick only one half for solution. Then complexity is O(log n)



    Found this link to be interesting.



    For: Himanshu



    While the Link explains how log(base2)N complexity comes into picture very well, Lets me put the same in my words.



    Suppose you have a Pre-Sorted List like:



    1,2,3,4,5,6,7,8,9,10



    Now, you have been asked to Find whether 10 exists in the list. The first solution that comes to mind is Loop through the list and Find it. Which means O(n). Can it be made better?



    Approach 1:



    As we know that List of already sorted in ascending order So:




    • Break list at center (say at 5).

    • Compare the value of Center (5) with the Search Value (10).

    • If Center Value == Search Value => Item Found

    • If Center < Search Value => Do above steps for Right Half of the List

    • If Center > Search Value => Do above steps for Left Half of the List


    For this simple example we will find 10 after doing 3 or 4 breaks (at: 5 then 8 then 9) (depending on how you implement)



    That means For N = 10 Items - Search time was 3 (or 4). Putting some mathematics over here;



    2^3 + 2 = 10 for simplicity sake lets say



    2^3 = 10 (nearly equals --- this is just to do simple Logarithms base 2)



    This can be re-written as:



    Log-Base-2 10 = 3 (again nearly)



    enter image description here



    We know 10 was number of items & 3 was the number of breaks/lookup we had to do to find item. It Becomes



    log N = K



    That is the Complexity of the alogorithm above. O(log N)






    share|improve this answer























    • I don't quite see how this is helpful for solving the problem in OP. They are supposed to order big-O classes by inclusion. Algorithms are not relevant at all.
      – user10605163
      Nov 20 at 6:58












    • Exactly this is just nn would you please explain via log(n) example as well because nn is the simplest of all
      – Himanshu Ahuja
      Nov 21 at 19:49













    up vote
    0
    down vote



    accepted







    up vote
    0
    down vote



    accepted






    Assuming you have some programming background. Say you have below code:



    void SomeMethod(int x)
    {
    for(int i = 0; i< x; i++)
    {
    // Do Some Work
    }
    }


    Notice that the loop runs for x iterations. Generalizing, we say that you will get the solution after N iterations (where N will be the value of x ex: number of items in array/input etc).
    so This type of implementation/algorithm is said to have Time Complexity of Order of N written as O(n)



    Similarly, a Nested For (2 Loops) is O(n-squared) => O(n^2)



    If you have Binary decisions made and you reduce possibilities into halves and pick only one half for solution. Then complexity is O(log n)



    Found this link to be interesting.



    For: Himanshu



    While the Link explains how log(base2)N complexity comes into picture very well, Lets me put the same in my words.



    Suppose you have a Pre-Sorted List like:



    1,2,3,4,5,6,7,8,9,10



    Now, you have been asked to Find whether 10 exists in the list. The first solution that comes to mind is Loop through the list and Find it. Which means O(n). Can it be made better?



    Approach 1:



    As we know that List of already sorted in ascending order So:




    • Break list at center (say at 5).

    • Compare the value of Center (5) with the Search Value (10).

    • If Center Value == Search Value => Item Found

    • If Center < Search Value => Do above steps for Right Half of the List

    • If Center > Search Value => Do above steps for Left Half of the List


    For this simple example we will find 10 after doing 3 or 4 breaks (at: 5 then 8 then 9) (depending on how you implement)



    That means For N = 10 Items - Search time was 3 (or 4). Putting some mathematics over here;



    2^3 + 2 = 10 for simplicity sake lets say



    2^3 = 10 (nearly equals --- this is just to do simple Logarithms base 2)



    This can be re-written as:



    Log-Base-2 10 = 3 (again nearly)



    enter image description here



    We know 10 was number of items & 3 was the number of breaks/lookup we had to do to find item. It Becomes



    log N = K



    That is the Complexity of the alogorithm above. O(log N)






    share|improve this answer














    Assuming you have some programming background. Say you have below code:



    void SomeMethod(int x)
    {
    for(int i = 0; i< x; i++)
    {
    // Do Some Work
    }
    }


    Notice that the loop runs for x iterations. Generalizing, we say that you will get the solution after N iterations (where N will be the value of x ex: number of items in array/input etc).
    so This type of implementation/algorithm is said to have Time Complexity of Order of N written as O(n)



    Similarly, a Nested For (2 Loops) is O(n-squared) => O(n^2)



    If you have Binary decisions made and you reduce possibilities into halves and pick only one half for solution. Then complexity is O(log n)



    Found this link to be interesting.



    For: Himanshu



    While the Link explains how log(base2)N complexity comes into picture very well, Lets me put the same in my words.



    Suppose you have a Pre-Sorted List like:



    1,2,3,4,5,6,7,8,9,10



    Now, you have been asked to Find whether 10 exists in the list. The first solution that comes to mind is Loop through the list and Find it. Which means O(n). Can it be made better?



    Approach 1:



    As we know that List of already sorted in ascending order So:




    • Break list at center (say at 5).

    • Compare the value of Center (5) with the Search Value (10).

    • If Center Value == Search Value => Item Found

    • If Center < Search Value => Do above steps for Right Half of the List

    • If Center > Search Value => Do above steps for Left Half of the List


    For this simple example we will find 10 after doing 3 or 4 breaks (at: 5 then 8 then 9) (depending on how you implement)



    That means For N = 10 Items - Search time was 3 (or 4). Putting some mathematics over here;



    2^3 + 2 = 10 for simplicity sake lets say



    2^3 = 10 (nearly equals --- this is just to do simple Logarithms base 2)



    This can be re-written as:



    Log-Base-2 10 = 3 (again nearly)



    enter image description here



    We know 10 was number of items & 3 was the number of breaks/lookup we had to do to find item. It Becomes



    log N = K



    That is the Complexity of the alogorithm above. O(log N)







    share|improve this answer














    share|improve this answer



    share|improve this answer








    edited Nov 22 at 7:13

























    answered Nov 20 at 5:05









    Prateek Shrivastava

    888511




    888511












    • I don't quite see how this is helpful for solving the problem in OP. They are supposed to order big-O classes by inclusion. Algorithms are not relevant at all.
      – user10605163
      Nov 20 at 6:58












    • Exactly this is just nn would you please explain via log(n) example as well because nn is the simplest of all
      – Himanshu Ahuja
      Nov 21 at 19:49


















    • I don't quite see how this is helpful for solving the problem in OP. They are supposed to order big-O classes by inclusion. Algorithms are not relevant at all.
      – user10605163
      Nov 20 at 6:58












    • Exactly this is just nn would you please explain via log(n) example as well because nn is the simplest of all
      – Himanshu Ahuja
      Nov 21 at 19:49
















    I don't quite see how this is helpful for solving the problem in OP. They are supposed to order big-O classes by inclusion. Algorithms are not relevant at all.
    – user10605163
    Nov 20 at 6:58






    I don't quite see how this is helpful for solving the problem in OP. They are supposed to order big-O classes by inclusion. Algorithms are not relevant at all.
    – user10605163
    Nov 20 at 6:58














    Exactly this is just nn would you please explain via log(n) example as well because nn is the simplest of all
    – Himanshu Ahuja
    Nov 21 at 19:49




    Exactly this is just nn would you please explain via log(n) example as well because nn is the simplest of all
    – Himanshu Ahuja
    Nov 21 at 19:49












    up vote
    0
    down vote













    Generally when a loop is nested we multiply the values as O(outerloop max value * innerloop max value) n so on. egfor (i to n){ for(j to k){}} here meaning if youll say for i=1 j=1 to k i.e. 1 * k next i=2,j=1 to k so i.e. the O(max(i)*max(j)) implies O(n*k).. Further, if you want to find order you need to recall basic operations with logarithmic usage like O(n+n(addition)) <O(n*n(multiplication)) for log it minimizes the value in it saying O(log n) <O(n) <O(n+n(addition)) <O(n*n(multiplication)) and so on. By this way you can acheive with other functions as well.




    Approach should be better first generalised the equation for calculating time complexity. liken! =n*(n-1)*(n-2)*..n-(n-1)so somewhere O(nk) would be generalised formated worst case complexity like this way you can compare if k=2 then O(nk) =O(n*n)







    share|improve this answer



























      up vote
      0
      down vote













      Generally when a loop is nested we multiply the values as O(outerloop max value * innerloop max value) n so on. egfor (i to n){ for(j to k){}} here meaning if youll say for i=1 j=1 to k i.e. 1 * k next i=2,j=1 to k so i.e. the O(max(i)*max(j)) implies O(n*k).. Further, if you want to find order you need to recall basic operations with logarithmic usage like O(n+n(addition)) <O(n*n(multiplication)) for log it minimizes the value in it saying O(log n) <O(n) <O(n+n(addition)) <O(n*n(multiplication)) and so on. By this way you can acheive with other functions as well.




      Approach should be better first generalised the equation for calculating time complexity. liken! =n*(n-1)*(n-2)*..n-(n-1)so somewhere O(nk) would be generalised formated worst case complexity like this way you can compare if k=2 then O(nk) =O(n*n)







      share|improve this answer

























        up vote
        0
        down vote










        up vote
        0
        down vote









        Generally when a loop is nested we multiply the values as O(outerloop max value * innerloop max value) n so on. egfor (i to n){ for(j to k){}} here meaning if youll say for i=1 j=1 to k i.e. 1 * k next i=2,j=1 to k so i.e. the O(max(i)*max(j)) implies O(n*k).. Further, if you want to find order you need to recall basic operations with logarithmic usage like O(n+n(addition)) <O(n*n(multiplication)) for log it minimizes the value in it saying O(log n) <O(n) <O(n+n(addition)) <O(n*n(multiplication)) and so on. By this way you can acheive with other functions as well.




        Approach should be better first generalised the equation for calculating time complexity. liken! =n*(n-1)*(n-2)*..n-(n-1)so somewhere O(nk) would be generalised formated worst case complexity like this way you can compare if k=2 then O(nk) =O(n*n)







        share|improve this answer














        Generally when a loop is nested we multiply the values as O(outerloop max value * innerloop max value) n so on. egfor (i to n){ for(j to k){}} here meaning if youll say for i=1 j=1 to k i.e. 1 * k next i=2,j=1 to k so i.e. the O(max(i)*max(j)) implies O(n*k).. Further, if you want to find order you need to recall basic operations with logarithmic usage like O(n+n(addition)) <O(n*n(multiplication)) for log it minimizes the value in it saying O(log n) <O(n) <O(n+n(addition)) <O(n*n(multiplication)) and so on. By this way you can acheive with other functions as well.




        Approach should be better first generalised the equation for calculating time complexity. liken! =n*(n-1)*(n-2)*..n-(n-1)so somewhere O(nk) would be generalised formated worst case complexity like this way you can compare if k=2 then O(nk) =O(n*n)








        share|improve this answer














        share|improve this answer



        share|improve this answer








        edited Nov 21 at 20:08

























        answered Nov 21 at 20:00









        Himanshu Ahuja

        356215




        356215






























            draft saved

            draft discarded




















































            Thanks for contributing an answer to Stack Overflow!


            • Please be sure to answer the question. Provide details and share your research!

            But avoid



            • Asking for help, clarification, or responding to other answers.

            • Making statements based on opinion; back them up with references or personal experience.


            To learn more, see our tips on writing great answers.





            Some of your past answers have not been well-received, and you're in danger of being blocked from answering.


            Please pay close attention to the following guidance:


            • Please be sure to answer the question. Provide details and share your research!

            But avoid



            • Asking for help, clarification, or responding to other answers.

            • Making statements based on opinion; back them up with references or personal experience.


            To learn more, see our tips on writing great answers.




            draft saved


            draft discarded














            StackExchange.ready(
            function () {
            StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fstackoverflow.com%2fquestions%2f53386454%2franking-big-o-functions-by-complexity%23new-answer', 'question_page');
            }
            );

            Post as a guest















            Required, but never shown





















































            Required, but never shown














            Required, but never shown












            Required, but never shown







            Required, but never shown

































            Required, but never shown














            Required, but never shown












            Required, but never shown







            Required, but never shown







            Popular posts from this blog

            Berounka

            Sphinx de Gizeh

            Different font size/position of beamer's navigation symbols template's content depending on regular/plain...