Results 1 to 2 of 2

Thread: Big O Notation

  1. #1

    Big O Notation

    I'm having a hard time understanding the concept of big O notation. I've looked it up on google, and I understand it expalins the complexity of an alogirthm. I don't understand how to find the Big O notation of a for loop, while loop, etc.
    Any good tutorials?
    +++++++-+-+-+-+-+ +-+-+-+ +-+-+-++++
    +|p|h|a|s|e| |o|n|e| |r|e|t|a|l|i|a|t|i|o|n|++
    +++++++-+-+-+-+-+ +-+-+-+ +-+-+-+-++

  2. #2
    Junior Member
    Join Date
    Jan 2004
    Posts
    6
    Well, I don't know any tutorials exactly (only books) but I can try to explain it as simple as possible. The thing with the big O notation is that it measures how the running time increases (or decreases) depending on the length of the input. And it's not quite an exact measurement, it's an upper bound. I'll try to clarify with an example:
    Say you have written a program with a for loop looping through the chars in some input:

    "
    int i=0;
    for (i=0; i<strlen(argv[1]); i++)
    {
    //Do something with the chars
    }
    "
    This will run in linear time, O(n) time. From a runningtime point of view it's the same as just calling a sleep method that sleeps for an amount of time depending on the length of the input.

    If your program would have nested for loops instead (like for building pairs of letters or something)
    "
    int o=0;
    int i=0;
    for (i=0; i<strlen(argv[1]); i++)
    {
    for (o=0;o<strlen(argv[1]); o++)
    {
    //Do something with the chars
    }
    }
    "
    This would run in O(n²) time since it runs n times n times so to say. (And from a runningtime perspective is the same as calling a sleep method to sleep for n^2 time, where n is the length of the input)

    A little more mathematical approach to explain it is that the graph of the runningtime compared to input length never crosses the graph of the O() it is said to be, if it starts under it from the first place. Ok it wasn't that mathematical ;)

    A thing to have in mind is that events that have constant time is said to be O(1) and is most often not counted at all since an algorithm that is O(n+1) is said to be O(n) and another that is O(n+n²) is said to be O(n²), the O notation always describes the worst case so to say since it is the worst case that dominates the runningtime of the algorithm. So when designing algorithms one should always try to get as low Ordo as possible (An O(n²) algorithm is way far worse than an O(n) for example)

    I don't know if it made anything clearer, but I hope it helps. Sorry about my worthless explaning skills, I haven't explained anything in english for years... nor calculated ordos :)
    Student in computer science at Chalmers University of technology, Gothenburg Sweden

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •