Page 1 of 2 12 LastLast
Results 1 to 10 of 18

Thread: The Importance of proper Analysis and Design

  1. #1
    Senior Member
    Join Date
    Nov 2001
    Posts
    1,255

    The Importance of proper Analysis and Design

    One of the most important and most overlooked/ignored steps of writing software is the Analysis and Design step. Analysis and Design is the key first stage to any new software development, and doing it properly isn't quite stressed enough in schools or in tutorials/guides. You need a clear defined idea of what the requirements for functionality are, and how best to implement them. The latter point seems to be altogether too often overlooked or ignored. This understanding is obtained from a variety of sources. The requirements come from the users and/or management, research will yield algorithmic and design enhancements (what follows), and implementation tips can come from using Pseudo-code. Pseudo-code is simply non-functioning code written in any language you like (Plain English, C, etc). The purpose of Pseudo-code is to construct the application on a logical basis, not a programmatic one. This helps when you refer to the design documents later on. I'd encourage the use of UML (http://www.uml.org/) which was designed specifically with modelling applications and procedures in mind.

    It is my endeavour to provide a tangible illustration of the benefits of proper Analysis and Design, in the hopes that it will at least educate some as to the benefits it can yield. To illustrate my point, I will use the example of a program which factors numbers to determine if they are prime. For non-prime numbers, it returns all factors of the provided number. Additionally, I will be timing it using the linux "time" command, and taking the "real" value, to show exactly how much performance can be had from seemingly minor chances. In this code, there are three steps of optimization: the initial step, Step A, and Step B. Follow the directions in the Core Block bit of code pertaining commenting/uncommenting specific lines for each of the steps.

    Step A basically is a very trivial and simple mathematical principle applied to speed things up here. The highest non-prime factor of a given number must be n/2 due to the fact that 2 is the lowest non-prime factor. The net effect this has is that it cuts the number of entries to look through in half.

    Step B is taking Step A further, if we are able to find a given factor, we know its result must also be a factor of the given number, therefore we save processing time by taking both factors at once, and reducing the max value so we don't have to go over the same number twice. Algebraically it might look like: n / a = b, results taken only when n % a = 0.

    The following code is the application in question:
    Code:
    #include <iostream>
    #include <cstdlib>
    
    using namespace std;
    
    #define DEFAULT_FACTOR_COUNT 300
    
    int main(int argc, char *argv[])
    {
      int num = 0;  // Number entered
      int max = 0;  // Maximum loop value
      int i = 2;    // Loop incrementor, initialized to 2
      int fcount = 0;  // Factor count
      int factors[DEFAULT_FACTOR_COUNT];  // Factors array
    
      // Initialize the factors array
      for (i=0; i<DEFAULT_FACTOR_COUNT; i++) {
        factors[i] = 0;
      }
    
      /*
       * Determine if a number was passed on the commandline. If it was, we skip
       * asking for a number.
       */
      if (argc > 1) {
        num = atoi(argv[1]); // set it to the first argument.
      } else {
        cout << "Please enter a number: ";
        cin >> num;
      }
    
      /**** CORE BLOCK ****\  
      //max = num / 2 + 1;   // Uncomment this for Step A.
      max = num;             // Comment this for Step A.
      
      
      /*
       *  We loop from 2 through n/2. This is because 2 is the smallest non-prime
       *  factor, and n/2 is the largest non-prime factor. We will decrement max to
       *  the next lowest factor on each successful factor find.
       */
      for (i=2; i<max; i++) {
        // If there is no remainder after the division, we add a factor.
        if (num % i == 0) {
          factors[fcount++] = i;
          //max = num / i;            // Uncomment this out for Step B.
          //factors[fcount++] = max;  // Uncomment this out for step B.
        }
      }
    
      /**** END CORE BLOCK ****\
      
      /*
       * First we save some work by determining if we are dealing with a prime number.
       * If we are, we simply output that the number is prime and exit.
       */
      if (fcount == 0) {
        cout << "Number is prime." << endl;
        return EXIT_SUCCESS;
      }
      
      /*
       * Now we print out the list of factors we achieved, starting with 1 and n.
       * Add some fancy formatting to it as well.
       */
       cout << "Found " << fcount << " factors of " << num << "." << endl;
       cout << "Factors: " << endl << "  ";
      for (i=0; i<fcount; i++) {
        if (i > 1 && i % 15 == 0){
          cout << endl << "  ";
        }
        cout << " " << factors[i];
      }
      cout << endl;
      return EXIT_SUCCESS;
    }
    Now to look at some of our values. For the initial run, I have run a simple bash script, that runs a set of 15 numbers, as well as running a number individually.
    The script is as follows:
    Code:
    #!/bin/bash
    echo -ne "Number 1: "; /usr/bin/time -f %E -a ./algo 8035737 > /dev/null
    echo -ne "Number 2: "; /usr/bin/time -f %E -a ./algo 1089053 > /dev/null
    echo -ne "Number 3: "; /usr/bin/time -f %E -a ./algo 28508191 > /dev/null
    echo -ne "Number 4: "; /usr/bin/time -f %E -a ./algo 1009095 > /dev/null
    echo -ne "Number 5: "; /usr/bin/time -f %E -a ./algo 98031 > /dev/null
    echo -ne "Number 6: "; /usr/bin/time -f %E -a ./algo 7641367 > /dev/null
    echo -ne "Number 7: "; /usr/bin/time -f %E -a ./algo 418763 > /dev/null
    echo -ne "Number 8: "; /usr/bin/time -f %E -a ./algo 7689231 > /dev/null
    echo -ne "Number 9: "; /usr/bin/time -f %E -a ./algo 5000101 > /dev/null
    echo -ne "Number 10: "; /usr/bin/time -f %E -a ./algo 5974125 > /dev/null
    echo -ne "Number 11: "; /usr/bin/time -f %E -a ./algo 51320898 > /dev/null
    echo -ne "Number 12: "; /usr/bin/time -f %E -a ./algo 1001 > /dev/null
    echo -ne "Number 13: "; /usr/bin/time -f %E -a ./algo 2103559887 > /dev/null
    echo -ne "Number 14: "; /usr/bin/time -f %E -a ./algo 1025488 > /dev/null
    echo -ne "Number 15: "; /usr/bin/time -f %E -a ./algo 12783 > /dev/null
    The actual program output during the run:
    Code:
    Number 1: 
       Found 6 factors of 8035737.
       Factors: 3 2678579 163 49299 489 16433
    Number 2: 
       Found 2 factors of 1089053.
       Factors: 7 155579
    Number 3: Number is prime.
    Number 4: 
       Found 6 factors of 1009095.
       Factors: 3 336365 5 201819 15 67273
    Number 5: 
       Found 6 factors of 98031.
       Factors: 3 32677 41 2391 123 797
    Number 6: Number is prime.
    Number 7: Number is prime.
    Number 8: 
       Found 22 factors of 7689231.
       Factors: 3 2563077 9 854359 11 699021 33 233007 99 77669 101 76131 303 25377 769
          9999 909 8459 1111 6921 2307 3333
    Number 9: Number is prime.
    Number 10: 
       Found 30 factors of 5974125.
       Factors: 3 1991375 5 1194825 15 398275 25 238965 75 79655 89 67125 125 47793 179
          33375 267 22375 375 15931 445 13425 537 11125 895 6675 1335 4475 2225 2685
    Number 11: 
       Found 62 factors of 51320898.
       Factors: 2 25660449 3 17106966 6 8553483 9 5702322 18 2851161 27 1900774 47 1091934 
          54 950387 73 703026 94 545967 141 363978 146 351513 219 234342 277 185274 282 
          181989 423 121326 438 117171 554 92637 657 78114 831 61758 846 60663 1269 40442 
          1314 39057 1662 30879 1971 26038 2493 20586 2538 20221 3431 14958 3942 13019 4986 
          10293 6862 7479
    Number 12: 
       Found 6 factors of 1001.
       Factors: 7 143 11 91 13 77
    Number 13: 
       Found 14 factors of 2103559887.
       Factors: 3 701186629 11 191232717 13 161812299 33 63744239 39 53937433 143 14710209 
          429 4903403
    Number 14: 
       Found 18 factors of 1025488.
       Factors: 2 512744 4 256372 8 128186 16 64093 107 9584 214 4792 428 2396 599 1712 
          856 1198
    Number 15: 
       Found 2 factors of 12783.
       Factors: 3 4261
    So the initial setup's output is:
    Code:
    Number 1: 0:00.38
    Number 2: 0:00.05
    Number 3: 0:01.33
    Number 4: 0:00.05
    Number 5: 0:00.00
    Number 6: 0:00.36
    Number 7: 0:00.02
    Number 8: 0:00.40
    Number 9: 0:00.25
    Number 10: 0:00.27
    Number 11: 0:02.40
    Number 12: 0:00.00
    Number 13: 1:41.72
    Number 14: 0:00.06
    Number 15: 0:00.00
    Not bad, almost everything is under 5 seconds with the exception of run 13 which is the only 10 digit number in our run. This could be a good example of how software performs. Almost all the time here, we are seeing a pretty quick piece of software, however there is an exception, and this exception is pretty large. Almost two full minutes to get the primes for number 13. Our total execution time is: 107.29 seconds (1:47.29).

    Now, applying the changes in Step A (diff. b/w Initial & Step A):
    Code:
    Number 1: 0:00.19   (-0.19)
    Number 2: 0:00.03   (-0.02)
    Number 3: 0:00.63   (-0.70)
    Number 4: 0:00.01   (-0.04)
    Number 5: 0:00.00   
    Number 6: 0:00.15   (-0.21)
    Number 7: 0:00.00   (-0.02)
    Number 8: 0:00.18   (-0.22)
    Number 9: 0:00.11   (-0.14)
    Number 10: 0:00.13  (-0.14)
    Number 11: 0:01.17  (-1.23)
    Number 12: 0:00.00  
    Number 13: 0:47.68  (-54.04)
    Number 14: 0:00.01  (-0.05)
    Number 15: 0:00.00
    Now this optimization knocks a solid 0:57.00 off our execution time, bringing the total down to: 50.29 seconds (0:50.29). The bulk of this ground is made up on the longest execution time, however it does make some significant headway.

    Applying changes in Step B:
    Code:
    Number 1: 0:00.00   (-0.19)  [-0.38]
    Number 2: 0:00.01   (-0.02)  [-0.05]
    Number 3: 0:00.63            [-0.70]
    Number 4: 0:00.00   (-0.01)  [-0.05]
    Number 5: 0:00.00     
    Number 6: 0:00.15            [-0.21]
    Number 7: 0:00.00            [-0.02]
    Number 8: 0:00.00   (-0.18)  [-0.40]
    Number 9: 0:00.11            [-0.14]
    Number 10: 0:00.00  (-0.13)  [-0.27]
    Number 11: 0:00.00  (-1.17)  [-2.40]
    Number 12: 0:00.00  
    Number 13: 0:00.21  (-47.47) [-1:41.51]
    Number 14: 0:00.00  (-0.01)  [-0.06]
    Number 15: 0:00.00
    (diff. b/w Step B & Step A)
    [diff. b/w Step B & Initial)
    This optimization as well knocks a further 0:49.18 off our execution time. The final total execution time after both sets of optimizations is 0:01.11. This represents a speed increase of over 9600% ; it takes less than 1/96th the time! All that extra speed from some minor changes that are only possible once you understand the logic of what the application should do. Granted, not every application is going to have such drastic speed increases just from a better understanding of how they should work. Consider however that improvements in speed are only one facet of improvement you will see by understanding and designing correctly from the get-go. I used speed in this illustration merely because it is the simplest to measure. More efficient design leads to improvement in all areas, and can contribute greatly to the security of the application (consider race conditions, for example), in addition to stability.

    For more info, check out the Unified Modelling Language homepage.
    Chris Shepherd
    The Nelson-Shepherd cutoff: The point at which you realise someone is an idiot while trying to help them.
    \"Well as far as the spelling, I speak fluently both your native languages. Do you even can try spell mine ?\" -- Failed Insult
    Is your whole family retarded, or did they just catch it from you?

  2. #2
    Senior Member
    Join Date
    Jan 2002
    Posts
    1,207
    In a lot of software, I think too much emphasis is placed on analysis and design.

    You can analyse and design for years, and still be wrong. The only way to find the real solutions is usually to implement them:

    - During analysis and design, requirements change
    - During development, requirements change
    - Requirements change all the time
    - Mistakes are made during analysis and subsequently design, quite serious ones, which are subsequently overlooked until someone actually tries to use the first prototype
    - Design and analysis takes time away from other tasks, as a result the business becomes less efficient beacuse the solution is delivered later
    - People with absolutely ****all skill, get moved to being systems analysists, because it's perceived as "easier" than programming, they make a balls-up of it, then throw their shoddy half-done inaccurate "solutions" at the developers

    I'm not advocating not TEACHING people design and analysis, just disputing how much time is given to them in modern software development.

    IMHO, the "waterfall" model is definitely out of date, and should not be used by anyone in the real world (of course I'm excluding banking and military apps from the real world here)

    Slarty

  3. #3
    Senior Member
    Join Date
    Nov 2001
    Posts
    1,255
    Originally posted here by slarty
    In a lot of software, I think too much emphasis is placed on analysis and design.

    You can analyse and design for years, and still be wrong. The only way to find the real solutions is usually to implement them:

    - During analysis and design, requirements change
    - During development, requirements change
    - Requirements change all the time
    - Mistakes are made during analysis and subsequently design, quite serious ones, which are subsequently overlooked until someone actually tries to use the first prototype
    - Design and analysis takes time away from other tasks, as a result the business becomes less efficient beacuse the solution is delivered later
    - People with absolutely ****all skill, get moved to being systems analysists, because it's perceived as "easier" than programming, they make a balls-up of it, then throw their shoddy half-done inaccurate "solutions" at the developers
    I've worked under such conditions (Moving Target Development), however it is not as you suggest:
    - During Anylsyis and Design, requirements are supposed to change until a satisfactory set is hammered out. Once you hit the design stage, it should be clearly defined what requirements the business has NOW.
    - If, during development requirements change, they wait for another version, or heavily document how the requirements changed.
    You are correct, requirements do change quite frequently, however no successful software development will survive constant change. It is akin to trying to future proof your computer. It is not generally possible, your best possible course of action is to upgrade for NOW. This is how most enterprise software development occurs. Changes do NOT happen overnight. The point of a lengthy, thorough analysis and design process is to prevent many of the very issues you mention. Software development is not inexpensive, but would you rather have a product that works fast or slow? A product that is unstable or stable? The added expense is worth it IME.
    Chris Shepherd
    The Nelson-Shepherd cutoff: The point at which you realise someone is an idiot while trying to help them.
    \"Well as far as the spelling, I speak fluently both your native languages. Do you even can try spell mine ?\" -- Failed Insult
    Is your whole family retarded, or did they just catch it from you?

  4. #4
    Ninja Code Monkey
    Join Date
    Nov 2001
    Location
    Washington State
    Posts
    1,027
    Most who preach analysis and design also preach an iterative model and or two way model. You should do a good deal of research and design before your start coding so that you know what you are building and you must alter that design as obstacles or better solutions pop up or as the customer's needs change.

    The design should also be reviewed by people who don't actually have emotional ownership in the product and by the customer to ensure it is indeed what they (the customer) wants, and that the design is indeed sound. This is where third party contractors or other architects in your organization come in handy.

    Design and analysis take away time in the short term for tasks, in the long term if done well they will save you far more time by finding problems before you are too far down a path.

    And system analysts are not system architects. You should have a mix of folks representing the customer, the analysts/pm types for the project, the architects /development heads, Quality Assurance, and the Security organizations involved in the analysis and design of any products. Sometimes people where a couple of hats, you just have to watch where those multiple hats can have conflicting viewpoints that may cause problems in the long run.

    Waterfall model is definitely old school, since then there has developed the Rational Unified Process, Microsofts MSF (which is actually decent), extreme programming, etc. For more info and discussion look up my old thread on the issues with the different types in the programming security forum. There are links to related materials there.

    To argue against good analysis and design for the solution is foolish and a noob mistake. To accept bad analysis and design is just as bad.
    "When I get a little money I buy books; and if any is left I buy food and clothes." - Erasmus
    "There is no programming language, no matter how structured, that will prevent programmers from writing bad programs." - L. Flon
    "Mischief my ass, you are an unethical moron." - chsh
    Blog of X

  5. #5
    Senior Member nihil's Avatar
    Join Date
    Jul 2003
    Location
    United Kingdom: Bridlington
    Posts
    17,188
    And system analysts are not system architects. You should have a mix of folks representing the customer, the analysts/pm types for the project, the architects /development heads, Quality Assurance
    In a perfect World, I cannot fault that................trouble is that you can never get the resource and end up having to multitask. I have never been offered multiple salaries for doing more than one job

    I think that you have to take a "horses for courses" approach and use whatever methodology suits the task. In some instances a "prototyping" approach, such as slarty suggests is appropriate. In others, a more formal, classical approach is indicated.

    My personal experience is that the biggest problems are bugger all to do with your development methodology. They come from a failure to agree what the objectives and critical success factors are with the user community.

    USER: "I want a motorised vehicle that can go across grass" (he means an ARV)

    HEAD OF DEVELOPMENT: "Design him a self-drive lawnmower"

    No matter how smart we think we are............the user pays our wages?

    Just my thoughts

  6. #6
    Ninja Code Monkey
    Join Date
    Nov 2001
    Location
    Washington State
    Posts
    1,027
    "My personal experience is that the biggest problems are bugger all to do with your development methodology. They come from a failure to agree what the objectives and critical success factors are with the user community. "

    This should have been handled in the analysis and design phase. If you don't know what the problem area is, how do you expect to solve it?
    "When I get a little money I buy books; and if any is left I buy food and clothes." - Erasmus
    "There is no programming language, no matter how structured, that will prevent programmers from writing bad programs." - L. Flon
    "Mischief my ass, you are an unethical moron." - chsh
    Blog of X

  7. #7
    Senior Member nihil's Avatar
    Join Date
    Jul 2003
    Location
    United Kingdom: Bridlington
    Posts
    17,188
    This should have been handled in the analysis and design phase. If you don't know what the problem area is, how do you expect to solve it?
    OK Juridian, I appreciate that this depends on what development methodology/model we use, but I would say it should be happening before analysis/design........as I understand that phase

    I would say that this should be sorted out in the Project Initiation Document/Terms Of Reference/Quality Plan, which should happen before any detailed analysis and certainly any design?

    This is our contract with the user...........hell, never give them an open ended cheque

    My cynical attitude comes from over 20 years of dealing with the devious little b******s


  8. #8
    Ninja Code Monkey
    Join Date
    Nov 2001
    Location
    Washington State
    Posts
    1,027
    Understandable, I was going by the lawnmower example. The vehichle that moves across grass is a decent one liner that can kick off the process. Everything after that could be handled in analysis/design...you need a vehichle that moves across grass. How big? What color? We can make it cut grass, would that be useful to you? Gas or electric? And then on down the line to possible technologies to use, high level architecting of a solution on down to actually developing some kind of hard specs, api's, etc.
    "When I get a little money I buy books; and if any is left I buy food and clothes." - Erasmus
    "There is no programming language, no matter how structured, that will prevent programmers from writing bad programs." - L. Flon
    "Mischief my ass, you are an unethical moron." - chsh
    Blog of X

  9. #9
    Senior Member nihil's Avatar
    Join Date
    Jul 2003
    Location
    United Kingdom: Bridlington
    Posts
    17,188
    Hi Juridian,

    As I suspected, we are both saying the same thing..........

    Everything after that could be handled in analysis/design...you need a vehichle that moves across grass. How big? What color? We can make it cut grass, would that be useful to you? Gas or electric? And then on down the line to possible technologies to use, high level architecting of a solution on down to actually developing some kind of hard specs, api's, etc.
    Well, almost. My take is that you have to do sufficient "analysis" (we are at the bid/proposal state here?) to pin the user down to a firm understanding of what the deliverables are.

    Sure, we are professionals and like to play with new toys (oooops!), sorry I meant "add value" to a project, but we have to get the budget authorised none-the-less.

    The conversations that you describe are those that I would have with the user at the Terms Of Reference/Project Initiation Document stage, then I have a "contract" that we can all sign off.

    My real point is that it is a big mistake to spend real analysis and design resource up front, until you have a firm definition of the user requirements.

    Hey, I have worked for some real &^^$£%&^ clients, so I am rather cautious?


  10. #10
    I definately have to say this thread is one of the better reads, that I have seen in a while.

    I personally have to professional experience with dealing with costumers, when it comes to writing applications.(College Kid)

    As far as Design and Analysis, I will say that, it is often overlooked by beginners and Experienced programmers alike.

    I will be honest, whenever I write a program, that is over 100 lines, I usually end up writing the entire thing in my head, before I start coding it. The problem here, is that my design and analysis is only as good a the Psuedo I use, to write it. (Though, I usually don'y use Pseudo, it comes in handy often.)

    I will say, that I definately think colleges and schools should go over Design and Analysis more, if it weren't for my small background in C, I would of never learned about Pseudo and Design and Analysis.

    Now I have a question. What is the Waterfall?

    It has been refered to a couple times, what is it?

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •