January 10th, 2004, 10:36 PM
Security engineering and your programming methodology
Recently I was reading about incorporating security engineering into projects being run by different programming methodologies. One of the points made by the author is that they thought that eXtreme programming (or agile programming) methods made it difficult to properly design and implement security for most programming projects. They held and strongly promoted the belief that cyclical models such as the one promoted by MSF allowed for better design and implementation.
I'm wondering what your opinions on this subject might be...
Have you practiced extreme programming and has it hampered you in any way? Was your experience any worse/better than when you tried a cyclical model of development?
Do you have any horror stories about trying to incorporate security into your programming project?
Any advice for people on how you believe it should be done?
If you need some background on the methodologies mentioned:
extreme programming - http://www.extremeprogramming.org
Microsoft Solutions Framework - http://www.microsoft.com/technet/tre...ol/default.asp
A good listing of other methodologies can be found easily via google. It's worth a look to see what's out there I think.
"When I get a little money I buy books; and if any is left I buy food and clothes." - Erasmus
"There is no programming language, no matter how structured, that will prevent programmers from writing bad programs." - L. Flon
"Mischief my ass, you are an unethical moron." - chsh
Blog of X
January 10th, 2004, 11:47 PM
Awesome read Juri. XP is what we should have been doing long ago. Maybe if Mircosoft did that Windows wouldn't suck so bad.
January 11th, 2004, 03:19 AM
More on my view about XP vis a vis security:
Unless I have totally missed the point of the article, I would think the XP would actually INCREASE the security of code by allowing 2 things:
1. Better debugging. XP stresses debugging during ALL phases of programming and not just at the end. This, I would think, would allow more time to find security related holes in code.
2. Because programs are kept simpler under XP, I would think the number of places where you could have a security hole may decrease.
January 11th, 2004, 04:13 AM
As you know doubt have heard Juridian, I had the fortunate opportunity to work in a place that quite liked the loose write it as needed document nothing style of code development. That being said, the little development I was able to push properly I adopted the extreme style of programming. IMO, Neither development methodology promotes security more than the other. It falls to your individual developers to handle that as they are doing development. Security considerations should be a constant thing in development, not a step in the process, but rather a key piece of the entire process, considered in each step, from analysis to design to development to implementaiton.
The MSF, while being quite a structured development methodology, does not necessarily increase the security potential of any application merely by adding proper code review stages and such. It essentially falls to the approach you favour. The Extreme development methodology does not necessarily lend itself to be more vulnerable, and were you to read in that there is little code review, one might expect that indeed, more bugs could be present. While that may be the case, the MSF is being presented under the dubious distinction of a company itself that has been plagued by bugs and vulnerabilities. If we are to infer that Microsoft has indeed followed similar steps and procedures beforehand, it appears that indeed contrary to what should be the case, code review steps are not sufficient for eliminating or vastly limited application bugs.
A code review is really only as good as the programmer doing the review, and depends largely on their understanding of the code they are reviewing. Simple development flaws can be picked up by a code review, anything ranging from typographical errors to overflowable buffers. What generally can't be picked up in a code review is flawed application design. This is not the fault of the reviewer, but rather a simple truth that a reviewing developer needs a complete understanding of how the application or portion of the application is intended to function, and what the needs are. Indeed, this can be a minor detail in overall application design, but a flaw -- even an extraordinarily minute one -- could present numerous vulnerabilities.
From that, I would put forth this analysis: Neither methodology lends itself to writing more secure software, as that is a consideration that every developer must take into account in their code development.
Definitely an interesting point for debate.
The Nelson-Shepherd cutoff: The point at which you realise someone is an idiot while trying to help them.
\"Well as far as the spelling, I speak fluently both your native languages. Do you even can try spell mine ?\" -- Failed Insult
Is your whole family retarded, or did they just catch it from you?