Click here to Skip to main content
15,887,906 members
Articles / All Topics

Programmers’ War of Survival

Rate me:
Please Sign up or sign in to vote.
3.18/5 (11 votes)
7 Jun 2010GPL36 min read 18.4K   6   7
Would you like to share your emotions about how honorable you will feel being the owner of “Compromised” applications???

Introduction

According to the 2008 Data Breach Study published by the Verizon Business Risk Team, the most common type of attacks by the cyber criminals targeted applications and software. The study further outlines that 73% of all such attacks were from the external sources taking advantage of existing vulnerabilities in those applications and software. Imagine you were the programmer of one of these applications; would you like to share your emotions about how honorable you will be feeling being owner of these “Compromised” applications, because the applications which were the jewel assets of you and your organization are now nothing but "hacked Websites" and "compromised applications".

Studies further show around 57% of the vulnerabilities are caused by the coding and design errors. Stating bitter facts further, on an average 8 vulnerabilities in every application are due to coding and design errors. Research by Gartner Group and Symantec says, close to 90% of software attacks were aimed at the application layer.

Sirens are echoing, lights are flashing all over and the message is loud and clear that applications are the target and it would not be wrong to say that it’s state of war for programmers, remember if you fail to plan, you plan to fail.

It is a serious matter of concern, because if it continues; then software developers will lose the reputation, organizations will withdraw their confidence on developers, the average market salaries might start to contract because every breach not only results in the loss of organizational reputation but is an extra cost to an organization as well. According to the studies, the estimated cost of such breaches is around $180 billion per year.

The Point of Interest

Let’s not discuss these terrifying figures further, peel the onion and come to the point!!

Folks our old school of thought is “build it first and fix it later’. Security issues are often the last issue that gets addressed in either development or testing, and fixing a security issue is not even worth a mention when budgets are tight and a project overruns.

A cure to this malady is to begin securing from the beginning which could best be understood by an analogy “If security were the electrical wires, then it is more difficult to put the wiring in after the house is built.”

It means that application security is more susceptible than ever in today’s dynamic application development landscape. In fact, the root cause of many application security vulnerabilities lies in the application source code, so is secure coding an answer to all these problems? The answer is “NO", secure coding addresses the part of the problem, many of the breaches are not only the result of coding errors but because of the bad design, that highlights the fact that security requirements should be considered at every stage of the software development life cycle.

What's the Game Plan?

Application development is a fine balance between functional requirements and business needs; a fine balance between resource and deadlines; but now it should be a fine balance between security and risk as well. Security is now one of the attributes of software like usability, performance, reliability, scalability and is a candidate for the same respect which other attributes deserve.

"Security is just another attribute of software like usability,
performance, reliability and scalability. The idea of incorporating security
into the SDLC begins with the evaluating the relative importance of this
attribute and then going on to incorporating controls in line with that" 
	- Tallah Mir Sr. Program Manager - Microsoft 

Acknowledging these facts, embedding SDLC phases with security considerations is the right approach.

Requirements

How could you secure an application if you do not know what you have built? Do you know your application?

A good starting point is the technical and requirements specifications. Understanding the coding that was used to build the application meets one of the four security principals “to know your application”. During the requirements specifications stage, ask the following questions:

  1. What are the technology security considerations?
  2. Are there any business model security considerations?
  3. What are the main threats to the system?

Architecture / Design

No one designs things for the wrong reasons. Remember two heads are better than one. Involve the Security Risk Team right from the day first. Think how can security requirements be designed in? What could go wrong? What are the main threats to the design? Consider the following design principles:

Principle Meaning Example
Economy Of Mechanism Keeping the Design simple and less complex Modular Code, Centralized Services
Fail-Safe Defaults Access Denied by Default and Granted Explicitly Denied Transaction
Complete mediation or Mandatory Access Control Checking permission each time subject requests access to objects Credentials not cached
Open design Design is not the Secret Popular Cryptographic algorithms
Separation of Privilege More than one condition is required to complete the task Split Keys, Compartmentalized Functions
Least privilege or Discretionary Access Control Rights are Minimum and users granted explicitly Non admin accounts, need to know basis
Least common mechanisms Common Mechanism to more than one user / Process / role is not shared Role based security
Physiological acceptability Security Mechanism should be communicated to end user for ease of use and acceptance Help dialogues, appealing icons

Coding

During the coding, make sure that code is robust and secure. Adopt the most pessimistic approach of "TRUST NOTHING" or “DEFENSE IN DEPTH”. For example, while taking input don’t rely on the content, don’t assume users are entering correct information. Remember the golden rule "When in doubt, check it out!!”. Identify what the main threats to the code are?

Ascertain with the security features of any programming language before taking a decision to code in it. Interestingly, with each programming language there is a tradeoff between efficiency and security. C and C++ provide efficiency and speed, though the code pays little attention to security issues, knowing what could go wrong with coding is important in that they affect security issues. Remember:

"No language can prevent insecure code, although there are language features which could aid or hinder a security-conscious developer."

-Chris Shiflett

Be careful while choosing Development Methodologies, when adopting security in every phase; what kind of methodology should we use, waterfall, iterative, spiral, extreme or agile. Are these methodologies flexible enough to incorporate your security considerations in each phase.

In a nutshell, you can’t create secure applications and software with rules alone, the principle of secure application and software development are applied across development methodologies and across programming languages as well.

Testing

There are several types and levels of testing, which I won't write about extensively. Suffice to say, augment UAT with SAT (Security Acceptance Testing). UAT where signals we are all GOOD to go, SAT testifies that we are all SAFE to go.

Bottom Line: Keep an Eye on the Big Bad Wolf

"In the 80's we wired the world with cables and in the
90's we wired the world with the computer networks. Today we are wiring the
world with the application (softwares).... Having a skilled professional
capable of designing, developing and deploying secure software is not critical
to this evolving world" - Mark Curphey, Director & Project Unit Manager Microsoft 
and Founder of OWASP (Open Web Application Security Project)

It is important to remember that nothing is permanent except change itself and if you won’t change, it's likely that you will not survive much longer!!!!

References

License

This article, along with any associated source code and files, is licensed under The GNU General Public License (GPLv3)


Written By
United Arab Emirates United Arab Emirates
Altaf Al-Amin Najvani
Project Manager
Commercial Bank Of Dubai


Qualifications:
BS - Computer Science
Masters In Project Management
Masters In Business Administration

Certifications:
CISA - Certified Information Systems Auditor
ITIL (Foundation)
Microsoft Certified Technology Specialist (SQL Server 2005)
Microsoft Certified Technology Specialist (Web Applications)
Microsoft Certified Application Developer (.NET)
Microsoft Certified Solution Developer (.NET)

Comments and Discussions

 
GeneralInteresting.. a few questions though. Pin
ZakPatat9-Jun-10 11:35
ZakPatat9-Jun-10 11:35 
First of all, interesting article. I work in the defense-contracting sector and security (across the board, not just code, of course) is obviously of extreme importance so articles like these are definitely interesting to read.

I do have a few questions/remarks though.

Could you explain this 'Open Design' a bit more? I ask this because, per definition, a 'design' needs to be 'open', otherwise my analyst teams, development teams and testing teams cannot properly do the job. In the case of the end-user, only the end-users whom are part of the design process itself have the need to see the design(s), any other end-user has no business with the design of the software, they just want it to work and have it work correctly (some more on end-users below). Bottom line, I don't quite get what you mean with 'the design is not a secret' (heck, often, in my job, the designs ARE actually classified, depending on the software and its use(s)).

You speak of 'Physiological acceptability' and include 'help dialogs & appealing icon' as examples. Here I do understand what you mean but, to a degree, disagree with your statement. Appealing icons is nice, but they are nothing more than 'visual', they do not influence security in the slightest, nor do they influence the functionality of the software. I do not see how this would improve security. Help dialogs are a given, but only when the end-user requests them. Anything that 'distracts' an end-user from doing his or her job, is wasted time. Adding a plethora of help dialogs that come and go whenever the software feels the need to 'communicate' its design to the user is only going to prevent them from doing their job. Also, if the high security causes performance issues (some of your examples are definitely able to cripple performance) then a whole other series of issues appear.

In fact, (most) end-users don't care about how the software is designed or how it works internally, they just want it to work. Since 'software security' is an internal affair for software, (most) end-users are simply not interested in it beyond the usual logging in and out.

Basically, I am saying that pretty much all of your article needs to be entirely transparent to the end-user, invisible so to speak. Security matters are of interest to the analysts, developers and, to a degree, the testers (unless it is a dedicated testing team that doesn't contain any end-users, then it definitely matters a lot more since those testers need a much more in-depth view of the software than an end-user).

It all comes down to the usual in the end, money. The more time an end-user spends not working (using your example, looking at help dialogs with appealing icons is not considered 'doing work') the less money the business makes. In the end, the money is what makes the business 'care' about its software, with that I don't mean 'how much does the software cost', but 'how much money does the software save the business'. Bogging down end-users with all sorts of security related things that do not help them perform their duties, is not going to save the business money, it is only going to cost more money.

So, while I fully agree with your call for better security in software, I do not entirely agree with your approach.

End-users should not be involved in the development of software apart from initial design (e.g., the usual 'what do you need the software to do' and, 'how do you do what you need the software to do now', etc sort of involvement, as well as 'proof-read' the parts of the design that matter to them, all this pre-development, of course), and a degree of alpha/beta testing (functionality testing!) to make sure the end-user wishes/requests/dreams are implemented in a way they can deal with, and that the software does what it needs to do correctly.

Security can also be overdone, too much security can make the software sluggish, it can also introduce a whole host of bugs that are 'uncommon' to a normal developer. Highly secured code can also hinder your ability to 'smoothly' debug it. We've done a lot of software security research internally (we develop internal software only, since I work for a huge corporation that does not sell software but uses hundreds (heck, probably thousands) of software applications, many internally developed, across a multitude of platforms) and finding that 'sweet spot' between security and usability, performance, functionality, etc' is not an easy task, a task that needs to be repeated a number of times during development and is different for every project. In my experience, a task better suited to analysts than developers.

Thankfully, we have to work by VERY strict guidelines, which makes the above slightly more 'predictable' during the design phase (we already know what need to be secured and, usually, how). I can see that for a software house, such 'predictability' can be entirely gone, making the design and development of secure software quite a nightmare.

In summary, and in my opinion, good security is a direct result of the software design (assuming the design process was done properly and resulting in a good design), not (directly) the code. Step number one in developing a piece of software is designing it, this is where the security needs to be applied (depending on methodology, inside the technical design documentation, not the functional design documentation, with the exception of the 'log in/log out/related' processes).

It gives the developers clear guidelines on how to develop the software, without them having to worry about the potential security issues (although they do need to keep an eye on such things, in case something was missed during design). This can result in a shorter development time, it can also lower the required experience level a little. In the end, it creates more secure software, with a large potential to have the development process done cheaper and faster. Where you stress security through code, I stress security through design and organization, at the end of the day, the result is the same.

However, personally, I say, leave the end-user out of it as much as possible, they have a job to do, a job that in the end pays your salary, if they don't work, you don't get paid.

All in all, I look forward to any follow up articles regarding security. You did a pretty good job on this one.
GeneralRe: Interesting.. a few questions though. Pin
Altaf Al-Amin9-Jun-10 20:24
Altaf Al-Amin9-Jun-10 20:24 
GeneralRe: Interesting.. a few questions though. Pin
ZakPatat10-Jun-10 20:37
ZakPatat10-Jun-10 20:37 
GeneralReserving grade... Pin
Ray Cassick6-Jun-10 8:30
Ray Cassick6-Jun-10 8:30 
GeneralRe: Reserving grade... Pin
Altaf Al-Amin6-Jun-10 8:57
Altaf Al-Amin6-Jun-10 8:57 
GeneralRe: Reserving grade... Pin
Ray Cassick6-Jun-10 9:24
Ray Cassick6-Jun-10 9:24 
GeneralRe: Reserving grade... Pin
Altaf Al-Amin7-Jun-10 19:10
Altaf Al-Amin7-Jun-10 19:10 

General General    News News    Suggestion Suggestion    Question Question    Bug Bug    Answer Answer    Joke Joke    Praise Praise    Rant Rant    Admin Admin   

Use Ctrl+Left/Right to switch messages, Ctrl+Up/Down to switch threads, Ctrl+Shift+Left/Right to switch pages.