Software development is an art, it can't be quantified, aproximated, or similar.I am talking here about true software, wich does not crahes ,missleads or complicates life more than it already is.In the same way a tree grows, an application should grow too - branch by branch, leaf by leaf, without planning each leaf or even each branch. It grows according to the environment it lives in, adapting each step of the way. Since you can't predict the environment, you can't say when it will be fully grown.
Think about Leonardo or any other genuine artist:what would have happend if someone would have hired him to do "Monalisa" in say, two weeeks?
Well -=nice dream=- but what about the money?
This is quite simple-there are Low risk investments and High risk investments-software is a high risk one. It is the bussinessman's job to take risks and programmer's job to build application(s); not the other way around.
Of course, if a genious programmer is also a sucessfull bussinessman than the hole software world goes to hell, fast....
In your article you made the assumption that all the coders would produce equal output, because you did not know how it can be estimated. However, there are empiricals which provide a better est based on language type used and experience of developer.
i assumed, for the sake of simplicity that all those involved in development can be taken as are equals in terms of effort and salary. and i have mentioned seperate roles, manhours and cost in the sample software estimate. in this article i was focussing on how to estimate a project without a design in place by using the documents created during requirements and analysis. at this stage it might not possible to exactly fix the number of people and their roles. again a company may have too many/little senior/junior developers for a given technology and so would normally use the available resources.
i am not aware of any empiricals to calculate manhours with roles. to my understanding (from my engineering knowledge) empiricals are based upon experience and not an exact mathematical derivation. (in some cases there were several empiricals to calculate the same thing and we could chose whichever we preferred based upon our experience ). please explain the empiricals you are talking about as this may help other readers (besides me ) and we could debate their importance.
thanks for your praises Shiva . this article focusses on how to estimate the man hours involved for developing a project and the rates mentioned here are arbitary. the final cost of the project/product would further depend upon the development company's business interest, market trends, govt. policies, competition from other companies/products and so on. as soon as i figure all that out i might try doing business for myself.
btw any venture capitalist out there wanna invest in developing a product for estimation based upon ideas mentioned here please get back to me .
I normally don't like to be critical about articles, but in my own opinion, this article looks like it is describing a shotgun approach to estimating a project - break the project into miniscule, ridiculous pieces and then over estimate everything. If the example given were to come across my desk as a project bid, it would be dismissed in five minutes.
There are several factors that seem to have been left out, or at lease glossed over, in your article. The biggest one deals with the risk assessment and handling critical risk factors.
You state that a high risk is where all stakeholders have an interest, and low risk is where few do. I have to disagree there, risk isn't based on who's looking at it, risk is based on complexity, availability and ability to complete a particular area, feature or task. For instance, a client may require that a particular report be placed in the system, obviously, since this is a requirement, the development firm also looks at this a necessary item, and the subcontractor that is hired to write it sees that this is an important item. Ok, by your analysis, this is a high risk item ... not so, becuase everyone (including the client) knows that writing a report is a simple matter, as they have already prototyped it to give an example. The development firm knows that the technology involved is already there, and finally the report sub contractor has done this many times and knows the tools that are needed are there. This really is a very low risk item.
A more realistic example of risk would be something along the lines of logging into a legacy mainframe system and pulling data some amount of data. The customer does it every day, they assumne it can all be done, but don't look at it as a high priority because its not in the package of results they want their new system to spit out. The developement firm sees this as a bit more of an involved item because they are not completely familiar with the old system. Now, the risk according to the article, would be medium or low, only the development firm really sees this as a risk factor. However, the development firm does not know with the proper interfaces are in place to allow the new system to access the old data, or for that matter, if the technology exists. Also, will new program modules need to be written on the old system to quickly get data out, and is the system so old that there are few to no progammers capable of writing those modules. The development firm also knows that if this data doesn't get pulled, then the end deliverable reports are going to be missing some critical information that will need to be retrieved some other way. So, now we have identified a potential project-stopping risk factor. This is an extremely high risk, and as another commentor stated, should be one of the first items researched and tackled in any development project - "rush to failure" I believe is the term sometimes used, minimize the project impact by determining failure and alternatives early.
Now, when you take into account the descriptiong of risk factor I have put forth, then factor in the risk description in the article (which, by the way is more aptly named 'perceived risk', slightly different) you have metric for the project estimate that helps to gauge portential failure and assist is client expectation management.
I've gone on about this enough, but a few of the other things that you failed to mention is that risk factors are always changing and need to be updated, for instance with the legacy system, once it is determined that the interface technology exists, and the data can be retrieved without additional programming (or additional programming and a subcontractor is found), then that risk factor obviously decreases -- and when that particular project area is complete, the risk is removed. Also, estimators should be familiar with development staff and capabilities, grossly over estimated project often come from estimators, business analysts and sales people who have no contact with the developers and have no clue as to what may already be in their own library arsenal.
yep this is a shotgun view of estimation. it could be read along with another one of my articles at the following address, http://www.codeproject.com/gen/design/Estimation.asp
(not that it is a better one either).
could you suggest some web sites where more info on estimation (and risk) can be found for the non-initiated.
i havent gotten too much into risk as i have a limited knowledge of the subject. i mentioned it only because it is an important factor that affects the cost of the project. risk estimation based upon specific modules/issues, done with a full fledged design is always a better option. my aim was to work out an estimate even without getting into the design.
BTW just fyi of all the readers, (so far) all the articles i have written here (so far) are not based upon info from other articles/websites/books. They are only my own views on the subject (refer sub title) based upon my limited experience.
Typically, estimates are generated for the purpose of deciding whether a project should start or continue.
I think it is vital, therefore, that each estimate detail line includes a percentage confidence factor.
I always break up projects (even little ones) into phases, tackling the highest (perceived) risk in the first phase. Confidence estimates usually get weaker for later phases which depend on the quality of results from the first phase(s).
Using this method I can present a realistic estimate with best and worst case values rather than an optimistic guess.