|
I don't even understand the question. If your C++ program is not broken up into .cpp files of less than 1000 lines, coupled usually 1-1 with .h files of less than 100 lines, there is far more wrong with your coding than hungarian notation. You know what the data type of a variable is because each class only has a few member variables, and their types are given in the .h file.
If you have a boatload of global variables, or big sprawling files with a zillion classes in them, or you're declaring members public and peeking inside other classes to look at their variables, I could see how you'd have a problem remembering the types. But this is just bad practice.
Hungarian notation solved a problem of the 1990's C-based world, where you had thousands of globals in a big project. That problem belongs to the past. If you have that problem now, then your code belongs to the past.
|
|
|
|
|
Because it's not the 90's anymore and there are better ways to know what a variable type is?
You should be using an editor that will tell you what the type is; then you will *know*. You will know even when the type changed from signed to unsigned or from short to long.
Besides it has been my experience that most people use Hungarian notation so they don't have to come up with decent variable names:
I always see functions with crap like this:
int iValue;
char *strzValue;
double dValue;
float fValue;
|
|
|
|
|
I always publish a data dictionary with every project and a style guide for any deviations from "standards" and to define which "standard" I am using. A single file holds the definion and type for each variable. Easy to find the datatype and you actually know what the variable means in the real world. This also helps keep business variable names distinct and consistant across an entire project (e.g., never "name", always clientName, providerName, and clientName everywhere) which I feel is a critical practice. I also publish "global" names that are reused. (e.g., ndx is always used for the innermost indexer in loops). You can write scripts that will find most variables and add them to a dictionary.
For Property-backs I use _whatever so I know I will have a Property Whatever on the object. Though I sometimes shortcut when it is very clear. (e.g., On the "Client" object I may have _clientName on the back, but "Name" on the Property. I always (and only) use type prefixes for GUI object types in stuff like ASP.net (e.g., "asp:Button btn_SaveClient") and always publish the prefixes btn =button, tb =textbox in the data dictionary. This really helps in the code behind when VS creates event handler names for you - tb_Client_OnFocus, tb_Client_TextChanged.
Every naming convention is only designed to make code clearer. A data dictionary does this best (IMHO), and if you inherit multi-convention code this really helps "fix" it. (e.g., if in one file I have clntCode and in another ClientID and another intClient I can add them all to the dictionary, make sure I understand them, and then find-replace for each - clntCode int; -> clientCode Int32; ClientID decimal(6,0); -> clientCode Int32; Not perfect, but a good start.
Which convention doesn't matter if you are consistant and document well. A Data Dictionary does this, provides a big step in documentation, and helps provide a global view of your application.
|
|
|
|
|
I was on a team a few years back tasked with writing coding standards. We started to define prefixes for all commonly-used types and user controls. We quickly realized that there were just too many variations for us to account for in modern Object-oriented systems. This includes dozens (or hundreds) of classes as well as custom subclassed UI controls.
Additionally, Visual Studio (and other IDEs) are kind enough to display tool-tips indicating the data type. I also frequently use the "Go to definition" feature to further explore the origin of variables or types.
It's important to remember that the type of some variables can change over time (i.e. Int32 to Int64 or Double to Decimal). Coupling the variable name to its type just increases the potential need for future refactoring that could be avoided by simply using a more generic (but still descriptive) name.
I realize the big exceptions to the IDE-sugar are when you print code or view it in a plain text editor. But what my team decided is that these were very rare in our situation.
We ended up recommending a "good descriptive name" for variables and UI controls. We realized that with the Intellisense-driven Visual Studio experience you only have to type a variable or control's name ONCE (in most cases) and Intellisense will assist with all subsequent references to it.
|
|
|
|
|
I actually worked at a place that was very strict about Hu notations. After getting used to it, the whole thing felt super natural and I really enjoyed it. The conventions were "sane" hungarian notations, as you can go overboard with it. We had maybe 10 or so prefixes everybody knew and used. Maintaining the code was very fun. I wish more places would adopt those notations. We were coding in C, so I don't know if it would work the same for C++.
|
|
|
|
|
VuNic wrote: Though it's C++ or C#, we do have primitive data types everywhere. In fact, for
smaller projects, primitive data types would account for 90% of the variables.
I doubt that is generally true.
Especially since, presumably, you are not claiming that one should use hungarian for local variables.
VuNic wrote: How do I know what datatype it is?
Context and naming. Are you unsure about the types of the following variables?
fullName = lastName + " " + firstName;
connection.Commit();
for(int i=0; i < accounts.Count; i++)
VuNic wrote: For example, the code is a 100K line code and I cannot copy the entire project
to my disk to review that at home.
Say what?
Your "disk"? Where exactly do you work? Exactly what kind of "computer" do you have at home?
My memory stick is old and cheep when I bought it yet it holds everything I need to move entire code trees (plural at the same time.) Just one of the code trees that I commonly move around has 7,000+ files on it and probably 90% of those are source files (and I do not do GUIS, so there are no image files of any sort.)
So if you presume 5000 files with 100 lines on average then that one code tree has 500,000 lines of code. I suspect the actual count is higher.
And how exactly are you going to "work" on something at home if the piece you are working on, in some context, is not complete?
VuNic wrote: f anybody has a valid reason against it, I'm all ears to it.
Because it will not provide any measurable benefit AND because as a standard it will annoy probably everyone else except you.
And morale has been proven to have an impact on productivity.
|
|
|
|
|
It's a subjective preference of course, so there's not absolute reason(s). Back in the day I used Hu notation; we pretty much all did when it was the "thing you should do".
However, here are some points that collectively tilt the scale towards not using Hu these days.
* Hu originally solved a problem, the difficulty of quickly differentiating typed variables in primitive development environments.
However the modern use of more powerful IDE's having more robust intellisense or similar capabilities that make it trivial to "see" applicable metadata about variables (etc) addresses the need for a quick means of figuring out what is what.
Though you say you like to take snippets of logic to review in a simple text editor rather than "have the entire project" local and use your IDE of choice, I would think that you are an exception. Using source repositories it is trivial and normal to have a copy of all source on your workstation or laptop, to work "disconnected", and to synch delta on either side up with the repository when connected and you wish to. I'm not sure what benefit you would gain from working with snippets in a "dead" environment, when it is so easy to work on a compilable, runnable local version.
For that matter, these days the source repository is generally only a VPN connection away unless you are on the road or in the air in most corporate environments.
This usage of snippets to take home may be an old habit of yours that bears reevaluation; maybe it's no longer necessary.
* In the style of OOP currently en vogue, injection is often favored. Many things that used to be variables are often parameters or properties now.
Any localized variables tend to be very localized, as in declared very close to their usage, and at method scope or even less (lambdas, etc).
Short, concise methods are greatly preferred over long rambling ones, making it much more likely that variable declaration is in close proximity to its usage.
* Refactoring is commonly practiced, ideally altering implementation while maintaining functionality. What is a double today might be an int tomorrow; a string might become a char[] or vice versa. Ideally the name shouldn't need to change.
Related to this, is the consideration of once and only once; if the name of a variable contains some token or segment that is a direct reinstatement of its (original) underlying type, you have effectively duplicated the information. If the underlying type changes, the "duplication" of the original type information should also change, but might not causing a classic information synching issue.
* In C#, the usage of the var key word and type inference at compile time is a pretty useful feature, but would not cooperate very well with Hu notation.
Something like:
var intResult = SomeMethod(...);
is obviously kind of silly...especially if SomeMethod(...) later gets refactored to return int? vs int, or any other type at all.
Also...what is the Hu notation for a nullable primitive?
int? nintResult;
int? n_intResult;
int? nIntResult;
Not loving that idea.
* .NET and C# in particular takes some pains to smooth the differences between value types and reference types, and encourages you to use an instance of a primitive object on the stack similarly to an instance of a reference type on the heap aside from the obvious localized scoping of changes to value types. As complex types tend to be much more common in most applications, naming conventions have evolved to tend to favor complex types more than primitive types.
Complex types with camel cased names like MyVerySpecialClass : MyVeryAbstractBase Hu oddly if you are of a very literal bent...
MyVeryAbstractBase mvabSubclass = new MyVerySpecialClass();
Early on in .NET, and this seemed to be encouraged by MS based on writings and examples of the time, tended to just use a sort of acronymization in a kind of vague Hu, something like this:
MyVeryAbstractBase mvab = new MyVerySpecialClass();
or, vexingly, an implicit assumption of specific concreteness such as:
MyVeryAbstractBase mvsc = new MyVerySpecialClass();
of more irritating, unnecessary concrete declaration:
MyVerySpecialClass mvsc = new MyVerySpecialClass();
This habit seems to be falling away finally, though it still seems to crop up in throw away variables in for's, foreaches, and lambdas where the brevity is basically harmless due to the very narrow scope (though I still prefer clear, full names).
Having one naming convention for primitives (Hu, for instance), and a different naming convention for complex types (PascalCase, for instance) works against the efforts made to unify usage of both.
Having said all of that, the MS Press Framework Design Guidelines book has a a couple of pages on Hungarian notation and why it is restricted / discouraged in .NET.
A key point is, privately scoped members are not held to the same rigor as non-privately scoped members. If you want to name all your private variables using Hu, then only the other people on your team or future devs who land on your code base later are going to care.
Another key point is the main value of conventions is consistency. It's better to be consistent and "wrong" per current prevailing mores, that it is to inconsistently adhere to them. Personally I find this to be a mark against Hu as it is more difficult to remain consistent over time if underlying data types change, but still.
Personally...the only place I still use Hu notation is in designer / template centric environments such as .aspx web pages where type declarations are effectively split between the template and the backing object.
Thus, I find a prefixed notation like this to be handy:
<asp:textbox runat="server" id="tbSalutation" ...="">
<asp:textbox runat="server" id="tbUserName" ...="">
...SomeBackingBindMethod(SomeDataRecord record)
{
if(record == null) return;
tbUserName.Text = record.UserName;
tbSalutation.Text = record.Salutation;
...
}
There is a nice secondary benefit of having an immediate mnemonic of what sort of control object I'm dealing with without needing to look at the designer as often, but the MAIN benefit the prefix gives me is that it naturally groups similar types of control for intellisense usage.
I don't have to remember the exact name of every textbox or checkbox or label; I can just type "tb..." and intellisense gives me an assist by painting a type down list of all the textboxes thus prefixed allowing me to hone in on the one I want quickly.
This is very pragmatic in my opinion, and also helps bridge the gap between OOP and parsed / merged view templates.
|
|
|
|
|
I would put forward a couple of reasons that Hu notation is not needed and why I do not use it.
1) All the names I assign within my objects (properties, methods, fields, etc.) derive there name based on what the business owners call it in the real world if there is a corelation. As a developer I can only do my job well when I also understand the real world objects that my code represents. As such a new developer must obtain some level of this knowledge for the data types to be evident. The example of AccountNumber in above posts should be understood within the logical context of your system. If I call a variable accountNumber then likely I have some idea of what that account number is within the context of my system. Hu notation will not tell me anything more than the common business knowledge will. The reality is that if I name it strAccountNumber all I know is that it is a string of some length. I would not be able to know simply from the "str" if the business rules allow special characters, if the max length was 8 characters, etc.
2) Modern IDE's virtually remove the need for Hu notation. If the declaration of a variable is not in view and the data type is not apparent based on my knowledge of the business domain then (as a .NET developer) all I have to do is hover over the variable and the type is immediate shown thanks to intellisense.
|
|
|
|
|
I used to use Hungarian notation when working in a weakly-typed language, but quickly gave it up when working with C# and .NET.
It wasn't just because C# is strongly typed; it was also because of:
- Property names and data binding
- Mapping database table and column names to classes and ORM's
- Serializing / deserializing objects to / from XML
- Configuration files
- Reflection.
Hungarian notation might be fine if your "names" never see the light of day; in most other cases, it gets ugly and geeky really fast.
|
|
|
|
|
Hungarian notation is very useful as long as you don't concatenate too many prefixes and you only use it when it actually helps.
I find the 'm_' prefix for class member and the 'p' prefix for pointer very helpful and will protest at code that doesn't use it. After using 'm_', I feel free to add another prefix as in 'm_pCursor' as this remains easy on the eye. After using the 'p' prefix I really want a capital letter so that the p stands alone and is not mistaken for anything else. For me, prefixing the data type is secondary to this as most IDEs will tell you the data type simply by hovering the mouse.
I use 'n' when my logic requires a positive whole number and 'i' when my logic requires an iterator although its implementation may be size_t or int depending on the need to use -1 to represent it pointing nowhere.
I use prefixes for controls btnName, edtName.
I use established concatenations such as lpzString but try to avoid creating new ones.
|
|
|
|
|
VuNic wrote: For example, the code is a 100K line code and I cannot copy the entire project
to my disk to review that at home. I choose to copy a 300 lines code block with
multiple functions to review it at home.
100k of code? correct me if i am wrong: this amount of code fits on everything which was produced since the late 80s.
I think that the main intention for advising not to use hungarian notation:
hungarian notation is redundant information to your declaration.
|
|
|
|
|
100K lines of code is 1,00,000 lines of code right?
Then I'm wrong. It should be around 1,30,000+ considering the code that we later converted as libraries.
It was a medical research project. It was HUGE. Not just these, I've worked on projects that casually run over 100K lines of code. Of course I do not write them from scratch but clean the scratches here and there. And some show stopping critical fixes too.
I'm still finding Hu notation to be helping, I'm sticking with it for C#.
Starting to think people post kid pics in their profiles because that was the last time they were cute - Jeremy.
|
|
|
|
|
... and said she recognised me from vegetarian club ...
... I was confused, I'd never met herbivore.
[Sorry, I'll get my coat]
Panic, Chaos, Destruction. My work here is done.
Drink. Get drunk. Fall over - P O'H
OK, I will win to day or my name isn't Ethel Crudacre! - DD Ethel Crudacre
I cannot live by bread alone. Bacon and ketchup are needed as well. - Trollslayer
Have a bit more patience with newbies. Of course some of them act dumb - they're often *students*, for heaven's sake - Terry Pratchett
|
|
|
|
|
Samantha Brick approves of your pun!
|
|
|
|
|
Samantha does not like your response though.
Countered.
Henry Minute
Girl: (staring) "Why do you need an icy cucumber?"
“I want to report a fraud. The government is lying to us all.”
I wouldn't let CG touch my Abacus!
When you're wrestling a gorilla, you don't stop when you're tired, you stop when the gorilla is.
Cogito ergo thumb - Sucking my thumb helps me to think.
|
|
|
|
|
Henry Minute wrote: Samantha does not like your response though.
My day is ruined. I was hoping to impress her with that sentence.
|
|
|
|
|
Samantha has recently been doing a lot of work Rick Stein. She's amazed at how he can fillet.
[Or is that the wrong Samantha?]
Panic, Chaos, Destruction. My work here is done.
Drink. Get drunk. Fall over - P O'H
OK, I will win to day or my name isn't Ethel Crudacre! - DD Ethel Crudacre
I cannot live by bread alone. Bacon and ketchup are needed as well. - Trollslayer
Have a bit more patience with newbies. Of course some of them act dumb - they're often *students*, for heaven's sake - Terry Pratchett
|
|
|
|
|
I would have enjoyed that more if I hadn't heard it on t'wireless last week.
Good 'un though.
Henry Minute
Girl: (staring) "Why do you need an icy cucumber?"
“I want to report a fraud. The government is lying to us all.”
I wouldn't let CG touch my Abacus!
When you're wrestling a gorilla, you don't stop when you're tired, you stop when the gorilla is.
Cogito ergo thumb - Sucking my thumb helps me to think.
|
|
|
|
|
Found my coat and grabbed my hat
Made the bus in seconds flat
|
|
|
|
|
Nagy Vilmos wrote: ... and said she recognised me from vegetarian club ... I'd never met herbivore
Join the club, meat her everyday, maybe she likes you.
|
|
|
|
|
Bravo! Bartender, 5's all around!
|
|
|
|
|
Hold still, now. We're going to have to hurt you.
Software Zen: delete this;
|
|
|
|
|
Until I saw this[^], that is.
Henry Minute
Girl: (staring) "Why do you need an icy cucumber?"
“I want to report a fraud. The government is lying to us all.”
I wouldn't let CG touch my Abacus!
When you're wrestling a gorilla, you don't stop when you're tired, you stop when the gorilla is.
Cogito ergo thumb - Sucking my thumb helps me to think.
|
|
|
|
|
Ah numba 2 (best chinesse accent)
|
|
|
|
|
I always wake up with the oddest things still running in my head.
Today, I was trying to calculate the hippopotamus of a right triangle.
Needless to say, I was not successful.
|
|
|
|
|