The Lounge is rated PG. If you're about to post something you wouldn't want your
kid sister to read then don't post it. No flame wars, no abusive conduct, no programming
questions and please don't post ads.
Even though we use the String, Int32, etc... correctly per Microsoft website, in VS, Intellisense will display suggestion "name can be simplified", "Show potential fixes", IDE0001 C# Name can be simplified.
Isn't that "Show potential fixes" = there is a bug and here is the potential fix?
Why Microsoft didn't fix it to clear this confusion? After all VS is Microsoft product right?
Maybe there is a reason behind it, and we all going to like it
I believe the reason it suggests to simplify the name is because string, int, etc do not require the System namespace include while String, Int32, etc still do. It can help clean up your namespace includes if that file doesn't use the System namespace for things other than simple types I'm sure there are other reasons to suggest simplification but that's the one that immediately comes to mind.
EDIT: Now that I think about it, in a way the simple names "decouple" the developer from the exact underlying type too. They could transparently change int to map to Int64 in the future, for example.
I used to do the same, string variable and String.Format(...) and int variable and Int32.Parse(...). There was a good reason I did that though, Microsoft recommended doing it! Until I switched to Visual Studio 2015 and all of a sudden it started giving me "tips" to simplify String to string and Int32 to int... Thanks Microsoft, for sticking to your own guidelines
For the same reason I stopped using this and base (unless necessary) and TheClassImIn.StaticMethod instead of simply StaticMethod.
And yes, I know I can turn off those rules, but I like sticking to defaults
I write process control applications, so I have to deal with lots of values that are defined as a particular number of bits in a network interface or a hardware register. For that reason I use Int32, UInt16, and so on. While I realize the chances of the aliases changing underlying type in .NET are effectively zero, I have too many battle scars from prior apps written in C. A variable declared as an int could be 16 bits, 32 bits, 64 bits, or something else.
That said, string's are another story entirely. Character sets, code pages, encoding, decoding, you still end up doing conversions of one sort or another regardless of your 'native' representation. I don't think I've ever declared a String in almost 10 years of C#. I always use the string alias.