Hi guys,
I'm converting an old VB application into C#, and part of the system requires me to convert certain special characters to their ASCII equivalent.
In VB, the code is:
sValue = Asc("œ") 'which gives 156
sValue = Asc("°") 'which gives 176
sValue = Asc("£") 'which gives 163
These are the correct values according to
http://www.ascii-code.com/.
But when doing the same conversion in C#, the first of these values gives a strange answer.
Here is the code:
As ints:
int i1 = (int)Convert.ToChar("œ");
int i2 = (int)Convert.ToChar("°");
int i3 = (int)Convert.ToChar("£");
As bytes:
byte i1 = (byte)Convert.ToChar("œ");
byte i2 = (byte)Convert.ToChar("°");
byte i3 = (byte)Convert.ToChar("£");
What gives?! :( I'm suspecting it's something to do with the sign bit, but I can't see what.
Many thanks