Click here to Skip to main content

Nick Fisher (Consultant) asked:

Open original thread
Hi guys,
 
I'm converting an old VB application into C#, and part of the system requires me to convert certain special characters to their ASCII equivalent.
 
In VB, the code is:
 
 
sValue = Asc("œ")  'which gives 156
 
sValue = Asc("°")  'which gives 176
 
sValue = Asc("£")  'which gives 163
 
 
These are the correct values according to http://www.ascii-code.com/.
 

But when doing the same conversion in C#, the first of these values gives a strange answer.
 
Here is the code:
 
 
As ints:
 
int i1 = (int)Convert.ToChar("œ");    // which gives 339

int i2 = (int)Convert.ToChar("°");    // which gives 176

int i3 = (int)Convert.ToChar("£");    // which gives 163

 
As bytes:
 
byte i1 = (byte)Convert.ToChar("œ");    // which gives 83

byte i2 = (byte)Convert.ToChar("°");    // which gives 176

byte i3 = (byte)Convert.ToChar("£");    // which gives 163

 
 
What gives?! Frown | :( I'm suspecting it's something to do with the sign bit, but I can't see what.
 
Many thanks
Tags: C#, VB

Preview



When answering a question please:
  1. Read the question carefully.
  2. Understand that English isn't everyone's first language so be lenient of bad spelling and grammar.
  3. If a question is poorly phrased then either ask for clarification, ignore it, or edit the question and fix the problem. Insults are not welcome.
Let's work to help developers, not make them feel stupid.
Please note that all posts will be submitted under the The Code Project Open License (CPOL).



Advertise | Privacy | Mobile
Web03 | 2.8.140926.1 | Last Updated 26 Mar 2009
Copyright © CodeProject, 1999-2014
All Rights Reserved. Terms of Service
Layout: fixed | fluid

CodeProject, 503-250 Ferrand Drive Toronto Ontario, M3C 3G8 Canada +1 416-849-8900 x 100