Click here to Skip to main content
14,265,815 members
Rate this:
Please Sign up or sign in to vote.
See more:
Hello Guys,

i have a Question about (int) Variables
(im new to C# or any other bigger Programming Languages)
what exactly does (int)... does?
i have a code below i try to multiply 3 and 2.5 if something is true but if i only make smth like this:
var testa = 2.5;
var testb = 3;
var test = testa * testb;
var test2 = 0;

if (true) {
    test2 = testb * testa;
};

i get then an CS0266 Error
i Googled and found that i should use "(int)...":



var testa = 2.5;
var testb = 3;
var test = testa * testb;
var test2 = 0;

if (true) { //"true" is just a placeholder
    test2 = (int)testb * (int)testa;
};

Console.WriteLine(test);
Console.WriteLine(test2);


//Output Console
7,5
6


Using "(int)..." dosnt Help much because the Result is Wrong

Thanks for any Help ;)
Posted
Updated 20-Dec-15 20:25pm
v4
Comments
BillWoodruff 21-Dec-15 13:29pm
   
You have very good answers here; but I want to strongly suggest that you do not use 'var as you are learning to use C#: not using 'var will force you to think about the Types you are using, and that's very important.

fyi: if you had written: var test2 = 0.0; you would not have had the compile error.
Rate this:
Please Sign up or sign in to vote.

Solution 3

It sounds like you have problem with some programming concepts you still have to understand; and the particular issue you faced with is typing, which is one of the basics. Is so, this should understand what you are missing: https://en.wikipedia.org/wiki/Type_system#Strong_and_weak_typing[^].

Note, that that reading the articles like this one won't really teach typing or using programming language, it just define the field of knowledge you need to be aware of. Programming itself needs to be learned by learning one or more programming languages and doing software development. In particular, C# and .NET typing paradigm combined different approaches, but primary paradigms are static, strong, safe and nominative, others should better be considered as supplementary: https://en.wikipedia.org/wiki/C_Sharp_%28programming_language%29[^].

In particular, var is not a primary expressive device of the language, is not a type, is not always applicable and is related to type inference
https://en.wikipedia.org/wiki/Type_inference[^],
https://msdn.microsoft.com/en-us/library/dd233180.aspx[^].

In other words, the code using var is strictly equivalent to the code with explicit type declaration which you can substitute after type inference is performed statically.

Your first code sample makes no sense because testa is not defined. And everywhere, if (true) is just weird, as it is equivalent to not having any "if" at all. In second sample, you are doing bad thing, typecast, which has nothing to do with variable declaration. It truncates 2.5 to 2. It makes no sense, because if you really wanted 2, you should declare const int a = 2. (Don't forget const.) It all makes no sense. You need to declare what you want and control your type, instead of getting surprised that something goes out of your control.

—SA
   
Comments
QWE-R 21-Dec-15 2:22am
   
Thanks for the informative links, very useful

"Your first code sample makes no sense because testa is not defined."
Sry i forget to copy the first line "var testa = 2.5;"

"And everywhere, if (true) is just weird, as it is equivalent to not having any "if" at all."
"true" was just a placeholder for me
   
You are welcome. Will you accept this answer formally, too?
—SA
BillWoodruff 21-Dec-15 13:24pm
   
+5 good advice
Rate this:
Please Sign up or sign in to vote.

Solution 2

I am not sure what are you trying to achieve by this code. But the reason of error was when you declared the variable
test2 = 0;

it was declared as int. Now when you were trying to do
test2 = testb * testa;

you were trying to assign a double value to an int type variable. Because,
var testa = 2.5;

makes
testa * testb

a double.

To get rid of it, try
var test2 = 0.0;

in your first example. Hope you will get what you want.

Cheers.
   
v2
Rate this:
Please Sign up or sign in to vote.

Solution 1

You should use float instead of int because multiplying an int 3 with a float 2.5 would be 7.5 which is a float.

You are casting variable testa which is a float to be an int, the number 2.5 now became 2.

test2 = (int)testb * (int)testa;

so test2 = 3 * 2 which is 6.
   

This content, along with any associated source code and files, is licensed under The Code Project Open License (CPOL)




CodeProject, 503-250 Ferrand Drive Toronto Ontario, M3C 3G8 Canada +1 416-849-8900 x 100