Click here to Skip to main content
15,897,187 members
Please Sign up or sign in to vote.
1.00/5 (2 votes)
See more:
In most programming languages, the integer is a 16-bit value as to Java and Visual Basic 32- bit integer. Have a research on the reasons behind such subtle design differences.

What I have tried:

i search on google but there's no answer
Posted
Updated 23-Nov-21 20:28pm

It is an historical thing. Basically the integer type size was choosen in order to represent a 'meaningful' amount of numbers, efficiently with the computers available at the time of the programming language design.
See, for instance: c - What is the historical context for long and int often being the same size? - Stack Overflow[^].
 
Share this answer
 
v2
No, that's quite wrong. Very few languages default to a 16 bit integer these days as it's range of values is far too small for modern requirements: −32,768 to 32,767 only.

In Java, an integer is not 16 bits, but 32 - a short is 16 bits, and the same applies in nearly every modern compiler: C#, C++, even modern C compilers use a 32 bit integer.

That didn't used to be the case when processors were 8-bit and memory was tiny (32K or 64K address spaces don't encourage 32 bit integers) and C compilers used 16 bit integers by default.

Only seriously old systems and compilers aimed at embedded environments will default to 16 bit integers these days.
 
Share this answer
 

This content, along with any associated source code and files, is licensed under The Code Project Open License (CPOL)



CodeProject, 20 Bay Street, 11th Floor Toronto, Ontario, Canada M5J 2N8 +1 (416) 849-8900