A question itself does not seem to make much sense: what makes a language to be a client side or server side? Nothing, some historical reasons. If a language is, for example,
imperative (
http://en.wikipedia.org/wiki/Imperative_programming[
^]), it could be potentially used on server side.
However, there is one important question about that. Why there are so many server-side languages and technologies, but Web client server-side is only JavaScript/ECMAScript? (I don't count VB as this language is not supported by many browsers.)
The answer is: this is
apparent social phenomena which emerged "in response" to basic Web technology and HTTP protocol. This is a matter of "ecology" of developing
technosphere, see
http://en.wikipedia.org/wiki/Novel_ecosystem[
^].
Features of the client-site should be universal and supported on all platforms, but the content itself, including the script code, is loaded from the server. Different languages would strongly damage a simple compatibility principle of the Web: a Web application should work on nearly all platforms and browsers. And even though there are multiple incompatibilities, especially in rendering, accurately written Web applications are compatible with those platforms and major browsers. Due to this fact,
multiple client-side scripting languages cannot survive; and by some historical reasons, only JavaScript survived. Exclusions like requirements to use incompatible things like ActiveX plug-ins, as it often happens, justify the rule.
Server-side belongs to a single company, or a single user of a Web hosting which usually provides some range of server-side technologies.
The server-side remains completely invisible to the users of a Web application. Due to this fact,
the selection of a server-side technology depends on an owner of a Web site, and the
server-side languages and technologies flourished.
—SA