|
Mur2501 wrote: As they all are translated to another language when they are build.
That's just it... interpreted languages aren't really translated to another language, they are kept in their original format until run by the user. When the user runs the program, he's really running the interpreter, which takes the script in and interprets what is to be done.
The definition is not always as cut and dry as you might think, there are a lot of languages nowadays that produce byte-code or some other in-between by-product that will be further interpreted at run-time.
|
|
|
|
|
Another difference to consider is their behavior.
For an interpreted language, the program is run line-by-line when you request it to run. If there is a problem in the logic or any other 'bad things', it will run until it hits one. Thus, if you misspelled a variable's name it wouldn't react to it until you run it and it happens to come across the misspelled version. Also, if you left out a BEGIN or END statements, it would run until that caused a problem, if ever. Some feel that these languages are easier for beginners to start with because they're not all-or-nothing, but not everyone agrees with that.
Now, for a compiled language, the entire program is converted to 'machine code' (often in several steps). This code has to take the entire program into consideration at once. If you have a problem it will not be able to figure out what to do and you'll get an error. The program will NOT be compiled. A misspelled variable is 'undefined' and would be a problem. Similarly, the compiler will look for an END for every BEGIN, in the correct order or it will be an error. Compiling a program is an all-or-nothing affair.
Because of this difference in when the program is turned into real computer instructions, the interpreted languages are much slower than compiled languages when running.
"The difference between genius and stupidity is that genius has its limits." - Albert Einstein | "As far as we know, our computer has never had an undetected error." - Weisert | "If you are searching for perfection in others, then you seek disappointment. If you are seek perfection in yourself, then you will find failure." - Balboos HaGadol Mar 2010 |
|
|
|
|
|
Many people are stating that let us c is old and hard but when I read it there was no problem rather than the ide(but I am already known how to use a ide, Code::Blocks) and it was also the one book from which I get the idea of the technical things cause it is easy to understand. So what I do change it and go to a new book or remain firm on it?
|
|
|
|
|
That is hardly a question that can be answered here. You need to decide for yourself which way you want to go in your career, and research which books may help you.
|
|
|
|
|
In programming there is no better or worse languages they are just tools and most of programming has nothing to do with the language, it is about the process behind the programming. In the commercial industry we refer to people who like to push a language in the derogatory term codermonkeys and generally at a job interview saying xxx language is a better language will see you immediately overlooked. If you make programming a career you will probably code fluently in a at least 3 or 4 languages and probably dabble and have limited understanding in 3 or 4 more.
C is indeed old and it can be hard but it and Java are also the most widely used programming language. In some sections of industries it is in fact the only choice and here I refer to industries like microcontroller industry. The manufacturers for those processors don't make programming tools and so generally the only options is assembler or C.
The current spectrum rating on most widely used languages is (1)Java,(2)C,(3)C++,(4)Python,(5)C# .... daylight to everything else
So most the bulk of us old commercial programmers can write in two of the three (Java,C,C++).
So that is the commercial world but that all said if you are just programming for fun feel free to use whatever language works for you. The language does not change the problem at all, if I ask you to write a bubble sort algorithm the language does not change the problem. Rosettacode.org actually lists the code for a bubble sort in 117 programming languages and none of them are better or worse than any other they all confirm to the same pseudocode.
The ability to Pseudocode is what seperates programmers from codermonkeys and you may care to read about it from wikipedia. Any real programmer can write what they are doing in pseudocode and that pseudocode requires no choice of language showing ultimately just how irrelevant the chocie of language is.
The choice of language generally comes down to availability and ease of use of the compiler and tools and your familiarity with it.
In vino veritas
|
|
|
|
|
You're(Leon de Boer) extremely right at the answer. Well I am not discriminating any programming language and I also believe that every programming language has its own power and uses. I also decided to learn C cause I want to be a system programmer as Java is not suitable for many low level services but C/C++ are one that has the pride.
|
|
|
|
|
Hi,
Well, there are a few considerations here. 'C', 'CPP' are examples of languages which produce code running close to the OS, and the Hardware. One works there on the coal face, dealing directly with actual memory locations, and de Operating System. the vagarities of the hardware components, etc. This can lead on occasion to hard to detect bugs, and crashes. A Strash is a famous example. Another consideration is that the Supplier of the OS can literally pull the carpet from underneath your feet, by deprecating your favourite OS. Another disadvantage is that you must maintain different versions of Source Code if you want toi write for more than One Platform.
On the other hand, just because you deal directly with the OS and Hardware, you can do all sorts of tricks that cannot be done in'Synthetic' languages, such as say C# or Java. These languages run on a 'Virtual Machine' in a 'Virtual Environment' When you find yourself in such an environment, you may forget about playing even the most innocent trick. That virtual machine knows nothing about memory, but talks in variables. The advantage here is, that this a far more friendly environment to write in, it tries not to allow you to write wrong code. Also, your code will probably run from now till kingdom come on every computer and OS.
Now, it should also be remembered, that as a society, we cannot ever dispense with languages such as C and CPP. Languages such as C# Java, and many others are actually written using 'C' and 'CPP'
I personally think that you could do worse than learning 'C' and 'CPP', in particular if in the latter you incorporate 'MFC'
Note: C# Java, vs 'C' and 'CPP' are very similar in syntax. The devil is in the syntactical detail!!!
Regards,
Bram van Kampen
|
|
|
|
|
Hi there.., i am learning opengl from the NeHe tutorials(NO GLUT).
i am getting the error
undefined reference to 'auxDIBImageLoadA' while compiling.
i am using codeblocks.
the codeblock is:
AUX_RGBImageRec *LoadBMP(char *Filename)
{
FILE *File=NULL;
if(!Filename)
{
return NULL;
}
File = fopen(Filename,"r");
if(File)
{
fclose(File);
return auxDIBImageLoad(Filename);
}
return NULL;
}
I have added libglaux.a already in the linker settings.
HELP!!!!
modified 7-Apr-16 11:10am.
|
|
|
|
|
Perhaps you are missing a header file.
Ratul Thakur wrote: i am getting the error undefined reference to 'auxDIBImageLoadA' while compiling. What is the EXACT error message you are receiving?
Ratul Thakur wrote:
I have added libglaux.a already in the linker settings. Which is irrelevant because you are dealing with a compiler error. The linker does not kick in until the compiler succeeds.
"One man's wage rise is another man's price increase." - Harold Wilson
"Fireproof doesn't mean the fire will never come. It means when the fire comes that you will be able to withstand it." - Michael Simmons
"You can easily judge the character of a man by how he treats those who can do nothing for him." - James D. Miles
modified 7-Apr-16 11:58am.
|
|
|
|
|
no m not missing the header (-_-)
m not that noob.
i know the
#include<gl/glaux.h>
header file.
modified 7-Apr-16 11:41am.
|
|
|
|
|
It looks like the compiler is generating an ASCII version of the call to auxDIBImageLoad , so it may be that you need a different library to be linked with. Check the GL documentation.
|
|
|
|
|
auxDIBImaheLoad comes from the GLAUX library which is obsolete and no longer supported by Visual studio which is why it can't find it. Even adding include "GLAux.H" wont help as the library file has been removed as well as the DLL.
If you are just playing around the source code and precompiled GLAUX.DLL are available on the internet but anything beyond that do not use it.
All that function does is loading an image file as a texture one second let me fashion you a replacement it will take longer to explain how to do it that to do it. Can you tell me what lesson this is from on the NeHe site and I will post result to them to update it on next message
In vino veritas
modified 8-Apr-16 5:08am.
|
|
|
|
|
Okay I found you are trying to do Lesson6 and the function LoadGLTextures() basically puts a texture into this variable
GLuint texture[1]; // Storage For One Texture
I have given you a new function which directly loads the texture
bool NeHeLoadBitmap(LPTSTR szFileName, GLuint &texid);
The call to it looks like
NeHeLoadBitmap(_T("Data/NeHe.bmp"), texture[0]))
Yes I unicoded the whole thing and set optimal screen resolution again and its in VS2013 project
Code link: http://s000.tinyupload.com/?file_id=94912123294851978142
I have a little job to do this weekend I guess which is to patch each lesson and send them to the site
In vino veritas
modified 8-Apr-16 5:16am.
|
|
|
|
|
Thanks a lot for your effort.., i ll try this one out and tell you if it works
|
|
|
|
|
YEH!!!!!!!! it worked. Thanks once again.
But what is the use of "_T". if i compile without including it, the application still works fine.
|
|
|
|
|
The _T is a macro provided by TCHAR.H for unicode/multilingual support. If you go to the project settings->general tab->Character set you will have that to "not set" setting. That macro allows you to use the other choices being unicode and multi-byte character sets making your code work multilingual like in chineese windows.
For you in ascii mode the macro actually does nothing (which you worked out) but if you select the other modes you will see you will get an error on every static text the _T tells the compiler to make the string in the correct mode and removes the error.
Being a commercial programmer and as Microsoft has made it so easy for doing it we generally try and use the multilingual code calls since Visual Studio 2013. This became almost compulsory when trying to write true 64 bit applications. The default setting of an empty project is actually to set for unicode character set.
Essentially TCHAR becomes a replacement for the standard char and its size varies in the compilation modes. They provide new string functions that match the old string functions in TCHAR.H but have different code for the different modes. Lets give you an example
strlen becomes _tcslen those calls work identical the difference being _tcslen will work in any language mode compilation, while strlen will only work in language "not set" mode like you have. Here is the link to what is going on from MSDN
strlen, wcslen, _mbslen, _mbslen_l, _mbstrlen, _mbstrlen_l[^]
This code is designed specifically for windows (it uses the Win32 API), it is not general in nature like that could work on linux so there is no reason to write generically but we should try and cover the different modes of windows compilation, especially as it is easy.
So for me the changes are just habit.
There is a funny part of this that so many of us are writing in that style that the linux community is having issues trying to port our code. So if writing general code I would probably try to avoid this style of programming.
In vino veritas
modified 8-Apr-16 23:28pm.
|
|
|
|
|
Hi
I am using Mailslots for Interprocess communication. Process A does a WriteFile with NULL for the overlapped parameter.
Process B has a overlapped parameter on the ReadFile.
My question is Can process A do WaitForSingleObject on the hEvent of process B overlapped to know when the read has completed
Thsnks
|
|
|
|
|
|
I did in a ReadFile for a mail slot when the i/o is completed shouldn't hOverLap.hEvent be signaled ?
|
|
|
|
|
|
|
I read the article
Which basically Said if I CreateFile on the Client Side with FILE_FLAG_OVERALAPPED
File then when doing an I/O which I would assume would include ReadFile on the Server side
An Overlapped structure would be used (A member of which is m_hEvent
I did a OpenEvent on the Client side as The CreateProcess let me inherit objects
tried WaitForSingleObject (m_hEvent of the overlapped of the ReadFile)
the First wait worked the second didn't as I did three writes
|
|
|
|
|
ForNow wrote: First wait worked the second didn't as I did three writes
"Didn't work" is hardly a technical description of the problem. Are you forgetting to reset a manual-reset event?
The difficult we do right away...
...the impossible takes slightly longer.
|
|
|
|
|
Don't think so the flag is set to False it should auto-reset
sysblk.mail = CreateEvent(&sa,
FALSE, FALSE, (LPCTSTR)"MyEvent");
|
|
|
|
|
Well maybe the writes are occurring so quickly that the system only has time to signal the event one time.
Whenever the event is signaled, you should check for more messages before looping back to the wait function.
Don't expect the event to be signaled exactly once for each message you write.
The difficult we do right away...
...the impossible takes slightly longer.
|
|
|
|