I heard a collegue of mine mention this, but uncertain of its truth...
software built with Microsoft's .Net (and later) compilers <br />
creates executables in a "common format" that can be run through<br />
commercially-available disassemblers to fully recover the original<br />
source code. the protection against this is to use scramblers.
Simplistically, code written against .NET is compiled into IL (intermediate language). This IL can be reversed back into source code. Obfuscators can be used to make this reversing much more complicated.
Now, it is important to note that no source code (in any language) is 100% safe from reverse-engineering. Given the time/motivation you can reverse any source.
the last thing I want to see is some pasty-faced geek with skin so pale that it's almost translucent trying to bump parts with a partner - John Simmons / outlaw programmer
Deja View - the feeling that you've seen this post before.
software built with Microsoft's .Net (and later) compilers
creates executables in a "common format" that can be run through
commercially-available disassemblers to fully recover the original
The original source code can not be recovered as it's not included in the compiled code.
Reverse engineering can recreate code that will do the same thing as the original code and will compile into an identical executable, but the recreated code will not be the same as the original source code.
the protection against this is to use scramblers.
Scramblers provide some level of protection, but it's impossible to completely protect any code from reverse engineering.
Hi every body,
Can I customize the paper size and print a crystal report directly by coding. And my client have some printed papers, he needs to print the data at required places only. Like "Name : Code Project" here Name : is laready printed and it has to print CodeProject at specified place exactly.
How can we do it please advice me.Thank you.
I created a C# appln using .Net 1.1v framework, which is tested on a system having .Net 2.0v framework installed. Application is opening and working fine. But some places, application is throwing messages "Object reference not set to an instance of an object." and continues to work. But these messages no where appeared on my system which is having only .Net 1.1 framework. But when I install .Net 1.1v framework on the test machine, application is taking the 1.1 framework and working fine for the previous test case. Why some classes of .Net 2.0 are not working with 1.1v framework created application assembly. Is .Net2.0 is backward compatible with .Net1.1???
I guess you've found that the answer is 'no'. In fact, you can mark your app to require the exact version that was used to build it. I'd have said that overall I'd not expect a problem, you should try to work out the issue, just in case it needs resolving either way.
Christian Graus - Microsoft MVP - C++ Metal Musings - Rex and my new metal blog
This In General, we do like this only. We will work and create appln on .Net1.1, and later point of time, users can install .Net2.0 or .Net3.0 or some future release...We cannot pinpoint each piece of code again and again and do some work around for each framework. .Net framework itself should take care about this issue. But it is not doing.
Any body can please comment on this statement ".Net 2.0 is not backward compatible with .Net 1.1".
Well, you can either track down the issue and understand it, or you can trawl online forums, hoping someone will make a statement that makes you feel better about the issue you are having. I've never had issues moving from 1.1 to 2.0, and I've moved at least one large project over.
Christian Graus - Microsoft MVP - C++ Metal Musings - Rex and my new metal blog
And one more thing I observed is, Encoding class(ASCIIEncoding.ASCII.GetBytes(msg) method) is giving wrong results. When I try to create a key using the appln which uses .Net2.0 and again I try to create a key with .Net1.1, both keys are different. Here no error message is coming, but the keys are not matching.
This makes me to feel that .Net2.0 is not backward compatible with .Net1.1.
Please if anybody facing the same problem can throw some light on this...
The only reason why backward compatibility should exist is because you want to use .NET2.0 features or improvements. In that case, you will have to correct for incompatibility. You cannot prevent Microsoft to correct bugs and potentially unsafe code.
Microsoft designed the .NET framework to avoid problems with compatibility as good as possible. You can specify wich dll versions must be used in an appplication. I recommend to read Jeff Richters book Applied programming .NET framework (o something like that). In this book you find four chapters on deployment, assemblies and versoning. This exactly addresses your topic (which is very complex).
It cannot be 100% unless either no existing class is modified or each framework version also contains all prior versions. Both are not really an option.
I don't remember exactly where this happened but I had a case where .Net 1.1 had a small bug and I had to built a workaround in my code to get around it. After moving to .Net 2.0 I realized that this bug has been resolved (which is a good thing) but as a result my workaround (logically) began throwing errors.
Sure backwards compatibility is a great thing in theory - in practice we call it "DLL Hell" and it is a blessing Microsoft didn't make that mistake with the .NET Framework. As far as I remember you can set your application to not even attempt to run on the 2.0 Framework (I do wish this would have been the default setting, but we can't have it all).
Long time since I posted anything here...
Well, I've written a small app that reads a file from the disk into objects and stores them in a list. Then, what I wanted was to edit each of these objects individually and save them back to the disk. Yeah, just a stupid editor.
So, I discover a nice thing about DataGridView: I can use it together with BindingSource to access items in a List<>. Since I already have the list, I just do that:
_list = new Records("C:\\somefile.list");<br />
BindingSource bs = new BindingSource(_list, "");<br />
this.dgEntries.DataSource = bs; // Set the datasource for the DataGridView<br />
Records is a class that implements the IList interface. So far, so good.
Thing is, I go ahead and delete a few rows from the list and expect to see the result. But unfortunatly, the BindingSource seems unable to propagate the change correcly. No change inflicted upon the DataGridView is propagated to the list. Actually, I can't even sort the list.
Does anyone has a insight over the subject? Anything that would shed a light on how to correctly use the BindingSource would help me a lot.
have a look at one of my articles[^]. It adresses the issue with the missing sorting and filtering capabilities. For this I implemented a new class named BindingListView (which is contained in the project). It needs a type and an IList to work and can be bound to the DataSource property of the DataGridView. I haven't actually tested if change operations work but you might have a good chance that it does because I've implemented the complete IBindingListView interface which is the interface the DataGridView internally works one no matter what you bind to its DataSource property.
Be sure to download the DataGridView version and not the DataGrid version and please give me some feedback it it works or not .
Dispose is mainly provided to allow the prompt release of objects available in finite, small quantities like windows handles, or DB connections. The GC will eventually free the memory the object used but isn't smart enough to realize that some objects need disposed asap to free shared resources.
Rules of thumb should not be taken for the whole hand.
Is it mandatory to call the "Dispose()"-method of an object that implements it or is the reserved memory be freed by the GC at a later time anyhow? If yes, why should I use IDisposable at all?
It's not really the memory that is the concern, but unmanaged resources. If you don't call Dispose, the object has to be finallized to free the resources, and that is not guranteed to take place at any specific time, or actually at all. If there are too many objects that needs finalizing when the application ends, it won't have time to finalize them all, so it will just kill them off anyway.
Also, calling Dispose makes collecting the object more efficient, as it then can be garbage collected directly. If you don't call Dispose, the object will go through one garbage collection, be placed in the queue to be finalized, be finalized, then go through a second garbage collection before it's finally freed.
If I use GC.SuppressFinalize(this) like in the example, and forget to "manually" dispose a used managed component in the Dispose-method, do I then have a memory leak?
You have a potential resource leak. If this unmanaged resource happens to be unmanaged memory, you have a potential memory leak.
If the component that you did not dispose uses any unmanaged resources, they are not guaranteed to be freed. Sometimes the garbage collector will manage to free it in time anyway before it becomes a problem, sometimes not.
If Dispose() is not called, the Finalizer will do the cleanup.
Multiple calls to Dispose() do no harm.
There is just one method for cleanup to maintain [Dispose( bool disposing )].
And this method can be extended by derived classes.
As Guffa already pointed out, the interface IDisposable should only be used, if the classes uses unmanaged resources.